Skip to main content
Log in

Symmetry and antisymmetry properties of optimal solutions to regression problems

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

Besides requiring a good fit of the learned model to the empirical data, machine learning problems usually require such a model to satisfy additional constraints. Their satisfaction can be either imposed a-priori, or checked a-posteriori, once the optimal solution to the learning problem has been determined. In this framework, it is proved in the paper that the optimal solutions to several batch and online regression problems (specifically, the Ordinary Least Squares, Tikhonov regularization, and Kalman filtering problems) satisfy, under certain conditions, either symmetry or antisymmetry constraints, where the symmetry/antisymmetry is defined with respect to a suitable transformation of the data. Computational issues related to the obtained theoretical results (i.e., reduction of the dimensions of the matrices involved in the computations of the optimal solutions) are also described. The results, which are validated numerically, have potential application in machine-learning problems such as pairwise binary classification, learning of preference relations, and learning the weights associated with the directed arcs of a graph under symmetry/antisymmetry constraints.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

Notes

  1. Its particular expressions reported in Assumption 4.1 follow by imposing, respectively, \(\bar{\mathbf {w}}^{(2)}=\bar{\mathbf {w}}^{(1)}\) in the symmetric case, and \(\bar{\mathbf {w}}^{(2)}=-\bar{\mathbf {w}}^{(1)}\) in the antisymmetric case.

References

  1. Diligenti, M., Gori, M., Maggini, M., Rigutini, L.: Bridging logic and kernel machines. Mach. Learn. 86, 57–88 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  2. Gnecco, G., Gori, M., Melacci, S., Sanguineti, M.: A theoretical framework for supervised learning from regions. Neurocomputing 129, 25–32 (2014)

    Article  Google Scholar 

  3. Gnecco, G., Gori, M., Melacci, S., Sanguineti, M.: Foundations of support constraint machines. Neural Comput. 27, 388–480 (2015)

    Article  MATH  Google Scholar 

  4. Gnecco, G., Gori, M., Melacci, S., Sanguineti, M.: Learning with mixed hard/soft pointwise constraints. IEEE Trans. Neural Netw. Learn. Syst. 26, 2019–2032 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  5. Gnecco, G., Gori, M., Sanguineti, M.: Learning with boundary conditions. Neural Comput. 25, 1029–1106 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  6. Kashima, H., Oyama, S., Yamanishi, Y., Tsuda, K.: On pairwise kernels: an efficient alternative and generalization analysis. In: Advances in Knowledge Discovery and Data Mining. Lecture Notes in Artificial Intelligence, vol. 5476, pp. 1030–1037. Springer (2009)

  7. Herbrich, R., Graepel, T., Bollmann-Sdorra, P., Obermayer, K.: Supervised learning of preference relations. In: Proceedings Fachgruppentreffen Maschinelles Lernen, pp. 43–47 (1998)

  8. Brunner, C., Fischer, A., Luig, K., Thies, T.: Pairwise support vector machines and their application to large scale problems. J. Mach. Learn. Res. 13, 2279–2292 (2012)

    MathSciNet  MATH  Google Scholar 

  9. Pahikkala, T., Tsivtsivadze, E., Airola, A., Järvinen, J., Boberg, J.: An efficient algorithm for learning to rank from preference graphs. Mach. Learn. 75, 129–165 (2009)

    Article  Google Scholar 

  10. Pahikkala, T., Airola, A., Stock, M., De Baets, B., Waegeman, W.: Efficient regularized least-squares algorithms for conditional ranking on relational data. Mach. Learn. 93, 321–356 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  11. Dardard, F., Gnecco, G., Glowinski, D.: Automatic classification of leading interactions in a string quartet. ACM Trans. Interact. Intell. Syst. (2015). doi:10.1145/2818739

  12. Vedaldi, A., Blaschko, M., Zisserman, A.: Learning equivariant structured output SVM regressors. In: Proceedings of the 2011 IEEE International Conference on Computer Vision (ICCV), pp. 959–966. IEEE (2011)

  13. Pahikkala, T., Viljanen, M., Airola, A., Waegeman, W.: Spectral analysis of symmetric and anti-symmetric pairwise kernels. arXiv:1506.05950v1 [cs.LG] 19 Jun 2015 (2015)

  14. Golub, G.H., Van Loan, C.F.: Matrix Computations. John Hopkins University Press, London, UK (1996)

  15. Barata, J.C.A., Hussein, M.S.: The Moore-Penrose pseudoinverse: a tutorial review of the theory. Braz. J. Phys. 42, 146–165 (2012)

    Article  Google Scholar 

  16. Bertsekas, D. P.: Dynamic Programming and Optimal Control, vo. 1. Athena Scientific, Nashua, NH, USA (1995)

  17. Horn, R. A., Zhang, F.: Basic properties of the Schur complement. In: Zhang, F., (ed.) The Schur Complement and Its Applications. Springer, New York, NY, USA (2005)

  18. Liu, W., Park, I., Wang, Y., Príncipe, J.C.: Extended kernel recursive least squares algorithm. IEEE Trans. Signal Process. 57, 3801–3814 (2009)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Giorgio Gnecco.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gnecco, G. Symmetry and antisymmetry properties of optimal solutions to regression problems. Optim Lett 11, 1427–1442 (2017). https://doi.org/10.1007/s11590-016-1101-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-016-1101-x

Keywords

Navigation