Metric Nearness Made Practical

Authors

  • Wenye Li The Chinese University of Hong Kong, Shenzhen Shenzhen Research Institute of Big Data
  • Fangchen Yu The Chinese University of Hong Kong, Shenzhen
  • Zichen Ma The Chinese University of Hong Kong, Shenzhen

DOI:

https://doi.org/10.1609/aaai.v37i7.26041

Keywords:

ML: Optimization, ML: Unsupervised & Self-Supervised Learning

Abstract

Given a square matrix with noisy dissimilarity measures between pairs of data samples, the metric nearness model computes the best approximation of the matrix from a set of valid distance metrics. Despite its wide applications in machine learning and data processing tasks, the model faces non-trivial computational requirements in seeking the solution due to the large number of metric constraints associated with the feasible region. Our work designed a practical approach in two stages to tackle the challenge and improve the model's scalability and applicability. The first stage computes a fast yet high-quality approximate solution from a set of isometrically embeddable metrics, further improved by an effective heuristic. The second stage refines the approximate solution with the Halpern-Lions-Wittmann-Bauschke projection algorithm, which converges quickly to the optimal solution. In empirical evaluations, the proposed approach runs at least an order of magnitude faster than the state-of-the-art solutions, with significantly improved scalability, complete conformity to constraints, less memory consumption, and other desirable features in real applications.

Downloads

Published

2023-06-26

How to Cite

Li, W., Yu, F., & Ma, Z. (2023). Metric Nearness Made Practical. Proceedings of the AAAI Conference on Artificial Intelligence, 37(7), 8648-8656. https://doi.org/10.1609/aaai.v37i7.26041

Issue

Section

AAAI Technical Track on Machine Learning II