skip to main content
10.1145/3533028.3533302acmconferencesArticle/Chapter ViewAbstractPublication PagesmodConference Proceedingsconference-collections
research-article

LLVM code optimisation for automatic differentiation: when forward and reverse mode lead in the same direction

Published:12 June 2022Publication History

ABSTRACT

Both forward and reverse mode automatic differentiation derive a model function as used for gradient descent automatically. Reverse mode calculates all derivatives in one run, whereas forward mode requires rerunning the algorithm with respect to every variable for which the derivative is needed. To allow for in-database machine learning, we have integrated automatic differentiation as an SQL operator inside the Umbra database system. To benchmark code-generation to GPU, we implement forward as well as reverse mode automatic differentiation. The inspection of the optimised LLVM code shows that nearly the same machine code is executed after the generated LLVM code has been optimised. Thus, both modes yield similar runtimes but different compilation times.

References

  1. Matthias Boehm et al. 2016. SystemML: Declarative Machine Learning on Spark. PVLDB 9, 13 (2016), 1425--1436.Google ScholarGoogle Scholar
  2. Matthias Boehm et al. 2020. SystemDS: A Declarative Machine Learning System for the End-to-End Data Science Lifecycle. In CIDR. www.cidrdb.org.Google ScholarGoogle Scholar
  3. Patrick Damme et al. 2020. MorphStore: Analytical Query Engine with a Holistic Compression-Enabled Processing Model. PVLDB 13, 11 (2020), 2396--2410.Google ScholarGoogle Scholar
  4. Ahmed Elgohary et al. 2018. Compressed linear algebra for large-scale machine learning. VLDB J. 27, 5 (2018), 719--744.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Edward Gan et al. 2020. CoopStore: Optimizing Precomputed Summaries for Aggregation. PVLDB 13, 11 (2020), 2174--2187.Google ScholarGoogle Scholar
  6. Rainer Gemulla et al. 2011. Large-scale matrix factorization with distributed stochastic gradient descent. In KDD. ACM, 69--77.Google ScholarGoogle Scholar
  7. Ahmed Helal et al. 2021. A Demonstration of KGLac: A Data Discovery and Enrichment Platform for Data Science. PVLDB 14, 12 (2021), 2675--2678.Google ScholarGoogle Scholar
  8. Dimitrije Jankov et al. 2019. Declarative Recursive Computation on an RDBMS. PVLDB 12, 7 (2019), 822--835.Google ScholarGoogle Scholar
  9. Ahmet Kara et al. 2021. Machine learning over static and dynamic relational data. In DEBS. ACM, 160--163.Google ScholarGoogle Scholar
  10. Lukas Karnowski et al. 2021. Umbra as a Time Machine. In BTW (LNI). GI.Google ScholarGoogle Scholar
  11. Andreas Kunft et al. 2019. An Intermediate Representation for Optimizing Machine Learning Pipelines. PVLDB 12, 11 (2019), 1553--1567.Google ScholarGoogle Scholar
  12. Sören Laue. 2019. On the Equivalence of Forward Mode Automatic Differentiation and Symbolic Differentiation. CoRR abs/1904.02990 (2019).Google ScholarGoogle Scholar
  13. Tae-Jun Lee et al. 2018. Greenhouse: A Zero-Positive Machine Learning System for Time-Series Anomaly Detection. CoRR abs/1801.03168 (2018).Google ScholarGoogle Scholar
  14. Xupeng Li et al. 2017. MLog: Towards Declarative In-Database Machine Learning. PVLDB 10, 12 (2017), 1933--1936.Google ScholarGoogle Scholar
  15. Thomas Neumann et al. 2020. Umbra: A Disk-Based System with In-Memory Performance. In CIDR.Google ScholarGoogle Scholar
  16. Stefanie Scherzinger et al. 2019. The Best of Both Worlds: Challenges in Linking Provenance and Explainability in Distributed Machine Learning. In ICDCS. IEEE.Google ScholarGoogle Scholar
  17. Maximilian Schleich et al. 2020. LMFAO: An Engine for Batches of Group-By Aggregates. PVLDB 13, 12 (2020), 2945--2948.Google ScholarGoogle Scholar
  18. Maximilian E. Schüle et al. 2017. Monopedia: Staying Single is Good Enough - The HyPer Way for Web Scale Applications. PVLDB 10, 12 (2017), 1921--1924.Google ScholarGoogle Scholar
  19. Maximilian E. Schüle et al. 2019. In-Database Machine Learning: Gradient Descent and Tensor Algebra for Main Memory Database Systems. In BTW (LNI). GI.Google ScholarGoogle Scholar
  20. Maximilian E. Schüle et al. 2019. ML2SQL - Compiling a Declarative Machine Learning Language to SQL and Python. In EDBT.Google ScholarGoogle Scholar
  21. Maximilian E. Schüle et al. 2019. MLearn: A Declarative Machine Learning Language for Database Systems. In DEEM@SIGMOD. ACM, 7:1--7:4.Google ScholarGoogle Scholar
  22. Maximilian E. Schüle et al. 2019. The Power of SQL Lambda Functions. In EDBT.Google ScholarGoogle Scholar
  23. Maximilian E. Schüle et al. 2020. Freedom for the SQL-Lambda: Just-in-Time-Compiling User-Injected Functions in PostgreSQL. In SSDBM. ACM, 6:1--6:12.Google ScholarGoogle Scholar
  24. Maximilian E. Schüle et al. 2021. ArrayQL for Linear Algebra within Umbra. In SSDBM. ACM, 193--196.Google ScholarGoogle Scholar
  25. Maximilian E. Schüle et al. 2021. In-Database Machine Learning with SQL on GPUs. In SSDBM. ACM, 25--36.Google ScholarGoogle Scholar
  26. Maximilian E. Schüle et al. 2022. ArrayQL Integration into Code-Generating Database Systems. In EDBT.Google ScholarGoogle Scholar
  27. Vraj Shah et al. 2021. Towards Benchmarking Feature Type Inference for AutoML Platforms. In SIGMOD. ACM, 1584--1596.Google ScholarGoogle Scholar
  28. Amir Shaikhha et al. 2021. An Intermediate Representation for Hybrid Database and Machine Learning Workloads. PVLDB 14, 12 (2021), 2831--2834.Google ScholarGoogle Scholar
  29. Ted Shaowang et al. 2021. Declarative Data Serving: The Future of Machine Learning Inference on the Edge. PVLDB 14, 11 (2021), 2555--2562.Google ScholarGoogle Scholar
  30. Jonas Traub et al. 2020. Agora: Bringing Together Datasets, Algorithms, Models and More in a Unified Ecosystem [Vision]. SIGMOD Rec. 49, 4 (2020), 6--11.Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Hantian Zhang et al. 2021. OmniFair: A Declarative System for Model-Agnostic Group Fairness in Machine Learning. In SIGMOD. ACM, 2076--2088.Google ScholarGoogle Scholar

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Conferences
    DEEM '22: Proceedings of the Sixth Workshop on Data Management for End-To-End Machine Learning
    June 2022
    63 pages
    ISBN:9781450393751
    DOI:10.1145/3533028

    Copyright © 2022 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 12 June 2022

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • research-article

    Acceptance Rates

    DEEM '22 Paper Acceptance Rate9of13submissions,69%Overall Acceptance Rate23of37submissions,62%
  • Article Metrics

    • Downloads (Last 12 months)36
    • Downloads (Last 6 weeks)2

    Other Metrics

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader