Skip to main content
Log in

Attention-driven transfer learning framework for dynamic model guided time domain chatter detection

  • Published:
Journal of Intelligent Manufacturing Aims and scope Submit manuscript

Abstract

Online chatter detection is crucial to ensure the quality and safety of the high-speed milling process. The rapid development of the deep learning community provides a promising tool for chatter detection. However, most previous chatter detection studies rely on complex signal processing techniques, leading to the separation of feature extraction and chatter detection and reducing detection efficiency. Additionally, these studies are developed for a limited range of machining conditions because the development of their model relies on experimental data, while performing experiments with numerous combinations of machining parameters is expensive and time-consuming. To tackle these drawbacks, this paper proposes a transfer learning chatter detection framework that doesn’t rely on any experimental data. The proposed framework is composed of the dynamic milling process model, the Double Attention-driven One-Dimension Convolutional Neural Networks (DAOCNN), and the ensemble prediction strategy. Firstly, a dynamic milling process model is established to generate simulated cutting force signals over a wide range of machining parameters, providing abundant training data and saving experimental costs. Then, without any complex signal processing method, the detection results are directly predicted by the proposed DAOCNN from the time-domain signals, eliminating the separation of feature extraction and chatter detection. Finally, a novel ensemble prediction strategy is proposed to ensure an accurate and robust prediction. To validate the effectiveness of the proposed framework, actual milling experiments are carried out under different cutting conditions. Moreover, contrastive studies with other detection approaches and ensemble methods are also performed. The results demonstrate that the milling stability is correctly predicted by the proposed method in an accurate and efficient manner, which indicates the proposed framework can be a promising tool for online chatter detection.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

References

Download references

Acknowledgements

This work was supported by the National Natural Science Foundation of China (Grants 52075267 and 52005332), the China Scholarship Council (No. 202106840031), and SIMTech, A*STAR(C21-J3-026).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jeong Hoon Ko.

Ethics declarations

Ethical approval

This material is the authors’ own original work, which has not been previously published elsewhere. The paper is not currently being considered for publication elsewhere. The paper reflects the authors’ own research and analysis in a truthful and complete manner.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yin, C., Wang, Y., Ko, J.H. et al. Attention-driven transfer learning framework for dynamic model guided time domain chatter detection. J Intell Manuf 35, 1867–1885 (2024). https://doi.org/10.1007/s10845-023-02133-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10845-023-02133-0

Keywords

Navigation