We propose a method to detect adhoc meeting based on crosscorrelation between audio feature data, which are collected from personal mobile terminals. This method can detect whether there is conversation between each pair of users without raw audio data. Through a two-day evaluation with eight users, we found our method could detect meeting contexts with 0.9 F-measures on average. We also introduce example applications such as a document search application in which detected meeting context is used as a file annotation.
Cite as: Okamoto, M., Iketani, N., Nishimura, K., Kikuchi, M., Cho, K., Hattori, M., Tsuboi, S. (2008) Finding two-level interpersonal context: proximity and conversation detection from personal audio feature data. Proc. Interspeech 2008, 2482-2485, doi: 10.21437/Interspeech.2008-615
@inproceedings{okamoto08b_interspeech, author={Masayuki Okamoto and Naoki Iketani and Keisuke Nishimura and Masaaki Kikuchi and Kenta Cho and Masanori Hattori and Sougo Tsuboi}, title={{Finding two-level interpersonal context: proximity and conversation detection from personal audio feature data}}, year=2008, booktitle={Proc. Interspeech 2008}, pages={2482--2485}, doi={10.21437/Interspeech.2008-615} }