![<div class="span index">1</div> <span><a class="" data-remote="true" data-type="html" href="/series/the-westerosi-primer">The Westerosi Primer</a></span> podcast artwork](https://cdn.player.fm/images/55785338/series/5nWCMgVEfngCGD7F/32.jpg 32w, https://cdn.player.fm/images/55785338/series/5nWCMgVEfngCGD7F/64.jpg 64w, https://cdn.player.fm/images/55785338/series/5nWCMgVEfngCGD7F/128.jpg 128w, https://cdn.player.fm/images/55785338/series/5nWCMgVEfngCGD7F/256.jpg 256w, https://cdn.player.fm/images/55785338/series/5nWCMgVEfngCGD7F/512.jpg 512w)
![<div class="span index">1</div> <span><a class="" data-remote="true" data-type="html" href="/series/the-westerosi-primer">The Westerosi Primer</a></span> podcast artwork](/static/images/64pixel.png)
SPONSOROWANY
We deep dive today into an ISWC 2024 Honorable Mention.
Self-recording eating behaviors is a step towards a healthy lifestyle recommended by many health professionals. However, the current practice of manually recording eating activities using paper records or smartphone apps is often unsustainable and inaccurate. Smart glasses have emerged as a promising wearable form factor for tracking eating behaviors, but existing systems primarily identify when eating occurs without capturing details of the eating activities (E.g., what is being eaten). In this paper, we present EchoGuide, an application and system pipeline that leverages low-power active acoustic sensing to guide head-mounted cameras to capture egocentric videos, enabling efficient and detailed analysis of eating activities. By combining active acoustic sensing for eating detection with video captioning models and large-scale language models for retrieval augmentation, EchoGuide intelligently clips and analyzes videos to create concise, relevant activity records on eating. We evaluated EchoGuide with 9 participants in naturalistic settings involving eating activities, demonstrating high-quality summarization and significant reductions in video data needed, paving the way for practical, scalable eating activity tracking.
39 odcinków
We deep dive today into an ISWC 2024 Honorable Mention.
Self-recording eating behaviors is a step towards a healthy lifestyle recommended by many health professionals. However, the current practice of manually recording eating activities using paper records or smartphone apps is often unsustainable and inaccurate. Smart glasses have emerged as a promising wearable form factor for tracking eating behaviors, but existing systems primarily identify when eating occurs without capturing details of the eating activities (E.g., what is being eaten). In this paper, we present EchoGuide, an application and system pipeline that leverages low-power active acoustic sensing to guide head-mounted cameras to capture egocentric videos, enabling efficient and detailed analysis of eating activities. By combining active acoustic sensing for eating detection with video captioning models and large-scale language models for retrieval augmentation, EchoGuide intelligently clips and analyzes videos to create concise, relevant activity records on eating. We evaluated EchoGuide with 9 participants in naturalistic settings involving eating activities, demonstrating high-quality summarization and significant reductions in video data needed, paving the way for practical, scalable eating activity tracking.
39 odcinków
Odtwarzacz FM skanuje sieć w poszukiwaniu wysokiej jakości podcastów, abyś mógł się nią cieszyć już teraz. To najlepsza aplikacja do podcastów, działająca na Androidzie, iPhonie i Internecie. Zarejestruj się, aby zsynchronizować subskrypcje na różnych urządzeniach.