Events & Resources

Learning Center
Read through guides, explore resource hubs, and sample our coverage.
Learn More
Events
Register for an upcoming webinar and track which industry events our analysts attend.
Learn More
Podcasts
Listen to our podcast, Behind the Numbers for the latest news and insights.
Learn More

About

Our Story
Learn more about our mission and how EMARKETER came to be.
Learn More
Our Clients
Key decision-makers share why they find EMARKETER so critical.
Learn More
Our People
Take a look into our corporate culture and view our open roles.
Join the Team
Our Methodology
Rigorous proprietary data vetting strips biases and produces superior insights.
Learn More
Newsroom
See our latest press releases, news articles or download our press kit.
Learn More
Contact Us
Speak to a member of our team to learn more about EMARKETER.
Contact Us

Apple researchers discover how AI can infer human activity from motion and sound

The news: New Apple research points to the iPhone company pairing large language models (LLMs) with traditional sensors to build a more precise understanding of what a user is doing in real time. It’s likely to show up in sensor-enabled smartphones, computers, and smart home hubs hinged on ambient intelligence.

Apple’s study shows that LLMs can identify everyday activities by combining simple audio captions and motion data, per 9to5Mac. The LLM never received raw user audio, only brief text captions generated by audio transcription models.

For marketers, this leans toward context-aware experiences and ambient interactions that respond to user intent, not taps.

Key stat: 88.6% of programmatic buyers in North America now use intent data, and 85.0% rely on behavioral signals, far surpassing demographic or lifestyle targeting, per Datonics. 

Apple’s research points to a future where those signals become richer, more moment-based, and privacy-safe. Instead of broad categories like “in-market” for shopping intent, devices could detect whether someone is focused at a laptop, prepping dinner, or in motion—real-world cues that shape relevance.

Privacy caveats remain: Context-aware AI must remain transparent, optional, and user-controlled to preserve trust, unlike Meta tapping AI conversations to personalize ads.

Apple is unlikely to expose granular activity labels to advertisers; instead, brands may receive tokenized, privacy-safe signals tied to timing and utility rather than user identity.

What this means for advertisers: Apple’s research shows how multimodal AI can unlock real-time, intent-driven interactions without compromising privacy—a departure from reactive design to anticipatory experiences shaped by context.

Brands should explore how to design for moments, not messages. Build content and promotions that surface organically depending on a user’s activity, be it cooking, commuting, or exercising—so brands show up when it matters most.

You've read 0 of 2 free articles this month.

Get more articles - create your free account today!