Snap Partner Summit is where the Snapchat platform's roadmap becomes public — and 2024's edition was arguably the most consequential in the company's history. With the 5th generation Spectacles launching in September and Lens Studio 5 shipping major new capabilities, the platform is moving from social media feature to spatial computing infrastructure.
Here's Verticar's breakdown of what matters most for AR content creators and brands building on Snap.
Spectacles 5th Gen: the most capable AR glasses yet
The headline announcement was Spectacles. Snap's 5th generation AR glasses are a genuine step-change from previous versions — see-through waveguide lenses, hand tracking, eye tracking, persistent world understanding, and Snap OS running full-stack on the device.
What this means for creators: for the first time, you can build AR experiences that understand the physical world, respond to natural gestures, and layer digital content onto reality with sub-10ms latency. The "AR on your phone" experience is compelling; the "AR in your field of view" experience is transformative.
Spectacles 5th gen is currently developer-only at $99/month. Snap will use this period to build the experience ecosystem ahead of the consumer launch, which they subsequently confirmed will be the 6th generation "Specs" in 2026.
Lens Studio 5: the biggest update in years
Lens Studio 5 shipped alongside the Summit and introduced a rewritten rendering pipeline, new asset types, and first-class Spectacles support. Key additions for developers:
- Spectacles Interaction Kit (SIK): Pre-built spatial UI components — buttons, sliders, panels — all optimised for hand and eye input. Reduces spatial UI development time dramatically.
- SnapML integration: Bundle custom machine learning models directly into Lenses. Enables real-time object recognition, activity detection and custom gesture triggers.
- Multiplayer simulation: Test shared-world multiplayer Lenses in the editor without needing multiple physical devices. A massive quality-of-life win for teams building collaborative experiences.
- Custom ML object trackers: Turn any physical object into an interaction surface — products, packaging, printed materials, even hand gestures — using custom-trained models.
AI integration: the platform gets smarter
Snap announced deep AI integrations that will eventually ship to both Spectacles and the main Snapchat camera. The most significant: Remote Service Gateway, which allows Lenses to make secure camera queries to third-party AI services while preserving user privacy.
In practice this means a Lens can see what the camera sees, send a privacy-preserving representation to GPT-4o or Gemini Vision, and receive a response — all within the AR experience. The use cases range from real-time product recognition to contextual recommendations to AI-powered creative direction.
This isn't a future capability — it's available now in Lens Studio for Spectacles developers, and will come to the main Snapchat camera over time.
Location-based AR: Guided Mode and fleet tools
One of the most underreported announcements: Snap shipped a suite of tools specifically for location-based AR deployments. Guided Mode lets operators configure Spectacles to launch a specific Lens automatically when worn — perfect for retail installations, museum experiences, and brand activations where you don't want users navigating menus.
Fleet Management allows remote monitoring and control of multiple Spectacles devices, enabling large-scale deployments at venues. This is the infrastructure layer that makes Spectacles-powered installations commercially viable.
What this means for brands
The Snap platform in 2024 has capabilities that brands are largely unaware of. Most marketers still think of Snap AR as "face filters for teenagers." The reality is that Snap now offers:
- Full-body spatial AR with persistent world anchoring
- Custom AI-powered lens logic with external model queries
- Multiplayer shared AR experiences
- Hands-free AR glasses deployments for physical locations
- Real-time 3D object generation with Snap3D API
Brands that engage with these capabilities now will be the ones with proven AR infrastructure when Specs launch to consumers in 2026. The learning curve is real — building for spatial computing is different to building for a phone camera — and the time to start is now.
Verticar's position
Verticar attended Summit as a Snap certified partner. We're already building on Spectacles and have direct relationships with Snap's developer and partner teams. If you're a brand looking to explore what's possible on the Snap platform — from a campaign lens to a full Spectacles deployment — we're the right partner.
Curious what Spectacles could do for your brand? Read our AR Glasses page or get in touch directly.