Back to Journal
Deep Dive · Snapchat Spectacles

Spectacles
So Far

Nine years. Five generations. A $3 billion bet on the future of computing. The complete history of Snapchat Spectacles — and what Verticar has built on them.

Author Verticar Studio
Date March 2026
Read time 12 min

When Snap Inc. quietly sold the first pair of Spectacles from a vending machine in Venice Beach in November 2016, nobody predicted they'd eventually be building towards the most ambitious consumer AR glasses product the world has ever seen. What started as a toy — a pair of sunglasses with a button that shoots circular videos — has evolved over five hardware generations into a fully see-through AR computer you wear on your face, running an entire operating system, with hand tracking, eye tracking, world understanding, and AI built in.

This is the complete story of Spectacles, told from the perspective of a studio that has been building on them since the developer release of Generation 5.


The Five Generations

2016
Gen 1
The original

Spectacles 1st Generation — The toy that started it all

Snap launched Spectacles in November 2016 with a strategy as unusual as the product itself: a single bright yellow "Snapbot" vending machine that appeared without announcement in different locations across the US, always generating a queue. The scarcity was engineered — and it worked. The internet went wild.

The first Spectacles were simple: a pair of sunglasses with a circular camera on the front, a button to record 10-second videos in the circular format native to Snapchat, and a charging case that doubled as a power bank. There was no AR. No display. No processing. Just a camera, some lights, and a Bluetooth connection to your phone.

Snap sold around 150,000 units before quietly stepping back in 2018 after clearing excess inventory at a reported $40M write-down. Gen 1 was a proof-of-concept for wearable content creation, not a computer platform — but it established Snap's belief that the camera would eventually move from your pocket to your face.

2018
Gen 2
Refined design

Spectacles 2nd Generation — Slimmer, smarter, still no AR

Generation 2 arrived in April 2018, this time available for purchase online rather than through vending machine theatre. The form factor was refined — lighter, slimmer, more conventionally wearable — and Snap added a polarised lens option for the first time.

Dual cameras replaced the single front-facing camera, improving video quality and enabling more interesting framing options. Audio recording was also improved. But fundamentally, this was still a camera you wore, not a computer you interacted with. The display was still absent — you watched your recordings on your phone.

Gen 2 performed modestly but didn't break through to mainstream adoption. Snap continued its investment in the wearables space despite the commercial headwinds, which in hindsight looks like one of the more prescient decisions in consumer tech history.

2020
Gen 3
First with AR

Spectacles 3rd Generation — Depth, dimension and the first taste of AR

Generation 3, launched in September 2019 at $380, was a meaningful leap forward. Two HD cameras plus two depth cameras gave Spectacles 3 the ability to capture genuine 3D content — videos with real depth information that could be processed into 3D Snaps and, crucially, used to anchor simple AR effects in the recorded video.

This was the first generation where the word "AR" applied in any meaningful sense. Using Lens Studio, creators could build effects that understood the depth data captured by the glasses and placed digital objects in the scene in a spatially-aware way. The results were impressive — the AR didn't render on your face in real-time as you wore them, but the output looked dramatically more grounded than phone-camera AR had ever been.

Gen 3 sold better than its predecessors but remained a niche product — priced beyond casual adoption and with a creative ceiling that required Lens Studio expertise to explore. But it proved the hardware roadmap: more cameras, more data, richer AR.

2021
Gen 4
First see-through display

Spectacles 4th Generation — The first real AR glasses

May 2021. Snap CEO Evan Spiegel walks onto a stage and puts on a pair of glasses. He looks up at a wall. Digital objects appear to float on it — visible through the lenses, anchored in the real world, interactive. This was the first time Spectacles had a display.

Generation 4 used waveguide lens technology to project AR images through see-through lenses into the wearer's field of view. For the first time, AR wasn't something you recorded and watched back — it was something you saw in real-time as you moved through the world. World-tracking, face-tracking, and a custom processor enabled interactive experiences built in Lens Studio.

Snap was explicit: Generation 4 was a developer device only, not a consumer product. At $99/month on a subscription basis, it was positioned as a platform for experimentation. The resolution was limited, the battery life short, and the form factor bulky. But it was the proof that see-through AR at a consumer glasses form-factor was achievable — and that Snap had the technology to get there.

2024
Gen 5
Most advanced AR glasses ever made

Spectacles 5th Generation — A fully capable spatial computer

September 2024. After three years of iterating on the developer platform, Snap launched the 5th generation Spectacles at their Partner Summit — and the leap was dramatic. See-through waveguide displays with significantly improved brightness and field of view. Hand tracking with millimetre-level precision. Eye tracking. World understanding that persists across sessions. Spatial audio. Snap OS — a full operating system purpose-built for AR glasses — running on custom silicon designed with Qualcomm.

The hardware is backed by Lens Studio 5, which ships with the Spectacles Interaction Kit (pre-built spatial UI components), SnapML integration for custom machine learning models, multiplayer simulation tools, and an AI gateway that allows Lenses to query GPT-4o or Gemini Vision from within the experience.

Generation 5 is still developer-only at $99/month. But the platform is no longer an experiment — it's an infrastructure. The 2025 update added the Depth Module API, Automated Speech Recognition in 40+ languages, the Snap3D API for real-time AI-generated 3D objects, Fleet Management for multi-device deployments, and Guided Mode for location-based installations. This is a platform a studio can build a serious product on.

And that's exactly what Verticar is doing.

The road to Specs — 2026 and the consumer launch

In June 2025, Evan Spiegel took to a stage again and made the announcement the developer community had been building towards: the 6th generation "Snap Specs" would launch as a consumer product in 2026. No subscription. Accessible pricing. A real product for real people.

He framed it bluntly: "We've spent 11 years and more than $3 billion to invent a new type of computer for augmented reality." That framing is important — Snap isn't positioning Specs as an accessory or a gadget. They're positioning it as the successor to the smartphone as a computing form factor.

The 6th generation will integrate both OpenAI GPT and Google Gemini AI natively, enabling Lenses that can understand what you're looking at, respond to voice, and provide contextual intelligence in real time — all without a phone in your hand. The developer ecosystem being built on Generation 5 today is the same platform that will run on consumer Specs.

This is why the timing matters. The studios building Spectacles experiences now are the ones who'll have production-ready content at the consumer launch moment. That moment is approaching fast.


What Verticar has built on Spectacles

Since joining the Spectacles developer program, Verticar has been building three distinct experiences — each exploring a different dimension of what spatial AR can be. Here's a look at what we've shipped.

Verticar × Spectacles — Project 01

Bitmoji Farm Simulator

A spatial AR game that transforms any flat surface into a working Bitmoji farm — plant, tend and harvest crops as your personalised Bitmoji character, visible through the glasses lenses. Built primarily around hand tracking as the input method: reach out, interact with objects, trigger actions with natural gestures. No controllers. No phone. Just your hands and the world around you.

Snap OS · Hand Tracking · World AR · Bitmoji SDK

Verticar × Spectacles — Project 02

Chef's Assistant

An AI-powered hands-free cooking guide that lives in your field of view while you work in the kitchen. Recipe steps float at eye level. Timers tick in the corner. Ingredient quantities appear when you glance at the counter. Powered by Gemini vision integration, the lens identifies what's on the surface and suggests the next step — all without touching your phone, without a screen in your way, without breaking your flow.

Gemini AI · World Tracking · Object Recognition · Spatial UI

Verticar × Spectacles — Project 03

Bodyweight Class

A fully spatial fitness experience — a life-sized virtual coach visible in the room around you, demonstrating every movement at scale and guiding you through a complete bodyweight workout with real-time rep counting, form cues and rest intervals. Your hands stay free to train. Your eyes stay on the coach. The phone stays in your pocket. This is what AR glasses are made for.

Body Tracking · Spatial Coach · Rep Counting · Fitness AR

What these projects tell us

Building these three experiences taught us a lot about what Spectacles can and can't do well right now — and about what spatial AR as a medium actually demands from creators.

Input is the new canvas. Every interaction we designed started with the question: how does this feel in your hands, in the physical world? Touch screens disappear. Spatial computing is inherently embodied — the most natural interactions are the most successful ones. Hand tracking on Gen 5 is remarkably capable, and designing around it rather than forcing phone-native UI patterns is the difference between an experience that feels futuristic and one that just feels clunky.

AI changes everything. The Chef's Assistant wouldn't exist without the Gemini integration in Lens Studio 5. The ability to point a camera at the physical world and get intelligent, contextual responses in real time — surfaced as spatial AR overlays — is a genuinely new capability. It turns Spectacles from a display into a perception layer. That's a very different kind of product.

The content problem is real. The biggest constraint on Spectacles adoption isn't hardware — Gen 5 is impressively capable — it's content. Without compelling experiences to wear them for, consumers won't adopt them. The job for studios like Verticar between now and the 2026 Specs launch is to build the content library that gives people a reason to put them on.

Looking ahead to Specs 2026

The consumer launch of Snap Specs in 2026 will be one of the defining moments in consumer tech of the decade. For the first time, a genuinely capable pair of AR glasses — one that has been refined over nine years and five hardware generations — will be available to buy. Not as a developer subscription. Not as a prototype. As a product.

The brands and content studios that are present at that moment with polished, purposeful Spectacles experiences will have an enormous first-mover advantage. The ones who wait until after launch to start building will be playing catch-up in a medium that rewards early expertise.

Verticar is building now. Our experiences are live. Our understanding of the platform is deep. If your brand wants to be at the Specs launch with something worth wearing — let's start that conversation today.


Spectacles Gen 5 is available to developers at spectacles.com. Snap Specs consumer launch is confirmed for 2026. Verticar is a certified Lens Studio developer and Snap partner.

Build for Specs
before 2026

Verticar is building Spectacles experiences now — spatial games, AI utilities and brand activations. If your brand wants to be ready at launch, start the conversation today.