Meta Spark (now increasingly referred to under the wider Meta AR umbrella) is the company's creator toolkit for building AR effects on Instagram and Facebook. With over two billion active users across these platforms, it represents the single largest distribution channel for AR content in the world.
At Verticar, we build on both Snap's Lens Studio and Meta Spark regularly — sometimes for the same client. Here's our honest assessment of what Meta's AR platform offers, where it excels, and where it falls short.
What Meta Spark is
Meta Spark is a desktop application (Mac and Windows) for creating AR filters and effects that run inside the Instagram and Facebook camera. Published effects appear as camera filters accessible to anyone in those apps — no download required. Effects can be triggered by face, body, world (environment), hands, or image targets.
The distribution advantage is massive: with over 600 million daily Instagram Story users, a well-placed AR effect can reach scale that would require significant media spend on other channels. The platform's Creator Marketplace also connects brand-ready effects with influencer creators who can drive authentic distribution.
What Meta Spark does well
Image target AR
Meta Spark's image target (marker-based) AR is arguably the best in class among consumer social platforms. You define a trigger image — packaging, a poster, a QR code — and the effect activates when the camera recognises it. We used this capability to great effect on our Level Shoes project, where the packaging itself became the AR activation point.
The robustness of the tracking and the quality of the anchored world effects in Meta Spark's image target implementation is genuinely impressive. Experiences stay locked to the target even under challenging lighting conditions.
Body tracking
Full-body tracking in Meta Spark is smooth and real-time, making it ideal for clothing try-ons, body paint effects, and full-figure brand overlays. Combined with the Instagram Reels format, body-tracked effects have strong shareability — they're visually striking in the 9:16 vertical format.
The reach argument
If your goal is raw impressions, Meta can't be beaten on social AR. The combination of Facebook's older demographic (higher purchasing power) with Instagram's visual culture and Gen Z penetration gives brands access to an unusually broad audience with a single effect.
Where Meta Spark falls short
Developer experience
Compared to Lens Studio, Spark's developer tooling is less mature. The scripting environment (JavaScript-based) is capable but the documentation is inconsistent, community resources are thinner, and the platform has fewer pre-built templates for complex interaction patterns. For Verticar, the typical build time for an equivalent effect is 20–30% longer in Spark than in Lens Studio.
Review process
Meta's effect review process is slower and less predictable than Snap's. Campaign timelines need to account for a 3–7 day review window that can extend significantly if the effect involves branded content or product references. We've experienced review times of over two weeks for effects that were straightforward in intent but triggered manual review flags.
Compute constraints
Meta Spark effects run on device with significant performance constraints — particularly on Android where the range of hardware is enormous. Complex 3D environments, high-poly assets or computationally expensive shaders that work perfectly in the Spark desktop simulator can have issues on lower-end devices in the wild. Performance testing across device tiers is essential.
Snap vs. Meta: how we choose
At Verticar, we recommend platforms based on the specific brief rather than defaulting to one over another. Our general heuristics:
- Go Meta Spark when: the brief requires physical-world triggers (packaging, OOH), when the target audience is primarily Instagram users, or when the campaign needs to reach older demographics alongside younger ones.
- Go Lens Studio when: the brief requires the fastest review cycle, the highest creative ambition in face and body AR, AR glasses capability, or when Spotlight/TikTok-equivalent distribution is the goal.
- Go both when: the budget and timeline allow, and the campaign needs maximum organic reach. We've built multi-platform AR campaigns for clients like Racesquare where the same creative concept was adapted to both platforms simultaneously.
The future of Meta AR
Meta is investing heavily in AR hardware — Ray-Ban Meta smart glasses already have a large install base, and Meta's internal AR glasses project (codenamed Orion) represents the company's long-term bet on wearable computing. As Meta builds towards a consumer AR headset, the Spark ecosystem will likely evolve from social filters to the creative layer for their hardware platform.
For brands, this means building Meta AR capability now creates optionality — the skills and assets built for Spark today will transfer to Meta's AR glasses ecosystem when it matures.
Building AR across Snap and Meta? Talk to Verticar — we work on both platforms daily and can advise on the right approach for your brief.