Snap x CodeLab
Introduction
For CodeLab’s 2025 Spring cohort, our team partnered with Snap to take on the Animal Kingdom Lens Challenge, a creative opportunity to explore how AR can bring immersive experiences to life.
Over six weeks, four cross-functional teams of designers and developers collaborated to design, prototype, and publish interactive AR lenses using Snap’s Lens Studio.
This project was part of Snap’s broader mission to empower creators and developers with tools to build playful, expressive, and meaningful AR experiences for their global audience.
The Team
Timeframe
Project Duration
April 14th — May 30th, 2025 (6 weeks)
Estimated Timeline
Intro Phase (Week of April 14)
- Virtual kickoff, Snap AR ecosystem intro, first Lens walkthrough
- Outcome: Understand what’s possible with AR
Phase 1 (April 15 — April 25)
- Lens Studio basics, asset library, build first AR experience
- Outcome: Foundational skills, initial lens built
Phase 2 (April 28 — May 16)
- Challenge brief released, build 3–4 Lenses, team milestones
- Outcome: Team AR project submission
Phase 3 (May 19 — May 30)
- Final pitch, Snap feedback, iteration + publishing
- Outcome: Final lenses published and reviewed
Tools
Design — Figma, Adobe Illustrator
Development — Lens Studio, JavaScript
Project Goals
Through the CodeLab x Snap partnership, we formed four cross-functional teams to conceptualize and build original lenses under the “Animal Kingdom” challenge theme.
From selfie-view filters that respond to birth year inputs to interactive games that challenge users to feed virtual cats, our lenses tackled both artistic storytelling and backend logic. Across all projects, we emphasized collaborative iteration, real-time testing, and feedback loops from Snap mentors.
Business Value & Impact
By creating four distinct lenses, the teams introduced a wide range of AR capabilities, including both face-tracked and world-tracked experiences. For example, some lenses focused on self-expression using animated accessories and headbindings, while others invited users to interact with animated 3D characters in their physical space. This variety not only broadened the creative use of Lens Studio but also aligned with Snap’s goal of offering diverse and culturally resonant experiences for its global user base.
Each team pushed the boundaries of Lens Studio by implementing and debugging key platform features such as body tracking, AnimationMixer, world tracking, behaviour scripting, etc. In doing so, they validated how these tools perform in realistic development cycles, from setup to testing, helping surface practical limitations and creative opportunities within Lens Studio.
Lens 1: “What is your Zodiac Animal?”
Overview
Inspired by Chinese zodiac traditions and in celebration of AAPI month, this lens allows users to discover their zodiac animal in an interactive and personalized way. The lens asks users to select their birth year from a drop-down menu. Once selected, a celebratory animation is triggered: their corresponding zodiac animal appears perched on their shoulder, along with a floating card listing personality traits associated with their sign. Gong sound effects and festive red/gold banners complete the moment.
Design Process:
We began by sketching out how users would move through the experience starting with selecting their birth year, triggering a reveal animation, and receiving a personalized zodiac result. Our inspiration came from traditional Lunar New Year decorations and temple signage, which helped us shape the overall visual direction: bold reds and golds and stylized animal designs.
To ensure cultural relevance and visual appeal, we researched zodiac animal traits and built a UI that balanced fun and elegance. We also included relevant facts about each zodiac animal, so this lens is not only a fun and interactive one but also educational.
Development Process:
Body Tracking:
The core features of the What’s Your Zodiac? lens includes head and shoulder tracking to attach illustrations to the user as they interact with the lens. After the user clicks the “Reveal my zodiac!” button, the informational banner with the description of the user’s zodiac attaches to their forehead, and the respective zodiac animal attaches to each of the user’s shoulders.
Scripting:
All interactive features were controlled through JavaScript-like scripting. To make images tappable (such as for buttons), we added an InteractionComponent to the image that would track user interactions to the image. Then, we were able to control specific interactions through script files. This allowed us to implement the increment and decrement feature for the birth-year-setter and reveal the user’s zodiac according to the year set by the user when the “Reveal my zodiac!” button was clicked.
Technical Challenges + Decision:
At first, we tried using a head-bound UI for the year selector, thinking it would make the experience feel more immersive. But in testing, we ran into issues as the UI kept disappearing or moving around too much when the user moved their head. It was frustrating and didn’t feel reliable. After discussing it with the Snap AR team, we decided to switch to a simple, static UI at the top of the screen instead. It made everything easier to use and more consistent across different devices.
Lens 2: “unCovery: Animal Magazine”
Overview:
UnCovery brings wonder into the everyday lives of Snapchatters through a magazine cover to focus your attention and curiosity on the animal. This lens randomizes through different magazine covers, and the Snapchatter will select a cover as they snap a photo of their animal. After, they have the option to edit the default text and post their unCovery magazine cover!
Design Process:
After drafting several ideas befitting the theme and considering our constraints, we were drawn to the magazine cover, which centers around a photo and incorporates layers of graphic elements. To bring more challenge and a unique aspect to this lens compared to other Vogue lenses on Snapchat is that this lens consists of 5 layers, whereas current Snapchat lens only consists of 2 layers. To bring more variation to the lens experience, there are 4 alternating magazine covers that vary in color, shape, and style.
Development Process:
Randomization:
We used scripting to randomize the magazine layout components such as colors, headlines, and decorative graphics. Each time the user taps the screen, a new combination of assets is pulled from a predefined set, giving the feel of an ever-changing editorial shoot.
Tap-to-Select Mechanism:
The user taps anywhere on the screen to trigger a full layout refresh. This was implemented through Lens Studio’s TouchComponent, which triggers the randomizer function and repositions or reassigns the assets in real time.
Technical Challenges + Decision:
One of our biggest technical challenges was managing the complexity of layering multiple 2D elements in a way that felt polished and dynamic. Unlike most magazine-style lenses that use only two visual layers, this lens was designed with five distinct visual layers — including a dynamic headline, subheadings, frame decorations, and background overlays.
Additionally, ensuring that the lens remained lightweight and within Snap’s performance limits was a challenge. Too many high-resolution graphics or overlapping animations caused preview lag or slow responsiveness. To solve this, we optimized all visual assets by reducing their file sizes and testing different compression levels to retain visual quality while keeping the experience smooth.
Finally, we initially explored letting users fully customize text in-lens, but quickly ran into limitations with Lens Studio’s support for dynamic text input. As a workaround, we kept the default text editable after the snap is taken, which encourages users to personalize their post using Snapchat’s built-in caption tool.
Lens 3: “Pet Gala”
Overview:
Pet Gala is a playful AR experience that lets users dress up their pets in Met Gala-inspired outfits. With a scrollable carousel of looks, fun accessories, and tap-to-change backgrounds, users can create their own fashion-forward pet photoshoot. It’s a lighthearted way to bring fashion and creativity into the world of pet AR.
Design Process:
For the outfits, we wanted them to feel more immersive and interactive, so we aimed for a 3D look. Started by designing the components in Figma, we brought them into Illustrator to add depth and detail. In addition, we used shading and perspective to make them pop, then exported everything as SVGs for the developers to implement.
Development Process:
Scripting:
To make everything work smoothly, we used JavaScript in Lens Studio. Tap interactions were tracked using Behavior events, which then triggered visual updates to the active pet outfit or background. We used object toggling and visibility switching to show and hide elements on demand, keeping the transitions lightweight and fast. We also added logic to reset the look after a full cycle, ensuring the lens never felt stuck or confusing for users.
Pet Tracking:
We started with Lens Studio’s Pet Tracking template, which allowed us to anchor virtual accessories and outfits directly onto the pet’s body. This involved aligning 3D assets (like hats, glasses, or capes) with the tracking points around the pet’s head. We tested these placements on various pet photos and videos to ensure the assets moved naturally with different angles and movements, keeping the lens playful but stable.
Tap-to-change:
Initially, we aimed to use a carousel-style UI to let users scroll through fashion choices, but due to time constraints and version issues in Lens Studio 5.9, we pivoted to a simpler tap-based interaction. We scripted a feature where taps of on-screen buttons would cycle through a predefined list of accessories, thus creating unique combinations of outfits. Tapping the background also cycles through different environments.
Technical Challenges + Decision:
We originally aimed to implement a scrollable carousel UI, but due to time limits and platform constraints, we ran into too many bugs. Instead, we simplified it to tap-to-change, which still gave users full control without overcomplicating the interface.
Additionally, using Lens Studio v5.9 gave us some issues, such as when the preview would glitch, and we couldn’t save projects properly. To avoid losing work, we switched to saving and sharing the lens as a .zip backup, which kept development moving smoothly.
Lens 4: “Cat Pop” — Feed the Cat Game
Overview:
Our team aimed to create an interactive and engaging AR lens that utilizes gamification to enhance the user experience. The user’s objective in the mini-game is to feed fish to as many cats as they can. As the user scans their world-view camera, the cats would sporadically spawn behind different bushes. However, if a user accidentally feeds a raccoon, the game ends, prompting the user to save their highest score and share it with friends and family. Cat Pop challenges cat lovers to stay alert and engaged while simultaneously encouraging them to interact and compete with other users.
Design Process:
During the design process, we utilized Figma and Adobe Illustrator. After several rounds of iterations, we established a fun and playful theme for the components. We selected the color palette and typeface because they evoked a lighthearted tone and ensured easy readability. Initial sketches were designed with Figma and then later refined into high-fidelity assets in Adobe Illustrator.
We created both 2D and 3D assets in Adobe Illustrator. However, since the 3D asset files were too large and not optimized for efficient file importation, we redirected our focus to the 2D assets. The 2D assets ensured optimal performance and maintained efficiency in the development stages.
2D Assets
3D Assets
Development Process:
Randomized Spawning Logic:
For Cat Pop, we focused on building the interactive game logic and laying out the core gameplay features. The main mechanic centered around cats popping up from behind bushes on the screen — users had to tap the cat to feed it fish and earn points. We handled randomized character spawning, visibility toggling, tap detection, score tracking, and careful layering of UI elements using Lens Studio’s Orthographic Camera and Render Order system.
Touch Interactions & Score Logic:
We used behavioural scripting to register taps. Each animal prefab had a script attached to it that determined whether the interaction resulted in a score increase (cat) or a penalty (raccoon). These tap callbacks were tied directly to the character’s active state, ensuring responsive feedback for players.
Technical Challenges + Decision:
One of the major tasks was to script how animals would appear. The goal was to randomly spawn either a cat or a raccoon behind a bush and allow the user to tap on the correct one (the cat) to score points. The first version simply selected a random prefab, but it was inconsistent, as sometimes characters wouldn’t appear at all or wouldn’t register taps.
To fix this, we created a generalized spawn function. This function handled:
- Randomly selecting a cat or raccoon
- Activating and positioning the character behind the bush
- Ensuring it was layered properly in the world-view
- Registering tap events for interaction
This consolidated approach made the characters reliably appear and respond to input while keeping the logic clean and easy to adjust. It also gave us flexibility for game balancing, which made tweaking spawn timing or adjusting difficulty much simpler.
Another important decision we made was switching from a 3D setup to a 2D Screen Image approach because the lens exceeded Lens Studio’s 8MB cap. This allowed us to precisely position and scale elements like the cat, raccoon, bush, and background, while also improving performance on Snapchat. We used Lens Studio’s scripting system (JavaScript) to manage the game state and user interactions, and built the entire UI using Screen Transform components and TapEvent listeners to track inputs.
Closing Remarks
We’re incredibly grateful to Lindsey Heisser, our client contact at Snap, and the Snap AR Team for their guidance and encouragement throughout this journey. This project wouldn’t have been possible without their support.
The Snap Lens Challenge gave our team a hands-on crash course in augmented reality (AR) — not just as a technology, but as a creative storytelling platform. Through collaboration, experimentation, and iteration, we explored what it means to design experiences that are playful, technically ambitious, and user-driven.
From scripting interactive game logic to designing culturally resonant visuals, every team member walked away with a deeper understanding of Snap’s AR ecosystem and the creative potential of Lens Studio.
And now, we hope you’ll give the lenses a try and explore the Animal Kingdom for yourself!