What to know about Niantic's new SDK for the most amazing AR experiences

Come see Niantic at our Immerse Global Summit during Metacenter Global Week in Orlando Oct 17-19

Ahead of the general availability of Niantic Lightship 3.0, Justin Sneddon, Group Product Manager on Niantic Lightship, will present how this SDK can help you build the most amazing AR experiences at our Immerse Global Summit during Metacenter Global Week in Orlando.

Here’s a sneak peak!

Beyond the AR Horizon

Lightship ARDK 3.0 takes what ARKit and ARCore offer in Unity via ARFoundation and cranks it up a notch. But that’s just the beginning. Lightship’s tools are designed to fill in the missing gaps and push the boundaries of computer vision technology. Buckle up, because we’re about to take you through some game-changing features like Depth, Meshing, Semantics, Navigation, Shared AR (Multiplayer), Visual Positioning (VPS), Debugging Tools (Playback and Mocking).

Depth - The Foundation of AR Awesomeness

Depth is the secret sauce behind every AR experience. It’s what helps us figure out where to place objects and how they should interact with the real world. Lightship’s depth is something truly special. Why, you ask? Well, it all comes down to our passion for getting people outdoors.

Lightship’s depth is trained on vast outdoor environments, which means it can provide incredibly accurate depth from a single camera. Plus, it’s not limited to a short range like Lidar on iPhone Pros. Lightship’s depth can reach a whopping 40+ meters, and it works on all AR-capable phones—yes, that includes all iPhones and most Androids!

And why does that extended range matter? Imagine summoning a massive dragon into your AR world—this creature has a wingspan that far exceeds the 5-meter limit. With Lightship’s long-range depth, you can place it 10 to 20 meters away from your camera and capture every breathtaking detail.

What else can you do with this supercharged depth? Let me break it down for you:

  • Placement: Convert a screen point to a real-world position and place digital objects there.

  • Measurement: Know the distance to anything on your screen.

  • Occlusion: Use depth information to seamlessly blend digital objects into the real world.

But wait, there’s more! When you combine depth with semantics (stay tuned for that!), the possibilities become endless. Visual effects like pulse effects, depth of field, toon filters, edge detection, and more come to life. I’ll walk you through how to create the first two experiences.

And, there you have it, folks! Niantic Lightship is all about taking your AR game to new heights. If you’re as excited as I am, you can dig deeper into these features with my upcoming blog posts, complete with examples and source code.

Register now!