The future is open.
See it with the Mapbox Vision SDK.
Living maps, intelligent navigation, augmented reality, and high-performance computer vision combine to provide a revolutionary viewport into what’s ahead. The Mapbox Vision SDK transforms connected cameras into a second set of eyes for your car. With live imagery processed directly on your device, we’re bringing visual context to our live location platform and rethinking how machines and humans alike interact with the road.
The Mapbox Vision SDK also works in conjunction with our live traffic and navigation, allowing developers on iOS and Android to build a heads-up display experience in their native apps.
These are some of the companies that have stepped into the driver’s seat.
We’re excited to explore a new set of opportunities with the Mapbox Vision SDK through our incubation efforts with infotainment systems/clusters within our HMI toolchain, EB GUIDE, along with HAD functionalities for autonomous driving, with EB robinos.
– Walter Sullivan, Head of Innovation & Incubation, Elektrobit
The Mapbox Vision SDK provides unprecedented access to real-time road data that will feed into Mapbox’s live location platform. Imagery is processed directly on the device, so current road conditions and live event data can be used to provide low latency, low bandwidth updates to the living map. Highly efficient neural networks run solely on the device - without the need to upload and process video in the cloud.* The Mapbox Vision SDK is powerful enough to understand today’s driving environment in real time, yet compact enough to run on billions of mobile devices.
Today, the Vision SDK is supported for iOS. Android support is coming in October, and embedded Linux platforms will be added later this year. When run on a smartphone, no additional hardware is necessary other than a windshield or dash mount.
The Vision SDK is now in private beta and available to select developers.
What’s your Vision?
Apply To Join The Beta Version Of Vision SDK.
Seats Are Limited