For many city goers, a new place is really seen on foot. You can take in the neighborhoods and buildings as you walk. Often these urban explorers have a phone outstretched in one hand. With NaviGuide AR and the Bose AR platform, metropolitan wanderers can keep their eyes off the screen and on the sights!
Using audio, a lot of information can be provided without pulling your phone out of your pocket. NaviGuide AR, a new app by motion mapping company Navisens, uses audio AR to supply turn-by-turn walking directions and identify landmarks. With Bose AR-enabled headphones or glasses, NaviGuide AR is easily able to access head location and gesture controls. Additional inputs provide even more accurate identification of your surroundings without looking at your phone.
“It’s surprisingly seamless to wear Bose Frames and interact with our app,” Navisens founder Ash Donikian said. “Actually experiencing the app with Bose Frames while walking around in the real world was such a great experience."
Most walking directions in a city aren’t very accurate and usually require looking at your phone to double check street names. What makes NaviGuide AR unique is that it doesn’t rely on GPS, which can be off the mark around tall buildings. Instead, it uses sensors on the phone to provide precise locations, even indoors. The Navisens technology was developed for firefighters who can’t count on WiFi or other signals to be available within a burning building.
Put on a pair of Bose AR Frames or headphones and NaviGuide gets even smarter. Using the gyroscope sensor available through the Bose AR SDK, the app is able to figure out where the user is looking. Precise location plus head direction means NaviGuide AR can accurately identify landmarks and tourist attractions within the user’s view.
"The Bose developer team has been great to work with. They've been very responsive in listening to our feedback and providing SDK updates to meet our requirements particularly in relation to the sensor data acquisition.” James Grantham, Navisens Software Engineer
The Navisens team worked with early developer access to quickly integrate the Bose AR libraries. The whole team got involved, excitedly testing the app with Bose Frames. In addition to accessing head position, the app uses the intuitive “tap” functionality for user input.
You don’t need to have advanced sensor technology to integrate with Bose AR. The SDK provides abstractions for common input, such as tap, double tap, head nod, and head shake. The gyroscope that reports the head position is translated to a simple X/Y/Z coordinate system.
“Using the Bose SDK was quite straightforward,” Donikian said. Navisens used the SDK to put the Bose AR sensors to the test. As is typical of Navisens’ experience with early devices, they were able to provide feedback for the Bose team to make improvements. Both the Bose AR SDK and firmware are stronger due to the robotics-based algorithms Navisens uses for its core technology.
Navisens is used to working with industrial devices, so the team enjoyed working with a consumer product with a large existing user base. “Many other AR wearables are bulky, large, require external battery packs, and aren't suitable for day-to-day tasks,” Donikian said. By contrast, users are familiar with the Bose Frames form factor, making it a wearable you want to wear. Plus, the experience can be enhanced by developers like Navisens, providing unlimited possibilities.
The team plans to add more data sources to NaviGuide AR in order to help recognize more landmarks. In the future, they would like to connect to additional services, to bring more power into their app. Imagine booking restaurants and other venues with just your voice, location, and natural device inputs.
The Bose AR platform offers a unique interface and delivery for all sorts of projects. From navigation, like we’ve seen with NaviGuide AR, to storytelling, gaming, and more. Build Bose AR intelligence into your next mobile project.