Bose and Capitol Records recently held a CodeJam in Detroit, Michigan. Detroit is in the midst of a revival with music and technology driving innovation. One of the top few projects that stood out to us was Qast, a unique application that ties in micro-location services with the Bose AR experience. Qast was the “Overall Winner” for the awards ceremony. The first implementation of their experience will tie into the roots of Motown with an interactive tour.

 

Qast integration with the Motown Museum

During the Detroit Code Jam, seven teams were focused on building an experience centered around the music industry. Qast, in particular, is inspired to build micro-location-based audio experiences, starting with a tour around the Motown Museum in Detroit, Michigan. The team is comprised of four members: Andrew O’BrienAustin WelchAli Azzawi, and Luis A Veras Landron.

 

Qast is designed for venue owners and individuals looking to build informationally enhanced physical environments. They are building this experience in a way that arbitrary data can be tethered to a highly specific location, called a micro-location. The main goal of this technology is to usher a new era of increasingly screenless interactions with enhanced physical spaces by leveraging Bose AR technology.

Qast implemented the Bose AR iOS SDK into their mobile application to leverage the heading data and micro location-accuracy from the user’s Bose AR-enabled devices. Upon entry into a location with a Sound Zone overlay, the user will receive data from the location via the Bose AR-enabled device. 

The application stack is comprised of a Bose AR-enabled device, Bose AR iOS SDK, Capitol Records streaming API,  MapBox API, and a Firebase backend. Once a user enters a Qast-enabled location, called a SoundZone,  the Capitol Records streaming API is used to play music unique to that SoundZone. To get more information, they can tap the SoundZone in-app. Alternatively, the heading information from the user’s device can be used to provide SoundZone previews that allow a user to look at a particular direction and use a gesture from the Bose AR-enabled hardware to get more information from that SoundZone without taking out their phone.

Once a Bose AR-enabled device is securely paired to their application, a session is created for the device. When the sensors on the hardware are activated from the application, it can subscribe to the data coming from the sensors.  The application receives the data via the SensorDisptachHandler class. This class has available methods that can be used to get specific data of interest. 

Below is a gist of the application receiving the data for the absolute rotation. The absolute rotation has the gyroscope, accelerometer, and magnetometer activated in the Bose AR-enabled hardware. The compass heading is accessible via the yaw property of a quaternion value provided by the rotation sensor (absolute rotation).  In the gist, the application takes the raw data coming from the hardware and converts it to degrees (line 12-14);  grabs the user’s location from the MapBox API (line 17-19); converts the yaw value to magnetic degrees (line 22);  and updates the direction with user heading in the UI (line 25-29).  

Github Gist for SensorDispatchHandler:

extension MapViewController: SensorDispatchHandler {
    
    /* 
       After initializing a WearableDeviceSession and calling sensorDispatch.handler = self in viewDidLoad(),
       Your view controllers can receive raw heading information with the receivedRotation delegate method.
       We use this information to render a custom vision cone on screen.
    */
    
    func receivedRotation(quaternion: Quaternion, accuracy: QuaternionAccuracy, timestamp: SensorTimestamp) {

        // 1. Convert raw heading data to degrees
        let qMap = Quaternion(ix: 1, iy: 0, iz: 0, r: 0)
        let qResult = quaternion * qMap
        let yaw: Double = (-qResult.zRotation).toDegrees()
        
        // 2. Get user location from the Mapbox MapView
        guard let userLocation = mapView.userLocation?.location, userLocation.horizontalAccuracy > 0 else {
            return
        }
        
        // 3. Convert degrees to magnetic degreees (0°-360°)
        let magneticDegrees: Double = (yaw < 0) ? 360 + yaw : yaw
        
        // 4. Update the direction of a custom vision cone to show user heading in the UI
        vision.updateVisionPath(center: userLocation.coordinate, orientation: 360 - magneticDegrees)
        locationManager.visionPolygon(for: userLocation.coordinate, orientation: 360 - magneticDegrees)
    }
}
 
Micro location-accuracy and receiving heading data without users taking their phones out are prerequisites for many of the AR experiences that Qast hopes to see in the future. Uniting these two data sources opens up a whole new world of features for entertainment, retail, and exhibition applications. Technologies like Bose make it possible to explore how audio-first experiences can be built and used.
 
Interested in building with Bose AR? Check out developer.bose.com/bose-ar for our docs and SDK. We will be following up on this blog post with links to the app once it is published in the Apple Store.