Smart Spaces in 137 Seconds?

Just over a year ago we published a blog post entitled “Beyond the Beacon: BLE Just Got Reel where we showed how our technology could detect and identify iOS7 devices using Bluetooth Smart technology. The video from that post has been viewed over ten thousand times, is featured in this GigaOM article and helped us land an opportunity to present at Bluetooth World in April 2014.

The above video crams into 137 seconds all of the progress we’ve made since then. And this blog post will help direct the curious viewer to all of the little bits which together comprise what’s shown in the video.

What are those black and white hardware devices?

We call them reelceivers, we designed them from the ground up, and what they do is listen for wireless advertising packets from nearby devices. In other words, they detect and identify things like beacons, wearables, smartphones and our active RFID tags. Here’s how to find out more:

What’s the open-source software running on the PC?

It would have taken way too much time to show it all in the video, so we’ll direct you to everything here:

  • barnowl is our middleware package which interfaces with the hardware
  • hlc-server is a contextual API built around barnowl
  • smartspaces is the webpage you see in the video, including the server-side part behind the scenes
  • Make a Smart Space is the tutorial on diyActive that brings it all together, and is your best starting point (and yes, it features the video too!)

What’s the “one API call” mentioned in the video?

You just ask hlc-server what is /at/place (where place is a human-friendly location name) and it will return to you:

  • a list of all the devices that are present
  • a processed version of whatever they sent in their last wireless transmission(s)
  • a link to any data associated with that device (for example: this is Jeff’s JSON)

That last one is what makes the technology so powerful. Anyone can associate digital information with their wireless device: hlc-server just makes the link between the unique device identifier (for instance the MAC address) and a URL which lists all the data in JSON format. Have a look at the Hyperlocal Context page of our website and if you’re still keen to know more, read our scientific article: Hyperlocal Context to Facilitate an Internet of Things Understanding of the World.

What are all the devices shown on the screen?

In no particular order:

Yeah, Smart Spaces are detecting more and more of the billions of Bluetooth Smart devices shipping every year! hlc-server can determine the device type of most of these based on either the UUID it transmits or the companyCode. If you’d like your device to be recognizable too, please contact us and we’ll include it in the next build.

Is there some way I can see that website live?

Yes, you can! Check out for live hyperlocal context from our office (which is slowly becoming a museum of trophies and prototypes as well…). Or for live hyperlocal context from Notman House in Montreal, the place where we first experimented with Log in to Life, the precursor of Smart Spaces.

How did you make the Nexus 5 work with the technology?

Okay, this answer will be a bit technical, bear with us:

  1. the Bluetooth Smart reelceivers regularly send ADV_DISCOVER_IND packets
  2. smartphones (both Apple and Android) are curious and send SCAN_REQ packets in response to learn more about the Bluetooth Smart reelceivers
  3. those SCAN_REQ packets include the 48-bit advertiser address of the smartphone
  4. the Nexus 5 on Android 4.4.4 uses a public advertiser address (in other words it doesn’t change)
  5. therefore, whenever the Nexus 5 scans for nearby Bluetooth Smart devices, and a Bluetooth Smart reelceiver is around, the reelceiver uniquely identifies the Nexus 5

So, in the web interface of hlc-server we simply associate the advertiser address of the Nexus 5 with a URL containing JSON data and it works! Note that iOS devices can also be identified in this manner, but they change their advertiser addresses every 15 minutes or so which makes this technique pointless (but there are alternatives!).

How do you aggregate the tweets of everyone present at a location?

In smartspaces, we use the Twitter handle of every person detected to load their most interesting tweets and then cycle through them in the “Social View” for roughly 24 hours. This works when people opt-in with a compatible device and share their Twitter handle, and it’s reely, reely cool at places like coworking spaces!

Could you really decide what song to play based on the preferences of everyone present?

We’ve been throwing around this idea for two and a half years and nothing would make us happier than for someone to make it a reality using our platform. All of the ingredients are finally there, so please be the one to make it happen (and let us know when you do!).

Why is there so much audio static in the video?

This is best explained as a MasterCard commercial:

Digital SLR: $800. Wide-angle Japanese lens: $750. Directional microphone: $100. Not realizing that the 99 cent AA battery in the mic was almost dead until after everything was filmed: PRICELESS.

Thanks for watching and reading and stay tuned. There’s plenty more in the works!