Three years ago today on this blog, we published Beyond the Beacon: BLE just got Reel. In honour of that anniversary, today we publish Three years Beyond the Beacon: the Physical Web just got Personal.
The original Beyond the Beacon post was prepared with a sense of urgency to coincide with Apple’s iOS7 release featuring iBeacon support. We were anxious to show our vision for Bluetooth Low Energy (BLE) technology, which the iOS 7 release enabled for the first time. Despite being hastily recorded, the video accompanying the blog post quickly gathered over 10,000 views, in large part due to the provocative Loophole in iBeacon could let iPhones guard your likes instead of bombard you with coupons article in which it is featured. In that GigaOM article, Stacey Higginbotham writes:
[reelyActive] is perhaps a bit too far ahead of its time right now, in that it’s thinking beyond beacons to a fully connected world where a person’s preferences are communicated by their device and stored in the cloud.
Indeed we were too far ahead of our time back then. Fast-forward to today however, where we proudly share how we’ve assembled established technologies in a novel way to enable exactly the above. And, curiously, this time around it is instead Google whose products and projects have made this possible.
The animation above illustrates how the vision of communicating a person’s preferences via their device involves effectively two steps:
- the mobile device wirelessly advertises the source of a person’s data to any and all devices in range
- those devices query that source and receive structured data which represents the person in a standard way
In the retail example, this means that the data which a client chooses to share could be immediately discovered by a nearby employee with a smartphone, by digital signage, or by sensor infrastructure throughout the store itself. The onus is then on the latter to deliver a personalised experience to the client who would expect nothing less in return!
This is both technically and functionally viable today on recent Android devices where an app advertises, in a Google Eddystone packet sent over BLE, a URL pointing to the person’s hosted data (Step 1), and, any receiving device queries that URL in exchange for structured data in the standardised form of Schema.org and JSON-LD (Step 2). The recipient can easily make sense of the person’s data which is structured precisely as per Google and other search engines suggestion such that they may “organise and display it in creative ways”.
At the time of writing, iOS devices restrict what may be transmitted over BLE, preventing the use of Eddystone, a product of Google’s Physical Web project. However, if you have a recent Android device and a BLE-enabled computing device such as a Raspberry Pi 3, using reelyApp and our open source software suite, you can do exactly what we show in this video.
For those who like to look under the hood, the personal data is stored in a hosted json-silo, accessed using cormorant and rendered using cuttlefish. The Pi is set up from scratch following this tutorial.
Three years ago, Apple opened the door to smartphones advertising themselves to their surroundings. Over the past two years Google advanced projects to both identify and represent physical “things” in a standard way. While both companies continue to champion the smartphone-as-receiver model, we’re more excited than ever about the smartphone-as-transmitter model which literally flips advertising on its head, fostering a user-centric, opt-in model of real-time information sharing on a human scale. Thanks to emerging standards and just enough wiggle room for permissionless innovation, today we proudly proclaim that the Physical Web just got Personal.