Saturday, 29 November 2014

Where in the World are all the Beacon Developers?

Publishers of Android apps have access to the Google Play Developer Dashboard, which provides each publisher with various statistics regarding any apps that they have released on Google Play.

When considering usage for my Beacon Scanner and Logger app, it occurred to me that these statistics can be viewed as a sort of "league table", ranking countries by the number of beacon developers who have downloaded the app. Note I'm assuming developers are the primary audience for the app here, though even if this is not the case I suspect it's reasonable to assume the ratio of developers to non-developers would be roughly the same from country to country.

Of 536 total downloads, 260 were still active at the time of writing, i.e. the app had been downloaded to a device but had not been deleted.

As can be seen in the chart and accompanying table copied below, the app was most popular in the US (which accounted for just over 22% of active installations) with Germany (11%) and the UK (6%) second and third respectively.


Breakdown of current app installs by country, as at 26th November 2014
There are issues that compromise the usefulness of my analysis, including:
  • The sample size is probably not large enough for the results to be statistically significant.
  • I've made some fairly crude assumptions about the nature of the audience.
  • The app is only available in English, which would obviously limit its appeal to non-English speakers.

Adding Google Analytics data from this blog gives another view of user location, though there are some posts that would be relevant to a non-beacon centric audience, which dilutes the value of this data somewhat.

The relatively high rank for Australia could well be down to the fact I've reviewed the Blue Cats beacon, which is an Australian product, and much of the traffic from Poland is probably attributable to the fact that our development centre is out there.

Google Analytics data for my blog, for the year to date.

Despite the many caveats laid out above, I find this analysis interesting and probably worth sharing. Feel free to leave feedback if you have any thoughts on this.

Tuesday, 25 November 2014

Microsoft Future Decoded Tech Day, London, 12th November 2014 - A Delegate's View

I attended Microsoft's Future Decoded Tech Day (#FutureDecoded) at the Excel Centre, London on 12th November and thought I'd share some of my notes from the event.

The day was split into morning sessions, which featured on-stage presentations in a theatre like environment and afternoon technical sessions where delegates would break out into smaller groups to focus on specialised topics. The presentations were ably introduced by Microsoft UK Technical Evangelist Andrew Spooner.

Morning Sessions - Keynote Lectures:

The morning's first session was a lecture on Open Data from internationally recognised expert Sir Nigel Shadbolt. Sir Nigel was co-lead, along with Sir Tim Berners-Lee, of the UK Government's project to make many of its data sets available for public use (data.gov.uk).

Sir Nigel and Sir Tim went on to create the Open Data Institute, a non-profit organisation founded in the UK but now with an international presence, to help drive the evolution of open data systems.

Crowd-sourcing of data was a key topic within Sir Nigel's lecture. He gave a compelling demonstration of crowd-sourced data as a force for good citing Ushahidi, which was created to collect eyewitness accounts of intimidation and violence as suffered by voters in Kenya's disputed 2007 presidential election. Ushahidi has since been used as a tool in several crisis situations, including the aftermath of earthquakes in Chile, Haiti and New Zealand.

Sir Nigel also highlighted the contribution made by OpenStreetMap following the Haitian earthquake of January 2010. There were no high resolution maps of the affected area at the time of the earthquake, which seriously affected disaster relief efforts. Rescue workers were able to rapidly crowd-source data using OpenStreetMap and build up the view of the area they needed to effectively co-ordinate their response.

A major take-away for me from this lecture was the concept of "linked data", an example of which is the ability to access multiple data sets related to a specific location via the location's postal code. The postal code effectively acts as a URI for these linked data sets. Linking data in this way makes it easier to discover, access, and reuse data in novel and potentially valuable ways. The Glasgow Future Cities Demonstrator referenced towards the end of this blog post provides examples of the benefits this approach can offer.

As a fan of attention grabbing statistics, and being a casual follower of Formula 1, I found this presentation from Lotus F1's IT Director fascinating.

The Lotus F1 team make extensive use of computer and physical modelling and on car telemetry, which means there are petabytes of data sloshing around the various systems that Taylor's team are responsible for.

The team are constantly proposing, modelling and applying changes to the car. If my notes are correct, the rate at which design changes are made equates to 1 change every 7 minutes over the course of a season, though most don't actually make it onto the car.

I found the parallels between the continuous delivery cycle to which an F1 team works and the context within which a DevOps team operates interesting and instructive:
  • Success is predicated on how swiftly value is realised from change.
  • Agility enables the right change to be enacted at the right time.
  • Appropriate tooling and process mitigates the risk of a change introducing unwanted behaviour.

F1 seems like an amazing, if extremely challenging, place to be an IT professional. Taylor is a fine ambassador for his team and I suspect many of those in the room for this session will be taking an interest in Lotus F1's race results going forward. 

Lotus F1 car.

Andrew Spooner invited Or Arbel onto the stage to discuss Arbel's messaging app, Yo. I'm perhaps not the target audience, but when I first read about the app earlier this year the word that sprang to mind was not "Yo" but "Why?" Arbel said during the interview that his intention was now to extend Yo, exposing an API so it could be used as a notification service for 3rd party apps.

I can get a better handle on this than on its current primary use case though I remain to be completely convinced. I'm not sure what Yo's API gives you that you can't already get from Urban Airship et al., though it seems the Miami Dolphins disagree with me.

To UK based gamers of a certain age Dr David Braben will be forever synonymous with the space based game Elite, which Dr Braben co-authored with Ian Bell in 1984. Elite quickly acquired legendary status due to its immersive nature and compelling gameplay and within gaming circles continues to be spoken of in hushed and reverential tones to this day.

Dr Braben is a founding member of the Raspberry Pi Foundation and CEO of Frontier Developments, and the majority of his session saw him discussing development of Frontier's new game, and most recent sequel to Elite, Elite Dangerous. While ostensibly a game demo what we actually got was verging on an astronomy lecture, as Dr Braben showcased the game's realistic modelling of the Milky Way galaxy; players are free to explore any one of the galaxy's 400 odd billion star systems* for example.

Elite Dangerous is able to model this astronomical number of star systems (sorry, pun intended) within the constraints of the host machine by continuing the franchise's tradition of using procedural generation when constructing the in-game galaxy. The results are extremely impressive, both technically and aesthetically.

I was fortunate to snag an access code to the playable demo via a prize draw later in the day and look forward to exploring the game myself, prior to its official release in December.

* 400 billion is in the upper range of the galaxy's estimated stellar population, in case you're interested. The lowest estimate is around 100 billion, so the development team certainly can't be accused of taking the easy option. ;)

Professor Cox is well known to UK audiences as a science broadcaster and distinguished physicist. Inviting him to speak at the event seems a very smart move on Microsoft's part, as his presence may well have been the draw that convinced some delegates to attend.

The lecture was a treat for those of us in the audience with an interest in physics, cosmology, engineering, or the history of science, as all four subjects were addressed in a thoroughly engaging and entertaining way. Professor Cox is a talented public speaker and gifted storyteller; his ability to take the audience on a journey is key to making the sometimes complex source material he deals with so widely accessible. For me the lecture formed a nice companion piece to Professor Cox's recent TV work, featuring elements from his "Wonders...", "Human Universe" and "Science Britannica" series.

The recreation of Galileo's experiment in a NASA vacuum chamber from "Human Universe" was a particular highlight. I love the reaction from the scientists in the room, all of whom obviously knew the expected result but had the look of awestruck children when confronted with the reality.

If you're a science geek, or have little science geeks in your house, and you have a chance to hear Professor Cox speak in person I strongly recommend the experience.

Professor Brian Cox (left) and graph.

During the lunch break I happened to walk past the halls in which the Appsworld mobile exhibition was being held. I launched my beacon scanner app and got the following hits.

Location: 51.509708,0.0246251 UUID: e2c56db5-dffb-48d2-b060-d0f5a71096e0 Maj. Mnr.: 9-1 RSSI: -99 Proximity: Far Power: -59 Timestamp: Nov 12, 2014 13:07:27.024

Location: 51.509708,0.0246251 UUID: e2c56db5-dffb-48d2-b060-d0f5a71096e0 Maj. Mnr.: 6-4 RSSI: -96 Proximity: Far Power: -59 Timestamp: Nov 12, 2014 13:07:28.146

Location: Unavailable UUID: e2c56db5-dffb-48d2-b060-d0f5a71096e0 Maj. Mnr.: 9-5 RSSI: -95 Proximity: Far Power: -58 Timestamp: Nov 12, 2014 13:09:50.880

It's still quite a rarity to detect anything other than my own development beacons when out and about so this was good to see. I'd have doubtless encountered more if I'd wandered into the exhibition hall and I'd have loved to explore further, but a quick look at the Moto 360 confirmed it would soon be time for the afternoon sessions to begin...

Afternoon Sessions - Technology Tracks:

There was a large and diverse range of sessions available during the afternoon. The most popular seemed to be the "Modern Development with Visual Studio" track, which was significantly oversubscribed. When I walked past the room in which this track was hosted, people were queuing outside the room and were being admitted on a "one in one out" basis. The last time I'd seen such a long queue of my fellow geeks we were all waiting to meet Peter Mayhew.

The "Developing Solutions for the Internet of Things" track that I had chosen was less busy, but still well attended. There were three main sessions within the track:

A Technical Overview of Microsoft’s Architecture for IoT: From the Edge to Analytics - Jeff Wettlaufer, Microsoft Senior Product Marketing Manager, Internet of Things

Check out Microsoft's IoT resources online - note the "Internet of Your Things" tagline,
which cropped up a few times during Jeff Wettlaufer's session.

Jeff Wettlaufer provided an overview of Microsoft's Azure IoT solution. This was a fairly high level session, which yielded the following points of interest:
  • There are 19 Azure datacenters, offering 600,000 cores.
  • Once your data is in Azure, you can use Power BI or HDInsight (Microsoft Cloud Hadoop) to analyse it.
  • Microsoft knows that not everyone is a .NET developer and is happy to provide support, documentation, and sample code for a variety of languages, including Python, PHP and Java.
  • Works with any client device and any client OS.
  • Any secure comms protocol is accepted.
  • Power Map for Excel looks impressive, it's a great way to visualise location specific data such as sales for specific retail locations. Definitely worth checking out.  

Power Map in Action - this slide may actually be from the following presentation,
but I'm pretty sure Jeff Wettlaufer introduced the tool.

Wettlaufer ran a real-time demo, which showed data generated from two Raspberry PIs on the lectern in front of him being pushed to Azure queues and then consumed by worker processes. Configuration seemed fairly straightforward and the demo worked as expected.

It was good to hear directly from someone in a senior position on Microsoft's IoT team, and the fact that Wettlaufer had flown in from Redmond lent emphasis to the importance Microsoft are placing on Azure for IoT.

The Cloud & IoT - Clemens Vasters, Microsoft Technical Lead, Azure Service Bus Team

Here we dug deeper into the components that enable Azure for IoT to work at scale. There were some impressive numbers thrown around and Vasters generously shared his experiences of architecting large scale, highly available systems.

Azure IoT principles
Some observations and facts provided by Clemens (not verbatim, so any errors are mine):
  • "IoT is when those with the least experience in building very large, secure, high-scale, high-availability multi-node cloud solutions are confronted with having to build them".
  • A Unit of Management in Azure is 420 instances max.
  • IoT devices are peripherals to the system, in the same way printers are peripherals on office LANs.
  • Do not scale further than you can test (common sense, but perhaps surprising how often this principal is violated).
  • Security needs to be as close to the device as possible and communication needs to be one way, where possible. Devices should not listen for inbound traffic - they can't protect themselves!
  • Expecting devices such as domestic hubs to effectively handle security for a plethora of connected devices is likely to end in tears. UPnP is insecure and does not help, turn it off if you can.
  • Service assisted communication can help to secure your solution (see Vasters' slide on the subject below).
Service Assisted Communication
Working definitions of scale:
  • Hyperscale - up to a million concurrent clients.
  • Enterprise - typically 10,000 - 20,000 concurrent clients.
  • Web - typically a few thousand or fewer concurrent clients (there are exceptions - e.g. Twitter & Facebook).
IoT will be at the high end of scale, particularly for telemetry and Command & Control systems.

Microsoft have copious documentation on Azure, which I won't attempt to summarise, but the concept of  Event Hubs is worth mentioning here. Event Hubs are the ingress point for IoT data into Azure, which is a key architectural difference between "regular" Azure and Azure for IoT. Once your data has been ingested by an event hub and passed into Azure, you're then free to store or transform the data as required. Also of note is Stream Analytics, which when used in conjunction with Event Hubs allows incoming data to be analysed in real-time.

Azure Event Hubs
Event Hubs with Stream Analytics
I got a lot out of this session, Vasters is an extremely effective and honest communicator who is more than happy to share his knowledge and opinions. I'd definitely recommend catching one of his sessions if you're attending a conference at which he's speaking.

Building the Glasgow Future Cities Demonstrator – IoT, Big Data & the Cloud - Dr Colin Birchenall, Chief Architect, Future City Glasgow

Dr Birchenall presented a view of Glasgow's Future City initiative, the result of a successful bid for funding from the UK Government's Technology Strategy Board (now renamed Innovate UK). In a nutshell, the project leverages technology for the benefit of the city and its communities and provides a learning resource for cities looking to embark on similar schemes.

The city has created a state of the art operations centre, where data from its CCTV network and traffic sensors is presented to the operations team. The operations centre opened in late 2013 and was immediately called into action supporting the emergency response to a tragic accident at the city's Clutha Vaults, where a police helicopter crashed into the building with devastating and tragic consequences.

Over 200 data streams have been identified and more than 100 have been publically exposed via Glasgow's Open Data Portal, which includes an interactive open map that stakeholders can use to view locations of various services and events, from allotments to traffic accidents. Developers are also encouraged to use the data to provide alternate views and build additional services.

I think this is an important and potentially influential initiative, if any of the above is at all interesting to you and you'd like to know more I'd strongly recommend a visit to the Future City Glasgow website.

We started the day with a high level view of the possibilities presented by Open Data and this session described how these possibilities were being realised, making the two sessions natural bookends for the day.

And finally...

"Where we're going we don't need roads" - this DeLorean was parked near the entrance to the DLR station.
I think I ended the day with a decent grasp of the high level architecture of Azure, both generally and specifically as deployed in an IoT scenario. I picked up a few tips that would probably serve me well and I felt inspired to follow up on some of the things I'd seen and heard.

My thanks and congratulations go to the organising team for putting together an interesting and entertaining event that delivered genuine value, at least to this delegate. Hopefully Microsoft will make Future Decoded a recurring fixture, I'll definitely be attending if they do.


Thursday, 20 November 2014

Rapid Cross Platform Mobile App Development with Evothings

I've been working on an experiment that features a requirement for Android and iOS client apps, and maybe Windows Phone too. I have reluctantly had to accept the fact that coding up at least two client apps and a back end is probably asking a bit much of myself, at least if I'd like to get the whole stack up and running anytime soon.

This epiphany led me to embark on a quest for a tool that will give me access to the full range of client platforms, while wielding a single codebase (I'm pretty sure this is how Sauron got started, so I hope things end better for me than they did for him).

My background is in web apps, so I decided to take a look at Apache Cordova, as this would mean I could build the majority of the app in HTML, CSS and JavaScript, with which I'm familiar.

I'd come across Cordova and PhoneGap before, even going so far as installing Cordova, though problems with my installation on that occasion led me to abandon cross platform development and develop a native Android app.

A few weeks back I had another look at Cordova and got it up and running on my Windows machine. The cause of my previous problems had apparently been resolved since I last looked at Cordova, probably either due to the fact I have a new laptop or because I'm using a different Cordova build and I soon had a fresh installation to play with.

I then started googling for Cordova tutorials and stumbled upon Evothings. I tend to develop interactively, with many small iterations, so the promise of rapid app deployment to connected Android and iOS devices resonated strongly with me.

I downloaded the Evothings Workbench (then at version 0.8.x, version 1.0.0 has since been released) and after following the excellent instructions was quickly up and running. The Workbench is effectively a server that runs on your development machine (both Windows and Mac versions are available) to which the Evothings client for Android or iOS can be wirelessly connected.

You can download the client app from Google Play or the App Store, both Workbench and client are available free of charge.

I can't recommend Evothings highly enough, it's quick, stable and a joy to use. I was able to quickly adapt the Evothings iBeacon demo app to my needs and get it talking to a Google App Engine back end with minimal fuss - more on this is an subsequent post.

Evothings is positioned as a tinker tool for IoT development, so if you have an Arduino lying around or any other connected devices you'd like to hack it's a great place to start; it just so happens to be a very capable part of my current mobile app development toolchain too. 

Thursday, 6 November 2014

Microsoft are trying to make public spaces more accessible to blind and partially sighted people...using beacons.

I'm attending Microsoft's Future Decoded event next week, and was wondering what the company was doing in the beacon space. The answer is both unexpected and inspirational.

This is a great use for the technology and one that, as the linked article suggests, would have benefits to users well beyond the target audience. I wish the team luck and will be really interested to see how this initiative progresses.

Thursday, 23 October 2014

Two Weeks with the Moto 360 - Device Review

I've had my Moto 360 for a couple of weeks now, so I thought I'd share my experiences of using the device so far. The short version is I'm generally happy with the watch and the few minor negatives are greatly outweighed by the general geeky coolness of strapping a voice activated computer to my wrist. If you'd like the slightly longer version, please read on...



The Battery

Battery life on the Moto has not been an issue for me, though my venerable Nexus 4 has shown the strain of doing much of the heavy lifting for the device, even running out of juice mid-evening on one occasion. If your phone already has issues retaining charge for a full day, I'm afraid pairing it with a smartwatch is only going to make things worse.

The watch typically leaves its cradle around 7am and turns in for the night at around 11pm. The lowest remaining charge percentage I've seen so far is 11%, with between 40% and 70% being typical. Pretty much the first thing the watch did after I paired it was update itself, so I'm assuming I'm benefitting from enhancements in the recent software update described on this Reddit thread.

Wireless Charging

I love this feature, and I don't think I'd consider a smartwatch that didn't have it. The Moto's cradle is nicely designed and effectively turns the watch into a bedside clock, as can be seen in the image below. I also invested in a desktop wireless charging pad, which happily charges the watch and on which my Nexus 4 now spends much of the working day.

Image credit - Ryan Whitwam, Android Police

Voice Recognition

I'm impressed with the Moto's voice recognition capability. I can't think of a particular instance of a mistake it has made, with both Google searches and text messages performing as expected (maybe that should be "as hoped" rather than "as expected").

Text messaging in particular works really well, and in my case is a definite time saver over jabbing at a touchscreen - the first time I sent an SMS merely by talking to the watch was a definite sci-fi moment.

Self consciousness has prevented me from making greater use of the voice interface while in public. Maybe we'll soon see crowds of people roaming the streets while talking to their wrists, but I strongly suspect not. Perhaps subvocal control systems hitting the mainstream will put a dent in the touchscreen interface's dominance, but that's a good few years away.

Health and Exercise

The Moto's health and exercise functionality is interesting, with hardware and supporting apps for heart and step monitoring baked in to the device. I don't have data on the accuracy of the heart sensor, but it has sometimes needed a few attempts to get a reading from me and it seems unlikely to be as accurate as a dedicated strap-on sports monitor.

I downloaded Endomondo, to test its Android Wear integration, at the request of one of my Objectivity colleagues and the results were fairly impressive. The app can be voice controlled, and a UI for stopping and starting workouts is displayed on the watch face. When you've finished your punishing hill run (or gentle stroll from the hotel to our Polish office in my case) you can view your workout stats, which include maximum and average heart rate as provided by the Moto.

My workout stats - I was overtaken by two snails and a tortoise.

Minor Issues

Compromises have definitely been made, the most obvious of which is the "flat tyre" at the bottom of the screen that houses the light sensor and screen driver hardware. The degree to which this bothers you will depend largely on your aesthetic sensibilities, but it clearly wasn't a deal breaker for me. I'll be surprised if this feature is present on any future versions of the device.

I've read reports elsewhere of battery capacity and performance being compromised, though neither has proven an issue for me so far YMMV.

The watch face doesn't always illuminate when I rotate my arm to glance at it, which is obviously a little annoying. Clearly my biomechanical profile is not fully supported by the device. Fortunately I can confirm the beer mitt swing (seen at the tail end of this clip from the British sitcom Men Behaving Badly) works every time for me.

Active notifications obscure approximately half of the watch face. To be fair to the Moto, this is an Android Wear feature and is not specific to this device. Depending on the watch face chosen, a visible notification can make it difficult or impossible to see the current time until the notification is dismissed. The current mechanism for dismissing notifications is a swipe, but if I could dismiss them with a quick shake of the device this would probably work better for me.

Conclusion

I'm happy with the watch. I wear it daily and I continue to find it genuinely useful. It's nicely designed, well made, and comfortable to wear. It also tells the time, which is a definite bonus.

As seems to be the norm for technology products it's a little more expensive in the UK than in the US, but £200 seems just about reasonable for this particular device.

I'm now eagerly awaiting the Android Lollipop update, to see if the 5.0 version of Android Wear will feature the ability use the watch as a beacon. :)


Thursday, 9 October 2014

iBeacons in Retail Event, Presented by Academia - The Swan at Shakespeare's Globe, London, September 25th 2014

I attended an event in London recently, along with 50 or so other consultants, technologists and retailers, to explore the subject of iBeacons in Retail. The event was organised by Academia, a leading provider of IT equipment and services to the UK education sector.

Though the ultimate purpose of the event was to allow Academia and Atama to raise awareness of their new joint venture, BeaconSense, Academia also sees value in helping to build a community around the beacon ecosystem. Judging by the degree of networking going on during breaks and over lunch, it seems they've made a good start on this.

The event featured 5 presentations, which I'll attempt to summarise:

1) Welcome, introduction to Academia and BeaconSense: Jesse Westgate - Academia

We opened with an introduction to Academia. We were given an overview of their BeaconSense solution, which they describe as "...a suite of hardware and software solutions delivering location based advertising and information services to key markets such as Retail and Leisure industries using Bluetooth technology and customised mobile apps". The presentation was accompanied by a brochure for the service, and the information shared was at a high level (i.e. no screen shots or detailed use cases).

There was no discussion of existing clients or of reference implementations, but the involvement of Pivotal (more on this later) gives the venture access to serious Big Data capability, and the Atama beacon is shaping up to be a solid product so I will be watching with interest to see how BeaconSense develops.

2) iBeacon technology, insights & development by Eric Ferraz CEO – Atama

Atama's CEO introduced its new iBeacon product, which is at the advanced prototype stage having been up and running in their lab for the last couple of months.

The device runs on 4 AA batteries and has some interesting features, including time based power management and variable advertising intervals.

Commenting on the device's battery profile, Ferraz asked the question "does it matter if beacons are bigger"? Personally I'd say "yes", at least if you want them on a name tag or asset label, but agree that in the majority of use cases increased battery life over a low profile enclosure is likely to be a worthwhile trade off.

I found the device's management capabilities particularity interesting. Atama beacons participate in a mesh network, via a 6lowpan back channel. 6lowpan uses low frequencies at very low power, meaning that signals are not affected as much by environmental obstructions as higher frequency bluetooth transmissions. thereby giving longer effective range. Beacons within the mesh communicate with a bridge which in turn talks to a cloud based management service. Centralised remote management obviously simplifies set up and ongoing maintenance, and this definitely feels like the way to go. Atama are not alone in this space though, Kontakt's cloud beacon being the obvious competitor.

All things considered the Atama beacon seems like a flexible and capable device with a solid list of features, and I look forward to trialling it.

AltBeacon was mentioned in passing and Android support was briefly discussed, though as this was an iBeacon centric event with Apple in attendance there was little appetite to address the subject in detail.

3) Case Study by Jess Stephens CMO – Smart Focus

This presentation was the stand out for me, as Jess Stephens' company Tag Points (recently acquired by Smart Focus, of which Stephens is the Chief Marketing Officer) have real world experience of running a service to which iBeacon technology is integral.

Stephens gave a detailed overview of case studies involving the Swan Centre in Eastleigh, UK and The Meadowhall Shopping Centre in Sheffield, UK. It was apparent in both cases that the engagement of mall ownership was instrumental in building a compelling offering that in turn drove customer engagement. Tag Points' case studies are available on-line, with another from Smart Focus to follow shortly; I strongly recommended checking these out if you're interested in a view from the trenches.

In terms of technology the Tag Points system is beacon agnostic, and supports both iOS and Android clients.

Stephens commented in passing that "Owning your own digital airspace" is likely to become increasingly important, akin to owning your internet domain name. This struck me as an interesting observation, and it's definitely worthy of further consideration.

4) Business Intelligence & Data Analysis by Chris Mills CTO – Pivotal

Pivotal is a provider of Big Data solutions, formed as a joint venture between EMC, VMWare and GE. Given its backers it seems logical to suspect the company is not short of resources and it's clearly able to engage with top-tier clients, so the fact that BeaconSense is able to offer insights derived via Pivotal has the potential to be an important differentiator.

Chris Mills' presentation focussed on a few case studies, including profiling the location of mobile phone calls in real time (for quality of service, not surveillance purposes, you'll be relieved to hear) and analysis of data provided after fitting out a US sports arena with iBeacons. Retail banking was also mentioned as a sector of interest.

Mills remarked that beacons effectively just provide another new data point to Pivotal. I suspect that organisations with an existing investment in Big Data are likely to take a similar view, and would be keen to derive maximum value from their beacon-specific customer engagement data by importing it into their existing solution.

5) Apple in Retail by Tom Grant - Apple Systems Engineer.

We were asked to treat the detail of Tom Grant's presentation as confidential, which I found slightly odd, as I don't recall anything being mentioned that wasn't already in the public domain. The presentation comprised a high level technical discussion and some case studies, which is unfortunately about as much as I can say without getting into specifics.

Grant is an engaging speaker, and I saw his attendance as evidence that Apple remains committed to the technology and to the UK iBeacon community.

After a brief Q&A, during which there was a short discussion of whether Android apps can consume iBeacon advertising packets (they can, if you were in any doubt), delegates retired for a quick tour of Shakespeare's Globe Theatre and a networking lunch.

I found the event extremely valuable; I learned from the experiences of others and made new contacts within the wider beacon community. Academia should be congratulated for having the foresight to put the event together, for attracting a strong line up of speakers, and for ensuring it ran so smoothly.

I'm convinced that beacons provide great potential for building novel and compelling interactions, and for building new businesses to deliver them.

It's an exciting time to be a Technologist.

Tuesday, 23 September 2014

Version 1.4 of Beacon Scanner and Logger Supports Logging of Location Data

Version 1.4 of Beacon Scanner and Logger (free) has been published on Google Play.

The headline feature for this releases the ability to log latitude and longitude for each beacon. The app uses Google Play Location Services to do this and is configured to run in "fine" mode for best possible accuracy.

I've switched from Eclipse to Android Studio for this release of the code so there will be a delay getting the source into GitHub.

As always, comments and feedback are welcome. The app was installed on 120 devices the last time I checked, so very many thanks to all who took the time to download and install it - I hope it's been useful and that it continues to earn a place on your devices.

Friday, 22 August 2014

Walgreen's Shopping Experience Now Features Gamification and Augmented Reality

U.S. drugstore Walgreens and digital agency Aisle411 have collaborated to produce a pretty compelling in-store shopping experience that makes use of Google's Project Tango.

It looks like shoppers need to create a list before visiting the store, which seems the only obvious weak point in this solution, before they're then shown a route to all the items they'd like to buy. Personalised special offers are displayed en route.

The gamification aspect looks to be limited to the shopper picking up loyalty points for following the proscribed route, but there's plenty of potential for adding further activities.

Build a set of scales into the trolley and add self scan and it strikes me there'd be no need for a physical checkout either. Here's the video.

More here en Francais, though Chrome does a reasonable job of translating the article into English.

Wednesday, 20 August 2014

Handling Preferences in an Android Application: an Example

While the main function of the Beacon Scanner & Logger app is to scan for and log beacon data, there are a number of functions common to many Android apps that I've needed to implement while putting the app together. I thought I'd document how I implemented some of this functionality in a "cookbook" style, in the hope that it proves useful to someone.

It's entirely possible I've made an omission or a mistake in the following. Please feel free to let me know if you spot anything amiss and in return I promise to bestow upon you my eternal gratitude. 

First up is handling application preferences: Implementing preferences is actually not too difficult, as the framework handles most of the heavy lifting for the developer, but there is a fair amount of configuration and a few lines of code required to get preference support up and running.

I've stripped out some of the code from the following examples, where the code removed is not directly related to handling preferences. If you'd like to view the code in its entirety, feel free to browse the app's GitHub repository.

The following steps are not presented in any particular order, but they are all mandatory.

Create a PreferenceFragment 

Since the Honeycomb release of Android, the recommended mechanism for handling preferences has been to use a Fragment rather than an Activity. Far be it from me to go against the Android teams sage advice, so I've used a Fragment in the Beacon Scanner app. The Fragment is responsible for loading up our preference definitions from the XML file we'll create in the next step, as you'll see from the code copied below:

package net.jmodwyer.beacon.beaconPoC;

import net.jmodwyer.ibeacon.ibeaconPoC.R;
import android.os.Bundle;
import android.preference.PreferenceFragment;

public class BeaconPoCPreferencesFragment extends PreferenceFragment {

    @Override
    public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    addPreferencesFromResource(R.xml.preferences);
    }

}

Create an XML Preferences file

We'll need to create a file to hold your preference definitions. This file should live in our applications \res\xml folder, and in this example I've called the file preferences.xml, which you'll see is referenced in the PreferenceFragment implementation shown in the previous step. Here's the contents of the file for the Beacon Scanner & Logger:

<PreferenceScreen xmlns:android="http://schemas.android.com/apk/res/android" >
    
    <PreferenceCategory android:title="Beacon Properties to capture:" >

        <CheckBoxPreference
            android:defaultValue="true"
            android:key="index"
            android:summary="Log a count of beacon pings detected so far"
            android:title="Row Number" />        
        
        <CheckBoxPreference
            android:defaultValue="true"
            android:key="uuid"
            android:summary="Log the UUID"
            android:title="UUID" />
                
        <CheckBoxPreference
            android:defaultValue="true"
            android:key="majorMinor"
            android:summary="Log the Major and Minor values"
            android:title="Major Minor" />
        
        <CheckBoxPreference
            android:defaultValue="true"
            android:key="rssi"
            android:summary="Log the RSSI value"
            android:title="RSSI" />
        
        <CheckBoxPreference
            android:defaultValue="true"
            android:key="proximity"
            android:summary="Log the Proximity value"
            android:title="Proximity" />
        
        <CheckBoxPreference
            android:defaultValue="true"
            android:key="power"
            android:summary="Log the Power value"
            android:title="Power" />
                                                                        
        <CheckBoxPreference
            android:defaultValue="false"
            android:key="timestamp"
            android:summary="Log a Timestamp every time a beacon is detected"
            android:title="Timestamp" />
                                        
    </PreferenceCategory>

</PreferenceScreen>

Create an Activity so we can interact with our Preferences 

We'll need to provide an Activity that can be invoked when the user selects the Preferences option within the app. This Activity is responsible for invoking the PreferenceFragment we created earlier. Here's the code:

package net.jmodwyer.beacon.beaconPoC;

import android.app.Activity;
import android.os.Bundle;

public class AppPreferenceActivity extends Activity {

@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
  getFragmentManager().beginTransaction().replace(android.R.id.content,
               new BeaconPoCPreferencesFragment()).commit();
}

}


Add the Activity to our AndroidManifest.XML file.

Every Activity needs an entry in AndroidManifest, and this is ours:

        <activity
            android:name="net.jmodwyer.beacon.beaconPoC.AppPreferenceActivity"
            android:label="@string/app_name" >
            <intent-filter>
<action android:name="net.jmodwyer.ibeacon.ibeaconPoC.AppPreferenceActivity" />
                <category android:name="android.intent.category.DEFAULT" />
            </intent-filter>
        </activity>

Create a \res\menu\main_activity_actions.xml file

We need to create an xml file to define the options we'll display on our action bar menu.

<menu xmlns:android="http://schemas.android.com/apk/res/android">

    <item android:id="@+id/Settings"
          android:title="@string/Settings"
          android:showAsAction="never"/>

</menu>

Modify \res\values\strings.xml

We defined a string called "Settings" in the main_activity_actions.xml file, so we now need to provide a definition for this in our application's strings.xml file.

<resources>

    <string name="Settings">Settings</string>

</resources>

Update our main Activity class

We need to make a number of changes to our main Activity class that will enable preference functionality within the app. For the sake of readability I've removed code that isn't related to the handling of preferences.

First we add code to our onCreate() method to load up the default values from our preferences file:

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
PreferenceManager.setDefaultValues(this, R.xml.preferences, false);

    }

Then we ensure the options menu, which is home to our Preferences option, is displayed when our main Activity is invoked:

    @Override
    public boolean onCreateOptionsMenu(Menu menu) {
        MenuInflater inflater = getMenuInflater();
        inflater.inflate(R.menu.main_activity_actions, menu);
        return super.onCreateOptionsMenu(menu);
    }

We'll need to override the onOptionsItemSelected(MenuItem) method so we can handle the user selecting the "Settings" option, which invokes our Preferences, from the action bar.

    // Handle the user selecting "Settings" from the action bar.
    @Override
    public boolean onOptionsItemSelected(MenuItem item) {
switch (item.getItemId()) {
    case R.id.Settings:
        // Show settings
    Intent api = new Intent(this, AppPreferenceActivity.class);
        startActivityForResult(api, 0);
        return true;        
    default:
        return super.onOptionsItemSelected(item);
}
    }

Those are the only modifications required to load and support modification of preferences, but we obviously want to use the values store in these preferences within the application. Here's a section of the app's startScanning method, which reads preference values and stores them in Activity scoped variables for use in a subsequent method (note the use of constants for preference keys,  which are declared elsewhere in the Activity class):

    private void startScanning(Button scanButton) {

        // Get current values for logging preferences
SharedPreferences sharedPrefs =   PreferenceManager.getDefaultSharedPreferences(this);
HashMap <String, Object> prefs = new HashMap<String, Object>();
prefs.putAll(sharedPrefs.getAll());
   
index = (Boolean)prefs.get(PREFERENCE_INDEX);
uuid = (Boolean)prefs.get(PREFERENCE_UUID);
majorMinor = (Boolean)prefs.get(PREFERENCE_MAJORMINOR);
rssi = (Boolean)prefs.get(PREFERENCE_RSSI); 
proximity = (Boolean)prefs.get(PREFERENCE_PROXIMITY);
power = (Boolean)prefs.get(PREFERENCE_POWER);
timestamp = (Boolean)prefs.get(PREFERENCE_TIMESTAMP);
    }


...And we're done. If we follow the above steps we should now have an application in which we can display, modify and retrieve preferences. If you're still with me then thanks for your time, I hope this has helped you, and I'd love to have your feedback.

Version 1.3 of Beacon Scanner & Logger Supports Scanning in Background

Version 1.3 of Beacon Scanner and Logger has been released to Google Play. The new version allows the app to continue scanning when in background mode, but not while the phone is locked or has automatically entered sleep mode. I'm investigating the best way of implementing scanning while locked for a subsequent version of the app.

As usual, the app is available to download for free, and source code is available on GitHub.

Reviews and comments are welcome, and if there's a feature you'd like to see then please let me know.

Wednesday, 6 August 2014

Version 1.2 of Beacon Scanner & Logger (free) Released

Version 1.2 of Beacon Scanner & Logger (free) (formerly iBeacon Scanner (free)) has been released to Google Play.

This version of the app adds support for detecting and logging AltBeacons, and now uses the background power save feature of the Radius Networks Android Beacon Library SDK.

The app was published at 12:00 UK time, and will doubtless take a few hours to percolate through the Google Play App store.

Tuesday, 5 August 2014

Beacon Scanner and Logger App Now Uses AltBeacon Library from Radius Networks

I've updated the Beacon Scanner app so that it now uses the AltBeacon library from Radius Networks and removed any references to the iBeacon specific library previously used.

The app itself is currently still iBeacon specific, but AltBeacon support (and hopefully Gimbal Series 10 support) will be added shortly. Update - here's a section of one of the output files generated by the app that shows both an AltBeacon (actually a virtual AltBeacon) and an iBeacon being detected:

42 UUID: 00000000-0000-0000-0000-000000000000 Maj. Mnr.: 1-1 RSSI: -68 Proximity: Near Power: -59
43 UUID: 52414449-5553-4e45-5457-4f524b53434f Maj. Mnr.: 0-0 RSSI: -54 Proximity: Immediate Power: -59
44 UUID: 52414449-5553-4e45-5457-4f524b53434f Maj. Mnr.: 0-0 RSSI: -44 Proximity: Immediate Power: -59
45 UUID: 00000000-0000-0000-0000-000000000000 Maj. Mnr.: 1-1 RSSI: -70 Proximity: Near Power: -59
46 UUID: 52414449-5553-4e45-5457-4f524b53434f Maj. Mnr.: 0-0 RSSI: -42 Proximity: Immediate Power: -59
47 UUID: 52414449-5553-4e45-5457-4f524b53434f Maj. Mnr.: 0-0 RSSI: -52 Proximity: Immediate Power: -59
48 UUID: 00000000-0000-0000-0000-000000000000 Maj. Mnr.: 1-1 RSSI: -73 Proximity: Near Power: -59
49 UUID: 00000000-0000-0000-0000-000000000000 Maj. Mnr.: 1-1 RSSI: -55 Proximity: Near Power: -59
50 UUID: 52414449-5553-4e45-5457-4f524b53434f Maj. Mnr.: 0-0 RSSI: -55 Proximity: Immediate Power: -59

Entries showing the UUID constructed entirely of zeroes correspond to the AltBeacon.

Support for the Gimbal device is next on the list, though judging by this question on Stack Overflow from David G Young, that could be tricky.

Source code is available now on github, as is an apt file; note the repository name has changed to remove reference to "iBeacon". Some refactoring is needed, but I wanted to share the work in progress version in case it proves useful to anyone now.

I'll update the app on Google Play once I've tidied up the code, I'm planning to add background scanning support too but this may have to wait for a subsequent release.