Location and motion-centric technology in Apple's iPhone 5S could have massive implications for travel and life-logging apps such as Rove, Foursquare, GateGuru, and Moves.
What hasn't been appreciated until now is how travel apps that rely on location services and gestures can fully take advantage of all the implications of the new technology.
On the iPhone 5S, apps using motion sensors can now work much more power-efficiently and drive richer and more precise tracking -- even when the device is off. The implications are huge.
Here are three reasons why:
1. The M7 co-processor
Until now, on iOS6, apps like Rove or Foursquare periodically measure accelerometer and/or GPS data to detect your general location and to guess whether you are stationary/walking/driving/flying and more. In the case of Rove, these measurements are made every couple of minutes to minimize battery consumption, and even less frequently if you are stationary or moving fast.
This approach is not always possible for apps that need to measure every step you take or every move you make -- like most fitness apps. The GPS and accelerometer sensors can't be as precise and frequent in their measurements because of potential battery drain and the poor quality of signals indoors.
This is about to change. Thanks to the M7 motion-sensing co-processor chip in iPhone 5S, access to continuous motion data now comes at without tying up the main processor and draining the battery!
The M7, acting on its own, aggregates the data from those accelerometers, gyroscopes, and compasses, and allows them to stay active and be analyzed all the time, even when the phone itself is asleep.
Motion-logging apps will no longer need to run nonstop in the background and instead will use the data collected by the M7. This means that the overall energy footprint will drop, at the same time the precision and quality of your data will improve.
Some might wonder why adding an additional processor that continuously tracks your motion would increase the overall lifetime of a full battery charge.
The M7 is a small processor optimized for one task (measuring your movement) and therefore has a much lower energy footprint than the main processor for the same task.
It will also allow the main processor to be dormant more often and only “wake-up” when a motion is detected, similar to how your screen saver works. For example, the operating system could reduce the ping frequency for new networks if you are not moving.
2. The new motion API
With the 5S, developers will not only have access to raw vectorial data but also human-readable motion states: walking, automotive, running, and stationary.
At Rove this is something that we currently have to do in-house. We developed models using raw acceleration data to guess your motion states.
This new Core Motion API won’t replace our current models but will allow us to improve them and focus on other aspects of our “guessing algorithm," such as for knowing if you are riding a boat or a train versus just “an automotive” transport -- similarly to how we currently detect flights and more.
With this new API, Apple is clearly encouraging developers with no data-mining background to build new motion-aware apps.
One of the major problems that location-based apps run into is the accuracy of indoor GPS measurements. GPS was designed as an outdoor localization system. Satellite signals are often blocked by building structures or bounce on walls in cities such as New York, resulting in a quite imprecise read on your location.
Location detection is somewhat improved with triangulation using wifi signals or cell towers -- but not nearly enough. Obviously this reduces any app's ability to detect the exact venue you’re in.
To mitigate this issue. Rove uses your entry and exit points to approximate indoor locations.
This is about to get much easier and much better thanks to iBeacon, which is a Bluetooth Low Energy-enabled device that transmits a radio signal with a unique identifier, which an iOS device (and eventually Android devices) can use to determine its distance from.
An airport terminal could set up iBeacons, sold by companies like Estimote, to make it easier for apps like GateGuru to direct people to duty-free shopping. Museums could offer tour apps that are precisely pinpointed to individual rooms.
It will be as if there was a couple of GPS satellites in every single venue.
Apps will your now get your coordinates in high-definition, so to speak, meaning they will not only know which restaurant you’re in but also which table you’re sitting at.
Creepy? Maybe a little. But that’s the kind of precision iBeacon could support! This will improve the quality of existing location based apps and open the door to a whole new set of features, for example keeping track of which paintings you stared at the most in a museum. Imagine that!
In the future, Apple might combine this technology with something similar to what Google’s Motorola Mobility did with the Moto X, namely, include a low-power chip whose single purpose is to crunch data from the device's microphone, to help establish context, such as if you're in a noisy setting, and you need the sound volume to automatically increase.
Overall, sensors that are always on enable app developers to create a variety of functions previously undreamt of.
NB: This is a guest viewpoint from Edouard Tabet, founder of Rove, an iPhone app that lets travelers track in rich detail where and what they saw on a journey.