News broke recently that Apple (NASDAQ:AAPL) hired Philip Stanger, CEO of indoor location technology start-up WiFarer, for a "leadership role" at Apple. Blog after blog has repeated the story, with the assumption that Stanger will be working on improving the Apple Maps user experience.
On the surface, this makes sense. Apple Maps has not lived up to expectations, and Apple definitely needs the next Apple Maps to be a lot better. This would definitely be worth hiring for.
But if we take a deeper look, it appears that Apple has a lot more than improved maps on the horizon. It appears that Apple is about to launch a revolution in indoor location technology. It's called SLAM, which stands for "simultaneous localization and mapping." In layman's terms, it could be described as "figure out where you are as you walk around." It has the potential to open up mapping and location-based applications in any indoor site worldwide.
SLAM technology, explained below, has been around for years, primarily in R&D labs building robots, space rovers, aerial drones, and the like. Recently 5-6 start-up companies (out of over a hundred in the indoor location area) have been working on incorporating SLAM into indoor location systems on smartphones. SLAM is definitely on the mobile horizon.
I speculate that Apple is preparing to release SLAM on iOS. The reasons are (1) their recently hiring Phillip Stanger, with his focus on indoor location, (2) a recently published patent application, (3) their acquisition of start-up company WiFiSLAM one year ago, (4) their introduction of the M7 motion sensing chip in the iPhone 5s, (5) some plans for the upcoming WWDC conference, and (6) Apple's out-of-character history of both supporting and preventing indoor location technology on iPhones.
Indoor location positioning on smartphones initially used a process called trilateration (or multilateration), which basically measures the signal strengths that the phone receives from cellular antennas and Wi-Fi access points in the area, and estimates how far the phone is from each of those antennas and access points. It then estimates the place that is closest to the right distance away from each antenna and access point. This is how Android and Google Maps estimate indoor location now, and how iPhones used to estimate indoor location based on Apple's previous alliance with Skyhook Wireless.
The problem with trilateration is that signal strengths are generally not an accurate way to estimate distance. If a signal strength from a particular antenna is weak, it might be because the phone is far from that antenna, or it might be because there is a big metal beam in between the phone and that antenna. Because of this, trilateration systems tend not to be accurate enough for location-based mapping and applications.
For that reason, the vast majority of indoor location solutions on the market have moved to a method called signal fingerprinting. In this method, a person walks around each site and takes recordings of the signal strengths that the phone is receiving at that point. A database of collections of signal strengths at hundreds of points on a site's map can enable a phone to estimate its location based on observed signal strengths more accurately than using trilateration.
The problem with signal fingerprinting, however, is that it requires that every site that will work in system be prepared by a person who walks around the site recording signal strengths. It also requires that the system have a map of every site.
A third approach that is very common recently is called sensor fusion (also known as inertial navigation). In this method, a phone knows its initial location (say, using GPS outdoors) and uses sensors in the phone to measure every movement from that initial location. This approach is very promising because of its generality, but suffers from small errors in motion sensing that after a few minutes can accumulate into big errors. It also requires a significant amount of battery power to monitor sensors every few seconds.
A fourth approach, used for years but recently popularized by Apple, is to put radio beacons around each site to be covered, that will transmit special signals that enable the phones nearby to estimate their locations. But this will only work in sites that have those beacons installed.
Enter SLAM. SLAM basically attempts to automate the process of collecting signal fingerprints, and learn maps of the sites at the same time, by using other methods for the first few times the phone moves around the site. In SLAM, phones moving around a site estimate their locations using sensor fusion, and record signal fingerprints and map data as they go. The location estimates during this process will be very inaccurate, because of the accumulating errors of sensor fusion, but after tens or hundreds of times moving around a site, the system can combine all the data received to put together a signal database and a map of the site. Once the map and signal database is ready, future phones in the site can estimate their locations more accurately.
In other words, SLAM enables a set of phones to learn how to position themselves in a site, and to learn a map of the site at the same time. This is simultaneous localization and mapping.
Companies working in the area of mobile SLAM report that it takes ten or twenty times walking around a site to learn the site.
This is where Apple comes in. Imagine if every iPhone were SLAMming indoor sites as they move around. With all those iPhones out there, in no time Apple could build up maps and signal databases for indoor sites all over the world, all using SLAM. These maps and databases could then be used to estimate indoor locations universally and efficiently. This would leapfrog all efforts to date in indoor location positioning, and give Apple the lead in supporting location-based applications indoors.
WiFiSLAM, the company that Apple acquired last year, had developed and released technology for signal fingerprinting. They pioneered a mobile application for collecting signal fingerprints easily. But they also, more quietly, developed strong technology for sensor fusion, which they demonstrated walking around a parking lot. (The picture shown is taken from a WiFiSLAM video.) While they never discussed combining the two technologies, their name makes it clear that their vision was SLAM. It would make sense for that vision, and the two technologies that can be combined, to drive Apple's acquisition.
Moreover, the WiFiSLAM team that's now at Apple has been working on SLAM, and submitted a patent application in 2012 that discusses it. This application describes a system in which "as the user moves along actual path, the sensors of the mobile will record the movement, and the location software will maintain a log of the sensor information, as well as the signal strengths... A loop emphasizes the continual iteration of this process. In one embodiment, this step occurs multiple times per second...." This is describing the basic SLAM process as explained above, in which iterations of phones moving around a site collect signal fingerprints and other data about the site, that can be used later for positioning. )
One of the challenges in mobile SLAM is that the phone needs to be tracking its location, using built-in sensors, as it is moving around unmapped locations. This inherently takes a lot of CPU and battery. But in September, 2013, Apple added a new chip to the iPhone 5s, the M7 motion sensing chip, which has the ability to "continuously measure motion data" using the phone's sensors. Apple's announcement discussed its use in exercise apps and gesture control, but they also hinted that it would be used in Maps. Nobody at the time appears to have understood this in reference to indoor location positioning by sensor fusion or SLAM.
This wouldn't be the first time for Apple to add hardware and only reveal its purpose later. iPhones supported Bluetooth Low Energy for a considerable amount of time before Apple announced iBeacons for indoor location. The M7 chip may be the next technical feature to have its purpose revealed.
For the past several years, Apple has had a love/hate relationship with indoor location. Early on Apple delivered rudimentary indoor location positioning through its alliance with Skyhook Wireless, but they cancelled that alliance a few years ago. While Apple usually does everything possible to enable a large ecosystem of third-party applications, it has recently blocked application developers from accessing Wi-Fi signal strength data that is critical for indoor location apps, making iPhones lag behind Android in this area. The only logical reason for this is that Apple wants to keep indoor location on iPhones to themselves.
In June, at Apple's WWDC conference, Apple has announced a session on the subject of Location and Motion and an event on the subject of Apps and Location. Might this year's WWDC be the occasion of Apple's revealing a grand plan to support worldwide indoor location positioning, based on SLAM?
Apple has long been one of the hardest companies to predict, with a knack for surprising the world and making their announcements seem obvious at the same time. The prediction that they will announce SLAM is purely speculation. But if Apple isn't the first to SLAM the world, Google (NASDAQ:GOOG) (NASDAQ:GOOGL) or Microsoft (NASDAQ:MSFT), or a major phone manufacturer, will soon.
There are over a hundred companies working on indoor location positioning, several of which are working actively on SLAM. Some others are working on other approaches to universal indoor location positioning, meaning indoor location technology that can work anywhere. Who will be the first to deliver a phone that can determine its location, accurately enough to be useful, in any indoor location?
Disclosure: I have no positions in any stocks mentioned, and no plans to initiate any positions within the next 72 hours. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.