Insights

The Most Interesting Work With AI Is Happening in the Shade of Apple's Lab

MEDICIGlobal Head of Content

The most exciting work often happens in the shade until the moment the results can speak for themselves. When it comes to artificial intelligence (AI), the same applies to a range of large technology companies that are keeping some of their interesting acquisitions and projects within their labs until the moment significant achievements can be revealed.

Just within the last two years, companies like Google, Microsoft, Salesforce (PredictionIO), Cloudera (Sense), Intel (Nervana) and others have made some interesting acquisitions. For example, in February, Google DeepMind launched a new division called DeepMind Health and acquired a university spinout company with a healthcare app called Hark. Then in September, Google acquired natural language understanding startup API.AI.

Microsoft has not been lagging at all with its active involvement with AI startups either. In February, the company announced that it has entered into a definitive agreement to acquire SwiftKey, whose software keyboard and SDK powers more than 300 million Android and iOS devices. Later in August, Microsoft acquired Genee, an AI-powered scheduling service.

The list of interesting acqui-hires is vast, but at the moment, let's focus on one of the dark horses in the AI industry – Apple. The California-based company is extremely secretive when it comes to advancements powering its technology, but certainly one of the most passionate adepts of developing and putting an AI brain into all its devices as well as reimagining the way we interact with virtually everything around us.

Apple and the AI startup ecosystem

Quietly, Apple has been gathering a very interesting talent pool under its roof. The company acquired AI startups such as Emotient (now Affectiva), a startup that uses artificial-intelligence technology to read people’s emotions by analyzing facial expressions, Perceptio, a startup developing technology to let companies run advanced artificial intelligence systems on smartphones without needing to share as much user data, and Vocal IQ.

“Vocal IQ introduced the world’s first self-learning dialogue API – putting real, natural conversation between people and their devices. Every time your application is used, it gets a little bit smarter. Previous conversations are central to its learning process – allowing the system to better understand future requests and in turn, react more intelligently,” as the solution was described in Forbes.

With Perceptio's acquisition, in particular, experts suggest that the company fits Apple’s strategy of trying to minimize its usage of customer data and do as much processing as possible on the device.

Further, in August 2016, the company also added to its team an ML and AI startup Turi. As reported by GeekWire, the acquisition reflects a larger push by Apple into AI and machine learning and promises to further increase the California-based company’s presence in the Seattle region, where Apple has been building an engineering outpost for the past two years.

Later, in September, Apple quietly acquired an India/US-based machine learning team, Tuplejump. TechCrunch has captured a description of this company from a Wayback Machine since the website has been taken down after the acquisition:

“A few years ago people realized that the volume of data that businesses generate was becoming unwieldy. A new set of technologies to handle these huge amounts of data cropped up. We were one of the early adopters of these ‘big data’ technologies. Having helped Fortune 500 companies adopt these technologies we quickly realized how complicated they were and how much simpler they could get.

"Thus started our quest to simplify data management technologies and make them extremely simple to use. We are building technology that is simple to use, scalable and will allow people to ask difficult questions on huge datasets.”

Professionals from TC suggest that in this acquisition, Apple was particularly interested in “FiloDB,” a new open-source distributed columnar database from TupleJump. FiloDB is designed to ingest streaming data of various types, including machine, event, and time-series data, and run very fast analytical queries over them.

While mentioned startups vary in their field of work, they certainly have a common ground – how to deal with a large stream of complex data and make sense of it in the most efficient and accurate manner. Turi, for example, lets developers build apps with ML and AI capabilities that automatically scale and tune. GeekWire notes that the company’s products – which include the Turi Machine Learning Platform, GraphLab Create, Turi Distributed, and Turi Predictive Services – are largely designed to help large and small organizations make better sense of data. Use cases include recommendation engines, fraud detection, predicting customer churn, sentiment analysis, and customer segmentation.

Professionals have been suggesting other ideas with respect to Apple AI spree. "Apple has been acquiring machine-learning companies. Increasingly, services and products are becoming AI-based – bots, speech recognition, tagging of user's photos, finding faces," said Viral Shah, co-creator of open source language Julia and co-author of 'Rebooting India' along with Nandan Nilekani, Co-founder of Infosys .

"If Apple is building a self-driving car that will also have machine learning and AI in it, it is only natural for them to acquire talent around machine learning and AI," Shah added to The Economic Times.

Apple takes actions to preserve its secrecy and independence in AI-related activities. In fact, the company has recently refused to join the likes of Google and Facebook on a new consortium that will aim to ensure artificial intelligence technology is developed in a safe, ethical and transparent manner.

AI-powered evolution of the whole company

The depth of Apple’s secretiveness has been well described in a story by Steven Levy of Backchannel, who started the ‘quest’ from the example of Siri’s brain transplant kept under cover. Two years ago, on July 30, 2014, Siri had a brain transplant, which nobody was told about:

“Apple moved Siri voice recognition to a neural-net based system for US users on that late July day (it went worldwide on August 15, 2014.) Some of the previous techniques remained operational – if you’re keeping score at home, this includes “hidden Markov models” – but now the system leverages machine learning techniques, including deep neural networks (DNN), convolutional neural networks, long short-term memory units, gated recurrent units and n-grams. (Glad you asked.) When users made the upgrade, Siri still looked the same, but now it was supercharged with deep learning.”

At that time, Apple did not publicize the development and an average user would notice that something was different only because there were fewer errors with request processing. But probably one of the most interesting parts of the story, however, comes further, emphasizing the hallmark of Apple in the AI industry:

“Until recently, when Apple’s hiring in the AI field has stepped up and the company has made a few high-profile acquisitions, observers have viewed Apple as a laggard in what is shaping up as the most heated competition in the industry: the race to best use those powerful AI tools. Because Apple has always been so tight-lipped about what goes on behind badged doors, the AI cognoscenti didn’t know what Apple was up to in machine learning.”

The Most Interesting Work With AI Is Happening in the Shade of Apple's Lab

Image source: An exclusive inside look at how artificial intelligence and machine learning work at Apple, Steven Levy

The company, in general, has been a ‘black sheep’ in sense of being different in its approach to working with AI than the rest of the mighty four – acquisitions have not been the most important part of AI-related initiatives. Apple leverages the assets it has within its walls, and if it is lacking the appropriate talent, it brings on board people that want to achieve results.

Speaking at Apple's quarterly earnings call, Apple CEO Tim Cook commented, "We have focused our AI efforts on the features that best enhance the customer experience. For example, machine learning enables Siri to understand words as well as the intent behind them. That means Siri does a better job understanding and even predicting what you want, then delivering the right responses to requests. <...> Machine learning is improving facial and image recognition in photos, predicting word choice while typing in messages and mail, and providing context awareness in maps for better directions. Deep learning within our products even enables them to recognize usage patterns and improve their own battery life. And most importantly, we deliver these intelligent services while protecting users' privacy. Most of the AI processing takes place on the device rather than being sent to the cloud."

One of the possible reasons Apple does not speak much of its work with AI is because it's right in front of over 100 million iPhone owners in the US – Apple’s devices are an embodiment of the company’s achievements in working with AI. Apple’s devices themselves have brains that are applied to make every interaction seamless and even predict user’s behavior. Apple’s AI efforts are in the ability to identify the caller who isn’t in the contact list (but did email you recently). Or when you swipe on your screen to get a shortlist of the apps that you are most likely to open next, etc.”

Elena Mesropyan

MEDICIGlobal Head of Content

Global Head of Content, MEDICI

Elena is a research professional with a background in social sciences and extensive experience in consumer behavior studies and marketing analytics. She is passionate about technologies enabling financial inclusion for underprivileged and vulnerable groups of the population around the world.