Machine Learning To Understand Every User And Anticipate Their Needs

Machine learning empowers computers with the ability to learn by themselves, as opposed to being explicitly programmed. In other words, computers have intuition instead of just following rules.

The ideas have been around for 60 years or so, but it’s safe to say that machine learning is presently a hot topic, and is having a big impact on mobile-first businesses. Nowhere was this more apparent than at this year’s Apple Worldwide Developers Conference (WWDC 2017).

It might be worth pausing to reflect on why that is the case. Put simply, in this new era of data-driven marketing, we quickly reach a point (indeed, we have reached it already) at which there is simply too much data for even the smartest, most sophisticated and hard working human beings to process. And it’s certainly not possible for that data to be processed, analyzed and acted upon in anything close to real-time.

Enter machine learning - using which organizations are able to create and deliver thoughtful, relevant campaigns by letting the machine do the thinking and as a result finally delivering on the value embedded in the billions of events of data we collect each day.

Lessons From WWDC

One of the main themes throughout WWDC 2017 was the further development of device-based machine learning to anticipate customer needs. And just as importantly, to surface relevant information or an interaction to solve those particular needs. This represents a significant change in marketing - the move from responding to demand to anticipating it, and machine learning will be at the forefront of this transition.

To this end, Apple is adding to its machine learning framework, Core ML, in two ways:

First, developers will have direct access to the GPU to run their own machine-learning models on the device to make apps more intelligent.

Second, Apple will make available pre-built machine-learning models. These include real-time image recognition, text prediction, sentiment analysis, face detection, handwriting detection, emotion detection, and entity recognition.

WWDC Machine Learning

All good stuff, but the larger point is that machine learning is now just an API call, and is available to all. It will inevitably make user experiences more prescient, and apps more intelligent.

As an example, see how Amazon deploys image recognition to simplify purchasing. Note that this is for purchases on Amazon via the app, and not in-store where the photo was likely taken. One consequence of the rise in machine learning will be this increased concentration of sophisticated user experiences within the extended app environment.  Pinterest, Google and, more recently, eBay have introduced similar visual search tools to drive purchases.

Another illustration is the new Siri-powered watch face which uses machine learning to customize its content in real time throughout the day, including reminders, traffic information, upcoming meetings, news, smart home controls, etc. This is the perfect example of information and interactions that customers need and want, right when they need it. And that is the ultimate promise of machine learning.

The good news is that Swrve does machine learning today. For instance, push notifications can be optimally timed so that they are sent at a time that users have typically interacted with your app - at an individual scale - all through machine learning. The results speak for themselves, with customers increasing engagement 30-50%.

To find out how Swrve can help you anticipate your customers needs, have a chat with us or check out our Blueprints.