Sections

How Small Companies Can Use Big Data To Grow And Improve

Jennifer Horowitz

Small businesses can cost-effectively analyze large data sets to improve their marketing and product quality and accelerate customer relationships. Leaders from every business sector must learn how to grasp its changes for the future as Big Data becomes the key basis of competition.

Big Data is for organizations of any size, with data management having developed into an important skill to competitively differentiate today’s market leaders from those that are no longer influential. Signals and Systems’ mid-2014 report found that the Big Data market is expected to total $76 billion by 2020, an increase of 17%.

Technically, Big Data refers to technologies and initiatives that are too massive for traditional skills, technologies, and infrastructure efficiently address.

More than 70 years ago, in 1941, the first attempt to quantify the volume of data growth known as the “information explosion” was used, according to the Oxford English Dictionary.

Big Data was initially a unique resource only for large corporations and statisticians. With the growth in the Internet, smartphones, wireless networks, sensors, social media, and other digital technologies, small businesses and companies of all sizes are now able to leverage this trend.

As Big Data grows, MSPs can even connect to SMBs in offering their services as they look for new opportunities. Markets and Markets predicts that third-party MSPs cut recurring in-house costs by 30-40% and can add as much as a 60% improvement in efficiency. Small businesses face a big problem today with finding data storage, due to the increased growth and data volume of devices.

MSPs can expand their cloud services as SMBs look for bigger and better data storage alternatives. This means new growth and partnerships for MSPs that choose to expand their suite of services.

In addition to expanding storage options, MSPs can look to analytics performance and database management. By helping small businesses better evaluate their data, SMBs can provide a streamlined recovery and backup system to ensure data is not cluttered on a user’s mobile device.

Big Data leaders and laggards

A.T. Kearney, a global management consultancy firm, and Carnegie Mellon University investigated the corporate use of Big Data in its first-ever Leadership Excellence in Analytic Practices (LEAP) July/August 2014 study. They divided companies into four categories: leaders, explorers, followers, and laggards. Here’s what the leaders were doing with Big Data.

An inclusive atmosphere: This begins with a hands-on, dynamic policy of executive sponsorship and mindshare about Big Data. This fosters team-building, cross-functional collaboration, and company-wide confidence in data-driven methodologies.

The need for speed: Leaders used approaches that focused on rapid experimentation, mobilization, and deployment. This was primarily through pilot programs and proof-of-concept modeling.

Forward-thinking: These policies bred innovation, growth, and better operational efficiency. While Big Data was used for reporting on past efforts, leaders focused on future endeavors. They evaluated risks. They studied costs and benefits and balanced the tradeoffs between them. Then they charted a course.

Building on Big Data

According to the IBM Institute for Business, 26% of companies see returns from Big Data after 6 months. 63% see returns after one year. 40% reported that they use Big Data to solve their operational challenges.

The world will become more and more reliant on data-driven metrics in the years to come, and businesses need to recognize that fact. Using the power of analytics can shift a company into high gear, while failing to do so could leave them stuck in neutral.

Want more strategies to help your business tap the power of analytics? See Top Five Big Data Challenges For CIOs.

Comments

Real-Time Data Transforms Political Journalism, But Context Remains Vital

John Graham

The runup to the 2016 U.S. election is being covered in interesting new ways by the political media, with analysis of Big Data and real-time opinion polling offering journalists much deeper insight than ever before. The trend of “data journalism” is peaking as the media embraces advanced technologies that allow them to deliver a new breed of numbers-driven, fact-based journalism.

The tools being used for data journalism open up possibilities for fresh perspectives, more in-depth reporting, and new stories behind the numbers that have never been seen before. Traditional journalists are beginning to see how data journalism can complement their reporting, and the U.S. election is serving as an ideal testing ground. Political reporters are lapping up the improved data literacy and access to objective analysis, which is helping to make their reports more thorough and informative.

Consequently, American voters are becoming digital voters. They have access to real-time, data-driven information and public sentiment, which is empowering them with broader insight. They’re relying on this to help them make up their minds before they cast their vote, and it’s given many voters a renewed interest in becoming informed citizens able to make an educated choice.

However, the rise of data-driven journalism brings with it a potential pitfall for media organizations and readers alike. Digital information overload will bring about a fatigue around numbers if reporting quantity becomes more highly valued than quality. Having access to mountains of data is a huge benefit, but a reporter still has to be a journalist first to ensure they’re not getting buried under the numbers and missing the stories.

In other words, a political journalist still needs to be a politico, not just a statistician. They could fall into the trap of placing too much importance on meaningless correlations as indicators of voter sentiment, losing their grasp on what made them a great political reporter in the first place. As data gets bigger, this will become harder to resist. So they need to become experts in making Big Data small—rather than obsessing over the numbers, obsessing over figuring out what they really mean. In doing that, they have an unprecedented opportunity to make people more informed rather than simply overwhelming with them a series of conflicting data sets.

Some media organizations are already tackling the challenge of remaining relevant in a world of information overload. Using big data and visualizations, they are making great strides in making data journalism more accessible to reporters, politicos, and voters, which is proving its worth in giving political reporting a new lease of life.

Reuters’ Polling Explorer tool is an example of how this is being done, offering up customizable data visualizations focusing on the biggest talking points in the U.S. leading up to the election. It’s an entirely new scale of public opinion measurement, presented in a way anyone can understand and use, while enabling Reuters to usher in its own improved brand of accurate, fact-based, and timely journalism.

We can see the true potential of using real-time data analysis to measure up-to-the-minute public opinion in one poll on the most important problem facing the US today. Immediately after the Paris attacks in November, terrorism skyrocketed way above the economy as the number-one issue, rising sharply again straight after the December San Bernardino attack. For Reuters, this is just one of many examples of their greatly increased ability to find outliers in the data.

Reuters Polling Explorer runs on SAP HANA, an in-memory data platform that allows Reuters to access and analyze 100 million survey responses for quicker and more efficient reporting of public opinion.

For more on data analytics in today’s media environment, see How Big Data Is Changing The News Industry.

Comments

John Graham

About John Graham

John Graham is president of SAP Canada. Driving growth across SAP’s industry-leading cloud, mobile, and database solutions, he is helping more than 9,500 Canadian customers in 25 industries become best-run businesses.

Smart Machines Create Markets For Cyber-Physical Advances

Marion Heindenreich

Today, industrial machines are more intelligent than ever before. These intelligent machines are changing companies in many ways.

Why smart machines?

Mobile networked computers were a key breakthrough for making smart machines. Big Data allows machines and computers to store information and analyze complex patterns. Cloud computing offers broad access to information and more storage.

These computerized machines are both physical and virtual. Some call them “cyber-physical” machines. Technology lets them be self-aware and connected to each other and larger systems.

Businesses change their approaches

Intelligent machines allow companies to innovate in many areas. For one, the value proposition for customers is evolving. Businesses now model and plan in different ways in many industries.

Makers of industrial machines and parts work in new ways within the organization. Engineering now partners with mechanical, electronic, and software staff to develop new products. Manufacturing now seamlessly ties what happens on the shop floor to the customer.

Service models are changing too. Scheduled and reactionary servicing of machines is fading. Now intelligent machines track themselves. Machines detect problems and report them automatically. Major problems or failures are predicted and reported.

A data mining example

One good industrial example is mining, which can be dangerous and difficult. As ores become scarce, the costs of mining have increased.

“Smart machines” started in mining in the late 1990s. Software and hardware let remote users change settings. Operators moved hydraulic levers from a safe distance. Sensors observed performance and diagnosed issues.

Data cables connected machines to computers on the surface. Continuous and remote monitoring of the machines grew. Over time, embedded sensors helped improve monitoring, diagnostics, and data storage.

The technology means workers only go underground to fix specific issues. As a result, accident and injury risk is lower.

New wireless technology now lets mining companies connect data from many mine sites. Service centers access large amounts of data and can improve performance. Maintenance is prioritized and equipment downtime is reduced.

Opportunity abounds

For companies the time is now. Today, mobile “connected things” generate 17% of the digital universe. By 2020 that share grows to 27%.

You might not be investing in this so-called “Internet of Things” (devices that connect to each other). But it’s a good bet your competitors are. A December 2015 study reported 33% of industrial companies are investing in the Internet of Things. Another 25% are considering it.

There are risks

This new dawning era of manufacturing is exciting. But there are concerns. Cyber attacks on the Internet of Things are not new. But as the use of intelligent machines grows, the threat of cyber attacks in industry grows.

Data confidentiality and privacy are concerns. So too are software and hardware vulnerabilities. Exposure to attack lies not just in the virtual space but the physical too. Tampering with unattended machines and theft pose serious risk.

To address these threats, industries must invest in cybersecurity along with smart machines.

Conclusion

The potential advantages of smart machines are staggering. They can reshape industries and change how companies produce new products and create new markets.

For more information, please download the white paper Digital Manufacturing: Powering the Fourth Industrial Revolution.

Comments

Marion Heindenreich

About Marion Heindenreich

Marion Heidenreich is a solution manager for the SAP Industrial Machinery and Components Business Unit who focuses on solution innovations like Product Costing on SAP HANA and cloud solutions, as well as providing financial and business analysis for industry business strategy definition and business planning.

Unlock Your Digital Super Powers: How Digitization Helps Companies Be Live Businesses

Erik Marcade and Fawn Fitter

The Port of Hamburg handles 9 million cargo containers a year, making it one of the world’s busiest container ports. According to the Hamburg Port Authority (HPA), that volume doubled in the last decade, and it’s expected to at least double again in the next decade—but there’s no room to build new roads in the center of Hamburg, one of Germany’s historic cities. The port needed a way to move more freight more efficiently with the physical infrastructure it already has.

sap_Q216_digital_double_feature1_images1The answer, according to an article on ZDNet, was to digitize the processes of managing traffic into, within, and back out of the port. By deploying a combination of sensors, telematics systems, smart algorithms, and cloud data processing, the Port of Hamburg now collects and analyzes a vast amount of data about ship arrivals and delays, parking availability, ground traffic, active roadwork, and more. It generates a continuously updated model of current port conditions, then pushes the results through mobile apps to truck drivers, letting them know exactly when ships are ready to drop off or receive containers and optimizing their routes. According to the HPA, they are now on track to handle 25 million cargo containers a year by 2025 without further congestion or construction, helping shipping companies bring more goods and raw materials in less time to businesses and consumers all across Europe.

In the past, the port could only have solved its problem with backhoes and building permits—which, given the physical constraints, means the problem would have been unsolvable. Today, though, software and sensors are allowing it to improve processes and operations to a previously impossible extent. Big Data analysis, data mining, machine learning, artificial intelligence (AI), and other technologies have finally become sophisticated enough to identify patterns not just in terabytes but in petabytes of data, make decisions accordingly, and learn from the results, all in seconds. These technologies make it possible to digitize all kinds of business processes, helping organizations become more responsive to changing market conditions and more able to customize interactions to individual customer needs. Digitization also streamlines and automates these processes, freeing employees to focus on tasks that require a human touch, like developing innovative strategies or navigating office politics.

In short, digitizing business processes is key to ensuring that the business can deliver relevant, personalized responses to the market in real time. And that, in turn, is the foundation of the Live Business—a business able to coordinate multiple functions in order to respond to and even anticipate customer demand at any moment.

Some industries and organizations are on the verge of discovering how business process digitization can help them go live. Others have already started putting it into action: fine-tuning operations to an unprecedented level across departments and at every point in the supply chain, cutting costs while turbocharging productivity, and spotting trends and making decisions at speeds that can only be called superhuman.

Balancing Insight and Action

sap_Q216_digital_double_feature1_images2Two kinds of algorithms drive process digitization, says Chandran Saravana, senior director of advanced analytics at SAP. Edge algorithms operate at the point where customers or other end users interact directly with a sensor, application, or Internet-enabled device. These algorithms, such as speech or image recognition, focus on simplicity and accuracy. They make decisions based primarily on their ability to interpret input with precision and then deliver a result in real time.

Edge algorithms work in tandem with, and sometimes mature into, server-level algorithms, which report on both the results of data analysis and the analytical process itself. For example, the complex systems that generate credit scores assess how creditworthy an individual is, but they also explain to both the lender and the credit applicant why a score is low or high, what factors went into calculating it, and what an applicant can do to raise the score in the future. These server-based algorithms gather data from edge algorithms, learn from their own results, and become more accurate through continuous feedback. The business can then track the results over time to understand how well the digitized process is performing and how to improve it.

sap_Q216_digital_double_feature1_images5From Data Scarcity to a Glut

To operate in real time, businesses need an accurate data model that compares what’s already known about a situation to what’s happened in similar situations in the past to reach a lightning-fast conclusion about what’s most likely to happen next. The greatest barrier to this level of responsiveness used to be a lack of data, but the exponential growth of data volumes in the last decade has flipped this problem on its head. Today, the big challenge for companies is having too much data and not enough time or power to process it, says Saravana.

Even the smartest human is incapable of gathering all the data about a given situation, never mind considering all the possible outcomes. Nor can a human mind reach conclusions at the speed necessary to drive Live Business. On the other hand, carefully crafted algorithms can process terabytes or even petabytes of data, analyze patterns and detect outliers, arrive at a decision in seconds or less—and even learn from their mistakes (see How to Train Your Algorithm).

How to Train Your Algorithm 

The data that feeds process digitization can’t just simmer.
It needs constant stirring.

Successfully digitizing a business process requires you to build a model of the business process based on existing data. For example, a bank creates a customer record that includes not just the customer’s name, address, and date of birth but also the amount and date of the first deposit, the type of account, and so forth. Over time, as the customer develops a history with the bank and the bank introduces new products and services, customer records expand to include more data. Predictive analytics can then extrapolate from these records to reach conclusions about new customers, such as calculating the likelihood that someone who just opened a money market account with a large balance will apply for a mortgage in the next year.

Germany --- Germany, Lower Bavaria, Man training English Springer Spaniel in grass field --- Image by © Roman M‰rzinger/Westend61/CorbisTo keep data models accurate, you have to have enough data to ensure that your models are complete—that is, that they account for every possible predictable outcome. The model also has to push outlying data and exceptions, which create unpredictable outcomes, to human beings who can address their special circumstances. For example, an algorithm may be able to determine that a delivery will fail to show up as scheduled and can point to the most likely reasons why, but it can only do that based on the data it can access. It may take a human to start the process of locating the misdirected shipment, expediting a replacement, and establishing what went wrong by using business knowledge not yet included in the data model.

Indeed, data models need to be monitored for relevance. Whenever the results of a predictive model start to drift significantly from expectations, it’s time to examine the model to determine whether you need to dump old data that no longer reflects your customer base, add a new product or subtract a defunct one, or include a new variable, such as marital status or length of customer relationship that further refines your results.

It’s also important to remember that data doesn’t need to be perfect—and, in fact, probably shouldn’t be, no matter what you might have heard about the difficulty of starting predictive analytics with lower-quality data. To train an optical character recognition system to recognize and read handwriting in real time, for example, your samples of block printing and cursive writing data stores also have to include a few sloppy scrawls so the system can learn to decode them.

On the other hand, in a fast-changing marketplace, all the products and services in your database need consistent and unchanging references, even though outside the database, names, SKUs, and other identifiers for a single item may vary from one month or one order to the next. Without consistency, your business process model won’t be accurate, nor will the results.

Finally, when you’re using algorithms to generate recommendations to drive your business process, the process needs to include opportunities to test new messages and products against existing successful ones as well as against random offerings, Saravana says. Otherwise, instead of responding to your customers’ needs, your automated system will actually control their choices by presenting them with only a limited group of options drawn from those that have already received the most
positive results.

Any process is only as good as it’s been designed to be. Digitizing business processes doesn’t eliminate the possibility of mistakes and problems; but it does ensure that the mistakes and problems that arise are easy to spot and fix.

From Waste to Gold

Organizations moving to digitize and streamline core processes are even discovering new business opportunities and building new digitized models around them. That’s what happened at Hopper, an airfare prediction app firm in Cambridge, Massachusetts, which discovered in 2013 that it could mine its archives of billions of itineraries to spot historical trends in airfare pricing—data that was previously considered “waste product,” according to Hopper’s chief data scientist, Patrick Surry.

Hopper developed AI algorithms to correlate those past trends with current fares and to predict whether and when the price of any given flight was likely to rise or fall. The results were so accurate that Hopper jettisoned its previous business model. “We check up to 3 billion itineraries live, in real time, each day, then compare them to the last three to four years of historical airfare data,” Surry says. “When consumers ask our smartphone app whether they should buy now or wait, we can tell them, ‘yes, that’s a good deal, buy it now,’ or ‘no, we think that fare is too expensive, we predict it will drop, and we’ll alert you when it does.’ And we can give them that answer in less than one second.”

When consumers ask our smartphone app whether they should buy now or wait, we can tell them, ‘yes, that’s a good deal, buy it now’.

— Patrick Surry, chief data scientist, Hopper

While trying to predict airfare trends is nothing new, Hopper has told TechCrunch that it can not only save users up to 40% on airfares but it can also find them the lowest possible price 95% of the time. Surry says that’s all due to Hopper’s algorithms and data models.

The Hopper app launched on iOS in January 2015 and on Android eight months later. The company also switched in September 2015 from directing customers to external travel agencies to taking bookings directly through the app for a small fee. The Hopper app has already been downloaded to more than 2 million phones worldwide.

Surry predicts that we’ll soon see sophisticated chatbots that can start with vague requests from customers like “I want to go somewhere warm in February for less than $500,” proceed to ask questions that help users narrow their options, and finally book a trip that meets all their desired parameters. Eventually, he says, these chatbots will be able to handle millions of interactions simultaneously, allowing a wide variety of companies to reassign human call center agents to the handling of high-value transactions and exceptions to the rules built into the digitized booking process.

Port of Hamburg Lets the Machines Untangle Complexity

In early 2015, AI experts told Wired magazine that at least another 10 years would pass before a computer could best the top human players at Go, an ancient game that’s exponentially harder than chess. Yet before the end of that same year, Wired also reported that machine learning techniques drove Google’s AlphaGo AI to win four games out of five against one of the world’s top Go players. This feat proves just how good algorithms have become at managing extremely complex situations with multiple interdependent choices, Saravana points out.

The Port of Hamburg, which has digitized traffic management for an estimated 40,000 trucks a day, is a good example. In the past, truck drivers had to show up at the port to check traffic and parking message boards. If they arrived before their ships docked, they had to drive around or park in the neighboring residential area, contributing to congestion and air pollution while they waited to load or unload. Today, the HPA’s smartPORT mobile app tracks individual trucks using telematics. It customizes the information that drivers receive based on location and optimizes truck routes and parking in real time so drivers can make more stops a day with less wasted time and fuel.

The platform that drives the smartPORT app also uses sensor data in other ways: it tracks wind speed and direction and transmits the data to ship pilots so they can navigate in and out of the port more safely. It monitors emissions and their impact on air quality in various locations in order to adjust operations in real time for better control over environmental impact. It automatically activates streetlights for vehicle and pedestrian traffic, then switches them off again to save energy when the road is empty. This ability to coordinate and optimize multiple business functions on the fly makes the Port of Hamburg a textbook example of a Live Business.

Digitization Is Not Bounded by Industry

Other retail and B2B businesses of all types will inevitably join the Port of Hamburg in further digitizing processes, both in predictable ways and in those we can only begin to imagine.

sap_Q216_digital_double_feature1_images4Customer service, for example, is likely to be in the vanguard. Automated systems already feed information about customers to online and phone-based service representatives in real time, generate cross-selling and upselling opportunities based on past transactions, and answer customers’ frequently asked questions. Saravana foresees these systems becoming even more sophisticated, powered by AI algorithms that are virtually indistinguishable from human customer service agents in their ability to handle complex live interactions in real time.

In manufacturing and IT, Sven Bauszus, global vice president and general manager for predictive analytics at SAP, forecasts that sensors and predictive analysis will further automate the process of scheduling and performing maintenance, such as monitoring equipment for signs of failure in real time, predicting when parts or entire machines will need replacement, and even ordering replacements preemptively. Similarly, combining AI, sensors, data mining, and other technologies will enable factories to optimize workforce assignments in real time based on past trends, current orders, and changing market conditions.

Public health will be able to go live with technology that spots outbreaks of infectious disease, determines where medical professionals and support personnel are needed most and how many to send, and helps ensure that they arrive quickly with the right medication and equipment to treat patients and eradicate the root cause. It will also make it easier to track communicable illnesses, find people who are symptomatic, and recommend approaches to controlling the spread of the illness, Bauszus says.

He also predicts that the insurance industry, which has already begun to digitize its claims-handling processes, will refine its ability to sort through more claims in less time with greater accuracy and higher customer satisfaction. Algorithms will be better and faster at flagging claims that have a high probability of being fraudulent and then pushing them to claims inspectors for investigation. Simultaneously, the same technology will be able to identify and resolve valid claims in real time, possibly even cutting a check or depositing money directly into the insured person’s bank account within minutes.

Financial services firms will be able to apply machine learning, data mining, and AI to accelerate the process of rating borrowers’ credit and detecting fraud. Instead of filling out a detailed application, consumers might be able to get on-the-spot approval for a credit card or loan after inputting only enough information to be identified. Similarly, banks will be able to alert customers to suspicious transactions by text message or phone call—not within a day or an hour, as is common now, but in a minute or less.

Pitfalls and Possibilities

As intelligent as business processes can be programmed to be, there will always be a point beyond which they have to be supervised. Indeed, Saravana forecasts increasing regulation around when business processes can and can’t be digitized. Especially in areas involving data security, physical security, and health and safety, it’s one thing to allow machines to parse data and arrive at decisions to drive a critical business process, but it’s another thing entirely to allow them to act on those decisions without human oversight.

Automated, impersonal decision making is fine for supply chain automation, demand forecasting, inventory management, and other processes that need faster-than-human response times. In human-facing interactions, though, Saravana insists that it’s still best to digitize the part of the process that generates decisions, but leave it to a human to finalize the decision and decide how to put it into action.

“Any time the interaction is machine-to-machine, you don’t need a human to slow the process down,” he says. “But when the interaction involves a person, it’s much more tricky, because people have preferences, tastes, the ability to try something different, the ability to get fatigued—people are only statistically predictable.”

For example, technology has made it entirely possible to build a corporate security system that can gather information from cameras, sensors, voice recognition technology, and other IP-enabled devices. The system can then feed that information in a steady stream to an algorithm designed to identify potentially suspicious activity and act in real time to prevent or stop it while alerting the authorities. But what happens when an executive stays in the office unusually late to work on a presentation and the security system misidentifies her as an unauthorized intruder? What if the algorithm decides to lock the emergency exits, shut down the executive’s network access, or disable her with a Taser instead of simply sending an alert to the head of security asking what to do while waiting for the police to come?

sap_Q216_digital_double_feature1_images6The Risk Is Doing Nothing

The greater, if less dramatic, risk associated with digitizing business processes is simply failing to pursue it. It’s true that taking advantage of new digital technologies can be costly in the short term. There’s no question that companies have to invest in hardware, software, and qualified staff in order to prepare enormous data volumes for storage and analysis. They also have to implement new data sources such as sensors or Internet-connected devices, develop data models, and create and test algorithms to drive business processes that are currently analog. But as with any new technology, Saravana advises, it’s better to start small with a key use case, rack up a quick win with high ROI, and expand gradually than to drag your heels out of a failure to grasp the long-term potential.

The economy is digitizing rapidly, but not evenly. According to the McKinsey Global Institute’s December 2015 Digital America report, “The race to keep up with technology and put it to the most effective business use is producing digital ‘haves’ and ‘have-mores’—and the large, persistent gap between them is becoming a decisive factor in competition across the economy.” Companies that want to be among the have-mores need to commit to Live Business today. Failing to explore it now will put them on the wrong side of the gap and, in the long run, rack up a high price tag in unrealized efficiencies and missed opportunities. D!

Comments

Erik Marcade

About Erik Marcade

Erik Marcade is vice president of Advanced Analytics Products at SAP.

Tags:

5 Things Pokémon Go Taught Me About The Future Of Marketing

Madelyn Bayer

In case you haven’t been outside lately, there is a game taking over the millennial world right now – it’s called Pokémon Go.

Pokémon Go is a mobile app that you can download for iOS or Android. It’s free to download and play, but you have the option to use real money to buy in-game currency called PokéCoins. PokéCoins are used to purchase Pokéballs, the in-game item you need to catch Pokémon. The game uses your phone’s GPS to obtain your real-world location and augmented reality to bring up Pokémon characters on your screen, placing them on top of what you see in front of you. You—the digital you—can be customised with clothing, a faction (a “team” of players you can join), and other options, and you level up as you play.

On the surface, it’s a fun mobile game whose popularity is as intriguing as it is entertaining, but the superficial fun of the app has led to some real results: Developer Nintendo’s valuation has increased by an estimated $7.5 billion thanks to the game.

With results like that, this app is more than just a game, but a possible whole new realm of digital marketing. I started to research some of the key learnings from Pokémon go from a marketing perspective.

  1. Keep it small and simple. Gone are the days of needing to invest in large ad campaigns and advertising budgets. How many ads did we see leading up to the Pokémon Go launch? Very few. Pokémon Go didn’t invest much into advertising because it didn’t need it – either the ad executives in charge knew that the success of the app would be dependent on the marketing and viral factors listed here, or they didn’t expect the app to be a breakout hit. Regardless, the bottom line is that you don’t need a massive advertising budget to be a great marketer; you just need to be able to connect with people. Simplicity is key: Well-designed websites, e-commerce platforms, apps, and products should welcome new users and make it extremely easy for all to get involved (a lesson learned from breakout social media apps like Instagram and Snapchat).
  1. Have an agile digital platform. If you don’t have an agile digital marketing platform, you will miss the boat. This lesson has been proven time and time again in today’s digital world. The marketing game changes faster than most brands can keep up with – but being able to react quickly to trends like this is essential. Failing fast, minimum viable product, and agile: These are fast becoming key phrases in marketing teams’ vocabulary. Whether you are launching a social campaign, a consumer app, or a large-scale marketing operation, you must be able to stand it up quickly, test it, iterate on it, and send it out quickly.
  1. Loyalty is everything. If you want to increase customer loyalty, you must reward your users for continuing to invest in your product. Pokémon Go players get bonuses and incentives for levelling up, taking on gyms, catching new Pokémon, and even walking. The thrill of finding a rare Pokémon or winning an intense battle is enough to keep users yearning for more, even through the less-active parts of the game. There are definite rewards for continued investment, and that’s what keeps users playing—sometimes at the expense of productivity. When I think of the apps I know and love, this feature is nothing new, but it is very important. Gamification and loyalty are what keep me checking in on the highly addictive Air New Zealand app, for example, tuning in each Tuesday to watch the reverse auctions grab flight seats. Creating an individualised offering to every consumer is a hot trend for retailers right now, and it may also be part of the lessons learned from Pokémon Go.
  1. Appeal to the new generation of augmented reality and virtual-reality natives. Just as Gen Y are considered digital natives because they grew up with Internet access, the emerging gen Z will be known as AR and VR natives – what feels new to us now will be the new normal for kids growing up today. That’s not to say every brand should jump on the AR or VR bandwagon. But learn from what this game has taught us: Why is this game taking over the world? What insights can be adapted to generate positive brand engagement? We have evolved past the age of disruptive placement and are now in an era of behavioral targeting. One of the biggest challenges retailers face is knowing where their customers are at any given point in time. How do they reward their customers at the point of sale? Could the next wave of retail disruption be the gamification of shopping in a virtual reality?
  1. Privacy vs. Personalisation. That old chestnut. According to the SAP New Zealand Digital Experience Report 2016, New Zealanders rated having relevant offers without infringing on privacy amongst the highest consumer experience attributes when considering importance to digital experience satisfaction. This is interesting considering the backlash concerning the data Niantic is actively collecting on Pokémon Go users. It seems this hasn’t deterred users too much; the explanation for this may lie further in the New Zealand Digital Experience report research.

Arguably, Pokémon Go ticks all the boxes when we look at the consumer-rated digital experience attributes listed below – though there may be one exception if we consider recent user safety horror stories that are starting to come out.

So what has all this taught us? It links back to the report: The better the digital experience – defined by the above attributes – the happier consumers are to give up their data. The graphs below show consumers’ willingness to give up certain personal information, depending on whether or not they have a satisfactory digital experience. As we all know, data, or information, is the currency of the future, and lessons like these raise important takeaways for all digital marketers looking to gain real consumer insights and preferences.

If you haven’t already given Pokémon Go a go, see what all the fuss is about. Whether the game is a passing fad or the newest trend of digital marketing is yet to be determined, but it offers some interesting thoughts to consider before you launch your next campaign to consumers.

For more insight on where marketing is headed, see MarTech: The Future Of Digital Marketing.

Comments

Madelyn Bayer

About Madelyn Bayer

In my role as an Industry Value Associate at SAP Australia and New Zealand, I help organisations calculate and realise the value that new systems and technology will have on their operations. My role covers industries spanning utilities, public sector, consumer products and retail with a specific focus around customer engagement and commerce solutions and through this role I have developed a strong understanding of mega trends, cloud computing, enterprise software, the networked economy, Internet of Things, millennials and digital consumers. I am particularly passionate about creating sustainable solutions to solving world problems through technology.