Q4 2018

Can You Keep a (Data) Secret?

Can You Keep a (Data) Secret?

The era of data ethics is upon us—and only the companies with the highest standards will win over global customers.

The web is worldwide, but data privacy depends very much upon where you live, a political holdover from pre-digital times that has demarcated the globe into privacy haves and have-nots. In the former East Germany, for example, the state police, aka Stasi, had detailed files on 4 million of its own citizens (out of a total population of 17 million), as well as 2 million West Germans.

The Stasi, and its even more sinister forebear, the Gestapo, have made Germans extremely protective of their privacy. No wonder, then, that the world’s first data protection law came into effect in (then West) Germany in 1978.

Germans long ago came to realize that their personal safety and self-determination includes their data. Consumers around the world are slowly coming around to the same belief while our global dependence on distributed computing forces a worldwide reckoning of privacy and data ethics.

Consumers now see that many “we protect your data” statements aren’t designed with their welfare in mind. To date, there already been over 1 billion data breaches this year—compared to 55 million for all of 2005.

As business increasingly crosses borders, the companies with the highest degree of data ethics will gain a competitive advantage. The first step toward that goal is instituting data ethics policies that satisfy customer expectations without dampening innovation.

On your marks, get set, go.

A Change Years in the Making

In the years following the passage of Germany’s 1978 data privacy law, few other countries followed suit—making Germany the model island of privacy in an otherwise no-holds-barred data collection world.

It could have continued that way for the foreseeable future if not for a series of data attacks, breaches, and intentional data harvesting that didn’t sit well with consumers, consumer advocates, and, eventually, regulators.

But of course, the regulators would have had little sway were it not for a major shift in customer thinking about privacy. Consumers are better educated now and asking for more transparency and accountability. They’re better informed and more savvy about security, privacy, and rights to data, and they’re thinking about the trade-offs of usability versus risk.

Part of this has been prompted by the advent of products like Internet of Things–enabled wearables. These products have helped make consumers more aware of the risks of data and of how their personal data has been, or can be, used. Now corporate customers are thinking along the same lines when considering products in the B2B world.

The market is paying attention—no company wants to be the poster child for privacy neglect. The pendulum has swung toward privacy, and companies must now show customers that they’re taking it seriously.

Privacy as a Competitive Advantage

Privacy begins with trust. Trust is now the most powerful digital currency and it powers today’s economy. Its constituent elements are data protection and privacy. Together, they must be part of every company’s operational DNA.

The General Data Protection Regulation (GDPR) has thrown a spotlight on global privacy and forced companies to reevaluate their data trail because the regulation is backed up with firepower in the form of penalties—fines of up to €20 million or up to 4% of annual global revenue, whichever is greater.

Depending on the security maturity levels within companies, the GDPR has been either an easy adaptation or a heavy burden. The GDPR has prompted other non-EU countries and their citizens to take a serious look at their current data protection laws. Although the GDPR is a European law, it affects more than just Europe-based companies and EU citizens. If your competitor is GDPR compliant and you’re not, guess who’s going to win business?

The GDPR is, essentially, very close to a certification (more on certifications later). The companies that are GDPR compliant are showing that they take privacy and security seriously and have privacy policies that are accessible and understandable by the general populace so that customers (not just the in-house legal department) know what they’re signing up for.

Companies earn trust by being transparent. Every company needs a “trust model.” There are some basic principles for developing such a model. They include incorporating security into software development from the very beginning, creating strict rules when it comes to who can access data and for what reason, and installing a program of consistent monitoring.

Toward an Ethical Foundation

Codes of ethics for data already exist, like that of the Association for Computing Machinery, which updated its code last summer. “The question is whether those high-level statements and principles actually provide meaningful guidance because they’re often difficult to translate to day-to-day thinking as technology products are being built,” says Solon Barocas, assistant professor in the department of information science at Cornell University in New York. “But high-level statements have value because they signal that the community itself values these things.”

Barocas teaches a data science ethics course, one of a wave of such courses that have recently appeared at universities across the United States. The core thought is to train the next generation of data scientists to incorporate these ideas into their work from the get-go. Recognizing a competitive edge, this is the kind of training that globally ambitious companies will be looking for when hiring data scientists.

The tech industry has existing practices and policies, Barocas says, but they haven’t been particularly well conveyed outside of the industry. That’s beginning to change, however (see How to Create a Data Ethics Policy).

How to Create a Data Ethics Policy

Data has become a tool for building trust with customers.

What does a data ethics policy look like? Here are six categories, adapted from the guidelines created by dataethics.eu.

Human first. Data is always borrowed, never owned; individuals’ interests, rights, and well-being are prioritized; individuals benefit from companies’ use of their data; and systems emphasize privacy by design.

Individual control. Individuals have primary control over, and should be fully aware of, how their data is collected, used, and kept.

Transparency. Companies must be transparent in how and where data is stored and must be able and willing to explain the artificial intelligence and algorithms they use.

Behavioral design. Companies must not try to influence customers’ behavior in ways that are not beneficial to their interests.

Accountability. The protection of personal data that informs data processing should extend to all business agents involved with the data.

Equality. Companies must pay attention to ensure that data is used without bias.

Read MoreClose

Done right, data ethics isn’t anti-competitive but is a competitive advantage, says Pernille Tranberg, co-founder of Copenhagen-based dataethics.eu and co-author of the book Data Ethics: The New Competitive Advantage.

But we should beware of “ethics washing” and “privacy washing,” she says. Think back to the early days of environmental sustainability when there were companies that took serious steps to improve their ecological impact. However, there were also plenty of companies accused of “green washing,” not backing up the talk with actions. This is a real danger, and it’s already happening, says Tranberg.

Companies that are transparent and that give people enough reasons to trust them will thrive. Those that don’t do this will have a much harder time in the future.

One.Thing.Less co-founder James Aschberger

The answer is developing third-party certification and verification schemes like the ones for the environmental movement. “A lot of companies say, ‘We really anonymize data, we can’t go back and identify you,’” says Tranberg. “That’s very cool if they really do it, but they need to have somebody to check that independently.”

Companies are also recognizing the value in independent certifications for data ethics and privacy, says Tranberg. “I already see some companies that are asking for it.”

She thinks it’s likely we’ll start out with many small certification schemes before they mature and grow into larger organizations and governments need to become involved. One example is EuroPrise’s European Privacy Seal, a privacy certification for IT products. The (surprise) German organization was started in 2009 as an EU-funded project and is now a global certification.

You Can Take It with You

But it’s not enough to assure customers that their data is being treated with kid gloves. Because soon customers will expect to be able to take their data wherever they want. Article 20 of the GDPR outlines the rules for data portability (the ability to take data from one platform to another).

Although many big companies that depend on data might not appreciate Article 20, it’s intended to increase competition by opening the field to other platforms. It also gives consumers the power to express their displeasure with a particular company by taking their data away.

Achieving portability could initiate new ideas about data monetization. Startups are appearing with products to do just this. There are several new models (see New Models for Data Management).

New Models for Data Management

With privacy comes with responsibility. Will customers be able to handle it?

Driven by opportunity, outrage, or concern—or all three—there are plenty of ideas that aim to change how data is managed and monetized. If any of these ideas take off, we’ll experience a massive change in the data economy.

Personal data brokers. Think financial adviser but for personal data. Upending the passive role of the consumer, the (maybe) up-and-coming industry promises to help consumers manage and monetize their data.

Self-sovereign identity. First conceived in the early computing days of the 1970s, self-sovereign identities, unlike today’s distributed digital identifiers, are central, singular, and created and maintained by the individual. Individuals can store and share—or not—as they like. Sovrin.org calls itself a “decentralized, global public utility,” providing infrastructure for self-sovereign identities.

Personal data marketplace. Personal data marketplaces aim to sell data for cash (or cryptocurrency, in some cases) by managing the sales with buyers. One startup (and there are many), Datacoup, has a platform that merges personal data from various digital platforms to create a single anonymous data profile. Parties interested in buying that data can do so through the platform, and the data holder gets paid.

Data labor unions. The recent book Radical Markets: Uprooting Capitalism and Democracy for a Just Society, by Eric A. Posner, a professor at University of Chicago’s law school, and E. Glen Weyl, a senior researcher at Microsoft, posits that data is a form of labor and suggests the creation of data labor unions that will enable fair compensation for data.

Personal data storage. From the famed Sir Tim Berners-Lee, creator of the World Wide Web, comes Solid, a new decentralized, open-source web platform. A Solid POD (personal online data) stores personal info; Solid POD owners can decide with whom to share access to their POD. Personal data can be shared across apps.

Read MoreClose
How Can Everyone Win?

Consumers want to see benefits from the use of their data, and companies want to grow. The two aren’t mutually exclusive.

Providing transparency can smooth the path to consent. A company that provides consumers with a clear understanding of how their personal data is used and secured, and that makes clear how consumers can benefit from sharing their data, is more likely to win approval for that data use.

Switzerland-based startup One.Thing.Less wants to take the work out of privacy for both consumers and corporations with its app that facilitates the communication about data privacy and use policies by starting with simple questions. It’s an opportunity for companies to engage their customers, says co-founder James Aschberger, and a win-win for both parties. Making the data relationship clear and simple will be another competitive advantage—it gives consumers some peace of mind and signals that a corporation isn’t hiding its real intentions behind a wall of terms of use agreements that few actually read.

But a future in which individuals are in complete control of their own data isn’t a given. It’s not clear that people are better at managing privacy than companies, says Cornell’s Barocas. The idea already has a long history, beginning in the 1990s, and an equally long track record of failure.

One.Thing.Less’s Aschberger isn’t convinced that there’s enough money in personal data for it to ever become a substantial source of income for individuals. But he does think that we’re moving toward a future that’s more collaborative, allowing individuals to control their data but also provide, if they so choose, more accurate data. And companies might not have much choice in the matter.

“It will become very interesting because I believe that those companies that are transparent and that give people enough reasons to trust them will thrive,” he says. “And I think that those that don’t do this will have a much harder time in the future.” D!

About the Authors

Timo Elliott is Vice President and Global Innovation Evangelist at SAP.

Lakshmi Hanspal is Vice President and Chief Security Officer at SAP Ariba.

John Schitka is Solutions Marketing Manager at SAP Canada.

Danielle Beurteaux is a New York–based writer who covers business, technology, and philanthropy. Her work has appeared in The New York Times and on Popular Mechanics, CNN, and Institutional Investor’s Alpha, among other outlets.

Read more thought-provoking articles in the latest issue of Digitalist Magazine, Executive Quarterly.