How Access To Voter Data Fuels Campaigns’ Drive To Win Your Vote

Reuters Content Solutions

Running for elective office is about building an organization to promote the candidate’s views. It’s about attracting professionals and volunteers to make calls and knock on doors. It’s about raising money. But today a political campaign of any size needs to be about collecting and analyzing data about voters: who supports the candidate, who can be persuaded, who may give money and who will show up at the polls.

The political parties provide their candidates with access to data, but access is just the start. To make productive use of data, campaigns must first ensure it is of high quality and then employ analysis to identify supporters, encourage them to volunteer and donate and, most importantly, get out to vote.

Sophisticated data analysis is not a level playing field—it costs a lot of money. What’s more, effective analysis of data creates a virtuous cycle: More resources enable a campaign to collect more data about voters’ views, to find more supporters, to refine messages that resonate, to recruit more volunteers, to attract even more donors and to get their supporters to the polls.

Rivals who are less adept with data or have fewer resources are at a significant disadvantage, says Daniel Kreiss, who teaches media studies at the University of North Carolina and is the author of “Prototype Politics: Technology-Intensive Campaigning and the Data of Democracy.”

“One of the things that data helps do is to figure out which groups of voters do we need to target, and which groups of voters do we need to spend our time and our messaging resources on,” says Kreiss. “And how do we efficiently do that to get more votes, again on Election Day, than the next person. So data matters greatly in terms of resources.”

A number of companies have emerged in recent years to help candidates and campaigns crunch data. NationBuilder, for example, prides itself on serving all political persuasions. Emily Schwartz, NationBuilder’s vice president of organizing, says that today’s data tools make analytical resources available to local and grassroots campaigns as well as national ones.

NationBuilder’s service is free to try, then users pay monthly fees based on the level of services, such as software for email campaigns and campaign-focused websites.

Other organizations offer data help to candidates:

  • i360, whose backers include the conservative Koch brothers, bills itself as “the leading data and technology resource for the free market political advocacy community.”
  • NGP VAN is a voter-data management platform for Democratic candidates and progressive organizations.
  • The Republican National Committee’s Data Center 2016 project is a proprietary voter file designed to give its candidates ammunition in data-driven elections.

The Obama model

Like other political experts, Meta S. Brown, president of the consultancy A4A Brown Inc. and author of “Data Mining for Dummies,” points to President Obama’s 2012 re-election victory over Mitt Romney as a recent high point in the use of data. In that election, the Obama campaign sent out volunteers to go door to door asking voters their opinions.

“You can use that information in every way you campaign,” says Brown. “They can use it in advertising and how to do ad buys effectively. That was a big competitive advantage of the Obama campaign over the Romney campaign.”

That data crunching provided another type of competitive advantage, she adds: The Obama campaign could spend fewer ad dollars to reach voters, on average, because they selected only those TV programs and times they needed.

The kind of sophisticated data analysis goes well beyond TV. It enables a campaign to microtarget people through email messages, social media, online ads, follow-up visits and phone calls to the homes of supporters to encourage them to vote.

Brown points to the Obama campaign’s expert use of social media as a model in the modern campaign. The campaign found supporters on Facebook (through their “likes” or message postings) and got them to send supportive messages to friends in swing states like Ohio. “That ability to take advantage of social media, that really is a tipping point,” says Brown.

The effective use of data can also reinforce a campaign’s relative strength against an opponent.

Campaigns that collect voters’ email addresses can conduct experiments on which messages are more effective, down to the email subject line, says Kreiss. By testing which type of message delivers better results—yielding more volunteers or more donations—the campaigns refine their efforts. The Obama campaign estimated that insights gained through these “A/B tests” added $100 million in donations during the 2012 race, according to Kreiss.

It starts with voter data

Campaigns start their data work with voter records. Voter data is largely a matter of public record in the United States. That includes names and mailing address and often when the person voted in past elections. It sometimes includes party affiliation.

For data experts, getting these names is just the start. The information needs to be in a format that computers can read. Voter rolls are subject to change as people move, die, change names or register for the first time. Keeping an accurate and up-to-date voter database is a major task for registration officials. A 2012 study by the Pew Center on the States found that about 24 million voter registrations in the United States—one in every eight—are no longer valid or are inaccurate. And voter lists do not come in a standard format, says Schwartz of NationBuilder. Wisconsin, Virginia and Washington, for example, do not provide party affiliation with their voter lists.

Brown notes that data has to be organized to be useful, and that takes work. “It doesn’t matter who the source is, you have to expect some quality problems and do some investigations,” she says. That’s where the well-funded and data-savvy campaign has a significant advantage.

Data and our democracy

Experts on data analysis disagree on its implications for the country. Kreiss says cynics argue that candidates use data to manipulate voters, but he argues for the benefits of analysis.

“We live in world where it is a lot harder to reach voters than it was 40 years ago because people’s media habits have changed significantly,” he says. “So to the extent that data is enrolled in the ability of campaigns to actually figure out which sorts of voters should we be talking to, how do we mobilize them, how do we get them to the polls and, ultimately, what should we be saying in order to get people excited about particular candidates, I think that is a good thing for democracy.”

The rise of data crunching in politics also tends to favor incumbents, because those who have been in the game before will have more detailed data in their files.

Eitan Hersh, an assistant professor of political science at Yale, says that while political parties share data with candidates about voters in their district, incumbents are able to take advantage of the data they have collected in previous campaigns to reconnect with previously identified supporters.

“They make lists of everyone who asked for a yard sign and made a donation and volunteered. And if you have been in Congress for 10 years, and you have a whole bunch of people on that list, that can be very valuable,” says Hersh. “If a challenger comes around, they might not have that.”

The technology used to analyze voter data—which costs less than buying TV ads—can help an upstart to level the playing field, but it requires the insurgent campaign to have data savvy and a lot of volunteers.

“One huge advantage that is closely tied to data is volunteer support because a lot of the strategies that utilize data are things like door-to-door canvassing and phone banking, strategies where the data helps you [know] who to contact and helps you study the effectiveness of contacts,” says Hersh. “Once you have a certain level of data access, that data can power volunteers. And it’s hard to manufacture volunteers.”

Another effect of campaign data-crunching is the potential for increased voter turnout.

Ryan Enos, an associate professor of government at Harvard, co-authored a study of get-out-the-vote efforts in the 2012 presidential race. Both the Obama and Romney campaigns successfully used data-driven techniques to encourage citizens to vote. In total, the campaigns raised voter turnout by 2.6 million people, or 7 percent, says Enos. “If we define a healthy democracy as one with larger participation, then that is probably a good thing,” he says.

However, there is a participation gap in who responds to these get-out-the-vote efforts, says Enos. People who are already more likely to vote are more responsive. “What we find is that these sorts of techniques work best for people who tend to be politically conservative. They tend to be richer, they tend not to be racial minorities. In some ways what these techniques do is widen the gap in what we might call participators and non-participators,” he says.

At the end of the day, successful use of data is no substitute for a strong candidate. Schwartz, who worked on Obama’s 2008 campaign, notes that effectively using data saves time and money in targeting voters. It does not substitute for a candidate’s message. “They ran an incredible campaign,” she says of 2008. But they also had Barack Obama.”

This blog was written through a partnership with Thompson Reuters. To learn how SAP is helping them Run Live, click here.

If you enjoyed this post, you might enjoy:


About Reuters Content Solutions

Reuters Solutions develops tailored, multimedia content with a journalistic approach. Reuters Solutions operates independently from Reuters editorial.

Data Analysts And Scientists More Important Than Ever For The Enterprise

Daniel Newman

The business world is now firmly in the age of data. Not that data wasn’t relevant before; it was just nowhere close to the speed and volume that’s available to us today. Businesses are buckling under the deluge of petabytes, exabytes, and zettabytes. Within these bytes lie valuable information on customer behavior, key business insights, and revenue generation. However, all that data is practically useless for businesses without the ability to identify the right data. Plus, if they don’t have the talent and resources to capture the right data, organize it, dissect it, draw actionable insights from it and, finally, deliver those insights in a meaningful way, their data initiatives will fail.

Rise of the CDO

Companies of all sizes can easily find themselves drowning in data generated from websites, landing pages, social streams, emails, text messages, and many other sources. Additionally, there is data in their own repositories. With so much data at their disposal, companies are under mounting pressure to utilize it to generate insights. These insights are critical because they can (and should) drive the overall business strategy and help companies make better business decisions. To leverage the power of data analytics, businesses need more “top-management muscle” specialized in the field of data science. This specialized field has lead to the creation of roles like Chief Data Officer (CDO).

In addition, with more companies undertaking digital transformations, there’s greater impetus for the C-suite to make data-driven decisions. The CDO helps make data-driven decisions and also develops a digital business strategy around those decisions. As data grows at an unstoppable rate, becoming an inseparable part of key business functions, we will see the CDO act as a bridge between other C-suite execs.

Data skills an emerging business necessity

So far, only large enterprises with bigger data mining and management needs maintain in-house solutions. These in-house teams and technologies handle the growing sets of diverse and dispersed data. Others work with third-party service providers to develop and execute their big data strategies.

As the amount of data grows, the need to mine it for insights becomes a key business requirement. For both large and small businesses, data-centric roles will experience endless upward mobility. These roles include data anlysts and scientists. There is going to be a huge opportunity for critical thinkers to turn their analytical skills into rapidly growing roles in the field of data science. In fact, data skills are now a prized qualification for titles like IT project managers and computer systems analysts.

Forbes cited the McKinsey Global Institute’s prediction that by 2018 there could be a massive shortage of data-skilled professionals. This indicates a disruption at the demand-supply level with the needs for data skills at an all-time high. With an increasing number of companies adopting big data strategies, salaries for data jobs are going through the roof. This is turning the position into a highly coveted one.

According to Harvard Professor Gary King, “There is a big data revolution. The big data revolution is that now we can do something with the data.” The big problem is that most enterprises don’t know what to do with data. Data professionals are helping businesses figure that out. So if you’re casting about for where to apply your skills and want to take advantage of one of the best career paths in the job market today, focus on data science.

I’m compensated by University of Phoenix for this blog. As always, all thoughts and opinions are my own.

For more insight on our increasingly connected future, see The $19 Trillion Question: Are You Undervaluing The Internet Of Things?

The post Data Analysts and Scientists More Important Than Ever For the Enterprise appeared first on Millennial CEO.


About Daniel Newman

Daniel Newman serves as the Co-Founder and CEO of EC3, a quickly growing hosted IT and Communication service provider. Prior to this role Daniel has held several prominent leadership roles including serving as CEO of United Visual. Parent company to United Visual Systems, United Visual Productions, and United GlobalComm; a family of companies focused on Visual Communications and Audio Visual Technologies. Daniel is also widely published and active in the Social Media Community. He is the Author of Amazon Best Selling Business Book "The Millennial CEO." Daniel also Co-Founded the Global online Community 12 Most and was recognized by the Huffington Post as one of the 100 Business and Leadership Accounts to Follow on Twitter. Newman is an Adjunct Professor of Management at North Central College. He attained his undergraduate degree in Marketing at Northern Illinois University and an Executive MBA from North Central College in Naperville, IL. Newman currently resides in Aurora, Illinois with his wife (Lisa) and his two daughters (Hailey 9, Avery 5). A Chicago native all of his life, Newman is an avid golfer, a fitness fan, and a classically trained pianist

When Good Is Good Enough: Guiding Business Users On BI Practices

Ina Felsheim

Image_part2-300x200In Part One of this blog series, I talked about changing your IT culture to better support self-service BI and data discovery. Absolutely essential. However, your work is not done!

Self-service BI and data discovery will drive the number of users using the BI solutions to rapidly expand. Yet all of these more casual users will not be well versed in BI and visualization best practices.

When your user base rapidly expands to more casual users, you need to help educate them on what is important. For example, one IT manager told me that his casual BI users were making visualizations with very difficult-to-read charts and customizing color palettes to incredible degrees.

I had a similar experience when I was a technical writer. One of our lead writers was so concerned with readability of every sentence that he was going through the 300+ page manuals (yes, they were printed then) and manually adjusting all of the line breaks and page breaks. (!) Yes, readability was incrementally improved. But now any number of changes–technical capabilities, edits, inserting larger graphics—required re-adjusting all of those manual “optimizations.” The time it took just to do the additional optimization was incredible, much less the maintenance of these optimizations! Meanwhile, the technical writing team was falling behind on new deliverables.

The same scenario applies to your new casual BI users. This new group needs guidance to help them focus on the highest value practices:

  • Customization of color and appearance of visualizations: When is this customization necessary for a management deliverable, versus indulging an OCD tendency? I too have to stop myself from obsessing about the font, line spacing, and that a certain blue is just a bit different than another shade of blue. Yes, these options do matter. But help these casual users determine when that time is well spent.
  • Proper visualizations: When is a spinning 3D pie chart necessary to grab someone’s attention? BI professionals would firmly say “NEVER!” But these casual users do not have a lot of depth on BI best practices. Give them a few simple guidelines as to when “flash” needs to subsume understanding. Consider offering a monthly one-hour Lunch and Learn that shows them how to create impactful, polished visuals. Understanding if their visualizations are going to be viewed casually on the way to a meeting, or dissected at a laptop, also helps determine how much time to spend optimizing a visualization. No, you can’t just mandate that they all read Tufte.
  • Predictive: Provide advanced analytics capabilities like forecasting and regression directly in their casual BI tools. Using these capabilities will really help them wow their audience with substance instead of flash.
  • Feature requests: Make sure you understand the motivation and business value behind some of the casual users’ requests. These casual users are less likely to understand the implications of supporting specific requests across an enterprise, so make sure you are collaborating on use cases and priorities for substantive requests.

By working with your casual BI users on the above points, you will be able to collectively understand when the absolute exact request is critical (and supports good visualization practices), and when it is an “optimization” that may impact productivity. In many cases, “good” is good enough for the fast turnaround of data discovery.

Next week, I’ll wrap this series up with hints on getting your casual users to embrace the “we” not “me” mentality.

Read Part One of this series: Changing The IT Culture For Self-Service BI Success.

Follow me on Twitter: @InaSAP


Running Future Cities on Blockchain

Dan Wellers , Raimund Gross and Ulrich Scholl

Building on the Blockchain Framework

Some experts say these seemingly far-future speculations about the possibilities of combining technologies using blockchain are actually both inevitable and imminent:

Democratizing design and manufacturing by enabling individuals and small businesses to buy, sell, share, and digitally remix products affordably while protecting intellectual property rights.
Decentralizing warehousing and logistics by combining autonomous vehicles, 3D printers, and smart contracts to optimize delivery of products and materials, and even to create them on site as needed.
Distributing commerce by mixing virtual reality, 3D scanning and printing, self-driving vehicles, and artificial intelligence into immersive, personalized, on-demand shopping experiences that still protect buyers’ personal and proprietary data.

The City of the Future

Imagine that every agency, building, office, residence, and piece of infrastructure has an entry on a blockchain used as a city’s digital ledger. This “digital twin” could transform the delivery of city services.

For example:

  • Property owners could easily monetize assets by renting rooms, selling solar power back to the grid, and more.
  • Utilities could use customer data and AIs to make energy-saving recommendations, and smart contracts to automatically adjust power usage for greater efficiency.
  • Embedded sensors could sense problems (like a water main break) and alert an AI to send a technician with the right parts, tools, and training.
  • Autonomous vehicles could route themselves to open parking spaces or charging stations, and pay for services safely and automatically.
  • Cities could improve traffic monitoring and routing, saving commuters’ time and fuel while increasing productivity.

Every interaction would be transparent and verifiable, providing more data to analyze for future improvements.

Welcome to the Next Industrial Revolution

When exponential technologies intersect and combine, transformation happens on a massive scale. It’s time to start thinking through outcomes in a disciplined, proactive way to prepare for a future we’re only just beginning to imagine.

Download the executive brief Running Future Cities on Blockchain.

Read the full article Pulling Cities Into The Future With Blockchain


About Dan Wellers

Dan Wellers is founder and leader of Digital Futures at SAP, a strategic insights and thought leadership discipline that explores how digital technologies drive exponential change in business and society.

Raimund Gross

About Raimund Gross

Raimund Gross is a solution architect and futurist at SAP Innovation Center Network, where he evaluates emerging technologies and trends to address the challenges of businesses arising from digitization. He is currently evaluating the impact of blockchain for SAP and our enterprise customers.

Ulrich Scholl

About Ulrich Scholl

Ulrich Scholl is Vice President of Industry Cloud and Custom Development at SAP. In this role, Ulrich discovers and implements best practices to help further the understanding and adoption of the SAP portfolio of industry cloud innovations.


Are AI And Machine Learning Killing Analytics As We Know It?

Joerg Koesters

According to IDC, artificial intelligence (AI) is expected to become pervasive across customer journeys, supply networks, merchandizing, and marketing and commerce because it provides better insights to optimize retail execution. For example, in the next two years:

  • 40% of digital transformation initiatives will be supported by cognitive computing and AI capabilities to provide critical, on-time insights for new operating and monetization models.
  • 30% of major retailers will adopt a retail omnichannel commerce platform that integrates a data analytics layer that centrally orchestrates omnichannel capabilities.

One thing is clear: new analytic technologies are expected to radically change analytics – and retail – as we know them.

AI and machine learning defined in the context of retail

AI is defined broadly as the ability of computers to mimic human thinking and logic. Machine learning is a subset of AI that focuses on how computers can learn from data without being programmed through the use of algorithms that adapt to change; in other words, they can “learn” continuously in response to new data. We’re seeing these breakthroughs now because of massive improvements in hardware (for example, GPUs and multicore processing) that can handle Big Data volumes and run deep learning algorithms needed to analyze and learn from the data.

Ivano Ortis, vice president at IDC, recently shared with me how he believes, “Artificial intelligence will take analytics to the next level and will be the foundation for retail innovation, as reported by one out of every two retailers globally. AI enables scale, automation, and unprecedented precision and will drive customer experience innovation when applied to both hyper micro customer segmentation and contextual interaction.”

Given the capabilities of AI and machine learning, it’s easy to see how they can be powerful tools for retailers. Now computers can read and listen to data, understand and learn from it, and instantly and accurately recommend the next best action without having to be explicitly programmed. This is a boon for retailers seeking to accurately predict demand, anticipate customer behavior, and optimize and personalize customer experiences.

For example, it can be used to automate:

  • Personalized product recommendations based on data about each customer’s unique interests and buying propensity
  • The selection of additional upsell and cross-sell options that drive greater customer value
  • Chat bots that can drive intelligent and meaningful engagement with customers
  • Recommendations on additional services and offerings based on past and current buying data and customer data
  • Planogram analyses, which support in-store merchandizing by telling people what’s missing, comparing sales to shelf space, and accelerating shelf replenishment by automating reorders
  • Pricing engines used to make tailored, situational pricing decisions

Particularly in the United States, retailers are already able to collect large volumes of transaction-based and behavioral data from their customers. And as data volumes grow and processing power improves, machine learning becomes increasingly applicable in a wider range of retail areas to further optimize business processes and drive more impactful personalized and contextual consumer experiences and products.

The transformation of retail has already begun

The impacts of AI and machine learning are already being felt. For example:

  • Retailers are predicting demand with machine learning in combination with IoT technologies to optimize store businesses and relieve workforces
  • Advertisements are being personalized based on in-store camera detections and taking over semi-manual clienteling tasks of store employees
  • Retailers can monitor wait times in checkout lines to understand store traffic and merchandising effectiveness at the individual store level – and then tailor assortments and store layouts to maximize basket size, satisfaction, and sell through
  • Systems can now recognize and predict customer behavior and improve employee productivity by turning scheduled tasks into on-demand activities
  • Camera systems can detect the “fresh” status of perishable products before onsite employees can
  • Brick-and-mortar stores are automating operational tasks, such as setting shelf pricing, determining product assortments and mixes, and optimizing trade promotions
  • In-store apps can tell how long a customer has been in a certain aisle and deliver targeted offers and recommendations (via his or her mobile device) based on data about data about personal consumption histories and preferences

A recent McKinsey study provided examples that quantify the potential value of these technologies in transforming how retailers operate and compete. For example:

  • U.S. retailer supply chain operations that have adopted data and analytics have seen up to a 19% increase in operating margin over the last five years. Using data and analytics to improve merchandising, including pricing, assortment, and placement optimization, is leading to an additional 16% in operating margin improvement.
  • Personalizing advertising is one of the strongest use cases for machine learning today. Additional retail use cases with high potential include optimizing pricing, routing, and scheduling based on real-time data in travel and logistics, as well as optimizing merchandising strategies.

Exploiting the full value of data

Thin margins (especially in the grocery sector) and pressure from industry-leading early adopters such as Amazon and Walmart have created strong incentives to put customer data to work to improve everything from cross-selling additional products to reducing costs throughout the entire value chain. But McKinsey has assessed that the U.S. retail sector has realized only 30-40% of the potential margin improvements and productivity growth its analysts envisioned in 2011 – and a large share of the value of this growth has gone to consumers through lower prices. So thus far, only a fraction of the potential value from AI and machine learning has been realized.

According to Forbes, U.S. retailers have the potential to see a 60%+ increase in net margin and 0.5–1.0% annual productivity growth. But there are major barriers to realizing this value, including lack of analytical talent and siloed data within companies.

This is where machine learning and analytics kick in. AI and machine learning can help scale the repetitive analytics tasks required to drive leverage of the available data. When deployed on a companywide, real-time analytics platform, they can become the single source of truth that all enterprise functions rely on to make better decisions.

How will this change analytics?

So how will AI and machine learning change retail analytics? We expect that AI and machine learning will not kill analytics as we know it, but rather give it a new and even more impactful role in driving the future of retail. For example, we anticipate that:

  • Retailers will include machine learning algorithms as an additional factor in analyzing and  monitoring business outcomes in relation to machine learning algorithms
  • They will use AI and machine learning to sharpen analytic algorithms, detect more early warning signals, anticipate trends, and have accurate answers before competitors do
  • Analytics will happen in real time and act as the glue between all areas of the business
  • Analytics will increasingly focus on analyzing manufacturing machine behavior, not just business and consumer behavior

Ivano Ortis at IDC authored a recent report, “Why Retail Analytics are a Foundation for Retail Profits,” in which he provides further insights on this topic. He notes how retail leaders will use new kinds of analytics to drive greater profitability, further differentiate the customer experience, and compete more effectively, “In conclusion, commerce and technology will converge, enabling retailers to achieve short-term ROI objectives while discovering untapped demand. But implementing analytics will require coordination across key management roles and business processes up and down each retail organization. Early adopters are realizing demonstrably significant value from their initiatives – double-digit improvements in margins, same-store and e-commerce revenue, inventory positions and sell-through, and core marketing metrics. A huge opportunity awaits.”

So how do you see your retail business adopting advanced analytics like AI and machine learning? I encourage you to read IDC’s report in detail, as it provides valuable insights to help you invest in – and apply – new kinds of analytics that will be essential to profitable growth.

For more information, download IDC’s “Why Retail Analytics are a Foundation for Retail Profits.


About Joerg Koesters

Joerg Koesters is the Head of Retail Marketing and Communication at SAP. He is a Technology Marketing executive with 20 years of experience in Marketing, Sales and Consulting, Joerg has deep knowledge in retail and consumer products having worked both in the industry and in the technology sector.