How Access To Voter Data Fuels Campaigns’ Drive To Win Your Vote

Reuters Content Solutions

Running for elective office is about building an organization to promote the candidate’s views. It’s about attracting professionals and volunteers to make calls and knock on doors. It’s about raising money. But today a political campaign of any size needs to be about collecting and analyzing data about voters: who supports the candidate, who can be persuaded, who may give money and who will show up at the polls.

The political parties provide their candidates with access to data, but access is just the start. To make productive use of data, campaigns must first ensure it is of high quality and then employ analysis to identify supporters, encourage them to volunteer and donate and, most importantly, get out to vote.

Sophisticated data analysis is not a level playing field—it costs a lot of money. What’s more, effective analysis of data creates a virtuous cycle: More resources enable a campaign to collect more data about voters’ views, to find more supporters, to refine messages that resonate, to recruit more volunteers, to attract even more donors and to get their supporters to the polls.

Rivals who are less adept with data or have fewer resources are at a significant disadvantage, says Daniel Kreiss, who teaches media studies at the University of North Carolina and is the author of “Prototype Politics: Technology-Intensive Campaigning and the Data of Democracy.”

“One of the things that data helps do is to figure out which groups of voters do we need to target, and which groups of voters do we need to spend our time and our messaging resources on,” says Kreiss. “And how do we efficiently do that to get more votes, again on Election Day, than the next person. So data matters greatly in terms of resources.”

A number of companies have emerged in recent years to help candidates and campaigns crunch data. NationBuilder, for example, prides itself on serving all political persuasions. Emily Schwartz, NationBuilder’s vice president of organizing, says that today’s data tools make analytical resources available to local and grassroots campaigns as well as national ones.

NationBuilder’s service is free to try, then users pay monthly fees based on the level of services, such as software for email campaigns and campaign-focused websites.

Other organizations offer data help to candidates:

  • i360, whose backers include the conservative Koch brothers, bills itself as “the leading data and technology resource for the free market political advocacy community.”
  • NGP VAN is a voter-data management platform for Democratic candidates and progressive organizations.
  • The Republican National Committee’s Data Center 2016 project is a proprietary voter file designed to give its candidates ammunition in data-driven elections.

The Obama model

Like other political experts, Meta S. Brown, president of the consultancy A4A Brown Inc. and author of “Data Mining for Dummies,” points to President Obama’s 2012 re-election victory over Mitt Romney as a recent high point in the use of data. In that election, the Obama campaign sent out volunteers to go door to door asking voters their opinions.

“You can use that information in every way you campaign,” says Brown. “They can use it in advertising and how to do ad buys effectively. That was a big competitive advantage of the Obama campaign over the Romney campaign.”

That data crunching provided another type of competitive advantage, she adds: The Obama campaign could spend fewer ad dollars to reach voters, on average, because they selected only those TV programs and times they needed.

The kind of sophisticated data analysis goes well beyond TV. It enables a campaign to microtarget people through email messages, social media, online ads, follow-up visits and phone calls to the homes of supporters to encourage them to vote.

Brown points to the Obama campaign’s expert use of social media as a model in the modern campaign. The campaign found supporters on Facebook (through their “likes” or message postings) and got them to send supportive messages to friends in swing states like Ohio. “That ability to take advantage of social media, that really is a tipping point,” says Brown.

The effective use of data can also reinforce a campaign’s relative strength against an opponent.

Campaigns that collect voters’ email addresses can conduct experiments on which messages are more effective, down to the email subject line, says Kreiss. By testing which type of message delivers better results—yielding more volunteers or more donations—the campaigns refine their efforts. The Obama campaign estimated that insights gained through these “A/B tests” added $100 million in donations during the 2012 race, according to Kreiss.

It starts with voter data

Campaigns start their data work with voter records. Voter data is largely a matter of public record in the United States. That includes names and mailing address and often when the person voted in past elections. It sometimes includes party affiliation.

For data experts, getting these names is just the start. The information needs to be in a format that computers can read. Voter rolls are subject to change as people move, die, change names or register for the first time. Keeping an accurate and up-to-date voter database is a major task for registration officials. A 2012 study by the Pew Center on the States found that about 24 million voter registrations in the United States—one in every eight—are no longer valid or are inaccurate. And voter lists do not come in a standard format, says Schwartz of NationBuilder. Wisconsin, Virginia and Washington, for example, do not provide party affiliation with their voter lists.

Brown notes that data has to be organized to be useful, and that takes work. “It doesn’t matter who the source is, you have to expect some quality problems and do some investigations,” she says. That’s where the well-funded and data-savvy campaign has a significant advantage.

Data and our democracy

Experts on data analysis disagree on its implications for the country. Kreiss says cynics argue that candidates use data to manipulate voters, but he argues for the benefits of analysis.

“We live in world where it is a lot harder to reach voters than it was 40 years ago because people’s media habits have changed significantly,” he says. “So to the extent that data is enrolled in the ability of campaigns to actually figure out which sorts of voters should we be talking to, how do we mobilize them, how do we get them to the polls and, ultimately, what should we be saying in order to get people excited about particular candidates, I think that is a good thing for democracy.”

The rise of data crunching in politics also tends to favor incumbents, because those who have been in the game before will have more detailed data in their files.

Eitan Hersh, an assistant professor of political science at Yale, says that while political parties share data with candidates about voters in their district, incumbents are able to take advantage of the data they have collected in previous campaigns to reconnect with previously identified supporters.

“They make lists of everyone who asked for a yard sign and made a donation and volunteered. And if you have been in Congress for 10 years, and you have a whole bunch of people on that list, that can be very valuable,” says Hersh. “If a challenger comes around, they might not have that.”

The technology used to analyze voter data—which costs less than buying TV ads—can help an upstart to level the playing field, but it requires the insurgent campaign to have data savvy and a lot of volunteers.

“One huge advantage that is closely tied to data is volunteer support because a lot of the strategies that utilize data are things like door-to-door canvassing and phone banking, strategies where the data helps you [know] who to contact and helps you study the effectiveness of contacts,” says Hersh. “Once you have a certain level of data access, that data can power volunteers. And it’s hard to manufacture volunteers.”

Another effect of campaign data-crunching is the potential for increased voter turnout.

Ryan Enos, an associate professor of government at Harvard, co-authored a study of get-out-the-vote efforts in the 2012 presidential race. Both the Obama and Romney campaigns successfully used data-driven techniques to encourage citizens to vote. In total, the campaigns raised voter turnout by 2.6 million people, or 7 percent, says Enos. “If we define a healthy democracy as one with larger participation, then that is probably a good thing,” he says.

However, there is a participation gap in who responds to these get-out-the-vote efforts, says Enos. People who are already more likely to vote are more responsive. “What we find is that these sorts of techniques work best for people who tend to be politically conservative. They tend to be richer, they tend not to be racial minorities. In some ways what these techniques do is widen the gap in what we might call participators and non-participators,” he says.

At the end of the day, successful use of data is no substitute for a strong candidate. Schwartz, who worked on Obama’s 2008 campaign, notes that effectively using data saves time and money in targeting voters. It does not substitute for a candidate’s message. “They ran an incredible campaign,” she says of 2008. But they also had Barack Obama.”

This blog was written through a partnership with Thompson Reuters. To learn how SAP is helping them Run Live, click here.

If you enjoyed this post, you might enjoy:


About Reuters Content Solutions

Reuters Solutions develops tailored, multimedia content with a journalistic approach. Reuters Solutions operates independently from Reuters editorial.

Data Analysts And Scientists More Important Than Ever For The Enterprise

Daniel Newman

The business world is now firmly in the age of data. Not that data wasn’t relevant before; it was just nowhere close to the speed and volume that’s available to us today. Businesses are buckling under the deluge of petabytes, exabytes, and zettabytes. Within these bytes lie valuable information on customer behavior, key business insights, and revenue generation. However, all that data is practically useless for businesses without the ability to identify the right data. Plus, if they don’t have the talent and resources to capture the right data, organize it, dissect it, draw actionable insights from it and, finally, deliver those insights in a meaningful way, their data initiatives will fail.

Rise of the CDO

Companies of all sizes can easily find themselves drowning in data generated from websites, landing pages, social streams, emails, text messages, and many other sources. Additionally, there is data in their own repositories. With so much data at their disposal, companies are under mounting pressure to utilize it to generate insights. These insights are critical because they can (and should) drive the overall business strategy and help companies make better business decisions. To leverage the power of data analytics, businesses need more “top-management muscle” specialized in the field of data science. This specialized field has lead to the creation of roles like Chief Data Officer (CDO).

In addition, with more companies undertaking digital transformations, there’s greater impetus for the C-suite to make data-driven decisions. The CDO helps make data-driven decisions and also develops a digital business strategy around those decisions. As data grows at an unstoppable rate, becoming an inseparable part of key business functions, we will see the CDO act as a bridge between other C-suite execs.

Data skills an emerging business necessity

So far, only large enterprises with bigger data mining and management needs maintain in-house solutions. These in-house teams and technologies handle the growing sets of diverse and dispersed data. Others work with third-party service providers to develop and execute their big data strategies.

As the amount of data grows, the need to mine it for insights becomes a key business requirement. For both large and small businesses, data-centric roles will experience endless upward mobility. These roles include data anlysts and scientists. There is going to be a huge opportunity for critical thinkers to turn their analytical skills into rapidly growing roles in the field of data science. In fact, data skills are now a prized qualification for titles like IT project managers and computer systems analysts.

Forbes cited the McKinsey Global Institute’s prediction that by 2018 there could be a massive shortage of data-skilled professionals. This indicates a disruption at the demand-supply level with the needs for data skills at an all-time high. With an increasing number of companies adopting big data strategies, salaries for data jobs are going through the roof. This is turning the position into a highly coveted one.

According to Harvard Professor Gary King, “There is a big data revolution. The big data revolution is that now we can do something with the data.” The big problem is that most enterprises don’t know what to do with data. Data professionals are helping businesses figure that out. So if you’re casting about for where to apply your skills and want to take advantage of one of the best career paths in the job market today, focus on data science.

I’m compensated by University of Phoenix for this blog. As always, all thoughts and opinions are my own.

For more insight on our increasingly connected future, see The $19 Trillion Question: Are You Undervaluing The Internet Of Things?

The post Data Analysts and Scientists More Important Than Ever For the Enterprise appeared first on Millennial CEO.


About Daniel Newman

Daniel Newman serves as the Co-Founder and CEO of EC3, a quickly growing hosted IT and Communication service provider. Prior to this role Daniel has held several prominent leadership roles including serving as CEO of United Visual. Parent company to United Visual Systems, United Visual Productions, and United GlobalComm; a family of companies focused on Visual Communications and Audio Visual Technologies. Daniel is also widely published and active in the Social Media Community. He is the Author of Amazon Best Selling Business Book "The Millennial CEO." Daniel also Co-Founded the Global online Community 12 Most and was recognized by the Huffington Post as one of the 100 Business and Leadership Accounts to Follow on Twitter. Newman is an Adjunct Professor of Management at North Central College. He attained his undergraduate degree in Marketing at Northern Illinois University and an Executive MBA from North Central College in Naperville, IL. Newman currently resides in Aurora, Illinois with his wife (Lisa) and his two daughters (Hailey 9, Avery 5). A Chicago native all of his life, Newman is an avid golfer, a fitness fan, and a classically trained pianist

When Good Is Good Enough: Guiding Business Users On BI Practices

Ina Felsheim

Image_part2-300x200In Part One of this blog series, I talked about changing your IT culture to better support self-service BI and data discovery. Absolutely essential. However, your work is not done!

Self-service BI and data discovery will drive the number of users using the BI solutions to rapidly expand. Yet all of these more casual users will not be well versed in BI and visualization best practices.

When your user base rapidly expands to more casual users, you need to help educate them on what is important. For example, one IT manager told me that his casual BI users were making visualizations with very difficult-to-read charts and customizing color palettes to incredible degrees.

I had a similar experience when I was a technical writer. One of our lead writers was so concerned with readability of every sentence that he was going through the 300+ page manuals (yes, they were printed then) and manually adjusting all of the line breaks and page breaks. (!) Yes, readability was incrementally improved. But now any number of changes–technical capabilities, edits, inserting larger graphics—required re-adjusting all of those manual “optimizations.” The time it took just to do the additional optimization was incredible, much less the maintenance of these optimizations! Meanwhile, the technical writing team was falling behind on new deliverables.

The same scenario applies to your new casual BI users. This new group needs guidance to help them focus on the highest value practices:

  • Customization of color and appearance of visualizations: When is this customization necessary for a management deliverable, versus indulging an OCD tendency? I too have to stop myself from obsessing about the font, line spacing, and that a certain blue is just a bit different than another shade of blue. Yes, these options do matter. But help these casual users determine when that time is well spent.
  • Proper visualizations: When is a spinning 3D pie chart necessary to grab someone’s attention? BI professionals would firmly say “NEVER!” But these casual users do not have a lot of depth on BI best practices. Give them a few simple guidelines as to when “flash” needs to subsume understanding. Consider offering a monthly one-hour Lunch and Learn that shows them how to create impactful, polished visuals. Understanding if their visualizations are going to be viewed casually on the way to a meeting, or dissected at a laptop, also helps determine how much time to spend optimizing a visualization. No, you can’t just mandate that they all read Tufte.
  • Predictive: Provide advanced analytics capabilities like forecasting and regression directly in their casual BI tools. Using these capabilities will really help them wow their audience with substance instead of flash.
  • Feature requests: Make sure you understand the motivation and business value behind some of the casual users’ requests. These casual users are less likely to understand the implications of supporting specific requests across an enterprise, so make sure you are collaborating on use cases and priorities for substantive requests.

By working with your casual BI users on the above points, you will be able to collectively understand when the absolute exact request is critical (and supports good visualization practices), and when it is an “optimization” that may impact productivity. In many cases, “good” is good enough for the fast turnaround of data discovery.

Next week, I’ll wrap this series up with hints on getting your casual users to embrace the “we” not “me” mentality.

Read Part One of this series: Changing The IT Culture For Self-Service BI Success.

Follow me on Twitter: @InaSAP


Diving Deep Into Digital Experiences

Kai Goerlich


Google Cardboard VR goggles cost US$8
By 2019, immersive solutions
will be adopted in 20% of enterprise businesses
By 2025, the market for immersive hardware and software technology could be $182 billion
In 2017, Lowe’s launched
Holoroom How To VR DIY clinics

From Dipping a Toe to Fully Immersed

The first wave of virtual reality (VR) and augmented reality (AR) is here,

using smartphones, glasses, and goggles to place us in the middle of 360-degree digital environments or overlay digital artifacts on the physical world. Prototypes, pilot projects, and first movers have already emerged:

  • Guiding warehouse pickers, cargo loaders, and truck drivers with AR
  • Overlaying constantly updated blueprints, measurements, and other construction data on building sites in real time with AR
  • Building 3D machine prototypes in VR for virtual testing and maintenance planning
  • Exhibiting new appliances and fixtures in a VR mockup of the customer’s home
  • Teaching medicine with AR tools that overlay diagnostics and instructions on patients’ bodies

A Vast Sea of Possibilities

Immersive technologies leapt forward in spring 2017 with the introduction of three new products:

  • Nvidia’s Project Holodeck, which generates shared photorealistic VR environments
  • A cloud-based platform for industrial AR from Lenovo New Vision AR and Wikitude
  • A workspace and headset from Meta that lets users use their hands to interact with AR artifacts

The Truly Digital Workplace

New immersive experiences won’t simply be new tools for existing tasks. They promise to create entirely new ways of working.

VR avatars that look and sound like their owners will soon be able to meet in realistic virtual meeting spaces without requiring users to leave their desks or even their homes. With enough computing power and a smart-enough AI, we could soon let VR avatars act as our proxies while we’re doing other things—and (theoretically) do it well enough that no one can tell the difference.

We’ll need a way to signal when an avatar is being human driven in real time, when it’s on autopilot, and when it’s owned by a bot.

What Is Immersion?

A completely immersive experience that’s indistinguishable from real life is impossible given the current constraints on power, throughput, and battery life.

To make current digital experiences more convincing, we’ll need interactive sensors in objects and materials, more powerful infrastructure to create realistic images, and smarter interfaces to interpret and interact with data.

When everything around us is intelligent and interactive, every environment could have an AR overlay or VR presence, with use cases ranging from gaming to firefighting.

We could see a backlash touting the superiority of the unmediated physical world—but multisensory immersive experiences that we can navigate in 360-degree space will change what we consider “real.”

Download the executive brief Diving Deep Into Digital Experiences.

Read the full article Swimming in the Immersive Digital Experience.


Kai Goerlich

About Kai Goerlich

Kai Goerlich is the Chief Futurist at SAP Innovation Center network His specialties include Competitive Intelligence, Market Intelligence, Corporate Foresight, Trends, Futuring and ideation. Share your thoughts with Kai on Twitter @KaiGoe.heif Futu


Jenny Dearborn: Soft Skills Will Be Essential for Future Careers

Jenny Dearborn

The Japanese culture has always shown a special reverence for its elderly. That’s why, in 1963, the government began a tradition of giving a silver dish, called a sakazuki, to each citizen who reached the age of 100 by Keiro no Hi (Respect for the Elders Day), which is celebrated on the third Monday of each September.

That first year, there were 153 recipients, according to The Japan Times. By 2016, the number had swelled to more than 65,000, and the dishes cost the already cash-strapped government more than US$2 million, Business Insider reports. Despite the country’s continued devotion to its seniors, the article continues, the government felt obliged to downgrade the finish of the dishes to silver plating to save money.

What tends to get lost in discussions about automation taking over jobs and Millennials taking over the workplace is the impact of increased longevity. In the future, people will need to be in the workforce much longer than they are today. Half of the people born in Japan today, for example, are predicted to live to 107, making their ancestors seem fragile, according to Lynda Gratton and Andrew Scott, professors at the London Business School and authors of The 100-Year Life: Living and Working in an Age of Longevity.

The End of the Three-Stage Career

Assuming that advances in healthcare continue, future generations in wealthier societies could be looking at careers lasting 65 or more years, rather than at the roughly 40 years for today’s 70-year-olds, write Gratton and Scott. The three-stage model of employment that dominates the global economy today—education, work, and retirement—will be blown out of the water.

It will be replaced by a new model in which people continually learn new skills and shed old ones. Consider that today’s most in-demand occupations and specialties did not exist 10 years ago, according to The Future of Jobs, a report from the World Economic Forum.

And the pace of change is only going to accelerate. Sixty-five percent of children entering primary school today will ultimately end up working in jobs that don’t yet exist, the report notes.

Our current educational systems are not equipped to cope with this degree of change. For example, roughly half of the subject knowledge acquired during the first year of a four-year technical degree, such as computer science, is outdated by the time students graduate, the report continues.

Skills That Transcend the Job Market

Instead of treating post-secondary education as a jumping-off point for a specific career path, we may see a switch to a shorter school career that focuses more on skills that transcend a constantly shifting job market. Today, some of these skills, such as complex problem solving and critical thinking, are taught mostly in the context of broader disciplines, such as math or the humanities.

Other competencies that will become critically important in the future are currently treated as if they come naturally or over time with maturity or experience. We receive little, if any, formal training, for example, in creativity and innovation, empathy, emotional intelligence, cross-cultural awareness, persuasion, active listening, and acceptance of change. (No wonder the self-help marketplace continues to thrive!)

The three-stage model of employment that dominates the global economy today—education, work, and retirement—will be blown out of the water.

These skills, which today are heaped together under the dismissive “soft” rubric, are going to harden up to become indispensable. They will become more important, thanks to artificial intelligence and machine learning, which will usher in an era of infinite information, rendering the concept of an expert in most of today’s job disciplines a quaint relic. As our ability to know more than those around us decreases, our need to be able to collaborate well (with both humans and machines) will help define our success in the future.

Individuals and organizations alike will have to learn how to become more flexible and ready to give up set-in-stone ideas about how businesses and careers are supposed to operate. Given the rapid advances in knowledge and attendant skills that the future will bring, we must be willing to say, repeatedly, that whatever we’ve learned to that point doesn’t apply anymore.

Careers will become more like life itself: a series of unpredictable, fluid experiences rather than a tightly scripted narrative. We need to think about the way forward and be more willing to accept change at the individual and organizational levels.

Rethink Employee Training

One way that organizations can help employees manage this shift is by rethinking training. Today, overworked and overwhelmed employees devote just 1% of their workweek to learning, according to a study by consultancy Bersin by Deloitte. Meanwhile, top business leaders such as Bill Gates and Nike founder Phil Knight spend about five hours a week reading, thinking, and experimenting, according to an article in Inc. magazine.

If organizations are to avoid high turnover costs in a world where the need for new skills is shifting constantly, they must give employees more time for learning and make training courses more relevant to the future needs of organizations and individuals, not just to their current needs.

The amount of learning required will vary by role. That’s why at SAP we’re creating learning personas for specific roles in the company and determining how many hours will be required for each. We’re also dividing up training hours into distinct topics:

  • Law: 10%. This is training required by law, such as training to prevent sexual harassment in the workplace.

  • Company: 20%. Company training includes internal policies and systems.

  • Business: 30%. Employees learn skills required for their current roles in their business units.

  • Future: 40%. This is internal, external, and employee-driven training to close critical skill gaps for jobs of the future.

In the future, we will always need to learn, grow, read, seek out knowledge and truth, and better ourselves with new skills. With the support of employers and educators, we will transform our hardwired fear of change into excitement for change.

We must be able to say to ourselves, “I’m excited to learn something new that I never thought I could do or that never seemed possible before.” D!