When You Hear Hoofbeats, Don’t Think Zebras.

Adriel Sanchez

My wife said that to me the other day as I described a situation at work. If you’re in the medical field like she is, you’d know exactly what she meant. If you’re just a marketing guy like me, the exchange might have been closer to ours:

“When you hear hoofbeats, don’t think zebras.
“What the hell are you talking about?”
“When you hear hoofbeats, look for horses, not zebras.”
“Repeating it isn’t going to help me understand it any better, honey.”

So I did what any dumb marketing guy would do. I went to Wikipedia.

Turns out the adage was first coined by Dr. Theodore Woodward in 1940. Dr. Woodward (or just Doc, as I like to imagine they called all doctors in the 1940s), used it to describe a very simple, yet fundamental principle in medicine.

If you hear hooves behind you, don’t expect to see a zebra when you turn around. Chances are it’s a horse. In other words, look for the simplest, most common explanation to a problem first. Only when that’s ruled out should you look for rarer, more complicated explanations. The medical term for this is differential diagnosis. So if you ever find yourself laying in a hospital bed among a team of doctors talking about horses and zebras, don’t get excited. They’re not bringing you an exotic pet.

Research has shown that our brains are actually wired to fight against the simplest explanation.

The ‘striking and the novel stay longer in the mind’ is a well-known phenomenon, first cited in Rhetorica ad Herennium, circa 85 BC. So if you’re in NY and see a pigeon, you probably won’t remember it. But if you see a gang of drunk clowns stumbling down 5th Avenue at 10 AM on a weekday, you’re more likely to retain that information. Incidentally, this actually happened to me.

Second, there’s the availability heuristic, which is a bias to assign more importance to something you can think of. If you can think of it, it must be important. Because you’re so important. Or at least you think you are. Put another way, events more easily remembered are interpreted as more probable.

It’s very simple, but incredibly important that this be drilled into the heads of medical students early. Because looking for zebras is very dangerous. At best it’ll delay treatment. Worst case you’ll be treated for the wrong thing, and what’s really going on will kill you.

While those of us in marketing don’t typically deal in issues of life and death, I couldn’t help but relate this concept of horses and zebras to problems I’ve encountered in my own work.

Whether your ‘patient’ is a demand generation campaign that’s not producing results, a website that’s not converting as many buyers as you’d like, or a new corporate blog that’s not getting much traffic, the range of diagnoses are pretty clear. At the highest level, it’s either what you’re selling (offer), who you’re selling it to (audience), or how you’re selling it (creative, or more broadly, execution).

The trick is to tell the difference between the horses and zebras that could be bringing down your performance. Could the problem be that you chose a black and white photo instead of a color one for your landing page? Maybe, but that’s not where to look first.

Did you target the right people? Does your offer stink? Those are your horses. Don’t spend your time theorizing about the correct shade of blue for your banner ad until you’ve ruled those out.

Granted, that’s a simplistic example using the principle of the 40/40/20 rule to illustrate my point. It’s arguable that these days, how you communicate the message (creative) is much more important with the rise of social, mobile, and changing habits in how people consume content. I’ll cover my views on how the 40/40/20 rule is evolving in a later post. But for now, suffice to say that horses and zebras will be dictated by the symptoms your marketing effort is experiencing and can vary by any number of circumstances, as they often do in medicine.

So whether you’re tackling a marketing or general business problem, remember this clear lesson from the people we trust our lives with. And knowing that we’re wired to give undue weight to relatively improbable explanations will also help keep you in check.

Put another way, KISS (Keep it Simple, Stupid). But remember, just because it’s simple doesn’t mean it’s easy.

Follow the conversation @Adriel_S or #marketingpfft.  


Recommended for you:

When Can Your Plant Talk? Mine Did Yesterday

SAP Guest

By Reuven Gorsht, Vice President, Customer Strategy, SAP

What if your plants can talk and let you know when it needs watering? How about that garage door you keep on forgetting to close?  Wouldn’t it be nice if it could send you a friendly reminder? This capability is here today, thanks to the ingenuity of a few individuals who are leading the charge to help us realize the incredible potential of the Internet of things.

Last year, two MIT media lab grads set out to make the Internet of Things closer to reality by connecting everyday objects to humans in a natural way. John Kestner and David Carr created Twine, a 2.5 inch-square device with on-board temperature and vibration sensors, as well as an expansion connector for other sensors, such as moisture sensors and magnetic switches, all tightly integrated through WiFi with a cloud-based service.

They started an ambitious campaign to raise money on the popular crowdfunding site Kickstarter, looking to raise $35,000 for research and development. Instead, they raised over $550,000 from nearly 4,000 enthusiastic backers.

Why the excitement? Twine creates your personal gateway to the Internet of Things. The real magic happens in the cloud-based software, which sets up the device and allows for infinite possibilities without ever writing a single line of code. Creating rules and filters that trigger messages, tweets or even HTTP requests are elegant and effortless. You can setup Twine to do virtually anything that can be monitored by the sensors, from detecting if your washing machine has sprung a leak to sending you a text when the mailman opens your mailbox to deliver the day’s mail.

In 2008, the number of connected devices outgrew the world’s population.Projections show that by 2020 the number of connected devices will exceed 50 billion (or about 6.5 devices for every person on the planet).

According to a new study from GE, productivity gains resulting from the “Internet of Things” could add between $10 trillion and $15 trillion to global GDP over the next 20 years. A 1 percent increase in efficiency could save up to $30 billion in aviation, $6 billion in power generation and $63 billion in healthcare costs. With sizable gains to be had for both consumers and businesses, we could be embarking on a sixth wave of innovation, fueled by smarter connected products and machines working to reduce waste, create new efficiency and power entirely new business models.

Twine is just one of the forays into the possibilities that can exist when we realize the full potential of smart, interconnected products. It is a building block for a future of smarter homes, cities, transportation, healthcare and others that will be driven by linking the interrelationships between systems and individuals. The text message you receive from your rosebush, in this case, is really an invitation to a smarter world, a world where ordinary objects are transformed and enlightened so that they could serve you better.

You can follow me on twitter: @reuvengorsht


Recommended for you:

Why Big Brands Are Going Local In 2013

Steve Olenski

“For consumers, local search is a near-daily ritual – 4 in 10 individuals use local search once a day, while two-thirds use local search at least 3-4 times per week.”

That’s one of the key findings from a recent YP study which revealed among many things “local search is not only pervasive and growing, but also changing in ways that are important for consumers, businesses and the search industry.”

The study, which was commissioned to immr, a marketing firm, queried 1,100 consumers to find out how they conducted their search; which devices they used the most.

And not surprisingly, the more devices a person has, the more local searches they do.

The study broke down consumers into three key segments which they label as (1) consumers with PCs and feature phones; (2) consumers with PCs and smartphones; and (3) consumers with PCs, smartphones and tablets.

As you can see, the more devices one has correlates to the more search one does.

  • On average, the first segment, with “PC’s only,” averages about five local searches a week.
  • With the addition of smartphones, the volume of local search nearly triples, to 13.5 local searches per week.
  • Consumers with all three devices – PCs, smartphones and tablets – do more than 21 local searches per week, or an average of three a day.

Perhaps those surveyed by Balihoo had already heard of the YP study for it would explain the results from the recent survey Balihoo conducted which showed that nearly half of big brands plan to spend more on local marketing in 2013 than they did in 2012.

It would also explain why, according to the Balihoo survey:

  • Over 67% of respondents rated digital marketing as being “very important” or “extremely important” in regards to their local marketing
  • Over 66% indicated that SEO as being one of the key digital marketing tactics they currently rely on
  • Mobile marketing is among the top 3 marketing mediums they want to add to their existing mix with over 35% planning to add it to their arsenal with local blogs and online customer reviews being the other among the top 3
  • More big brands also plan on including local search registration and localized websites

Back to the YP survey which in its conclusions had some very interesting summations. I think you’ll see a common theme.

  • A near-daily ritual for many consumers, an enormous volume of local search activity is being driven by the utility, accessibility and a growing body of searchable local content.
  • A significant portion of local search is being done on smartphones and tablets. By removing barriers and friction, always-on devices increase the incidence and frequency of local search, at home as well as on the go.
  • As the penetration of mobile devices increases, the overall volume of local search is likely to increase in parallel.

See the commonality?

One word: Mobile.

Clearly the newest frontier – no Captain Kirk, not the final frontier – is mobile marketing. All marketers know its value. They see its enormous potential. It’s one of the reasons not long I wrote Mobile Marketing Too Large For Brands To Ignore.

But alas far too many marketers sit in their conference rooms and choose to ignore the large gray pachyderm sitting in the corner.

Sources: YPMarketingCharts, Google Images

Named one of the Top 100 Influencers In Social Media (#41) by Social Technology Review and a Top 50 Social Media Blogger by Kred, Steve Olenski is a senior content strategist at Responsys, a leading global provider of on-demand email and cross-channel marketing solutions. 


Recommended for you:

IT Executives Get Ready To Win With Cloud Computing

Lindsey Nelson

CIO links IT efforts to positive business outcomes in a report to the boardWhat drives IT in your organization – cost or agility? Within the IT organization, most discussions are currently focused on cost controls rather than the greater potential benefit – business agility. More often, we hear “How much capital and operational expense can I cut with cloud?” Yet, business leaders outside of the IT function are beginning to change that conversation to “How will cloud improve revenue or my company’s competitiveness?”

Beyond cost reduction, demonstrating the true value of cloud computing has its challenges. In a newly released business-agility survey, corporate decision makers linked cloud computing directly to business agility. It shows that the hype around cloud computing is maturing into facts that cloud can really support both IT and business transformation.

Business leaders link cloud computing directly to significant business improvement

According to a joint survey by VMware and AbsolutData of 600 corporate leaders from around the world, the majority of respondents believe that cloud computing can help:

  • Make the entire organization more “business agile” and “responsive”
  • Achieve 10% greater business agility outcomes such as key revenue growth, cost reduction, and risk mitigation
  • Drive business agility that is three times more likely to be “much better than the competition”
  • Accelerate the execution and maintenance of an architecture that supports business process changes

As you can see, many forward-thinking CIOs view cloud computing as a strategic weapon – and not just for IT. Many believe that it can also enable full business transformation that can eventually change how their business operates. According to McKinsey & Company, a global management consulting firm, the benefits of agility include faster revenue growth, greater and more lasting cost reduction, and more effective management of risks and reputational threats. And by delivering these advantages to the boardroom, CIOs can ensure that IT will garner the attention it deserves.

In a couple of years, we will most likely hear companies talk about how cloud technology helped them create a much tighter connection between IT transformation and business transformation. And that will be a significant win for IT.

Want to learn more about the link between cloud computing and agility?

Take at a moment to read “Business Agility and the True Economics of Cloud Computing” to explore more insights and findings from this research.


Recommended for you:

Hurricane Sandy, 5 Weeks Later

Heather McIlvaine

Looking back, the mistakes seem so obvious: data centers located on a low-lying island (Manhattan), and back-up generators kept in basements. Now that the floodwaters have receded, it’s time ask ourselves: What have we learned from Hurricane Sandy?



Between October 24 and October 30, Hurricane Sandy wreaked havoc in eight countries from the Caribbean to Canada. It destroyed homes and livelihoods and left upwards of 200 people dead. In the United States, the storm is said to have affected 24 states. While cities as far inland as Cleveland, Ohio, felt the impact of Hurricane Sandy, it was the East Coast (New Jersey, New York, Pennsylvania, Connecticut, Massachusetts, and Maine) that bore the brunt of it. Today, the total cost of damage is estimated to be at least U.S.$50 billion. For many, a return to pre-Sandy conditions is still a long way off.

Lessons learned

Now, five weeks after the storm, we ask ourselves, what have we learned from it? For one thing, there’s the environmental aspect. Rising sea levels and a warming planet have increased the likelihood of more frequent Hurricane Sandy-like superstorms in the future, say some. There’s also the sociopolitical aspect. The damage caused by the storm in the United States was severe. And with far fewer resources, countries in the Caribbean will have an even harder time recovering from Hurricane Sandy. Technology provides yet another lens through which to view the storm. It both played a crucial role in responding to the event and, at the same time, was made useless by it.

For example, many New York City hospitals faced the possibility of power outages and had to evacuate patients to other locations. At least one step of the harrowing journey was made smoother through technology. In this case, it was New York’s statewide use of electronic medical records, which gave doctors immediate and secure access to incoming patients’ medical charts and information. However, electronic medical records would not have been much help had the power outages been more widespread.

As it was, over seven million people in the United States lost power during Hurricane Sandy. What’s more, the outages had far-reaching secondary effects. Like at Verizon. With no power, the telecommunications provider wasn’t able to operate the safety system that keeps water out of its cables. As a result, customers lost access to Internet, TV, and the telephone. In the end, service technicians actually used thermal imaging to spot damaged cables and replace them at a fast pace. A fitting turn of events in which one technology was creatively used to repair the technology that failed in the first place.

In comparison, other companies were forced to use some very low-tech solutions to keep things running during the storm. When basement flooding took out the back-up generators at Peer 1 Hosting, a data center operator located in lower Manhattan, the company had only one hope of staying online: a single generator located on the building’s rooftop. More than 30 customers helped data center employees carry thousands of gallons of fuel, bucket by bucket, up 17 flights of stairs to the generator. Amazingly, their efforts were successful and the data center remained operational during the hurricane.

20,000 gallons of diesel fuel per day

The same outcome may have been possible with much less effort, however, if their generators and fuel tanks hadn’t been located below sea level. Since flooding is not uncommon on a low-lying island like Manhattan, the basement would seem to be the worst place to store resources that are critical for disaster response. But Peer 1 wasn’t alone in this. Internap, another data center operator in the same building, also lost its fuel pumps due to water damage. It was able to hook up its generator to a fuel truck parked outside the office, which provided the data center with up to 20,000 gallons of fuel per day for four days. Certainly an effective, but expensive, alternative to buckets.

A number of other data centers in Manhattan were forced to shut down for some amount of time during the storm, not because they didn’t have a working generator, but due to lack of fuel, or access to it. In fact, the top three reasons for generator failure, according to Gartner’s conversations with customers, all have to do with fuel. These are: failed sensors (companies thought they had fuel when they didn’t), stale fuel affecting performance and engine operation, and improper fuel storage resulting in contamination. Customers also said that fuel stored in a risky location (basements!) and inadequate supply prevented them from using generator power during an outage.

It’s surprising that so many data centers have a disaster recovery plan in place, but then neglect this crucial aspect. At least one company seems to have learned from Hurricane Sandy: Internap is considering the installation of submersible fuel pumps. For next year’s superstorm.


Recommended for you: