Are You Planning To Embark On An Advanced Analytics Journey?

Paul Pallath

Welcome to the new world! The manner in which data is generated and captured today has come of age. Traditionally the most common way of generating data from B2C/B2B2C business processes was by having interactions captured as part of transactional systems in a highly structured format. But with the changing technological landscape, much has changed in how data is generated and captured.

What data supports this view?

According to the ESG Digital Archive Market Forecast, the growth in data volumes that is driven by unstructured data amounts to more than 88% compared to structured data. What’s more, Computer World states that unstructured information may account for more than 70% to 80% of all data in organizations.

The change has come because we in the 21st century have redefined the way business is conducted. Significant advancement in Internet technology has forced most businesses to establish a digital online presence to stay relevant. Likewise, every interaction that a customer has in the digital online ecosystem leaves behind a digital footprint containing huge amounts of information.

Social media presences, for individuals and businesses, have increased the speed at which information travels. This has made it possible to share opinions in blogs or multimedia content. The result is the constant generation of large amounts of unstructured data.

All that unstructured data is good news for data scientists

However, this is good news for data scientists.

The previous figures imply that we have now Yottabytes [1024] of data at our disposal for deriving business value – and that amount of data is about to increase.

The Internet of Things, with its emphasis on completely connected systems, has resulted in the availability of high-speed, streaming data. This makes it possible for innovations that use data to build technologies to enable machines talk to one another (and perhaps eventually become intelligent enough to remove humans from the loop)! Taking the trend into consideration, Brontobytes [1027] of data to work with will soon be a reality for data scientists.

So, what is the best way for a business to capture and benefit from this information? Of course, capturing the massive swathes of data available is an important part of the Big Data story. But it’s not the most important part.

The most vital activity is to generate insights that add value to your business. This takes vision, it takes change, it takes…advanced analytics.

For organizations embarking on a journey into advanced analytics, it’s vital to keep in mind these important considerations:

  • How do we measure business value and return on investment?
  • How do we use advanced analytics effectively?
  • Is advanced analytics just another technology project?
  • Is Big Data equal to high quality insight?

In next week’s Predictive blog, I’ll discuss each one of these considerations in more detail.

For expert insight on your own digital transformation, listen to Coffee Talk with Game Changers on The Digital Economy: How Organizations Adapt.

Comments

Paul Pallath

About Paul Pallath

Dr Paul Pallath is the Chief Data Scientist & Senior Director with the Advanced Analytics Organisation at SAP. With over 20 years of experience in Machine Learning, Paul has several research publications in the field of Machine Learning & Data Mining in International Journals and conferences and has also invented several patentable ideas.  He has a Master’s Degree in Computer Applications with Gold Medal, and PhD in Machine Learning, both from Indian Institute of Technology, Delhi.

Data Analysts And Scientists More Important Than Ever For The Enterprise

Daniel Newman

The business world is now firmly in the age of data. Not that data wasn’t relevant before; it was just nowhere close to the speed and volume that’s available to us today. Businesses are buckling under the deluge of petabytes, exabytes, and zettabytes. Within these bytes lie valuable information on customer behavior, key business insights, and revenue generation. However, all that data is practically useless for businesses without the ability to identify the right data. Plus, if they don’t have the talent and resources to capture the right data, organize it, dissect it, draw actionable insights from it and, finally, deliver those insights in a meaningful way, their data initiatives will fail.

Rise of the CDO

Companies of all sizes can easily find themselves drowning in data generated from websites, landing pages, social streams, emails, text messages, and many other sources. Additionally, there is data in their own repositories. With so much data at their disposal, companies are under mounting pressure to utilize it to generate insights. These insights are critical because they can (and should) drive the overall business strategy and help companies make better business decisions. To leverage the power of data analytics, businesses need more “top-management muscle” specialized in the field of data science. This specialized field has lead to the creation of roles like Chief Data Officer (CDO).

In addition, with more companies undertaking digital transformations, there’s greater impetus for the C-suite to make data-driven decisions. The CDO helps make data-driven decisions and also develops a digital business strategy around those decisions. As data grows at an unstoppable rate, becoming an inseparable part of key business functions, we will see the CDO act as a bridge between other C-suite execs.

Data skills an emerging business necessity

So far, only large enterprises with bigger data mining and management needs maintain in-house solutions. These in-house teams and technologies handle the growing sets of diverse and dispersed data. Others work with third-party service providers to develop and execute their big data strategies.

As the amount of data grows, the need to mine it for insights becomes a key business requirement. For both large and small businesses, data-centric roles will experience endless upward mobility. These roles include data anlysts and scientists. There is going to be a huge opportunity for critical thinkers to turn their analytical skills into rapidly growing roles in the field of data science. In fact, data skills are now a prized qualification for titles like IT project managers and computer systems analysts.

Forbes cited the McKinsey Global Institute’s prediction that by 2018 there could be a massive shortage of data-skilled professionals. This indicates a disruption at the demand-supply level with the needs for data skills at an all-time high. With an increasing number of companies adopting big data strategies, salaries for data jobs are going through the roof. This is turning the position into a highly coveted one.

According to Harvard Professor Gary King, “There is a big data revolution. The big data revolution is that now we can do something with the data.” The big problem is that most enterprises don’t know what to do with data. Data professionals are helping businesses figure that out. So if you’re casting about for where to apply your skills and want to take advantage of one of the best career paths in the job market today, focus on data science.

I’m compensated by University of Phoenix for this blog. As always, all thoughts and opinions are my own.

For more insight on our increasingly connected future, see The $19 Trillion Question: Are You Undervaluing The Internet Of Things?

The post Data Analysts and Scientists More Important Than Ever For the Enterprise appeared first on Millennial CEO.

Comments

Daniel Newman

About Daniel Newman

Daniel Newman serves as the Co-Founder and CEO of EC3, a quickly growing hosted IT and Communication service provider. Prior to this role Daniel has held several prominent leadership roles including serving as CEO of United Visual. Parent company to United Visual Systems, United Visual Productions, and United GlobalComm; a family of companies focused on Visual Communications and Audio Visual Technologies.
Daniel is also widely published and active in the Social Media Community. He is the Author of Amazon Best Selling Business Book “The Millennial CEO.” Daniel also Co-Founded the Global online Community 12 Most and was recognized by the Huffington Post as one of the 100 Business and Leadership Accounts to Follow on Twitter.
Newman is an Adjunct Professor of Management at North Central College. He attained his undergraduate degree in Marketing at Northern Illinois University and an Executive MBA from North Central College in Naperville, IL. Newman currently resides in Aurora, Illinois with his wife (Lisa) and his two daughters (Hailey 9, Avery 5).
A Chicago native all of his life, Newman is an avid golfer, a fitness fan, and a classically trained pianist

When Good Is Good Enough: Guiding Business Users On BI Practices

Ina Felsheim

Image_part2-300x200In Part One of this blog series, I talked about changing your IT culture to better support self-service BI and data discovery. Absolutely essential. However, your work is not done!

Self-service BI and data discovery will drive the number of users using the BI solutions to rapidly expand. Yet all of these more casual users will not be well versed in BI and visualization best practices.

When your user base rapidly expands to more casual users, you need to help educate them on what is important. For example, one IT manager told me that his casual BI users were making visualizations with very difficult-to-read charts and customizing color palettes to incredible degrees.

I had a similar experience when I was a technical writer. One of our lead writers was so concerned with readability of every sentence that he was going through the 300+ page manuals (yes, they were printed then) and manually adjusting all of the line breaks and page breaks. (!) Yes, readability was incrementally improved. But now any number of changes–technical capabilities, edits, inserting larger graphics—required re-adjusting all of those manual “optimizations.” The time it took just to do the additional optimization was incredible, much less the maintenance of these optimizations! Meanwhile, the technical writing team was falling behind on new deliverables.

The same scenario applies to your new casual BI users. This new group needs guidance to help them focus on the highest value practices:

  • Customization of color and appearance of visualizations: When is this customization necessary for a management deliverable, versus indulging an OCD tendency? I too have to stop myself from obsessing about the font, line spacing, and that a certain blue is just a bit different than another shade of blue. Yes, these options do matter. But help these casual users determine when that time is well spent.
  • Proper visualizations: When is a spinning 3D pie chart necessary to grab someone’s attention? BI professionals would firmly say “NEVER!” But these casual users do not have a lot of depth on BI best practices. Give them a few simple guidelines as to when “flash” needs to subsume understanding. Consider offering a monthly one-hour Lunch and Learn that shows them how to create impactful, polished visuals. Understanding if their visualizations are going to be viewed casually on the way to a meeting, or dissected at a laptop, also helps determine how much time to spend optimizing a visualization. No, you can’t just mandate that they all read Tufte.
  • Predictive: Provide advanced analytics capabilities like forecasting and regression directly in their casual BI tools. Using these capabilities will really help them wow their audience with substance instead of flash.
  • Feature requests: Make sure you understand the motivation and business value behind some of the casual users’ requests. These casual users are less likely to understand the implications of supporting specific requests across an enterprise, so make sure you are collaborating on use cases and priorities for substantive requests.

By working with your casual BI users on the above points, you will be able to collectively understand when the absolute exact request is critical (and supports good visualization practices), and when it is an “optimization” that may impact productivity. In many cases, “good” is good enough for the fast turnaround of data discovery.

Next week, I’ll wrap this series up with hints on getting your casual users to embrace the “we” not “me” mentality.

Read Part One of this series: Changing The IT Culture For Self-Service BI Success.

Follow me on Twitter: @InaSAP

Comments

More Than Noise: Digital Trends That Are Bigger Than You Think

By Maurizio Cattaneo, David Delaney, Volker Hildebrand, and Neal Ungerleider

In the tech world in 2017, several trends emerged as signals amid the noise, signifying much larger changes to come.

As we noted in last year’s More Than Noise list, things are changing—and the changes are occurring in ways that don’t necessarily fit into the prevailing narrative.

While many of 2017’s signals have a dark tint to them, perhaps reflecting the times we live in, we have sought out some rays of light to illuminate the way forward. The following signals differ considerably, but understanding them can help guide businesses in the right direction for 2018 and beyond.

When a team of psychologists, linguists, and software engineers created Woebot, an AI chatbot that helps people learn cognitive behavioral therapy techniques for managing mental health issues like anxiety and depression, they did something unusual, at least when it comes to chatbots: they submitted it for peer review.

Stanford University researchers recruited a sample group of 70 college-age participants on social media to take part in a randomized control study of Woebot. The researchers found that their creation was useful for improving anxiety and depression symptoms. A study of the user interaction with the bot was submitted for peer review and published in the Journal of Medical Internet Research Mental Health in June 2017.

While Woebot may not revolutionize the field of psychology, it could change the way we view AI development. Well-known figures such as Elon Musk and Bill Gates have expressed concerns that artificial intelligence is essentially ungovernable. Peer review, such as with the Stanford study, is one way to approach this challenge and figure out how to properly evaluate and find a place for these software programs.

The healthcare community could be onto something. We’ve already seen instances where AI chatbots have spun out of control, such as when internet trolls trained Microsoft’s Tay to become a hate-spewing misanthrope. Bots are only as good as their design; making sure they stay on message and don’t act in unexpected ways is crucial.

This is especially true in healthcare. When chatbots are offering therapeutic services, they must be properly designed, vetted, and tested to maintain patient safety.

It may be prudent to apply the same level of caution to a business setting. By treating chatbots as if they’re akin to medicine or drugs, we have a model for thorough vetting that, while not perfect, is generally effective and time tested.

It may seem like overkill to think of chatbots that manage pizza orders or help resolve parking tickets as potential health threats. But it’s already clear that AI can have unintended side effects that could extend far beyond Tay’s loathsome behavior.

For example, in July, Facebook shut down an experiment where it challenged two AIs to negotiate with each other over a trade. When the experiment began, the two chatbots quickly went rogue, developing linguistic shortcuts to reduce negotiating time and leaving their creators unable to understand what they were saying.

Do we want AIs interacting in a secret language because designers didn’t fully understand what they were designing?

The implications are chilling. Do we want AIs interacting in a secret language because designers didn’t fully understand what they were designing?

In this context, the healthcare community’s conservative approach doesn’t seem so farfetched. Woebot could ultimately become an example of the kind of oversight that’s needed for all AIs.

Meanwhile, it’s clear that chatbots have great potential in healthcare—not just for treating mental health issues but for helping patients understand symptoms, build treatment regimens, and more. They could also help unclog barriers to healthcare, which is plagued worldwide by high prices, long wait times, and other challenges. While they are not a substitute for actual humans, chatbots can be used by anyone with a computer or smartphone, 24 hours a day, seven days a week, regardless of financial status.

Finding the right governance for AI development won’t happen overnight. But peer review, extensive internal quality analysis, and other processes will go a long way to ensuring bots function as expected. Otherwise, companies and their customers could pay a big price.

Elon Musk is an expert at dominating the news cycle with his sci-fi premonitions about space travel and high-speed hyperloops. However, he captured media attention in Australia in April 2017 for something much more down to earth: how to deal with blackouts and power outages.

In 2016, a massive blackout hit the state of South Australia following a storm. Although power was restored quickly in Adelaide, the capital, people in the wide stretches of arid desert that surround it spent days waiting for the power to return. That hit South Australia’s wine and livestock industries especially hard.

South Australia’s electrical grid currently gets more than half of its energy from wind and solar, with coal and gas plants acting as backups for when the sun hides or the wind doesn’t blow, according to ABC News Australia. But this network is vulnerable to sudden loss of generation—which is exactly what happened in the storm that caused the 2016 blackout, when tornadoes ripped through some key transmission lines. Getting the system back on stable footing has been an issue ever since.

Displaying his usual talent for showmanship, Musk stepped in and promised to build the world’s largest battery to store backup energy for the network—and he pledged to complete it within 100 days of signing the contract or the battery would be free. Pen met paper with South Australia and French utility Neoen in September. As of press time in November, construction was underway.

For South Australia, the Tesla deal offers an easy and secure way to store renewable energy. Tesla’s 129 MWh battery will be the most powerful battery system in the world by 60% once completed, according to Gizmodo. The battery, which is stationed at a wind farm, will cover temporary drops in wind power and kick in to help conventional gas and coal plants balance generation with demand across the network. South Australian citizens and politicians largely support the project, which Tesla claims will be able to power 30,000 homes.

Until Musk made his bold promise, batteries did not figure much in renewable energy networks, mostly because they just aren’t that good. They have limited charges, are difficult to build, and are difficult to manage. Utilities also worry about relying on the same lithium-ion battery technology as cellphone makers like Samsung, whose Galaxy Note 7 had to be recalled in 2016 after some defective batteries burst into flames, according to CNET.

However, when made right, the batteries are safe. It’s just that they’ve traditionally been too expensive for large-scale uses such as renewable power storage. But battery innovations such as Tesla’s could radically change how we power the economy. According to a study that appeared this year in Nature, the continued drop in the cost of battery storage has made renewable energy price-competitive with traditional fossil fuels.

This is a massive shift. Or, as David Roberts of news site Vox puts it, “Batteries are soon going to disrupt power markets at all scales.” Furthermore, if the cost of batteries continues to drop, supply chains could experience radical energy cost savings. This could disrupt energy utilities, manufacturing, transportation, and construction, to name just a few, and create many opportunities while changing established business models. (For more on how renewable energy will affect business, read the feature “Tick Tock” in this issue.)

Battery research and development has become big business. Thanks to electric cars and powerful smartphones, there has been incredible pressure to make more powerful batteries that last longer between charges.

The proof of this is in the R&D funding pudding. A Brookings Institution report notes that both the Chinese and U.S. governments offer generous subsidies for lithium-ion battery advancement. Automakers such as Daimler and BMW have established divisions marketing residential and commercial energy storage products. Boeing, Airbus, Rolls-Royce, and General Electric are all experimenting with various electric propulsion systems for aircraft—which means that hybrid airplanes are also a possibility.

Meanwhile, governments around the world are accelerating battery research investment by banning internal combustion vehicles. Britain, France, India, and Norway are seeking to go all electric as early as 2025 and by 2040 at the latest.

In the meantime, expect huge investment and new battery innovation from interested parties across industries that all share a stake in the outcome. This past September, for example, Volkswagen announced a €50 billion research investment in batteries to help bring 300 electric vehicle models to market by 2030.

At first, it sounds like a narrative device from a science fiction novel or a particularly bad urban legend.

Powerful cameras in several Chinese cities capture photographs of jaywalkers as they cross the street and, several minutes later, display their photograph, name, and home address on a large screen posted at the intersection. Several days later, a summons appears in the offender’s mailbox demanding payment of a fine or fulfillment of community service.

As Orwellian as it seems, this technology is very real for residents of Jinan and several other Chinese cities. According to a Xinhua interview with Li Yong of the Jinan traffic police, “Since the new technology has been adopted, the cases of jaywalking have been reduced from 200 to 20 each day at the major intersection of Jingshi and Shungeng roads.”

The sophisticated cameras and facial recognition systems already used in China—and their near–real-time public shaming—are an example of how machine learning, mobile phone surveillance, and internet activity tracking are being used to censor and control populations. Most worryingly, the prospect of real-time surveillance makes running surveillance states such as the former East Germany and current North Korea much more financially efficient.

According to a 2015 discussion paper by the Institute for the Study of Labor, a German research center, by the 1980s almost 0.5% of the East German population was directly employed by the Stasi, the country’s state security service and secret police—1 for every 166 citizens. An additional 1.1% of the population (1 for every 66 citizens) were working as unofficial informers, which represented a massive economic drain. Automated, real-time, algorithm-driven monitoring could potentially drive the cost of controlling the population down substantially in police states—and elsewhere.

We could see a radical new era of censorship that is much more manipulative than anything that has come before. Previously, dissidents were identified when investigators manually combed through photos, read writings, or listened in on phone calls. Real-time algorithmic monitoring means that acts of perceived defiance can be identified and deleted in the moment and their perpetrators marked for swift judgment before they can make an impression on others.

Businesses need to be aware of the wider trend toward real-time, automated censorship and how it might be used in both commercial and governmental settings. These tools can easily be used in countries with unstable political dynamics and could become a real concern for businesses that operate across borders. Businesses must learn to educate and protect employees when technology can censor and punish in real time.

Indeed, the technologies used for this kind of repression could be easily adapted from those that have already been developed for businesses. For instance, both Facebook and Google use near–real-time facial identification algorithms that automatically identify people in images uploaded by users—which helps the companies build out their social graphs and target users with profitable advertisements. Automated algorithms also flag Facebook posts that potentially violate the company’s terms of service.

China is already using these technologies to control its own people in ways that are largely hidden to outsiders.

According to a report by the University of Toronto’s Citizen Lab, the popular Chinese social network WeChat operates under a policy its authors call “One App, Two Systems.” Users with Chinese phone numbers are subjected to dynamic keyword censorship that changes depending on current events and whether a user is in a private chat or in a group. Depending on the political winds, users are blocked from accessing a range of websites that report critically on China through WeChat’s internal browser. Non-Chinese users, however, are not subject to any of these restrictions.

The censorship is also designed to be invisible. Messages are blocked without any user notification, and China has intermittently blocked WhatsApp and other foreign social networks. As a result, Chinese users are steered toward national social networks, which are more compliant with government pressure.

China’s policies play into a larger global trend: the nationalization of the internet. China, Russia, the European Union, and the United States have all adopted different approaches to censorship, user privacy, and surveillance. Although there are social networks such as WeChat or Russia’s VKontakte that are popular in primarily one country, nationalizing the internet challenges users of multinational services such as Facebook and YouTube. These different approaches, which impact everything from data safe harbor laws to legal consequences for posting inflammatory material, have implications for businesses working in multiple countries, as well.

For instance, Twitter is legally obligated to hide Nazi and neo-fascist imagery and some tweets in Germany and France—but not elsewhere. YouTube was officially banned in Turkey for two years because of videos a Turkish court deemed “insulting to the memory of Mustafa Kemal Atatürk,” father of modern Turkey. In Russia, Google must keep Russian users’ personal data on servers located inside Russia to comply with government policy.

While China is a pioneer in the field of instant censorship, tech companies in the United States are matching China’s progress, which could potentially have a chilling effect on democracy. In 2016, Apple applied for a patent on technology that censors audio streams in real time—automating the previously manual process of censoring curse words in streaming audio.

In March, after U.S. President Donald Trump told Fox News, “I think maybe I wouldn’t be [president] if it wasn’t for Twitter,” Twitter founder Evan “Ev” Williams did something highly unusual for the creator of a massive social network.

He apologized.

Speaking with David Streitfeld of The New York Times, Williams said, “It’s a very bad thing, Twitter’s role in that. If it’s true that he wouldn’t be president if it weren’t for Twitter, then yeah, I’m sorry.”

Entrepreneurs tend to be very proud of their innovations. Williams, however, offers a far more ambivalent response to his creation’s success. Much of the 2016 presidential election’s rancor was fueled by Twitter, and the instant gratification of Twitter attracts trolls, bullies, and bigots just as easily as it attracts politicians, celebrities, comedians, and sports fans.

Services such as Twitter, Facebook, YouTube, and Instagram are designed through a mix of look and feel, algorithmic wizardry, and psychological techniques to hang on to users for as long as possible—which helps the services sell more advertisements and make more money. Toxic political discourse and online harassment are unintended side effects of the economic-driven urge to keep users engaged no matter what.

Keeping users’ eyeballs on their screens requires endless hours of multivariate testing, user research, and algorithm refinement. For instance, Casey Newton of tech publication The Verge notes that Google Brain, Google’s AI division, plays a key part in generating YouTube’s video recommendations.

According to Jim McFadden, the technical lead for YouTube recommendations, “Before, if I watch this video from a comedian, our recommendations were pretty good at saying, here’s another one just like it,” he told Newton. “But the Google Brain model figures out other comedians who are similar but not exactly the same—even more adjacent relationships. It’s able to see patterns that are less obvious.”

A never-ending flow of content that is interesting without being repetitive is harder to resist. With users glued to online services, addiction and other behavioral problems occur to an unhealthy degree. According to a 2016 poll by nonprofit research company Common Sense Media, 50% of American teenagers believe they are addicted to their smartphones.

This pattern is extending into the workplace. Seventy-five percent of companies told research company Harris Poll in 2016 that two or more hours a day are lost in productivity because employees are distracted. The number one reason? Cellphones and texting, according to 55% of those companies surveyed. Another 41% pointed to the internet.

Tristan Harris, a former design ethicist at Google, argues that many product designers for online services try to exploit psychological vulnerabilities in a bid to keep users engaged for longer periods. Harris refers to an iPhone as “a slot machine in my pocket” and argues that user interface (UI) and user experience (UX) designers need to adopt something akin to a Hippocratic Oath to stop exploiting users’ psychological vulnerabilities.

In fact, there is an entire school of study devoted to “dark UX”—small design tweaks to increase profits. These can be as innocuous as a “Buy Now” button in a visually pleasing color or as controversial as when Facebook tweaked its algorithm in 2012 to show a randomly selected group of almost 700,000 users (who had not given their permission) newsfeeds that skewed more positive to some users and more negative to others to gauge the impact on their respective emotional states, according to an article in Wired.

As computers, smartphones, and televisions come ever closer to convergence, these issues matter increasingly to businesses. Some of the universal side effects of addiction are lost productivity at work and poor health. Businesses should offer training and help for employees who can’t stop checking their smartphones.

Mindfulness-centered mobile apps such as Headspace, Calm, and Forest offer one way to break the habit. Users can also choose to break internet addiction by going for a walk, turning their computers off, or using tools like StayFocusd or Freedom to block addictive websites or apps.

Most importantly, companies in the business of creating tech products need to design software and hardware that discourages addictive behavior. This means avoiding bad designs that emphasize engagement metrics over human health. A world of advertising preroll showing up on smart refrigerator touchscreens at 2 a.m. benefits no one.

According to a 2014 study in Cyberpsychology, Behavior and Social Networking, approximately 6% of the world’s population suffers from internet addiction to one degree or another. As more users in emerging economies gain access to cheap data, smartphones, and laptops, that percentage will only increase. For businesses, getting a head start on stopping internet addiction will make employees happier and more productive. D!


About the Authors

Maurizio Cattaneo is Director, Delivery Execution, Energy, and Natural Resources, at SAP.

David Delaney is Global Vice President and Chief Medical Officer, SAP Health.

Volker Hildebrand is Global Vice President for SAP Hybris solutions.

Neal Ungerleider is a Los Angeles-based technology journalist and consultant.


Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.

Comments

Tags:

No Longer Soft Skills: Five Crucial Workplace Skills Everyone Should Learn

Carmen O'Shea

My child’s elementary school focuses on skills they believe support children in becoming changemakers. Through use of an integrated, project-based curriculum, they explicitly teach and assess “learner values” such as iteration, risk, failure, collaboration, and perspective. Their philosophy is that these attributes long considered “soft skills” have become the crucial educational priorities for this generation.

Why do they believe this? Much knowledge is now easily accessed and readily queried, such that the acquisition of specific content or know-how is far less important than how to apply that content in different situations and how to interact with others in the pursuit of goals. This holds true in the workplace as well as the academic environment. When I think about how I operate in my job at a large technology company, it’s not really what I know but what I do with what I know, and whom I engage to get things accomplished.

Watching the school teach these skills just as they do math or language has made me stop and consider what they look like for an employee. I wanted to share my thoughts on five qualities beyond relevant academic skills or professional experience that are just as important (if not more so) in predicting top work performance. These are more qualitative skills that managers should hire for, employees should develop, and organizations should optimize for.

  • Empathythe ability to see and integrate multiple perspectives and to understand the impact of how others think. Empathy can also mean advocating and showing empathy for oneself and for others. Empathy is assuming a good intention even when someone has said or done something we dislike – to stop and pause, attempt to understand, and respond compassionately in a difficult workplace situation. Empathy also extends to intuiting beyond just the professional environment to more of a personal level to truly understand what drives a colleague or employee.
  • Resiliencethe ability to take risks even when you know you may fail and then to bounce back, sometimes repeatedly, from failure. Inherent in resilience is the idea of iteration – that it is often essential to try things multiple times, in multiple ways, from multiple angles, before achieving a desired outcome. Resilience is receiving difficult yet constructive feedback from a manager or peer and resolving to act positively on it instead of wallowing or harboring a grudge. Resilience is maintaining a sense of optimism even in a down quarter at work.
  • Creativitythe ability to think differently or expansively and to approach a problem from multiple angles. Sometimes it’s called “thinking outside the box.” Creativity often includes inquiry, the act of questioning and satisfying one’s curiosity about particular topics. Torrance defined it along several parameters – number of ideas generated, number of categories of ideas, originality of ideas, and how detailed each idea is elaborated. We see it in action during brainstorming phases of projects, but it’s also possible to apply creativity on a continual basis, by pushing colleagues to expand on their thoughts, by not being satisfied with a less than stellar answer, by taking time to understand how multiple approaches to an issue could be combined, or by simply trying something new in a familiar situation.
  • Collaborationthe ability to interact and work productively with others, in all size groups. Effective collaboration requires empathy, especially when collaborators have different backgrounds, styles, or thought processes. Collaboration also requires exemplary communication skills, both oral and written, as well as reflective listening. So much of our tasks on the job require collaboration with others, whether to inform, persuade, learn, or engage, and these interactions form the bedrock for innovation. It’s tough to innovate without collaborating.
  • Flexibilitythe ability to adapt or change course if that is what the situation demands. Flexibility includes letting go of one’s idea in the interest of attaining a goal more quickly. It can also include development a comfort level with uncertainty or ambiguity, especially in times of change. Flexibility is a willingness to absorb feedback objectively and course correct as needed without personalizing the information or demonizing the provider of it. Expounding on another’s idea (not our own) in a brainstorming session demonstrates flexibility, as does remaining calm while an org change takes effect and roles are temporarily unclear.

When employees exhibit these qualities, they are better able to understand their purpose at work and to unleash their passions in the pursuit of that purpose. When teams exhibit these qualities, achievement and employee engagement are higher.  I wager that retention and innovation will improve as well. It’s heartening that as a society we’re beginning to consider how to best prepare our children educationally for the kind of work environments they will encounter after they finish their academic journey.

Do you also see these qualities as valuable in assessing employee fit? How can managers and organizations better identify, train and reward employees for living these qualities?

For more on this topic, see Your Business Needs People With Skills, Not Just Qualifications.

Comments

Carmen O'Shea

About Carmen O'Shea

Carmen O’Shea is the Senior Vice President of HR Change & Engagement at SAP. She leads a global team supporting major transformation initiatives across the company, focused on change management, employee engagement, and creative marketing and interaction. You can follow Carmen on Twitter.