Sections

How To Really Boost Mining Safety And Productivity

Ruediger Schroedter

Digital business is all about streamlining. Every organization collects data, but not all know how to use it. At their best, data and analytics help businesses make smarter decisions. The key is knowing what to measure—and who to share it with.

Today, new enterprise software is revolutionizing mining. From predictive software to 3D printing, the future of mining is here. But to harness its power, companies must first reimagine their business processes.

Predictability is bliss

Everyone can agree that the fewer unexpected disruptions, the better. But how can mines reduce unwelcome surprises? The answer, in a nutshell, is to measure everything.

Digital tools help employees to track alerts and interruptions in real time. And through predictive sensors, operators know which machines need repair—and when. This powerful technology keeps all site teams informed and in the loop.

Bonus: Click here to learn more about Digital Transformation in Mining

But what about the safety of the site itself? Whether in metals or coal, no site is free of risks. In the past, determining reserves and ore quality took a lot effort and manpower. But new sensor-based tools are expediting the decision-making process. Thanks to geomodeling, determining project viability is now more accurate—and more cost-effective.

With the site developed, companies can now maximize its potential. At AngloGold Ashanti, they use geomodeling to identify high-grade ore bodies. Then through half-level mining, they can target the ore and avoid the waste. Through digital transformation, AngloGold shanti’s metals extraction processes are more precise, predictable, and profitable.

Prioritizing sustainability and safety

Sustainability is more important in mining than ever before. Today’s mines must meet strict goals without losing productivity. Meanwhile, renewable energy is becoming cheaper and more practical. New mines will depend on these resources and use real-time analytics to track output. Embracing sustainability now may help attract capital and create new revenue streams.

Just as important to digital business is safety. But how can digital tools help? A survey by The Economist found promise in mobile applications. Almost half of organizations used mobile to communicate in the field or underground. Many also used mobile to track on-site hazards as they develop.

For example, at Codelco, underground workers wear bionic jackets with safety sensors. These tools provide real-time monitoring of location and crucial environmental factors. By tracking light, sound, and oxygen, mine workers can expect and avoid potential danger.

Coalminer operating digger in tunnel of deep mine --- Image by © Monty Rakusen/cultura/Corbis

3D printing and the supply chain

How can businesses create a more efficient, cost-effective supply chain? Digital transformation thrives on collaboration and customization. And with 3D printing, warehouses of spare parts and stock piles will be a thing of the past. These powerful devices will speed up repairs and reduce remote shipping costs.

Here’s an example: Mines rely on topographic maps for safe and effective resource extraction. Two-dimensional maps may be the standard, but 3D maps are proving far more reliable. Not only do these maps improve planning, they also help investors analyze site potential.

In fact, 3D printing may have other benefits to the metals industry as well. Many 3D printers rely on titanium oxide for production. This powder uses titanium, steel, cobalt, and other raw materials. With resources in demand, many hope that 3D printing will help stabilize market prices.

The benefits of digital transformation are everywhere, from smarter mine modeling to sustainable sourcing. This allows for real-time responses and decisive, efficient action. The net result? A more powerful, productive digital mine—and a safer one.

Join a LiveTwitterChat on Digitalization in Mining on May 4th from 10-11 a.m. EST: #digitalmining

The global mining and metals industry is coming together to discuss how digital innovation is impacting the mining industry, July 12-14 at the International SAP Conference for Mining and Metals in Frankfurt, Germany. Find out more and register. Don’t miss this opportunity to meet with world leaders and learn how your organization can become a connected digital enterprise.

Follow who is coming and speaking and pre-event activities by following sapmmconf and @sapmillmining on Twitter.
AA Mining and Metals Forum

 

Comments

Ruediger Schroedter

About Ruediger Schroedter

Ruediger Schroedter is responsible for solution management of SAP solutions for the mining industry worldwide. He has spent more than 15 years in the mill products and mining industries and has extensive experience implementing SAP solutions for customers in these industries before coming to SAP.

Data Analysts And Scientists More Important Than Ever For The Enterprise

Daniel Newman

The business world is now firmly in the age of data. Not that data wasn’t relevant before; it was just nowhere close to the speed and volume that’s available to us today. Businesses are buckling under the deluge of petabytes, exabytes, and zettabytes. Within these bytes lie valuable information on customer behavior, key business insights, and revenue generation. However, all that data is practically useless for businesses without the ability to identify the right data. Plus, if they don’t have the talent and resources to capture the right data, organize it, dissect it, draw actionable insights from it and, finally, deliver those insights in a meaningful way, their data initiatives will fail.

Rise of the CDO

Companies of all sizes can easily find themselves drowning in data generated from websites, landing pages, social streams, emails, text messages, and many other sources. Additionally, there is data in their own repositories. With so much data at their disposal, companies are under mounting pressure to utilize it to generate insights. These insights are critical because they can (and should) drive the overall business strategy and help companies make better business decisions. To leverage the power of data analytics, businesses need more “top-management muscle” specialized in the field of data science. This specialized field has lead to the creation of roles like Chief Data Officer (CDO).

In addition, with more companies undertaking digital transformations, there’s greater impetus for the C-suite to make data-driven decisions. The CDO helps make data-driven decisions and also develops a digital business strategy around those decisions. As data grows at an unstoppable rate, becoming an inseparable part of key business functions, we will see the CDO act as a bridge between other C-suite execs.

Data skills an emerging business necessity

So far, only large enterprises with bigger data mining and management needs maintain in-house solutions. These in-house teams and technologies handle the growing sets of diverse and dispersed data. Others work with third-party service providers to develop and execute their big data strategies.

As the amount of data grows, the need to mine it for insights becomes a key business requirement. For both large and small businesses, data-centric roles will experience endless upward mobility. These roles include data anlysts and scientists. There is going to be a huge opportunity for critical thinkers to turn their analytical skills into rapidly growing roles in the field of data science. In fact, data skills are now a prized qualification for titles like IT project managers and computer systems analysts.

Forbes cited the McKinsey Global Institute’s prediction that by 2018 there could be a massive shortage of data-skilled professionals. This indicates a disruption at the demand-supply level with the needs for data skills at an all-time high. With an increasing number of companies adopting big data strategies, salaries for data jobs are going through the roof. This is turning the position into a highly coveted one.

According to Harvard Professor Gary King, “There is a big data revolution. The big data revolution is that now we can do something with the data.” The big problem is that most enterprises don’t know what to do with data. Data professionals are helping businesses figure that out. So if you’re casting about for where to apply your skills and want to take advantage of one of the best career paths in the job market today, focus on data science.

I’m compensated by University of Phoenix for this blog. As always, all thoughts and opinions are my own.

For more insight on our increasingly connected future, see The $19 Trillion Question: Are You Undervaluing The Internet Of Things?

The post Data Analysts and Scientists More Important Than Ever For the Enterprise appeared first on Millennial CEO.

Comments

Daniel Newman

About Daniel Newman

Daniel Newman serves as the Co-Founder and CEO of EC3, a quickly growing hosted IT and Communication service provider. Prior to this role Daniel has held several prominent leadership roles including serving as CEO of United Visual. Parent company to United Visual Systems, United Visual Productions, and United GlobalComm; a family of companies focused on Visual Communications and Audio Visual Technologies. Daniel is also widely published and active in the Social Media Community. He is the Author of Amazon Best Selling Business Book "The Millennial CEO." Daniel also Co-Founded the Global online Community 12 Most and was recognized by the Huffington Post as one of the 100 Business and Leadership Accounts to Follow on Twitter. Newman is an Adjunct Professor of Management at North Central College. He attained his undergraduate degree in Marketing at Northern Illinois University and an Executive MBA from North Central College in Naperville, IL. Newman currently resides in Aurora, Illinois with his wife (Lisa) and his two daughters (Hailey 9, Avery 5). A Chicago native all of his life, Newman is an avid golfer, a fitness fan, and a classically trained pianist

When Good Is Good Enough: Guiding Business Users On BI Practices

Ina Felsheim

Image_part2-300x200In Part One of this blog series, I talked about changing your IT culture to better support self-service BI and data discovery. Absolutely essential. However, your work is not done!

Self-service BI and data discovery will drive the number of users using the BI solutions to rapidly expand. Yet all of these more casual users will not be well versed in BI and visualization best practices.

When your user base rapidly expands to more casual users, you need to help educate them on what is important. For example, one IT manager told me that his casual BI users were making visualizations with very difficult-to-read charts and customizing color palettes to incredible degrees.

I had a similar experience when I was a technical writer. One of our lead writers was so concerned with readability of every sentence that he was going through the 300+ page manuals (yes, they were printed then) and manually adjusting all of the line breaks and page breaks. (!) Yes, readability was incrementally improved. But now any number of changes–technical capabilities, edits, inserting larger graphics—required re-adjusting all of those manual “optimizations.” The time it took just to do the additional optimization was incredible, much less the maintenance of these optimizations! Meanwhile, the technical writing team was falling behind on new deliverables.

The same scenario applies to your new casual BI users. This new group needs guidance to help them focus on the highest value practices:

  • Customization of color and appearance of visualizations: When is this customization necessary for a management deliverable, versus indulging an OCD tendency? I too have to stop myself from obsessing about the font, line spacing, and that a certain blue is just a bit different than another shade of blue. Yes, these options do matter. But help these casual users determine when that time is well spent.
  • Proper visualizations: When is a spinning 3D pie chart necessary to grab someone’s attention? BI professionals would firmly say “NEVER!” But these casual users do not have a lot of depth on BI best practices. Give them a few simple guidelines as to when “flash” needs to subsume understanding. Consider offering a monthly one-hour Lunch and Learn that shows them how to create impactful, polished visuals. Understanding if their visualizations are going to be viewed casually on the way to a meeting, or dissected at a laptop, also helps determine how much time to spend optimizing a visualization. No, you can’t just mandate that they all read Tufte.
  • Predictive: Provide advanced analytics capabilities like forecasting and regression directly in their casual BI tools. Using these capabilities will really help them wow their audience with substance instead of flash.
  • Feature requests: Make sure you understand the motivation and business value behind some of the casual users’ requests. These casual users are less likely to understand the implications of supporting specific requests across an enterprise, so make sure you are collaborating on use cases and priorities for substantive requests.

By working with your casual BI users on the above points, you will be able to collectively understand when the absolute exact request is critical (and supports good visualization practices), and when it is an “optimization” that may impact productivity. In many cases, “good” is good enough for the fast turnaround of data discovery.

Next week, I’ll wrap this series up with hints on getting your casual users to embrace the “we” not “me” mentality.

Read Part One of this series: Changing The IT Culture For Self-Service BI Success.

Follow me on Twitter: @InaSAP

Comments

More Than Noise: 5 Digital Stories From 2016 That Are Bigger Than You Think

Dan Wellers, Michael Rander, Kai Göerlich, Josh Waddell, Saravana Chandran, and Stephanie Overby

These days it seems that we are witnessing waves of extreme disruption rather than incremental technology change. While some tech news stories have been just so much noise, unlikely to have long-term impact, a few are important signals of much bigger, longer-term changes afoot.

From bots to blockchains, augmented realities to human-machine convergence, a number of rapidly advancing technological capabilities hit important inflection points in 2016. We looked at five important emerging technology news stories that happened this year and the trends set in motion that will have an impact for a long time to come.

sap_q416_digital_double_feature1__1

Immersive experiences were one of three top-level trends identified by Gartner for 2016, and that was evident in the enormous popularity of Pokémon Go. While the hype may have come and gone, the immersive technologies that have been quietly advancing in the background for years are ready to boil over into the big time—and into the enterprise.

The free location-based augmented reality (AR) game took off shortly after Nintendo launched it in July, and it became the most downloaded app in Apple’s app store history in its first week, as reported by TechCrunch. Average daily usage of the app on Android devices in July 2016 exceeded that of the standard-bearers Snapchat, Instagram, and Facebook, according to SimilarWeb. Within two months, Pokémon Go had generated more than US$440 million, according to Sensor Tower.

Unlike virtual reality (VR), which immerses us in a simulated world, AR layers computer-generated information such as graphics, sound, or other data on top of our view of the real world. In the case of Pokémon Go, players venture through the physical world using a digital map to search for Pokémon characters.

The game’s instant global acceptance was a surprise. Most watching this space expected an immersive headset device like Oculus Rift or Google Cardboard to steal the headlines. But it took Pikachu and the gang to break through. Pokémon Go capitalized on a generation’s nostalgia for its childhood and harnessed the latest advancements in key AR enabling technologies such as geolocation and computer vision.

sap_q416_digital_double_feature1_images8Just as mobile technologies percolated inside companies for several years before the iPhone exploded onto the market, companies have been dabbling in AR since the beginning of the decade. IKEA created an AR catalog app in 2013 to help customers visualize how their KIVIK modular sofa, for example, would look in their living rooms. Mitsubishi Electric has been perfecting an AR application, introduced in 2011, that enables homeowners to visualize its HVAC products in their homes. Newport News Shipbuilding has launched some 30 AR projects to help the company build and maintain its vessels. Tech giants including Facebook, HP, and Apple have been snapping up immersive tech startups for some time.

The overnight success of Pokémon Go will fuel interest in and understanding of all mediated reality technology—virtual and augmented. It’s created a shorthand for describing immersive reality and could launch a wave of technology consumerization the likes of which we haven’t seen since the iPhone instigated a tsunami of smartphone usage. Enterprises would be wise to figure out the role of immersive technology sooner rather than later. “AR and VR will both be the new normal within five years,” says futurist Gerd Leonhard, noting that the biggest hurdles may be mobile bandwidth availability and concerns about sensory overload. “Pokémon is an obvious opening scene only—professional use of AR and VR will explode.”

sap_q416_digital_double_feature1__3

Blockchains, the decentralized digital ledgers of transactions that are processed by a distributed network, first made headlines as the foundation for new types of financial transactions beginning with Bitcoin in 2009. According to Greenwich Associates, financial and technology companies will invest an estimated $1 billion in blockchain technology in 2016. But, as Gartner recently pointed out, there could be even more rapid evolution and acceptance in the areas of manufacturing, government, healthcare, and education.

By the 2020s, blockchain-based systems will reduce or eliminate many points of friction for a variety of business transactions. Individuals and companies will be able to exchange a wide range of digitized or digitally represented assets and value with anyone else, according to PwC. The supervised peer-to-peer network concept “is the future,” says Leonhard.

But the most important blockchain-related news of 2016 revealed a weak link in the application of technology that is touted as an immutable record.

In theory, blockchain technology creates a highly tamper-resistant structure that makes transactions secure and verifiable through a massively distributed digital ledger. All the transactions that take place are recorded in this ledger, which lives on many computers. High-grade encryption makes it nearly impossible for someone to cheat the system.

In practice, however, blockchain-based transactions and contracts are only as good as the code that enables them.

Case in point: The DAO, one of the first major implementations of a “Decentralized Autonomous Organization” (for which the fund is named). The DAO was a crowdfunded venture capital fund using cryptocurrency for investments and run through smart contracts. The rules that govern those smart contracts, along with all financial transaction records, are maintained on the blockchain. In June, the DAO revealed that an individual exploited a vulnerability in the company’s smart contract code to take control of nearly $60 million worth of the company’s digital currency.

The fund’s investors voted to basically rewrite the smart contract code and roll back the transaction, in essence going against the intent of blockchain-based smart contracts, which are supposed to be irreversible once they self-execute.

The DAO’s experience confirmed one of the inherent risks of distributed ledger technology—and, in particular, the risk of running a very large fund autonomously through smart contracts based on blockchain technology. Smart contract code must be as error-free as possible. As Cornell University professor and hacker Emin Gün Sirer wrote in his blog, “writing a robust, secure smart contract requires extreme amounts of diligence. It’s more similar to writing code for a nuclear power reactor, than to writing loose web code.” Since smart contracts are intended to be executed irreversibly on the blockchain, their code should not be rewritten and improved over time, as software typically is. But since no code can ever be completely airtight, smart contracts may have to build in contingency plans for when weaknesses in their code are exploited.

Importantly, the incident was not a result of any inherent weakness in the blockchain or distributed ledger technology generally. It will not be the end of cryptocurrencies or smart contracts. And it’s leading to more consideration of editable blockchains, which proponents say would only be used in extraordinary circumstances, according to Technology Review.

sap_q416_digital_double_feature1__5

Application programming interfaces (APIs), the computer codes that serve as a bridge between software applications, are not traditionally a hot topic outside of coder circles. But they are critical components in much of the consumer technology we’ve all come to rely on day-to-day.

One of the most important events in API history was the introduction of such an interface for Google Maps a decade ago. The map app was so popular that everyone wanted to incorporate its capabilities into their own systems. So Google released an API that enabled developers to connect to and use the technology without having to hack into it. The result was the launch of hundreds of inventive location-enabled apps using Google technology. Today, millions of web sites and apps use Google Maps APIs, from Allstate’s GoodHome app, which shows homeowners a personalized risk assessment of their properties, to Harley-Davidson’s Ride Planner to 7-Eleven’s app for finding the nearest Slurpee.

sap_q416_digital_double_feature1_images6Ultimately, it became de rigueur for apps to open up their systems in a safe way for experimentation by others through APIs. Technology professional Kin Lane, who tracks the now enormous world of APIs, has said, “APIs bring together a unique blend of technology, business, and politics into a transparent, self-service mix that can foster innovation.”

Thus it was significant when Apple announced in June that it would open up Siri to third-party developers through an API, giving the wider world the ability to integrate Siri’s voice commands into their apps. The move came on the heels of similar decisions by Amazon, Facebook, and Microsoft, all of which have AI bots or assistants of their own. And in October, Google opened up its Google Assistant as well.

The introduction of APIs confirms that the AI technology behind these bots has matured significantly—and that a new wave of AI-based innovation is nigh.

The best way to spark that innovation is to open up AI technologies such as Siri so that coders can use them as platforms to build new apps that can more rapidly expand AI uses and capabilities. Call it the “platformication” of AI. The value will be less in the specific AI products a company introduces than in the value of the platform for innovation. And that depends on the quality of the API. The tech company that attracts the best and brightest will win. AI platforms are just beginning to emerge and the question is: Who will be the platform leader?

sap_q416_digital_double_feature1__4

In June, Swiss citizens voted on a proposal to introduce a guaranteed basic income for all of its citizens, as reported by BBC News. It was the first country to take the issue to the polls, but it won’t be the last. Discussions about the impact of both automation and the advancing gig economy on individual livelihoods are happening around the world. Other countries—including the United States—are looking at solutions to the problem. Both Finland and the Netherlands have universal guaranteed income pilots planned for next year. Meanwhile, American startup incubator Y Combinator is launching an experiment to give 100 families in Oakland, California, a minimum wage for five years with no strings attached, according to Quartz.

The world is on the verge of potential job loss at a scale and speed never seen before. The Industrial Revolution was more of an evolution, happening over more than a century. The ongoing digital revolution is happening in relative hyper speed.

No one is exactly sure how increased automation and digitization will affect the world’s workforce. One 2013 study suggests as much as 47% of the U.S workforce is at risk of being replaced by machines over the next two decades, but even a conservative estimate of 10% could have a dramatic impact, not just on workers but on society as a whole.

The proposed solution in Switzerland did not pass, in part because a major political party did not introduce it, and citizens are only beginning to consider the potential implications of digitization on their incomes. What’s more, the idea of simply guaranteeing pay runs contrary to long-held notions in many societies that humans ought to earn their keep.

Whether or not state-funded support is the answer is just one of the questions that must be answered. The votes and pilots underway make it clear that governments will have to respond with some policy measures. The question is: What will those measures be? The larger impact of mass job displacement, what future employment conditions might look like, and what the responsibilities of institutions are in ensuring that we can support ourselves are among the issues that policy makers will need to address.

New business models resulting from digitization will create some new types of roles—but those will require training and perhaps continued education. And not all of those who will be displaced will be in a position to remake their careers. Just consider taxi drivers: In the United States, about 223,000 people currently earn their living behind the wheel of a hired car. The average New York livery driver is 46 years old, according to the New York City Taxi and Limousine Commission, and no formal education is required. When self-driving cars take over, those jobs will go away and the men and women who held them may not be qualified for the new positions that emerge.

As digitization dramatically changes the constructs of commerce and work, no one is quite sure how people will be impacted. But waiting to see how it all shakes out is not a winning strategy. Companies and governments today will have to experiment with potential solutions before the severity of the problem is clear. Among the questions that will have to be answered: How can we retrain large parts of the workforce? How will we support those who fall through the cracks? Will we prioritize and fund education? Technological progress and shifting work models will continue, whether or not we plan for their consequences.

sap_q416_digital_double_feature1__2

In April, a young man, who was believed to have permanently lost feeling in and control over his hands and legs as the result of a devastating spine injury, became able to use his right hand and fingers again. He used technology that transmits his thoughts directly to his hand muscles, bypassing his injured spinal cord. Doctors implanted a computer chip into the quadriplegic’s brain two years ago and—with ongoing training and practice—he can now perform everyday tasks like pouring from a bottle and playing video games.

The system reconnected the man’s brain directly to his muscles—the first time that engineers have successfully bypassed the nervous system’s information superhighway, the spinal cord. It’s the medical equivalent of moving from wired to wireless computing.

The man has in essence become a cyborg, that term first coined in 1960 to describe “self-regulating human-machine systems.” Yet the beneficiary of this scientific advance himself said, “You’re not going to be looked on as, ‘Oh, I’m a cyborg now because I have this big huge prosthetic on the side of my arm.’ It’s something a lot more natural and intuitive to learn because I can see my own hand reacting.”

As described in IEEE Spectrum, the “neural-bypass system” records signals that the man generates when thinking about moving his hand, decodes those signals, and routes them to the electric sleeve around his arm to stimulate movement: “The result looks surprisingly simple and natural: When Burkhart thinks about picking up a bottle, he picks up the bottle. When he thinks about playing a chord in Guitar Hero, he plays the chord.”

sap_q416_digital_double_feature1_images5What seems straightforward on the surface is powered by a sophisticated algorithm that can analyze the vast amounts of data the man’s brain produces, separating important signals from noise.

The fact that engineers have begun to unlock the complex code that controls brain-body communication opens up enormous possibilities. Neural prostheses (cochlear implants) have already reversed hearing loss. Light-sensitive chips serving as artificial retinas are showing progress in restoring vision. Other researchers are exploring computer implants that can read human thoughts directly to signal an external computer to help people speak or move in new ways. “Human and machine are converging,” says Leonhard.

The National Academy of Engineering predicts that “the intersection of engineering and neuroscience promises great advances in healthcare, manufacturing, and communication.”

Burkhart spent two years in training with the computer that has helped power his arm to get this far. It’s the result of more than a decade of development in brain-computer interfaces. And it can currently be used only in the lab; researchers are working on a system for home use. But it’s a clear indication of how quickly the lines between man and machine are blurring—and it opens the door for further computerized reanimation in many new scenarios.

This fall, Switzerland hosted its first cyborg Olympics, in which disabled patients compete using the latest assistive technologies, including robot exoskeletons and brainwave-readers. Paraplegic athletes use electrical simulation systems to compete in cycling, for example. The winners are those who can control their device the best. “Instead of celebrating the human body moving under its own power,” said a recent article in the IEEE Spectrum, “the cyborg games will celebrate the strength and ingenuity of human-machine collaborations.” D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.

Comments

Dan Wellers

About Dan Wellers

Dan Wellers is the Global Lead of Digital Futures at SAP, which explores how organizations can anticipate the future impact of exponential technologies. Dan has extensive experience in technology marketing and business strategy, plus management, consulting, and sales.

Tags:

The Future Of Work Is Now

Stefan Ries

Far beyond collaboration, the digitization of work determines how we work and engage people. Technologies – such as artificial intelligence, machine learning, robotics, analytics, and cloud technologies – change the way we recruit, develop talent, and make our workforce more inclusive. They also introduce new jobs, largely with different skill set requirements. Some of the most-wanted jobs today did not exist five years ago – and many jobs we wouldn’t even imagine today will arise in the near future. Our workplace is changing at light speed.

“Beyond collaboration, the digitization of work determines how we work and engage people”

Technology accelerates the transformation of businesses and industries. We need to prepare our businesses for the future, anticipate skills requirements and workforce changes. While some of the developments are unpredictable, it is up to thought and industry leaders like us to take control and shape the future of work.

SAP Future Factor, an interactive Web series: Engaging with thought leaders about the future of work

Welcome to the SAP Future Factor Web Salon, an interactive Web series featuring perspectives of thought leaders from academia, business, and government about the workplace of the future. The series drives a continuous exchange about the impacts of digitization on organizations and shares insight on innovative practices already in place.

The inaugural episode features SAP chief human resources officer Stefan Ries and Kevin Kruse, leadership expert and author of the New York Times best-seller “We: How to Increase Performance and Profits Through Full Engagement.” The two thought leaders exchange views on the opportunities and challenges of a digitized workplace and business culture. Their discussion will touch on the rising digital workplace, new ways to collaborate, the role technology plays to foster diversity and inclusion, employee engagement, and talent development.

Choose the topics that match your needs

Tomorrow’s workplace is all about choices – and so is the format of the SAP Future Factor Web series. All episodes are fully interactive, giving you the opportunity to interact with the content of the video by choosing topics of interest to you and your business. You determine what you would like to view and learn about, and in what order.

Episode 1 features the following topics:

  • Impacts of Digitization
  • HR’s Role in a Digitized World
  • Cloud Culture
  • Business Beyond Bias
  • Man vs. Machine
  • Rise of Social Intelligence

The future is now. Engage with us in the SAP Future Factor!

We hope you will enjoy the first episode. Tell us what you think.

Are the biggest trends from the last year on your radar screen? See More Than Noise: 5 Digital Stories From 2016 That Are Bigger Than You Think.

Comments

Stefan Ries

About Stefan Ries

Stefan Ries is Chief Human Resources Officer (CHRO), Labor Relations Director, and a member of the Executive Board of SAP SE. Stefan was born in Bavaria and raised in Constance, Germany, where he spent most of his youth. After receiving his masters of business in economics from the University of Constance in 1991, he moved to Munich. He started his career as HR Manager at Microsoft, overseeing HR duties in Austria, Switzerland, and East European countries. In July 1994, he went on to lead the HR function for Compaq Computer in Europe, Middle East, and Africa. Following the company’s acquisitions of Tandem Computers and Digital Equipment Corporation in 1999 and 2000, Stefan led the entire HR organization for Compaq in Germany. Stefan first joined SAP in 2002 and later became responsible for various HR functions, heading up the HR business partner organization and overseeing all HR functions on an operational level. To support innovation, Stefan attaches great importance to a diverse working culture. He is convinced that appreciating the differences among people, their unique backgrounds and personalities is a key success factor for SAP.