Sections

5 Ways The Internet Of Things Is Changing The Game For Education And Learning

Geetika Shukla

There’s been so much buzz about the Internet of Things (IoT) lately – maybe not as much as for the U.S. presidential campaigns, but it’s pretty close. For today’s youngsters, the day will come when a computer is no longer seen as a separate object or device. With technology very much entwined in the basic fabric of everyday living, our children might feel offended if their obedient room lamp doesn’t immediately acknowledge their presence by switching itself on.

Over time, the IoT will be a mindset, rather than a steady stream of technology. Even though every other device in our home, workplace, or surrounding environment will be intelligent enough to connect and talk to each other, people will inevitably focus on the transformational possibilities for our world.

The realm of education is no exception to the IoT’s influence. Until now, educational technology has pivoted more or less around virtual conferencing and classrooms, online tutorials, and similar offerings. However, this is only the beginning. Here are five ways the IoT can transform education.

1. Connect academies all over the map

Some of the latest IoT artillery in this field includes digital highlighters, smart boards, and even smarter boards. This means your printed text could be digitally transferred to your smartphone or any other app at an incredible speed through tools like C-Pen and Scanmarker. Interactive boards can receive, acknowledge, and reciprocate information, simplifying and accelerating the overall learning experience.

Just imagine a scenario where students sitting in a classroom or at their desk at home can interact with their classmates, mentors, and educators scattered across the world. Now, let’s suppose the lesson of the day is focused on sea life. To give students a really exciting – and highly educational – experience, the teacher decides to access live information generated through sensors and live feeds monitoring a particular body of water.

2. Conserve and sustain to survive and flourish

With the aid of the IoT, a variety of options are possible in terms of environmental and energy conservation, ecosystem regulation, traffic, and transport, to name a few, that can help schools build up their budgets and offer better learning opportunities. For example, a school district in Pennsylvania saved a fortune on energy by using the IoT to support its energy monitoring and control program and reinvested the savings into its education programs. After all, living a green lifestyle is the way to go for all of us – we might as well put it to work so we can invest in more critical areas.

3. Win over students (and parents) with a safe and secure learning environment

The safety and security of students are paramount – whether you are a parent, educational authority, security official, or concerned citizen. With empowered sensors, RFIDs, cameras, and connected devices, monitoring and surveillance of entire buildings is possible. Instant notifications, alerts, and configured actions would be a significant addition to the security and safety of schools and other educational institutions.

4. Grant parity for all

The connected world of everything has a lot to offer students who need modified learning plans and exceptions. There are already a number of devices, tools, and apps that create appropriate learning experiences while bringing them on par with the rest of the class. One such example is the Lechal shoe project, which enables the visually challenged to better navigate the world through technology.

5. Turn learners into creators

The IoT indeed promotes and paves the way for creativity – and for children, there’s nothing better than learning the nuances and applications of hyperconnectivity firsthand. After all the predictions regarding the enormous number of connected communication and decision-making devices in the years to come, this is an excellent opportunity for schoolchildren to understand, build, and control such systems themselves.

The future trajectory of IoT-enabled education: Bumpy or smooth?

The IoT has the potential to strip away common barriers in education such as economic status, geography, language, and physical location. But once the initial glitz of being “super and hyperconnected” fizzles out, there are more important questions that need to be answered.

Converging education with technology is not just about bringing learning resources or making learning simpler and faster – it’s about quality, impact, and community acceptance too. Even with all the fancy resources and technology at our children’s fingertips, it is still a long and tough road ahead for the IoT to reform education in a path-breaking and everlasting way. Nevertheless, the seeds are sown well and the harvest appears to be promising.

Learning doesn’t stop when you graduate; if you want to be successful, it’s a lifelong endeavor. Learn How to Create a Culture of Continuous Learning.

Comments

Geetika Shukla

About Geetika Shukla

My association with SAP is eight wonderful years. I have a disposition for the latest technological trends and a fascination for all the digital buzz apart from the world of process orchestration, cloud, and platforms.

University: Then And Now

Katerina Hoare

Back in 1975, the popular opinion was that just passing your university classes – “P’s get degrees” – was enough to set you up for a successful career. Today, it’s very different. P’s may get you degrees… but they certainly won’t guarantee you a graduate job.

The age of simply passing subjects while spending vital study time down at the pub is over. Graduate positions have hundreds of equally competent university students vying for a very limited number of positions. Universities rely on their students to be successful when entering the workforce in order to reinforce the value of the degrees that they are offering. And think about all the times today’s students are told by a parent or a nosy aunt, “ohhh that degree may be interesting, but can you get a job with it straight away?” Ouch, reality hits.

The student’s objective has changed from my father’s time, where university was a chance to make lifelong friends, collect a piece of paper at the end, and set off job hunting. Today it is a more stressful experience with a less accepting and nurturing employment market waiting on the other side of that graduation ceremony.

But how can universities help their students? How can they give them the best possible chance of securing jobs after graduation and, in doing so, increase the value of the degrees and courses of study that they are offering? What if there was a way that professors, lecturers, and tutors could identify students at risk of failing a subject before they had enrolled in a subject, course, or major?

Before choosing majors for bachelor’s degrees, students must pass a collection of prerequisite subjects in their first and second years. A student’s ability to meet particular assessment criteria in their first two years can determine their success in passing the critical third-year subjects in their degree major. But remember: it isn’t enough to just pass classes these days. Students applying for internships and jobs learn that academic averages and GPAs will determine whether they get to the interview stage for a desired role. I was constantly warned by friends when I was in my first year that I needed to study hard to keep my grades high enough to earn Distinction. Luckily I did, which made me successful when applying for internships and jobs after graduation. But what about less informed students who aren’t in the loop around the need for a Distinction average? How can their university support them?

Are analytics the answer?

What if a university used analytics to identify a first-year student who scored a 58 in first-year finance (when the average for the cohort was 78) and, rather than enrolling him in a second-year finance course, automatically placed him into a tutorial class run by the subject lecturer? Wouldn’t that be better? Wouldn’t that increase the student experience and result in a higher mark and higher probability of graduate employment?

What about if a university used analytics to look at a student’s high school marks in mathematical methods and proactively nurse lower performing students through their first-year compulsory quantitative statistics courses?

Around 1 in 5 students will drop out of university or change their course within their first year. The Australian undergraduate population more than doubled in just a decade, from just over 159,000 students in 1994 to over 405,000 students in 2014. This means that the higher education sector is losing approximately $810 million in revenue per year (based on an average first-year course debt of $10,000, and a fifth of enrolled students dropping out). In my own personal network, I have multiple friends who have swapped and changed both bachelor’s courses and universities in their first year because they struggled, felt isolated, and didn’t receive the assistance from the faculty that they were accustomed to (and still needed) in high school.

Luckily, today universities are catching up with the need to deliver value to their students and to nurse them through their first year of university. Predictive analytics offers an opportunity for faculty to proactively identify at-risk students and put safeguards in place to ensure their optimal performance, which in turn provides them a higher probability of receiving a job after graduation.

University life has changed extensively since my father’s time, and it will continue to do so as employment markets change and students are encouraged to complete an undergraduate degree in increasing numbers. In failing to proactively identify and support at-risk students, Universities are leaving revenue on the table and churning out graduates who are likely to experience difficulty securing a  job post-graduation.

Learn more about higher education today in The Digital University.

Comments

Katerina Hoare

About Katerina Hoare

Recent University graduate, passionate about leveraging SAP technology to improve the student experience and helping Universities Run Better

The Digital University

Katerina Hoare

Pick up a copy of The Financial Review or The Australian, and you’ll notice that both newspapers focus heavily on the digital strategies of top Australian companies. Read anything about the digital economy and you’ll be presented with statistics about how 40% of today’s F500 companies on the S&P 500 will no longer exist in 10 years (Diamandis, 2016).

With such a harsh statistic, the message to the market is clear: Adapt to the digital world or perish. Everyone knows about Kodak’s failure to embrace the digital camera and adapt to the new digital market. As a result, Kodak filed for bankruptcy in January 2012.

The final impact? These days, big companies understand the need for a digital strategy: the market knows it and investors understand it.

But what are the implications for universities and the higher education sector of failing to adapt to the market and invest in their own digital strategy? What are they missing out on? How can they avoid having their own “Kodak moment?” Let’s investigate.

Discover SAP partner packages for our midmarket ERP suite

The most common misconception of the “digital university” is that it is simply a university that offers online learning and courses. But it is much more than that. A true digital university is one that embraces the digital age from back to front, inside out, and beginning to end. A digital university is one in which lecturers teach students about the latest digital trends whilst using that very technology to deliver the most up-to-date insights. It is one in which PhD students openly seek out digital technology to support new, unknown, and untested ideas and innovations. It is a university in which students play with the latest technology and imagine working in the most digital workplaces, where meetings are held between holograms and everyone’s working from the couch at home wearing their comfiest pyjamas.

So why isn’t the traditional “back office” doing the same? Why isn’t today’s campus environment currently offering a digitized experience?

According to PwC, many universities are in fact developing specific digital strategies in reaction to the massive shift towards using new technology, “yet [they] lack the vision, capability or commitment to implement them effectively.” 

Sounds like a bit of a Kodak moment, doesn’t it? Kodak was aware of the threat posed by the digital camera, but instead chose to repeatedly focus on its picture chemical business. Wave after wave of CEOs “would bemoan his predecessor’s failure to transform the organization to digital, declare his own intention to do so, and proceed to fail at the transition as well” (Mui, 2012).

We understand that digital transformation is complex, difficult, and entrenched within the politics and relationships of the university. Difficulties in clearly articulating the business benefits of transformation are not always clear-cut. Kodak knew that it had to transform their business to digital but it failed to grasp the “why” before tackling the “how.” The “how” is usually the easy path. Entire service organisations filled with buzzing consultants act to help you achieve the “how.”

But the “why?” That’s more difficult, though certainly not unachievable. Retail is one industry that has weathered many storms on the road to digital transformation. Retailers have embraced the digital age, in which tweets are more effective than letters or catalogs in the mail. They have learned to service their customers, treat them well, communicate with them in the way they prefer, and ultimately make them happy.

Retail organisations have done this by going digital. Marketing algorithms help retailers leverage data to alert customers to products they may be interested in based on past search history, purchases, and buying behaviour. Retail organisations know that social media is the primary communication channel for many; that’s why Instagram is now a viable advertising channel, as is Facebook, Pinterest, and Twitter. But how could a university better service the student and itself by going digital?

There are many answers to this question, but here are a few key ones.

  1. Student and learning analytics—focus on student success: Identify students at risk of failing not only a particular subject but also any assessment piece or task within the subject, based on interactions with required learning materials that need to be accessed in order to complete the assessment. Think in terms of assessment/marking criteria forms, assessment instructions, even how many times students access or download required lecture slides/readings, rather than relying solely on “static” characteristics such as socio-economic background and ATAR score.
  2. Forecasting—focus on course planning: Forecast the number of lecturers/tutors required for courses before students have even officially enrolled. You can then hire the appropriate number of tutors ahead of the stampede of first-years who enroll because they heard from a mate of a mate that Sociology 101 is really easy and the lecturer is related to XYZ sports star.
  3. The “business”—focus on finance: Calculate and influence the cost per subject/per course and the overall fiscal position of the university, as a business, by leveraging real-time data from HR, procurement, and even council land rates.
  4. The core—focus on student admission and retention: Use insight driven by analytics to identify and target students in order to increase conversion rates of the most desirable undergraduate, post-graduate, and PhD students. Provide the digital university experience to your students that they expect through a channel of communication that blends into their already seamlessly integrated iOS10 iPhone calendar and life.

In today’s digital age, where the voice of the customer is more prevalent than ever, turning your students into advocates for your university is one of the most powerful marketing tools available. In Australia and throughout the world, competition for students is increasing, and with the push towards more tertiary educated people a shared trend, competition for students is only going to get tougher.

Universities that are not adapting to this new digital era will be left behind. Whether you want to be a digital leader or simply stay relevant in the digital age, the time to act is now.

Sources:

Comments

Katerina Hoare

About Katerina Hoare

Recent University graduate, passionate about leveraging SAP technology to improve the student experience and helping Universities Run Better

More Than Noise: 5 Digital Stories From 2016 That Are Bigger Than You Think

Dan Wellers, Michael Rander, Kai Göerlich, Josh Waddell, Saravana Chandran, and Stephanie Overby

These days it seems that we are witnessing waves of extreme disruption rather than incremental technology change. While some tech news stories have been just so much noise, unlikely to have long-term impact, a few are important signals of much bigger, longer-term changes afoot.

From bots to blockchains, augmented realities to human-machine convergence, a number of rapidly advancing technological capabilities hit important inflection points in 2016. We looked at five important emerging technology news stories that happened this year and the trends set in motion that will have an impact for a long time to come.

sap_q416_digital_double_feature1__1

Immersive experiences were one of three top-level trends identified by Gartner for 2016, and that was evident in the enormous popularity of Pokémon Go. While the hype may have come and gone, the immersive technologies that have been quietly advancing in the background for years are ready to boil over into the big time—and into the enterprise.

The free location-based augmented reality (AR) game took off shortly after Nintendo launched it in July, and it became the most downloaded app in Apple’s app store history in its first week, as reported by TechCrunch. Average daily usage of the app on Android devices in July 2016 exceeded that of the standard-bearers Snapchat, Instagram, and Facebook, according to SimilarWeb. Within two months, Pokémon Go had generated more than US$440 million, according to Sensor Tower.

Unlike virtual reality (VR), which immerses us in a simulated world, AR layers computer-generated information such as graphics, sound, or other data on top of our view of the real world. In the case of Pokémon Go, players venture through the physical world using a digital map to search for Pokémon characters.

The game’s instant global acceptance was a surprise. Most watching this space expected an immersive headset device like Oculus Rift or Google Cardboard to steal the headlines. But it took Pikachu and the gang to break through. Pokémon Go capitalized on a generation’s nostalgia for its childhood and harnessed the latest advancements in key AR enabling technologies such as geolocation and computer vision.

sap_q416_digital_double_feature1_images8Just as mobile technologies percolated inside companies for several years before the iPhone exploded onto the market, companies have been dabbling in AR since the beginning of the decade. IKEA created an AR catalog app in 2013 to help customers visualize how their KIVIK modular sofa, for example, would look in their living rooms. Mitsubishi Electric has been perfecting an AR application, introduced in 2011, that enables homeowners to visualize its HVAC products in their homes. Newport News Shipbuilding has launched some 30 AR projects to help the company build and maintain its vessels. Tech giants including Facebook, HP, and Apple have been snapping up immersive tech startups for some time.

The overnight success of Pokémon Go will fuel interest in and understanding of all mediated reality technology—virtual and augmented. It’s created a shorthand for describing immersive reality and could launch a wave of technology consumerization the likes of which we haven’t seen since the iPhone instigated a tsunami of smartphone usage. Enterprises would be wise to figure out the role of immersive technology sooner rather than later. “AR and VR will both be the new normal within five years,” says futurist Gerd Leonhard, noting that the biggest hurdles may be mobile bandwidth availability and concerns about sensory overload. “Pokémon is an obvious opening scene only—professional use of AR and VR will explode.”

sap_q416_digital_double_feature1__3

Blockchains, the decentralized digital ledgers of transactions that are processed by a distributed network, first made headlines as the foundation for new types of financial transactions beginning with Bitcoin in 2009. According to Greenwich Associates, financial and technology companies will invest an estimated $1 billion in blockchain technology in 2016. But, as Gartner recently pointed out, there could be even more rapid evolution and acceptance in the areas of manufacturing, government, healthcare, and education.

By the 2020s, blockchain-based systems will reduce or eliminate many points of friction for a variety of business transactions. Individuals and companies will be able to exchange a wide range of digitized or digitally represented assets and value with anyone else, according to PwC. The supervised peer-to-peer network concept “is the future,” says Leonhard.

But the most important blockchain-related news of 2016 revealed a weak link in the application of technology that is touted as an immutable record.

In theory, blockchain technology creates a highly tamper-resistant structure that makes transactions secure and verifiable through a massively distributed digital ledger. All the transactions that take place are recorded in this ledger, which lives on many computers. High-grade encryption makes it nearly impossible for someone to cheat the system.

In practice, however, blockchain-based transactions and contracts are only as good as the code that enables them.

Case in point: The DAO, one of the first major implementations of a “Decentralized Autonomous Organization” (for which the fund is named). The DAO was a crowdfunded venture capital fund using cryptocurrency for investments and run through smart contracts. The rules that govern those smart contracts, along with all financial transaction records, are maintained on the blockchain. In June, the DAO revealed that an individual exploited a vulnerability in the company’s smart contract code to take control of nearly $60 million worth of the company’s digital currency.

The fund’s investors voted to basically rewrite the smart contract code and roll back the transaction, in essence going against the intent of blockchain-based smart contracts, which are supposed to be irreversible once they self-execute.

The DAO’s experience confirmed one of the inherent risks of distributed ledger technology—and, in particular, the risk of running a very large fund autonomously through smart contracts based on blockchain technology. Smart contract code must be as error-free as possible. As Cornell University professor and hacker Emin Gün Sirer wrote in his blog, “writing a robust, secure smart contract requires extreme amounts of diligence. It’s more similar to writing code for a nuclear power reactor, than to writing loose web code.” Since smart contracts are intended to be executed irreversibly on the blockchain, their code should not be rewritten and improved over time, as software typically is. But since no code can ever be completely airtight, smart contracts may have to build in contingency plans for when weaknesses in their code are exploited.

Importantly, the incident was not a result of any inherent weakness in the blockchain or distributed ledger technology generally. It will not be the end of cryptocurrencies or smart contracts. And it’s leading to more consideration of editable blockchains, which proponents say would only be used in extraordinary circumstances, according to Technology Review.

sap_q416_digital_double_feature1__5

Application programming interfaces (APIs), the computer codes that serve as a bridge between software applications, are not traditionally a hot topic outside of coder circles. But they are critical components in much of the consumer technology we’ve all come to rely on day-to-day.

One of the most important events in API history was the introduction of such an interface for Google Maps a decade ago. The map app was so popular that everyone wanted to incorporate its capabilities into their own systems. So Google released an API that enabled developers to connect to and use the technology without having to hack into it. The result was the launch of hundreds of inventive location-enabled apps using Google technology. Today, millions of web sites and apps use Google Maps APIs, from Allstate’s GoodHome app, which shows homeowners a personalized risk assessment of their properties, to Harley-Davidson’s Ride Planner to 7-Eleven’s app for finding the nearest Slurpee.

sap_q416_digital_double_feature1_images6Ultimately, it became de rigueur for apps to open up their systems in a safe way for experimentation by others through APIs. Technology professional Kin Lane, who tracks the now enormous world of APIs, has said, “APIs bring together a unique blend of technology, business, and politics into a transparent, self-service mix that can foster innovation.”

Thus it was significant when Apple announced in June that it would open up Siri to third-party developers through an API, giving the wider world the ability to integrate Siri’s voice commands into their apps. The move came on the heels of similar decisions by Amazon, Facebook, and Microsoft, all of which have AI bots or assistants of their own. And in October, Google opened up its Google Assistant as well.

The introduction of APIs confirms that the AI technology behind these bots has matured significantly—and that a new wave of AI-based innovation is nigh.

The best way to spark that innovation is to open up AI technologies such as Siri so that coders can use them as platforms to build new apps that can more rapidly expand AI uses and capabilities. Call it the “platformication” of AI. The value will be less in the specific AI products a company introduces than in the value of the platform for innovation. And that depends on the quality of the API. The tech company that attracts the best and brightest will win. AI platforms are just beginning to emerge and the question is: Who will be the platform leader?

sap_q416_digital_double_feature1__4

In June, Swiss citizens voted on a proposal to introduce a guaranteed basic income for all of its citizens, as reported by BBC News. It was the first country to take the issue to the polls, but it won’t be the last. Discussions about the impact of both automation and the advancing gig economy on individual livelihoods are happening around the world. Other countries—including the United States—are looking at solutions to the problem. Both Finland and the Netherlands have universal guaranteed income pilots planned for next year. Meanwhile, American startup incubator Y Combinator is launching an experiment to give 100 families in Oakland, California, a minimum wage for five years with no strings attached, according to Quartz.

The world is on the verge of potential job loss at a scale and speed never seen before. The Industrial Revolution was more of an evolution, happening over more than a century. The ongoing digital revolution is happening in relative hyper speed.

No one is exactly sure how increased automation and digitization will affect the world’s workforce. One 2013 study suggests as much as 47% of the U.S workforce is at risk of being replaced by machines over the next two decades, but even a conservative estimate of 10% could have a dramatic impact, not just on workers but on society as a whole.

The proposed solution in Switzerland did not pass, in part because a major political party did not introduce it, and citizens are only beginning to consider the potential implications of digitization on their incomes. What’s more, the idea of simply guaranteeing pay runs contrary to long-held notions in many societies that humans ought to earn their keep.

Whether or not state-funded support is the answer is just one of the questions that must be answered. The votes and pilots underway make it clear that governments will have to respond with some policy measures. The question is: What will those measures be? The larger impact of mass job displacement, what future employment conditions might look like, and what the responsibilities of institutions are in ensuring that we can support ourselves are among the issues that policy makers will need to address.

New business models resulting from digitization will create some new types of roles—but those will require training and perhaps continued education. And not all of those who will be displaced will be in a position to remake their careers. Just consider taxi drivers: In the United States, about 223,000 people currently earn their living behind the wheel of a hired car. The average New York livery driver is 46 years old, according to the New York City Taxi and Limousine Commission, and no formal education is required. When self-driving cars take over, those jobs will go away and the men and women who held them may not be qualified for the new positions that emerge.

As digitization dramatically changes the constructs of commerce and work, no one is quite sure how people will be impacted. But waiting to see how it all shakes out is not a winning strategy. Companies and governments today will have to experiment with potential solutions before the severity of the problem is clear. Among the questions that will have to be answered: How can we retrain large parts of the workforce? How will we support those who fall through the cracks? Will we prioritize and fund education? Technological progress and shifting work models will continue, whether or not we plan for their consequences.

sap_q416_digital_double_feature1__2

In April, a young man, who was believed to have permanently lost feeling in and control over his hands and legs as the result of a devastating spine injury, became able to use his right hand and fingers again. He used technology that transmits his thoughts directly to his hand muscles, bypassing his injured spinal cord. Doctors implanted a computer chip into the quadriplegic’s brain two years ago and—with ongoing training and practice—he can now perform everyday tasks like pouring from a bottle and playing video games.

The system reconnected the man’s brain directly to his muscles—the first time that engineers have successfully bypassed the nervous system’s information superhighway, the spinal cord. It’s the medical equivalent of moving from wired to wireless computing.

The man has in essence become a cyborg, that term first coined in 1960 to describe “self-regulating human-machine systems.” Yet the beneficiary of this scientific advance himself said, “You’re not going to be looked on as, ‘Oh, I’m a cyborg now because I have this big huge prosthetic on the side of my arm.’ It’s something a lot more natural and intuitive to learn because I can see my own hand reacting.”

As described in IEEE Spectrum, the “neural-bypass system” records signals that the man generates when thinking about moving his hand, decodes those signals, and routes them to the electric sleeve around his arm to stimulate movement: “The result looks surprisingly simple and natural: When Burkhart thinks about picking up a bottle, he picks up the bottle. When he thinks about playing a chord in Guitar Hero, he plays the chord.”

sap_q416_digital_double_feature1_images5What seems straightforward on the surface is powered by a sophisticated algorithm that can analyze the vast amounts of data the man’s brain produces, separating important signals from noise.

The fact that engineers have begun to unlock the complex code that controls brain-body communication opens up enormous possibilities. Neural prostheses (cochlear implants) have already reversed hearing loss. Light-sensitive chips serving as artificial retinas are showing progress in restoring vision. Other researchers are exploring computer implants that can read human thoughts directly to signal an external computer to help people speak or move in new ways. “Human and machine are converging,” says Leonhard.

The National Academy of Engineering predicts that “the intersection of engineering and neuroscience promises great advances in healthcare, manufacturing, and communication.”

Burkhart spent two years in training with the computer that has helped power his arm to get this far. It’s the result of more than a decade of development in brain-computer interfaces. And it can currently be used only in the lab; researchers are working on a system for home use. But it’s a clear indication of how quickly the lines between man and machine are blurring—and it opens the door for further computerized reanimation in many new scenarios.

This fall, Switzerland hosted its first cyborg Olympics, in which disabled patients compete using the latest assistive technologies, including robot exoskeletons and brainwave-readers. Paraplegic athletes use electrical simulation systems to compete in cycling, for example. The winners are those who can control their device the best. “Instead of celebrating the human body moving under its own power,” said a recent article in the IEEE Spectrum, “the cyborg games will celebrate the strength and ingenuity of human-machine collaborations.” D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.

Comments

Dan Wellers

About Dan Wellers

Dan Wellers is the Global Lead of Digital Futures at SAP, which explores how organizations can anticipate the future impact of exponential technologies. Dan has extensive experience in technology marketing and business strategy, plus management, consulting, and sales.

Tags:

The Future Of Work Is Now

Stefan Ries

Far beyond collaboration, the digitization of work determines how we work and engage people. Technologies – such as artificial intelligence, machine learning, robotics, analytics, and cloud technologies – change the way we recruit, develop talent, and make our workforce more inclusive. They also introduce new jobs, largely with different skill set requirements. Some of the most-wanted jobs today did not exist five years ago – and many jobs we wouldn’t even imagine today will arise in the near future. Our workplace is changing at light speed.

“Beyond collaboration, the digitization of work determines how we work and engage people”

Technology accelerates the transformation of businesses and industries. We need to prepare our businesses for the future, anticipate skills requirements and workforce changes. While some of the developments are unpredictable, it is up to thought and industry leaders like us to take control and shape the future of work.

SAP Future Factor, an interactive Web series: Engaging with thought leaders about the future of work

Welcome to the SAP Future Factor Web Salon, an interactive Web series featuring perspectives of thought leaders from academia, business, and government about the workplace of the future. The series drives a continuous exchange about the impacts of digitization on organizations and shares insight on innovative practices already in place.

The inaugural episode features SAP chief human resources officer Stefan Ries and Kevin Kruse, leadership expert and author of the New York Times best-seller “We: How to Increase Performance and Profits Through Full Engagement.” The two thought leaders exchange views on the opportunities and challenges of a digitized workplace and business culture. Their discussion will touch on the rising digital workplace, new ways to collaborate, the role technology plays to foster diversity and inclusion, employee engagement, and talent development.

Choose the topics that match your needs

Tomorrow’s workplace is all about choices – and so is the format of the SAP Future Factor Web series. All episodes are fully interactive, giving you the opportunity to interact with the content of the video by choosing topics of interest to you and your business. You determine what you would like to view and learn about, and in what order.

Episode 1 features the following topics:

  • Impacts of Digitization
  • HR’s Role in a Digitized World
  • Cloud Culture
  • Business Beyond Bias
  • Man vs. Machine
  • Rise of Social Intelligence

The future is now. Engage with us in the SAP Future Factor!

We hope you will enjoy the first episode. Tell us what you think.

Are the biggest trends from the last year on your radar screen? See More Than Noise: 5 Digital Stories From 2016 That Are Bigger Than You Think.

Comments

Stefan Ries

About Stefan Ries

Stefan Ries is Chief Human Resources Officer (CHRO), Labor Relations Director, and a member of the Executive Board of SAP SE. Stefan was born in Bavaria and raised in Constance, Germany, where he spent most of his youth. After receiving his masters of business in economics from the University of Constance in 1991, he moved to Munich. He started his career as HR Manager at Microsoft, overseeing HR duties in Austria, Switzerland, and East European countries. In July 1994, he went on to lead the HR function for Compaq Computer in Europe, Middle East, and Africa. Following the company’s acquisitions of Tandem Computers and Digital Equipment Corporation in 1999 and 2000, Stefan led the entire HR organization for Compaq in Germany. Stefan first joined SAP in 2002 and later became responsible for various HR functions, heading up the HR business partner organization and overseeing all HR functions on an operational level. To support innovation, Stefan attaches great importance to a diverse working culture. He is convinced that appreciating the differences among people, their unique backgrounds and personalities is a key success factor for SAP.