Why Continuous Testing Is Critical, But Not Enough

Shoeb Javed

The ability to innovate and quickly adapt to changes (both business- and IT-driven) is key to an organization’s success. Companies are under pressure to release innovation faster across their packaged applications. While innovation creates great opportunity, it also introduces great risk. The implementation and ongoing management of enterprise applications is complex, and a crucial component is making sure that end-to-end processes work—every time, every day.

A new vision for test automation extends to enabling business users and organizations to adopt, accept, and utilize new applications faster. When processes across enterprise applications are executed seamlessly and as designed, companies can leverage the potential of their enterprise apps to achieve strategic goals like entering new markets, increasing revenue, and maintaining a competitive advantage. Continuous testing and quality are the foundation of successful business outcomes.

Here’s a look at continuous testing vs. continuous quality. 

What is continuous testing?

Let’s start with the basics. The simplest definition of continuous testing is “the process of executing automated tests as part of the software delivery pipeline to obtain immediate feedback on the business risks associated with a software release candidate.”

What does DevOps have to do with continuous testing?

In 2009, DevOps was introduced as an extension of the Agile development methodology, enabling higher efficiency and collaboration, so companies can react more swiftly to the constant changes in today’s digital economy. Although increased speed and velocity are positive outgrowths of Agile and DevOps, customers may be delivering software that hasn’t been properly tested or tested at all. This can expose companies to risk like defects in production.  This drives the need for a new approach to testing that aligns with the new 24/7 release cycle.

Why isn’t continuous testing enough?

Continuous testing isn’t enough because it doesn’t equate to continuous quality. Continuous testing is just the tip of the iceberg. I see it as a fundamental necessity because of the complexity of enterprise landscapes. Continuous testing is a critical phase of the continuous delivery process, and it helps us address the demand for superior customer experiences. We gain immediate and real-time feedback on business risks and maintain continuity across the business. Despite the valuable insights we gain from continuous testing, I strongly believe no amount of testing alone can ensure quality.

So, what is continuous quality in DevOps?

The problem is, there’s not an industry standard for continuous quality in DevOps. Even though DevOps was originally designed to produce high-quality, updated software that gets to the user faster, “quality” is not well-defined. I believe our ability to develop an industry standard is an essential next step, especially if we expect to move from aspiring to actually delivering highly functional systems that consider the user experience.     

How do we get to a higher standard?

I believe there are two paths to a higher standard. The first path is for us to better define continuous testing, or raise our game and make a collective shift to continuous quality. I see this issue as one of the most important factors in the next phase of the DevOps evolution.

The scope of continuous quality is higher-reaching than continuous testing. As we look to develop an industry definition of continuous quality, I believe we must consider the following factors:

  • A deep understanding of business processes in production
  • Upfront promotion of the need for objective data from the business
  • Recognition of internal and external customers for quality metrics
  • Prioritization of continuous quality from day one
  • Incorporating quality into the Agile development methodology (don’t “throw everything over the wall”)
  • Tracking project health to include quality at each phase of the SDLC
  • End-to-end business process testing, not just single service/component quality
  • Performance monitoring and optimization throughout development and production for feedback

How does your DevOps team currently define continuous quality?

For more on software development trends, see How Open Source Is Changing Software Innovation.

Comments

Shoeb Javed

About Shoeb Javed

Shoeb Javed is chief technology officer at Worksoft, a leading global provider of automation software for high-velocity business process discovery and testing. Shoeb is responsible for the technology strategy, software development, quality assurance and customer support for all Worksoft solutions. As CTO, Shoeb works with quality assurance and business leaders of some of the largest global Fortune 1000 corporations to help automate testing of complex packaged enterprise applications to speed up project timelines and improve operational efficiencies.

The IoT Data Explosion, IPv6, And The Need To Process At The Edge

Chuck Pharris

Part 2 in the 3-part Edge Computing series

The Internet of Things (IoT) is growing. By how much? IDC predicts there will be 30 billion connected things worldwide by 2020.[1] After 2020? That’s anybody’s guess—but one clear indication that this growth will be big is the move to a new Internet addressing system: IPv6.

The problem is that the current system, IPv4, allows for only 4 billion addresses or so, which requires some devices to share addresses. With more and more sensors embedded in more and more things—each requiring an IP address—this state of affairs is unsustainable.

IPv6 solves this problem by bumping up the universe of available addresses to a number that’s hard to comprehend —something like 340,000,000,000,000,000,000,000,000,000,000,000,000 (or 340 trillion trillion trillion).[2]

But what about the data?

I can’t say that I expect 340 trillion trillion trillion IoT devices out there anytime soon. But as the IoT grows, the amount of data generated by proliferating sensors embedded in connected things will grow as well. And for organizations deploying IoT devices to move all this data back and forth via the cloud is simply untenable.

Hence the idea of edge processing. Edge processing, as I explained in a previous blog, is the idea of processing data on the “edge” where IoT devices are deployed—rather than sending all sensor-generated data back to mission central over the cloud. Without edge processing, I don’t think the IoT could be a reality.

But even if we were to revamp the planet’s Internet infrastructure, would you still find value in all that data? In fact, much of the data produced by sensors is not particularly useful. So instead of doing a rip and replace of the Internet, why not just process data at the edge and use an IoT gateway to run the analytics on site, sending back only what’s useful to mission central?

The four pillars

It is such practical concerns that make edge processing an appealing approach for real-world IoT deployments. But how do you move forward?

In a recent white paper, SAP explores some of the primary concerns, categorizing them according to the 4 P’s of intelligent edge processing: presence, performance, power, and protection. The paper examines these four pillars and focuses on better ways to cleanse, filter, and enrich the growing volumes of sensor data. Let’s a take a quick look.

Presence

Intelligent edge processing requires your systems to be present at the creation, as it were—on the edge, where the action take place. Using machine learning and smart algorithms on the edge, you can generate insight and take action without human intervention. This is good, because running in a more autonomous fashion is an imperative for the digital economy.

As an example, the paper dives into automated reordering and receiving using warehouse shelves equipped with sensors. A different example, though, is automated work orders triggered by analysis of events. This is interesting because the automated action—creation of a work order—requires a follow-on action involving humans, like putting a technician on site, let’s say. In this way, many organizations will use edge processing in conjunction with human beings doing things. It all depends on the scenario that works best in context.

Performance

Intelligent edge processing can improve performance for IoT scenarios by solving the problem of overwhelming traditional data-storage technologies. Take the example of processing in manufacturing where the goal is to approximate a standard set by the “golden batch” for all subsequent manufacturing runs. Combining operational technology with information technology, you can process the complex events that happen on the edge, and bring new batches closer into compliance with the golden batch. This helps improve manufacturing performance, from the perspectives of both speed and quality.

Power

Intelligent edge processing gives you the power to execute processes where they take place—without the latency of data transfer in the cloud. Take, for example, a remote mining operation with limited connectivity. Whatever processes occur on site—say, the ordering of replacement parts for mining equipment—can still be carried out with edge processing. Workers can record the order, and replacements can either be made where parts are locally available or put on hold until the part arrives. In either case, the need for the part is recorded, and the information can be synced opportunistically when a connection becomes available.

Protection

Intelligent edge processing can help deliver the security needed for IoT deployments. By their very nature, such deployments emphasize openness and are designed to work with other networks—many of which may not be under your control. With intelligent edge processing, you can track the unique identities of sensors in your network, encrypt any data sent out, and run the necessary checks on data coming in. On-site processing in this fashion, in fact, is required—because managing such security via the cloud would not only introduce data latency into the equation, but could also open up holes to be exploited by malicious actors.

So, yes, the IoT is growing—and along with it, the volumes of data companies are required to manage. This volume of data cannot be managed entirely via the cloud. Edge processing is a solution to this problem. Take a look at the “4 P’s” paper here: “Excellence at the Edge: Achieving Business Outcomes in a Connected World.” And stay tuned for my final blog in this series: “Edge Computing and the New Decentralization: The Rhyming of IT History.”

[1] http://www.idc.com/infographics/IoT

[2] https://www.google.com/intl/en/ipv6/

Comments

Chuck Pharris

About Chuck Pharris

Chuck Pharris has over 25 years of experience in the manufacturing industries, driving sales and marketing at a large controls systems company, overseeing plant automation and supply chain management, and conducting energy analysis for the U.S. Department of Energy’s research laboratory. He has worked in all areas of plant automation, including process control, energy management, production management, and supply chain management. He currently is the global director of IoT marketing for SAP. Contact him at chuck.pharris@sap.com.

How To Get The Best Out Of Automation

Dr. Markus Noga and Sebastian Schroetel

In 2016, the management consulting firm McKinsey predicted that up to 70% of all tasks are potentially automatable with so-called next-generation technologies. Companies worldwide jumped on that bandwagon and invested heavily in one of the hottest innovations when it comes to automation – robotics.

Today, only a few months later, some experts claim that robotic process automation (RPA) has only been a fast-paced trend based on the excitement of industry leaders, and it is not the predicted “panacea” for all the challenges enterprises face regarding automation. Recently, another blog post by McKinsey tackled this topic and qualified the initial enthusiasm for bots and their supposed potential to incur all sorts of back-office processes. In fact, the rapid adaptation of robotization waived the consideration of its potential downsides.

According to McKinsey, “installing thousands of bots has taken a lot longer and is more complex than most had hoped it would be” and “not unlike humans, thousands of bots need care and attention – in the form of maintenance, upgrades, and cybersecurity protocols, introducing additional costs and demanding ongoing focus for executives.” All in all, the authors state that the economic results of RPA underperformed the estimations, especially with regard to cost reduction. The impression has been strengthened that “people do many different things, and bots may only address some of them.”

The next level of automation

Considering the latest trends in robotics that came along with unexpected complexity, little flexibility, and additional maintenance, automation is still pushing forward. The aim is to help enterprises realize their potential and switch focus from just “keeping the lights on” through human manpower to growth generation triggered by automation technology. A dedicated automation approach to achieve the intelligent enterprise involves three interacting levels:

  • Software components, or “engines,” that provide automation relying on highly specific process knowledge
  • Machine learning that involves teaching a computer how to spot patterns and make connections by showing it a massive volume of data – algorithms that can learn from experience without having to be explicitly programmed
  • Robotic process automation software that operates another application without the support of a human user, helping to run repetitive, rule-based monotonous tasks and bridging temporary gaps

In contrast to the mere RPA approach many companies have pursued in the past, only the integration of all three layers lifts the enterprise to the next automation level. Engines are the basis of enterprise automation activities. They enable companies to shape their processes by making decisions on where to direct incoming inquiries at subsequent steps. But engines have a fixed logic and limited configuration possibility. Therefore, they cannot cover all facets of the business processes and have the potential to only facilitate automation in up to 60% of all cases.

In credit management, for example, credit-rules engines can help evaluate personal creditworthiness and process credit limit applications in a structured way. This is done by automatically categorizing them based on defined scoring rules and assigning a specific credit limit to the customer after the examination is completed.

Applications for machine learning

But what happens when a scenario occurs that wasn’t encountered by the operator? By adding intelligent automation technologies to the automation portfolio, processes become noticeably intelligent. Machine learning can upgrade the automation level of a process up to 98%. How? By setting up general guidelines without telling the system exactly what to do. The underlying algorithm learns from the operator’s previous actions and takes all available data into account to deliver the most relevant response to an occurrence.

Applying this to credit management, machine learning is useful in those cases where a customer lacks a dedicated credit history. Here, machine learning fills in with more accurate forecasting models based on people’s overall payment history, on information related to the borrower’s interaction behavior on the lender’s website, and other unstructured data sets.

Robotics, as the third automation layer, can help automate the remaining two percent of repetitive, monotonous tasks in a process. But due to its lower integration level, RPA is limited in its reach and adds the percentage on top on much higher costs. In financial risk management processes like bank lending, robotics can deal with requests for overdraft protection or credit card approvals.

A genuine alternative to mere bot systems

Related to the downsides of bot systems, a multi-automation-layer approach is the way to set up a stable and holistic automation concept as an alternative to pure RPA to flatten or avoid the disadvantages bot systems entail.

The McKinsey authors support the thesis that robotics should be used in exceptional cases, instead of being applied as the universal remedy to deal with repetitive tasks.

All in all, enterprises are actively searching for ways to shape their processes and automate parts of their work. Robots are perceived as being too inflexible, expensive, and complex in their maintenance to accomplish these goals in a satisfactory manner. By expanding the automation portfolio with engines and machine learning, a meshing system of automation technology can address these concerns and force a holistic implementation of automation throughout the enterprise.

Currently, companies and CIOs are resetting their bot programs. Figuring out the desired goal of automation might help to steer it into the right direction.

SAP’s automation strategy in general, and our cloud-based machine learning portfolio and related services in particular, are ready to step in and to fill the automation gaps that bots leave.

Comments

Dr. Markus Noga

About Dr. Markus Noga

Dr. Markus Noga is vice president of Machine Learning at SAP. Machine learning (ML) applies deep learning, machine learning, and advanced data science to solve business challenges. The ML team aspires to building SAP’s next growth business in intelligent solutions, and works closely with existing product units and platform teams to deliver business value to their customers. Part of the SAP Innovation Center Network (ICN), the Machine Learning team operates as a lean startup within SAP with sites in Germany, Israel, Singapore, and Palo Alto.

Sebastian Schroetel

About Sebastian Schroetel

Sebastian Schroetel is a director at SAP for machine learning in the digital core. In this role, Sebastian and his global team shape and create machine learning solutions for SAP's core ERP products. Sebastian has 10 years of experience in innovation software development, with focus on automation, analytics, and data processing.

Diving Deep Into Digital Experiences

Kai Goerlich

 

Google Cardboard VR goggles cost US$8
By 2019, immersive solutions
will be adopted in 20% of enterprise businesses
By 2025, the market for immersive hardware and software technology could be $182 billion
In 2017, Lowe’s launched
Holoroom How To VR DIY clinics

From Dipping a Toe to Fully Immersed

The first wave of virtual reality (VR) and augmented reality (AR) is here,

using smartphones, glasses, and goggles to place us in the middle of 360-degree digital environments or overlay digital artifacts on the physical world. Prototypes, pilot projects, and first movers have already emerged:

  • Guiding warehouse pickers, cargo loaders, and truck drivers with AR
  • Overlaying constantly updated blueprints, measurements, and other construction data on building sites in real time with AR
  • Building 3D machine prototypes in VR for virtual testing and maintenance planning
  • Exhibiting new appliances and fixtures in a VR mockup of the customer’s home
  • Teaching medicine with AR tools that overlay diagnostics and instructions on patients’ bodies

A Vast Sea of Possibilities

Immersive technologies leapt forward in spring 2017 with the introduction of three new products:

  • Nvidia’s Project Holodeck, which generates shared photorealistic VR environments
  • A cloud-based platform for industrial AR from Lenovo New Vision AR and Wikitude
  • A workspace and headset from Meta that lets users use their hands to interact with AR artifacts

The Truly Digital Workplace

New immersive experiences won’t simply be new tools for existing tasks. They promise to create entirely new ways of working.

VR avatars that look and sound like their owners will soon be able to meet in realistic virtual meeting spaces without requiring users to leave their desks or even their homes. With enough computing power and a smart-enough AI, we could soon let VR avatars act as our proxies while we’re doing other things—and (theoretically) do it well enough that no one can tell the difference.

We’ll need a way to signal when an avatar is being human driven in real time, when it’s on autopilot, and when it’s owned by a bot.


What Is Immersion?

A completely immersive experience that’s indistinguishable from real life is impossible given the current constraints on power, throughput, and battery life.

To make current digital experiences more convincing, we’ll need interactive sensors in objects and materials, more powerful infrastructure to create realistic images, and smarter interfaces to interpret and interact with data.

When everything around us is intelligent and interactive, every environment could have an AR overlay or VR presence, with use cases ranging from gaming to firefighting.

We could see a backlash touting the superiority of the unmediated physical world—but multisensory immersive experiences that we can navigate in 360-degree space will change what we consider “real.”


Download the executive brief Diving Deep Into Digital Experiences.


Read the full article Swimming in the Immersive Digital Experience.

Comments

Kai Goerlich

About Kai Goerlich

Kai Goerlich is the Chief Futurist at SAP Innovation Center network His specialties include Competitive Intelligence, Market Intelligence, Corporate Foresight, Trends, Futuring and ideation. Share your thoughts with Kai on Twitter @KaiGoe.heif Futu

Tags:

Jenny Dearborn: Soft Skills Will Be Essential for Future Careers

Jenny Dearborn

The Japanese culture has always shown a special reverence for its elderly. That’s why, in 1963, the government began a tradition of giving a silver dish, called a sakazuki, to each citizen who reached the age of 100 by Keiro no Hi (Respect for the Elders Day), which is celebrated on the third Monday of each September.

That first year, there were 153 recipients, according to The Japan Times. By 2016, the number had swelled to more than 65,000, and the dishes cost the already cash-strapped government more than US$2 million, Business Insider reports. Despite the country’s continued devotion to its seniors, the article continues, the government felt obliged to downgrade the finish of the dishes to silver plating to save money.

What tends to get lost in discussions about automation taking over jobs and Millennials taking over the workplace is the impact of increased longevity. In the future, people will need to be in the workforce much longer than they are today. Half of the people born in Japan today, for example, are predicted to live to 107, making their ancestors seem fragile, according to Lynda Gratton and Andrew Scott, professors at the London Business School and authors of The 100-Year Life: Living and Working in an Age of Longevity.

The End of the Three-Stage Career

Assuming that advances in healthcare continue, future generations in wealthier societies could be looking at careers lasting 65 or more years, rather than at the roughly 40 years for today’s 70-year-olds, write Gratton and Scott. The three-stage model of employment that dominates the global economy today—education, work, and retirement—will be blown out of the water.

It will be replaced by a new model in which people continually learn new skills and shed old ones. Consider that today’s most in-demand occupations and specialties did not exist 10 years ago, according to The Future of Jobs, a report from the World Economic Forum.

And the pace of change is only going to accelerate. Sixty-five percent of children entering primary school today will ultimately end up working in jobs that don’t yet exist, the report notes.

Our current educational systems are not equipped to cope with this degree of change. For example, roughly half of the subject knowledge acquired during the first year of a four-year technical degree, such as computer science, is outdated by the time students graduate, the report continues.

Skills That Transcend the Job Market

Instead of treating post-secondary education as a jumping-off point for a specific career path, we may see a switch to a shorter school career that focuses more on skills that transcend a constantly shifting job market. Today, some of these skills, such as complex problem solving and critical thinking, are taught mostly in the context of broader disciplines, such as math or the humanities.

Other competencies that will become critically important in the future are currently treated as if they come naturally or over time with maturity or experience. We receive little, if any, formal training, for example, in creativity and innovation, empathy, emotional intelligence, cross-cultural awareness, persuasion, active listening, and acceptance of change. (No wonder the self-help marketplace continues to thrive!)

The three-stage model of employment that dominates the global economy today—education, work, and retirement—will be blown out of the water.

These skills, which today are heaped together under the dismissive “soft” rubric, are going to harden up to become indispensable. They will become more important, thanks to artificial intelligence and machine learning, which will usher in an era of infinite information, rendering the concept of an expert in most of today’s job disciplines a quaint relic. As our ability to know more than those around us decreases, our need to be able to collaborate well (with both humans and machines) will help define our success in the future.

Individuals and organizations alike will have to learn how to become more flexible and ready to give up set-in-stone ideas about how businesses and careers are supposed to operate. Given the rapid advances in knowledge and attendant skills that the future will bring, we must be willing to say, repeatedly, that whatever we’ve learned to that point doesn’t apply anymore.

Careers will become more like life itself: a series of unpredictable, fluid experiences rather than a tightly scripted narrative. We need to think about the way forward and be more willing to accept change at the individual and organizational levels.

Rethink Employee Training

One way that organizations can help employees manage this shift is by rethinking training. Today, overworked and overwhelmed employees devote just 1% of their workweek to learning, according to a study by consultancy Bersin by Deloitte. Meanwhile, top business leaders such as Bill Gates and Nike founder Phil Knight spend about five hours a week reading, thinking, and experimenting, according to an article in Inc. magazine.

If organizations are to avoid high turnover costs in a world where the need for new skills is shifting constantly, they must give employees more time for learning and make training courses more relevant to the future needs of organizations and individuals, not just to their current needs.

The amount of learning required will vary by role. That’s why at SAP we’re creating learning personas for specific roles in the company and determining how many hours will be required for each. We’re also dividing up training hours into distinct topics:

  • Law: 10%. This is training required by law, such as training to prevent sexual harassment in the workplace.

  • Company: 20%. Company training includes internal policies and systems.

  • Business: 30%. Employees learn skills required for their current roles in their business units.

  • Future: 40%. This is internal, external, and employee-driven training to close critical skill gaps for jobs of the future.

In the future, we will always need to learn, grow, read, seek out knowledge and truth, and better ourselves with new skills. With the support of employers and educators, we will transform our hardwired fear of change into excitement for change.

We must be able to say to ourselves, “I’m excited to learn something new that I never thought I could do or that never seemed possible before.” D!

Comments