Sections

The Future Of Open Data

Heather McIlvaine

How are mobile apps, Big Data, and civic hacking changing the nature of open data in government? The Center for Technology in Government took a look at this topic and presents its findings.

Picture this: You’re a tourist in New York City and after a long day of sightseeing you’re looking for a bite to eat. But before you simply head into the nearest sandwich shop, you look it up on “NYC Restaurant Scrutinizer.” This mobile app isn’t your average restaurant guide:  Photo: iStockphotoAlong with food reviews, you can also see the results from restaurant inspections conducted by the New York City Department of Health and Mental Hygiene. After finding out that the sandwich shop doesn’t properly refrigerate its deli meats, you decide to go to the Chinese restaurant on the corner with an “A” grade. And delicious spring rolls to boot.

“NYC Restaurant Scrutinizer” was developed by a private app developer named Michael Boski in 2009. He beat the government agency to the punch by about three years: “ABCEats,” the health department’s own mobile app for checking restaurant grades, first appeared on iTunes in 2012.

Civic hacking and open data

Boski is a so-called civic hacker, a private citizen who taps into publicly available data to create useful apps for the general population. Since most governments endorse open data policies that disseminate such information, civic hacking is not illegal.

The New York City health department, for example, has been posting its inspection results online for anyone to see since the mid-1990s. It’s worth noting that without this policy, “NYC Restaurant Scrutinizer” would not be possible. On the other hand, it was Boski, not the government, who took the first step in making the information easier to consume, which increased its value to citizens.

That is often the case: In its current form, open government data is large and complex and usually lacks the context for everyday use. But not everyone thinks civic hacking is the answer to this problem. For one thing, apps from private developers, like “NYC Restaurant Scrutinizer,” can’t guarantee the accuracy of information they provide. And they usually don’t take into account any downstream consequences that might negatively affect businesses and government agencies.

“The Dynamics of Opening Government Data”

SAP’s Public Services Industry Business Solutions team wanted to find out more about the impact of these open data initiatives on the public and private sector, so it commissioned the Center for Technology and Government (CTG), an independent research organization at the State University at Albany (SUNY) to do further research in the area. The resulting whitepaper, “The Dynamics of Opening Government Data,”examines two case studies of open government initiatives and presents four recommendations based on CTG’s findings.

One of those case studies was the previously mentioned example with the New York City health department. The other case study involved the Department of Transportation for the city of Edmonton, Canada, and its release of data on planned road construction projects.

“In both cases, it was clear that simply opening government data – making it available to the public – is not the whole story,” says Anthony Cresswell, a senior fellow at CTG. “It has longer-term consequences that are difficult to predict. In our research, we sought to come up with strategies and policies that help government agencies understand ahead of time how opening data will affect all the stakeholders involved.”

The information polity

Identifying and understanding this collection of stakeholders, what Cresswell and his colleagues call an ‘information polity,’ is one of the key aspects of the whitepaper. “An information polity includes the government agencies that create the data, employees who use the data, citizens who want access to the data, app developers who modify the data, and other stakeholders,” explains CTG senior program associate Brian Burke.

In other words, Michael Boski is part of an information polity, as are the people using his app, as are the restaurants being inspected, as is – most importantly – the government agency conducting those inspections and providing the data. “The timeliness, format, and quality of data that the government provides all affect the usability of that data for developers,” says Burke. “And it is the relationship between these information sources that ultimately influences the value of that data for the public.”

Four recommendations from CTG

How should government agencies go about implementing open data initiatives that maximize value and minimize risk? Cresswell and Burke explain the four recommendations outlined in the whitepaper.

1. Release government data that are relevant to both agency performance and the public interest.

As part of the Open Government Initiative launched by the Obama administration in 2009, U.S. federal agencies published high-value datasets online at data.gov. Any person can access the Web site and explore a vast amount of information, ranging from the location of every farmer’s market in the country to the average energy consumption by household to the U.S. trade volume in tomatoes.

But how many citizens really want to know what the current yield of the country’s tomato crop is? “As government agencies try to balance resources, time, and effort, they should choose to focus on those datasets that hold the most public value,” says Cresswell.

2. Invest in strategies to estimate how different stakeholders will use the data.

“Some datasets, like government budgets, don’t lend themselves to use on a smartphone. Others, like restaurant inspection results, make a lot more sense when you connect them to geospatial data so they can be used on the go. If you model how users are likely to interact with the data, you can choose the technology solution that will maximize value,” says Burke.

In addition, different stakeholders may want access to different kinds of data. In the road construction case study, for example, a commuter might want to know what the estimated delay on a particular route is, while a construction site foreman digging near a road block might want access to the location of the newly-laid sewer line. Government agencies will have to decide whether they can – or should – invest in collecting new data to suit these different needs.

3. Devise data management practices that improve context in order to “future-proof” data resources.

“Good meta data will help you ‘future-proof’ data resources. It’s a very simple thing, but very powerful,” says Burke. The developer who created the road construction app for the citizens of Edmonton reported that building the app was very straightforward, thanks to the high-quality meta data already provided by the city.

For example, the dataset already included geospatial data that was compliant with GIS standards, which made mapping the information easier and more accurate. Good data management practices are essential for government agencies looking to make their data more accessible and useful.

4. Think about sustainability.

Planning for the long-term sustainability of a given dataset is strongly linked to understanding the values and risks associated with releasing the data in the first place. Before restaurant grades were posted online by the New York City health department, restaurants had to display their rating on the premises, but they could easily hide a bad report in a less visible place. Once the agency began posting inspection reports online in the 1990s, restaurants felt the consequences of a bad rating much more severely.

As a result, they began demanding more frequent health inspections, so they would have the chance to improve their score. The health department responded to this demand by hiring more inspectors. “If you identify the value and risk of releasing information, you can better predict what additional resources you’ll need down the line,” recommends Burke.

What does this mean for SAP?

While government agencies reevaluate their open data policies in light of these four recommendations, SAP is also examining what this means for the software industry: “CTG’s research gives SAP a better understanding of the challenges that the public sector is facing today to support open data and open government mandates,” says Elizabeth McGowan, director of technology & innovation for business solutions in the public services industry at SAP. “To generate public value going forward, it will be crucial for governments – at all levels – to invest in strategies that open data for exploration by citizens, stakeholders, and other government agencies. We believe that solutions for big data management and analysis will be a critical part of these strategies.”

Follow @SAPOpenGov on Twitter to stay up-to-date on the latest open government news, events, insights.

Photo: iStockphoto

Comments

Heather McIlvaine

About Heather McIlvaine

Heather McIlvaine is the Editor of SAP.info. Her specialties include writing, editing, journalism, online research and publishing.

Tags:

awareness

Why 3D Printed Food Just Transformed Your Supply Chain

Hans Thalbauer

Numerous sectors are experimenting with 3D printing, which has the potential to disrupt many markets. One that’s already making progress is the food industry.

The U.S. Army hopes to use 3D printers to customize food for each soldier. NASA is exploring 3D printing of food in space. The technology could eventually even end hunger around the world.

What does that have to do with your supply chain? Quite a bit — because 3D printing does more than just revolutionize the production process. It also requires a complete realignment of the supply chain.

And the way 3D printing transforms the supply chain holds lessons for how organizations must reinvent themselves in the new era of the extended supply chain.

Supply chain spaghetti junction

The extended supply chain replaces the old linear chain with not just a network, but a network of networks. The need for this network of networks is being driven by four key factors: individualized products, the sharing economy, resource scarcity, and customer-centricity.

To understand these forces, imagine you operate a large restaurant chain, and you’re struggling to differentiate yourself against tough competition. You’ve decided you can stand out by delivering customized entrees. In fact, you’re going to leverage 3D printing to offer personalized pasta.

With 3D printing technology, you can make one-off pasta dishes on the fly. You can give customers a choice of ingredients (gluten-free!), flavors (salted caramel!), and shapes (Leaning Towers of Pisa!). You can offer the personalized pasta in your restaurants, in supermarkets, and on your ecommerce website.

You may think this initiative simply requires you to transform production. But that’s just the beginning. You also need to re-architect research and development, demand signals, asset management, logistics, partner management, and more.

First, you need to develop the matrix of ingredients, flavors, and shapes you’ll offer. As part of that effort, you’ll have to consider health and safety regulations.

Then, you need to shift some of your manufacturing directly into your kitchens. That will also affect packaging requirements. Logistics will change as well, because instead of full truckloads, you’ll be delivering more frequently, with more variety, and in smaller quantities.

Next, you need to perfect demand signals to anticipate which pasta variations in which quantities will come through which channels. You need to manage supply signals source more kinds of raw materials in closer to real time.

Last, the source of your signals will change. Some will continue to come from point of sale. But others, such as supplies replenishment and asset maintenance, can come direct from your 3D printers.

Four key ingredients of the extended supply chain

As with our pasta scenario, the drivers of the extended supply chain require transformation across business models and business processes. First, growing demand for individualized products calls for the same shifts in R&D, asset management, logistics, and more that 3D printed pasta requires.

Second, as with the personalized entrees, the sharing economy integrates a network of partners, from suppliers to equipment makers to outsourced manufacturing, all electronically and transparently interconnected, in real time and all the time.

Third, resource scarcity involves pressures not just on raw materials but also on full-time and contingent labor, with the necessary skills and flexibility to support new business models and processes.

And finally, for personalized pasta sellers and for your own business, it all comes down to customer-centricity. To compete in today’s business environment and to meet current and future customer expectations, all your operations must increasingly revolve around rapidly comprehending and responding to customer demand.

Want to learn more? Check out my recent video on digitalizing the extended supply chain.

Comments

Hans Thalbauer

About Hans Thalbauer

Hans Thalbauer is the Senior Vice President, Extended Supply Chain, at SAP. He is responsible for the strategic direction and the Go-To-Market of solutions for Supply Chain, Logistics, Engineering/R&D, Manufacturing, Asset Management and Sustainability at SAP.

How to Design a Flexible, Connected Workspace 

John Hack, Sam Yen, and Elana Varon

SAP_Digital_Workplace_BRIEF_image2400x1600_2The process of designing a new product starts with a question: what problem is the product supposed to solve? To get the right answer, designers prototype more than one solution and refine their ideas based on feedback.

Similarly, the spaces where people work and the tools they use are shaped by the tasks they have to accomplish to execute the business strategy. But when the business strategy and employees’ jobs change, the traditional workspace, with fixed walls and furniture, isn’t so easy to adapt. Companies today, under pressure to innovate quickly and create digital business models, need to develop a more flexible work environment, one in which office employees have the ability to choose how they work.

SAP_Digital_Emotion_BRIEF_image175pxWithin an office building, flexibility may constitute a variety of public and private spaces, geared for collaboration or concentration, explains Amanda Schneider, a consultant and workplace trends blogger. Or, she adds, companies may opt for customizable spaces, with moveable furniture, walls, and lighting that can be adjusted to suit the person using an unassigned desk for the day.

Flexibility may also encompass the amount of physical space the company maintains. Business leaders want to be able to set up operations quickly in new markets or in places where they can attract top talent, without investing heavily in real estate, says Sande Golgart, senior vice president of corporate accounts with Regus.

Thinking about the workspace like a designer elevates decisions about the office environment to a strategic level, Golgart says. “Real estate is beginning to be an integral part of the strategy, whether that strategy is for collaborating and innovating, driving efficiencies, attracting talent, maintaining higher levels of productivity, or just giving people more amenities to create a better, cohesive workplace,” he says. “You will see companies start to distance themselves from their competition because they figured out the role that real estate needs to play within the business strategy.”

The SAP Center for Business Insight program supports the discovery and development of  new research-­based thinking to address the challenges of business and technology executives.

Comments

Sam Yen

About Sam Yen

Sam Yen is the Chief Design Officer for SAP and the Managing Director of SAP Labs Silicon Valley. He is focused on driving a renewed commitment to design and user experience at SAP. Under his leadership, SAP further strengthens its mission of listening to customers´ needs leading to tangible results, including SAP Fiori, SAP Screen Personas and SAP´s UX design services.

Tags:

What If Chelsea Manager Jose Mourinho Could Be Proved Right In Medical Staff Row?

Mark Goad

Big Data and the Internet of Things brings new level of insight to sports medicine

With the 2015-16 European football (soccer) season underway, we are already seeing the impact of the huge pressure to succeed. In some cases, it is boiling over even this early on, with Chelsea manager Jose Mourinho getting involved in a very public row with his medical staff over the treatment of Eden Hazard during a match. As the season builds momentum, all clubs know one of the most vital aspects of winning trophies is keeping the best players fit so they can play at the top of their game as often as possible.

Last season, just like in every season, we saw injuries that affected teams’ results and possibly their final standings at the end of the season, while other teams capitalized. Arsenal manager Arsene Wenger blamed injuries for the team’s failed title bid, while Real Madrid suffered injuries to players like Gareth Bale and Luka Modric at a crucial stage of the season and lost the title to Barcelona.

There’s no doubt that football clubs, especially the bigger teams, employ first-rate medical staff – physiotherapists, doctors, sports scientists, and so on – but they can only do so much to keep players off the treatment table. Players are human, after all, and keeping them injury-free for such long and grueling campaigns is a big ask. This season again will see players on the end of crunching tackles, over-exerting their bodies, and over-stretching.

What’s less talked about than lost games and league titles when discussing injuries is the salaries paid to injured players. The estimated average cost of player injuries in the top four professional football leagues in 2015 was $12.4 million* per team. Remarkably, every year teams lose an equivalent of 15%-30%** of their player payroll to injuries.

As salaries continue to rise, injuries are becoming just as much of an off-the-pitch boardroom issue as they are an on-the-pitch issue. Consider that if Barcelona’s Lionel Messi, the world’s highest-paid player, spends just a week out injured, the club still has to pay his weekly salary of around $1 million. Not only that, but there’s the huge potential for lost revenue from missing out on UEFA Champions League progress or domestic success because key players are out.

Just as winning seems to mean more than ever, so does football as a business. So with the spotlight firmly on “sweating the assets” – extracting maximum value from the entire squad – clubs are looking to Big Data and Internet of Things technology to consider how player injuries can be prevented with new levels of insight.

Prevention is better than cure

In July this year we saw what could be a huge landmark in the potential of monitoring the risk of injuries, when football’s international governing body FIFA announced its approval of wearable electronic performance and tracking systems during matches. As well as collecting data on statistics like distance covered and heart rate to determine decisions like substitution timings, this also paves the way for wearable satellite devices that keep medical staff updated on the likelihood of a player picking up an injury from over-exertion.

Emerging injury-risk monitoring software uses the concepts of Big Data and wearable technology to pull in and apply mathematical formulas to an exhaustive range of relevant data about players: fitness levels, recent levels of exertion, opponents, age, technique, hydration, even weather. This could help medical staff predict the risk of future injuries with much greater accuracy, allowing them to run simulations and take corrective actions in real time. Imagine a seemingly non-injured key player being substituted during a tightly contested match, only to find out afterwards that monitoring software had indicated he was at a high risk of pulling a muscle. This could very much be a part of the future of professional football.

Going back to Jose Mourinho and his reaction to the Chelsea medical staff running onto the pitch to treat Eden Hazard, it’s interesting to consider how in the future this kind of technology could either support or discredit his position in the dispute. It could help managers work more closely with physiotherapists, as they can visualize the data that shows the risk of injury to players. Although the pressure to win will likely keep on rising, the risk of expensive players injuries could see a big reduction.

SAP’s own injury risk monitoring software is currently in the proof-of-concept phase and will be entering development in the near future. The goal is to build IRM on the SAP Sports One platform as an additional component, and to provide integration to the existing modules of SAP Sports One solution. SAP Sports One was launched earlier this year and is the first sports-specific cloud solution powered by the SAP HANA platform, providing a single, unified platform for team management and performance optimization.

*Statistic calulated using 2015 Global Sports Salaries Survey

**Bleacher Report “Inside the 2014 Numbers of Each MLB Team’s Regular-Season Injury Impact” and NBA Injury Analysis

Comments

Mark Goad

About Mark Goad

Mark Goad is a Client Partner at SAP. His specialties include social media, digital marketing, analytics, strategy and management.

Tags:

Big, Bad Data: How Talent Analytics Will Make It Work In HR

Meghan M Biro

Here’s a mind-blowing fact: Research from IBM shows that 90% of the data in the world today has been created in the last two years alone. I find this fascinating.

Which means that companies have access to an unprecedented amount of information: insights, intelligence, trends, future-casting. In terms of HR, it’s a gold mine of Big Data.

This past spring, I welcomed the Industry Trends in Human Resources Technology and Service Delivery Survey, conducted by the Information Services Group (ISG), a leading technology insights, market intelligence, and advisory services company. It’s a useful study, particularly for leaders and talent managers, offering a clear glimpse of what companies investing in HR tech expect to gain from their investment.

Not surprisingly, there are three key benefits companies expect to realize from investments in HR tech:

• Improved user and candidate experience

• Access to ongoing innovation and best practices to support the business

• Speed of implementation to increase the value of technology to the organization.

It’s worth noting that driving the need for an improved user interface, access, and speed is the nature of the new talent surging into the workforce: people for whom technology is nearly as much a given as air. We grew up with technology, are completely comfortable with it, and not only expect it to be available, we assume it will be available, as well as easy to use and responsive to all their situations, with mobile and social components.

According to the ISG study, companies want HR tech to offer strategic alignment with their business. I view this as more about enabling flexibility in talent management, recruiting and retention — all of which are increasing in importance as Boomers retire, taking with them their deep base of knowledge and experience. And companies are looking more for the analytics end of the benefit spectrum. No surprise here that the delivery model will be through cloud-based SaaS solutions.

Companies also want:

• Data security

• Data privacy

• Integration with existing systems, both HR and general IT

• Customizability —to align with internal systems and processes.

Cloud-based. According to the ISG report, more than 50% of survey respondents have implemented or are implementing cloud-based SaaS systems. It’s easy, it’s more cost-effective than on-premise software, and it’s where the exciting innovation is happening.

Mobile/social. That’s a given. Any HCM tool must have a good mobile user experience, from well-designed mobile forms and ease of access to a secure interface.

They want it to have a simple, intuitive user interface – another given. Whether accessed via desktop or mobile, the solution must offer a single, unified, simple-to-use interface.

They want it to offer social collaboration tools, which is particularly key for the influx of Millenials coming into the workplace who expect to be able to collaborate via social channels. HR is no exception here. While challenging from a security and data protection angle, it’s a must.

But the final requirement the study reported is, in my mind, the most important: analytics and reporting. Management needs reporting to know their investment is paying off, and they also need robust analytics to keep ahead of trends within the workforce.

It’s not just a question of Big Data’s accessibility, or of sophisticated metrics, such as the key performance indicators (KPIs) that reveal the critical factors for success and measure progress made towards strategic goals. For organizations to realize the promise of Big Data, they must be able to cut through the noise and access the right analytics that will transform their companies for the better.

Given what companies are after, as shown in the ISG study, I predict that more and more companies are going to be recognizing the benefits of using integrated analytics for their talent management and workforce planning processes. Talent analytics creates a powerful, invaluable amalgam of data and metrics; it can identify the meaningful patterns within that data and metrics and, for whatever challenges and opportunities an organization faces, it will best inform the decision makers on the right tactics and strategies to move forward. It will take talent analytics to synthesize Big Data and metrics to make the key strategic management decisions in HR. Put another way, it’s not just the numbers, it’s how they’re crunched.

For more on the power of talent analytics, see Talent Analytics: Predicting HR’s Way Out Of The Fog.

Image source: Simonebrunozzi via Wikipedia

Comments