Sections

Five Key Benefits Big Data Can Deliver For Finance: Part 2

Nilly Essaides

Part 2 in a series. Read Part 1 here.

Finance professionals and experts interviewed by the Association for Financial Professionals (AFP) for its upcoming FP&A Guide, How Finance Can Get Ready for Big Data, releasing April 12, pointed out five key benefits finance can achieve from adopting Big Data strategies.

  1. Improved forecasting. The key benefits for incorporating Big Data strategies into FP&A is improving predictability. Big Data validates the assumptions that go into the business forecast, and therefore allows FP&A to come up with a more accurate view of how events in the market and internally will impact the company’s performance, and thus its competitive position. A data-driven finance department can better look forward and identify leading indicators. With that information, the CFO can make more educated decisions.
  1. Better KPIs. FP&A can take also take advantage of Big Data when identifying and understanding value drivers, and then managing and monitoring financial and non-financial KPIs against these value drivers. By nature of its job and role, FP&A is in the right position to examine that and assess whether core planning and reporting models represent the right driver relationships and related KPIs.
  1. More predictable working capital. An existing example for an area where Big Data can play a role is in analyzing and predicting working capital. Traditionally, finance would look up 15 factors that drive working capital and monitor them to come up with a forecast. Now, instead, an analyst can seek statistical correlations between working capital and any number of data points to arrive at a forecast for the organization.
  1. Identification of growth opportunities. One of the areas that CEOs identified as the best thing CFOs can do, according to KPMG’s The View from the Top 2015 survey, is in best leveraging financial data and analytics to identify growth opportunities. While marketing is clearly involved, finance is actually in a much better position – and has better access to data – to analyze the cost to serve across multiple dimensions (products, customers, services, channels) and then analyze pricing strategies and where to optimize profitability and growth.
  1. A stronger strategic role for FP&A. Finally, FP&A already has the basic multidisciplinary thinking and analytical approach. Using Big Data and getting comfortable with some ambiguity allows FP&A professionals to more quickly adjust their thinking, and recommendations, in reaction to changes in the business environment, today and looking forward. Many FP&A groups are already moving their focus from what happened to what’s going to happen and why. In this role, they are becoming a strategic partner to the business and senior management.

According to Allan Frank, chief IT strategist and co-founder of The Hackett Group, Big Data and related new tools present a tremendous opportunity for finance to take the lead, given its core fiduciary responsibilities. “The challenge for finance is how to develop an enterprise view of analytics,” he said. “The first thing is to realize you can find out more. You can ask questions you couldn’t ask before and frame them in the form of business outcomes.”

Over time there will likely be an evolution of the FP&A business analyst into the business data scientist, according to Philip Peck, vice president of finance transformation at Peloton. “FP&A practitioners will review and analyze all of the forward-looking KPIs and data available, dynamically adjust forecasts, make tactical recommendations, and effectively drive that information into operations,” he said.

Peck added that as finance and FP&A continue to extend and expand their business partnering activities across the organization, they have a unique opportunity to spearhead or at least guide Big Data and analytics efforts and become the go-to experts in this area. “Similar to the evolution we experienced when business intelligence became more prevalent, we are starting to see the emergence of analytic centers of excellence or competency centers,” he said.

To benchmark your organization’s forecasting methods and other FP&A processes, take the AFP FP&A Benchmarking Survey, in partnership with IBM. You can also connect with me on LinkedIn or follow me on Twitter.

Comments

Nilly Essaides

About Nilly Essaides

Nilly Essaides is senior research director, Finance & EPM Advisory Practice at The Hackett Group. Nilly is a thought leader and frequent speaker and meeting facilitator at industry events, the author of multiple in-depth guides on financial planning & analysis topics, as well as monthly articles and numerous blogs. She was formerly director and practice lead of Financial Planning & Analysis at the Association for Financial Professionals, and managing director at the NeuGroup, where she co-led the company’s successful peer group business. Nilly also co-authored a book about knowledge management and how to transfer best practices with the American Productivity and Quality Center (APQC).

Real-Time Analysis Tools Critical To Improving Finance Performance [INFOGRAPHIC]

Viki Ghavalas

The majority of finance executives agree that real-time analysis tools are key to making better business decisions, according to a report by CFO Research and SAP titled “The Future of Financial Planning and Analysis.” However, executives polled also believe that their current systems still need more improvement to be able to make a positive impact on the business. Executives surveyed point to four main priorities for their FP&A tools.

Finance executives surveyed expect the demand for real-time analysis tools to grow in the coming years. However, the survey also shows that having these tools is not enough and that stakeholders also expect analysis and insights from finance that are simple and actionable.

Data in financial planning and analysis

Learn more about what finance executives are projecting for FP&A by downloading the “The Future of Financial Planning and Analysis” report.

Are you monitoring business performance in real time? If not, read Boosting Efficiency For CFOs And The Finance Function.

Comments

Viki Ghavalas

About Viki Ghavalas

Viki Ghavalas is worldwide program manager for the finance line of business at SAP.

Why Banks Should Be Bullish On Integrating Finance And Risk Data

Mike Russo

Welcome to the regulatory world of banking, where finance and risk must join forces to banking executiveensure compliance and control. Today it’s no longer sufficient to manage your bank’s performance using finance-only metrics such as net income. What you need is a risk-adjusted view of performance that identifies how much revenue you earn relative to the amount of risk you take on. That requires metrics that combine finance and risk components, such as risk-adjusted return on capital, shareholder value added, or economic value added.

While the smart money is on a unified approach to finance and risk, most banking institutions have isolated each function in a discrete technology “silo” complete with its own data set, models, applications, and reporting components. What’s more, banks continually reuse and replicate their finance and risk-related data – resulting in the creation of additional data stores filled with redundant data that grows exponentially over time. Integrating all this data on a single platform that supports both finance and risk scenarios can provide the data integrity and insight needed to meet regulations. Such an initiative may involve some heavy lifting, but the advantages extend far beyond compliance.

Cashing in on bottom-line benefits

Consider the potential cost savings of taking a more holistic approach to data management. In our work with large global banks, we estimate that data management – including validation, reconciliation, and copying data from one data mart to another – accounts for 50% to 70% of total IT costs. Now factor in the benefits of reining in redundancy. One bank we’re currently working with is storing the same finance and risk-related data 20 times. This represents a huge opportunity to save costs by eliminating data redundancy and all the associated processes that unfold once you start replicating data across multiple sources.

With the convergence of finance and risk, we’re seeing more banks reviewing their data architecture, thinking about new models, and considering how to handle data in a smarter way. Thanks to modern methodologies, building a unified platform that aligns finance and risk no longer requires a rip-and-replace process that can disrupt operations. As with any enterprise initiative, it’s best to take a phased approach.

Best practices in creating a unified data platform

Start by identifying a chief data officer (CDO) who has strategic responsibility for the unified platform, including data governance, quality, architecture, and analytics. The CDO oversees the initiative, represents all constituencies, and ensures that the new data architecture serves the interests of all stakeholders.

Next, define a unified set of terms that satisfies both your finance and risk constituencies while addressing regulatory requirements. This creates a common language across the enterprise so all stakeholders clearly understand what the data means. Make sure all stakeholders have an opportunity to weigh in and explain their perspective of the data early on because certain terms can mean different things to finance and risk folks.

In designing your platform, take advantage of new technologies that make previous IT models predicated on compute-intensive risk modeling a thing of the past. For example, in-memory computing now enables you to integrate all information and analytic processes in memory, so you can perform calculations on-the-fly and deliver results in real time. Advanced event stream processing lets you run analytics against transaction data as it’s posting, so you can analyze and act on events as they happen.

Such technologies bring integration, speed, flexibility, and access to finance and risk data. They eliminate the need to move data to data marts and reconcile data to meet user requirements. Now a single finance and risk data warehouse can be flexible and comprehensive enough to serve many masters.

Join our webinar with Risk.net on 7 October, 2015 to learn best practices and benefits of deploying an integrated finance and risk platform.

Comments

Mike Russo

About Mike Russo

Mike Russo is senior industry principal, Financial Services, with SAP. Mike has 30 years of experience in the financial services/financial software industries. This includes stints as senior auditor for the Irving Trust Co., New York; manager of the International Department at Barclays Bank of New York; and 14 years as CFO for Nordea Bank’s New York City branch – a full-service retail/commercial bank. Mike also served on Nordea’s Credit, IT, and Risk Committees. Mike’s financial software experience includes roles as a senior banking consultant with Sanchez Computer Associates and manager of Global Business Solutions (focused on sale of financial/risk management solutions) with Thomson Financial. Before joining SAP, Mike was a regulator with the Federal Reserve Bank in Charlotte, where he was responsible for the supervision of large commercial banking organizations in the Southeast with a focus on market/credit/operational risk management.

How Emotionally Aware Computing Can Bring Happiness to Your Organization

Christopher Koch


Do you feel me?

Just as once-novel voice recognition technology is now a ubiquitous part of human–machine relationships, so too could mood recognition technology (aka “affective computing”) soon pervade digital interactions.

Through the application of machine learning, Big Data inputs, image recognition, sensors, and in some cases robotics, artificially intelligent systems hunt for affective clues: widened eyes, quickened speech, and crossed arms, as well as heart rate or skin changes.




Emotions are big business

The global affective computing market is estimated to grow from just over US$9.3 billion a year in 2015 to more than $42.5 billion by 2020.

Source: “Affective Computing Market 2015 – Technology, Software, Hardware, Vertical, & Regional Forecasts to 2020 for the $42 Billion Industry” (Research and Markets, 2015)

Customer experience is the sweet spot

Forrester found that emotion was the number-one factor in determining customer loyalty in 17 out of the 18 industries it surveyed – far more important than the ease or effectiveness of customers’ interactions with a company.


Source: “You Can’t Afford to Overlook Your Customers’ Emotional Experience” (Forrester, 2015)


Humana gets an emotional clue

Source: “Artificial Intelligence Helps Humana Avoid Call Center Meltdowns” (The Wall Street Journal, October 27, 2016)

Insurer Humana uses artificial intelligence software that can detect conversational cues to guide call-center workers through difficult customer calls. The system recognizes that a steady rise in the pitch of a customer’s voice or instances of agent and customer talking over one another are causes for concern.

The system has led to hard results: Humana says it has seen an 28% improvement in customer satisfaction, a 63% improvement in agent engagement, and a 6% improvement in first-contact resolution.


Spread happiness across the organization

Source: “Happiness and Productivity” (University of Warwick, February 10, 2014)

Employers could monitor employee moods to make organizational adjustments that increase productivity, effectiveness, and satisfaction. Happy employees are around 12% more productive.




Walking on emotional eggshells

Whether customers and employees will be comfortable having their emotions logged and broadcast by companies is an open question. Customers may find some uses of affective computing creepy or, worse, predatory. Be sure to get their permission.


Other limiting factors

The availability of the data required to infer a person’s emotional state is still limited. Further, it can be difficult to capture all the physical cues that may be relevant to an interaction, such as facial expression, tone of voice, or posture.



Get a head start


Discover the data

Companies should determine what inferences about mental states they want the system to make and how accurately those inferences can be made using the inputs available.


Work with IT

Involve IT and engineering groups to figure out the challenges of integrating with existing systems for collecting, assimilating, and analyzing large volumes of emotional data.


Consider the complexity

Some emotions may be more difficult to discern or respond to. Context is also key. An emotionally aware machine would need to respond differently to frustration in a user in an educational setting than to frustration in a user in a vehicle.

 


 

download arrowTo learn more about how affective computing can help your organization, read the feature story Empathy: The Killer App for Artificial Intelligence.


Comments

Christopher Koch

About Christopher Koch

Christopher Koch is the Editorial Director of the SAP Center for Business Insight. He is an experienced publishing professional, researcher, editor, and writer in business, technology, and B2B marketing. Share your thoughts with Chris on Twitter @Ckochster.

Tags:

What Will The Internet Of Things Look Like In 2027? 7 Predictions

Tom Raftery

Recently I was asked: Where do you see the Internet of Things in 10 years?

It is an interesting question to ponder. To frame it properly, it helps to think back to what the world was like 10 years ago and how far we have come since then.
iPhone launch 2007

Ten years ago, in 2007 Apple launched the iPhone. This was the first real smartphone, and it changed completely how we interact with information.

And if you think back to that first iPhone—with its 2.5G connectivity, lack of front-facing camera, and 3.5-inch diagonal 163ppi screen—and compare it to today’s iPhones, that is the level of change we are talking about in 10 years.

In 2027 the term Internet of Things will be redundant. Just as we no longer say Internet-connected smartphone or interactive website because the connectedness and interactivity are now a given, in 10 years all the things will be connected and the term Internet of Things will be superfluous.

While the term may become meaningless, however, that is only because the technologies will be pervasive—and that will change everything.

With significant progress in low-cost connectivity, sensors, cloud-based services, and analytics, in 10 years we will see the following trends and developments:

  • Connected agriculture will move to vertical and in-vitro food production. This will enable higher yields from crops, lower inputs required to produce them, including a significantly reduced land footprint, and the return of unused farmland to increase biodiversity and carbon sequestration in forests
  • Connected transportation will enable tremendous efficiencies and safety improvements as we transition to predictive maintenance of transportation fleets, vehicles become autonomous and vehicle-to-vehicle communication protocols become the norm, and insurance premiums start to favor autonomous driving modes (Tesla cars have 40% fewer crashes when in autopilot mode, according to the NHTSA)
  • Connected healthcare will move from reactive to predictive, with sensors alerting patients and providers of irregularities before significant incidents occur, and the ability to schedule and 3D-print “spare parts”
  • Connected manufacturing will transition to manufacturing as a service, with distributed manufacturing (3D printing) enabling mass customization, with batch sizes of one very much the norm
  • Connected energy, with the sources of demand able to “listen” to supply signals from generators, will move to a system in which demand more closely matches supply (with cheaper storage, low carbon generation, and end-to-end connectivity). This will stabilise the the grid and eliminate the fluctuations introduced by increasing the percentage of variable generators (such as solar and wind) in the system, thereby reducing electricity generation’s carbon footprint
  • Human-computer interfaces will migrate from today’s text- and touch-based systems toward augmented and mixed reality (AR and MR) systems, with voice- and gesture-enabled UIs
  • Finally, we will see the rise of vast business networks. These networks will act like automated B2B marketplaces, facilitating information-sharing among partners, empowering workers with greater contextual knowledge, and augmenting business processes with enhanced information

IoT advancements will also improve and enhance many other areas of our lives and businesses—logistics with complete tracking and traceability all the way through the supply chain is another example of many.

We are only starting our IoT journey. The dramatic advances we’ve seen since the introduction of the smartphone—such as Apple’s open-sourced ResearchKit being used to monitor the health of pregnant women—foretell innovations and advancements that we can only start to imagine. The increasing pace of innovation, falling component prices, and powerful networking capabilities reinforce this bright future, even if we no longer use the term Internet of Things.

For a shorter-term view of the IoT, see 20 Technology Predictions To Keep Your Eye On In 2017.

Photo: Garry Knight on Flickr

Originally posted on my TomRaftery.com blog

Comments

Tom Raftery

About Tom Raftery

Tom Raftery is VP and Global Internet of Things Evangelist for SAP. Previously Tom worked as an independent analyst focussing on the Internet of Things, Energy and CleanTech. Tom has a very strong background in social media, is the former co-founder of a software firm and is co-founder and director of hyper energy-efficient data center Cork Internet eXchange. More recently, Tom worked as an Industry Analyst for RedMonk, leading their GreenMonk practice for 7 years.