Happy New Year: Build on Foundation of Lessons Learned in 2012

Happy New Year to all my readers and followers. I hope everyone has gotten some rest and is ready for a great 2013. 2012 was a busy year in which we saw a critical inflection point, where an elevated focus on new methods and innovative technological approaches such as big data, business analytics, business and social collaboration, cloud computing, mobile technology and social media become part of the mainstream business and IT dialogue. These technologies are beginning to be part of or embedded into enterprise software that will be available in 2013. This is a critical step forward that will help organizations become more efficient in their operations and use technology to its fullest. As we start the new year, I thought a reflection back on the some of the highlights from 2012 was in order.

I had a chance to review some of the most-read blogs from some of our research directors, and found many of them worth further mention as you look for insights. First, Robert Kugel educates finance and business professionals across a range of topics, and his leadership in areas like financial management is apparent in GAAP and IFRS Harmonize Revenue Recognition Standards. In Rob’s research on governance, risk and compliance (GRC) he identified the reality that GRC is about the individual words and that the specifics matter differently to finance, operations and IT. Rarely do you see an integrated process and technology strategy in organizations for getting the information required to mitigate risk and ensure the right level of governance and compliance, though it is so important that organizations ensure they are taking the proper steps with an understanding of their operations.

Next, Richard Snow continues to be a steady voice in customer and contact center management, helping businesses understand the value of technology and vendors learn how they can improve customer experience through more intelligent interactions. Richard’s post on his agenda in 2012 was our most read industry blog, and his contact center in the cloud post was a big read too. Each provided research-based guidance on how to improve an organization’s customer service culture. Richard’s research in customer feedback management continues to identity the immaturity of organizations’ measurement of and actions with customers across social media channels.

In the realm of business and big data, Tony Cosentino noted that business analytics is a priority for organizations, but found a lot of confusion on what approaches best fit specific roles. His post Making Sense of the Swirling World of Business Analytics outlines these challenges and explains what organizations need to understand to ensure the right level of process and technology. I personally liked Tony’s call to transform the three Vs of big data into three Ws of business analytics. Tony’s latest research on the next generation of business intelligence is uncovering what is important with mobile technology and where social collaboration is changing the way business operates. Tony’s recent analysis of the movement to leverage event- and stream-based data to gain better operational intelligence is also critical, since data across the network is definitely part of big data and should be harvested to its full potential.

Some of my own blog posts sparked both violent agreement and disenchantment. In one, I pointed out the silliness of placing a bunch of business charts on a web page, calling it a dashboard, and believing that this will help organizations improve performance. My title maybe caused some stir – The Pathetic State of Dashboards – but it helped instrument a dialogue on what business needs to see and do with the results of analytics, which is not just more charts that need to be deciphered or discarded. I followed this with a look at issues in the data within the charts, and specifically the relevance of key performance indicators (KPI) and what types of measurements are needed to provide actionable metrics (see The Stupidity of KPIs in Business Analytics). These blog posts collectively provided some honest analysis and heavy use of research to show how business analytics needs to improve to reach its true value and contribute to business outcomes in the most efficient manner.

I had some good insights unveiled in our research into the demand for social and mobile technology as part of human capital management and how they have become a necessity and how these are impacting the daily operations in the next generation of workforce management and I am happy to point out that we have a new research director to help advance our research and education in this critical applications and technology category.

We will publicly unveil the research agendas for 2013 on the second week of January that we completed last month and provided to our clients. We post research agendas publicly to provide depth on the themes and plans for our research in the year. Unfortunately most technology analyst firms do not publish their plans for the year and have dropped the process and ignore the rigor it takes to ensure an independent level of research is conducted. This is part of the challenge I laid down to the technology analyst industry last year. We have a passion for research and provide a strong focus on the best practices and insights to applications and technology that help business and IT be successful. I believe our firm plays a role in helping businesses improve in their use of technology.

We finished a decade of business at Ventana Research and as I mentioned in my analysis of a decade of research there is a great opportunity to use research as a more effective technology selection method. We have many surprises in store for 2013, from two summits that will align with our Business Leadership and Technology Innovation awards to a new look and feel in our brand, which we have already begun to roll out and which will soon show up on our website.

On behalf of the entire team at Ventana Research, we hope you find the best possible value from technology in 2013, and we look forward to hearing from you across any of our channels, from research, blogs or social media. You have a lot to do to make sure you adopt best practices, build the best business case and even select the right technology, and we are glad to be part of helping your organization be successful. Our research helps everyone bring the best possible technology to market and deliver the best possible value for every organization.

Best Regards,

Mark Smith

CEO & Chief Research Officer

The Stupidity of KPIs in Business Analytics

In my last rant, on business analytics and the pathetic state of dashboards, I pointed out significant flaws in business intelligence software created by technology providers and in how it is being deployed by business and IT. Now I want to follow up with some insight on disconnects to a critical asset that is essential to the success of business analytics. I mean key performance indicators (KPIs), a term used in inaccurate ways that have diminished the value of the concept for business.

Let’s start with the definition; for practicality I will use Wikipedia, which says that a KPI is used “by an organization to evaluate its success or the success of a particular activity in which it is engaged.” The success being evaluated could be a goal, a target or something else that is important. To set a baseline, you calculate two measures and compare them to create a performance metric. For example, units sold and unit price are two separate measures that can be calculated to produce a metric called sales. Then through iterations of more precise calculations, this measure can be refined to compare against the sales quota or goal for a specific time period; this creates a key performance indicator on the outcomes of sales and even marketing efforts.

In actual use, however, the KPI, its use and its value have been dumbed down in ways that diminish the quality of intelligence we gain from using business analytics. First is the vague and contradictory ways in which the term is applied by technology providers and practitioners. Over the last decade I have seen “KPI” used to describe what are actually metrics – the building blocks of KPIs – and only sometimes performance-related. A metric like revenue or sales is not a KPI; neither are cost-specific, throughput-related metrics based on quantity or processing, nor customer-related metrics like first-call resolution. Such metrics are commonly presented in dashboards through visualization and called KPIs. Today we seldom see scorecards that use business analytics, which once were common for presenting KPIs properly to business users. Maybe it is time to start using scorecards for managing performance and not just measuring it.

The second issue has to do with the performance part of KPI, which should show how an organization or any of its business processes measure up to expected outcomes. Ideally, upon viewing performance-related metrics or indicators, within seconds an individual should be able to determine what, if any, action should be taken to improve performance, such as discovering what is contributing to the subpar performance or identifying opportunities for improvement. This root-cause level of actions requires examination of different classes of metrics related to performance and can range from people and processes to customers or risk. Understanding the cause and effect of metrics requires knowing and presenting the process and interconnects of how a business operates. Unfortunately most business analytics software just will provide you a table of data with no insight on what metric is contributing to the issue. By creating the right types of metrics underlying a KPI, we can reduce the time and resources required to support the communications (email, phone calls and meetings) that people normally use to investigate performance shortfalls. To get to this point requires creating a library of measures, metrics and indicators that can cross a variety of situations and help inform action-taking and decision-making. Let’s drop the P and just say key indicators (KIs) to set a new context that focuses on the indicators and the types of metrics that support them. This could lead organizations to make substantive improvements.

The third step is to make KPIs or KIs relevant to the particular roles and responsibilities of individuals. Company or divisional KPIs are interesting but only provide a general view of how an organization is performing. Where the rubber hits the road is the context of the indicators and metrics at the department, team and individual levels. We need to provide the ability for individuals to select their own focus within the scope of these facts and figures to determine how well their activities are contributing to the execution of business processes and outcomes. Here the role of business analytics is critical. To make the analyst buzzwords self-service BI and agile BI being pushed by IT analyst firms a reality, tools have to make analytics more intuitive to users. More tools for data discovery are not the answer, and making users select their scope every time they get an updated report or dashboard is a waste of time that decreases productivity and increases costs in running an organization. Instead let’s design a new generation of business analytics based on roles and individuals developed through a profile; this could go a long way toward streamlining the focus of analysis and preparing individuals to quickly determine what action to take.

To erase the stupidity in how KPIs are spoken about, demonstrated and actually deployed, we need to advance our dialogue and educational discussion of what key indicators and range of metrics are required to support particular deployments. I have already said that just placing more charts in a dashboard, no matter how pretty and interactive they might be, will not help support the actions and decisions that business analytics should enable. The effort to make KPIs more valuable begins with ensuring they are properly developed and represent performance in terms of the state of success toward achieving the goal or target. Showing past performance is insufficient without knowing how well it met expectations. Presenting a KPI does not necessarily require a chart; it can be done equally well by text presenting it within the context of how the people or process is performing over time and where it is in progress toward the expected target. These indications can be linked to additional facts with a directional arrow or other simple representations that make it easy to determine whether to take action. If your business intelligence software does not support a simpler way to communicate key indicators and metrics, maybe you have the wrong tool.

If we admit the flaws within our deployments and technologies and force ourselves to have more realistic conversations, we could advance the science of business analytics. Over the years we have made strides forward and then taken steps backward in trying to meet the needs of the lowest competency denominator. We need to aim higher and take steps to find out what should be done to produce full value from business analytics. Increasing the value of these investments can help an organization increase its efficiency and effectiveness. If you are not sure if you are heading in wrong direction with your metrics and indicators, just let me know, that is what myself and others at our firm do for a living.


Mark Smith – CEO & Chief Research Officer