Close

CMO.com by Adobe delivers marketing insights, expertise, and inspiration for and by marketing leaders—all aimed at helping CMOs and senior marketers lead their brands in this new digital world. To help marketers stay informed and save time, CMO.com features curated content from more than 150 leading sources, in addition to original content from thought leaders at Adobe and across the industry.

Adobe is the global leader in digital marketing and digital media solutions. To learn more about how Adobe helps marketers make, manage, measure, and monetize digital content across every channel and screen, visit:

Visit Adobe.com  Adobe Marketing Cloud

Welcome ,Sign Out

Analysis & Measurement

You Say Reporting, I Say Analysis: Who's Right?

image

by Brent Dykes
Director of Industry Consulting
Adobe Systems

See More by this author >

You may have heard the terms “reporting” and “analysis” used as though they were interchangeable. While both of these areas of Web analytics draw upon the same collected Web data, reporting and analysis are very different in terms of their purpose, tasks, outputs, delivery, and value. Without a clear distinction of the differences, an organization may sell itself short in one area (typically analysis) and not achieve the full benefits of its Web analytics investment. 

Most companies have analytics solutions in place to derive greater value for their organizations. In other words, the ultimate goal for reporting and analysis is to increase sales and reduce costs (i.e., add value). Both reporting and analysis play roles in influencing and driving the actions, which lead to greater value in organizations.

For the purposes of this blog post, I’m not going delve deeply into what happens before or after the reporting and analysis stages, but I do recognize that both areas are critical and challenging steps in the overall data-driven decision-making process. It’s important to remember that reporting and analysis only have the opportunity of being valuable if they are acted upon.

Purpose
Before covering the differing roles of reporting and analysis, let’s start with some high-level definitions of these two key areas of analytics.

>> Reporting: The process of organizing data into informational summaries in order to monitor how different areas of a business are performing.

>> Analysis: The process of exploring data and reports in order to extract meaningful insights, which can be used to better understand and improve business performance.

Reporting translates raw data into information. Analysis transforms data and information into insights. Reporting helps companies monitor their online businesses and be alerted to when data falls outside of expected ranges. Good reporting should raise questions about the business from its end users. The goal of analysis is to answer questions by interpreting the data at a deeper level and providing actionable recommendations. Through the process of performing analysis you may raise additional questions, but the goal is to identify answers, or at least potential answers, that can be tested. In summary, reporting shows you what is happening while analysis focuses on explaining why it is happening and what you can do about it.

Tasks
Companies sometimes confuse “analytics” with “analysis.” For example, a firm might be focused on the general area of analytics (strategy, implementation, reporting, etc.), but not necessarily on the specific aspect of analysis. It’s almost like some organizations run out of gas after the initial set-up-related activities and don’t make it to the analysis stage. In addition, sometimes the lines between reporting and analysis can blur -- what feels like analysis is really just another flavor of reporting.

One way to distinguish whether your organization is emphasizing reporting or analysis is by identifying the primary tasks that are being performed by your analytics team. If most of the team’s time is spent on activities such as building, configuring, consolidating, organizing, formatting, and summarizing, then that’s reporting. Analysis focuses on different tasks, such as questioning, examining, interpreting, comparing, and confirming. (I’ve left out testing as I view optimization efforts as part of the action stage.) Reporting and analysis tasks can be intertwined, but your analytics team should still evaluate where it is spending the majority of its time. In most cases, I’ve seen analytics teams spending most of their time on reporting tasks.

Outputs
On the surface, reporting and analysis deliverables can look similar: lots of charts, graphs, trend lines, tables, stats, etc. However, there are some subtle differences. One of the main differences between reporting and analysis is the overall approach. Reporting follows a push approach, where reports are pushed to users who are then expected to extract meaningful insights and take appropriate actions for themselves (i.e., self-serve). I’ve identified three main types of reporting:

1. Canned reports: These are the out-of-the-box and custom reports you can access within the analytics tool or that also can be delivered on a recurring basis to a group of end users. Canned reports are fairly static with fixed metrics and dimensions. In general, some canned reports are more valuable than others, and a report’s value might depend on how relevant it is to an individual’s role (e.g., SEO specialist vs. Web producer).

2. Dashboards: These custom-made reports combine different KPIs and reports to provide a comprehensive, high-level view of business performance for specific audiences. Dashboards might include data from various data sources and are also usually fairly static.

3. Alerts: These conditional reports are triggered when data falls outside of expected ranges or some other predefined criteria is met. Once people are notified of what happened, they can take appropriate action as necessary.

In contrast, analysis follows a pull approach, where particular data is pulled by an analyst in order to answer specific business questions. A basic, informal analysis can occur whenever someone simply performs some kind of mental assessment of a report and makes a decision to act or not act based on the data. In the case of analysis with actual deliverables, there are two main types:

1. Ad hoc responses: Analysts receive requests to answer a variety of business questions, which might be spurred by questions raised by the reporting. Typically, these urgent requests are time-sensitive and demand a quick turnaround. The analytics team might have to juggle multiple requests at the same time. As a result, the analyses cannot go as deep or wide as the analysts may like, and the deliverable is a short and concise report, which may or may not include any specific recommendations.

2. Analysis presentations: Some business questions are more complex in nature and require more time to perform a comprehensive, deep-dive analysis. These analysis projects result in a more formal deliverable, which includes two key sections: key findings and recommendations. The key findings section highlights the most meaningful and actionable insights gleaned from the analyses performed. The recommendations section provides guidance on what actions to take based on the analysis findings.

When you compare the two sets of reporting and analysis deliverables, the different purposes (information vs. insights) reveal the true colors of the outputs. Reporting pushes information to the organization, and analysis pulls insights from the reports and data. There may be other hybrid outputs, such as annotated dashboards (analysis sprinkles on a reporting donut), which appear to span the two areas. You should be able to determine whether a deliverable is primarily focused on reporting or analysis by its purpose (information/insights) and approach (push/pull).

Next: Why context is king.

[pagebreak]

Another key difference between reporting and analysis is context. Reporting provides no or limited context about what’s happening in the data. In some cases, the end users already possess the necessary context to understand and interpret the data correctly. However, in other situations the audience may not have the required background knowledge. Context is critical to good analysis. In order to tell a meaningful story with the data to drive specific actions, context becomes an essential component of the storyline.

Although they both leverage various forms of data visualization in their deliverables, analysis is different from reporting because it emphasizes data points that are significant, unique, or special--and explains why they are important to the business. Reporting can sometimes automatically highlight key changes in the data, but it’s not going to explain why these changes are (or aren’t) important. Reporting isn’t going to answer the “so what?” question on its own.

If you’ve ever had the pleasure of being a new parent, I would compare canned reporting, dashboards, and alerts to a 6-month-old infant. It cries--often loudly--when something is wrong, but it can’t tell you what is exactly wrong. The parent has to scramble to figure out what’s going on (hungry, dirty diaper, no pacifier, teething, tired, ear infection, new Baby Einstein DVD, etc.). Continuing the parenting metaphor, reporting is also not going to tell you how to stop the crying.

The recommendations component is a key differentiator between analysis and reporting; it provides specific guidance on what actions to take based on the key insights found in the data. Even analysis outputs such as ad hoc responses may not drive action if they fail to include recommendations. Once a recommendation has been made, follow-up is another potent outcome of analysis because recommendations demand decisions to be made (go/no go/explore further). Decisions precede action. Action precedes value.

Delivery
As mentioned, reporting is more of a push model, where people can access reports through an analytics tool, Excel spreadsheet, or widget, or have them scheduled for delivery into their mailboxes, mobile devices, FTP sites, etc. Because of the demands of having to provide periodic reports (daily, weekly, monthly, etc.) to multiple individuals and groups, automation becomes a key focus in building and delivering reports. Most of the analysts who I’ve talked to don’t like manually building and refreshing reports on a regular basis. It’s a job for robots or computers, not human beings who are still paying off their student loans.

On the other hand, analysis is all about human beings using their superior reasoning and analytical skills to extract key insights from the data and form actionable recommendations for their organizations. Although analysis can be “submitted” to decision makers, it is more effectively presented person-to-person. In their book “Competing on Analytics,” Thomas Davenport and Jeanne Harris emphasize the importance of trust and credibility between the analyst and decision maker. Decision makers typically don’t have the time or ability to perform analyses themselves. With a “close, trusting relationship” in place, the executives will frame their needs correctly, the analysts will ask the right questions, and the executives will be more likely to take action on analysis they trust.

Value
When it comes to comparing the different roles of reporting and analysis, it’s important to understand the relationship between reporting and analysis in driving value. I like to think of the data-driven decision-making stages (data > reporting > analysis > action > value) as a series of dominoes. If you remove a domino, then it can be more difficult or impossible to achieve the desired value.

The "Path to Value" all starts with having the right data that is complete and accurate. It doesn’t matter how advanced your reporting or analysis is if you don’t have good, reliable data. If we skip the “reporting” domino, some seasoned analysts might argue that they don’t need reports to do analysis (i.e., just give me the raw files and a database). On an individual basis that might be true for some people, but it doesn’t work at the organizational level if you’re striving to democratize your data.

Most companies have abundant reporting, but may be missing the “analysis” domino. Reporting will rarely initiate action on its own because analysis is required to help bridge the gap between data and action. Having analysis doesn’t guarantee that good decisions will be made, that people will actually act on the recommendations, that the business will take the right actions, or that teams will be able to execute effectively on those right actions. However, it is a necessary step closer to action and the potential value that can be realized through successful Web analytics.

Final Words
Reporting and analysis go hand-in-hand, but how much effort and resources are being spent on each area at your company? When I hear a client is struggling to find value from its Web analytics investment, it usually means one of the dominoes is missing--and often analysis is that misplaced domino.

I recently met with a major media client that found it was missing its analysis domino. The Web analytics team was struggling to meet the strategy, implementation, and reporting demands of this large, complex organization, let alone providing analysis beyond just ad-hoc responses. Senior management was becoming increasingly frustrated with its analytics staff and system. Fortunately, the Web analytics team received additional budget and hired an analyst to perform deep-dive analyses for all of its main product groups and drive actionable recommendations. Not surprisingly, the attitude of the senior executives did a 180-degree turn shortly after the company found its missing analysis domino.

You might be wondering how much time your analysts should spend on analysis. As a rule of thumb, I would say at least 25% of their time should be spent on analysis, and generally the more, the better. Surprisingly, 100% is not desirable either because there are many important responsibilities that are needed to keep an analytics program afloat, such as reporting, gathering business requirements, training, documenting, and communicating successes. I hope after reading this article you at least recognize that 0% of their time is unacceptable. If your company isn’t doing much analysis today, then experiment with a 10% focus on analysis and see what success you have from there. 

Back to main blog page, The Data-Driven Organization.

Share: