I recently spoke with a friend who is a senior analytics executive at a major consumer technology company. He told me how his firm had been offered a significant set of incentives by another country’s government to open an “analytics center of excellence.”
So they got busy, recruited some smart folks, bought some fancy tools, and rented some sweet office space.
A year later, the overseas analytics hub had regressed to providing reports of uncertain usage and value. What happened?
This story follows a common pattern, in which means--capabilities--are confused with ends, or the actual translation of insight into action and results. So how can you hedge your bet on analytics to help your odds of driving all the way through to target returns?
One technique we’ve seen applied usefully is to think of the list of analyses you’re pursuing not as a task list--for which items are tossed over the wall by operators to the analytics department, who passively receives and works down its inventory--but rather as a portfolio of potential opportunities to be assembled thoughtfully and managed actively, much as a venture capital firm might do.
Here are some examples of the ways in which these two approaches differ:
How analytic priorities are set
• Task list: At best, set externally by analytics team; at worst, FIFO (“first in, first out”) or MAFO (“most annoying, first out”).
• Analytic portfolio: Set through ongoing dialogue between operators and analysts, and facilitated through joint exploration of data presented through commonly agreed frameworks.
Time to get analytics
Task list: Varies by item--some (most, usually) “super-urgent,” others “whenever.”
Analytic portfolio: Consistent cadence, pursued iteratively. Answers pursued in shorter (hour, day, week) to longer (month, quarter, year) cycles depending on how adequate the answers are to the decisions supported.
How the analytics team is evaluated
• Task list: Analytics team managed through cost, timeliness, and accuracy of reports. Generally an annual process focused on general performance.
• Analytic portfolio: Analytics team managed through quality and quantity of insights developed, experiments supported, and business impact realized. Often reviewed on quarterly cycles with active reprioritization of specific inquiries.
The context for analytic explorations
Task list: Frequently, analysts know what they’re doing, but not why.
Analytic portfolio: Analysts always understand the “investment thesis” behind any particular exploration.
How accuracy standards get set
• Task list: Often overly tight, academically derived standards for confidence levels and intervals as default.
• Analytic portfolio: Matched to the stakes and uncertainty surrounding the decision to be supported.
How the work is pursued
• Task list: Process-driven, checking the boxes “regardless”; technique-focused, “have hammer, need nails.”
• Analytic portfolio: Hypothesis-driven, knowing when to say when; technique-agnostic, “right tool for the job.”
How the capability is developed
• Task list: Focus on tools and training for more sophisticated analysis, even if rarely used.
• Analytic portfolio: Focus on building competence through repeated practice and tight integration of good-enough answers with frequent actions.
Overheard at lunch
• Task list: “Let me tell you about the model I built.”
• Analytic portfolio: “Let me tell you about the lift over control we got.”
What are real-world practitioners doing to capitalize on this distinction? For my book, “Marketing and Sales Analytics,” Melanie Murphy, senior director of customer and business analytics at Bed Bath & Beyond, noted that a big piece of keeping her team strategically influential (as opposed to becoming “order takers”) is to focus on what the organization can act on, rather than trying to get too far out with sophisticated analysis to lead people’s thinking.
At investment management firm T. Rowe Price, Paul Musante, head of client and market insights in the firm’s U.S. Investment Services organization, developed a “learning agenda” that’s devised in partnership with his senior colleagues based on a regular review of business and market performance. That, in turn, serves to discipline and filter the inevitably large demands made on his teams. “It’s crucial that we allocate these scarce resources carefully, not just for insight, but for where we can practically act as well,” he told me.
One common denominator underlying executives who take an analytics approach is an appreciation that pace counts--that momentum is strategic. An analytics organization that is consistently delivering useful, usable insights and partnering effectively to see them through to results is one that earns and keeps its seat at the decision-making table, which, in turn, helps promote the data-driven organization everyone’s after.