Data can only steer you right if you apply analytics to understand what it’s trying to tell you.
In a previous story, we looked at the importance for data analysis of avoiding bias and choosing the right metrics. In this follow-up we discuss the importance of confronting “analytical reality.”
Data analysis is supposed to replace hunches with facts. Brands don’t want to risk millions of campaign dollars on someone’s gut instinct. The marketer, ideally, has a goal, a clear threshold of success that must be crossed to achieve results. So how do you get there?
Data analysis is the “GPS.” The whole point of data analysis is to understand what is going on and to use that information to make the right decision. It’s “ready, aim, fire” (data, analysis, action). But sometimes the order gets mixed up, resulting in people drawing the wrong conclusions and acting on that basis. The process then becomes “ready, fire, aim”, or even more comically, “fire, aim, ready”.
“The biggest test of data is analytics,” said Mark Stouse, chairman and CEO of Proof Analytics. “It contextualizes the data, making it extraordinarily difficult to manufacture conclusions, whereas data visualization alone makes it easy.”
Can data identify what’s causing something?
Can one gauge causality from data alone? Stouse believes not. Marketers can try by extrapolating from historical data, then check to see if this extrapolation was correct. “If everything is stable, extrapolation can work. But when the variety, volatility and velocity of change is great, extrapolation has zero value.”
“Data is indeed always about the past, and it has no innate ability to forecast. Past is not Prologue,” he continued. “But multivariable regression is the proven approach to taking data representing the relevant factors (the known knowns) — as well as some potentially important stuff (known unknowns) — and turning that into a calculated historical portrait of causality. That, in turn, creates a forecast against which you can understand the accuracy of the model vis-à-vis a comparison between forecast and actuals.”
Erica Magnotto, director of SEM at Accelerated Digital Media, sees the value of historical data, but only if there is room for retroactive perspective and predictive planning. “Forecasting campaign success should be based on trending data and performance like year-over-year and month-over-month. This should create close to accurate predictions on future success. If the forecasted data indicates a slower month or potential downturn in the market, optimizations can be made in real time to promote efficiency and conservative scale. If forecasting indicates a stronger month, then it’s time to start planning for scale, testing and additional campaign launches.”
Marketers should also be aware of hiccups in the model. Magnotto noted that there is a difference between normal “ebb and flow”’” of performance versus a crash/spike. “Data occurring outside of the normal margin of ebb and flow could indicate that immediate action in the account is necessary. Marketers should also not assume user behavior will always be consistent so it’s important to understand benchmark performance so abnormal user (or campaign) behavior can be detected,” she said.
What can marketers do?
Marketers must be analytical, open-minded, and humble at the same time. This alone can be a challenge when there are always some people who can be too self-assured, or fixated on the trivial at the expense of the substantive. Still, there are approaches to check mistakes before they happen.
Magnotto focused on knowing the data, the customer, and acknowledging reality. She offered this checklist for agencies, but the main points on it apply to brands too:
1. Understand basic excel/sheets principals and how to pivot large sets of data downloaded from any platform.
2. Understand basic comparison formulas and default ways to look at data trends (month-over-month, year-over-year, period-over-period, week over week).
3. Have agreed upon primary KPIs and secondary KPIs with the client.
4. Always speak the client’s language and incorporate the client’s source of truth data into reporting. This will ensure more productive conversations and help marketers navigate away from making mistakes or misreading performance.
5) Know when to admit defeat in a campaign strategy. If a “great idea” is not working, then be comfortable allowing the data to speak for itself and changing strategies.
6) Always QA reporting. Apply QA to formulas, timeframes, numbers, etc. If something looks too good to be true when analyzing data, it probably is. QA for mistakes that may be leading to that anomaly.
Stouse stressed avoiding a fixed mindset. “Blindness to analytical reality is about choosing not to see, because what is there offers a challenge to what you believe.” he said. “The opposite of analysis is a certainty you have chosen and justified without any real basis except your own self-interest. More mistake have been made in the name of certainty than anything else I can think of.”
The following William Terdoslavich, from 2023 provides their research perspective. HERE