Industry insiders have always claimed that the Great Recession was a good thing for marketing analytics. They believed that marketers would invest more in data to prove their value to their clients at a time when most companies are cutting their marketing budgets.
I think many of us assumed the COVID-19 downturn would do the same thing for marketing analytics. However, that may not be the case this time around.
I’ve noticed that many peers at different companies have lost their jobs in the past few weeks because of the COVID-19 downturn. It might be that the crisis forced marketers to evaluate whether the costly analytics practices were really worth the money and work involved. Or it might be that we (the data professionals) never delivered as much value as we thought.
In reality, both sides probably share the blame. Neither the analysts nor the marketers have ever really approached marketing analytics the right way.
For the past ten years, the marketing industry invested heavily in building data warehouses, implementing advanced tracking, and hiring data professionals to analyze and report this data.
But along the way, marketing analytics began turning into snake oil.
The benefits were widely overstated for the amount of money invested. The solutions built were flimsy on quality. And the goals were often improbable (if not impossible).
I don’t think the analysts or the marketers intentionally did something dishonest. I think they simply did what marketing people always do – sell the benefits of a product.
The main problem was that marketers may not have been the right people to use this particular product.
What Makes the Marketing Industry So Different Than Other Industries?
Other industries have used data for much longer than marketing. Financial services, manufacturing, logistics, and tech companies have built highly complex data solutions to support and improve their organizations.
But a key thing about these industries that separate them from marketing is that they depend heavily on operational efficiency.
A few seconds makes a big difference in financial service transactions. Manufacturing and technology companies rely on operations for improving quality. And logistics requires advanced organization and efficiency to deliver goods consistently on time.
Like these industries, high quality data solutions also require operational efficiency. Because these industries have prioritized that efficiency for so long, they have an easier time building these solutions. The data they produce is more accurate and the various stakeholders actually use it.
Marketing agencies, though, have never relied on operational efficiency. At least not to the same extent.
In most situations, this is a good thing for marketers. It helps them win clients and adapt to the ever-changing needs of the consumers.
But in this type of environment, operational efficiency is simply a hard thing to prioritize, which leads to widespread data quality issues that undermines the data solution goals.
How Does Poor Quality Undermine Marketing Data Solutions?
For the same reason you want your tax accountant to be good at math, stakeholders want their data to be accurate. Every time they find errors in reporting, and every time an analyst has to come back and make clarifications, the marketing analytics team loses credibility.
It’s hard to fight that credibility issue once it becomes widespread in the organization. Even though analytics team members may still get paid for producing what they believe is good work, stakeholders within the company will start going elsewhere for their data.
They’ll completely ignore your dashboard and instead go directly to the data source itself. It’s a pain for them to do this, but they’ll suffer through it when they believe it gets them more accurate data.
The irony is that these stakeholders often contribute to quality problems as well. Data collection is a partnership between the analytics team and stakeholders, and a lack of discipline from stakeholders contributes to the quality issues they complain about.
How Does Quality Become Such a Problem in Marketing Agencies?
Marketers are comfortable adapting quickly to meet the needs of their clients. And marketing executives expect their own internal departments to adapt quickly to meet their needs.
This leads to constantly changing goals for marketing data solutions. The purpose of a dashboard or data warehouse is in constant flux and projects get stuck in development hell because of it.
The data professionals building these solutions find themselves making “one more adjustment” for the same project. These constant adjustments, without any clear end goal, only degrades data quality further.
These quality issues then get amplified by individual contributors who work outside the analytics department. A common example is when marketers move so quickly to launch a campaign that they may not remember to add URL parameter tracking until after the campaign launches.
It’s not unusual for marketing agencies to have departments dedicated to media buying, social media management, campaign planning, and account management.
All of these departments naturally produce data through their efforts. And the data is usually solid within the individual departments. Since a few people implement the social media campaigns, it’s very easy for them to establish consistent practices within their own teams.
But issues crop up when analytics teams try to bring these data silos together.
Getting all these contributors to use their various tools in such a way that various data sources can map together requires a heavy influence from management with more defined processes and procedures.
If you look at the simplified scenario below, you’ll see how the likelihood of delivering these perfect solutions decreases as you increase the complexity, even when the individual parts usually succeed on their own.
*This scenario is a simplification and assumes independence between each step.
This basically happens to marketers when they attempt to create enterprise-wide data solutions. They simply don’t have the operational discipline to scale these solutions.
And as of right now, no tool out there has eliminated the need for this discipline – no matter how many sales teams claim their tool is the silver bullet.
Should Marketers Become Better at Operations? Or Should They Set More Realistic Goals?
Marketers could learn to improve operations (and definitely read my upcoming book which explains how to do that), but I think it’s better to set more realistic goals, rather than change the whole industry culture.
The following are common goals marketers have with data:
A “master” data warehouse where all data from all departments are cleanly mapped together
A “master” dashboard that answers every single business question
These goals are ambitious even for a highly efficient organization. And all of them are theoretically possible.
But as the company grows in size, it becomes more difficult to pull this off. The company eventually adds more and more steps to support the solution, thus reducing the quality of output.
The work on this scale requires a high level of discipline from every individual contributor, from account executives to creatives to data analysts and even executives.
In an industry where adaptability and intuition are key, it’s hard to create this level of discipline without resistance.
The Solution? We Should Do Less Business Intelligence and More Data Science
If both marketers and their analysts want data solutions that provide real benefits and are practical within the marketing industry culture, they should focus less on business intelligence and more on data science.
I've always thought “data science" was a corny term myself, but it actually applies in this context.
Data science, as defined by applied statistics and machine learning using advanced programming languages like R, Python, or SAS, offers a way for marketers to circumvent many of the problems with large-scale tracking and reporting initiatives.
Since marketers struggle to gain the organizational discipline to pull off the level of tracking required to meet many of their goals (cross-domain, multi-channel attribution, user journey, master dashboards, etc), they can use statistics to infer the answer to their business questions.
Take multi-channel attribution for example.
Many marketers want to know if a user that views a display ad in September and sees a Facebook post in October buys something in December.
Many would like to use multi-channel attribution to map such a path to purchase. But tracking and mapping that together is not a real-world solution. It’s unlikely that a three-hundred employee ad agency can organize itself to support an accurate tracking system and circumvent the privacy rules put in place by Google and Facebook (as well as the entire European Union) to map this path with full accuracy.
But the same attribution question can be answered with statistics. We can infer, sometimes using simple t-tests, whether an increase in spending in one department, like social media, leads to a higher product purchase rate.
The best part about this approach is it actually moves the focus away from tracking and reporting towards real analysis.
Instead of mapping various data sources together, the analytics team can use the existing data as is. If you’re already analyzing data, why not use it in a more raw form?
(Side note: you should still pull this raw data into a data warehouse. You just don’t have to prioritize mapping at such a granular level.)
Using raw data also makes data quality less of an issue. Since statistics has always depended on sampling, as opposed to capturing the whole population, the data quality is more forgiving. If you know you’re trying to estimate the impact of your campaign on the results, it’s less of an issue if you’re missing a few data points.
This takes the burden off the organization to improve operations and puts it on the analytics team to nudge stakeholders towards ad-hoc analysis.
Analytics teams must automate the bare basic reporting and start a hypothesis library, where stakeholders ask questions on a weekly basis and the analytics team answers with ad hoc analysis.
Analytics teams must also shift from hiring so many database developers, report developers, and low-skill analysts in favor of statisticians and data scientists. Or at least analysts who understand higher level statistics.
This will make a marketing analytics team more nimble and produce greater value in line with the culture of marketers.
After all, marketers don’t just want flexibility – they need it, and data science is able to give it.