The cost of building and maintaining custom analytics tools can quickly outweigh the perceived benefits. Frederik Werner shares why you should consider industry-relevant solutions for added value and cost-effectiveness.
The cost of building and maintaining custom analytics tools can quickly outweigh the perceived benefits. Frederik Werner shares why you should consider industry-relevant solutions for added value and cost-effectiveness.
The cost of building and maintaining custom analytics tools can quickly outweigh the perceived benefits. Frederik Werner shares why you should consider industry-relevant solutions for added value and cost-effectiveness.
The cost of building and maintaining custom analytics tools can quickly outweigh the perceived benefits. Frederik Werner shares why you should consider industry-relevant solutions for added value and cost-effectiveness.
When you’ve worked long enough in any industry, you will start to notice some patterns. Ideally, those are positive patterns, like best practices and proven approaches. On the other side of the spectrum, we can find some negative patterns (sometimes called anti-patterns) as well, like common misunderstandings or antiquated solutions.
On any normal day, I personally prefer to write about the positive sides of the analytics industry. There are many things to be excited about, whether it is new solutions rising to popularity or making the most out of existing tools. Today’s post is going to be a bit of a deviation from that, but not without showing some ways to get into a better position towards the end.
Today I want to talk about one of the most common less-than-best practices in analytics: The urge from companies, teams, and individuals to build custom, internal tools. If you look and ask around in the community, it is hard to find any company that is not relying on individuals maintaining a tool that has been built years ago (sometimes even from previous team members who left the team) for a very specific task and has since outgrown its original use case and user base.
For this post, I’m going to discern some of the reasons why those tools are built, how they grow into a substantial cost factor and risk for their companies, and what can be done instead. Of course, large parts are inspired by what I’ve seen from Accutics’ own customers since I joined the company, which was even inspired by what I’ve experienced myself on the customer side.
To start, let’s first take a look at...
Even in today’s market, with analytics tools like Adobe Analytics or Customer Journey Analytics becoming more and more end-user friendly and less and less technical, digital analytics remains a somewhat-technical field. We are constantly dealing with java script in the frontend, eVar expiration and attribution models in the backend, and sometimes even SQL or Python for companies who don’t get Analysis Workspace. Our daily lives can be pretty complicated.
As a natural result of that, we like to staff our teams with people from a technical background. Common ways into our industry include experience in web development, statistics, programming, and other highly specialized areas. We generally like complex environments, figuring out how things work, and finding solutions through everything we bring to the table. And that’s good!
With those common backgrounds and the teams commonly featuring skills like this, it’s no miracle that many analysts and analyst-adjacent team members know how to and enjoy some level of programming. Practically everyone has built a small website once, built or adapted a small script for some routine job, or worked with the APIs of an analytics tool. That’s good, too!
Where it then starts to get less good is when we allow our previous experiences and skills to influence other, less familiar areas. Even though we built a website once, building a technical tool for others to use is an entirely different game. Using an API and coding a script to automate the task is brilliant, but building a large-scale platform to automate diverse tasks for business-critical operations is very much not the same. And while it can be tempting to embark on a new skill-expanding adventure every now and then, the ongoing need for support and new features can lead to quite a bit of stress to keep up with the demand, often leading to substantial follow-up investments and/or frustrated users. That’s less good!
I’ve gone on record in the past complaining about the constant temptation of adding complexity to our daily work lives and companies. Given that we are commonly surfing on the edge between boredom and being overwhelmed by new intellectual adventures, I have heard about escalating pet-projects and subsequently frustrated stakeholders a few too many times. Considering that working in digital can be even more overwhelming to our business partners, we should remind ourselves to always strive for a reduction of everyday complexity rather than adding to the overwhelmingness.
On a way more business-related point, something that is very commonly underestimated is…
Since I’ve been through the Dunning-Kruger Curve on this topic myself, I want to discuss some of the false assumptions that might drive an effort to build something internally over just buying a solution on the market. One of the biggest misconceptions is that building tools is more affordable than buying. Let’s go over an example.
Marketing and analytics teams are commonly looking for ways to reliably track marketing traffic across channels, brands, and global teams. In the Adobe Analytics world, customers usually leverage Classifications of unique campaign tracking codes to provide metadata for campaigns. Using Classifications, marketing can use URLs like https://www.accutics.com/?cid=1_1234 to signal to Adobe Analytics that is later translated into channel, campaign, and other information.
In this setup, there are two critical tasks to be done:
At the start of that process's maturity journey, teams commonly use manual processes with the tools they have available to them. Excel is usually the first tool getting used, as it works well enough to collect metadata in a somewhat standardized way. Creating unique campaign codes is very challenging, especially considering that the same Excel file might be used by marketers and agencies across channels, oftentimes copied and sent via email, and edited without proper change management. After the information is collected, it is oftentimes up to the analytics team to manually convert the Excel file to a CSV file and ingest it into Adobe Analytics, creating delays and even prioritization conflicts. If the analytics team is heavily utilized or hit by the aftermath of a recent Christmas party, marketers are met with the horrible choice of either delaying a campaign launch or risking that traffic is not tracked correctly, compromising the perceived value of their channel. In a worst-case scenario, the person who originally built the Excel sheet might leave the company, leaving everyone with the choice between re-building it from scratch or trying to make sense of what has been left behind. “There must be a better way for this, I bet I can build something!” is a common next thought from that one team member who has built a website before.
Now, let’s say it takes that team member a month or two to build a simple tool that lets stakeholders campaign codes on their own. While the time estimation is rather optimistic, the initiative will be met with much enthusiasm from the previously bottlenecked stakeholders. They might even contribute some budget to get the issue solved once and for all. And while the resulting tool might get the job done well enough initially, it quickly starts showing its limitations. Tools like this are usually overly technical, cumbersome to use, and don’t meet compliance or accessibility requirements. Even worse, once that tool should be made available to a larger audience over the internet, a whole new list of requirements from IT, security, and legal start hammering on the poor team members who were trying to be helpful. The added complexity leads to slower updates, long processes, and often outages that hinder the marketing teams just as much as the manual process has in the past.
At this point, some might consider handing over the project to IT. While it is hard to communicate the exact requirements, feature ideas, and involved APIs to a less-involved developer, the delays and even more cumbersome access processes usually lead to an even lower adoption rate than the less-professional-but-working-fine version. If a higher support level is needed, the cost of having the IT resources on standby can be substantial.
Let’s try and put some numbers behind this endeavor. Of course, the numbers may vary depending on your exact setup, but the direction should be accurate enough for our discussion today. Here’s what we can assume:
That’s a lot of money! Over one or two years, a company climbing up the maturity ladder would spend close to 200,000$ on the process of managing Adobe Analytics Classifications with a manual process and later some internal tools. Given the many stakeholders involved and the many decisions made along the line, it will be exceedingly difficult for any single individual in the chain to consider the whole undertaking and see the high overall cost for the company.
Of course, this article would be rather depressing if I would point you to the fact that...
Juliana Jackson, Media.Monks, Jimmy Felstead and Diana Ellegaard-Daia, Accutics, take you through a step-by-step framework that will help you tie your analytics efforts back to the overall impact on the business.
Now, you will already have gotten an idea of where this post is heading. Of course, Accutics as a company is making tools that does not just solve the use case described above, but quite a few other common challenges for quality ensurance, data aggregation and transformation, and many more. But that’s not why I wanted to write about this topic.
When I was complicit of all the inefficiencies and lost resources I’ve outlined above at my previous companies, I simply wasn’t aware of the better ways that are available to everyone in the industry. Once I learned about Accutics from the customer side, I immediately saw how it made so much more sense to what we were trying to build internally. This discovery changed my perspective so fundamentally that I later decided to become part of the mission to help companies spend their money on actual impact, rather than throwing more and more budget down the drain.
So, if you are in a similar situation today, I can only encourage you to take a step back and evaluate if the way you are handling processes today is the best way possible. If not, there’s a good chance that you would be better off spending a comparatively small part of your budget on a solution that is built by a dedicated, knowledgeable company that might even be better at building the tool you need.
It is very hard for any individual or single company to build internal tools that are even close to the feature set, user experience, and smoothed-out edges of a professionally offered solution. Considering how fast our industry and the web in general evolve, staying on top of trends and new capabilities is something even the largest companies are struggling with.
As a personal recommendation, I can only encourage you to stay curious and open-minded about the way you conduct your analytics operation today. It can be a daunting field to work in, as the constant reminders of our limited knowledge that motivate us one day might bring us down the next day, when we discover we’ve been unconsciously clinging to sub-optimal practices for a long time. It happens to anyone, and you are not alone in your experience. Once you are ready to take the next step, there are companies out there who will be very happy to help you reach the next level!
Frederik Werner is Head of Analytics at Accutics. He is the author behind Full Stack Analyst and has quickly been recognized as one of the top bloggers on Adobe Analytics and the latest trends in web analytics.