Things you may need to be careful about data layer tagging

Disclaimer

Please hold your middle finger down while reading this. All the figures and situations in this dumpster are fictional for twisted entertainment(wink, wink*). So stop thinking this is about you.
Oh, by the way, my article is not at PG level. Violent language is included, and this post has a very rough and vulgar tone. Just a little bit. Just…a pinch of the salts.
Part of the reason is that I’m a so-called ‘damn millennial’ who grew up watching Deadpool, Eminem, and Kevin Smith movies, and the rest of the reason is that my once delicate, greenhouse flower-like temperament has become quite rough due to the trauma I’ve accumulated while working in this industry.
Don’t worry; this despair and anger of mine isn’t directed at anyone in particular.
Feel free to leave my digital hell’s kitchen, my insanity playground, whenever you like without a dramatic diva ruckus.
My subtle dyslexic arse with a lack of dopamine gave up proofreading after multiple tries. If you see grammatical errors or some awkward sentences, move on. Use the context clues to understand my post, thanks.

Data tagging.

Have you heard about it? Data layer tagging and the challenges you’ve faced when implementing it for Adobe Analytics and Google Analytics.

If you do, welcome to my world - comrades.

Let me translate this clusterfuck (Oops, my French) into plain English by the terms of my unfiltered raw frustration.

The most batshit insane scenario is when a client’s analyst or an internal team demands detailed, dynamic data tagging without having direct access to the client’s source code.

If you are a developer, you might already sense some bad omens.

As a developer, this situation is enough to make you want to jump into a massive, giant, deep river feet first. You can scream like a legendary siren, but this isn’t the mythological tale of one singing some seductive melody luring sailors to the watery graves.

It’s an auditory apocalypse, the unholy noise that could make eardrums explode.

Imagine that you are at a metal concert where the sound guy has messed up, and the amps are cranked to ‘kill your hearing forever’. And you, unfortunately, are at the centre front when the vocal unleashes a growl that would make demons cover their ears.

Yes, I am that Banshee, motherfucker (Oops, I did it again).

After years of battling in the trenches of data layer implementation, I’ve compiled a set of guidelines. These aren’t just some pie-in-the-sky best practices dreamed up in a boardroom. No, these are hard-won lessons, each one paid for in sleepless nights, caffeine overdoses, and the occasional mental breakdown.

Now, here are ten absolute rules. I’m writing these down myself because there are many bastards people who ignore this matter, so engrave these firmly in your head.

Data analysts should keep the following essential things in mind:

Rule 1. Clear Definition of Tagging Requirements:
Data analysts must provide clear and detailed documentation defining the tagging requirements. This should include information on what data needs to be collected, the specific events or actions to be tracked, and any additional parameters or metadata that should be included.
Rule 2. Data Collection Specifications:
Specify the data collection method and any specific tools or libraries that should be used for tagging implementation. Clearly define the data points that need to be captured and the format in which they should be collected.
Rule 3. Tag Management System:
If your organisation uses a tag management system (TMS) or a similar platform, provide instructions on how to integrate the tags within the TMS. Include details on where and how the TMS should be implemented on the website or application.
Rule 4. Variable and Event Mapping:
Clearly define the variables and events that need to be tracked and provide a mapping of these to the specific data points that should be captured. This helps the front-end developers understand the context and purpose of each data point.
Rule 5. Data Layer Implementation:
If your organisation utilises a data layer, provide guidelines on how the data layer should be structured and implemented. This includes defining the data layer variables, their values, and how they should be utilised for tracking.
Rule 6. Data Quality Assurance:
Specify any specific quality assurance steps that should be taken to ensure accurate data collection. This may include testing and validation procedures to ensure the implemented tagging meets the desired requirements.
Rule 7. Collaboration and Communication:
Establish a clear line of communication between the data analyst and front-end developers to address any questions, concerns, or clarifications during the tagging implementation process. Regular collaboration and feedback loops can help ensure a successful implementation.
Rule 8. Documentation and Examples:
Provide comprehensive documentation that includes examples, code snippets, and visual references to help the front-end developers understand the expected implementation. This documentation should be easily accessible and well-structured.

Now, you might be thinking, “Great, I’ve got my checklist; I mean, basically, those just sound like what I have done, anyway. Time to throw it over the fence to the dev team and watch the magic happen!” Hold your horses, cowboy. If only it were that simple.

Here’s the kicker: despite these clear, practical, and downright essential guidelines, there seems to be still some misunderstanding between developers and others.

It’s like I’m standing here, a worn-out veteran of a thousand implementation battles, sometimes defeated from the past, holding out a map that leads directly to the treasure of sustainable tagging layers complying with the website structures, and non-dev folks are looking at it upside down, using it as a napkin or worse, ignoring it completely to follow their own “rules and customs.”

Let me paint you a picture of what happens when these guidelines are ignored.

Picture this: It’s 3 AM, you’re on your fifth cup of coffee, and you’re staring at a screen full of data that makes about as much sense as a drunk squirrel trying to solve a Rubik’s cube. Why? Because someone thought “just grab all the data” was a valid specification. Don’t be that guy.

The true tragedy here isn’t just the ignored guidelines. It’s the fact that these best practices, forged in the fires of real-world implementation nightmares, could save everyone a world of pain. But no, apparently it’s more fun to stumble around in the dark, stubbing our toes on every possible obstacle, than to just turn on the damn lights.

So here we are, stuck in this absurd cycle of ignored wisdom, impossible demands, and the inevitable fallout. And who’s left holding the bag? You guessed it - the developers. We’re expected to perform miracles with duct tape and wishful thinking, all because someone thought reading and following guidelines was beneath them.

It’s enough to make you want to grab the mic and spit some lines about the insanity of it all, but unlike certain non-PC rappers we won’t name (wink wink), we can’t just say “fuck it” and walk off stage. We’re stuck here, trying to make the impossible possible.

Now, if you’ll excuse me, I need to go bang my head against the wall until either the wall gives or my sanity does. Whichever comes first.

Disclaimer that I forgot to put above:

The Grammarly AI (Hey, Grammarly - If you have read this, your support will be very welcomed. Thanks) gave this article a bit of a nudge. Cause…Well, my brain - perpetually dysfunctions to control dopamine and serotonin - sometimes has working memory issues of dory from finding Nemo. Sometimes, it decides to skip grammar checks or even forgets to finish sentences.
However, please make no mistake. Every bit of this raw and unfiltered content spewed forth from my twisted soul. I mean…It’s quite obvious. AI can’t get this chaotic text from their generative arse. Better or…for worse, It’s all me.