It’s official (at least in my mind): Digital Analytics as we know it is dead!

Why?

It’s not about tools, it’s about ownership.

When you implement an analytics solution, you probably start with selecting a tool. Then the dance begins:

Legal: You need to convince your legal department that your tool is compliant (think data protection, but not only)…
Architecture: You need to convince your architects your tool fits the current architecture…
Business: You need to convince those who pay that your tool meets all the business needs…
Security: You need to convince your security officers that the hosting is compliant, secure, …
Cloud: You need to convince yet other architects that your tool sits in an approved platform, do your first cut, …
Ownership: You need to find that product owner willing to take the responsibility for the black box…
Operations: You lack the product know-how, so you go find a specialized agency…

In the meantime, your technical teams sit and watch…

Then you get the tool, and the after-party begins:

Business: We need that fancy dashboard/KPI… You: Yes, but, calculated fields are limited, the reporting interface can only do this and that…
Data team: We need more detail on this… You: Sorry, we have no more custom dimensions left…
Your intern working on a project: Hey, I would need the IP addresses to do some bot detection POC… You: Sorry, it’s anonymized / not available…
Your data engineers: We would like to ingest your data into the DWH… You: Hmmm… let me check my options (available connectors)…
Business: The data does not match our sales numbers… You: Let me call the agency…
Analyst: Why is this dimension empty…? You: Let me call the agency…

You probably know: Like many parties, this one ends with big headaches!

What am I trying to say (and it’s still not about the tool):

In most cases you deploy a tool, only to realize that you merely need half of its features. On the other hand, you also realize that it only perfectly matches half of your expectations. To bridge the gap, you need to bend the tool with awkward workarounds or convince your stakeholders to accept what they get. Also, you are at the mercy of the vendor, hoping that your product ideas will eventually be picked up and implemented.

As a last resort, you deploy more tools to achieve your goals (meet expectations), and the party starts over…

How this could have played instead:

You find (or are) a tech-savvy, experiences business analyst with subject matter expertise in digital analytics and its related fields, then:

  1. Cooperate with legal and compliance to design the business case with data protection and privacy in mind.
  2. Bring architects and technical teams around a table to leverage existing infrastructure (think DWH, cloud services, etc.).
  3. Concieve a lightweight, efficient and extensible architecture for your digital analytics solution.
  4. Build it with internal resources.
  5. Maintain it with internal resources.

You might say: This is overkill, no way!

Well, consider that you would then truly own and understand your data. You would have a transparent, compliant setup. You would be able to extend it as needed. You fully meet all the agreed upon expectations.

You would have a product owner and eventually a team fully committed to enrich your data landscape.

Think about it. Let it sit for a while. This is ownership!

And the best part? This is fully doable today. Technology has progressed, and you probably already have everything in place.

Digital Analytics is not about tools, it’s about expertise, it’s about political will to consider it as part of the entire company’s strategical assets.

The words “digital analytics” are becoming void, free of sense. Its concepts on the other hand are becoming part of a mature data strategy.

Think about it…

Why I Forced GA4 Signals to Use google.com – and Saved My CSP Sanity

While working on a project to fully first-party both my client- and server-side GTM setup – including proxied gtm.js, gtag.js, and a custom sGTM endpoint – I ran into something surprising: GA4’s audience beacon was being sent to google.nl. It felt counterintuitive at first, but the explanation was logical – my proxy runs in the Netherlands, and Google chooses the TLD based on location. That small discovery led me to dig into how Google Signals works under the hood. The result was a lightweight, fully client-side fix that intercepts and rewrites the request, while also making Content Security Policy (CSP) whitelisting far more manageable by eliminating unpredictable regional TLDs.

My first instinct was to tweak the proxied gtag.js. That quickly failed. The code is minified, opaque, and packaged in a way that defeats most breakpoints in Developer Tools. Even XHR/Fetch breakpoints didn’t trigger – but ironically, that hint pointed me to the real solution.

GA4’s Signals request doesn’t use fetch() or XMLHttpRequest; it creates an image beacon dynamically using new Image(). Once I realized that, and with a little help from my preferred LLM, I came up with this:

var OriginalImage = Image;
window.Image = function(width, height) {
  var img = new OriginalImage(width, height);
  var originalSetSrc = Object.getOwnPropertyDescriptor(HTMLImageElement.prototype, 'src').set;
  Object.defineProperty(img, 'src', {
    set: function(value) {
      if (value.indexOf('/ads/ga-audiences') !== -1) {
        value = value.replace(/google\.(?:[a-z]{2,3}(?:\.[a-z]{2})?)/, 'google.com');
      }
      originalSetSrc.call(this, value);
    }
  });
  return img;
};

This snippet overrides the src setter of the Image class. That class gets instantiated behind the scenes whenever JavaScript creates an <img> element. The overridden setter checks whether the requested URL includes /ads/ga-audiences, which indicates a GA4 Signals audience call.

If that’s the case, the code replaces the domain using this regular expression:

/google\.(?:[a-z]{2,3}(?:\.[a-z]{2})?)/

This regex also handles compound TLDs like .co.uk – not just .de, .nl, or .fr.

Because I’m using this in a GTM setup, I integrated it via a Custom HTML tag in my client-side GTM container. I wrapped the code in a <script>...</script> block and gave the tag a priority of -1000 to ensure it runs before any other tag, especially the one that loads gtag.js.

Voilà — now the GA4 Signals request consistently goes to the .com endpoint:

o when you bump into someone from the security team in the hallway, they might even smile at you. Why? Because they only had to whitelist one domain (if any) for your CSP — likely *.google.com, which GA4 already uses. Without this fix, they might have needed to allow hundreds of unpredictable TLDs — something security teams never like doing.

To my knowledge, this change has no impact whatsoever on Google Signals functionality or audience building. The request still reaches the correct endpoint, just through a fixed and predictable .com TLD. You’re not blocking or disabling anything — you’re simply taking control of where the beacon is sent, in a way that aligns better with modern security practices.

To see it in action, go here:

https://www.tortenplatten.ch

Welcome to withdata.blog

Hello, and welcome to the first post on withdata.blog! This is a space dedicated to exploring the vast and ever-evolving world of data and analytics. Whether you’re a seasoned data professional or someone just starting to delve into the field, I hope you’ll find insights, ideas, and inspiration here.

Why This Blog Exists

Data is at the heart of so many critical decisions in today’s world. From understanding customer behavior through Customer Data Platforms (CDPs) to analyzing trends in digital analytics, the role of data continues to expand. This blog was born from a desire to discuss not just the technical aspects of data, but also the broader implications — including data privacy, ethical considerations, and the impact of emerging technologies like machine learning and artificial intelligence.

Topics You Can Expect

Here are some of the key areas I’ll cover:

  1. Customer Data Platforms (CDPs)
    • How to implement and optimize a CDP.
    • Use cases that drive better customer experiences.
  2. Digital Analytics
    • Understanding user behavior across digital platforms.
    • Best practices for analytics implementation.
  3. Data Analysis & Visualization
    • Techniques for extracting actionable insights from data.
    • Tools and frameworks to simplify the process.
  4. Machine Learning & Data Science
    • Practical applications of machine learning in business.
    • Tutorials for getting started with data science projects.
  5. Internet Technologies
    • How internet architecture shapes data collection and analysis.
    • Exploring APIs, SDKs, and integration strategies.
  6. Data Management & Pipelines
    • Building efficient data pipelines to handle large-scale data.
    • (R)ETL processes: Extract, Transform, Load, and how Reverse ETL fits in modern data workflows.
  7. Python for Data
    • Using Python for data analysis, visualization, and automation.
    • Popular Python libraries for data science and machine learning, such as Pandas, NumPy, and scikit-learn.
  8. Data Privacy
    • Navigating regulations like GDPR and CCPA.
    • Balancing personalization with ethical data use.

What’s Coming Next

In upcoming posts, I’ll dive deeper into each of these topics. Expect tutorials, case studies, and thought pieces that break down complex concepts into actionable insights. I’ll also share experiences from building and optimizing data pipelines, including challenges and lessons learned along the way.

Join the Conversation

I’d love to hear from you! What topics are you most interested in? Are there specific challenges you’re facing in your data journey? Feel free to leave a comment or reach out directly. This blog isn’t just about sharing my knowledge — it’s about building a community of like-minded individuals passionate about data.

Thank you for joining me on this journey. Let’s explore the possibilities of data together!