Why I Forced GA4 Signals to Use google.com – and Saved My CSP Sanity

While working on a project to fully first-party both my client- and server-side GTM setup – including proxied gtm.js, gtag.js, and a custom sGTM endpoint – I ran into something surprising: GA4’s audience beacon was being sent to google.nl. It felt counterintuitive at first, but the explanation was logical – my proxy runs in the Netherlands, and Google chooses the TLD based on location. That small discovery led me to dig into how Google Signals works under the hood. The result was a lightweight, fully client-side fix that intercepts and rewrites the request, while also making Content Security Policy (CSP) whitelisting far more manageable by eliminating unpredictable regional TLDs.

My first instinct was to tweak the proxied gtag.js. That quickly failed. The code is minified, opaque, and packaged in a way that defeats most breakpoints in Developer Tools. Even XHR/Fetch breakpoints didn’t trigger – but ironically, that hint pointed me to the real solution.

GA4’s Signals request doesn’t use fetch() or XMLHttpRequest; it creates an image beacon dynamically using new Image(). Once I realized that, and with a little help from my preferred LLM, I came up with this:

var OriginalImage = Image;
window.Image = function(width, height) {
  var img = new OriginalImage(width, height);
  var originalSetSrc = Object.getOwnPropertyDescriptor(HTMLImageElement.prototype, 'src').set;
  Object.defineProperty(img, 'src', {
    set: function(value) {
      if (value.indexOf('/ads/ga-audiences') !== -1) {
        value = value.replace(/google\.(?:[a-z]{2,3}(?:\.[a-z]{2})?)/, 'google.com');
      }
      originalSetSrc.call(this, value);
    }
  });
  return img;
};

This snippet overrides the src setter of the Image class. That class gets instantiated behind the scenes whenever JavaScript creates an <img> element. The overridden setter checks whether the requested URL includes /ads/ga-audiences, which indicates a GA4 Signals audience call.

If that’s the case, the code replaces the domain using this regular expression:

/google\.(?:[a-z]{2,3}(?:\.[a-z]{2})?)/

This regex also handles compound TLDs like .co.uk – not just .de, .nl, or .fr.

Because I’m using this in a GTM setup, I integrated it via a Custom HTML tag in my client-side GTM container. I wrapped the code in a <script>...</script> block and gave the tag a priority of -1000 to ensure it runs before any other tag, especially the one that loads gtag.js.

Voilà — now the GA4 Signals request consistently goes to the .com endpoint:

o when you bump into someone from the security team in the hallway, they might even smile at you. Why? Because they only had to whitelist one domain (if any) for your CSP — likely *.google.com, which GA4 already uses. Without this fix, they might have needed to allow hundreds of unpredictable TLDs — something security teams never like doing.

To my knowledge, this change has no impact whatsoever on Google Signals functionality or audience building. The request still reaches the correct endpoint, just through a fixed and predictable .com TLD. You’re not blocking or disabling anything — you’re simply taking control of where the beacon is sent, in a way that aligns better with modern security practices.

To see it in action, go here:

https://www.tortenplatten.ch

Welcome to withdata.blog

Hello, and welcome to the first post on withdata.blog! This is a space dedicated to exploring the vast and ever-evolving world of data and analytics. Whether you’re a seasoned data professional or someone just starting to delve into the field, I hope you’ll find insights, ideas, and inspiration here.

Why This Blog Exists

Data is at the heart of so many critical decisions in today’s world. From understanding customer behavior through Customer Data Platforms (CDPs) to analyzing trends in digital analytics, the role of data continues to expand. This blog was born from a desire to discuss not just the technical aspects of data, but also the broader implications — including data privacy, ethical considerations, and the impact of emerging technologies like machine learning and artificial intelligence.

Topics You Can Expect

Here are some of the key areas I’ll cover:

  1. Customer Data Platforms (CDPs)
    • How to implement and optimize a CDP.
    • Use cases that drive better customer experiences.
  2. Digital Analytics
    • Understanding user behavior across digital platforms.
    • Best practices for analytics implementation.
  3. Data Analysis & Visualization
    • Techniques for extracting actionable insights from data.
    • Tools and frameworks to simplify the process.
  4. Machine Learning & Data Science
    • Practical applications of machine learning in business.
    • Tutorials for getting started with data science projects.
  5. Internet Technologies
    • How internet architecture shapes data collection and analysis.
    • Exploring APIs, SDKs, and integration strategies.
  6. Data Management & Pipelines
    • Building efficient data pipelines to handle large-scale data.
    • (R)ETL processes: Extract, Transform, Load, and how Reverse ETL fits in modern data workflows.
  7. Python for Data
    • Using Python for data analysis, visualization, and automation.
    • Popular Python libraries for data science and machine learning, such as Pandas, NumPy, and scikit-learn.
  8. Data Privacy
    • Navigating regulations like GDPR and CCPA.
    • Balancing personalization with ethical data use.

What’s Coming Next

In upcoming posts, I’ll dive deeper into each of these topics. Expect tutorials, case studies, and thought pieces that break down complex concepts into actionable insights. I’ll also share experiences from building and optimizing data pipelines, including challenges and lessons learned along the way.

Join the Conversation

I’d love to hear from you! What topics are you most interested in? Are there specific challenges you’re facing in your data journey? Feel free to leave a comment or reach out directly. This blog isn’t just about sharing my knowledge — it’s about building a community of like-minded individuals passionate about data.

Thank you for joining me on this journey. Let’s explore the possibilities of data together!