Skip to main content
Skip table of contents

How to exclude robot behaviour without IP address from Data Management?

For Privacy concerns, IP Address are not accessible and shared. This article shows how to exclude robot behaviour from the Data Management interface, without manipulating IP addresses

Create an exclusion rule in Data Management

Go to the interface to create an exclusion rule:

Data ManagementConfigurationExclusionsCreate an exclusion

  1. Choose the scope
    Choose whether your exclusion will be applied to one or more sites.

  2. Choose one or more identifying properties
    Select properties specific to the unusual traffic you wish to exclude, such as:

    • City

    • Organization / Connection organization

    • ISP

    • Browser / Browser version

    • User Agent

    • Event URL / Page URL

    • Traffic source (when available in your data)

Tip: Bot traffic often becomes reliably identifiable when you combine multiple properties (for example, Browser + Connection organization), rather than excluding on a single broad condition that could remove legitimate visits.

  1. Retrieve the right property_key from your Data Model
    Use your Data Model to retrieve the property_key that you will use in the exclusion.

For example, if you want to exclude traffic from the city of Paris, you will need to use the property_key city for that property, which you will then use in your exclusion rule based on Tag parameter.

  1. Add conditions to refine the rule
    Accumulate your criteria (by clicking Add a condition) to best match your rule to your unusual traffic.

Example approach:

  • Exclude a suspicious User-Agent

  • AND exclude traffic from a specific Connection organization

  • AND restrict the exclusion to certain pages or URLs when possible

Important notes and common pitfalls

Exclusions are not retroactive

Exclusion rules apply going forward; they do not modify already-processed historical data.

If you need to correct historical data, this requires a data reprocessing service (billable). Contact the Support Team and be prepared to provide: the site(s) concerned, the exact date range to be reprocessed.

Use tag-available properties (not calculated-only properties)

Some properties are computed during processing and are not available “in the tag.” Those properties cannot be used as exclusion conditions.

For example, visit_entrypage is calculated during processing and cannot be used reliably in exclusions. If you want to exclude based on landing/entry behavior, use a tag-available property such as page (or other URL/page properties available in your implementation) instead.

Allow time for changes to take effect

After creating or editing an exclusion, allow some time for the rule to take effect in newly collected/processed data (maximum 20min), then monitor your reporting to confirm the change.

Avoid over-excluding

When excluding suspicious traffic, keep conditions as specific as possible to avoid removing real users. If the bot pattern is uncertain, start with narrower criteria and expand cautiously as you confirm the behavior.

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.