Skip to main content





Author: [email protected]

What data strategy when media budgets are at half mast?

Since the summer of 2022 and for the first half of 2023 at least, media budgets, including digital, are like the economy: gloomy. How can we continue to work with data in this context? On which pillars should we rely to get the full value of interactions with audiences? Here’s an overview.

A low tide that is slow to rise. This is the picture that emerges from the forecasts for the evolution of the digital advertising market. Yes, digital is in better shape than the overall advertising market, but the ebb is there. In France, the players (publishers, trading desks) consulted by JDNet at the end of 2022 confirm a decline in programmatic since the summer of 2022 and anticipate a dull first half of 2023.

Globally, advertising growth estimates have been revised downwards: GroupM forecasts an increase of 5.3% in global advertising investments in 2023 (the estimate was 6.4% six months earlier). If we take a look at the US market, and more specifically at ad spending on social networks, there too a cold wind is blowing on the forecasts. For Insider Intelligence, nearly $10 billion disappeared between the December 2022 estimates and those published in March 2022. As the analysis goes on, a consensus is emerging: the slowdown should continue during the first half of 2023. At least.


Digital media buying in poor shape


This small form of digital media investment can be explained. In a very uncertain global context, brands anticipate a drop in consumption and modulate their investments accordingly. ” And when you have to cut back, digital media buying is easily deactivated/reactivated, unlike investments in content or SEO. There, cutting budgets can result in regressions “, says the digital strategy manager of a major industrial player. In this context, with less media investment, how can you continue to develop your data?


Focus on organic levers


Unsurprisingly, efforts are focused on organic levers, for both acquisition and retention. On the acquisition side, those who have already built up an “SEO income” (i.e., SEO work that guarantees regular traffic), will at least maintain it, and even develop it by supporting their content production. This traffic is especially interesting if it is generated by content that covers the different types of search intentions (informational, navigational, commercial, transactional). The visits generated by this means therefore represent qualified traffic that can be worked on, at least in part, as a result of consent management.


Another channel that is the object of all the attention during these times of media dearth, emailing. In all its forms: from the “acquisition cold email” to the regular newsletter that intends to maintain a lasting relationship with an audience. These emails are precious supports to enrich the data: beyond the classic click and opening rates, other indicators such as the level of loyalty, the typology of the consumed contents allow to refine the knowledge of its audience to better activate it later.


More than ever, think omnichannel


Making the most of the traffic gained through these types of levers means optimizing the ” playbooks “. To develop consideration and conversion, these automation sequences often combine emails, but brands have every interest in thinking about them on an omnichannel scale. For example, to trigger them on the basis of an in-store action (purchase, delivery of a loyalty card) or to develop them by integrating into the scenario an interaction with the call center (to provide advice and confirm an interest).


Whatever the organic levers used, they all have one thing in common: unlike media buying, the aim here is not to “overpressure an audience” but to address it with appropriate messages, at a measured pace and in full compliance. A delicate balance which, to be maintained, requires the support of 3 pillars:


Pillar #1: Consent Management


Now an essential part of the martech stack, the CMP (Consent Management Platform) must help identify the formula that guarantees a consent rate consistent with the benchmarks of your business sector. A formula to “ABtest” by varying design elements but also the language used, which must be a happy compromise between brand tone, pedagogy and legal imperatives.


Pillar #2: Identity Resolution


Equally important is the ability to reconcile interactions around a unique ID for each user. Without this identity resolution, it will be difficult not to over-solicit audiences and, even more so, to send them content adapted to their expectations. If technical prerequisites, especially in emails (read our white paper “a world without cookies”) are necessary, it is also important to think of campaigns to multiply the opportunities to associate emails and cookies.


Pillar #3: Audience segmentation


Finally, to fully capitalize on organic levers and first-party data, the ability to segment audiences holds the keys to effective activation. The objective is to capture as many signals as possible in order to create segments of varying degrees of complexity and to personalize messages. And, once again, by thinking on an omnichannel scale, therefore capturing signals from all horizons and on a large scale, in order to give teams the material to enrich the dimensions of these segments.


Backed by these three pillars, the teams in charge of acquisition and retention will be able to improve customer knowledge. And gain precision in the allocation of media budgets for which, mechanically, brands can expect an increased return.

Improving data quality, a key issue for companies

In a world where data is increasingly valuable to businesses, ensuring its quality is essential to guaranteeing the effectiveness of campaigns and therefore maximizing marketing investments. This is where our platform comes into play.

Why is data quality so important to businesses?

Data quality is crucial for businesses because it directly impacts their ability to make informed decisions, analyze data accurately, and achieve their business goals.When data is of poor quality, it can lead to errors in analysis and decision-making, which can have a negative impact on marketing campaign ROI, data analysis, and campaign analytics.For example, if a company uses poor quality data to target its advertising campaigns, it may end up reaching people who are not interested in its products or services, which can lead to wasted budget and lower campaign performance. Similarly, if a company uses poor quality data to analyze its performance, it may end up making decisions that are not based on accurate data, which can be detrimental to the company’s long-term growth and success.
In summary, data quality is critical for businesses as it is the basis for informed decision making and accurate analysis.

What is Data Quality or Data Integrity?

Data integrity refers to the accuracy and consistency of data throughout its life cycle, from collection and storage to analysis and dissemination. Without data integrity, companies risk making decisions based on inaccurate information, which can lead to lost revenue, damaged reputations, and even legal issues. Ensuring data integrity is a complex and difficult process, especially for organizations that handle large amounts of data from multiple sources. It requires the implementation of a series of controls and processes, including quality control, validation, duplicate removal, integrated delivery control, real-time alerts, preservation and backup, cybersecurity and advanced access controls. These measures ensure that data is accurate, complete and consistent, and that any data integrity threats are identified and quickly addressed.

Improve data quality with our platform

Our platform aims to give companies the confidence in their data in a very simple way. We offer a standardized datalayer interface that allows users to define their data schema and define validation rules that feed their data quality workflow.
Moreover, our Data Cleansing feature allows users to transform/correct their events in real time in a simple and intuitive way, thanks to our no-code approach. However, the more technical among us are not forgotten since we also offer a low-code module (or even just code for the more daring).

Manage data errors with our platform

We have several features to manage data errors. First, we have a data quality dashboard that allows users to see specification violations at a glance and quickly correct them at the source or in real-time with the Data Cleansing feature.
We also offer real-time alerts so that users can react quickly to data errors. These alerts can be sent by email, messaging (Slack, Teams, …), webhook or via notifications in the interface. An alert can be configured in 3 clicks, with a slider to choose the trigger threshold and the communication channel.

How our product helps work with the same data across the enterprise

Our standardized datalayer interface allows users to define the schema of their data and define validation rules to ensure that all data conforms to that schema. This way, all teams can work with the same data and ensure that it is of high quality. In addition, we have a single data dictionary that allows users to define and share their data definitions across the enterprise.

What is Data Cleansing and how does it work?

The Data Cleansing feature allows users to transform/correct their events before sending them to their destinations. We have several types of transformations available, such as event renaming, event derivation, property modification and event filtering, which can be created in a simple and intuitive way thanks to our no-code approach based on basic formulas and operators, very similar to what one would find in a spreadsheet like Excel. For those who prefer a low-code approach, it is also possible to add custom JavaScript code to create custom transformations. The Data Cleansing feature is particularly useful for ensuring that data sent to destinations is of high quality and complies with required specifications.

What about the quality of the data transmitted to destinations?

We have an event deliverability tracking interface that allows users to check if data is reaching its destination or if there have been any problems in sending. This interface includes quick and easy to read metrics such as the percentage of events not sent, a visualization of the evolution of correctly sent and failed events over a given period of time, and an error summary table. The latter gives an overview of the different types of errors encountered and how to resolve them. In case of sending problems, we also offer an alert system to notify users immediately.

How our platform simplifies complex technical errors when sending data to partners

First of all, the errors are not always technical, they are often missing or badly formatted data and our platform generates explanations in natural language that are very easy to read. And as for technical errors, whether they come from a partner’s API feedback or an unavailability of its servers, it was important for us that each error be very simple to understand. We use a natural language generator (NLG) to transform these unreadable errors into explanations that are perfectly understandable by a non-technical profile with resolution paths. That’s the magic of AI 🙂

Infographic Sales 2023 – Best practices to increase customer engagement during sales

In a context where 48% of French people say they are worried about their purchasing power and 75% want to change their purchasing behavior (OpinionWay study, “From purchasing duty to purchasing power: the relationship of the French with consumption” 2022), the sales represent a critical period for advertisers who must be creative to stand out among the thousands of signals sent to consumers.

With a Customer Data Platformyou can now increase your customer knowledge andsend the right messages in a few clicks in order to increase their engagement.

To do so, here are some good practices to adopt to take advantage of the sale period.

In summary, it is necessary to have a centralized view on the entire life cycle of user data to have a positive impact on their buying behavior. Segmentation becomes the most efficient way to create your audience groups and adapt omnichannel strategies. A Customer Data Platform is a great tool to increase customer engagement.

Did you like this infographic and want to learn more about customer engagement and analyzing the results?

Find out more with our white paper “How to prepare for Cookieless” and increase your customers’ engagement.

Checklist 2022 – Improve your tracking before Black Friday


Like every year, Black Friday promises to attract a lot of people, which should generate a large amount of data to be processed by data teams. Make sure your tracking is ready for this event by reading our best practices.


Improve your Tracking before Black Friday with our Infographics


Did you like this checklist and want to know more about your tracking?

Find out more about Commanders Act’s Live Report Builder and optimise your customer journey.

Consent management in Spain: 5 reasons to accelerate

Although Spain has adopted the GDPR (General Data Protection Regulation) like many other European Union countries, its application on the ground remains contrasted.

Like the CNIL in France, the SADP (Spanish Agency for Data Protection) has issued guidelines and published guides to support companies and facilitate their compliance with the GDPR, particularly with regard to the management of cookies and associated consents. “The SADP’s guidelines are very similar to those of the CNIL, but in practice, there is not a strong official pressure for all actors to take up the subject quickly and globally,” says Juan Vasquez, Director for the Iberian region of Commanders Act.

This feeling is also reflected in the figures. Yes, Spain is at the top of the rankings if we look at the number of fines imposed, but is rather at the bottom if we look at the average amount: 131,564 euros compared to 10.79 million euros in France. In this context, can the French experience inspire the Spanish market? What lessons can be learned from it? The summary in 5 lessons.


Learning #1: Don’t Think It’s Just About Others

In Spain, the SADP is paying special attention to Telcos such as Vodafone, banks such as Caixabank and BBVA, and airlines such as Iberia and Vueling. A pressure that has strongly encouraged companies in these sectors to take up the issue. “I think there are significant differences in the level of compliance depending on the sector of activity, with players in the telecommunications or financial sectors, whose activity is directly related to the processing of personal data, clearly above average”, analyzes Santiago Vázquez-Graña, DPO of Capgemini Spain.

Should other sectors and smaller players feel immune? Probably not. As the French experience has shown, even if the CNIL does not necessarily have the means to investigate all sectors at the same time, campaigns follow one another to review the various sectors of the economy. It is difficult to see how actors in the tourism sector, which accounts for more than 10% of GDP, can escape the investigations of the SADP.

In short, whether it is a large company or a small or medium-sized enterprise, all the players, especially those whose activities are exposed in one way or another, may appear on the radar of the authorities.


Learning #2: Don’t wait for the skills shortage

GDPR and the directives issued by national authorities form a complex set to interpret. Beyond the modalities of consent management, subjects such as the retention period may require the support of experts. The same applies to technical implementation. In other words, it is better not to wait for a general market adoption movement before taking the plunge – unless you want to pay a high price for skills that are not readily available. In France, where companies have often waited until the last moment to invest in the subject, specialized law firms have quickly been missed…


Learning #3: Don’t think you’ve solved the problem with server-side

The announced end of cookies has led companies to launch migrations to the server-side. In other words, the collection of information is no longer handled on the browser side, but on the server side (read our white paper “How to prepare for cookieless?” ). These migrations are sometimes accompanied by a question: since the collection is done “cookieless”, there is no need for consent anymore?

Considering the efforts made to switch to server-side, the shortcut is tempting, but… no. Server-side is just a technical collection modality with no impact on consent management requirements. That’s why the Commanders Act CMP was designed from the start to propagate the consent signal in server-side mode.


Learning #4: Thinking about managing global user preferences

How do you manage consent? Case by case, site by site, channel by channel? So, deal with the site, then the app and later the chatbot? Feedback shows that companies like the French Army (l’Armée de Terre) or Floa Bank that opt for a global approach to the subject enjoy a better return on investment.

By addressing consent holistically through a user preference management center, the company streamlines its efforts and greatly minimizes the risk of non-compliance. “In the absence of such an approach, many chatbots in Spain are not attached to a CMP”, observes Juan Vasquez.


Learning #5: Consider preference management as an investment

As a corollary to the previous point, the next step in consent management is to no longer consider the subject as a cost, but as an investment, or even as a competitive advantage that the company can take advantage of. In Spain, “the costs associated with compliance (…) and the general view of compliance as a hindrance to business rather than as a source of added value to the relationship with the customer, explain why many companies do not invest the necessary resources to adapt to the new regulatory reality” , says Legal Army in an interview with

However, the situation seems to be gradually changing. Let’s hope that, as in France, a growing number of companies will make consent management an attractive argument in its own right.

4 steps in preparation for server-side tag management

With the advent of Intelligent Tracking Prevention (ITP) updates from Apple and Google’s announcement to get rid of third-party cookies, server-side tag management really gained momentum. Most browsers, devices and channels have now stopped using third-party cookies, which will make client-side tagging obsolete in the future. These 4 steps in preparation for server-side tag management will help you.

It’s worth remembering that client-side tagging involves the user’s browser interacting via a tag container directly with the providers (such as Google Analytics or Facebook). With every tag that is configured, an HTTP request is sent for each interaction to the corresponding endpoint of the tag vendor. The request in the client itself is already structured in the correct data format.

Server-side tagging, on the other hand, allows a separate data endpoint to be created in a server environment to which the data will be sent, instead of directly to the providers. This means that all interactions in progress on the website are sent as a bundled data stream from a client integrated in the browser to the server of the tag management solution. It is only here that the individual tags of the service providers are located and structure the desired data into the required format. In addition, the data can be further processed, enriched and anonymised before it is transferred to the digital marketing providers.

Many companies are therefore switching from client-side to server-side tag management. However, this transition takes time. This does not come as any surprise, given that this a process which must include all the players in the digital world. But this carries the risk of a wait-and-see attitude. If you wait with your server-side strategy until the concept is fully mature, on the one hand, you forego an important learning curve and, on the other hand, it means missing out on important competitive advantages.

But what is the best way to manage this transition? What should the strategy for server-side implementation look like? The following checklist provides some guidance on this.


1. Involve the Data Protection Officer in the transition

Many challenges which are causing a stir in the digital world trace their origins back to data protection issues. Even if a server-side implementation is described as a “technical project“, the company’s Data Protection Officer should be involved as early as possible. Their opinion plays a decisive role at every stage of the process – from selecting the solutions to be implemented on the server side in the future, to new processing methods which facilitate the centralisation of data on a server-side solution.


2. Inventory of partners

Depending on the needs of the company, several dozen tags are currently created from the digital infrastructure. And just as many or almost as many partners are activated.

Which of these partners…


… is already ready for use on the server side?

… plans to be ready for this in the future?

… will operate in hybrid mode?

… has not planned a medium-term changeover?


For solution providers, the complexity of the transition is directly related to the nature of the tasks. There are three main types of tasks: detection, capture and interaction.

During the detection process, it is almost impossible to bypass the client-side tag; during the capture process, on the other hand, an exchange seems possible, while in the interaction process (e.g. personalisation), it becomes much more difficult. In addition, for one and the same partner the capture process can occasionally occur on the client side and further processing on the server side.

So, if you want to devise your own strategy, you also have to be familiar with the strategy/strategies of your partners. Since this is a major challenge, nowadays all the players involved communicate with each other on this topic, which makes the task easier.


3. Start with the right candidates

Even if the transition is based on a server-side designed customer data platform (CDP), it takes several months and requires coexistence between client-side and server-side solutions. In light of this, it is important to find the right candidates so that everyone (from IT to marketing and the Data Protection Officer) can become familiarised with the server-side logic.

There is a high probability that a Pareto principle will emerge in which 20% of the tags cause 80% of the difficulties. A good candidate is a solution which has already tested the transition to a server-side solution and for which the benefits are clearly identifiable.

The assessment of solution providers according to the business resilience of a conversion and its technical difficulty can help with operation scheduling.


4. Communication and information

Server-side implementation does not only affect the IT teams, digital marketing or the Data Protection Officer – everyone needs to face the challenges and prerequisites of server-side technology at their own level. Clichés about needing to break open the silos do not help either in this case. Companies should provide instead appropriate communication with a view to initiating a joint discussion about the strategy, the key moments and feedback.



We cannot predict exactly what the digital world will look like without cookies. However, several developments are taking place which suggest that the future belongs to server-side technologies. In fact, the switch to server-side concepts will take time, and so both models (client- and server-side) will still have to operate side by side for a while. But the centralisation of data on a server-side solution offers too many advantages to be ignored any more.

Since employees are often already busy with day-to-day business, it is hardly possible without external experts who can optimally support a company during this transition. Another good reason not to wait too long – because with what’s at stake, the best providers are quickly booked up. With these 4 steps in preparation for server-side tag management you are on the right path.



More on the topic:

White Paper: How do you prepare for a future where there are fewer cookies?

How to choose the right CDP?

All Customer Data Platforms can build unified customer profiles. But features still vary widely among systems, both in terms of the core features that build customer profiles and additional features that perform other functions.

Finding the right CDP for your business requires a close look at what each system provides compared with what your company needs.

We’ll start with the core features. Remember that non-CDP systems provide a number of features that overlap with a CDP. A data warehouse may collect data from source systems, making CDP data collection features less important. A Master Data Management system may maintain customer identities that your CDP can import. An Integration Platform may be able to move CDP data into systems that need it. Even if you do have those types of systems, there’s no guarantee they’ll actually meet your needs. But you do want to take a look.
For now, let’s assume you do need a CDP to provide the full set of features. Some items to check include:

Data collection:

  • Can the system handle all of your existing data sources, including both online and offline data?
  • Does the vendor provide prebuilt connectors for your existing data sources, or will you have to build your own?
  • What is required to build a new connector?
  • If you have streaming data sources, can the system support them?
  • Can you define and react to events within the streaming data?
  • Can the system accept unknown or unexpected inputs without losing the data?
  • Can the system support your data volumes, in terms of load time, storage capacity, and access speed?
  • What will be needed to organize the data so it matches your company’s business data model?

Profile building:

  • Can the system build profiles with a persistent ID that does not change regardless of changes to any personal identifier?
  • Can the match personal identifiers using the methods you require:

»Deterministic (matches based on known links between two identifiers, such as a phone number and postal address on the same account record).

»Similarity (matches based on alternative forms of the same information, such as spelling variations in a street name).

»Probabilistic (matches based on correlations between two items, such as a phone and computer that are frequently used in the same times and places).

• Can the system build a unified view by combining all data associated with the same individual?
• Can the system create aggregates, scores, and other derived values based on events and transactions associated with the same individual?
• Can the system build household- or company-level profiles that combine information for multiple individuals?
•How quickly can the system recalculate derived values when a new personal identifier is matched to an existing customer ID (without reassessing existing matches)?
•How quickly can the system reassess all existing matches when new personal identifiers are added?
•How quickly is ingested data incorporated into customer profiles, including updated calculations using that data?

Data sharing:

  • What access methods are provided to retrieve CDP data?
    •What steps, if any, are needed to expose new data elements for sharing?
    •How quickly can individual profiles be returned via API call?
    •Can the system perform calculations such as a predictive model score in real time based on an API call?

Many CDPs provide functions beyond profile building. Major categories are listed below.

Any of these might be provided by a specialist system. Companies that have adequate specialist solutions in place will not usually want to replace them with the CDP’s version. But companies without an adequate solution may want to assess whether CDP provides an acceptable option, recognizing the advantage of minimizing the number of separate systems the company must buy, integrate, and train employees to use.

Analytics: reporting, exploratory and descriptive analytics, data visualization, predictive modeling, next-best-action identification, product recommendations, send-time optimization, marketing results attribution.

Campaigns: audience or segment selection, multi-step campaign flows, real-time interactions, segment-level content selections, test splits

Personalization: individual-level message and offer selection, data retrieval, dynamic content execution

Orchestration: message selection across multiple campaigns, individual-level channel selection, channel-specific formatting, connectors to channel delivery systems.

  • Delivery: channels supported, personalization features, content creation and management, volumes supported.
  • Operations: order processing, shipping, product delivery, billing, sales, support, etc.
    Delivery and operations systems in particular are major products in their own right. Most companies would start by selecting systems that meet their core delivery and operations needs, and then assess whether the built-in CDP is adequate or a separate CDP is needed.

Discover our White Paper “Time to invest in a Customer Data Platform” written with CDP Institute !

Identifying & reaching out to consumers across channels

Whether on the website, in the store or via email, every channel which enables consumers to communicate with a company generates customer data. However, each of these channels provides only a fraction of the information about the consumer as a person. Only when the various details are amalgamated does a uniform picture emerge – like when doing a puzzle. This is where Identity Management (IM) comes into play. An IM solution can be used to link all customer data across all channels and devices, online and offline. This allows companies to successfully identify and assign individual consumers in the correct manner.

Marketers can use this information to gain a more in-depth understanding as to who their customers are and what spurs them into action. This is how they manage to interact with them more effectively. This not only increases the success of a marketing campaign, but also the ROI from the advertising expenditure.

The following sections show how identity management works and how it can help the marketing department do its job.

Establishing connections: aggregating customer data to form an overall picture

Customer data is often scattered throughout the company and is handled in isolation in the various departments. This makes it almost impossible to identify a consumer across all interactions and channels and to assign data already available.

Identity management solutions recognise customers based on specific identifying characteristics (“identifiers”). They can be, for instance, an email address used by the user to log in to a website from different devices, or the customer number used in the store, which appears on the customer card.

IM compares the different identifiers across the various devices and contact points and merges them to form a uniform consumer profile. This is how a complete overview of the customer is produced from a large volume of partial data. Based on a single characteristic, a user can be recognised anywhere – both online and offline.

To be able to do this, an identity management solution must be connected to all relevant systems in the company, for example to the CRM (Customer Relationship Management) system, which stores all customer information, and to the backend system, which stores, for instance, shipment information for orders. The finance system, which contains customers’ payment details and history, as well as all the marketing channels, the Contact Centre and the online store must also be connected to the IM solution.

Crunch point: terminating connections

Two approaches have proven successful in identifying users:

  • Deterministic: this merges only those identifiers which are uniquely assigned to the same user, such as the customer number and the account number.
  • Heuristic: this approach, on the other hand, operates on the basis of probabilities. This means that attention is focused on such factors as the users’ browsing behaviour and their location data, Wi-Fi ID, IP address, as well as characteristics which are based on personal data such as interests, gender and age and are consistent across different devices.

Regardless of whether a deterministic or heuristic approach is used, an IM solution must consolidate all the identifiers from the different systems and channels. However, it is also important that it can terminate these connections again. When using a customer card, for example, which is linked to an email address, all contact points merge into a single user profile. However, if several people use the same device, this affects the results. For instance, a daughter lends her mother her tablet, who logs in to an online store using her own access details. This results in two different logins for one device. The challenge here is to identify and terminate the mother’s connection again as she only logged in once – after all, she is not part of the daughter’s user profile.

In addition, it must be possible to take time parameters into account if, for example, no one has logged in again via the device for a long time. This suggests that the device has been given away or scrapped, in which case it should be classified as irrelevant to the user profile. The type of device is also a factor. For instance, a smartphone is generally used only by a single person, whereas a desktop PC is often used by several people in a household.

Strong customer approach via IM

IM also proves to be extremely useful when it comes to approaching customers:

Individual customer requirements: IM not only makes it possible to merge different identifiers, but it also incorporates information about marital status, profession, hobbies and place of residence. This creates a complete picture of the customer, which goes far beyond cookies, an email address and customer number. For example, if the finance system registers that a customer always chooses to pay by instalments, this option is always presented to them. This increases the likelihood of a purchase being made.

Consistent approach: since IM supplies the information from all channels bundled together, marketers can offer consistent storytelling to their customers. Consumers are approached with the same message across all channels. For instance, if someone buys a washing machine via a desktop PC, not only is advertising for other washing machines prevented on this device, but on all devices and channels. In fact, advertising for detergents follows instead.

Frequency capping: providers can control how often users can see a particular ad banner per day, week or month. On the one hand, this is intended to increase visitors’ attention and, on the other hand, to prevent them from finding the advertising intrusive.

Channel substitution: IM allows marketers to see how, for example, users interact with an article in the online store. If they visit the product page regularly, no retargeting is required to draw their attention to the article. In this instance, a simple pop-up banner with a discount coupon will perhaps push them to buy it. In other words, it saves marketing costs to use the inexpensive channels (Messenger) to start with. If this action is unsuccessful, only then will the expensive channels be resorted to, such as the call centre.

Privacy Barometer 2022: consent collection practices are becoming increasingly standardised

One year after the latest directives from France’s data protection authority were enacted, most websites are now equipped with a consent banner. Collection mechanisms are gradually becoming standardised, but certain best practices relating to the design, look & feel and user experience are continuing to prove their ability to ramp up performance.

Paris, 31 May 2022 – Commanders Act, the publisher of a cookieless marketing platform, has unveiled the fifth edition of its Privacy Barometer, which provides insights into the performance levels of the mechanisms that its customers have implemented to collect consent in accordance with the GDPR. The findings of the 2022 edition of the Barometer reveal that the opt-in share, i.e. the number of consents vs the number of users who expressed a choice, was down slightly on 2021, but still high (average of 74% for desktop users). Although the vast majority of websites in France now use a consent banner in accordance with the data protection authority’s recommendations, an examination of their banner’s performance reveals a number of disparities.

To produce this barometer, Commanders Act analysed 285 consent banners on websites representing the main sectors of activity in the French market between 1 and 31 January 2022.

Consent collection banner: increasingly standardised practices

Since 31 March 2021, France’s data protection authority (CNIL) has required companies that are active on the internet and mobile devices to comply with the guidelines governing consent collection and cookie management. In particular, these directives impose the use of an explicit mechanism for collecting consent, featuring at least two separate action buttons of the same level (“Accept” and “Refuse”).

One year after the new directives were implemented, companies have widely rolled out their explicit consent banners. Most have gone for a pop-in banner, which requires users to make a choice before they can continue browsing on the website. Web visitors are well used to seeing this type of mechanism, meaning that they are more adept at understanding and dealing with banners than before.

The market is tending to align with the same banner model and offering few innovations,” explains Michael Froment, CEO of Commanders Act. “As with any practice entering the mainstream, it is accompanied by a slight fall in performance with an average opt-in share of 74% for desktops compared to 81% in 2021, and an average consent rate of 45% for desktops compared to 55% in 2021.

However, not all mechanisms offer the same performance: “We’ve seen that some characteristics relating to the design, look & feel and user experience have a tremendous influence on the opt-in rate,” stresses Michael Froment at Commanders Act. “The best practices that were already effective in 2021 are still effective today, and new models deserve to be explored and tested.

User experience quality: key to ramping up the consent rate

The best opt-in rates are achieved with a banner model that includes a “Continue without accepting” button or link, and an “Accept” button. The three-button model (“Accept”, “Refuse” and “Configure”) is very effective for websites with a premium brand image. Choosing an appropriate colour for the “Accept” button still plays a major role in gaining the user’s consent. “For example, using red for the “Accept” button, even if red is part of the company’s style guide, will clearly cause the consent rate to plummet,” explains Michael Froment.

It is vitally important to use a clear scenario and avoid forcing users to click several times if they do not wish to provide their consent. This practice is not only skating on thin ice when it comes to complying with CNIL’s recommendations, but it also leads to a catastrophic bounce rate that can reach 70% in the worst cases. Another pitfall is requiring users to give their consent whenever they change domain, which often happens with multi-domain websites. “Only ask the question once! Repeatedly asking for consent is guaranteed to aggravate users and lose their consent,” advises Michael Froment.

Finally, even though companies are tending to use the same type of model, some websites are taking a more original approach with creative banners that are ultimately more engaging. “Although our recommendation is to prioritise sober and classic designs, some companies have managed to strike the right balance between the legal obligation for collecting consent and their brand’s defining features and personality,” explains Michael Froment. “It’s an extremely interesting approach to the subject and it could inject new driving force into the market and help companies achieve great performance, but they really must carry out A/B testing to make sure that their banners are effective. Unless they can perform that type of test, the conventional approach is currently the best choice.

To see the infographic for the 2022 Barometer and the best practices for improving privacy banners, click here


White Paper – How do you prepare for the cookieless world?

And why the Server-Side model will become your greatest ally

… and how to prepare for a cookieless world. Essential and fragile in equal measure: that’s exactly how the status of data could be summed up in 2022. Data are essential. Take away data, and digital operations effectively lose their power of sight, whether disseminating an advertising campaign, orchestrating an anti-churn scenario featuring a combination of web, emails and a call centre, or customising a website. But also fragile because over the past 10 years, data has become both scarce and fragile (development of adblockers, entry into force of the RGPD, cookie hunting with the ITP,…)

From a legal perspective, we’re entering an era of consent, while from a technical point of view, we’re heading into a cookieless era.

This development raises the legitimate question of what can we do to make this new era compatible with data-driven marketing practices? In other words, how can marketing teams continue performing their data-driven actions in this new digital landscape? One of the answers, which has already been expressed and shared by the digital industry, involves switching over to the server-side model. Server-side is nothing more than a new technical method for collecting data in a digital world shackled by an increasingly stringent set of requirements. It also gives marketing teams the ideal opportunity to nurture and take greater care of their data and ultimately do more with less.

In this White Paper you will learn about:

  • Seven myths about Server-Side
  • Server-Side: What advantages does it offer?
  • Five actions for getting ready to embrace the Server-Side Model

Discover the Commanders Act Customer Data Platform


The benefits of the Server-Side model

There are two ways to collect the signals that customers send out when visiting a website or using a mobile app. The first method involves getting the browser to execute small fragments of code (tags) when loading pages.

These tags collect and send the data (origin, content, profile, etc.) to the partners who have been authorised by the customer. Therefore, with the first method, everything relies on the browser. The second approach involves sending the data to the server, which will take care of supplying the data directly to the different partners’ servers according to a set of rules. In this case, data are collected, converted and shared from server to server without any involvement from the browser.

Server-side has become the hottest topic of discussion among digital teams when gathered around the coffee machine. Is there any justification for its popularity? Let’s take a look at the opportunities that a server-side strategy can bring.

  • Quality of collected data: the server-side model ensures that the data collection process is centralised on a server, instead of being distributed by each browser. An effective server-side platform checks the quality of the data transiting over the server and corrects any identified errors on the spot, rather than waiting for the tech-ops team to develop a patch. All the data teams value this feature since data cleaning is a major challenge. The same process can also be performed with a tag-based approach, but it weighs down the container so much that it can affect the customer experience while making it much harder to install patches. Improving the quality of the collected data also increases the quality of the data that are transmitted, which ensures proper performance from the solutions receiving the data. In terms of reliability, meanwhile, server-side processing is also a way of reducing the discrepancies sometimes observed (due to browser constraints) between analytical and transactional data.


  • Team agility: you are all familiar with the term “code freeze”, those times when no changes can be made to the site for fear of creating risks during critical periods. With a server-side approach, it is always the same data streams supplying the first server. When those streams need to be modified, corrected or transformed, the source code is left alone, since the data are manipulated asynchronously when transiting over the server. Patches can now be installed at any time, extra data can be sent, and data can also be modified. This is an especially useful advantage.


  • Greater control over data processing compliance: just a single query to the server is required, which will subsequently process, adapt and divide the data among the different partners. The website publisher is therefore able to guarantee visitors that the stated data processing rules really will be applied. This will give the compliance teams an extra safeguard. Piggybacking is no longer possible.


  • Operational quality of the site: this is one of the most important topics in the digital ecosystem, since all websites and mobile apps obey an industrial set of rules governing scalability and quality. There is nothing harder for a tech-ops team to deal with than a large mixed bag of tags with varying levels of compatibility, some of which have been deployed by third parties without the necessary precautions. The most common example involves slow loading pages, which are caused by large chunks of JavaScript code, JavaScript conflicts and security alerts in conversion funnels due to an unsafe tag.


  • Improved website performance: reducing the volume of scripts needed on websites lowers the risk of affecting the customer experience, ramps up page loading times and allows techs to comply with internal standards and sometimes objectives. We know just how much performance powers the customer experience. A one-second delay can mean a 7% reduction in conversions (Strangeloop).


  • Breaking free from the technical constraints linked to browsers: ad-blockers, with their blacklists that block calls to certain services from the browser, or simply cookie filtering mechanisms like Apple’s Intelligent Tracking Prevention (ITP). With server-side, calls are made from the server, beyond the reach of the ad-blockers. And as the service invoked on the server side can be hosted on a sub-domain of the website (rather than on a third-party domain), it is not intercepted by ITP-type mechanisms. A word of caution however… the server-side model does not exempt organisations from their obligation to comply with consent collection rules. Contrary to what we might hear, server-side does know how to share a consent signal.


  • Team productivity: Team discovering server-side for the first time are sometimes caught off guard, since they did not think that a platform could help them take back control of the data streams. Ultimately, a server-side platform simplifies a number of data-based operations which, without server-side, are either flat-out impossible or extremely difficult to perform and require coordination between several teams. Some of the highly useful features that will turn you into a digital traffic controller include real-time quality control combined with alerting, real-time data enrichment with scores (CRM, predictive, etc.) and control over both data sources and destinations.


  • Separation between collection and sharing duties: this is undoubtedly the most revolutionary benefit. Although data are still collected on the device, since that is where users interact with the website, mobile app or smart device, data transmission and the strategy for sharing and circulating data happen on the server. The JavaScript in the webpage can be trimmed down to the bare essentials, which will lighten the website, require less energy from the browser and deliver content faster. In many cases, code freezes are a thing of the past, and the teams can continue optimising their campaigns without having to interact with the website’s source code.


  • Security and privacy: the server-side philosophy provides your users with a superior level of security and confidentiality by preventing any risk of their data being intercepted. The GDPR introduced the idea that brands do not own the data but are merely “leasing” the data. In addition, the risks are high. As such, building trust among users is even more essential, and any technology that protects access to data is heading in the right direction. Less piggybacking, fewer JavaScript files that reconfigure on the fly… these are just some of the advantages that should prove popular with the regulator.

Server-side: 10 myths debunked

In the wake of the cookieless era, the “server-side” model (i.e. server-based data collection) is steadily gaining traction among digital marketing teams in their campaign roadmaps. This new development has raised a number of question marks, doubts and fears, and has even spawned a few myths…


Myth #1: “With server-side model, tags are dead and so is tag management”

It’s true that the server-side model is also known as a tagless process, so it’s easy to believe that tag management is dead. But technically speaking, the reality is a lot more complex. Not all solutions are eligible for server-side. Server-side may already be compatible with consent management and analytics, but in practice it’s harder to implement for ad servers and personalisation solutions. The transition period will take some time, so until that time comes, both methods (client-side and server-side processing) will co-exist. Therefore, we haven’t seen the end of tag management yet. This also gives us the time to deal with the inevitable changes to the profession… we’ll come back to this topic later.


Myth #2: “Server-side is a job for developers”

The emergence of the server-side model has ushered in the feeling that the whole tracking deal has changed sides, namely that it’s been taken out of the marketing teams’ hands and given to the developers. Let’s be clear, this is a (major) shortcut. Admittedly, implementing tagless integrations is a technical job. That’s why the whole process will not be achieved in a day, but instead spread over a few years. But in this case, we’re talking about implementing integrations using the solutions available in the market. Once the solutions are up to speed (i.e. tag managers – a new name will probably be required – and partners’ solutions), server-side will remain in its rightful place behind the scenes. So marketing managers are still responsible for working on the data collected.


Myth #3: “Thanks to server-side, part of my job will go up in smoke”

No tags = the end of tag management and… the associated jobs? Do traffic managers and tagging specialists have any need to be worried? Not really. It’s all a matter of defining “tag management”. If we consider that the primary aim with tag management is placing tags in containers that are executed in a browser, then yes, that breed of tag management will be phased out.

However, if tag management is also seen as a way of processing data before they are passed along to partners, then tag management is far from having one foot in the grave. On the contrary, a new era is dawning. The server-side model opens up a wealth of possibilities, whether checking data quality, enriching data or distributing data to partners with greater precision. The bottom line is that server-side is a real catalyst for stimulating creativity when it comes to data.


Myth #4: “Server-side is a black box; basically we’re losing all control”

This feeling is perfectly understandable. With browser-side tag management, you get the impression that accessing collected data is a bit like reading from an open book, whereas server-side tag management seems to have slammed that book shut. But appearances can be deceptive. In practice, everything is still under complete control and perfectly accessible. The tag manager continues to be the hub where data are collected, processed and sent to partners. Instead of dealing with tags, the solution manages tagless integrations from server to server. Those integrations can still be seen and manipulated.


 Myth #5: “Server-side is managed in-house using bespoke developments”

This is often how new technological practices and features are launched, especially if some sections of the market have not yet got to grips with this new challenge (and this still applies to server-side). As such, it might be tempting to rely on in-house developments to control data exchanges between servers. But give into the temptation and you’ll lose sight of what really matters. The aim isn’t just to implement the server-side model, but also hide all the complex technical mechanisms involved, so that the market teams can focus on their daily activities with greater agility. When it comes to this challenge, a software product will always be more robust than a bespoke development.


Myth #6: “Server-side is going to be expensive…”

Since server-side requires all stakeholders to overhaul their technical infrastructure, dedicated budgets will admittedly be required. But the costs should be weighed against the benefits reaped. There will be a drastic fall in the number of tags that browsers have to manage for a given website, which will improve web performance and consequently enhance the user experience. Above all, switching over to a server-side model improves quality control and enriches data beyond the capabilities of a client-side arrangement. The process of migrating to the server-side model is in the teething stages, and its potential is still widely underestimated. It could potentially mean that partners will receive less data, but the data will be much more relevant.


Myth #7: “Server-side is a real SPOF”

In IT speak, a SPOF or “Single Point of Failure” means that the availability of an entire system depends on the availability of just one of its components. If that component fails, the whole system goes down. Since all transactions are carried out between servers with the server-side model (and not from browser to server), this raises questions about the ability of the server managing the transactions to shoulder the load. As a result, infrastructures will need to be scaled up to address two challenges, i.e. collect all the hits, and process / share the data. The good news is that tremendous experience has already been acquired in this particular area. For example, we know that processing activities can be buffered (temporarily put to one side) in case of an unexpected peak in the load, so that they can subsequently be processed in batches.


Myth #8: “Thanks to server-side, I don’t have to worry about consent”

This misconception should be cleared up as quickly as possible. Server-side is a technical process for collecting and processing data. It doesn’t make the slightest difference to the precautions that should be taken to comply with the GDPR (General Data Protection Regulation) and the consent collection directives issued by the data protection authority. Irrespective of whether information is sent from a browser or a server, consent must be obtained from the user.


Myth #9: “Setting up a consent process with server-side is complicated”

If consent management practices are based on an in-house development that hasn’t been designed to accommodate the server-side model, then migrating could be a painful experience. This doesn’t apply to solutions like TrustCommander, our Consent Management Platform, whose architecture has been designed from the ground up to embrace the server-side model. Therefore, propagating consent among partners with a server-side arrangement is painless.


Myth #10: “Server-side is bound to tighten up data confidentiality”

When it comes to security, nothing can be taken for granted. Just because interactions are managed from server to server doesn’t mean that they’ll naturally be more secure. It will be true if infrastructures are audited and secured in accordance with best practices, and if traffic between servers is secured with an encryption mechanism that is worthy of the name. Depending on the technical arrangements made, and even with server-side, it’s worth pointing out that a data layer may linger in the browser and therefore leave data exposed. If data are sensitive, the server-side model can be implemented without requiring a data layer. Therefore, the security of a server-side arrangement requires a set of measures… and choices.

Commanders Act announces the launch of a new innovation cycle

In response to the changes sweeping the digital ecosystem and customers’ expectations for managing and using data, a technological transformation is required to ensure a seamless digital marketing service.


Paris, 15 March 2022Commanders Act, the publisher of a cookieless marketing platform, takes a look back over the last 10 years and goes on record as announcing that 2022 will be a game-changer for innovation in the data market. The waves of transformations that have successively hit the digital landscape, especially tighter regulations and practices governing data collection and use, have ushered in a new set of challenges and requirements for organisations. Aware of the major impact that these recent changes have had on marketing campaign performance and continuity, and the growing number of organisations clamouring for increasingly sophisticated solutions and technological convergence, Commanders Act is currently launching a new innovation cycle by using a new platform to provide its customers with the ability to take back control of their campaigns and nurture those campaigns with enriched and transformed data to create even greater value.

Ad blockers, cookies, consent… how the last 10 years have shaken up the digital ecosystem

Ever since its inception in 2010, Commanders Act has always been focused on its mission of putting business teams back in the data driving seat, so that they can acquire the independence required to ramp up their performance. For nearly 10 years, the company has been supporting its customers in growing their digital maturity by delivering the expertise and solutions that they need to bring greater structure to their campaigns, define and roll out an effective tag management strategy, raise the agility bar and supercharge their performance.

Although data have reinforced their reputation in recent years as a key driving force for superior performance, new constraints have also emerged for collecting and using those data. The digital ecosystem has been significantly weakened by the advent of ad blockers, the declared war on third-party cookies in browsers and obviously the GDPR and new consent collection rules.

“Back in 2010, we had plenty of data, but we didn’t know how to use them effectively; now we know how to use those data, but their volume is shrinking,” explains Michael Froment, CEO and Co-Founder of Commanders Act. “However, data control is always the key to business performance. The question now being asked is how can marketing teams ensure continuous digital marketing in a cookieless environment shackled by an increasingly stringent set of requirements.

Converge strategies and technologies to address the new challenges facing the market

Their accelerated digitisation combined with the transformation of their market has spawned a new range of requirements from European organisations. Questions relating to data governance, private data protection, team agility and the impact on the user experience have now become critical topics.

These topics already existed some 10 years ago, but they have since come to the fore, especially for kick-starting growth after the two-year health crisis,” advises Michael Froment. “Organisations have improved their level of digital maturity. The business, IT and legal teams used to be at loggerheads, but now it has dawned on them that they need to converge their objectives, strategies and tools.”

Organisations require a consistent, standardised and more effective technology strategy, so that their business teams can collect and control even more data, while aligning their efforts with the real issues and constraints facing the IT and legal teams, with the aim of boosting the performance of their digital campaigns and ultimately improving their business results.

In 2022, Commanders Act is changing its game plan and launching a new technological innovation cycle

This retrospective and introspective analysis has prompted Commanders Act to overhaul its range of services. “We’ve made the most of the last two years to take a step back, listen to our customers, understand their new expectations and think about how we can best respond to those needs,” explains Michael Froment. The result is a new technology platform where all of Commanders Act’s expertise will converge.

Our areas of expertise and our functional scope remain the same as in 2010, but we are transforming them to reflect the new technologies, the new market standards and the changes shaping the ecosystem,” adds Michael Froment. “This new platform heralds the start of a new growth and innovation cycle that builds on the trends in the market and changes in our customers’ needs, and which offers new opportunities for collecting and enriching data with the objective of wringing even greater business value out of those data. It may be a disruptive technology, but it still remains true to our promise.

This change in approach has also led to a change in visual identity, which reflects both the new technology platform spearheaded by Commanders Act and the continuity in the expertise and values championed by the company since its creation. With its new style guide, revamped logo and new website, Commanders Act is going for a more modern identity that conveys the cohesion, innovation, performance and intelligence that have come to define the brand and which have been the driving force behind its growth and success for more than 10 years.

Consent Management in Europe – an overview of the situation in Germany, France, Italy and the UK

When it comes to data protection, marketers are also greatly concerned about the issue of consent management. Especially for internationally orientated companies and their marketing departments, it is often not easy to obtain an accurate overview of applicable data protection regulations in different countries. One thing is clear: the GDPR is the benchmark in this area.

We are also going to take a look at the different interpretations of the GDPR, as well as the data protection regulations applicable in Germany, France, Italy and the UK.

Consent is the new hub for data-driven marketing

The topic of consent continues to gain momentum and the relevance of consent obtained in compliance with data protection regulations and the correct handling of it is still growing. Companies and their marketing teams cannot avoid dealing with this topic. They have to find solutions to manage the consent obtained and even to be able to provide proof of this in an emergency.

In this regard, companies have to face the problems and issues relating to the topic of consent. How can the apparent contradictions of performance-driven online marketing and the special protection of personal data be mitigated by privacy-compliant opt-ins? The brands which operate internationally are also faced with the challenge of having to comply with relevant local regulations.

  • How does the Federal Supreme Court interpret data protection in its case-law in Germany?
  • What does Germany’s new Data Protection Act include?
  • What is the CNIL all about in France?
  • What has decided Italy?
  • How is data protection regulated in non-EU Member States such as the UK?

By answering these questions and focusing in particular on data protection in marketing departments, the foundation can be laid for an online marketing activity which is not only performance-driven, but also regards consent management as a key consideration.

Consent Management in Germany

In Germany, data protection and the associated consent management are based on the right to informal self-determination granted in the Basic Law. In this context, on 20 May 2021, the Bundestag (German Parliament) adopted a draft law entitled the “Telecommunications and Telemedia Data Protection Act” (TTDSG), which aims to amend the Telecommunications Act (TKG) and the Telemedia Act (TMG), thereby adapting both laws within the meaning of the ePrivacy Directive of the EU to the GDPR. This procedure is necessary, among other things, based on the judgment issued by the Federal Supreme Court on 28 May 2020, according to which the EU Cookie Directive has not been fully transposed into applicable law. The TTDSG is due to come into force on 1 December 2021. Breaches of it will be punished by fines of up to EUR 300,000.

With regard to consent management, the opinion of the Bundesrat (German Federal Council) concerning Article 24 of the draft TTDSG is extremely interesting: While the stronger alignment of § 24 of the draft TTDSG with Article 5(3) of the ePrivacy Directive is welcomed, at the same time the simple design of the cookie banners with two buttons “Consent” and “Reject” is recommended as “expedient”. This is fascinating in that Article 5(3) of the ePrivacy Directive requires that, “[…] to store information or to gain access to information stored in the terminal equipment of a subscriber or user is only allowed on condition that the subscriber or user concerned is provided with clear and comprehensive information […] has given his/her consent.” In this sense, the CJEU also ruled in 2019 in the Planet49 judgment, making it clear that explicit consent is not given with a pre-ticked checkbox or using generalised and multi-cookie information banners, such as “Continue browsing and enjoy the benefits of our website”.

It therefore remains to be seen whether and in what form the recommendation of the Bundesrat fits in with a tendency towards a strict interpretation of the explicit opt-in by the CJEU and the Federal Supreme Court, as well as the discussion about an explicit opt-in also for technical cookies and how this opinion could be implemented. The German Federal Ministry for Economic Affairs and Energy explains this in a press release as follows:

“With regard to cookies, the TTDSG is also intended to achieve user-friendly and competitive consent management, which should include recognised services, browsers and telemedia providers. The detailed design of these new structures is to be carried out by means of a government regulation, the successes of which will be monitored and evaluated by the Federal Government (§ 26 TTDSG).”

The “nudging” issue or even: playing tricks with the opt-in banners

The term “nudging” describes the attempt to influence website visitors in order to persuade them to give consent. These little tricks in the visual design of banners are currently possible in light of the fact that there is (still) no clear legal regulation or case-law on this.

A typical example of this practice is the use of banners with large green buttons for the “Accept all cookies” function and a smaller pale-grey equivalent for the “Reject cookies”. Another recurring practice is that visitors have to click through a series of settings to reject cookies, while the option to accept cookies is directly available.

However, some recent decisions by various courts have found that companies should not push this too far: According to a publication from the Lower Saxony supervisory authority, “excessive” nudging results in the invalidity of consent since the user is not given a real choice. A judgment from the Rostock Regional Court also supports this statement. However, there has been no word so far of either procedures or fines which specifically relate to the issue of nudging. There will certainly be further developments in this area in the future.

And what about newsletter tracking?

Another important topic in terms of consent management is the handling of data for sending newsletters. Using a double opt-in is now a familiar and widely accepted requirement. However, this raises problems when processing the data based on the newsletters sent, for example for analysis purposes.

The legal framework on the subject of tracking information such as opening the newsletter, clicking on links contained in it or opening documents contained in it has not yet been regulated in detail. What is clear, though, is that consent should also be obtained separately for these processing operations. On the other hand, it is unclear when the most sensible point is for obtaining this consent. The most practical option is currently to display a note text at the “first opt-in”, i.e. at the time when a user fills out the newsletter registration form. But a concrete solution has not yet materialised in practice. This topic must therefore also continue to be monitored by companies and their marketers in order to keep up to date with all regulations and jurisdictions, as with all questions relating to consent management.

Modern online marketing can no longer avoid the introduction of a consent management platform due to an increasingly complicated data protection environment. In this context, it is important to clearly keep the terms “privacy statement” and “consent management platform (CMP)” separate. While the privacy statement serves to fulfil the information obligations (Which data is collected and further processed? Whom they may be passed on to?), the CMP is concerned with obtaining active consent in compliance with data protection regulations. Therefore, a CMP helps website operators with the correct design and operation of opt-in banners.

Implement your consent management with our TrustCommander professionally and in compliance with data protection regulations.

Good to know:
A/B tests are an important ally when it comes to optimising your opt-in rates.

As you can already see, from a marketer’s point of view, the topic of consent management has a lot to do with making consent banners compliant with data protection regulations, on the one hand, and optimising them in such a way, on the other hand, to ensure that the opt-in rate is as high as possible. A/B tests are destined to feature in determining suitable opt-in banner formats. Leave nothing to chance and analyse, by playing different versions within a certain period of time, exactly which banner formats on your website generate the highest possible consent rate.

Our current privacy barometer shows, for example, that consent banners are most often displayed in the form of pop-ins (72%) to obtain a 100% explicit opt-in. In addition, we can give you five practical tips to make your consent banners as effective as possible. The good thing about this is that: These tips are based on the requirements of the French data protection authority CNIL, which even went one step further in interpreting an explicit opt-in than is currently the case in Germany.

Click here for the current privacy barometer

Consent Management in France

The French data protection authority CNIL published a directive on cookies and other trackers in October 2020, which had to be implemented by all website operators by the end of March 2021. Key aspects which are explained in it are GDPR-compliant consent, the shutdown of opt-out mechanisms where, by definition, no explicit consent prevails, compliance with transparency requirements, a simple option for revoking opt-ins, as well as the verifiability of all opt-ins. This cookie and tracking directive is very detailed and provides information from the technical to the purely visual design of consent management on websites.

In its directive, the CNIL focuses in particular on the timeframes for storing cookies. For example, it recommends a duration for cookies (i.e. how long a set cookie can actively collect data) limited to a period of time that allows a relevant comparison of the target groups during this period, but mentions a maximum duration of 13 months. The maximum period for retaining the information collected via these cookies is set by the CNIL at 25 months. A new visit to a website by visitors who have already been tracked does not represent an automatic extension of this period. The duration and retention periods specified by the CNIL will be reviewed periodically in order to ensure that they are still appropriate.

The CNIL confirms its main principles, including the symmetry of consent

  • User consent is better regulated. The Internet user must now give explicit consent. Simply continuing to browse a site can no longer be considered a valid expression of consent. Consent must be given by a “clear positive act” (e.g. clicking on the “I agree” button) in order to allow the trackers to be triggered. Without this, only essential trackers can be deposited.
  • Users should be able to withdraw their consent easily and at any time.
  • Symmetry of consent. Gone are the days when you had to look hard for the “opt-out” button. From now on, it must be just as easy to refuse as to accept the deposit of cookies.
  • Individuals must be clearly informed of the purposes of the trackers before they consent, and of the consequences of doing so. The identity of all companies using cookies subject to consent should also be easily accessible to the Internet user.
  • Finally, if an Internet user requests proof of consent, the actors depositing the trackers must be able to provide valid proof of the user’s free, informed, specific and unambiguous consent.

Which cookies are currently considered as not requiring consent in France?

As already mentioned, a distinction can be made between technically necessary and analytics cookies (e.g. tracking or affiliate cookies). In general, according to the ePrivacy Directive, all technical cookies that are needed to operate a website do not require explicit consent. However, in order to find out exactly which cookies or trackers are classified as “technically necessary”, a closer look must be taken again at national interpretations and case-law. In France, the following types fall under the “not requiring consent” category:

  • Trackers which record the decision made by the user
  • Trackers for authentication when accessing a service and to ensure the security of the authentication mechanism, e.g. by limiting suspicious access attempts.
  • Trackers designed to store the contents of a shopping basket on an e-commerce website or to invoice the user for purchased products.
  • Trackers for customising the user interface (e.g. for choosing the language or view)
  • Trackers which enable load balancing
  • Trackers which allow paid websites to limit free access to content
  • Specific trackers for measuring visitors

The CNIL still recommends that you also inform about cookies exempted from consent and their purposes, even if there is no general requirement for consent. Certain analytics cookies may also be exempt from the consent requirement in France. However, according to the CNIL, certain conditions must be met:

  • Analytics cookies can be used without consent, the tracking of which serves purely to measure the number of visitors.
  • No explicit consent is required either if there is no tracking across different applications or websites.
  • Lastly, the CNIL mentions the quality of the data collected. It is permitted to analyse data without any opt-in if it is used exclusively for the purpose of creating anonymous statistical data, or if there is no comparison of the data with other processing operations and no disclosure to third parties.

Consent Management in the United Kingdom

We would also like to give a brief overview of the requirements for consent management according to a British interpretation. Despite its exit from the European Union, the United Kingdom is pursuing a course in terms of data protection law which is strongly orientated towards the EU and, therefore, towards the GDPR. Therefore, there are few differences to the EU, especially in connection with cookies, with the exception that the Information Commissioner’s Office (ICO) as a competent authority is not only financially strong, but is also considered to be much stricter and more active than is currently the case, for example, with German authorities.

For instance, the ICO accepts only a few exceptions to the clearly explicit consent requirement. Cookies which can also be set without any opt-in must therefore be “strictly necessary” according to the narrowest possible interpretation. Examples of this are cookies which allow a shopping cart to be stored for the next session, or which are used to ensure, for example, the security of online banking. This also includes cookies which are used to support the loading of websites. Clearly not included are analytics cookies, cookies for recognising certain website visitors or cookies which collect first-, second- and/or third-party data for advertising purposes.

Analytics services – a controversial aspect throughout Europe

Probably the most controversial aspect of consent management on websites at the moment revolves around the issue concerning the need for analytics services such as Google Analytics and, therefore, at the same time, the issue of whether these services require an explicit opt-in.

In this context too, the debate is based on the already mentioned Article 5(3) of the ePrivacy Directive, which stipulates the rules for handling cookies at EU level. However, two aspects are answered differently:

  1. Does Article 5(3) of the ePrivacy Directive also apply to “cookieless” technologies?
  • Neither the Federal Supreme Court in Germany nor the CJEU have taken a position so far in this regard at European level.
  • France and the United Kingdom consider that Article 5(3) of the ePrivacy Directive does also apply to “cookieless” technologies.
  1. Are analytics services to be regarded as “technically necessary” or “strictly necessary” under article 5(3)(2) of the ePrivacy Directive?
  • Germany has not yet taken a stance on this.
  • France considers “harmless” analytics services to be “technically necessary”, but without clearly explaining what exactly is meant by them.
  • The UK takes the most stringent view and considers all analytics services to be strictly subject to consent.

The result shows that the use of analytics services without active consent is a highly controversial issue. Even though the situation is unclear, the use of analytics services without active consent also tends to be discouraged in Germany.

Consent Mangement in Italy

On June 10th, 2021, the Italian Data Protection Authority (Garante per la protezione dei dati personali) has published new guidelines for cookie usage. It comes after 6 months of public consultation on cookies topic.

The aim of these new guidelines is to identify the legal requirements applicable to the use of cookies and to suggest technical solutions to correctly implement these obligations and avoid sanctions.

What are the modalities ?

Any website that have users based in Italy are concerned by these new guidelines.

The deadline to comply is set at January 10th, 2022.

The penalties if you do not comply with these new guidelines are as follows:

  • Omission or inadequate information : from 6000 to 36.000 euros
  • Installation of cookies without consent : from 10,000 to 120,000 euros

What are the guidelines ?

1. Precision of what is a Consent and how to collect it

  • The act of giving consent must be “free, specific, informed and unambiguous”
  • There must be a command (e.g. an ‘X’) to close the banner without giving consent to the use of cookies or other profiling techniques by maintaining the default settings.
  • Scrolling is not a clear, affirmative positive action from the user to collect consent.
  • Cookie walls are not allowed.

2. About cookie banner

  • “Accept” and “Reject” buttons are required.
  • The user’s personal data storage period must be specified.
  • The banner must contain a link to the privacy policy.
  • The user must be able to give/withdraw consent granularly according to purposes and providers.
  • Users must be able to access and edit their tracking preferences at any time after setting their initial preferences.
  • New specifications for the accessibility of cookie information in relation to persons with disabilities;

3. Analytics and technical cookies

  • Analytical cookies require consent (subject to certain conditions)
  • Technical cookies do not require consent

4. Validity of consent

  • Consents collected before the publication of the new Garante Guidelines on cookies, if they comply with the characteristics required by the Regulation, are valid as long as, at the time of their acquisition, they have been recorded and can therefore be documented.
  • The banner may not be shown to users before 6 months have passed since the consent was collected.

5. Proof of consent

  • You need to be able to prove that consent was obtained according to the standards of the GDPR.

European Cooperation: Transparency & Consent Framework 2.0

Another important framework for consent management is IAB-TCF 2.0, which was introduced in September 2020. Consent management according to the Transparency & Consent Framework (TCF) of IAB Europe works by dovetailing processing purposes and vendors. The descriptions of these processing purposes are provided in precise detail. The TCF operates in layers. Therefore, the reproduction of the “legal full text” is mandatory. According to the regulations, however, it is sufficient to display the “user-friendly” text on a first level and the “legal full text” only on a second level of the consent banner. At no point can there be any deviation from the official text modules and their official translations.

In addition to the lists for publishers, content management systems (CMS), as well as advertisers and agencies, the vendor list for IAB-TCF 2.0 gives a precise overview of which e-commerce shops fall under TCF 2.0. Therefore, implementing the TCF should contribute to improving transparency in the provider jungle and to compliance with the GDPR. However, it is important to note that the TCF does not guarantee compliance with the GDPR on its own. Whether the GDPR is actually complied with when the TCF is implemented must be checked separately from a legal perspective and, if necessary, on a case-by-case basis.

Currently, TCF 2.0 is subject to a great deal of criticism. The focal point of this criticism is mainly varying requirements for cookie banners compared to the GDPR. You can find out more about this topic here.

We can see that the situation is complex. As a globally operating company, it is necessary always to be informed of the situation in order to act in compliance with data protection regulations in all markets and to take specific local requirements into account.

Would you like to rethink your approach to consent management and introduce a consent management platform in your company or receive specific suggestions on how to design your cookie banners?

Don’t hesitate to contact us for more information!

We would like to thank Christoph Bauer from ePrivacy GmbH for his cooperation in the course of our joint webinar and in producing this article. Article initially written in July 2021.

Time to invest in a Customer Data Platform

White Paper – Time to invest in a Customer Data Platform

“Our tools give business users direct control of their first-party data.” “We provide an up-to-the-second view of all your customer data.”

Chances are high that those promises sound familiar. You’ve heard them from companies selling data warehouses, Customer Relationship Management, Master Data Management, Integration Platforms, Marketing Automation, and Data Management Platforms. All were enticing; none fully delivered on the promise.

Today, you’re hearing the same claims from Customer Data Platform systems. Why should you believe this time will be any different? We explain.

In this White Paper you will learn about:

  • Does Unified Customer Data Really Matter?
  • Why Other Systems Failed
  • Why CDP Succeeds
  • Which are the specific technical features of a CDP
  • The Value of Success
  • How to Find the Right CDP

Discover the Commanders Act Customer Data Platform

This content was written by the CDP Institute and sponsored by Commanders Act.

This site is registered on as a development site.