The Tools Are Ready for Cookie Deprecation, But the Data is Not

By: Laura McElhinney, Chief Data Officer, MadTech

As brands and agencies prepare for a future without cookies, the most commonly asked question is which tools will work, and which will not. Without cookies as a consistent identifier, brands are eager to know how they can leverage their tech stacks in combination with their own first-party data to continue executing their marketing and advertising initiatives.

In many private conversations, it’s clear that buy-side decision makers think they have a full view of the future. There will be alternative identifiers, clean rooms, and a focus on first-party data. It will be challenging, but the common consensus is that as long as the tools work, everything will be OK in the end.

Marketers can breathe a sigh of relief, because the ad tech tools they rely on daily are indeed ready. What many don’t know is that the majority of the data needed to execute campaigns is nowhere near where it needs to be to work with the tools within the new infrastructure.

Data as a Service

Now that we’ve actually reached a point where cookie support is drying up, the industry is grasping the fact that new infrastructure needs to be put in place in order to utilize data. The emphasis on first-party data requires the use of clean rooms to match data sets and then reach out to a desired audience. 

These systems rely on data, but in a privacy-first 1st party data world, data will move less and less, gone are the days of transferring a large raw data file. Users want the data queryable and accessible where and when they want it, in the platforms they need it in. With 1st party data, the security and privacy risks of transferring anything outside of your infrastructure should be seen as a major risk to a business. This means the data needs to be prepared, processed and accessible at rest, so that anonymized insights are the only things moving. The acronym used for this service layer is “Data-as-a-Service (DaaS), and requires processing and normalizing a company or platform’s data to make it accessible. 

The idea behind this is that a layer exists between the data and the platform, allowing it to easily move from platform to platform and provide the insights that advertisers need to power their campaigns. This is a different kind of connectivity. 

Much of this should be powered by API communication between platforms, allowing for easy queries across the data sets. The issue is that while all of the train tracks are being laid, the actual cargo – in this case, the data – is not ready for transport. The result is that the data remains inaccessible, slowing down the marketing operations that rely on these insights. 

APIs can certainly move data. But if, for example, a marketer is trying to access IDs via an API connection within a clean room, the bigger question is whether or not the IDs have been linked in advance. A raw data transfer requires the data receiver to analyze the data, determine the deduplicated users across all of the device IDs. If the future of advertising relies on these API connections, things need to move much faster and with less labor.

For the ecosystem to truly be ready, we need to avoid the intense time and cost of raw data transfers and instead focus on making the data queryable. This requires data that is already processed and can be sliced, diced, and filtered in real-time. 

If this issue is not resolved soon, it will have consequences across the ecosystem and slow down the speed of advertising. Some brands and publishers are licensing anywhere from 80 to 120 platforms to help enact their entire marketing and advertising strategy. All of those systems have different responsibilities and functions, but they all require some of the same data, and they all need to be able to share insights in a back-and-forth loop. 

There is an extraordinary amount of raw data available in the marketing ecosystem right now, and much of it will remain usable even amid cookie deprecation. However, everyone needs to go the extra mile to move data beyond its raw state, transforming it into something queryable, accessible, mobile, and usable.