In a programmatic world, clients seem to be increasingly concerned about what they call “duplication,” or the overlap of inventory or audiences across multiple demand-side platforms. The fear is that the same ad will be served to the same user by two different DSPs, or worse yet, that the client is bidding against themselves for the same ad.
Setting aside the fact that getting the same user to see the same ad multiple times has been the goal of advertising since the invention of movable type, reports of duplication, to paraphrase Mark Twain, have been greatly exaggerated. The definition of duplication is “an exact copy.”
But even with the same inventory, same users and the same data, two DSPs can have two very different strategies based on what view into these variables they have and what they choose to do with those variables.
Exact Duplication Is Unlikely Given All The Inputs
For every impression that gets served in a real-time bidding environment there are multiple inputs into the equation:
- The inventory available
- The data available to the DSP representing that potential audience member’s perceived value
- The match rate between the exchange ID and the DSP user ID or cookie
- The bidders used to assess that value and place a corresponding bid
- The optimization algorithms and techniques used to refine and extend the potential audience
With first-generation DSPs, the potential audience and the data representing that potential audience member’s perceived value is one and the same. This is because a pre-packaged audience segment, purchased from a data broker, provides no pre-impression data to the DSP.
Only after a campaign starts running — and the DSP begins collecting post-impression data on conversions, best-performing sites, optimal ad placements, most productive time of day, etc. — will the DSP be able to assess the value of a member of that segment and bid accordingly.
Go Beyond Pre-Packaged Segments
However, DSPs that have unstructured data capabilities can use pre-impression data to assess a potential audience target’s value more accurately. Based on individual data elements such as recency, keyword combinations searched, sites visited, even the context of content they have consumed, a DSP that uses unstructured data can make a much more educated bid for that user. It can also apply that same sort of surgical approach with post-impression data to build spot-on, look-alike models and amplify scarce data signals.
So, even if the same user happens to get served an ad by two DSPs, the reasons why they served the ad and the value of the bid they placed for the user might be vastly different. The only problem is, with first-generation DSPs, you can’t see any of the data that would tell you why that user was valued at that price or how they compare in value to a different user in the same segment.
For instance, if Domino’s is targeting pizza intenders through a prepackaged audience segment buy, the person who read an article about pizza two weeks ago will be valued as highly as the person who searched “pizza coupon” 10 minutes ago. Obviously, their value to Domino’s for making a timely purchase is vastly different.
Because most first-generational DSPs are built on an arbitrage model, your actual bids are obscured. All you can see, even if you use a verification service, is that you served the same user more than once.
Could Duplication Be A Good Thing, Anyway?
I once had a client tell us that even though their campaign was producing conversions, the duplication rate concerned them. I told them we had two options:
- Find the exact people looking for their product that are somehow invisible to everyone else.
- Serve an ad to everyone on the Web to cover all of our bases and ensure no user is ever touched again.
All kidding aside, duplication is going to happen. To find the people looking for your product, you’re likely going to have to use multiple data sources, which might result in duplication. And serving a single ad to everyone on the Web… well, it doesn’t make a lot of sense and most people don’t convert after a single exposure to the ad, anyway.
In fact, duplication might even be useful. Who knows, in the past we might have called it “exposure.” But as long as you can’t see the data, you’ll never understand whether duplication is costing you money or making you money.
So, it seems the industry analysts are right. You only need one first-generation DSP. But that doesn’t mean you only need one DSP. Adding an unstructured programmatic marketing platform to the mix will give you the visibility you need to know if duplication is a friend or foe.
Opinions expressed in the article are those of the guest author and not necessarily Marketing Land.