Bid Throttling: why publishers need to optimize their programmatic request volume?

Reading time : 5 minutes

In recent years, the number of advertising requests has surged, attributed to various factors such as the widespread adoption of header bidding, the proliferation of players, and the overall growth of the market.

The functioning of this ecosystem tends to generate an increasing number of unnecessary requests, destined to not materialize from the outset (e.g., market player globalization, changes in operating systems and browsers, consent requirements, market globalization, etc.).

It is pertinent to understand the undesirable effects that this phenomenon generates and know the solutions available to publishers to address it without impacting revenue (yield).

What disadvantages can arise from the massive sending of requests?

The detrimental effects can be categorized into four main areas:

  • Financial Impact: Whether the auction succeeds or not, the large-scale transfer of data to data centers generates unnecessary tech costs.      
  • Environmental Impact: Servers involved in data transfers consume energy, especially as this increasingly occurs via cellular networks (3G, 4G, 5G).   
  • Web Performance Impact: The multiplication of requests affects web performance; the more calls there are, the greater the risk of slowing down the site’s page display. This latency is exacerbated when exchanges occur on the client side, i.e., through the user’s browser. 
  • Perception Deterioration: A too-high volume of unnecessary requests diminishes the perception of inventories by buyers and the tools they use. They may decide to exclude this source of non-qualitative offers.

    To mitigate this phenomenon, what are the possibilities for filtering advertising calls?

In response to these side effects, many publishers reduce the number of partners they connect to. While operationally simple, this strategy can negatively impact company revenue. To reconcile the reduction of unnecessary calls with revenue, the concept of ‘traffic shaping’ for ad calls has emerged. This involves an editor’s ability to adapt its volume of requests or selectively remove them in a controlled manner.

The most common technique is called throttling, a two-level filtering process operated by SSPs:

  • On Inventory: Requests from a publisher are excluded to avoid selling inventory that is of little interest to DSPs.   
  • With DSPs: The SSP assesses the likelihood that a DSP will participate in an auction. If the chances are low, the emission of bid requests is limited for a more or less extended period.

It is worth noting that the criteria on which these filters operate are typically the SSPs’ secret sauce, not publicly shared.

To regain control over this filtering at the source, publishers are now being offered solutions to generate these ad call limitations. These technologies rely on machine learning to allow publishers to continuously adjust their calls.

Nexx360‘s intelligent filtering at the source gives publishers back control and maximizes their revenue while reducing waste.

The ad call optimization performed by the Nexx360 platform has been designed for and with publishers. Their needs: a solution that adapts to their choice of partners, the pace of their activities, and more generally, their ad stack while preserving their revenue and web performance. This filtering is offered to all Nexx360 clients, natively integrated into optimization tools and at no extra cost. Thus, each sender can drastically reduce their emissions as well as those of the programmatic chain while protecting their revenue.

For this, the Nexx360 solution:

  • Analyzes the historical auctions based on various parameters such as placement, browser, OS, country, device, etc.   
  • Classifies these inventories from A to F, akin to the nutri-score in the agri-food domain. The closer the score is to F, indicating less performant inventory categories, the more the publisher can reduce the volume of calls without risks… and vice versa!

Note that a minimum of requests (sampling) is retained to observe behavioral changes and adapt automatically. Publishers have access to all tracking information: actor rankings, performance by criteria (volume of bids, impressions, revenue, etc.). They can even configure their optimization strategy based on the actors they work with directly or those they want to exclude from certain calls.

‘This modular ranking allows adjusting the number of requests to SSP performance without permanently reducing the number of partners an editor works with. On the contrary, the more competitors in the race, the higher the probability of selling at the best price. Likewise, DSPs are more likely to bid on quality requests.’

Gabriel Chicoye, CTO & co-founder of Nexx360:

Don’t miss our exclusive newsletter!