Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scale estimation for usage in FLEDGE #20

Open
alexmturner opened this issue Feb 10, 2023 · 4 comments
Open

Scale estimation for usage in FLEDGE #20

alexmturner opened this issue Feb 10, 2023 · 4 comments

Comments

@alexmturner
Copy link
Collaborator

Hi all,

We’re seeking some rough scale estimates for expected Private Aggregation API usage in FLEDGE auctions once both APIs have shipped and third-party cookies have been deprecated. This will help inform server designs to ensure they can handle the expected traffic.

We’d appreciate any estimates, even if there’s substantial uncertainty, and have provided a template below for providing them in a reply to this issue or through the Privacy Sandbox feedback form. Please feel free to omit answers to any of the questions if they are sensitive.


Template:

Estimates for: [company name]

We plan to participate in FLEDGE auctions as [a bidder/a seller/both a bidder and a seller].

  • We expect to run/participate in [number] of FLEDGE auctions per day
  • We expect to record [number] histogram contributions (i.e. key/value pairs) on average per auction.¹ These histograms will include:
    • [example 1]
    • [example 2]
  • We plan to batch reports [hourly/daily/weekly/other] for processing in the aggregation service.

​​¹ Please include any contributions sent using the reportContributionForEvent() mechanism.

@michal-kalisz
Copy link

Hi @alexmturner

Estimates for: RTB House
We plan to participate in FLEDGE auctions as a bidder.

  • We expect to run/participate in 1 trillion (1e12) of FLEDGE auctions per day

  • We expect to record histogram contributions (i.e. key/value pairs) on average per auction.¹:

    • 1% on bid
    • Up to 10 per impression
    • Up to 10 per click

These histograms will include:

  • Bid: bid value per advertiser, bidder “rejected reason” per advertiser
  • Imp: bid value per advertiser, banner and “user group”, model execution metadata
  • Click: bid value per advertiser, banner and “user group”, model execution metadata

We plan to batch reports for processing in the aggregation service:
Bid values: every 6 hours, “rejected reason” hourly. (would it be possible to batch different bucket sets with different frequencies?)

@alexmturner
Copy link
Collaborator Author

Hi @michal-kalisz, thanks! Appreciate the response

@renanfel
Copy link

To follow-up on the original request, the following information will also be helpful to prepare the service:

  • expected maximum number of aggregates keys in each domain file. Ideally, a rough distribution across batches would also be helpful, e.g. what percentage of batches will have the largest domain size.

@k-o-ta
Copy link

k-o-ta commented Jun 7, 2023

Hi @alexmturner
Estimates for: CyberAgent(Dynalyst)

We plan to participate in FLEDGE auctions as a bidder

  • We expect to run/participate in 10 billion of FLEDGE auctions per day
  • We expect to record [number] histogram contributions (i.e. key/value pairs) on average per auction.¹ These histograms will include:
    • five per bid
      • bid value, rejected reason, bidding metadata
    • two or three per imp
      • bid value
    • two or three per click
      • bid value

We plan to batch reports [hourly/daily/weekly/other] for processing in the aggregation service.

  • hourly:
    • rejected reason
    • metadata of bidding
  • daily:
    • imp/click report(maybe needed hourly when starting new campaign)
    • the average bid gap for an ad

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants