Privacy as a tool for Robust Mechanism Design in Large Markets - PowerPoint Presentation

Download presentation
Privacy as a tool for Robust Mechanism Design in Large Markets
Privacy as a tool for Robust Mechanism Design in Large Markets

Embed / Share - Privacy as a tool for Robust Mechanism Design in Large Markets


Presentation on theme: "Privacy as a tool for Robust Mechanism Design in Large Markets"— Presentation transcript


Slide1

Privacy as a tool for Robust Mechanism Design in Large Markets

(A Case Study)Based on joint works with: Rachel Cummings, Justin Hsu, Zhiyi Huang, Sampath Kannan, Michael Kearns, Mallesh Pai, Jamie Morgenstern, Ryan Rogers, Tim Roughgarden Jon Ullman, and Steven WuSlide2

Approximately Stable, School Optimal, and Student-Truthful Many-to-One Matchings (via Differential Privacy)

Aaron RothJoint work with:Sampath Kannan, Jamie Morgenstern, and Steven WuSlide3

Many-to-one Stable MatchingsSlide4

Many-to-one Stable Matchings

In a stable matchings problem there are students and schools.

Students

each have a total order

over the schools

Schools

have a total order

over the students

Students can be matched to at most 1 school; schools to at most

students.

Definition: A matching is stable if it satisfies:Feasibility: For each school : (No Blocking Pairs with Filled Seats): For each and such that , either or for every

,

.

(No Blocking Pairs with Empty Seats): For every

such that

, and every

such that

,

.

 Slide5

Many-to-one Stable Matchings

Simple mechanisms compute the student-optimal/school optimal matchings (student/school proposing deferred acceptance). But…Even in the 1-to-1 case, no mechanism is dominant-strategy-truthful for both sides of the market [Dubins and Freedman 1981, Roth 1982]In the many-to-one case, no school-optimal mechanism is dominant-strategy truthful for either side of the market. [Roth 1984]Can we circumvent them with approximation and large-market assumptions?

Worst Case ResultsSlide6

“Traditional” Economic Approache.g. [Immorlica

and Mahdian 05], [Kojima and Pathak 09], [Lee 11], [Azevedo and Budish 12], …Make a strong distributional assumption about how preferences are generatede.g. ([IM 05, KP09]) students have preference lists of constant length

, drawn

i.i.d

. from a product distribution

Show that as the “market grows large”, when exact school-optimal matching is computed, the fraction of people who have incentive to deviate diminishes

e.g. as

(and fixed), with high probability, a

fraction of students have incentive to

mis

-report.  Slide7

Here: A more robust “dual” approach.

Make no assumptions about student or school preferences. Ask for truthful reporting to be an asymptotic dominant strategy for every student. Make no “large market” assumptions except that schools have sufficiently many slots.Instead: Perturb the process by which matchings are computed, and find “approximately stable”, “approximately school optimal” matchings. Also: Ask for small finite-market bounds (not just limit results)Slide8

Approximately Stable Matchings

Definition: A matching

is

stable

if it satisfies:

Feasibility: For each school

:

(No Blocking Pairs with Filled Seats): For each

and

such that

, either or for every , .(No Blocking Pairs with Empty Seats): For every such that , and every

such that

,

.

Definition: A matching

is

-approximately stable

(envy free) if it satisfies:Feasibility: For each school

:

(No Blocking Pairs with Filled Seats): For each

and

such that

, either

or for every

,

.

(No Blocking Pairs with Empty Seats at under-enrolled schools): For every

such that

, and every

such that

,

.

 

Schools tolerate a small degree of under-enrollmentSlide9

Approximately School Optimal Matchings

Definition: Let be the school-optimal stable matching. A matching is school dominant

if for every school

, and every pair of students

such that

and

:

i.e. every student matched to

in a school dominant matching must be

at least as preferred

as every student matched to in the school optimal matching. But there may be fewer of them. Slide10

Approximate Dominant Strategy Truthfulness

A utility function

is

consistent

with an ordering

if for every

:

if and only if

.

Definition: A matching mechanism

is -approximately dominant strategy truthful if for every , and deviation , and for every utility function consistent with

:

 Slide11

Our Result

Theorem: There is a computationally efficient algorithm for computing -approximately stable, school dominant matchings, that makes it an -approximately dominant strategy

for every student to report truthfully whenever school capacity is sufficiently large:

When students have constant length preference lists, we only require:

 

When

, we can take

.

 Slide12

Differential Privacy [DMNS06]A measure of Algorithmic Stability

Let denote an arbitrary type profile, and let

be any possible report for agent

. Then a mechanism

is

-differentially private if for all

:

In particular, for any

:

Algorithmically enforced

informational smallness

.

 Slide13

A Helpful Change in PerspectiveAdmissions Thresholds

Think of school preferences as being represented by assigning a rating

to each student

.

.

A set of admissions thresholds

induces a matching:

(i.e. students go to their favorite school that will have them)

Say thresholds

are

-approximately stable if

is.

Idea: Try and find

-approximately stable, school dominant thresholds, subject to differential privacy.

 Slide14

Differential Privacy Yields Approximate DSIC.

Theorem: Let

be an

-differentially private algorithm for computing admissions thresholds. The algorithm

which takes as input preferences

and:

computes

, and

outputs

is -approximately dominant strategy truthful for all students.  Matching is computed subject to “joint differential privacy”. Slide15

Differential Privacy Yields Approximate DSIC.

Proof: Fix a set of preferences , a student , a deviation

, and a utility function

consistent with

.

 

 

 

 

 

 

(Differential Privacy)

(argmax and consistency)

(

)

 

Goal: Design private algorithm to compute

approximately stable, school dominant

thresholdsSlide16

School Proposing Deferred Acceptance

Set all school thresholds

, an initial empty matching

, and initial counts

of enrollment for each school.

While there exists an under-enrolled school

and

:

Lower the threshold for school

: For each student , if then:

,

,

Output

 

How can we make this differentially private?Slide17

Some Useful Privacy Properties

Theorem (Postprocessing): If is -differentially private, and

is any (randomized) function, then

is

-differentially private.

 Slide18

Some Useful Privacy Properties

Theorem (Composition): If

are

- differentially private, then:

is

-differentially private.

 Slide19

So…

We can go about designing algorithms as we normally would. Just access the data using differentially private “subroutines”, and keep track of your “privacy budget” as a resource. Private algorithm design, like regular algorithm design, can be modular. Slide20

School Proposing Deferred Acceptance

Set all school thresholds

, an initial empty matching

, and initial counts

of enrollment for each school.

While there exists an under-enrolled school

and

:

Lower the threshold for school

: For each student , if then:

,

,

Output

 

Only data access: Keeping track of enrollment counts.Slide21

Privately Maintaining Counts

[DworkNaorPitassiRothblum10,ChanShiSong10] give exactly the tool we need.Private algorithm to maintain a running count.Given a stream of n bits, maintain an estimate of the running count to accuracy

, where each person can affect at most

entries in the stream.

For us:

. (No student changes enrollment status at any school more than twice.)

 

 

 

 

  

 

 

 

 

 Slide22

Privately Maintaining Counts

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 Slide23

Private School Proposing Deferred Acceptance

Idea: Run school proposing deferred acceptance, but maintain enrollment counts privately. implies privacy of the whole algorithm.

-DP implies

-approximate dominant strategy truthfulness.

schools to keep track of, so total error is

So as to never over-enroll, run as if capacity is shaded down by

.

So long as capacity

, the under-enrollment due to capacity shading and error is

.

 Slide24

Private School Proposing Deferred Acceptance

Privacy approximate dominant strategy truthfulness.Utility guarantees?Enrollments are always underestimated, and so…The sequence of proposals is always a subsequence of the proposals made by some trajectory of the (exact) school-proposing deferred acceptance algorithm.No blocking pairs with filled seatsSchool dominanceExcess under-enrollment of at most

Only blocking pairs with empty seats are at almost fully enrolled schools.

 Slide25

Stepping back…Differential Privacy is a tool that can be used to design robust mechanisms in large markets.

Ex-post guarantees for all players even in settings of incomplete informationNo distributional assumptionsShifts perspective to mechanism designExplicitly perturb mechanisms to yield distributional robustness…Rather than proving structural properties about exact solutions on random instances. Slide26

Stepping back…Other applications:

Privately computing Walrasian equilibrium prices: Asymptotically truthful combinatorial auctions with item pricings. Privately computing correlated/Nash equilibria: Mediators for equilibrium selection that make truth-telling an ex-post Nash equilibrium. Privately selecting alternatives: General recipe for mechanism design without money. [McSherry Talwar 07, Nissim Smorodinsky Tennenholtz 11]There should be more!

Lets involve mechanism/market designers!Slide27

Stepping back more…

“Markets for Privacy”Can we find a “market price” for ?Depends on individual costs of privacy risk, as well as value of resulting data analysis.Disclosures viewed as public goods? (Talk to John)“Markets for Data”Information is very interesting as a commodityLots of complicated complementarities, because of inferences. Differential privacy

removes

some kinds of complementarities (by making reconstruction impossible)

Leaves others

Privacy trades off in non-trivial ways with “price of data”.

Lets involve economists! Slide28

Thanks!

By: lindy-dunigan
Views: 13
Type: Public

Privacy as a tool for Robust Mechanism Design in Large Markets - Description


A Case Study Based on joint works with Rachel Cummings Justin Hsu Zhiyi Huang Sampath Kannan Michael Kearns Mallesh Pai Jamie Morgenstern Ryan Rogers Tim Roughgarden Jon Ullman and Steven Wu ID: 713246 Download Presentation

Related Documents