/
Software in Society Software in Society

Software in Society - PowerPoint Presentation

danika-pritchard
danika-pritchard . @danika-pritchard
Follow
386 views
Uploaded On 2017-10-13

Software in Society - PPT Presentation

Personal responsibility in the engineering workplace 1 Lere Williams Policy vacuums conceptual vacuums and invisibility in software Algorithmic complexity ethical not computational Arguments for inclusion and personal responsibility in the software industry ID: 595533

data software training classifier software data classifier training learning facebook vacuums policy sample responsibility invisible minority input computer problem

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Software in Society" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Software in Society

Personal responsibility in the engineering workplace

1

Lere WilliamsSlide2

Policy vacuums, conceptual vacuums and invisibility in software

Algorithmic complexity (ethical not computational)

Arguments for inclusion and personal responsibility in the software industry

Overview

2Slide3

“The mark of a basic problem in computer ethics is one in which computer technology is essentially involved and there is an uncertainty about what to do and even about how to understand the situation.”

[Moor, 1985]

Computation is a very general purpose tool, and is increasingly and idiosyncratically involved in our lives

Its involvement leads us to conceptual vacuums and policy vacuums

A framework for computer ethics

3Slide4

Big controversy about whether or not the dissemination of fake news on Facebook affected the outcome of the election

[https://goo.gl/JduJbT

, https://goo.gl/9yfakX]

False claims (in actuality the Pope did

not

endorse Trump or anyone else)

Evidence to suggest that a lot of people get news from Facebook (44% of U.S. adults by one estimation)

Also evidence to suggest that content on Facebook leads to action

Zuckerberg says (and I agree): we need to be extremely careful about becoming arbiters of truth

Policy vacuum: Facebook and fake news

4Slide5

Invisible abuse

Use of invisible operations to conduct malpractice, misuse of private information, surveillance, etc.

Invisible decisions and assumptions

Things left to the interpretation of the developer: substantive implementation decisions, input data, etc.

Invisible complexity

Invisibility in software

5Slide6

6

Designed bySlide7

Judges were using algorithmically-generated risk assessments that predict rates of recidivism as input for sentencing

Proprietary algorithm was found by one study to have a 40% error rate and to be biased against African-Americans

Ultimate ruling was for what is effectively a warning label

Wisconsin Supreme Court sentencing case

7Slide8

Supervised learning algorithm (roughly): takes historical instances of a problem (

training data) and produces a function (

classifier) that can be used to decide future instances of the problem

In order to do this, the algorithm picks out certain attributes of the input data (

features

). The set of all features is called the

feature space

.

How might this sort of unfairness happen?

8Slide9

Learning algorithms are designed to detect statistical patterns in the training data

If the training data reflects existing biases, then the learned classifier will likely reinforce those biases

Critically, this can be true regardless of whether a sensitive attribute is explicitly included in the feature set

Biased training set? Likely biased classifier

9Slide10

It gets better. Classifier error often decreases as inverse square root of sample size:

Problems of sample size

10Slide11

But of course, by definition, minority populations have smaller sample sizes.

So if the classifier learned on the majority group does not in fact apply well to the minority group, then the classifier will be more accurate for the majority than the minority

Overall then, it might appear that a classifier is highly accurate while still biasing against a minority population

Problems of sample size

11Slide12

Angwin advocates for due process protections with respect to data used in algorithmic decision making

Apparently, the credit industry is the only one currently subject to such legislation

In light of the knowledge of how learning algorithms work, that’s maybe not a bad suggestion

Of course, not all modelling uses learning algorithms (that was just an example). Specifics of the model used are critical.

We should probably advocate for seeing the code too

A case for data due process

12Slide13

Choice of training data is largely a decision left up to the developer.

Perhaps then, more diverse development teams can help to combat bias in training sets

Team diversity drives innovation in new areas

[

https://goo.gl/YCSS9M

]

A case for diversity in software organizations

13Slide14

Educate people about software internals

Remember Moor’s thesis. Paraphrasing: complexity creates obscurity, and unnecessary ethical disputes often result from lack of hard facts.

Think carefully about the software you build, and the contexts in which it might be applied

Very, very hard to consider all the possible ramifications, but awareness is the first step.

Personal responsibility in software

14Slide15

Government applications

Code for America’s food stamp application

Brigade’s voter networks

Poverty applications

Kiva’s microfinancing

Enveritas’s smallholder verification process

Many, many other problem domains

More chances than ever to apply software to big challenges

15Slide16

Brave new world, full of policy vacuums

Fairness is not guaranteed by “neutral” software

We have a responsibility to take an active stance in how software shapes the world

Educate people about software

Build carefully

Recap

16Slide17

17