/
How do we know if our educational intervention worked? How do we know if our educational intervention worked?

How do we know if our educational intervention worked? - PowerPoint Presentation

stefany-barnette
stefany-barnette . @stefany-barnette
Follow
350 views
Uploaded On 2018-10-04

How do we know if our educational intervention worked? - PPT Presentation

Part of this presentation comes from a presentation by Brian Dillon in the last incarnation of CS6604 Educational Interventions The fundamental issue We want to do something presumably something different than what we have been doing ID: 684292

difference technology educational media technology difference media educational significant learning economic education conclusions research standard mcs meaningful test intervention

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "How do we know if our educational interv..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

How do we know if our educational intervention worked?

(Part of this presentation comes from a presentation by Brian Dillon in the last incarnation of CS6604)Slide2

Educational Interventions

The fundamental issue: We want to “do something” (presumably something different than what we have been doing).

This is called an “intervention”.Slide3

Educational Interventions

The fundamental issue: We want to “do something” (presumably something different than what we have been doing).

This is called an “intervention”.

How do we know if we have “succeeded

”?Slide4

Educational Interventions

The fundamental issue: We want to “do something” (presumably something different than what we have been doing).

This is called an “intervention”.

How do we know if we have “succeeded”?

If we have not been doing something, there is no basis for comparison.Slide5

Measurement

Find some basis for performance.

Test scores?

It is important to have a track record prior to your intervention if you want to measure impact.Slide6

Measurement

Find some basis for performance.

Test scores?

It is important to have a track record prior to your intervention if you want to measure impact.

Lots of confounding variables!

What was REALLY the treatment, aside from the

intended intervention?Slide7

Significant Difference

Could the difference have occurred by chance?

Two dominating parameters:Slide8

Significant Difference

Could the difference have occurred by chance?

Two dominating parameters:

Variance and nSlide9

Significant Difference

Could the difference have occurred by chance?

Two dominating parameters:

Variance and n

Strength vs. significance:

Enough numbers will make any size difference become significant

No amount of strength is significant without numbersSlide10

Effect Size

Typically

measured in standard deviations

½ SD considered “moderate”

You can have a moderate effect size, with no significant difference!Slide11

Experimental Conditions

Experiment vs. Quasi-Experiment

In education, we usually don’t have complete control (or even much control)

Pre-test vs. Post-test: Learning gains

Control vs. Treatment(s)Slide12

No Significant DifferenceSlide13
Slide14

The No Significant Difference Phenomenon

by Thomas Russell

Published in 1992, updated 1999 and currently on web

Historical catalog of 200+ (now 500+) media comparison studies (MCS) which showed no significant difference.

Primary outcome is the preponderance of evidence.Slide15

Richard Clark

Stated position: educational media has no more effect on educational outcome than a grocery van has on the nutritional value of the groceries.

Allows: Economic savings may justify new mediaSlide16

Carol Twigg

Stated position: Supports Russell’s findings; MCS are unnecessary; all media may be assumed as equivalent

Allows: A number of people may feel new technology/media must be “proved” firstSlide17

Thomas Russell

Stated position: Stop reproving that already proven! No media will be shown less/more effective than any other.

Allows: Equivalent technologies may be ranked/selected based on other criteria.Slide18

Cumulative MCS by YearSlide19

Total MCS 1920-2000Slide20

Conclusions

No technology/media is meaningful in education.Slide21

Conclusions

No technology/media is meaningful in education.

No.

W

hat the research says is more like “None provides learning gains over standard non-technology teaching practice”Slide22

Conclusions

No technology/media is meaningful in education.

No.

W

hat the research says is more like “None provides learning gains over standard non-technology teaching practice”

Educational (technology) research is a waste of timeSlide23

Conclusions

No technology/media is meaningful in education.

No.

W

hat the research says is more like “None provides learning gains over standard non-technology teaching practice”

Educational (technology) research is a waste of time

It depends on what your goal is.Slide24

Fix

Board

Fully RemovableSlide25

Economy and Affordance

Consider economic savings

Some approaches cost less than others.Slide26

Economy and Affordance

Consider economic savings

Some approaches cost less than others.

Consider educational “affordances”

Affordance – An increased or new capability as a result of a new technique/technologySlide27

Paper Book vs Kindle/iPad

Learning:

Economic:

Affordances:Slide28

Online vs. Traditional Course

Learning:

Economic:

Affordances:Slide29

REAL Conclusions

One MAY choose to accept the statement that “all media are educationally equivalent”.

Any technology (online course systems among them) needs to be justified by other means.

Online course systems should be built with an eye to

Reducing cost

Targeting specific affordances