/
From Objectives to Outcomes From Objectives to Outcomes

From Objectives to Outcomes - PowerPoint Presentation

cheryl-pisano
cheryl-pisano . @cheryl-pisano
Follow
390 views
Uploaded On 2017-08-09

From Objectives to Outcomes - PPT Presentation

Beth Fentress Hallmark PhD RN Belmont University Nashville TN Belmont University Nashville Tennessee 5000 students College of Health Sciences Interprofessional Education Nursing ID: 577143

simulation assessment competence outcomes assessment simulation outcomes competence assessments students objectives safe skills performance time education evaluation clinical knowledge

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "From Objectives to Outcomes" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

From Objectives to Outcomes

Beth Fentress Hallmark, PhD, RNBelmont UniversityNashville, TN Slide2

Belmont University

Nashville, Tennessee

5,000 + students

College of Health Sciences

Inter-professional

EducationNursing Accelerated, Fast track and Traditional BSNFNPSocial Work (BSW)Physical Therapy (DOT)Occupational Therapy (DPT/MSOT)Pharm DSlide3

Belmont University

Nashville, Tennessee

2- Eight bed Adult Health laboratories

8 bed “Acute care” lab

4 bed

Peds lab8 bed Health Assessment/OB lab4 Inter-professional private patient areasSlide4

From Objectives to Outcomes:

 

Learning Objectives:Identify the components of Healthcare SimulationDiscuss

the importance of outcomes evaluation and challenges to traditional assessments

Discuss the importance of validity, reliability and feasibility as it relates to assessment

Discuss types of assessments and their application in healthcare education Slide5

Components of Healthcare Simulation

Jeffrey A. Groom, PhD, CRNA (2009)Slide6
Slide7
Slide8

Study Finds No Progress in Safety at Hospitals

November 24, 2010 : NY Times: 10 North Carolina hospitals 25.1 injuries per 100 admissions42.7 %

= extra time in the hospital2.9 % patients suffered a permanent injury

> 8 percent life-threatening

2.4 % of them caused or contributed to a patient’s death

Medication errors caused problems in 162 cases.Slide9

How do we measure our improvement?

Safe/competent practitioners: whatever the discipline/setting.

Initial & Continued competenceAcquisition of relevant knowledge

Development of psychomotor skills

Application of this knowledge and the skillSlide10

Current Assessments

Current methods used to measure performance in the clinical area is difficult:Confidentiality

Faculty to student ratioSafety to patientPreceptors: Valid? Reliable?

Adjunct Faculty

ToolsSlide11

Model of CompetenceSlide12

Problem with “Knowing”

Knowing is measured using examinations like NCLEX, NREMT cognitive exam, FNP certification exam,, calculations test, etc….Recalling basic Facts, principles and theoriesMultiple choice questions and T/F questionsTest question design: Valid, reliable

Bloom’s taxonomyCritical thinking questionsSlide13

Problem with “Knowing”

Cognitive DomainBelmont pass rate on NCLEX May 2005 is 98.6% on NP exam 100 %....Strategies to pass these exams employed in educational institutions.

Does this mean that each of these students Will be prepared to care for you or your loved ones?Slide14

Model of CompetenceSlide15

Problem with “Knows How”

“Knows How”: Application of knowledge to problem solving and decision making” (Waas

, 2001) “A thought process stimulated by a problem” (Waas, 2001).

“ability to solve problems, make decisions and describe procedures” (

Scalese

, 2008)Case studies and essaysMultiple/multiplesAgain… are these students prepared to provide safe proficient care.Slide16

Model of CompetenceSlide17

“Shows how vs. Does”

“Shows How”“demonstration of skills in a controlled setting” (Scalese, 2008)

Educating in these methods includes simulation based education (SBE). OSCE, SP, Simulations, log books, portfoliosTechnical skills

Includes higher level thinking

“Does”

Moves from simulated environment to the real life settingSlide18

Assessment vs. Evaluation

Assessment and evaluation

are often used interchangeablyHowever for our purposes…

Assessment describes the measurement of learner outcomes

Evaluation describes the measurement of course/program outcomesSlide19

Why do we assess learner outcomes?

Provides baseline data

Provides summative and formative feedback

“Drives learning”

Allows measures of individual progress.

Encourages “student” reflectionAssures public that providers are competentLicensure/credentialing requirementsSlide20

Why do we evaluate our programs?

Demonstrates change and growth in programs/courses

Identifies gaps in programs/ courses

Fundamental to outcomes- or competency-based education

Accrediting/Credentialing facilities/programs

Allows administration to make informed allocation decisions Slide21

Slide22

Objectives/Outcomes of Program

Define outcomes based on accrediting/professional organizations., etc.Objectives/Outcomes leads to competency and mastery.Identify the Knowledge, Skills and Attitudes/Affective Behaviors (KSA).

Curricular/Program Specific Simulation Event Specific.Measurable, clearly defined standards.Slide23

± change/refineSlide24

Simulation Education

KnowledgeSkillsAttitudesAdvance these throughout the curriculum via assessmentFor example: injection to team trainingSlide25

Preparing assessments

What should be assessed?

Every aspect of curriculum considered Essential

Significant designated teaching time

Should be consistent with

learning outcomes that are established as the competencies students should master/perform at a given phase of study Slide26

Use of Assessment in Simulation

Rosen, MA et al. Measuring Team Performance in Simulation-Based Training: Adopting Best Practices for Healthcare.

Sim Healthcare

3:2008;33–41.

Formative

orSummativeSlide27

Assessment

Formative AssessmentLower stakes assessment

One of several assessments over time of course or programMay be evaluative, diagnostic, or prescriptive

Often results in remediation or progression to next level

Summative Assessment

Higher stakes assessmentGenerally final course or program assessmentPrimary purpose is performance evaluationOften results in a Go-No Go outcomeSlide28

Assessments - peer

Enables learners to hone their skills in their ability to work with others and professional insight Enables faculty to obtain a view of students they do not see

An important part of peer assessment is for students to justify the marks they award to others

Justification can also be used as a component when faculty evaluates attitudes and professionalism. Slide29

Assessments - standard setting

Should be set to determine competenceEnables certification to be documented, accountable and defensible

Appropriately set standards for an assessment will pass those students who are truly competent

Standards should not be two low (false positives) to pass those who are incompetent, nor too high (false negative) to fail those who are competent.Slide30

Assessments - standard setting

Standards should be set around a core curriculum that includes the knowledge, skills and attitudes required of all students

When setting a standard the following should be considered:

Must reflect the core curriculum

High standard in the core components of the curriculum

Demonstrate mastery at each phaseSlide31

To attain “

competent performance” – basic Knowledge

, Skills & Attitudes

required

Competence –

is the application of specific KSAs. Performance – is the “translation of competence into action”Clinical Competence & PerformanceSlide32

Performance Assessment

Basic to performance – Do they know it and know how?

Competence –

Can they do it?

Performance –

Do they do it?Slide33

Assessing Simulation

Documenting Data:Live, video recording, Software logging systemsLogistics of documenting Data:AV annotation via logging, pencil paper (wipe off cards),

scantron, PDA/handheld/TabletPC

Assessors:

Instructors, Observers, SIM/Patients, Peers, ParticipantsSlide34

Choosing appropriate assessment methods/tools

When choosing the assessment instrument, the following should be answered:

Is it valid

Is it reliable

Is it feasibleSlide35

Assessments - validity

Are we measuring what we are supposed to be measuring?Use the appropriate instrument for the knowledge, skill, or attitude you are testing

The major types of validity should be considered (content, predictive, and face)Slide36

Assessments - reliability

Does the test consistently measure what it is supposed to be measuringTypes of reliability:

Inter-rater (consistency over raters)Test-retest (consistency over time)

Internal consistency (over different items/forms)Slide37

Assessment Tools

Tools should measure KSA within the domains that you are measuringCognitivePsychomotor

AffectiveDo these domains occur alone? Or simultaneous?Simulation offers the ability to assess each of these domains …an application of the cognitive domain while performing psychomotor skills as the student demonstrates how they have internalized values, attitudes and beliefs.Slide38

Where did I start?

“Low hanging Fruit” TASTED GREAT!!Self- reportedConfidence

Increased critical thinkingSatisfaction Situational AwarenessSlide39
Slide40

Where should you start?

Tools developed for your OBJECTIVES!To measure clinical judgment Use a tool developed for this.Lasater

(2007).Adds to reliability and validityMay combine instruments

What about the tool you use for clinical evaluation?

Is it reliable? Valid? Who developed it? Have you had consistency issues with tool/students in clinical?

Does it measure what you really want it to?Slide41

Assessments - feasibility

Is the administration of the assessment instrument feasible in terms of time and resources?

Time to construct?

Time to Score ?

Ease of interpreting the score/producing results ?

Practical given staffing/organization ?Quality of feedback ?Learner takeaway ?Motivate Learner ?Slide42

Practicality

Number of students to be assessedTime available for the assessment

Number of staff availableResources/equipment available

Special accommodationsSlide43

Examples of Tools

Kardong-Edgren, S., Adamson, K.A., Fitzgerald, C. (2010, January). A review of currently published evaluation instruments for human patient simulation. Clinical Simulation in Nursing, 6(1), e25-e35. Doi:10.1016/jecns.2009.08.004.Slide44

Exercise

Let’s try it: OUTLOUDGroupsHospitalEmergency

Nursing educationSafe Medication AdministrationHow does this link to the programmatic outcomes and then with your course?Slide45

Safe Medication Administration

How are you measuring this now?SummativeFormativeKnowledge (Cognitive exams)S

kills/Psychomotor (lab check off)Attitudes/Affective (what would you examine or are you examining here?) Likert satisfaction? Self Confidence?Slide46

Model of CompetenceSlide47

Knowing: Safe Med Administration

Each group write a high level MC question for Safely Administering a specific medication (Choose One).Is the student who answers this question safe and competent?Slide48

Model of CompetenceSlide49

Knows How: Safe Med Administration

Write a short case related to giving the same medication?What components must the student tell the grader?How to administer the med?

Side Effects?Teaching ?What Else will we measure?

For the student who reaches all of these assessment criteria …Are they competent and safe to give the medication?Slide50

Model of CompetenceSlide51

Shows How: Safe Med Administration

Take the case above and the objectives and apply to a simulation.This can be simple or advanced incorporating teamwork/communication/ high acuity.

What KSA’s are required?Which “student” do you want taking care of you?Slide52

References

Decker, S., Sportsman, S., Puetz, L., Billings, L. (2008). The Evolution of Simulation and Its Contribution to Competency.

The Journal of Continuing Education in Nursing, 39 (2), 74-80.Groom, J.A. (2009). Creating new solutions to the simulation puzzle. Society for Simulation in Healthcare, 4 (3), 131. DOI: 10.1097/SIH.0b013e3181b3e4c3

Kardong-Edgren

, S., Adamson, K.A., Fitzgerald, C. (2010, January). A review of currently published evaluation instruments for human patient simulation. Clinical Simulation in Nursing, 6(1), e25-e35. Doi:10.1016/jecns.2009.08.004.

Mckimm, J. (2010).University of Auckland; Visiting Professor of Healthcare Education and Leadership, University of Bedfordshire and Honorary Professor in Medical Education, Swansea University. Retrieved from http://www.faculty.londondeanery.ac.uk/e-learning/setting-learning-objectives/learning-objectives-and-learning-outcomes.Ross J. Scalese, MD, Vivian T. Obeso, MD, and S. Barry Issenberg, MD. (2009).Simulation Technology for Skills Training and Competency Assessment in Medical Education. Journal of General Internal Medicine, Journal of General Internal Medicine , 23(1), 46-49, DOI: 10.1007/s11606-007-0283-Wass

, V., Van der Vleuten, C., Shatzer

, J., & Jones, R. (2001). Assessment of clinical competence.

Lancet, 24

;357(9260):945-9.