/
Automated Web-Based Behavioral Test for Early Detection of Alzheimer’s Disease Automated Web-Based Behavioral Test for Early Detection of Alzheimer’s Disease

Automated Web-Based Behavioral Test for Early Detection of Alzheimer’s Disease - PowerPoint Presentation

playhomey
playhomey . @playhomey
Follow
342 views
Uploaded On 2020-08-28

Automated Web-Based Behavioral Test for Early Detection of Alzheimer’s Disease - PPT Presentation

Eugene Agichtein Elizabeth Buffalo Dmitry Lagun Allan Levey Cecelia Manzanares JongHo Shin Stuart Zola Intelligent Information Access Lab Emory University Emory IR Lab ID: 808422

eye vpc gaze tracking vpc eye tracking gaze behavior subjects task test viewport analysis time cursor relevant search image

Share:

Link:

Embed:

Download Presentation from below link

Download The PPT/PDF document "Automated Web-Based Behavioral Test for ..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Automated Web-Based Behavioral Test for Early Detection of Alzheimer’s Disease

Eugene Agichtein*, Elizabeth Buffalo, Dmitry Lagun, Allan Levey, Cecelia Manzanares, JongHo Shin, Stuart Zola

Intelligent

Information Access Lab

Emory University

Slide2

Emory IR Lab: Research Directions

Modeling collaborative content creation

for information organization and indexing.

2Mining search

behavior data to improve information finding.

Medical applications

of Search, NLP,

behavior modeling.

Slide3

Mild Cognitive Impairment (MCI) and Alzheimer’s Disease

Alzheimer’s disease (AD) affects more than 5M Americans, expected to grow in the coming decadeMemory impairment (aMCI) indicates onset of AD (affects hippocampus first)

Visual Paired Comparison (VPC) task: promising for

early diagnosis of both MCI and AD before it is detectableby other means3

Slide4

VPC: Familiarization Phase

4

Slide5

VPC: Delay Phase

Delay

5

Slide6

VPC: Test Phase

6

Slide7

VPC Task: Eye Tracking Equipment

7

Slide8

8

Subjects with Normal Visual Recognition Memory > 66% of time on

Novel Images

Slide9

VPC:

Low Performance Indicates

Increased Risk for Alzheimer’s Disease

1. Detects onset earlier than

ever before possible2. Sets stage for

intervention

9

Eugene Agichtein, Emory University

Slide10

Behavioral Performance on the VPC test is a Predictor

of Cognitive DeclineEugene Agichtein, Emory University

10

[Zola et al., AAIC 2012]

Scores on the VPC task accurately predicted, up to three years prior to a change in clinical diagnosis, MCI patients who would progress to AD, and Normal subjects who would progress to MCI

Slide11

VPC: Gaze Movement Analysis

11Lagun et al., Journal of Neuroscience Methods, 2011

Visual examination behavior in the VPC test phase.

In this representative example, the familiar image is on the left (A), and the novel image is on the right (B), for a normal control subject. The detected gaze positions are indicated by blue circles, with the connecting lines indicating the ordering of the gaze positions.

Slide12

Technical Contribution: Eye Movement Analysis

12

Lagun et al., Journal of Neuroscience Methods, 2011

Slide13

Significant Performance Improvements

13

Method

Features

Accuracy

Sensitivity

Specificity

AUC

Baseline

NP

0.667

0.6

0.734

0.667

LR

NP+SO+RF+FD

0.71

0.712

0.707

0.71

SVM

NP+SO+RF+FD

0.869* (+30

%)

0.967* (+61%)

0.772* (+5%)

0.869* (+30%)

Lagun et al., Journal of Neuroscience Methods, 2011

Slide14

Our Big Idea:

Web-based VPC task (VPW)

with E. Buffalo, D. Lagun, S. Zola

Web-based version of VPC without an eye tracker Can be administered anywhere in the world on any modern computer.Can adapt classification algorithms to automatically

interpret the viewing data collected with VPW

14

Slide15

VPC-W Architecture

15

Slide16

VPC-W: basic prototype demo

Delay

ViewPort

position

Familiarization (identical images)

Test (novel image on left)

16

Slide17

Experiment OverviewStep 1:

Optimize VPC-W on (presumably) Normal Control (NC) subjectsStep 2: Analyze VPC-W subject behavior with both gaze tracking and viewport tracking simultaneouslyStep 3: Validate VPC-W prediction on discriminating Impaired (MCI/AD) vs. NC

17

Slide18

VPC-W: Novelty Preference

Preserved

Delay

(seconds)

Mean novelty

preference, VPC (N=30)

Mean novelty

preference, VPC-W (N=34)

10

67%

65%

60

68%

69%

Self-reported elderly NC subjects tested with

VPC-W

over the internet

exhibit

similar novelty preference

to that of

VPC

.

Single-factor

ANOVA reveals no significant difference between VPC and VPC-W subjects

18

Slide19

VPC vs. VPC-W: Similar Areas of Interest

VPC ranking

VPC-W

rankingQuantifying viewing similarity: Coarse measure: divide into

9 regions (3x3), rank by VPC and VPW viewing time. The Spearman rank correlation varies between 0.56 and 0.72 for different stimuli.

VPC

VPC-W

Areas of

attention:

heat map for VPW (viewport-based) is concentrated in similar areas to VPC (unrestricted eye-tracking) .

19

Slide20

Actual Gaze vs. Viewport Position

20

Attention

w.r.t. ViewPort

Slide21

Eye-Cursor Time Lag Analysis

21

XY: minimum at -75.00 ms 199.8578

X:minimum at -90.00 ms 161.8480Y:minimum at -35.00 ms 116.3665

Slide22

Viewport Movement ~ Eye Movement

N

ormal elderly subject (NP=88%, novel image is on left).

I

mpaired elderly subject (NP=49%, novel image is on left).

22

Slide23

Exploiting Viewport Movement Data

Novelty Preference

fixation duration distribution

+

23

Slide24

VPC-W Results

: Detecting MCI21 Subjects (11 NC, 10 aMCI), recruited @Emory ADRC:

Accuracy on the pilot data

comparable to best reported values for manually administered cognitive assessment test (MC-FAQ, reported accuracy, specificity, and sensitivity of 0.83, 0.9, and 0.89 respectively) (Steenland et al., 2009).

Classification method

5-fold CV

10-fold CV

leave-1-out

Acc.

Sens.

Spec.

AUC

Acc.

Sens.

Spec.

AUC

Acc.

Sens.

Spec.

AUC

Baseline: NP>=0.58

0.81

0.80

0.82

0.81

0.81

0.80

0.82

0.81

0.81

0.80

0.82

0.81

SVM

(VPC-W)

0.81

0.80

0.83

0.81

0.85

0.80

0.9

0.86

0.86

0.80

0.91

0.86

Accuracy, Sensitivity, Specificity, and AUC (area under the ROC curve) for automatically classifying patients tested with

VPC-W

using

5-fold, 10-fold

, and leave-one-out

cross validation

.

24

Slide25

Current Work

Analysis:Applying deep learning and “motif” analysis for more accurate analysis of trajectoryIncorporating visual saliency signalsData collection:Longitudinal tracking of subjects“Test/Retest”: effects of repeated testingSensitivity analysis: for possible use in drug trialsWide range of “normative” data using

Mturk worker pool

25

Slide26

Future Directions and Collaboration Possibilities

Can we apply similar or the same techniques for cost-effective and accessible detection of:Autism (previous work on difference in gaze patterns)ADHDStroke/Brain traumaOther possibilities?

What can we learn about the searcher from their natural search and browsing behavior?Image search and examination preferences (anorexia)

Correlate behavior with biomarkers (Health 101 cohort)26

Slide27

VPC-W Summary

VPC-W, administered over the internet, elicits viewing behavior in normal elderly subjects similar to eye tracking-based VPC task in the clinic.

Preliminary results show automatic identification

of amnestic MCI subjects with accuracy comparable to best manually administered tests. VPC-W and associated classification algorithms could facilitate cost-effective and widely accessible screening for memory loss with a simple log on to a computer.

Other potential applications: online detection and monitoring of ADD, ADHD, Autism and other neurological disorders.This project has the potential to dramatically enhance the current practice of Alzheimer’s clinical

and translational research.

27

Slide28

Eye Tracking for Interpreting

Search BehaviorEye tracking gives information about searcher interests:Eye position

Pupil diameterSaccades and fixations

Reading

Search

Camera

28

Slide29

We Will Put an Eye Tracker on Every Table! - E. Agichtein, 2010

Problem: eye tracking equipment is expensive and not widely available.Solution: infer

searcher gaze position from searcher interactions.

29

Slide30

Inferring

Gaze from Mouse Movements

Actual Eye-Mouse Coordination

Predicted

No Coordination (35%)

Bookmarking (30%)

Eye follows mouse (35%)

30

Guo & Agichtein, CHI WIP 2010

Slide31

Post-click Page Examination Patterns

Two basic patterns: “Reading” and “Scanning”“Reading”: consuming or verifying when (seemingly) relevant information is found“Scanning”: not yet found the relevant information, still in the process of visually searching31

Slide32

Cursor Heapmaps (Reading vs. Scanning)

[Task: “verizon helpline number”]32

Relevant (dwell time: 30s)

Not Relevant (dwell time: 30s)

Move cursor horizontally

 “reading”

Passively move cursor

“scanning”

Slide33

Typical Viewing Behavior (Complex Patterns) [Task: “number of dead pixels to replace a Mac”]

33

Relevant (dwell time: 70s)

Not Relevant (dwell time: 80s)

Actively move the cursor with pauses  “reading” dominant

Keep the cursor still and scroll

 “scanning” dominant