/
FT-IR OC & EC predictions in IMPROVE & CSN networks across multiple years FT-IR OC & EC predictions in IMPROVE & CSN networks across multiple years

FT-IR OC & EC predictions in IMPROVE & CSN networks across multiple years - PowerPoint Presentation

ventuilog
ventuilog . @ventuilog
Follow
343 views
Uploaded On 2020-06-23

FT-IR OC & EC predictions in IMPROVE & CSN networks across multiple years - PPT Presentation

Bruno Debus Andy Weakley Satoshi Takahama Ann Dillner Oct 22 th 2019 Petaluma California 1 Acknowledgements Funding for this project EPA and IMPROVE NPS Cooperative Agreement P11AC91045 ID: 784754

amp improve site csn improve amp csn site sites error 2018 bias prediction atypical mdl 2017 teflon predictions network

Share:

Link:

Embed:

Download Presentation from below link

Download The PPT/PDF document "FT-IR OC & EC predictions in IMPROVE..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

FT-IR OC & EC predictions in IMPROVE & CSN networks across multiple years

Bruno Debus, Andy Weakley, Satoshi Takahama, Ann Dillner Oct 22th, 2019Petaluma - California

1

Slide2

AcknowledgementsFunding for this project:

EPA and IMPROVE (NPS Cooperative Agreement P11AC91045) EPRI (Agreement 10003745 and 10005355)Swiss Polytechnic University-Lausanne (EPFL) Collaborators, post-docs and undergraduate / graduate students: CSN, FRM, IMPROVE, SEARCH programs and site/state personnelJoann Rice, Mike Hays, Emily Li, EPABret Schichtel and Scott Copeland, IMPROVEStephanie Shaw and Eric Edgerton, EPRI/ARARandall Martin and the SPARTAN personnel, SPARTAN and Washington UniversityDave Diner and MAIA team members, MAIA and JPL2Alexandra BorisKelsey SeibertTravis Ruthenburg

Mohammed Kamruzzaman

Charlotte Burki

Amir Yazdani

Brian Trout

Jenny Hand

Katie George

Charity

Coury

Sean Raffuse

Tony Wexler

Slide3

Non-destructiveFast and low-costAnalyzing all IMPROVE and CSN PTFE filters

5 min/sample, 3 instruments, 400-700 filters/wk6 undergrads and 1 lab supervisorPM2.5 PTFE/Teflon filtersRoutinely collected No gas phase adsorptionFT-IR spectra are information richTOR OC and ECOrganic functional groups, OMSources Inorganics including SO4, NO3, SiO3FT-IR: Strengths & Limitations for network applicationsCalibration methods are complex PTFE

filter manufacturer (Pall, MTL) dependent

No

directly comparable methods for functional groups and OM to validate data (

no gold standard

)

Slide4

FT-IR in routine network measurements

FT-IR spectroscopy

Extract quantitative information about IR active substances

Mass

Carbon

Ions

Elements

Functional groups

IMPROVE

75,505 Teflon filters (2015 – 2018)

CSN

26,936 Teflon filters (2017 – 2018)

4

Teflon filters

Slide5

Quantitative analysis of ambient samples using FT-IR

Basic considerations5

Slide6

Global model  Single calibration using a sample subset

from every siteLow prediction quality for samples collected during wildfires eventsSmoke impacted sample detection?

Sites with unusual composition

Atypical / Typical partitioning?

IMPROVE

From

“Global”

to

Mutli

-level”

modeling

CSN

A.T

.

Weakley

et al(2018

) Ambient aerosol composition by infrared spectroscopy and partial least squares in

the chemical

speciation network: Multilevel modeling for elemental carbon, Aerosol Science

and Technology

, 52:6,

642-654

Requires collocated Teflon & Quartz modules at

every location

around the network

Drawbacks & limitations

:

1

2

3

Slide7

Wildfire

detection – IMPROVE (2015)Based on a simple OC/EC criterionSeasonality & TOR OC concentrations are consistent with fire season / emissions 7≈ 341 samples

1

Slide8

Typical

/ Atypical sites – CSN (2017)Cluster 1: 124 Typical sitesCluster 2: 15 Atypical sitesSite name

State

Adams County

CO

Platteville

CO

Criscuolo Park

CT

Rome - Elementary School

GA

Indianapolis - Washington Park

IN

Elizabeth Lab

NJ

New York - Division Street

NY

Harvard Yard (Cleveland)

OH

Southerly WTP

OH

Akron - 5 Points

OH

Marcus Hook

PA

NE Wastewater Treatment Plant

PA

Jail at Bayamon

PR

Seattle 10th Ave

WA

Charleston NCore

WV

Atypical sites ( n = 15)

Each site is represented as function of it mean OC/EC ratio and prediction error (Global model)

Clustering is used to partition Atypical sites from Typical sites

8

2

Slide9

Site selection - Flowchart

9TypicalAtypicalNon FireSite number

Site combination

Optimization

3

Slide10

2015

2016 2017 2018Site number - Optimization (EC)IMPROVECSN

Slide11

Site combination - Optimization (EC)

11Identify site combination with optimal predictions:Examine 3,000 potential site list candidates using a Monte Carlo method

High R

2

& near zero bias

Reliable predictions for

both

OC & EC

Consistent predictions across multiple years

Slide12

Optimal IMPROVE site list

1214 % of the network

Slide13

Optimal

CSN site lists13≈ 14 % of the network

Slide14

Calibration / FT-IR predictions Results

14

Slide15

Results – OC & EC prediction (IMPROVE)

R

2

Bias

(%)

Error

(%)

<

MDL

(%)

OC

0.98

-0.3

12.9

4.3

EC

0.92

0.2

25.7

32.7

2015 – 2016 – 2017

2018

Satisfactory prediction metrics across a 4 year period

Predictions from both Fire & non fire impacted samples are reported together

15

Slide16

Results – OC & EC prediction (CSN)

R2Bias (%)Error (%)< MDL (%)

OC

0.93

-0.8

13.6

0.6

EC

0.75

0.5

25.5

9.9

2017

2018

S

atisfactory prediction metrics across a 2 year period

Predictions from both Typical & Atypical sites are reported together

16

Slide17

Ions & Elements FT-IR predictions

IMPROVECSNR2 Bias

(%)

Error

(%)

< MDL

(%)

S

0.98

0.2

5.6

1.4

PM

2.5

0.98

0.5

5.7

0.2

SO

4

0.98

0.1

5.8

1.1

NH

4

0.95

1.2

8.8

1.2

Soil

0.98

1.5

9.5

8.9

Si

0.98

2.0

11.2

14.7

Ca

0.97

1.0

11.3

6.9

Al

0.98

0.6

12.3

8.8

OC

0.98

0.8

12.7

0.4

HIPS

0.88

-3.0

22.1

17.7

Fe

0.93

3.1

23.9

16.8

Ti

0.92

0.6

24.8

19.4

EC

0.91

1.6

25.9

15.8

NO

3

(winter north)

0.93

10.7

48.6

25.1

R

2

Bias

(%)

Error

(%)

< MDL

(%)

S

0.94

0.1

9.6

2.7

OC

0.94

-0.7

13.3

0.7

SO

4

0.83

-0.3

14.8

4.2

EC

0.79

0.2

25.2

12.6

Ca

0.82

-0.9

31.1

16.7

Si

0.86

-2.9

41.3

34.6

NO

3

0.88

13.3

45.7

15.2

Ti

0.68

-9.6

59.9

27.1

NH

4

0.84

3.0

66.8

47.7

Al

0.73

-41.7

102.6

68.6

Similar (%) error compared to OC

Similar (%) error compared to EC

Besides carbon, additional IR active materials can be predicted from Teflon filters

(

XRF

, IC

)

This data can be used for QC and the calibrations developed for CSN could be extended to FRM (as previously shown for OC

& EC

)

17

Slide18

Conclusions

18Multi-levels models accommodate unique variations in aerosols composition across the networks and improve predictionsIMPROVE  Fire / Non fire modelsCSN  Atypical / Typical sitesThe number of sites retained in the calibration to maintain accurate predictions and the corresponding site selection was optimized via a Monte Carlo methodIMPROVE  22 sites retained (14 % of the network)CSN  20 sites retained (14 % of the network)The multi-level modeling provides reliable TOR-equivalent OC & EC concentrations across a 4 years of IMPROVE and 2 years for CSN.In addition to carbon, IR active materials sulfate and silicate can be predicted from IR spectra of Teflon filters

Useful for QC for IMPROVE and CSN

CSN calibrations can be used for the FRM network

Slide19

Thank you for your attentionPlease send request for

additional plots / analysis to bdebus@ucdavis.edu

Slide20

20

Slide21

Supporting Materials

21

Slide22

FT-IR Lab – UC Davis

Automatic LN2 refilling systemPurge systemPurge system

IR1

IR2

IR3

Slide23

Wildfire detection – IMPROVE (2016)

≈ 180 samples

Slide24

Wildfire detection – IMPROVE (2017)

≈ 620 samples

Slide25

Wildfire detection – IMPROVE (2018)

≈ 560 samples

Slide26

Initial site selection strategy

Each site is summarized by it median TOR EC & NH4 concentrationsSite close to the bin center is selected for calibration (representative)The optimal number of site is assessed by varying the number of binsExample of bin segmentation (IMPROVE 2015)26

Slide27

Inter-year comparison of the top 30 sites list candidates (

IMPROVE – OC)

Optimum

Slide28

Inter-year comparison of the top 30 sites list candidates (

IMPROVE

– EC)

Optimum

Slide29

Inter-year comparison of the top 30

Atypical

sites list candidates (

CSN

– OC)

Optimum

Slide30

Inter-year comparison of the top 30

Atypical

sites list candidates (

CSN

– EC)

Optimum

Slide31

Inter-year comparison of the top 30

T

ypical

sites list candidates (

CSN

– OC)

Optimum

Slide32

Inter-year comparison of the top 30

T

ypical

sites list candidates (

CSN

– EC)

Optimum

Slide33

Site index

Site nameState02-090-0034Alaska NCoreAK06-067-0006Sacramento - Del Paso ManorCA11-001-0043Washington DC - McMillan ReservoirDC18-163-0021Evansville - Buena Vista RoadIN29-510-0085St. Louis - Blair Street

MO

34-013-0003

Newark Firehouse

NJ

36-031-0003

Whiteface

NY

37-067-0022

Winston-Salem - Hattie Ave

NC

37-119-0041

Garinger

High School

NC

39-113-0038

Sinclair Community College

OH

40-109-1037

OCUSA Campus

OK

42-071-0012

Lancaster Downwind

PA

42-125-5001

East of Pittsburgh- Florence

PA

48-201-1039

Deer Park

TX

53-033-0080

Seattle - Beacon Hill

WA

55-119-8001

Perkinstown CASTNET

WI

Typical sites

Atypical sites

Site index

Site name

State

08-123-0008

Platteville

CO

09-009-0027

Criscuolo

Park

CT

18-097-0078

Indianapolis - Washington Park

IN

34-039-0004

Elizabeth Lab

NJ

IMPROVE

CSN

Optimal site lists – Details

Site index

Site name

State

Affiliation

BAND1

Bandelier

NM

NPS

BIBE1

Big Bend National Park

TX

NPS

CABA1

Casco Bay

ME

STATE

CORI1

Columbia River Gorge

OR

FS

FLTO1

Flat Tops Wilderness

CO

FS

GLAC1

Glacier

MT

NPS

HAVO1

Hawaii Volcanoes

HI

NPS

JARB1

Jarbidge

NV

FS

LASU2

Lake Sugema

IA

STATE

LIGO1

Linville Gorge

NC

FS

LTCC1

Lake Tahoe Community College

CA

STATE

MAVI1

Martha's Vineyard

MA

TRIBE

MONT1

Monture

MT

FS

MOOS1

Moosehorn

ME

FWS

MORA1

Mount Rainier

WA

NPS

OLYM1

Olympic

WA

NPS

RAFA1

San Rafael

CA

FS

SHRO1

Shining Rock

NC

FS

TALL1

Tallgrass

KS

STATE

THSI1

Three Sisters

OR

FS

VIIS1

Virgin Islands

VI

NPS

WHIT1

White Mountain

NM

FS

Slide34

IMPROVE 2015 – OC & EC prediction

R2Bias (%)Error

(%)

<

MDL

(%)

OC

0.98

0.8

12.7

0.4

EC

0.91

1.6

25.9

15.8

Slide35

IMPROVE 2016 – OC & EC prediction

R2Bias (%)Error (%)< MDL (%)

OC

0.97

-1.7

14.5

1.5

EC

0.91

0.4

29.0

18.8

Slide36

IMPROVE 2017 – OC & EC prediction

R2Bias (%)Error (%)< MDL (%)

OC

0.98

-0.8

12.1

1.5

EC

0.91

-0.2

23.8

4.8

Slide37

IMPROVE 2018 – OC & EC prediction

R2Bias (%)Error (%)< MDL (%)

OC

0.99

0.6

12.4

0.4

EC

0.92

-0.9

24.0

7.1

Slide38

CSN

2017 – OC & EC predictionR2Bias (%)

Error

(%)

<

MDL

(%)

OC

0.94

-0.7

13.3

0.7

EC

0.79

0.2

25.2

12.9

Slide39

CSN

2018 – OC & EC predictionR2Bias (%)Error (%)< MDL (%)

OC

0.92

-1.0

14.1

0.4

EC

0.69

1.3

26.0

7.1

Slide40

OC bias distribution –

IMPROVE (2015 – 2018)Percentile bias located within TOR uncertainty boundaries

Slide41

E

C bias distribution – IMPROVE (2015 – 2018)Percentile bias located within TOR uncertainty boundaries

Slide42

OC bias distribution –

CSN (2017 – 2018)Percentile bias located within TOR uncertainty boundaries

Slide43

E

C bias distribution – CSN (2017 – 2018)Percentile bias located within TOR uncertainty boundaries

Slide44

IMPROVE – Prediction of IR active ions & elements from Teflon filter

Winter North sample only

Slide45

Spatial distribution of the 79 sites considered for developing a Winter North Nitrate calibration (IMPROVE)

Slide46

CSN – Prediction of IR active ions & elements from Teflon filter