/
Radha Venkatagiri 1 ,  Abdulrahman Radha Venkatagiri 1 ,  Abdulrahman

Radha Venkatagiri 1 , Abdulrahman - PowerPoint Presentation

eatsui
eatsui . @eatsui
Follow
342 views
Uploaded On 2020-08-06

Radha Venkatagiri 1 , Abdulrahman - PPT Presentation

Mahmoud 1 Siva Kumar Sastry Hari 2 Sarita Adve 1 1 University of Illinois at Urbana Champaign 2 NVIDIA Research Approxilyzer Towards A Systematic Framework for Instruction ID: 800099

output quality approxilyzer error quality output error approxilyzer sdc cost errors resiliency approximation application instructions order determine threshold program

Share:

Link:

Embed:

Download Presentation from below link

Download The PPT/PDF document "Radha Venkatagiri 1 , Abdulrahman" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Radha Venkatagiri1, Abdulrahman Mahmoud1, Siva Kumar Sastry Hari2, Sarita Adve11University of Illinois at Urbana-Champaign, 2NVIDIA Research

Approxilyzer: Towards A Systematic Framework for Instruction-Level Approximate Computing and its Application to Hardware Resiliency

Let them eat

cake!

Make Informed Choices!

Slide2

Error in program

11% degradation

Quality of output

Introduction

2

APPROXILYZER

Final

end-to-end Output

?

Approximate computing Trades off output quality for power, performance, resilience...Fundamental problem: How do errors in a program affect its output quality?

Slide3

Contributions (1 of 2): Approxilyzer3 APPROXILYZERFinal end-to-end OutputEnd-to-end Quality Metric(domain-specific)++Quality Threshold (Optional)

Unmodified

Program

Approxilyzer

: Tool to determine output quality

Minimal programmer burden, general-purpose,

automatic, comprehensive

Slide4

Contributions (1 of 2): Approxilyzer4 APPROXILYZERFinal end-to-end OutputEnd-to-end Quality Metric(domain-specific)++Quality Threshold (Optional)

Unmodified

Program

Approxilyzer

: Tool to determine output quality

Minimal programmer burden, general-purpose,

automatic

,

comprehensive

Error Model:

Single bit errors in operand register + dynamic instructions

Error site : Operand register bit + dynamic instruction

Determine output quality with high accuracy

(95

%)

and confidence (99%)

Slide5

Contributions (1 of 2): Approxilyzer5 APPROXILYZEREnd-to-end Quality Metric(domain-specific)++Quality Threshold (Optional)

Unmodified

Program

Approxilyzer

: Tool to determine output quality

Minimal programmer burden, general-purpose

, automatic,

comprehensive

Comprehensive output quality profile

Determine output quality with high accuracy

(95

%)

and confidence (99%)

Slide6

Ultra-low cost resiliency with approximate outputTuning output quality vs. resiliency coverage vs. cost Significant overhead savings (upto 55%) for small quality loss (1%)First order approximation potential of applicationsIdentify potential approximable instructions, reduce exploration space 40% static instructions have approximable 32b register chunks

Contributions (2 of 2): Applications of Approxilyzer

System Optimization; e.g.,

Resiliency vs.

Cost vs.

Quality

Optimizer

Identify potentially

approximable

instructions

End-to-end Quality Metric

+

APPROXILYZER

+

(Optional)

Q

uality

T

hreshold

Unmodified

Program

Comprehensive output quality profile

Slide7

OutlineIntroductionBackground: RelyzerApproxilyzer Application : Ultra-low cost resiliencyApplication : First order approximation potentialConclusion7

Slide8

APPLICATION...Output

APPLICATION...

Output

Challenge : Determine Quality

I

mpact of

All Errors

11% quality degradation

Challenge : Determine quality impact of virtually all errors in reasonable time

Slide9

Background : Error Outcomes for Single Bit Errors9Error-freeexecution

Output

Output

Output

Silent Data

Corruption (SDC)

Detection

Detection

Masked

ERROR ( )

Slide10

Background: Relyzer [ASPLOS’12]APPLICATION...Output

Equivalence Classes (using data + control flow heuristics)

Pilots

Few injections to predict the outcome of virtually all errors

Insight: Errors flowing

through similar

control+data

paths

produce similar outcomes

Inject error in Pilot

Pilot outcome

=

Outcome of all errors in class

Slide11

OutlineIntroduction Background: RelyzerApproxilyzer Application : Ultra-low cost resiliencyApplication : First order approximation potentialConclusion11

Slide12

Approxilyzer : Predict SDC Quality12Not all SDCs (output corruptions) are bad – quality is important! Blackscholes: Precise = $100, SDC = $107, quality loss (relative error) = 7%Pilot = SDC with quality Q  All errors in class = SDCs with quality Q? Equivalence ClassesPilots

APPLICATION

Output

.

.

.

Output Quality

SDC

11% Deviation from error-free o/p

11%

3% Deviation from error-free o/p

3%

67% Deviation from error-free o/p

67%

C

omprehensive end-to-end output quality profile with few error injections

Slide13

Approxilyzer: Quality Aware Error ClassificationGranularity of quality?Hard to capture large range of continuous values, unnecessary!13

Output

Silent Data Corruption (SDC)

Highly Tolerable

Error in non-significant o/p

Quality loss < 0.0001% etc.

Not Tolerable

Quality loss > 100%

Potentially tolerable

Quality > Threshold?

Record

q

uality in fine-grained discretized bins

SDC-Good

SDC-Bad

SDC-Maybe

Detectable Data

Corruptions (DDC)

NaN

, infinity etc.

Not

approximable

!

Slide14

Approxilyzer: ValidationPilot = SDC w quality Q  All errors in class (Population) = SDCs w quality Q? Pilot error category == Population error category?For SDC-Maybe, pilot quality bin == population quality bin?~ 2.6 million error injection experiments99% confidence interval 14

Slide15

Approxilyzer : Validation15On average, 88% prediction accuracy with exact quality bin match

Slide16

Approxilyzer : Validation16Allow Δ = 1 flexibility in quality bin matchingΔQuality_Bin= 1

Slide17

Approxilyzer : Validation17High prediction accuracy (96%)Quality determined at very fine granularity (within Δ=2 quality bins)ΔQuality_Bin= 1ΔQuality_Bin= 2

Slide18

OutlineIntroduction Background: RelyzerApproxilyzer Application : Ultra-low cost resiliencyApplication : First order approximation potentialConclusion18

Slide19

Approxilyzer Application to ResiliencyTune quality vs. resiliency vs. overheadEnable ultra-low cost resiliency solutionsResiliency scheme: Instruction duplicationSelectively protect instructions End-to-end output quality is not acceptable to user/applicationCannot be protected by low-cost detectors19SDC-GoodSDC-BadSDC-MaybeDetectable DataCorruptions (DDC)User acceptable quality threshold

Slide20

Approxilyzer: Ultra-Low Cost Resiliency (Water)20Significant resiliency overhead savings for small loss of quality

Slide21

Approxilyzer: First Order Approximation PotentialIdentify potentially approximable instructions in the programEliminates instructions that produce unacceptable quality outputsWith single bit errors unlikely to withstand more rigorous perturbations21Remaining are first order candidates for approximationReduce exploration space for more targeted analysisBest and worst case bounds (extremes of threshold) for approximation potential SDC-GoodSDC-Bad

SDC-MaybeDetectable DataCorruptions (DDC)MaskedDetected

User acceptable quality threshold

Slide22

Exploration of application’s approximation potential along different dimensions 15%40%Exploring Best-Case Approximation PotentialAt least one 32b register chunk

Slide23

ConclusionsApproxilyzer : Minimal programmer burden, general-purpose, automatic, comprehensive Determines output quality at fine granularities with high accuracy (95%)Two applicationsUltra-low cost resiliency (upto 55% savings) for small quality loss (1%) First order approximation potential of applicationsFuture directions: Other error models, input independence, data vs. instruction 23Optimizer

System Optimization; e.g.,

Resiliency vs.

Cost vs.

Quality

Identify potentially approximable instructions

End-to-end Quality Metric

+

APPROXILYZER

+

(Optional)

Q

uality

T

hreshold

Unmodified

Program

Comprehensive output quality profile

Slide24

BACKUP SLIDES

Slide25

Quality Metrics and Thresholds25

Slide26

Error Outcome classification26

Slide27

Error Outcome classification27

Slide28

Slide29