/
The Customer Viewpoint on Dependability BenchmarkingNeeraj SuriChalmer The Customer Viewpoint on Dependability BenchmarkingNeeraj SuriChalmer

The Customer Viewpoint on Dependability BenchmarkingNeeraj SuriChalmer - PDF document

kylie
kylie . @kylie
Follow
344 views
Uploaded On 2021-07-07

The Customer Viewpoint on Dependability BenchmarkingNeeraj SuriChalmer - PPT Presentation

Department of Computer Engineering Department of Computer Engineering Application scope of DBenchmarking systems ID: 855110

department user engineering computer user department computer engineering systems benchmark application service system dependability specific benchmarking faults representative fault

Share:

Link:

Embed:

Download Presentation from below link

Download Pdf The PPT/PDF document "The Customer Viewpoint on Dependability ..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

1 The Customer Viewpoint on Dependability
The Customer Viewpoint on Dependability BenchmarkingNeeraj SuriChalmers Univhttp://www.ce.chalmers.se/~suri Department of Computer Engineering Department of Computer Engineering Application scope of D.Benchmarking systems...“Dependability” is multi-faceted with (perceived) qualitative attributes as well –so what specific aspect(s) of dependability can a D.Benchmark (realistically/potentially) cover & measure?Systems: large area covering cont

2 rol, telecomm., transaction NEED to narr
rol, telecomm., transaction NEED to narrow down this definitionWhat are we benchmarking: Complete systems (HW, OS, (embedded)?, ...... and over user-induced changes to system functionality?(application specific or platform/service specific?) Department of Computer Engineering –Does it cover single/distributed & networked systems and services + middleware + end-to-end system functionality? –Can “all” workloads, application areas, fault/stre

3 ss scenarios driving the D.benchmark be
ss scenarios driving the D.benchmark be covered (even to a representative extent)?–Is it a magic number(s)? Can it be user-trusted/reproduced? Is it a single benchmark or a suite for diff. systems/apps?... How meaningful is a system benchmark, where functionality (&dependability aspects) may change based on user SW/apps.?? Department of Computer Engineering Who is going to use D.Benchmark(s) and for what?Designer? for conformance to design s

4 pecs/standards? [do they even exist for
pecs/standards? [do they even exist for current day broad usage “systems”] comparisons across a product class? [eg. PDA’s..] utility?A product sells for its “delivery of services”to the user –if dependability is not the explicitly perceived service, then D.Benchmarks will remain as supplementary “features” to the userWhat does it “buy” the user (& designer)? Is it just a design metric/techniques/tool etc that the designer is simply expecte

5 dto conform to? Department of Computer E
dto conform to? Department of Computer Engineering What does it cost to run benchmarks and who runs them?–Service or $ critical aspects of a product need to match the cost/ease of running a benchmark (“bang for the buck” principle –are sometimes more valuable than a –If its an embedded environment, then why should the user care if the metric is too entwined within the system ops for the user to measure, comprehend or control? Department of C

6 omputer Engineering D.Benchmarking & Fau
omputer Engineering D.Benchmarking & Fault Loads etc...Definition of “faults” or “representative stress” needs to be carefully worked out –At what level should we be considering faults? OS, middleware, HW, HW-SW, application? –Definition of “faults” may be very system/application dependent. Eg. in RT-OS, lack of (timely) delivery of a service equates to a fault. So for benchmarking, we need to be very careful in defining what is “representat

7 ive stress”! Department of Computer Engi
ive stress”! Department of Computer Engineering Need to communicate “value” to the user, not as a technical achievement but as a “service” commodityNeeds to be simple & comprehensible –even a rough measure (with all the sophistication hidden) has better value for the user!Need to span a tangible domain (technical correctness of focusedbenchmarks for specific domains vs. real usage!)User + Application area sensitivity is “the” key dimension