/
HDI San Diego Chapter HDI San Diego Chapter

HDI San Diego Chapter - PowerPoint Presentation

marina-yarberry
marina-yarberry . @marina-yarberry
Follow
376 views
Uploaded On 2016-07-28

HDI San Diego Chapter - PPT Presentation

May 2012 Desktop Support Metrics A Case Study Mike Russell VP Communications SDHDI About the Author Over 26 years in IT in San Diego Experienced in Programming Operations Infrastructure and Support ID: 422938

support staff time desktop staff support desktop time effectiveness customer satisfaction metrics service management quality tickets productivity ticket team

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "HDI San Diego Chapter" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

HDI San Diego ChapterMay 2012Desktop Support Metrics:A Case Study

Mike RussellV.P. Communications, SDHDISlide2

About the AuthorOver 26 years in I.T. in San DiegoExperienced in Programming, Operations, Infrastructure and Support12 years in Service Delivery ManagementSpecialized in Desktop Support Management and Device DeploymentMember of the San Diego HDI chapter for 5 yearsMember of HDI Desktop Support Forums

Member of HDI Desktop Support Advisory BoardSlide3

The Challenges of Developing Desktop Support MetricsAnalysis and Development of Desktop Support standards are not as mature as the Service Desk standardsDifferent responsibilities require different processesDifferent processes require different metrics which are difficult to capture

History of using subjective not objective measuresDesktop Support Staff have multiple inputs for serviceMobile, dynamic and dispersed workforceHave a closer relationship with customer base

Receive escalations and triage requests from different departments in I.T.Slide4

Common Perceptions of Desktop SupportPerceptions of Desktop Support can be variedCustomers feel the Desktop Support team is an extension or replacement of the Service DeskI.T. Partners feel that the Desktop Support team can be engaged at will for immediate assistance and may be an inexhaustible resourceProject Managers feel that the Desktop Support team is a replacement for project labor and may be an inexhaustible resourceExecutive Management does not fully understand the scope of work performed by Desktop Support

Desktop Support analysts can feel misunderstood, under appreciated and over utilizedSlide5

The ProblemI needed a way to accurately measure, analyze and market the services delivered by Desktop Support.Demonstrate Staff ProductivityMeasure Staff EffectivenessMeasure Performance QualityTrack Customer Satisfaction

Illustrate effects of bad changesIdentify Opportunities for Service ImprovementDemonstrate Improved performanceImprove staff satisfaction

Market the value of Desktop Support to I.T. and Executive ManagementSlide6

“Must Have” MeasurementsStaff EffectivenessStaff ProductivityThe ability to track tickets by:Incidents

Service RequestsProblemsChangesQualityCustomer Satisfaction Slide7

Staff EffectivenessThe Effective use of Time by StaffActual time staff has available to work on issuesDoes not include meetings, breaks, non-ticketed project time, sick days, PTO, etc.May require process changes in time and attendanceActual time spent on individual tickets

Does not mean time received to time resolvedMay be in several chunks of timeWill require manual “best judgment” of staffMay require modification to your ticketing system

Actual Ticket Time/Available Time = EffectivenessSlide8

Staff EffectivenessExample:Bobby works a standard 40 hour week (37.5 hours w/breaks)Bobby attends 4 hours of meetings (37.5 – 4 = 34.5)Bobby is sick one day (34.5 – 8 = 26.5)Bobby documents 22.3 hours spent on ticket resolution.

Bobby’s Effectiveness is 22.3/26.5 = 84%Slide9

Staff EffectivenessExpect to see initial rates between 30% to 200%Low numbers indicate that staff may not be estimating times correctly or not reporting all issuesHigh numbers may indicate duplicate tickets or lack of understanding of what is being trackedThis can take 2 – 3 months to “settle in” with staffReview monthly with staff to find cause of out of range effectivenessDo not assume staff are purposely misleading with stats

Tip: do NOT show total time spent on tickets to staff at first (or possible at all)Industry Standard:

80% Effectiveness Rating(48 Minutes per Hour)Slide10

Staff ProductivityYou should already be tracking the number of tickets being closed per day or monthDecide which metrics are related to productivity and customer satisfaction (ex: initial response time, days to resolve, etc.)In your ticketing system, automatically classify tickets as incidents or requestsHave the analyst resolving the ticket verify the CLOSING classification (do NOT default to opening classification!)

The closing analyst should document if the ticket is related to a change or problem, and if so, which oneTry using the time captured in the effectiveness metric to calculate tickets closed per working hourSlide11

QualityUse a monthly quality survey to track SLA adherence and other factors critical to the delivery of superior supportCustomer Contact Time (SLA < 2 business hours)Resolution Time (SLA < 8 business hours)Work log information complete and correctDocument all customer contacts including nameShould be clear enough that the CUSTOMER can understand it.

Appointments Made and KeptIf appointments are made, are they keptAsset Information CorrectClosing CTI Appropriate for the issueSlide12

QualityContains Objective and Subjective measurementsMeasurement standards should be clear and documentedShould not be performed by one individualSampling size needs to remain consistentBecause some subjective judgments must be made, the staff members must have the ability to review and challenge the resultsAs a manager, you have the right and responsibility to make changes to the results in order to remain fairSlide13

Customer SatisfactionSend an automated survey to customers for each ticketExpect a 5% - 15% rate of returnVery low or Very high returns are a red flag, especially on an individual basis.Design reports so that Customer Satisfaction can be trended to other metrics (ticket volumes, time to respond, problems, projects, etc.)

Customer Satisfaction transcends all levels of management, and can be the most important factor in the perception, success and survival of the desktop support team.Slide14

Quick ReviewPerceptions:Unlimited Resources, Unknown scope of services, Always Immediately available (not doing anything else), misunderstood, under appreciated, over utilizedMetrics driving the solution:Staff ProductivityStaff EffectivenessQualityCustomer Satisfaction

Put this all together, what does it look like?Slide15

The SolutionThe Balanced Team ScorecardProductivity metrics for Incident, Service Request Problems and Changes (SLAs/OLAs)Average Team EffectivenessAverage Quality ScoresAverage Customer SatisfactionTrending report for last 12 months for Key Performance Indicators and SLAs

Subtext describing significant factors in changesDistributed Monthly to I.T. Management and CustomersSlide16

What does the end result look like?Slide17

The ResultsIt Works!Received Praise from Executive Levels on the metrics reportedAdopted as standard for metrics reporting for the I.T. operations teamsReceived praise from the staff as they felt recognized and valued in the organization

Captures data that can be used for further research, ex: cost of bad changes, most costly services, etc.Recognized as a best practice by HDI, presented at the 2012 National Conference in OrlandoSlide18

Thank you!Questions?Mike Russell, I.T. Service Delivery Managementmrussel2@cox.net