AUTOMATING WEB APPLICATION PERFORMANCE TESTING – CHALLENG
36K - views

AUTOMATING WEB APPLICATION PERFORMANCE TESTING – CHALLENG

Similar presentations


Download Presentation

AUTOMATING WEB APPLICATION PERFORMANCE TESTING – CHALLENG




Download Presentation - The PPT/PDF document "AUTOMATING WEB APPLICATION PERFORMANCE T..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.



Presentation on theme: "AUTOMATING WEB APPLICATION PERFORMANCE TESTING – CHALLENG"— Presentation transcript:

Slide1

AUTOMATING WEB APPLICATION PERFORMANCE TESTING – CHALLENGES AND APPROACHES

Bishnu Priya Nanda , Tata Consultancy Services Ltd.Seji Thomas, Tata Consultancy Services Ltd.Mohan Jayaramappa, Tata Consultancy Services Ltd.

Copyright @ 2016 Tata Consultancy Services Limited

Slide2

Contents

Introduction to Automated Performance TestingApproach to automate Performance TestingExecuting Automated Performance Testing using JenkinsChallengesBenefitsLimitationsconclusion

Copyright @ 2016 Tata Consultancy Services Limited

Slide3

Introduction to Automated Performance Testing

Reduces effort in testing environment setup , collecting and analyzing metrics. Adoption of agile technology in software development cycle.Helps in receiving continuous feed back on software performance and optimizing them.Non of the manual step of Performance analysis will be missed.Helps in finding performance difference between application versions.

Copyright @ 2016 Tata Consultancy Services Limited

Slide4

Approach to automate Performance Testing

Three approach identified currently to conclude the application performanceScreen/page response time captured by JMeter load testing.CPU and Memory utilization of servers involved in Performance TestingGarbage Collection (GC) log analysis of JVMs.

Copyright @ 2016 Tata Consultancy Services Limited

Slide5

Automating Application Transaction Response Times

Automating load testing to get the response time report, analyzing and concluding as Pass/Fail will follow the below approach.JMeter script preparation as per workload (manually)Preparing response time limit file containing SLA to compare with Load testing result (manually)Automatic triggering load testing using JMeter command line option Automatic collecting JMeter load testing result Automatic executing python script to aggregate JMeter report collected , compare with limit file against SLA set for transactions and creating output file saying response time as pass/fail.

Copyright @ 2016 Tata Consultancy Services Limited

Slide6

Automating Application Transaction Response Times Output

The output file from response time automation process will look like,Note :The requests for that transaction is sorted in descending order of 90th Percentile response and that request is compared with the SLA given in limit file for that transaction and concluded pass/FailThe overall response time result is considered as Pass is all the transactions are pass.

Transaction NameRequest NameSLA (in ms)Concurrent UserSamplesMinAvg90th PercentileMaxPass/FailAbcAbc:/request1200010150980170019002100PassXyzXyz:/request5250010125850240026002900Fail

Copyright @ 2016 Tata Consultancy Services Limited

Slide7

Automating Server CPU and Memory Utilization

Automating Server Utilization report, analyzing and concluding as Pass/Fail will follow the below approach.Creation of server utilization limit file for each server involved in Performance testing (manually).Automatic trigger of Vmstat command of linux to collect the CPU and memory utilization of servers.Automatic execution of python script to compare the avg CPU and memory utilization against the SLA mentioned in limit file for servers to create an output file saying result as pass/fail.There is another statistical parameter considered called Coefficient of Variance considering the percentage of variance from max utilization to average.

Copyright @ 2016 Tata Consultancy Services Limited

Slide8

Automating Server CPU and Memory Utilization Output

The output file from response time automation process will look like,Note:The average server utilization (CPU and RAM) and C.V. is compared with the SLA given in limit file to make the result as pass or fail.If any one result among CPU and RAM fails then the overall utilization result of the server is considered to be failed.

MeasureLimit (in %)MinAvgMaxStdDevC.V.Pass/FailCPU702635.749620.6457.75FailRAM7097.1297.5497.610.10.1Fail

Copyright @ 2016 Tata Consultancy Services Limited

Slide9

Automating Garbage Collection Log analysis

Automating Garbage Collection(GC) log capturing, analyzing and concluding as Pass/Fail will follow the below approach.Creation of GC limit file for application server involved in Performance testing (manually).Automatic trigger to collect GC log from server just after load test and parse raw GC log file to proper format for better analysis using GCViewer, an open source tool.Automatic execution of python script to compare the parsed log file from GCViewer with SLA mentioned in GC limit file and creating output file containing GC metrics as pass or fail.To decide leaks and potential issues one parameter is considered in GC statistics i.e. MaxSlopeAfterFullGC , which is calculated by max memory slope between two full GCs .

Copyright @ 2016 Tata Consultancy Services Limited

Slide10

Automating Garbage Collection Log analysis Output

The output file from GC log analysis automation will look like,

MeasureLimitActualResultOverall ResultRemarksMinTotalTime360057506PassFailSecondsMinFullGCCount00PassFailMin Full GCs that must be present in entire testMaxFullGCCountPerHour1012FailFailMax number of Full GC per hourMinThroughput9594.9FailFailin %ageAvgMemAfterFullGC52PassFail% of total memory. If it is below this limit then Overall result will always made PASSMaxSlopeAfterFullGC300000PassFailBytes/Sec increase in used-memory after Full GC (i.e. slope of all min used-memory after FullGCs)

Copyright @ 2016 Tata Consultancy Services Limited

Slide11

Deployment of Performance Testing Automation in Jenkins

Copyright @ 2016 Tata Consultancy Services Limited

Slide12

Performance Settings in Jenkins

Copyright @ 2016 Tata Consultancy Services Limited

Slide13

Executing Performance Testing Automation in Jenkins

Creating New Performance Testing Job by Jenkins admin userPT servers to be linked with the Jenkins Server by Jenkins admin

Copyright @ 2016 Tata Consultancy Services Limited

Slide14

Executing Performance Testing Automation in Jenkins contd…

Configuring PT job for executionFirst reboot and restart the PT servers

Copyright @ 2016 Tata Consultancy Services Limited

Slide15

Executing Performance Testing Automation in Jenkins contd..

Run Vmstat command on each of the servers

Copyright @ 2016 Tata Consultancy Services Limited

Slide16

Executing Performance Testing Automation in Jenkins contd..

Trigger load testing to perform through JMeter (command line option) and transfer load testing result to Jenkins server.

Copyright @ 2016 Tata Consultancy Services Limited

Slide17

Executing Performance Testing Automation in Jenkins contd..

Transfer Vmstat output, GC output from all servers involved in PT (e.g. App/DB/Web ) to Jenkins server after load test.Parse JMeter result , Vmstat output and GC output comparing with respective SLAs set in limit files to get the final output showing Pass/Fail.Scripts configured in Jenkins for same are,

Copyright @ 2016 Tata Consultancy Services Limited

Slide18

Executing Performance Testing Automation in Jenkins contd..

Final python script will be executed , which will read Jenkins log file containing all output and will create final output file consolidating all performance testing result. If all results are pass then will declare performance testing result as pass otherwise fail. The final performance result will be displayed in application’s dashboard in Jenkins.Script configured in Jenkins for same is, Final out put in HTML format is

Copyright @ 2016 Tata Consultancy Services Limited

Slide19

Executing Performance Testing Automation in Jenkins contd..

Final python script will be executed , which will read Jenkins log file containing all output and will create final output file consolidating all performance testing result. If all results are pass then will declare performance testing result as pass otherwise fail. The final performance result will be displayed in application’s dashboard in Jenkins.Script configured in Jenkins for same is, Final out put in HTML format is

Copyright @ 2016 Tata Consultancy Services Limited

Slide20

Challenges

Zero percentage error have to consider .Need for password less authentication between server to transfer files.Difficulty in finding memory leak as visual GC graph not possible. Identified new metrics ‘MaxSlopeAfterFullGC’ .Graphical analysis of server utilization results not possible make judgment, so identified the statistical measure ‘co-efficient of variance’ (C.V.) as a good indicator of the overall utilization. C.V. is the Standard Deviation/Average

Copyright @ 2016 Tata Consultancy Services Limited

Slide21

Benefits

Avoids time consumption in manual analysis and PT environment setup.Analyzing and mitigating Performance issues before system derails at the last moment.Automated performance testing guarantees users get new feature not new performance issues. For application adopting agile methodology . Automated performance testing helps developer to optimize performance issues arising by addition of new features.

Copyright @ 2016 Tata Consultancy Services Limited

Slide22

Limitations

Need for password less authentication between server for SSH.Creating limit file and modifying in case of change in workload manually.Creating and editing JMeter scripts manually.Incase of build change , validating JMeter scripts manually.

Copyright @ 2016 Tata Consultancy Services Limited

Slide23

Conclusion

Automated Performance testing increases application quality and productivity.This process enables everyone on the development team to share test scenarios and test results .Produces report that everyone can understand in the team.The goal of load testing at the speed of Agile is to deliver as much value to the users of an application through evolving features and functionality while ensuring performance no matter how many users are on the app at any one time.

Copyright @ 2016 Tata Consultancy Services Limited

Slide24

Acknowledgement

The authors would like to acknowledge the encouragement, valuable guidance and review comments provided by Mohan Jayaramappa, Senior Consultant, Tata Consultancy Services Ltd.A note of thanks for helping in this experiment and concept to, Debiprasad Swain, Principal Consultant, Tata Consultancy Services Ltd.Pitabasa Sa, Senior Consultant, Tata Consultancy Services Ltd.

Copyright @ 2016 Tata Consultancy Services Limited