PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates.
© 2024, Amazon Web Services, Inc. or its affiliates.
Alexander Podelko
Performance Testing
Transformation
Sr. Performance Engineer
Amazon Web Services
Workshop on Load Testing & Benchmarking
LTB 2024
London | May 7, 2024
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates.
Alex Podelko
• Has specialized in performance since
1997
• Senior Performance Engineer at AWS –
Amazon Aurora
 Before worked for MongoDB,
Oracle/Hyperion, Intel, and Aetna
• SPEC RG Steering Committee Member
Disclaimer: The views expressed here are my personal views only and do not necessarily represent those of
my current or previous employers. All brands and trademarks mentioned are the property of their owners. All
products are mentioned as examples only, not as recommendations.
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 3
© 2024, Amazon Web Services, Inc. or its affiliates.
Adjusting Performance Testing
to Industry Trends:
Adding early and continuous performance
testing
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates.
Industry Trends
• Web
 Centralization, open / unlimited workload
• Cloud
 Further centralization, price tag (FinOps)
 Dynamic configurations / Self-Management
• Agile / iterative development
 Continuous Integration / Delivery / Deployment
 DevOps / SRE
The Past, Present, and Future of Performance Engineering
4
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates.
Industry Trends
Centralization
=> Control over deployments
=> Ability to deploy small changes
=> Agile development
=> Fuzzier line between Dev and Ops (DevOps, SRE)
=> Need for continuous performance engineering
The Past, Present, and Future of Performance Engineering
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 6
Continuous Performance Testing
• Continuous performance testing
 To catch regressions early
• Collecting all info needed to investigate regressions
 In the form convenient for further analysis
• Foundation to build further automation on the top of it
 For further performance optimization
• All context-dependent
 Don’t wait for an exact recipe, figure it out depending on your needs
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 7
Performance Testing
Traditional vs Continuous
• Before releases
• Realistic Mix
• As close to production as possible
• Checking Service Level Objectives
(SLOs)
• Using a load testing tool or
harness
• The approach is relatively
consistent and well described
• Often (maybe even each build)
• Different tests
• To maximize coverage
• Checking the difference between
builds
• Using an additional layer of
automation on the top of load testing
tool
• All context-dependent
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates.
Integrating Performance Engineering into DevOps
Developmen
t
Operations
PE
Performance
Testing
Capacity Planning
Tuning
Shift Right
Shift Left
Monitoring
Capacity Planning
SRE
FinOps
Efficiency
Performance
Tester / Engineer /
Architect
Developer
SDET
8
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates.
Challenges of Continuous Performance Testing
• Integration
• Decomposition
• Coverage Optimization
• Variability / Noise Reduction
• Change Detection
• Advanced Analysis
• Operations / Maintenance
9
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 10
© 2024, Amazon Web Services, Inc. or its affiliates.
The Challenge of
Integration
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 11
Continuous Integration: Load Testing Tools
• CI support in load testing tools
 Integration with CI Servers (Jenkins, Hudson, etc.)
 Automation support
• CI tools support for performance testing
 Jenkins Performance Plugin
• Performance Testing Frameworks
 Combining multiple tools
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 12
A Performance Testing Framework
An example:
https://github.com/serputk
o/performance-testing-fram
ework
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates.
Distributed Load Testing on AWS
From AWS Solutions Library
https://aws.amazon.com/solutions/implementations/distributed-load-tes
ting-on-aws/
13
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates.
Closely Integrated Systems
• Sophisticated, but proprietary closely integrated systems
 Creating a Virtuous Cycle in Performance Testing at MongoDB
 Fallout: Distributed Systems Testing as a Service (DataStax)
 Tracking Performance of the Graal Compiler on Public Benchmarks (Charles
University / Oracle Labs)
 Introducing Ballast: An Adaptive Load Test Framework (Uber)
14
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 15
© 2024, Amazon Web Services, Inc. or its affiliates.
The Challenge of
Decomposition
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 16
Decomposition
• For most complex systems, continuous performance testing
should be done on component level / limited scale
 To align with development
 System-level requirements -> Component-level requirements
 Record/playback approach -> Programming
– Custom Load generation
– Stubbing/Mocking/Service Virtualization
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 17
Result Interpretation [Modeling]
• If the results are for component / small-scale environment,
changes should be modeled into end-to-end performance
 Performance Testing and Modeling for New Analytic Applications
 Or/and confirmed by full-scale end-to-end performance test
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 18
© 2024, Amazon Web Services, Inc. or its affiliates.
The Challenge of
Coverage
Optimization
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 19
Time / Resource Considerations
• Performance tests take time and resources
 The larger tests, the more
• May be not an option on each commit
• Need of a tiered solution
 Some performance measurements each commit
 Daily mid-size performance tests
 Periodic large-scale / uptime tests outside CI
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 20
Coverage Optimization
• A multi-dimensional problem
 Configuration
 Workloads / Tests
 Frequency of runs
• A trade off between coverage and costs
 Costs of running, analyzing, maintenance, etc.
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 21
The Challenge
• If addressed seriously, the number of workloads / tests /
configurations is growing
• No good way to optimize
• One approach is to see if some results are correlated
 If we find same problems on the same set of tests, we can use just one
or few tests from this group
 Tracking Performance of the Graal Compiler on Public Benchmarks (Charles
University / Oracle Labs)
• Combinatorial testing approaches (PairWise / Covering Arrays)
 From functional testing
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 22
© 2024, Amazon Web Services, Inc. or its affiliates.
The Challenge of
Variability
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 23
Variability - System
• Inherent to the test setup
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 24
Addressing Variability
• Methodological principles for reproducible performance eva
luation in cloud computing. 2019
(SPEC RG – Cloud)
• Reducing variability in performance tests on EC2: Setup and
Key Results
(MongoDB)
• Tracking Performance of the Graal
Compiler on Public Benchmarks
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 25
Addressing Variability
• Same environment / starting config
 For example, AWS cluster placement groups
• No other load
• Multiple iterations
• Reproducible multi-user tests
 Restarts between tests
 Clearing caches / Warming up caches
 Staggering / Sync points
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 26
© 2024, Amazon Web Services, Inc. or its affiliates.
The Challenge of
Change Detection
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 27
Complex Results
• No easy pass/fail
 Individual responses, monitoring results, errors, etc.
• No easy comparison
 Against SLA
 Between builds
• Variability
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 28
Simple Comparison
Jenkins Performance Plugin
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 29
keptn.sh
Quality Gates
SLIs / SLOs as code
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 30
Change Point Detection
• Statistical methods taking noise in consideration
• E-Divisive means algorithm
 ICPE Paper: Change Point Detection in Software Performance Testing
 Fixing Performance Regressions Before they Happen, Netflix
Technology Blog
 https://github.com/mongodb/signal-processing-algorithms
– Open sourced, generic
 Need several data points. May miss a gradual degradation.
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 31
© 2024, Amazon Web Services, Inc. or its affiliates.
The Challenge of
Advanced
Analysis
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 32
Keep All Artifacts for Further Analysis
• Get all metrics
 Throughputs, latencies, resource utilizations, etc.
• Save all related artifacts
 Exact code / configuration
 Logs, etc.
• Ability to re-run the test in the exactly same configuration is
helpful
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 33
Root Cause Analysis
• Collecting artifacts to do root cause analysis
• Insights snapshots
 Flamegraphs (perf, eBPF)
• Continuous Profiling
 Java Flight Recorder
 APM
 Tracing
 Observability
 eBPF-based tools
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 34
Visualization
• Visualizing systems and software performance - Report on the
GI-
Dagstuhl
• Sometimes helps to catch an issue
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 35
Looking Beyond Aggregate Info
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 36
© 2024, Amazon Web Services, Inc. or its affiliates.
The Challenge of
Operations and
Maintenance
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 37
Operations
• Scheduling / execution tests
• Results analysis
• Triaging and escalating issues
• Maintenance
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 38
Coverage / Maintenance Trade-Off
Coverage
Maintenance
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 39
Catching / Troubleshooting Errors
• Catching errors is not trivial
 Building in checks
 Depends on interfaces used
– Protocol-level [recording]
– GUI
– API/Programming
– Production Workloads
• Keeping logs / all info needed to investigate issues
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 40
Changing Interfaces
• If using protocol-level or GUI scripts, minor changes may break
them
 It may be not evident
 If recording used, a change in interfaces may require to recreate the
whole script
• API / Programming is usually more stable / easier to fix
• AI to catch the changes / self-healing scripts
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 41
Who Is Doing Maintenance?
• Who is responsible for what?
• Infrastructure Code
 Tools, plumbing code, integration
• Specific tests
• Integrated workloads
 Covered multiple functional areas
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates. 42
SUMMARY
• Adjusting performance testing to industry trends
• Specific challenges should be addressed:
 Integration
 Coverage Optimization
 Variability / Noise Reduction
 Change Detection
 Advanced Analysis
 Operations / Maintenance
• Performance engineering gets more integrated, context-dependent
 Integrated into both Development and Operations
PERFORMANCE TESTING TRANSFORMATION
© 2024, Amazon Web Services, Inc. or its affiliates.
Thank you!
© 2024, Amazon Web Services, Inc. or its affiliates.
Alex Podelko
podealex@amazon.com

Performance Testing Transformation - LTB 2024

  • 1.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. © 2024, Amazon Web Services, Inc. or its affiliates. Alexander Podelko Performance Testing Transformation Sr. Performance Engineer Amazon Web Services Workshop on Load Testing & Benchmarking LTB 2024 London | May 7, 2024
  • 2.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. Alex Podelko • Has specialized in performance since 1997 • Senior Performance Engineer at AWS – Amazon Aurora  Before worked for MongoDB, Oracle/Hyperion, Intel, and Aetna • SPEC RG Steering Committee Member Disclaimer: The views expressed here are my personal views only and do not necessarily represent those of my current or previous employers. All brands and trademarks mentioned are the property of their owners. All products are mentioned as examples only, not as recommendations.
  • 3.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 3 © 2024, Amazon Web Services, Inc. or its affiliates. Adjusting Performance Testing to Industry Trends: Adding early and continuous performance testing
  • 4.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. Industry Trends • Web  Centralization, open / unlimited workload • Cloud  Further centralization, price tag (FinOps)  Dynamic configurations / Self-Management • Agile / iterative development  Continuous Integration / Delivery / Deployment  DevOps / SRE The Past, Present, and Future of Performance Engineering 4
  • 5.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. Industry Trends Centralization => Control over deployments => Ability to deploy small changes => Agile development => Fuzzier line between Dev and Ops (DevOps, SRE) => Need for continuous performance engineering The Past, Present, and Future of Performance Engineering
  • 6.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 6 Continuous Performance Testing • Continuous performance testing  To catch regressions early • Collecting all info needed to investigate regressions  In the form convenient for further analysis • Foundation to build further automation on the top of it  For further performance optimization • All context-dependent  Don’t wait for an exact recipe, figure it out depending on your needs
  • 7.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 7 Performance Testing Traditional vs Continuous • Before releases • Realistic Mix • As close to production as possible • Checking Service Level Objectives (SLOs) • Using a load testing tool or harness • The approach is relatively consistent and well described • Often (maybe even each build) • Different tests • To maximize coverage • Checking the difference between builds • Using an additional layer of automation on the top of load testing tool • All context-dependent
  • 8.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. Integrating Performance Engineering into DevOps Developmen t Operations PE Performance Testing Capacity Planning Tuning Shift Right Shift Left Monitoring Capacity Planning SRE FinOps Efficiency Performance Tester / Engineer / Architect Developer SDET 8
  • 9.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. Challenges of Continuous Performance Testing • Integration • Decomposition • Coverage Optimization • Variability / Noise Reduction • Change Detection • Advanced Analysis • Operations / Maintenance 9
  • 10.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 10 © 2024, Amazon Web Services, Inc. or its affiliates. The Challenge of Integration
  • 11.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 11 Continuous Integration: Load Testing Tools • CI support in load testing tools  Integration with CI Servers (Jenkins, Hudson, etc.)  Automation support • CI tools support for performance testing  Jenkins Performance Plugin • Performance Testing Frameworks  Combining multiple tools
  • 12.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 12 A Performance Testing Framework An example: https://github.com/serputk o/performance-testing-fram ework
  • 13.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. Distributed Load Testing on AWS From AWS Solutions Library https://aws.amazon.com/solutions/implementations/distributed-load-tes ting-on-aws/ 13
  • 14.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. Closely Integrated Systems • Sophisticated, but proprietary closely integrated systems  Creating a Virtuous Cycle in Performance Testing at MongoDB  Fallout: Distributed Systems Testing as a Service (DataStax)  Tracking Performance of the Graal Compiler on Public Benchmarks (Charles University / Oracle Labs)  Introducing Ballast: An Adaptive Load Test Framework (Uber) 14
  • 15.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 15 © 2024, Amazon Web Services, Inc. or its affiliates. The Challenge of Decomposition
  • 16.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 16 Decomposition • For most complex systems, continuous performance testing should be done on component level / limited scale  To align with development  System-level requirements -> Component-level requirements  Record/playback approach -> Programming – Custom Load generation – Stubbing/Mocking/Service Virtualization
  • 17.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 17 Result Interpretation [Modeling] • If the results are for component / small-scale environment, changes should be modeled into end-to-end performance  Performance Testing and Modeling for New Analytic Applications  Or/and confirmed by full-scale end-to-end performance test
  • 18.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 18 © 2024, Amazon Web Services, Inc. or its affiliates. The Challenge of Coverage Optimization
  • 19.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 19 Time / Resource Considerations • Performance tests take time and resources  The larger tests, the more • May be not an option on each commit • Need of a tiered solution  Some performance measurements each commit  Daily mid-size performance tests  Periodic large-scale / uptime tests outside CI
  • 20.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 20 Coverage Optimization • A multi-dimensional problem  Configuration  Workloads / Tests  Frequency of runs • A trade off between coverage and costs  Costs of running, analyzing, maintenance, etc.
  • 21.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 21 The Challenge • If addressed seriously, the number of workloads / tests / configurations is growing • No good way to optimize • One approach is to see if some results are correlated  If we find same problems on the same set of tests, we can use just one or few tests from this group  Tracking Performance of the Graal Compiler on Public Benchmarks (Charles University / Oracle Labs) • Combinatorial testing approaches (PairWise / Covering Arrays)  From functional testing
  • 22.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 22 © 2024, Amazon Web Services, Inc. or its affiliates. The Challenge of Variability
  • 23.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 23 Variability - System • Inherent to the test setup
  • 24.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 24 Addressing Variability • Methodological principles for reproducible performance eva luation in cloud computing. 2019 (SPEC RG – Cloud) • Reducing variability in performance tests on EC2: Setup and Key Results (MongoDB) • Tracking Performance of the Graal Compiler on Public Benchmarks
  • 25.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 25 Addressing Variability • Same environment / starting config  For example, AWS cluster placement groups • No other load • Multiple iterations • Reproducible multi-user tests  Restarts between tests  Clearing caches / Warming up caches  Staggering / Sync points
  • 26.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 26 © 2024, Amazon Web Services, Inc. or its affiliates. The Challenge of Change Detection
  • 27.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 27 Complex Results • No easy pass/fail  Individual responses, monitoring results, errors, etc. • No easy comparison  Against SLA  Between builds • Variability
  • 28.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 28 Simple Comparison Jenkins Performance Plugin
  • 29.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 29 keptn.sh Quality Gates SLIs / SLOs as code
  • 30.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 30 Change Point Detection • Statistical methods taking noise in consideration • E-Divisive means algorithm  ICPE Paper: Change Point Detection in Software Performance Testing  Fixing Performance Regressions Before they Happen, Netflix Technology Blog  https://github.com/mongodb/signal-processing-algorithms – Open sourced, generic  Need several data points. May miss a gradual degradation.
  • 31.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 31 © 2024, Amazon Web Services, Inc. or its affiliates. The Challenge of Advanced Analysis
  • 32.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 32 Keep All Artifacts for Further Analysis • Get all metrics  Throughputs, latencies, resource utilizations, etc. • Save all related artifacts  Exact code / configuration  Logs, etc. • Ability to re-run the test in the exactly same configuration is helpful
  • 33.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 33 Root Cause Analysis • Collecting artifacts to do root cause analysis • Insights snapshots  Flamegraphs (perf, eBPF) • Continuous Profiling  Java Flight Recorder  APM  Tracing  Observability  eBPF-based tools
  • 34.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 34 Visualization • Visualizing systems and software performance - Report on the GI- Dagstuhl • Sometimes helps to catch an issue
  • 35.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 35 Looking Beyond Aggregate Info
  • 36.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 36 © 2024, Amazon Web Services, Inc. or its affiliates. The Challenge of Operations and Maintenance
  • 37.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 37 Operations • Scheduling / execution tests • Results analysis • Triaging and escalating issues • Maintenance
  • 38.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 38 Coverage / Maintenance Trade-Off Coverage Maintenance
  • 39.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 39 Catching / Troubleshooting Errors • Catching errors is not trivial  Building in checks  Depends on interfaces used – Protocol-level [recording] – GUI – API/Programming – Production Workloads • Keeping logs / all info needed to investigate issues
  • 40.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 40 Changing Interfaces • If using protocol-level or GUI scripts, minor changes may break them  It may be not evident  If recording used, a change in interfaces may require to recreate the whole script • API / Programming is usually more stable / easier to fix • AI to catch the changes / self-healing scripts
  • 41.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 41 Who Is Doing Maintenance? • Who is responsible for what? • Infrastructure Code  Tools, plumbing code, integration • Specific tests • Integrated workloads  Covered multiple functional areas
  • 42.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. 42 SUMMARY • Adjusting performance testing to industry trends • Specific challenges should be addressed:  Integration  Coverage Optimization  Variability / Noise Reduction  Change Detection  Advanced Analysis  Operations / Maintenance • Performance engineering gets more integrated, context-dependent  Integrated into both Development and Operations
  • 43.
    PERFORMANCE TESTING TRANSFORMATION ©2024, Amazon Web Services, Inc. or its affiliates. Thank you! © 2024, Amazon Web Services, Inc. or its affiliates. Alex Podelko podealex@amazon.com

Editor's Notes

  • #23 But even if the system is dedicated, nothing is ran on the system, everything is done completely in the same way – it still may be significant variability, as shown on the graph. These are single-user (!) results for the exactly same search.
  • #24 Alessandro Vittorio Papadopoulos, Laurens Versluis, André Bauer, Nikolas Herbst, Jóakim von Kistowski, Ahmed Ali-Eldin, Cristina Abad, J. Nelson Amaral, Petr Tuma, and Alexandru Iosup. Methodological principles for reproducible performance evaluation in cloud computing. IEEE Transactions on Software Engineering, July 2019 https://atlarge-research.com/pdfs/TSE_2018_Cloud_Benchmarking_Methodology.pdf. Lubomír Bulej, François Farquet, Vojtěch Horký, Petr Tůma. Tracking Performance of the Graal Compiler on Public Benchmarks. LTB 2021. https://ltb2021.eecs.yorku.ca/slides21/tuma-kn.pdf
  • #25 We still have multiple ways to address variability – although it may be not trivial. For example: -Same environment / starting config -No other load -Multiple iterations -Reproducible (no randomness) multi-user tests. One approach may be concurrent tests (using synchronization points).
  • #27 Most load testing tools compare results to SLAs – but it is not very useful for continuous integration when we want to see the change.
  • #28 Here is another example of variability, this time in a table form for comparison between builds. Again, this is single user, same setup, and same automated process.
  • #30 Fixing Performance Regressions Before they Happen by Angus Croll, Netflix Technology Blog, 2022. https://netflixtechblog.com/fixing-performance-regressions-before-they-happen-eab2602b86fe Identifying Software Performance Changes Across Variants and Versions. ASE, 2020. https://www.se.cs.uni-saarland.de/publications/docs/MAS+20.pdf
  • #34 Visualizing systems and software performance - Report on the GI-Dagstuhl, 2018. https://doi.org/10.7287/peerj.preprints.27253v1
  • #35 Building reporting / alerting around averages (or other aggregated results) may disguise trends / issues seen with individual graphs.
  • #40 Traditional approach to performance testing – recording/playback on the protocol level (and on GUI level too) - is notoriously prone to change / fragile, especially during early stages of system’s lifecycle. It adds a lot of overheads maintaining the scripts and the need to add sophisticated logic to avoid false negative and positive results, especially in case of Continuous Integration. Using APIs is usually more robust when you have APIs available and know well how it is used – but it often not the case.