How Twitch reduced GC pauses in Go from seconds to milliseconds

This title was summarized by AI from the post below.
View profile for Prateek Jain

✅Performance Architect ✅ Java+Database Performance Engineer ✅ SRE ✅ Platform Engineering , Full Stack Dev

Ever wondered how large-scale systems slash GC pauses from seconds to milliseconds—and what lessons you can deploy in your own stack? Twitch’s journey with Go’s garbage collector is an eye-opener. Recommend as must read for all #performanceengineering guys. Key points: 1️⃣ GC Pause Reduction Multi-seconds → <1ms (100x faster) Achieved via GC tuning & evolution (Go 1.2 → Go 1.7) 2️⃣ Concurrent & Incremental GC Go 1.5 introduced mostly-concurrent GC 10x latency reduction → chat survived massive loads smoothly 3️⃣ Root Cause Analysis Perf profiling uncovered GC bottlenecks: Stack shrinking Finalizer scanning Fixes shipped in Go 1.6 & Go 1.7 4️⃣ NUMA & Memory Locality NUMA page faults = hidden latency culprit Pinning/tuning cut pauses from 70ms → 10–15ms Outcome: GC improvements for the whole Go ecosystem

  • No alternative text description for this image
Prateek Jain

✅Performance Architect ✅ Java+Database Performance Engineer ✅ SRE ✅ Platform Engineering , Full Stack Dev

2mo
Like
Reply

To view or add a comment, sign in

Explore content categories