# What is the Memory Model?
When we talk about the "Memory Model" in Go, we are usually discussing two interconnected concepts:
- Physical Memory Allocation: How Go decides where variables live (Stack vs Heap) and how it cleans them up (Escape Analysis and Garbage Collection).
- Concurrency Visibility: The strict rules defining when a piece of data modified by one Goroutine actually becomes mathematically visible to another Goroutine (The formal Go Memory Model Document).
Understanding these two concepts is the gateway to writing high-performance, expert-level Go code.
# Level 1: Stack vs Heap (The Backpack vs The Storage Unit)
Imagine every time a function runs, it gets a temporary Backpack to hold its variables. That backpack is called the Stack.
- It is incredibly fast.
- When the function finishes, it simply drops the backpack on the floor and walks away. Zero cleanup overhead. No sweeping required.
Now imagine a massive Storage Unit located miles away. That is the Heap.
- It is huge and accessed by all functions across your app using an address (a pointer).
- It is slower to access.
- Because everyone shares it, you need a janitor (the Garbage Collector) to constantly drive down there, figure out which boxes are abandoned, and throw them away.
# Level 2: Escape Analysis (How Go Decides)
In Go, you do not use malloc() or new/delete to decide where memory goes. The Go compiler runs an algorithm called Escape Analysis to figure it out for you.
The rule is simple: If the compiler can prove that a variable is not needed after the function returns, it stays on the incredibly fast Stack. If the variable "escapes" the function, it MUST go to the Heap.
// Example A: Stays in the backpack! (Stack)
func FastPath() int {
x := 10 // Born on the stack
return x * 2 // Returned by COPY. x dies instantly.
}
// Example B: Sent to the Storage Unit! (Heap)
func SlowPath() *int {
x := 10 // Born on the stack? NO.
// We are returning a POINTER to x.
// The calling function will still need 'x' after SlowPath is dead.
// Therefore, x "escapes" to the Heap!
return &x
}The Pointer Myth: Beginners often think "passing pointers is faster" because they don't want to copy structs. But if passing that pointer forces the struct to the Heap, the Garbage Collection penalty is often entirely 10x slower than just copying a small 64-byte struct on the Stack!
# Level 3: Garbage Collection Symptoms
When a variable goes to the Heap, you pay a tax. That tax is the Garbage Collector (GC).
The GC has to periodically scan the Heap to find abandoned memory. To do this safely, it briefly pauses your application (called a Stop-The-World pause). In Go, these pauses are generally less than a millisecond, but if you have incredibly high traffic heavily allocating on the heap, the GC will start stealing your actual CPU cores just to keep up with the trash.
Memory Leaks in Go
Go has a GC, so how can you have a memory leak? By accidental hoarding!
var globalCache = make(map[string][]byte)
func CacheData(key string, data []byte) {
// If we never delete old keys...
// The GC sees 'globalCache' still pointing to 'data'.
// It is FORBIDDEN from cleaning it up.
// The memory grows forever until the server crashes.
globalCache[key] = data
}Symptoms of a memory leak typically look like a server's RAM usage looking like a perfect staircase going up endlessly over months, until the OS abruptly kills the application with an "OOM (Out Of Memory) Killed" event.
# Level 4: The Concurrency "Happens Before" Rule (Expert)
The formal "Go Memory Model" actually refers to concurrency. Modern CPUs have multiple cores, and every core has its own private L1/L2 Cache (temporary notepad).
Because of these private caches, if Goroutine 1 writes x = 5 on Core 1, Goroutine 2 running on Core 2 might still see x == 0 because Core 1 hasn't flushed its cache to Main RAM yet!
To guarantee that Goroutine 2 sees the number 5, you have to create a "Happens Before" synchronization event.
var msg string
var done = make(chan bool)
func setup() {
msg = "hello, world" // Write (Core 1)
done <- true // Synchronization Event!
}
func main() {
go setup()
// We block here until 'true' arrives
<-done
// The Go Memory Model GUARANTEES that because the channel
// receive "happened after" the channel send, the write
// to 'msg' MUST be visible here. It will safely print "hello, world".
print(msg)
}Mutexes (sync.Mutex) and Channels (chan) are the two primary tools Go gives you to establish formal "Happens Before" memory boundaries across CPU caches.