ProgrammingBackend Developer

Explain what a goroutine is in Go. How does it work under the hood and how does it differ from threads in other languages?

Pass interviews with Hintsage AI assistant

Answer

A goroutine is a lightweight thread of execution managed by the Go runtime. To start a new goroutine, the go keyword is used before the function call. Under the hood, a structure is created that describes the stack and state of the task, which is added to the goroutine scheduler's queue.

Unlike OS threads, a goroutine has a much smaller initial stack (typically 2 KB), and the stack automatically grows as needed. The Go scheduler distributes them across available OS threads (M:N model).

Key differences from threads:

  • They are created faster and require less memory
  • Scheduled by the Go runtime, not the OS
  • Automatically scaled across cores
  • Synchronization is implemented via channels, preventing many classes of race conditions

Example of using goroutines and channels:

package main import ( "fmt" "time" ) func worker(id int, jobs <-chan int, results chan<- int) { for j := range jobs { fmt.Printf("worker %d started job %d ", id, j) time.Sleep(time.Second) fmt.Printf("worker %d finished job %d ", id, j) results <- j * 2 } } func main() { jobs := make(chan int, 5) results := make(chan int, 5) for w := 1; w <= 3; w++ { go worker(w, jobs, results) } for j := 1; j <= 5; j++ { jobs <- j } close(jobs) for a := 1; a <= 5; a++ { <-results } }

Trick question

Can Go goroutines run in parallel on multiple cores?

A common misconception is: "No, because Go uses green threads." In fact, using an environment variable or the runtime.GOMAXPROCS(n) call, Go can parallelize the execution of goroutines across all available CPU cores.

Example:

import "runtime" func main() { runtime.GOMAXPROCS(4) // Allows using 4 cores ... }

Examples of real errors due to ignorance of the topic


Story

In a backend service project implemented in Go, a worker pool was created using goroutines, but the programmers forgot to limit the number of concurrently running goroutines. As a result, under increased load, the application started thousands of goroutines, leading to memory exhaustion and service crashes. The issue was resolved by introducing a limit on active goroutines (for example, using a semaphore or worker pool).


Story

One employee incorrectly synchronized data between goroutines by using regular global variables without mutexes or channels. This caused a race condition, leading to occasional errors during payment processing. The problem was discovered only after deployment in production.


Story

In the parsing service, an oversight occurred with passing nil channels in select: after closing the channel, select continued to block waiting for data, causing some goroutines to 'hang'. This was fixed by assigning nil to the closed channel and properly handling select.