Jump to content

Rust Vs Go: One Is A Scam — Prove Me Wrong

From JOHNWICK

I have shipped production systems in both, argued with teammates about both, and changed my mind more than once. This is the part most threads skip, written in plain words and grounded in real incidents.

The Hook No One Likes To Admit

Teams rarely pick Rust or Go only for speed. They pick a feeling.
Go feels calm: quick to start, easy to hire, friendly standard library.
Rust feels strict: slower at the start, fewer hidden traps later.
Both can be true. The “scam” begins when we sell one truth as the whole truth.

What Actually Hurts In Prod

Incidents do not arrive with big warnings. They stack quietly.
A goroutine that never stops.
Shared state touched from two places because it looked harmless.
A small subslice that keeps a large buffer alive.
A queue spike that turns garbage collection from hero to suspect.
On the Rust side, the pain looks different: longer onboarding, slow builds on big graphs, unsafe creeping in when dates loom.
Neither language is a scam. Oversold defaults are.

One Incident That Changed My Mind

A rollout pushed a simple fan-out worker. Nothing fancy. Within minutes, concurrent goroutines climbed from 3,800 to 24,600. The p99 moved from 115 ms to 420 ms. Resident memory rose from 900 MB to 2.3 GB. Root cause: a loop that launched work without a firm cap, plus a small subslice pinning a large backing array. We swapped the worker for a bounded pattern and let cancellation flow through the whole path. The graph eased. The room could breathe again.

Before: Goroutines ≈ 24,600 | p99 ≈ 420 ms | RSS ≈ 2.3 GB
After:  Goroutines ≈ 4,100  | p99 ≈ 150 ms | RSS ≈ 1.1 GB
Change: -83% goroutines, -64% p99, -1.2 GB memory (same hosts, same load window)

Copy-Paste Fix That Stops Silent Leaks

Drop-in for Go HTTP handlers or workers. It caps fan-out, respects cancellation, and avoids unbounded slices. It is boring on purpose.

package main

import (
 "context"
 "fmt"
 "net/http"
 "sync"
 "time"
)

func handler(w http.ResponseWriter, r *http.Request) {
 ctx := r.Context()

 // Example workload; replace with real items.
 jobs := make([]int, 100)
 for i := range jobs { jobs[i] = i + 1 }

 // Bounded parallelism.
 sem := make(chan struct{}, 8)
 results := make(chan int, len(jobs))

 var wg sync.WaitGroup
 for _, job := range jobs {
  select { case <-ctx.Done(): break }
  sem <- struct{}{}
  wg.Add(1)
  go func(n int) {
   defer wg.Done()
   defer func() { <-sem }()
   // Respect cancellation at every hop.
   select {
   case <-ctx.Done():
    return
   default:
    time.Sleep(10 * time.Millisecond) // Simulated work
    val := n * n
    select {
    case results <- val:
    case <-ctx.Done():
     return
    }
   }
  }(job)
 }

 go func() {
  wg.Wait()
  close(results)
 }()

 sum := 0
 for {
  select {
  case v, ok := <-results:
   if !ok {
    fmt.Fprintf(w, "sum=%d\n", sum)
    return
   }
   sum += v
  case <-ctx.Done():
   return
  }
 }
}

func main() {
 http.HandleFunc("/work", handler)
 _ = http.ListenAndServe(":8080", nil)
}

Where Go Wins Without Drama

Go helps teams move. The defaults steer you to a working server fast. The standard library gives you HTTP, tracing hooks, profiling, and a mental model that most engineers can pick up in a week. For service glue, worker pools, CLIs, and simple APIs, Go gets you from idea to traffic before the coffee cools.

Where Rust Pays Rent Every Night

Rust helps teams sleep. The borrow checker blocks whole classes of mistakes that show up under pressure. For parsers, hot data paths, crypto, engines, and anything with high fan-out and tight memory, Rust’s strictness is not a hurdle; it is a seatbelt. You pay with learning time and build time, and you get predictability when load arrives.

A Hand-Drawn Map Of A Real Service

This is how I draw the split when I want both speed and safety without heroics.

Client --> [API Gateway] --> [Order Service] --> [Hot Path]
                                |                 |
                                |                 --> [Price Cache]
                                |                 --> [DB Write]
                                v
                            [Outbox/Queue] --> [Notifier]

Go   : Gateway, Outbox Worker, Notifier (ship fast, easy ops)
Rust : Hot Path (parse/price/write with clear memory bounds)

Mixing is not betrayal. Mixing is maturity.

The Scam Test In Plain Words

Ask three questions before you bet a quarter on one language.
Can juniors ship safely without hidden leaks or races.
Does the language help your hot path instead of fighting it.
Can you debug under pressure with tools and a model the whole team understands.
If you fail two out of three, the choice is scamming you, not your users.

How To Choose Without Regret

Write glue in Go. Write the hot loop in Rust. Draw a hard boundary like the map above. Measure on your hardware, with your data, under your normal burst shape. Hold the line when hurried edits try to cross it.

The Part I Changed My Mind About

I once believed the language decided success. It does not. The defaults decide how often you bleed.
Go’s defaults make shipping easy and leaks easy.
Rust’s defaults make safety easy and ramp-up hard.
Call the “scam” by its real name: overselling a default as free.

Final Challenge Disagree. Paste one real incident where Go saved you or Rust slowed you. Add three numbers like the block above. I will read every word and, if your receipts beat mine, I will change my mind again.