Features
Cache
Built-in manual caching system with TTL support in TurboGo.
Configure Manual Cache
TurboGo provides a lightweight manual caching system that allows you to store responses or computed values in memory with a custom TTL (time-to-live) per request context.
✅ Default route-level automatic caching is no longer used. You now have full control via
ctx.Cache.Memory.
Enable Cache
To activate the caching system, initialize TurboGo with .WithCache():
app := TurboGo.New().WithCache()✅ You can also chain it with other modules:
app := TurboGo.New().WithCache().WithQueue().WithPubsub()
This sets up a safe, in-memory TTL cache that can be used inside any handler.
Example Usage
func main() {
app := TurboGo.New().WithCache()
app.Get("/", func(ctx *core.Context) {
key := "my-key"
val, found := ctx.Cache.Memory.Get(key)
if found {
fmt.Println("✅ Cache HIT:", key)
ctx.Text(200, "From Cache: "+string(val))
return
}
fmt.Println("🔄 Cache MISS:", key)
result := "Hello, TurboGo!"
ctx.Cache.Memory.Set(key, []byte(result), 10*time.Second)
ctx.Text(200, "Generated: "+result)
})
app.Listen(":8080")
}How It Works
- Each request gets access to
ctx.Cache.Memory, a concurrency-safe in-memory TTL cache. - You control what gets stored, when it expires, and how it’s fetched.
- No automatic route-based cache means you can cache anything: computed data, responses, third-party calls, etc.
Use Cases
- Cache database or external API responses per user/request
- Store computed results temporarily for performance
- Reduce redundant processing on high-traffic endpoints