Is OpenAI really running ChatGPT on a single PostgreSQL instance?

Is OpenAI really running ChatGPT on a single PostgreSQL instance?

posted Originally published at matevosian.tech 1 min read

The headline of OpenAI’s recent article https://openai.com/index/scaling-postgresql/ feels a bit clickbaity,
If they truly used only one database instance, ChatGPT would’ve been dead on arrival.

But the reality is far more impressive: they’re using the right tool for each layer of the persistent stack:

  • They push PostgreSQL to its absolute limits, with a single primary writer, yes, but backed by nearly 50 read replicas across global regions.

  • For write-heavy workloads, they’ve wisely migrated to sharded systems like Azure Cosmos DB.

  • They’ve added layers of resilience:

    • connection pooling with PgBouncer,
    • query rate limiting,
    • caching with lock leasing,
    • cascading replication (in testing),
    • strict schema-change policies.

It’s not magic, it’s mature, thoughtful engineering at scale.

If you work with databases, this post is absolutely worth reading.

1 Comment

1 vote

More Posts

The Privacy Gap: Why sending financial ledgers to OpenAI is broken

Pocket Portfolioverified - Feb 23

5 Things This Playwright SQL Fixture Does So You Don't Have To

vitalicset - Apr 13

Which is Better for Prompt Engineering: Deepseek R1 or OpenAI o1?

Shivam Bharadwaj - Feb 10, 2025

Your Tech Stack Isn’t Your Ceiling. Your Story Is

Karol Modelskiverified - Apr 9

Getting Started with PostgreSQL: Best Practices from Development to Production

Sunny - Mar 15
chevron_left

Related Jobs

View all jobs →

Commenters (This Week)

1 comment
1 comment

Contribute meaningful comments to climb the leaderboard and earn badges!