Faster AI Experimentation, Worse Products

posted Originally published at valentineshi.dev 2 min read

AI storm has revealed an opinion that in AI-accelerated product teams, traditional "source of truth" artifacts like Figma, design documents, software models become obsolete because products evolve too quickly, with codebase modified directly and iteratively instead of changes undergoing centralized planning, team alignment and documentation.


With this in mind, I increasingly see a dangerous assumption emerging in AI-era product discussions: because AI lowers code generation cost, experimentation becomes "cheap".

It does not and here is why.

AI mainly reduces the direct cost of making a single change and increases its speed. However, it neither reduces the changes absorption cost nor the absorption speed. If anything, it increases both.

And absorption is where most real cost lives:

  • stakeholder alignment
  • validation
  • operational consistency
  • customer adaptation
  • maintaining coherent product decision over time

Generating changes directly to a code base, actually breaks its ability to be the "single source of truth". The product functionality cannot be validated by reading code base.

Moreover, changes require consensual alignment beforehand with a team of non-engineering stakeholders (owners, domain experts, customer proxies etc.). Code base cannot serve as a discussion artefact for them.

In addition, strategically, code base as source of truth is useless too. The continuity and logical product evolution is the king for sustainable business growth. The code base cannot serve as the product decision making basis.

So, I suggest, without explicit and continuously maintained software engineering artifacts the effective alignment is impossible.

Ironically, this becomes visible even in teams of two. One person rapidly experimenting without a shared, mindful direction can easily create waste for both sides despite “moving fast”.

The bottleneck is no longer coding speed. "It has never been". The bottleneck is (1) collective cognition and alignment speed, (2) the ability of the target market to absorb changes, (3) surprisingly, the accumulated costs (direct plus opportunity cost) of mindless experimentation and (4) product decision making.

Nothing of this is solved with faster code generation, some of this is aggravated with that.

In the AI era, engineering artifacts increase their power and value, they become the real keepers of continuity, clever and logical business decisions and, ultimately, the staples for product success.

5 Comments

2 votes
2
2
2
0

More Posts

Local-First: The Browser as the Vault

Pocket Portfolioverified - Apr 20

Split-Brain: Analyst-Grade Reasoning Without Raw Transactions on the Server

Pocket Portfolioverified - Apr 8

I’m a Senior Dev and I’ve Forgotten How to Think Without a Prompt

Karol Modelskiverified - Mar 19

AI Reliability Gap: Why Large Language Models are not for Safety-Critical Systems

praneeth - Mar 31

Sanitization by Construction: The "Edge Compiler"

Pocket Portfolioverified - Apr 13
chevron_left

Related Jobs

View all jobs →

Commenters (This Week)

3 comments
2 comments
1 comment

Contribute meaningful comments to climb the leaderboard and earn badges!