I'm a designer, not a developer. Here's how I prepared my PHP blog for AI agents.

I'm a designer, not a developer. Here's how I prepared my PHP blog for AI agents.

posted Originally published at shinobis.com 2 min read

I need to be upfront about something: I'm not a developer. I'm a UX/UI designer with over 10 years in banking and fintech. I built my blog with PHP and Claude as my coding partner. Every technical decision was a conversation with an AI, not a Stack Overflow deep dive.

So when Cloudflare released a tool that tests how ready your site is for AI agents, I was curious but skeptical. How hard could this be for someone who learned PHP three months ago?

Turns out, not hard at all. My blog scored 50/100, and it took less than an hour.

What AI agents actually need from your site

Forget the technical jargon for a second. AI agents are like new visitors who can't see your design. They don't care about your beautiful layout or your carefully chosen typography. They want your content, clean and structured.

Most websites give them a mess. Navigation bars, cookie banners, JavaScript bundles, social media widgets. The agent has to dig through all that noise to find the actual article. It's like handing someone a book wrapped in newspaper.

The four things I implemented basically unwrap the newspaper:

  1. A file that says "here's who I am" llms.txt. A simple Markdown file at my domain root. It tells AI models: I'm a designer writing about AI tools, here are my best articles, here are my free tools. Five minutes to write. I even built a free generator so you don't have to write it from scratch.
  2. One line in robots.txt It tells AI crawlers: cite me, but don't train on my content. Thirty seconds to add.
  3. A clean version for agents When an AI agent asks for my page, my server sends back plain Markdown instead of HTML. No navigation, no scripts, just the article. Twenty minutes to set up with Claude's help.
  4. A list of my tools A JSON file that tells agents what they can do on my site. Five minutes.

Why I skipped the other 6 standards

The test checks 10 things. Six of them are for platforms like Stripe or Salesforce things with APIs, user accounts, and authentication flows. My blog has none of that. Implementing OAuth on a blog with no login would be absurd.

What I learned

The most interesting part wasn't the implementation. It was the realization that preparing your site for AI agents is similar to good UX design. You're making content accessible to a different type of user one that processes information differently than humans.

As a designer, that clicked immediately. We don't design the same interface for mobile and desktop. Why would we serve the same messy HTML to a human browser and an AI agent?

For anyone in the community with a personal blog or portfolio: have you thought about how AI agents see your site? I'm curious if others have tried the test and what they found.

1 Comment

1 vote
1

More Posts

I’m a Senior Dev and I’ve Forgotten How to Think Without a Prompt

Karol Modelskiverified - Mar 19

Everyone says DeepSeek is cheaper, but I got tired of guessing the exact math. So I built a calculat

abarth23 - Apr 27

How I Built a React Portfolio in 7 Days That Landed ₹1.2L in Freelance Work

Dharanidharan - Feb 9

AI Reliability Gap: Why Large Language Models are not for Safety-Critical Systems

praneeth - Mar 31

Breaking the AI Data Bottleneck: How Hammerspace's AI Data Platform Eliminates Migration Nightmares

Tom Smithverified - Mar 16
chevron_left

Related Jobs

View all jobs →

Commenters (This Week)

2 comments
1 comment
1 comment

Contribute meaningful comments to climb the leaderboard and earn badges!