Pattern pickup in practice

In my post about AI agents as power tools I wrote that agents are good at picking up existing patterns. Here’s a concrete example of what that looks like—both the strengths and the limits.

The setup

I have a project using SQLite with a bespoke database versioning and migration system. Nothing fancy, just a version number stored in the database and a set of migration functions that upgrade from one version to the next. The kind of thing you write when you need migrations but don’t need a framework.

What worked

While implementing a new feature, Claude needed to add a column to an existing table. Without prompting, it:

  • Created the migration function following the existing naming convention
  • Registered it in the migration map
  • Updated the schema version constant
  • Wrote the ALTER TABLE statement in the style of the existing migrations

This is a bespoke mechanism in a single codebase. No documentation. No framework it could have been trained on. It read the existing code, understood the pattern, and extended it correctly.

This is the pattern amplification working in your favor. Establish a clear pattern, and the agent will follow it.

What didn’t work

The migration system only supported direct upgrades—each version bump needed an explicit migration. To go from v5 to v8, you’d run v5→v6, then v6→v7, then v7→v8. This was a known limitation I’d never bothered to fix.

Claude didn’t suggest fixing it. It just worked within the constraint, writing one migration at a time. The pattern it observed was “write direct migrations,” so that’s what it did.

Once I realized cheap implementation meant I could now justify fixing this, I had Claude implement skip migrations—the ability to jump directly from v5 to v8 when intermediate steps aren’t necessary. It did this correctly too, once asked.

Where judgment was still needed

Later, a situation arose where skip migrations actually mattered. Version 8 introduced an expensive index. Version 9 changed the index to different columns. For a database upgrading from v7 or earlier, running v7→v8→v9 would build an index only to immediately drop and rebuild it.

The right call: add a v7→v9 skip migration that goes directly to the final index.

Claude didn’t suggest this. It followed the pattern—write a v8 migration, write a v9 migration—because that’s what the codebase showed. It couldn’t recognize that this particular sequence of changes made a skip migration worthwhile.

This is the “no judgment” limitation. The agent can see that skip migrations exist. It can implement them when asked. It did not recognize when the situation called for one. That requires understanding the cost of the intermediate work and deciding it’s not worth doing. That’s judgment about the problem, not just the code.

In a different project, adding a directive to evaluate whether skip migrations would be beneficial when writing database changes helped the agent catch these opportunities. This is another case where rails mitigate lack of judgment.

The pattern

This example illustrates the dynamic I keep coming back to:

  • Agent picks up mechanical patterns reliably
  • Agent executes correctly within established rails
  • Agent doesn’t recognize when to deviate from the pattern
  • Human provides the judgment about when, agent handles the how

The migration mechanism worked because I’d established a clear pattern. The skip migration optimization didn’t happen automatically because recognizing the opportunity required reasoning about costs and trade-offs the agent doesn’t do unprompted.

The work still got done faster than it would have without the agent. But the architecture decisions—including “this is a good time for a skip migration”—remained mine.