The anatomy of DESIGN.md and how it changes everything we know about design

Zoltan SzogyenyiZoltan Szogyenyi

I am writing this as someone who in the past month has created over 50 design skill files, launched these files even before Google authored DESIGN.md, and have been losing my mind over how everything will change in the design world.

The file we are talking about is the blueprint that Google created after launching Google Stitch, and even before that we at TypeUI created design skill files (Reddit launch) that could both be used to train AI models on generating interfaces based on a given style guide, accessibility standards and design systems.

So was the chicken or the egg first? I do not think it matters as much as it will actually change the landspace of generating user interfaces and technically making legacy tools such as Figma or Penpot as the source of truth for design as a thing for the past.

This page should help you better understand what the DESIGN.md file is by breaking it down into multiple parts, provide examples of real world usage and how to integrate it with your workflow whether that means designer tools such as Figma or Penpot, or AI tools such as Claude, Codex, Cursor, etc.

Also I was crazy enough to write this page by hand, albeit used AI to build these interactive widgets inside the article. So feel free to read the paragraphs as well, they are my honest thoughts after working on this for over a month.

Interactive DESIGN.md file

Hover over any section to see its purpose and strict rules.

1
---Frontmatter
2
name: design-system-[brand-or-scope]
3
description: Creates implementation-ready design-system guidance with tokens, component behavior, and accessibility standards.
4
---
5 
6
# [Design System Name]System Name
7 
8
## MissionMission
9
One paragraph describing the system objective and target product experience.
10 
11
## BrandBrand
12
- Product/brand: [name]
13
- Audience: [primary users]
14
- Product surface: [web app, marketing site, dashboard, mobile web]
15 
16
## Style FoundationsStyle Foundations
17
- Visual style: [keywords]
18
- Typography scale: [token list]
19
- Color palette: [semantic tokens + values]
20
- Spacing scale: [token list]
21
- Radius/shadow/motion tokens: [if applicable]
22 
23
## AccessibilityAccessibility
24
- Target: WCAG 2.2 AA
25
- Keyboard-first interactions required
26
- Focus-visible rules required
27
- Contrast constraints required
28 
29
## Writing ToneWriting Tone
30
concise, confident, implementation-focused
31 
32
## Rules: DoRules: Do
33
- Use semantic tokens, not raw hex values in component guidance.
34
- Define all required states: default, hover, focus-visible, active, disabled, loading, error.
35
- Specify responsive behavior and edge-case handling.
36 
37
## Rules: Don'tRules: Don't
38
- Do not allow low-contrast text or hidden focus indicators.
39
- Do not introduce one-off spacing or typography exceptions.
40
- Do not use ambiguous labels or non-descriptive actions.
41 
42
## Guideline Authoring WorkflowGuideline Workflow
43
1. Restate design intent in one sentence.
44
2. Define foundations and tokens.
45
3. Define component anatomy, variants, and interactions.
46
4. Add accessibility acceptance criteria.
47
5. Add anti-patterns and migration notes.
48
6. End with QA checklist.
49 
50
## Required Output StructureOutput Structure
51
- Context and goals
52
- Design tokens and foundations
53
- Component-level rules (anatomy, variants, states, responsive behavior)
54
- Accessibility requirements and testable acceptance criteria
55
- Content and tone standards with examples
56
- Anti-patterns and prohibited implementations
57
- QA checklist
58 
59
## Component Rule ExpectationsExpectations
60
- Include keyboard, pointer, and touch behavior.
61
- Include spacing and typography token requirements.
62
- Include long-content, overflow, and empty-state handling.
63 
64
## Quality GatesQuality Gates
65
- Every non-negotiable rule uses "must".
66
- Every recommendation uses "should".
67
- Every accessibility rule is testable in implementation.
68
- Prefer system consistency over local visual exceptions.

The basics of the markdown file

At the end of the day it is a simple markdown file. And the reason it works is simply because LLM models have gotten so good based on general training data that they only need a little bit of a nudge in the right direction. This is why instead of megabytes of JSON file such as how Figma works currently, AI interpretation of design systems works better with a well writen, human language compressed into a simple markdown file.

The DESIGN.md above is a more complete representation of a file where we also include more meta data such as guidelines and output structure, but the bare basics that would be enough for you to provide are two parts: brand guidelines, style foundations, and accessibility.

There is a general boom right now regarding the surfacing of these design skills and there is no clear term that has yet been provided. Furthermore, it is also not yet clear how you can install these skill files because the DESIGN.md format will only reliably work with Google Stitch, as for other tools such as Claude or Codex these files should still be installed inside the agents folder.

Our official CLI that helps you install design skill files (or DESIGN.md files) helps you choose the AI provider that you are using and the installing it into the correct folder where it can be interpreted when generating user interfaces.

Example design skills

If you want to instantly try out some of these files then I suggest checking out some of the open-source and curated design skill files that we have released here at TypeUI.

Curated design skills you can pull into your project

Use the CLI tool or copy-paste the markdown file into your project.

Paper design skill preview

Paper

$ npx typeui.sh pull paper
Bento design skill preview

Bento

$ npx typeui.sh pull bento
Neobrutalism design skill preview

Neobrutalism

$ npx typeui.sh pull neobrutalism
Bold design skill preview

Bold

$ npx typeui.sh pull bold
Artistic design skill preview

Artistic

$ npx typeui.sh pull artistic
Clean design skill preview

Clean

$ npx typeui.sh pull clean
Cafe design skill preview

Cafe

$ npx typeui.sh pull cafe
Dramatic design skill preview

Dramatic

$ npx typeui.sh pull dramatic
Refined design skill preview

Refined

$ npx typeui.sh pull refined
Energetic design skill preview

Energetic

$ npx typeui.sh pull energetic

The past month over 27k users have downloaded or used these skill files and the idea is very simple: install it in your project and let AI build the UI. You as the coordinator can then also nudge the pages into the right direction by telling AI to improve content, positioning, and more.

How does it change everything?

Before AI we used to have designers build design systems in Figma, and then have developers transform that ideal representation of user interface into actual working code. This is why with our flagship project called Flowbite we were able to influence over 30 million projects created in the world of the internet between 2022 and 2026.

But that time is OVER. It no longer makes sense to spend weeks and months creating design systems in Figma, and then use tools to convert these designs into code by burning unnecessary amount of tokens and not even getting the right results. The key is to FULLY delegate the creation of design and code to AI, but with the right instructions using blueprint design skill files and the right coordination from the creative minds of the designers and developers that were previously using the old workflow.

The new workflow

The most ideal situation is when the the coordinator (human) working with AI would have taste in design and also knowledge in building production ready applications. If that is not the case, I would suggest designers and developers work with design skill files as the source of truth, with guardrails for designer to prepare the frontend pages without datatabase intergating queries and developers cleaning up after designers.

We are actually trying to work on such a synchronizer as we speak, but it is a difficult balance to find between providing too much or too little information to the AI. Bigger labs than we are working on this overtime and I am genuniely curious where this will go in the future.

Integrating DESIGN.md with Figma and Penpot

I also built two plugins for Figma and Penpot that helps you generate design skill files based on what you have already designed in your application. I am planning on extracting tokens in the future, but for now you can set it up yourself with 100% control.

It is possible that in the future we will still be able to use tools such as Figma or Penpot to first think out the design and synchronize skill files based on those specifications, but there is a fine line between how much information you can actually feed AI so long as it does not start to underpeform. Remember: with AI skills less is more.

Unfortunately though Figma has been using a disastrous strategy in implementing AI into their workflow. I pray they will allow the usage of skill files or else their stock could plunge even lower in the future. The future of Figma is design skill files.

Figma logoExternal link

DESIGN.md Skills for Figma

Use the Figma plugin to keep design-skill context close to component exploration, review, and handoff decisions.

Open in Figma Community
Penpot logoExternal link

DESIGN.md Skills for Penpot

Apply the same curated design-skill language in Penpot to align design intent with AI-assisted code generation.

Open in Penpot Hub

Okay, so what now?

I do not know. We lost over 80% of revenue at Flowbite since our all time high, but it does seem that a new door is starting to open with post-AI tools such as TypeUI where we see an increase of usage and subscriptions. Also on a personal level it has been so much more fun building websites without having to code 2-3 hours for a simple feature, now I can do this probably in less than 10% of the invested time with tools like Codex or Claude.

We are currently working on what we call enhanced skill files that provides more nuanced specifications about UI components and style guidelines that seem to be working pretty well. We will provide this for the pro version if you are interested in supporting our work.

Anyways, I invite you to explore the rest of our website and I hope that you and your team will be able to navigate these new waters and emerge as the winners of the post-AI era that is already changing everything we know about technology.