Published: April 15, 2026
In November 2022, Elon Musk fired Twitter's dedicated accessibility experience team. It wasn't a subtle decision — former staff documented it publicly. The reasoning, presumably, was speed. Move fast, cut what doesn't ship features.
I decided to see what's left.
Using three free tools, I ran a full accessibility audit on X.com — the same process I use when auditing client sites professionally. What I found was not surprising. It was damning. And it mirrors what 95.9% of websites look like right now — yours probably included.
Here's the stat that should stop you mid-scroll: according to the WebAIM Million 2026 report, the average webpage now has 56.1 detectable WCAG errors — up 10.1% from last year. The failure rate is increasing, not decreasing. This isn't a problem on the verge of being solved.
Meanwhile, 3,948 ADA web accessibility lawsuits were filed in 2025 — a 23.84% increase over 2024. And since June 28, 2025, the EU's European Accessibility Act has been enforceable against private businesses selling to EU consumers.
The excuse that "accessibility is too complex" doesn't hold up when 96% of all errors come from just 6 issue types — the same 6 types, unchanged since 2019.
X is the most visible example of an industry-wide pattern: accessibility is always the first thing cut when a product "moves fast." It's not a feature, it's not tracked on a velocity board, and the consequences are invisible — until they aren't.
But X is also instructive precisely because of its scale and resources. If the richest man in tech can't be bothered to fix basic HTML errors on a platform with hundreds of millions of users, it reframes the problem. This isn't a resource problem. It's a discipline problem.
The findings below are from tools anyone can run in under 10 minutes. I'm not citing theoretical violations — these are reproducible, real failures on live pages.
You don't need a specialist toolkit to do this. The best accessibility auditing tools are free and built into the browser.
One important caveat — and this matters more than most people realize: automated tools catch roughly 30–40% of real accessibility issues. They find the symptoms that live in static HTML — missing attributes, wrong contrast values, absent labels. What they can't catch is the other 60–70%: broken keyboard navigation flows, incorrect focus management, screen reader announcement order, ARIA states that change dynamically, and whether your interactive components actually work without a mouse.
That gap is the difference between running a scan and running an audit.
For more on what a thorough audit methodology looks like, Marcy Sutton's approach is worth reading.
I'll be honest — none of this surprised me. After auditing enough sites professionally, you develop a rough mental model of what you'll find before you run the first scan. X.com matched that model almost perfectly. But there were still moments where the specific nature of a failure stopped me.
What follows is the detectable layer — the 30–40% that automated tools can surface. Keep in mind: for every failure listed here, there are two or three more that only show up in manual testing.
This is the one that wins every year: 83.9% of all websites have contrast violations. It's the most common WCAG failure on the web. It was the most common failure five years ago. It'll probably be the most common failure five years from now.
On X.com, timestamps, mute labels, and secondary action text render in gray on slightly lighter gray. On a high-DPI display in a bright room, it's barely readable for someone with perfect vision. For users with low vision or color blindness, it's functionally invisible.
WCAG 2.1 AA requires a 4.5:1 contrast ratio for normal text. X's timestamp text lands around 2.8:1. The fix is two lines of CSS and a decision to care:
/* Before */
color: #8b98a5; /* ~2.8:1 on white */
/* After */
color: #595f64; /* 4.5:1+ on white */What makes this failure so dispiriting isn't that it's hard. It's that it keeps winning. Low contrast is not a WCAG 2.2 edge case — it's been in the spec since 2008. Design tools flag it automatically. And yet here we are.
Over 50% of all pages fail this. On X.com, user-uploaded images frequently have no alt attribute. When a screen reader hits one, it reads the full URL: "https://pbs.twimg.com/media/FxA..." in its entirety. Not useful.
<!-- Failing -->
<img src="user-photo.jpg">
<!-- Fixed -->
<img src="user-photo.jpg" alt="Photo of a protest outside the EU Parliament">X does offer an optional alt text field when uploading images. It's buried in the upload flow, opt-in, and the vast majority of users never touch it. That's a product design choice — to treat accessibility as the user's responsibility rather than the platform's.
The problem with that position is that the platform controls the HTML. You can prompt users to add alt text and also set a reasonable fallback. Offloading it entirely is a choice.
This is where I'd push back on anyone who says this is a "corner case" problem.
The main navigation on X.com — bookmarks, notifications, messages, search, home — is built from icon-only links and icon-only buttons. Every one of them is either an <a> or <button> wrapping an SVG with no accessible name. A screen reader announces each one as "link" or "button" and nothing else. You have no idea what it does until you click.
These are both the same root failure — a design system that renders icons without text labels and doesn't compensate with ARIA — so I'm grouping them:
<!-- Failing: link -->
<a href="/notifications">
<svg>...</svg>
</a>
<!-- Fixed: link -->
<a href="/notifications" aria-label="Notifications">
<svg aria-hidden="true">...</svg>
</a>
<!-- Failing: button -->
<button>
<svg><!-- heart icon --></svg>
</button>
<!-- Fixed: button -->
<button aria-label="Like">
<svg aria-hidden="true"><!-- heart icon --></svg>
</button>Two attributes per element. The fix is not the interesting part. The interesting part is that this covers the entire primary navigation and every action on every post — like, repost, share, bookmark. A keyboard or screen reader user navigating X hears "link, link, button, button, button" cycling through a feed with no context for what any of it does.
This isn't a missed edge case. This is the core interaction model of the product.
X's search bar and compose box lack properly associated accessible labels. They have placeholder text, which is not the same thing. Placeholder text disappears when you start typing. It typically has low contrast by design. Many screen readers don't reliably announce it as a label at all.
<!-- Failing -->
<input type="text" placeholder="Search">
<!-- Fixed -->
<label for="search-input" class="sr-only">Search X</label>
<input id="search-input" type="text" placeholder="Search">The sr-only class visually hides the label while keeping it in the accessibility tree. It's an invisible fix to the end user.
What annoys me about this one specifically is that placeholder-as-label is still being taught in frontend tutorials in 2026. It's not a new pattern — it's been a known failure mode for years. Developers learn it from examples, ship it, and it lands in production on platforms used by hundreds of millions of people.
The lang attribute on the <html> element tells screen readers which language the page is in, so they apply the correct pronunciation rules and voice profile. On X, a global platform serving dozens of languages, this is absent or inconsistent on certain page types.
<!-- Failing -->
<html>
<!-- Fixed -->
<html lang="en">One attribute. No JavaScript. No library. No design review required. It's been mandatory in WCAG since 2.0 was published in 2008. X has been around since 2006.
I include this one not because it's the most damaging failure — it isn't — but because it illustrates something important: the problem isn't complexity. If your argument for skipping accessibility is that "WCAG is too complicated," you have to explain why you're also skipping the one-attribute fix that takes fifteen seconds.
Statistically, it does. 95.9% of home pages fail WCAG. The five failures above account for the vast majority of those violations. If you haven't run an audit yet, the question isn't whether these exist on your site — it's how many of them.
And here's what changes the math: if your business sells to EU consumers, your legal exposure started in June 2025 when the European Accessibility Act became enforceable. If you're a US state or local government serving 50,000+ people, your ADA Title II deadline is April 24, 2026. If you're a Canadian corporation, AODA fines run up to CAD $100,000 per day.
These failures aren't theoretical compliance gaps anymore. They're liability.
Want to know where your site actually stands? I run professional WCAG audits for Next.js and React sites — a full manual review, not just an automated scan. Real findings, prioritized by legal exposure and user impact, with specific fix guidance. Book an audit here.
After I publish something like this, someone inevitably asks: "Can't you just install AccessiBe and be done with it?"
No. And the numbers make that clear.
983 lawsuits in 2025 were filed against sites that were already using overlay widgets — 24.9% of all ADA web cases. AccessiBe was named in 424 of those. UserWay in 273. Courts are treating widget presence as evidence of awareness of the accessibility problem — not remediation of it.
What overlays do: inject JavaScript that patches ARIA attributes and adds a toolbar. What they can't do: write meaningful alt text for your images, fix your broken navigation structure, or repair keyboard focus flows.
The DOJ has been explicit: overlays do not ensure ADA compliance.
The alternative to a widget subscription isn't "do nothing" — it's fixing the actual HTML. The five failures above are all native HTML fixes. No subscription, no third-party script on every page visit. What you need is someone to find all of them — including the 60–70% that automated tools miss — and hand you a prioritized list of what to fix first. That's what an audit is.
I know the response from some teams will be: "We don't have the bandwidth." So let me make this concrete.
Legal & General shipped accessibility fixes and saw a 25% jump in organic search traffic within 24 hours — growing to 50% over time. Conversion rate doubled over three months. ROI hit 100% within a year. Content maintenance dropped from 5 days per job posting to half a day, saving £200,000 per year. (Source)
That's not a charity argument. That's accessible HTML producing cleaner markup, faster renders, and better SEO — because those things are structurally the same work.
If you're a CTO or product owner reading this and thinking "we'll get to it next quarter" — the European Accessibility Act didn't wait for next quarter. It's been enforceable since June 2025. Canada's AODA carries fines up to CAD $100,000 per day. Australia updated to WCAG 2.2 AA in April 2025. The US ADA Title II deadline for governments serving 50,000+ people hits April 24, 2026.
The cost of an audit is a fraction of the cost of a single lawsuit. And unlike a lawsuit, an audit comes with a fix list.
For a deeper look at what penalty exposure actually looks like in practice, see my earlier post: The €1M Risk: Real-World EAA Penalties in 2026.
I'll give you the starting point for free. An automated scan is worth doing — it just isn't sufficient on its own.
This gets you the 30–40% — the surface-level violations that live in static HTML. The rest — keyboard navigation, focus management, dynamic ARIA states, screen reader announcement order, interactive component behavior — requires manual testing with real assistive technology. That's the part most teams don't have the expertise or tooling to do themselves.
For a real-world walkthrough of the automated process on a production site, Joshua Winn's accessibility audit guide is one of the most practical I've read.
If you want the full regulatory context for why this matters strategically — not just technically — WCAG in the EU (2026): Why Waiting Is Now the Most Expensive Accessibility Strategy covers that ground.
Want to go deeper? If you're a developer who wants to build accessibility into your workflow — how to prioritize fixes, handle what automated tools miss, and ship accessible code by default — that's what we cover in The Agentic Architect Lab. It's a community of senior devs shipping faster with AI and building things that actually work for everyone.
X.com has all of these failures. So does 95.9% of the web.
The tools are free. The fixes are native HTML. The legal exposure is real, enforceable, and getting more serious every year. The same failure types have topped the list since 2019 — which means the industry has had seven years to fix them and largely hasn't.
This isn't a WCAG complexity problem. The spec isn't too hard to understand. It's a discipline problem — the same one that leads a company to fire its accessibility team the moment it starts to feel like overhead.
In my last professional audit for an e-commerce client, I found 47 violations — 14 of them in categories that regularly appear in ADA lawsuits. We fixed all of them in two sprints. The automated scan had only flagged 18. The other 29 came from manual testing: broken keyboard traps, focus that disappeared into invisible elements, modals that screen readers couldn't escape. That's the gap between running a tool and running an audit.
I audit Next.js and React sites professionally — a full manual review with a written report, prioritized findings, and specific fix guidance. Not a widget. Not a PDF checklist. Real findings from someone who does this on production codebases. Book an audit here.
