Summary

The only method to truly discover whether or not content is accessible, is from usability studies that include people with a range of disabilities. Accessibility consultants can offer advice for best practice. Automated accessibility checkers can identify common mistakes in the markup. Neither are guarantees that the content is accessible.

Author: Roberto Scano

Automated accessibility validation tools are really useful for a quick spot check of any obvious mistakes in markup. It's important to remember that that's all they are; they in no way signify that the content is accessible. There are many points that cannot be automated, such as ensuring that the alternative text for images is appropriate, and many other issues. The problem is compounded with CSS, as validators do not check the CSS when evaluating a document; they just check the markup, and leave many issues that cannot be automated as user checks. It is the developer's responsibility to ensure that the user checks are addressed before making any claim to conformance to the Web Content Accessibility Guidelines 1.0 (WCAG 1.0).

The following WCAG 1.0 checkpoints apply to CSS, and could be missed when relying on automated validation.

Checkpoint 2.2

Ensure that foreground and background color combinations provide sufficient contrast when viewed by someone having color deficits or when viewed on a black and white screen. [Priority 2 for images, Priority 3 for text].

Checkpoint 3.4

Use relative rather than absolute units in markup language attribute values and style sheet property values. [Priority 2]

For example, in CSS, use em or percentage lengths rather than pt or cm, which are absolute units. If absolute units are used, validate that the rendered content is usable

Checkpoint 7.2

Until user agents allow users to control blinking, avoid causing content to blink (i.e., change presentation at a regular rate, such as turning on and off). [Priority 2]

CSS Zen Garden is an example of a website that makes a claim of being accessible to Triple A of WCAG 1.0. I'm not singling out CSS Zen Garden as an example of a website that advocates bad practice; CSS Zen Garden is an inspirational website that brilliantly demonstrates the power of separating content from presentation, and I have nothing but the greatest respect for Dave, and the many talented designers who have their work published in the garden. I only use CSS Zen Garden as an example, as it recently came up for discussion on an Italian web accessibility discussion list.

Even ignoring the CSS issues, CSS Zen Garden wouldn't meet WCAG Triple A. We don't intend to perform a complete accessibility review of CSS Zen Garden, but even a cursory glance reveals abbreviations that haven't been defined.

If your design doesn't work in at least IE5+/Win and Mozilla (run by over 90% of the population), chances are we won't accept it.

For triple A, IE and Win should be marked up as an abbreviation and an acronym respectively. The point is, there is more to accessibility than pointing to an automated accessibility validator.

There is a parody of CSS Zen Garden, which is truly awful. The link for the accessibility validation misleadingly contains the URL for the real CSS Zen Garden, which is a shame as there are a couple of automated accessibility errors highlighted by Bobby that could easily have been addressed. Still, it does highlight why automated validation is no indication as to the accessibility of the document, as the design, which obviously has accessibility issues, are not amongst the errors flagged by Bobby. This design, GeoCities 1996, is included in CSS Zen Garden, and illustrates perfectly why automated validation alone cannot determine whether or not a resource is accessible.

Whilst typical validators do not check CSS for accessibility, the CSS analyser on Juicy Studio (currently being ported to PHP) checks potential colour contrast issues, and that relative units of measurement are used as property values. Roberto has an Italian version of the CSS Analyser, along with a tool that tests for flickering in animated gifs.

Category: Accessibility.

Comments

  1. [accessibility-validators.php#comment1]

    The test for flickering in animated gifs is a fantastic tool. I ran some tests with it, and the output made it easy for me to understand the results even though I don't speak Italian. Excellent example of how to make such a useful tool usable. For clarity, the tool can be found here: http://www.webaccessibile.org/test/checkimg.aspx

    It took me a while to find on the page, but it's bookmarked now *smile*

    Thanks.

    Jane

    Posted by Jane on

  2. [accessibility-validators.php#comment3]


    Jane wrote

    The test for flickering in animated gifs is a fantastic tool.

    i totally agree, any plans for an english language version?

    Posted by stevef on

  3. [accessibility-validators.php#comment4]

    There is a parody of CSS Zen Garden, which is truly awful. The link for the accessibility validation misleadingly contains the URL for the real CSS Zen Garden

    - To be pedantic, I didn't make a parody of the zen garden; the entire point of the zen garden is to see the wide range of styles can be applied by css to an unchanging xhtml structure. Mine does exactly that, and thus cannot be called a parody.

    Neither does it "misleadingly contain the URL for the real zen garden": the validator checks the xhtml, which is the same for "every" zen garden, as only the css changes. (That's the entire point.)

    Thus, the validator is at fault as it only checks the structure rather than the presentation layer as well.

    Posted by Bruce on

  4. [accessibility-validators.php#comment5]

    Thank you for your comments, Bruce.

    - To be pedantic, I didn't make a parody of the zen garden; the entire point of the zen garden is to see the wide range of styles can be applied by CSS to an unchanging XHTML structure.

    Pedantic is good for me. A parody is where someone else's work is imitated for comic effect. You don't have an exact copy of the template for CSS Zen Garden, you have an imitation of it that contains mistakes not found in the original (I'll explain later). You then apply CSS for comic effect. A parody.

    Mine does exactly that, and thus cannot be called a parody.

    If you run your version through Bobby's automated validation process, yours will result in errors. The reason it results in errors is because you have the same link phrases pointing to different resources, and you don't separate adjacent links with more than whitespace. The real Zen Garden doesn't have this problem. It only occurs with your imitation that you apply CSS to for comic effect. A parody.

    Neither does it "misleadingly contain the URL for the real zen garden"

    It points to the real CSS Zen Garden; not yours. Clicking on the link verifies that the original CSS Zen Garden validates. It's misleading, because yours doesn't. The reason yours doesn't validate is because your version is different from the original Zen Garden template; you have an imitation of it, which you use to apply CSS for a comic effect. A parody.

    the validator checks the XHTML, which is the same for "every" zen garden, as only the CSS changes. (That's the entire point.)

    The same for every Zen Garden entry, except yours. Yours isn't the same as every other entry. Yours has errors in. Yours doesn't pass the automated Bobby test. But you probably haven't noticed that, because you're validating a completely different resource. Validate your own, and hopefully you'll get the point I was trying to make.

    Thus, the validator is at fault as it only checks the structure rather than the presentation layer as well.

    Just in case you haven't noticed, that is the whole point of this article. I mentioned your parody because I thought it was funny.

    I maintain it's a parody, and I maintain that it misleadingly points to the real CSS Zen Garden. The real Zen Garden passes Bobby's automated validation. Yours doesn't. It's misleading. I've heard people mention that yours validates. The reason they think yours validates is because they've been misled.

    I hope that clears up any misunderstandings.

    Posted by Gez on

  5. [accessibility-validators.php#comment6]

    Fair enough, Gez. You're right with the URL you link to (which is entirely my fault, as I didn't change the links on my site from the "work-in-progress" version based on the submissions template when to the real integrated version when it got added to the Zen Garden site.) Mea culpa.

    The "real" version that's integrated into the zen garden is http://csszengarden.com/?cssfile=http://www.tastydirt.com/zen/sample.css

    It does appear to validate. Of course, your point is entirely correct: it's got nothing to do with my "good" design, and everything to do with the incomplete validation - Bobby doesn't check the CSS which makes my design both comic and inaccessible.

    So we agree - don't we?

    Posted by Bruce on

  6. [accessibility-validators.php#comment7]

    I didn't change the links on my site from the "work-in-progress" version based on the submissions template when to the real integrated version when it got added to the Zen Garden site

    I didn't realise your entry was listed in Zen Garden. I should've checked, sorry. We were at cross purposes *smile* I've updated the original article to point to Zen Garden.

    So we agree - don't we?

    We do *smile* Your design should be used as a teaching resource, as it illustrates perfectly why an automated check isn't enough.

    Best regards,

    Posted by Gez on

  7. [accessibility-validators.php#comment8]

    I am fortunate to have The Zen of CSS Design book, written by Zen Garden creator Dave Shea with Molly Holzschlag, in my possession. It contains what I consider to be a mea culpa on having claimed AAA, and mentions some of what is mentioned in this post:

    "After the markup was written, a quick check from Bobby confirmed that it passed most major accessibility checkpoints. A few quick changes were needed before launch to fix the few glitches that Bobby noticed. A link labeled 'AAA' was added to the footer of the Zen Garden to signify that accessibility had been taken care of.

    "Or had it? It turns out that Bobby is not the final word when it comes to accessibility. If you familiarize yourself with the Web Content Accessibility Guidelines published by the W3C(...) you'll soon realize there are guidelines that Bobby simply can't check.

    "(...)So the Zen Garden's HTML theoretically passed all accessibility checkpoints related to markup but there are further checkpoints that go beyond HTML. A few of them even apply to CSS, and it became evident over time that some designs weren't taking these into account."

    Posted by Matt May on

  8. [accessibility-validators.php#comment9]

    Hi Matt,

    I am fortunate to have The Zen of CSS Design book, written by Zen Garden creator Dave Shea with Molly Holzschlag, in my possession. It contains what I consider to be a mea culpa on having claimed AAA, and mentions some of what is mentioned in this post:

    Thank you for posting the references from the book. Dave also acknowledges the triple A claim on Mezzoblue.

    Posted by Gez on

  9. [accessibility-validators.php#comment10]

    The summary at the top of this page is quite true, but not very helpful and possibly even slightly misleading.

    There's a saying as old as IT that fixing a fault at a later stage costs 10x more than fixing it at the previous stage. If automated tools pick up some of the faults at the templating stage, they've justified their use.

    Do you know of any single site / page which states which faults each automated tool can and cannot detect? Preferably as a score-card table with a row for each WCAG guidline and a column for each tool? This would also function as checklist for items which need human checking.

    And do you know of any place where it's possible to get people with a range of diabilities to check pages quickly and cheaply? If not, the summary is true but unrealistic.

    Posted by Philip Chalmers on

  10. [accessibility-validators.php#comment11]

    The summary at the top of this page is quite true, but not very helpful and possibly even slightly misleading.

    Thank you. Would you like to expand on why you find the summary misleading?

    There's a saying as old as IT that fixing a fault at a later stage costs 10x more than fixing it at the previous stage. If automated tools pick up some of the faults at the templating stage, they've justified their use.

    Automated tools are free, take a fraction of a second to use, so not much justification is required. The problem is, they're only as useful as the person interpreting the results.

    Do you know of any single site / page which states which faults each automated tool can and cannot detect? Preferably as a score-card table with a row for each WCAG guideline and a column for each tool? This would also function as checklist for items [that] need human checking.

    I don't know of a single website that states the faults of each automated tool as a score-card table with a row for each WCAG guideline and a column for each tool. For a requirement so specific, you would be well-advised to commission someone to gather that data for you.

    The WCAG 1.0 checklist is suitable for a human checking. Of the 70 checkpoints contained in the checklist, I would say that only 5 are currently testable through automated checkers with certainty (none from priority 1, two from priority 2 and three from priority 3):

    Checkpoint 1.5

    Until user agents render text equivalents for client-side image map links, provide redundant text links for each active region of a client-side image map [Priority 3]

    Checkpoint 3.2

    Create documents that validate to published formal grammars [Priority 2]

    Although documents may be checked to ensure they adhere to the appropriate DTD, they cannot ensure that a document has been marked up correctly, and is structurally correct. See scripting away validation concerns for more information.

    Checkpoint 3.4

    Use relative rather than absolute units in markup language attribute values and style sheet property values [Priority 2]

    The only tool that currently does this is the CSS validator from Juicy Studio (currently being ported to PHP).

    Checkpoint 10.4

    Until user agents handle empty controls correctly, include default, place-holding characters in edit boxes and text areas [Priority 3]

    Checkpoint 10.5

    Until user agents (including assistive technologies) render adjacent links distinctly, include non-link, printable characters (surrounded by spaces) between adjacent links [Priority 3]

    For all other checkpoints, automated validators can only give a very rough indication as to whether they've been satisfied, and require a human checker to know for certain.

    And do you know of any place where it's possible to get people with a range of disabilities to check pages quickly and cheaply? If not, the summary is true but unrealistic.

    You get what you pay for. You could approach charities and ask if they know of anyone with so little self-respect they'd be prepared to jump at your demand for next to nothing, and work your way up from there. Usability testing is usually quite expensive, although there are services such as the Shaw Trust Website Accessibility Accreditation Scheme that recruit people with a range of disabilities to test websites at a very reasonable rate.

    Posted by Gez on

Comments are closed for this entry.