Summary

I've created a document with over 33 obvious priority 1 and priority 2 errors according to WCAG 1.0, and ran them through the leading free accessibility validators, and the W3C's markup validation service. None of them successfully found any of the errors, with one of them reporting an error that didn't exist.

Author: Gez Lemon

Contents

Invalid Content

Automated accessibility validation tools are really useful for a quick spot check of any obvious mistakes in markup, but it's important to remember that they can only provide a very rough guide, and a positive report does not imply an accessible page. To illustrate this, I've created a document with priority 1 and priority 2 errors (at least 33 obvious errors, many compounded) according to the Web Content Accessibility Guidelines 1.0. I then ran the document through the leading accessibility validators, checking for priority 1, 2, and 3 errors, with interesting results. The document was also run through the W3C's markup validation service.

Update: Thank you to Jim Thatcher for pointing out that the tests were limited to accessibility validators that were free to use (at least for one page). Jim makes the point that you get what you pay for, and it's entirely possible that validators that are not free to use may do a better job than the validators tested in this article. If anyone knows of a validator that makes sense of the test document, I would appreciate your feedback.

Cynthia Says Accessibility Validator

Cynthia Says reports no errors, but reports two priority three warnings for creating a logical tab order, and providing keyboard shortcuts. Interestingly, both priority 3 warnings are contentious areas of web accessibility, which are arguably best avoided for true accessibility.

HiSoftware Accessibility Validator

HiSoft has the same interface as Cynthia Says, and reports the same results.

Site Valet Accessibility Validator

Site Valet found no errors or warnings, but displays the default text of, possible fail - check warnings, when there were no warnings.

WebAIM Wave Accessibility Validator

WebAIM Wave reported no errors, and the only warning was that it found a table with no structural markup, and hoped that it was for layout; it was.

Watchfire's WebXACT Validator

There was a time when Bobby was easily one of the better accessibility validators. Since being taken over from the Center for Applied Special Technology by Watchfire, and turned into WebXACT, the number of false positives reported by the validator make it the worse of the current crop. WebXACT passes the document for priority 1 and priority 2 issues, for which there are plenty of obvious errors, but fails the document at level 3: Include default, place-holding characters in edit boxes and text areas. The error points to the two select elements, which are neither edit boxes nor text areas.

W3C Markup Validation Service

The W3C's markup validation service reports no errors in standard mode or verbose mode.

Category: Accessibility.

Comments

  1. [invalid-content-accessibility-validators.php#comment2]

    Hi ZCorpan,

    Isn't "Checkpoint 12.4: Associate labels explicitly with their controls" covered by the test "Checkpoint 10.2: Properly positioned labels"?

    By explicit labels, they mean that the label is explicitly associated with the form control using the for attribute:

    
    <label for="surname">Surname</label>
    <input type="text" name="surname" id="surname" ...
    

    I deliberately left that test off, as it's too easy to check for. Checkpoint 10.2 relates to labels that are implicitly labelled (wrapped around the form control). The guidelines put the emphasis on screen-readers being able to get access to the relevant label, but properly positioned form controls are also important for people with cognitive and mobility problems. All modern user-agents support explicitly associated labels, but I think the positioning is still important for the other groups. The position of labels is likely to be dropped completely from WCAG 2.0 HTML techniques, which would be a mistake in my opinion.

    Posted by Gez on

  2. [invalid-content-accessibility-validators.php#comment3]

    Hi Gez,
    there is an interesting piece published this month on webcredible - "The problem with automated accessibility testing tools" [http://www.webcredible.co.uk/user-friendly-resources/web-accessibility/automated-tools.shtml]. There is also a paper I did with my boss andrew a few years back - "Accessibility Testing Software Compared" [http://ausweb.scu.edu.au/aw03/papers/arch/paper.html] which covers similar territory.

    Posted by steve faulkner on

  3. [invalid-content-accessibility-validators.php#comment4]

    Hi Steve,

    Thank you for the heads-up *smile*

    I agree with most of Trenton's article, but I strongly disagree with Trenton's rationale of the summary attribute:

    However, there may be a heading directly before the table and it describes what the table is about. In this instance, this summary is essentially useless as it will just repeat what the previous heading said.

    Firstly, it would be the caption element that is likely to repeat a title before it. Secondly, it assumes the document is being read linearly. The whole point of the summary attribute is to provide the preamble that may be included in content before it, so that the table can be navigated without dependence on the rest of the content.

    Posted by Gez on

  4. [invalid-content-accessibility-validators.php#comment5]

    it's very easy to have wrong result with the automated tools with web site who pass the wcag with manual check.
    For instance: try label outside form who is valide code and look to make no error with screenreader, try anchor link like <a href=# name="thisisananchor"></a> who is today the best solution for anchor link (work everywhere)

    Posted by goetsu on

  5. [invalid-content-accessibility-validators.php#comment7]

    Hello,
    Hope someone can help. I am disabled myself and used to always use Bobby to "validate" the SABIF (Sussex Acquired Brain Injury Forum) website, for the benefit of any disabled visitors. I would aim to achieve Bobby AAA, and then mark it as WAI AA (better safe than sorry - advice from Jim Byrne, via email). My question is this: can anyone suggest/recommend an accessibility validator that can actually validate accessibility in an acceptable manner?

    Posted by Tim John on

  6. [invalid-content-accessibility-validators.php#comment8]

    Hi Tim,

    can anyone suggest/recommend an accessibility validator that can actually validate accessibility in an acceptable manner?

    The biggest mistake made by the current crop of accessibility validators is that they don't understand the difference between structure, presentation, and behaviour. As a consequence, they only examine the markup, ignoring everything else. The problem is that developers following standards have started to separate structure, presentation, and behaviour, which has effectively left the validators behind.

    No validator will reliably validate accessibility for you. I would continue to use whichever one you're most familiar with for a quick check, but carefully consider the results. If you haven't done so already, I would install NILS accessibility toolbar for IE http://www.nils.org.au/ais/web/resources/toolbar/ and Chris Pederick's Web Developer's Toolbar for Mozilla. http://chrispederick.com/work/firefox/webdeveloper You can then view the website with images, styles, and scripting disabled, to ensure it works as you expect. Another thing that may be helpful would be to submit the website to the critique section of an accessibility forum, where you may get some useful feedback: http://www.accessifyforum.com/forum11/

    Posted by Gez on

  7. [invalid-content-accessibility-validators.php#comment9]

    Thanks a lot Gez. I already have Chris Pederick's Web Developer's Toolbar on Firefox - I will now actually use it (and Firefox)! I'll also download and install NILS accessibility toolbar for IE, per your advice. Much appreciated.

    Posted by Tim John on

  8. [invalid-content-accessibility-validators.php#comment10]

    A good point, well made - I doubt if there's a better method of exposing the shortcomings of automated validators.

    Most of the checkpoints you've covered will never be machine-testable, and it's unlikely that techniques like DOM injection will ever be reliably detected. But I was surprised that none of the validators picked up 3.2 or 13.2, both of which are easily tested. Hopefully it will be only a short time before we see checkers parsing CSS files and picking up on things like pixels for font-size under 3.4.

    Finally, FWIW, I ran the page through SiteMorse and it too reported no accessibility issues, but it did report missing meta-date for description and keywords in another section.

    Posted by Dan on

  9. [invalid-content-accessibility-validators.php#comment11]

    I use a combination of the W3C validator and the watchfire validator, purely as a quick scan to see if I've missed anything. Then I will go through the site manually switching off images, style sheets etc.

    Posted by Hayley on

  10. [invalid-content-accessibility-validators.php#comment12]

    The test page that you put together successfully shows that automated tools have their shortcomings. All tools do. Accessibility can be a difficult goal to test for, to be sure, and the checkpoints you chose to highlight are precisely the ones that are more problematic. Some of these items could be incorporated into tools. Some could not. Accessibility will always require human judgement.

    At the same time, I want to put in a few words about how WAVE is intended to be used. WAVE cannot catch all, or even most, of the types of errors that you have focused on, but it catches a few more than you give it credit for. My comments are most relevant to WAVE 3.5 http://dev.wave.webaim.org/, though the majority of my comments are also relevant to 3.0.

    WAVE de-emphasizes the reporting features that other tools make prominent. WAVE is meant to be a tool for facilitating informed human evaluation, as opposed to being a tool that performs in the background and produces a definitive report with little or no effort.

    Some examples:
    WAVE exposes the alt text of images by putting the alt text adjacent to the image itself. In the case of your "lemon" and "apple" mismatch, a WAVE user would see that they don't match, and could correct the mistake. The WAVE feedback icon is still "green," and classified as an "accessibility feature," but the purpose of the in-context feedback is to encourage the reviewer to personally make the comparison between the *intended* result and the *actual* result.

    Your data table example (5.1) is another example of this principle. The reviewer could look at the table in WAVE and see that the columns have headers, but the rows do not. The reviewer could say "oops, I forgot to include row headers."

    Similar things could be said for your tests of 5.2 ("oops the headers are wrong"), 3.5 ("oops, I didn't mean to make that a heading"), 3.6 ("oops, I've mismatched my list types"), 3.7 ("oops, this shouldn't be marked up as a quote"), 5.4 ("oops, this shouldn't be marked up as a data table"), and 10.2 ("oops, I didn't position the label correctly").

    Using WAVE (or any other tool) requires some knowledge about accessibility in order to use the tool correctly. I'm sure that this is one of the points you're trying to make. WAVE doesn't make up for ignorance. It facilitates accessible development for those who already have some knowledge about accessibility.

    As a side note, your test for 12.3--though not inaccurate--is probably not the best test, since there are accessibility issues with optgroup, such as the diffulty of accessing the optgroup categories without the use of a mouse (which is an issue for motor disabilities and screen reader users).

    Also, your interpretation of "linearize" in 5.3 is the "old" kind of linearize, if I understand your intention correctly. I interpret linearization to be the literal "bare bones" order of the content in the HTML when all of the tags are removed, e.g. the order when viewed without tables or styles. I think you're referring to visual linearization from left to right across the graphical interface in a fully-styled layout with tables, which would be an issue for older screen scrapers, but not for modern screen readers.

    ... but despite my critiques, your main points are all valid: the tools are inadequate, and in some cases they always will be, due to the nature of accessibility itself.

    Posted by Paul Bohman on

  11. [invalid-content-accessibility-validators.php#comment13]

    there are accessibility issues with optgroup, such as the diffulty of accessing the optgroup categories without the use of a mouse

    Because people don't know that the proper keystroke on Windows to drop down a menu is Alt-downarrow, not just plain downarrow. Once you do that, you can navigate just fine. Takes five seconds to learn. Even Mac people know it.

    Posted by Joe Clark on

  12. [invalid-content-accessibility-validators.php#comment14]

    This is great work, but a bit academic in some way.

    As a blind user, very familiar with HTML and screen readers, I have to say that the errors you have highlighted have little to do with what is found in the wild.

    Inaccessible pages to me are those with poor markup (no headings where there are obvious places for them like bolded text obviously acting as section breaks), missing alt text, and missing form labels. Understanding how to use headings is tricky if you don't understand that a screen reader only reads things in the order in which the HTML appears. Missing alt text is simply oversight or pure laziness in my opinion. Similarly for form labels.

    If all pages made good use of headings (especially to allow one to skip navigation sections and get to the meat of the page), used decent alt text (use it where appropriate, null where not), and labeled *all* of their form elements explicitly (wrapping controls in labels does not work reliably in practice with Jws or Window-eyes), the internet would be a much more accessible place.

    Data tables are important, but unless they are overly complex (multiple header rows as in your TV watching example), screen readers can usually guess correctly. Same for form controls, but since designers often put labels in strange places, labeling is always effective and lets designers have their freedom.

    I think this does highlight the fact that in order to create accesible sites, you do have to understand the basics of how a screen reader reads the page, and understand some basic HTML. I don't think a designer using a WYSIWYG HTML editor having never seen anyone use a screen reader (or having never used one) can truly understand how to create accessible pages. By the same token, they will not bennefit from accessibility tools either, because they would probably not understand the output. The only way a tool can be effective is if it used by a skilled craftsman. No matter how nice the hammer, a poor carpenter cannot build a nice piece with it.


    Just my 3 cents...

    -- Rich

    Posted by Rich Caloggero on

  13. [invalid-content-accessibility-validators.php#comment15]

    Thank you for your comment, Rich.

    This is great work, but a bit academic in some way.

    The work here is deliberately academic. The purpose of the exercise was to try and encourage web developers not to depend on accessibility validators, but use them as a quick check for obvious errors. A website that receives a clean bill of health from an accessibility validator is not necessarily an accessible website. Lots of websites claim a level of accessibility they don't reach, merely because they've satisfied an automated tester rather than look at the real issues.

    Your list of things to do to make a website accessible is a great list, but some of the items couldn't be checked automatically with a high level of confidence. For example, a validator may not know whether a particular element would be better marked up as a heading, or that the alternative text was inappropriate for the image.

    since designers often put labels in strange places, labeling is always effective and lets designers have their freedom.

    Labelling is effective for some assistive technologies, but the positioning is also important for other groups, such as people with cognitive and/or mobility difficulties. If the label is positioned ambiguously, or too far away from the form control, the positioning of the label introduces an accessibility barrier.

    The only way a tool can be effective is if it used by a skilled craftsman. No matter how nice the hammer, a poor carpenter cannot build a nice piece with it.

    Absolutely, and a good analogy.

    Posted by Gez on

Comments are closed for this entry.