3.7: WCAG Web Auditing Review Template
- Page ID
Now that you have an understanding of WCAG, we will introduce you to an accessibility auditing template. You can use the template to record issues when you conduct audits, but it also acts as a checklist to help commit the guidelines to memory. For now, review the layout and elements of the Web Auditing Review Template and add it to your Web Accessibility Auditing Toolkit.
Elements of the Review Template
Title: This field should indicate the type of review, either General, Template, or Detailed (to be covered in greater detail in the unit Web Accessibility Audit Reporting) and the guideline it is based on.
Location: The URL of the homepage of the site being reviewed.
Date: The date the review was finished.
Reviewer: The name of the person(s) who completed the review.
Guideline Reference: A link to the guideline(s) the review is based upon.
Tools Used During This Review: A list of the tools used in the review, including automated checkers, browsers, browser plugins, readability test tool, colour contrast test tool, screen readers, and any other tools used. Be sure to mention version numbers if applicable.
General Comments: An overview of the result of the WCAG 2.1 Review (below), outlining the key issues, why they are issues, with brief mention of potential solutions. This section is written for a general audience, minimizing the use of technical language.
WCAG 2.1 Review: The main content of the review. This is a list of WCAG 2.1 success criteria, each one’s conformance level (A, AA, AAA), the evaluation received (Pass, Fail, Pass?, Fail? N/A), and comments associated with the evaluations. These comments should identify accessibility issues relevant to the guideline, explain why an issue presents a barrier, and offer potential solutions to resolve issues. The review should be aimed at the web developers who will be resolving the issues identified and may contain technical language and sample code that can be replicated. Screenshots and other graphics can be used to enhance explanations given in the text of the comments.
There will likely be cases when borderline issues are identified, where it could be argued that some element may pass or fail the associated success criteria. In such cases “Pass?” (with the question mark) is used where the auditor is leaning toward a pass, but others might argue it fails. And, use “Fail?” where the auditor is leaning toward a fail, though others might argue it passes. One example might be a description for an image provided in an alt attribute that does not fully describe the meaningful information in the image. In such cases it is often a subjective decision by the auditor, commenting to the author of the alt text to review the text to determine whether it “adequately” describes the meaning one should take away from the image if it were being viewed. Questionable pass or fail is described more thoroughly in the example linked in the Toolkit box below.
Note that the AAA items are greyed out, as well as the two AA success criteria (1.2.4, 1.2.5) that are not required by AODA (this is relevant to Ontario-based participants). If you are auditing in a jurisdiction that requires these guidelines, you might choose to adjust the template by removing the grey for these two guidelines. Otherwise grey items are optional, though when reviewing content issues associated with these guidelines, recommendations can still be made to implement techniques associated with these guidelines to improve overall usability.
Other Notes: While not included in the template, there are occasions when a reviewer needs to comment on issues not associated with the accessibility of the site being reviewed. For instance, a reviewer might mention potential bugs that may have been identified, include information about posting an accessibility statement or provide details on next steps following the review, such as planning a follow-up review after issues are addressed, or arranging a time to address questions that arise from the report.
Appendix: While not included in the template, the Appendix should include a list of the pages sampled from the site that was reviewed.
Example of a Completed Review
In 2012 a General Review of Canvas was posted by OCAD University in Toronto, which was in the process of selecting a new LMS for the university. The review was posted publicly, so it works well as an example of what a completed review might look like.
This review looked at a series of tasks, like reading a post in the forums and posting a reply, reviewing test results and checking marks in the Gradebook, and so on (these scenarios were described in the appendices, missing from the publicly-posted review). The result of these scenarios were combined into a General Review (see the unit Web Accessibility Audit Reporting for a description of different types of reviews). Read through parts of the review to get an idea of the types of information it contains.
You might ask, if there are so many issues, why did we use Canvas to deliver the online course version of the content here? Following the publication of the review, Canvas did pay attention and put considerable effort into improving the accessibility of their system. In 2014, Canvas accessibility had been much improved, though with still a few areas where improvements could be made. Compared with other Learning Management Systems, the current version of Canvas fares well in terms of accessibility.