So I decided to analyze the three leading screen readers for accessing and testing PDF accessibility. Different elements, such as document title, headings, images, bookmarks, tables, forms, lists, and links were tested with each of the screen reader to understand what were their interpretation and how well did they read a tagged PDF file.
The Access Technology Higher Education Network (ATHEN) is continuing its functional evaluation of the components of the Google Application Suite to determine the accessibility of each component for users with various types of disabilities and assistive technology.
There has been much discussion, and some arguments, about how to determine the accessibility of websites. Unfortunately, this is often polarised around two simplistic choices: A compliance/conformance based approach that usually involves a checklist of criteria; or, some form of user testing by people who have different disabilities and/or who rely on different assistive technologies. Both approaches have their strength and limitations, and neither can provide a reliable declaration about the accessibility of a site on its own.
What the arguments attempt to assert is, essentially, that “Checklist” accessibility is not good enough, either because the checklists themselves are flawed or that the checklist takes the disabled user out of the equation and relegates their challenges to the level of a series of check items.
People with cognitive impairment are still one of the less well understood groups in terms of how we can make things easier for them