Dedicated to Improving the Lives of Blind and Visually Impaired People

Web Accessibility in Mind Conference Coverage-Continued

Last week’s Bulletin covered some of the presentations from The WebAIM Web Accessibility In Mind Conference, held on September 7 and 8, 2022 to help attendees advance in their use of digital accessibility. Following are highlights from a few more of the presentations. As noted previously, recordings of sessions from this virtual conference, presented in partnership with Pope Tech, will be available soon.

Making Advertisements More Accessible

by Daniel Parker, RDPFS Intern

The opening presentation of the WebAIM Conference on September 8, 2022 was a panel discussion titled “The Intended Consequence of Inaccessible Digital Ads.” Moderated by WebAble CEO Mike Paciello, it also featured Jonathan Day of Centrus Digital, Joe Dolson of Accessible Web Design, and Gerard Cohen of Twitter. The panel explored how, even though no one likes ads, people with disabilities are missing out on opportunities to gain greater access to better ads that will result in more clicks and more revenue for advertisers, platforms, and content creators. Some ads are inaccessible by design, employing, for example, flashing animations that are both distracting and disturbing for certain users, or layouts that are inaccessible or bothersome for screen reader users to navigate. Additionally, the largest advertising platforms, such as Google AdSense, have no stipulations regarding the accessibility of their ads. This is despite the fact that a plethora of other mandates exist concerning types of content that can appear in ads. As a result, content creators cannot really make websites which are both free and completely accessible; they must either charge for use, itself an accessibility barrier, or they must sacrifice the accessibility of at least part of their websites by using ads. This is borne out by the most recent WebAIM Million Report, which found that pages using Google AdSense had an average of 23.9 more accessibility errors than other pages. Creators also cannot tell Google that they want only accessible ads on their sites. Dolson particularly stressed these latter points, saying that if Google were to create policies demanding only accessible ads on their platform, the ad market would change overnight due to their enormous influence.

Creating an Accessible Extended Reality (XR) Experience

by Daniel Parker, RDPFS Intern

Professor Reginé Gilbert of New York University gave a presentation on accessibility developments and challenges in the extended reality (XR) space. This umbrella term includes both augmented reality (AR) as found in games like Pokémon Go, and virtual reality (VR), an all-encompassing experience that replaces actual reality with a completely virtual environment. She became involved in XR accessibility because of an experience with Pokémon Go and her realization that a blind colleague of hers wished to play, too. Gilbert noted that XR has potential for people with disabilities, including wayfinding, thanks to artificial intelligence (AI) and machine learning. Pitfalls could include existing systemic biases against people with disabilities, the perception that AI systems are “authoritative” or “superior,” and the cost of XR headsets, among others. Gilbert and her team have also experimented in creating accessible Instagram filters, such as one detailing the next solar and lunar eclipse dates. They are even trying to come with a way to represent the eclipse through sound. However, although one can add sound to an Instagram filter, the difficulty in making and updating accessible filters means this program is still in the early stages. You can find out more about Gilbert’s work on her website and on Twitter.

Making Accessible Data Visualizations

by Daniel Parker, RDPFS Intern

In this presentation, Thomas Watkins, User Experience (UX) Architect at 3Leaf, LLC, addressed the importance and correct protocols for creating accessible data visualizations such as infographics. He first provided examples of a parallel process called “sonification,” in which data trends are represented in an audible form. The two examples were of a line whose points alternately got exponentially higher and lower, and a graph of seismic activity over time due to fracking, where the increase in pings over time represented an increasing number of tremors. Watkins believes, however, that sonification is not yet a workable solution, and tactile representations even less so, thus providing a need for accessible visualization captions. His formula for an accessible caption to a visualization centers on giving the following information in order: graph type, key objective, orientation for display, and finally, orientation to data read or summary, depending on the graphic. A paraphrased example from his presentation is: “‘Rising College Costs’ is a time-series line graph measuring showing the increases in college costs over time. It compares three types of college from 1970–2020.” Now individual data points can be given, which he calls a “data read”, or trends can be noted as a data summary. You can find out more about Watkins and his projects on 3Leaf’s website, and on his LinkedIn, Instagram and Twitter profiles. You can also find an interview with him here.