Skip to content →

The Idea Place Posts

The ADA at 31

Monday will mark the 31st anniversary of the signing of the Americans with Disabilities Act (ADA). Each year as another year goes by and we celebrate the signing of the legislation, I am torn between feeling celebratory and sad.

I do feel celebratory about the fact that the work of many allowed us to reach this point. As someone who has worked on various accessibility efforts over the years in a small way, I know all too well the toil, endless negotiations and so much more that happens to make progress. So congratulations for sure to those involved in getting this landmark legislation passed and sustained. That is assuredly worth acknowledgement on a grand scale.

I urge anyone reading this to read the congressional findings that are listed in the legislation. Like a lot of civil rights legislation, they detail that as a class, in this case one to which I belong, people with disabilities are not treated very well and in fact that’s an understatement. Frankly we suffer a staggering amount of outright discrimination and I’m of the opinion that far too often the level of discrimination people with disabilities experience is drastically softened when speaking about the reality of life in the U.S. today.

I know from numerous firsthand experiences, calling something discriminatory makes a lot of people uncomfortable. But trust me, experiencing the actual discrimination has that and more consequences.

So great, celebrate the reality that we have a law that at least gives some hope if that is your choice. I understand that perpetual exposure to commentary that it is all trouble can be tough to experience. But while you are celebrating, just remember, it isn’t as if in passing any of the amendments to the ADA or other legislation, congress has said any of the eight findings they list have gone away. So if hearing that there are multiple challenges is tough or that something seems discriminatory, just remind yourself that the following are still part of the society we’ve created here in the U.S. according to our own congress.

The Congress finds that

(1) physical or mental disabilities in no way diminish a person’s right to fully participate in all aspects of society, yet many people with physical or mental disabilities have been precluded from doing so because of discrimination; others who have a record of a disability or are regarded as having a disability also have been subjected to discrimination;

(2) historically, society has tended to isolate and segregate individuals with disabilities, and, despite some improvements, such forms of discrimination against individuals with disabilities continue to be a serious and pervasive social problem;

(3) discrimination against individuals with disabilities persists in such critical areas as employment, housing, public accommodations, education, transportation, communication, recreation, institutionalization, health services, voting, and access to public services;

(4) unlike individuals who have experienced discrimination on the basis of race, color, sex, national origin, religion, or age, individuals who have experienced discrimination on the basis of disability have often had no legal recourse to redress such discrimination;

(5) individuals with disabilities continually encounter various forms of discrimination, including outright intentional exclusion, the discriminatory effects of architectural, transportation, and communication barriers, overprotective rules and policies, failure to make modifications to existing facilities and practices, exclusionary qualification standards and criteria, segregation, and relegation to lesser services, programs, activities, benefits, jobs, or other opportunities;

(6) census data, national polls, and other studies have documented that people with disabilities, as a group, occupy an inferior status in our society, and are severely disadvantaged socially, vocationally, economically, and educationally;

(7) the Nation’s proper goals regarding individuals with disabilities are to assure equality of opportunity, full participation, independent living, and economic self-sufficiency for such individuals; and

(8) the continuing existence of unfair and unnecessary discrimination and prejudice denies people with disabilities the opportunity to compete on an equal basis and to pursue those opportunities for which our free society is justifiably famous, and costs the United States billions of dollars in unnecessary expenses resulting from dependency and nonproductivity.

Leave a Comment

University of Wisconsin Hangs Out No Screen Readers Allowed Sign For Big Ten Opener

On Friday, the University of Wisconsin Badgers kicked off the COVID-19-influenced 2020 football campaign with a resounding 45-7 victory over the Illinois Fighting Illini. Like much in this year of change, Camp Randall was empty of the typical 80,000 fans.

To bring some of the gameday experience into the home, Wisconsin social media touted a new Badgers Live gameday experience.  Unfortunately, what Wisconsin Athletics clearly failed to do was ensure this experience was open to all fans. Instead, they hung out a sign to people who use keyboards and screen readers saying, “You are not welcome.”

Anyone familiar with web accessibility will recognize obvious WCAG failures on the opening signup screen.  Missing form labels and lack of keyboard access to needed controls just to name a couple.

If you manage to get past that, the signup experience has another basic failure where you are asked to pick an image to represent your user account.  The images are not reachable from the keyboard and are missing proper alt text.

There are likely many other failures beyond this.  I gave up after the inability to pick an image in the account creation process.

Web accessibility is not new and in fact is not optional for public institutions such as the University of Wisconsin. The university has detailed accessibility policies at https://www.wisc.edu/accessibility/.

At this point in my mind there is no reason beyond institutional indifference from at minimum the Athletics department to accessibility for these situations to keep happening.  This is not the first time I have experienced accessibility issues with web offerings from the athletics department.

It is far beyond time that Director of Athletics Barry Alvarez and Chancellor Becky Blank take accessibility of the online experiences for Wisconsin Athletics seriously. This new gameday experience may be exciting or it may be of absolutely no interest to me. But I, like any other fan, should have the opportunity to join and evaluate for myself.

As of Sunday, inquiries to Chancellor Blank on Twitter have gone unacknowledged. Email to a member of the athletic department indicated the issue would be investigated but with no date for an answer.

We are in unique times with all of us facing many challenges that were unexpected at the start of the year. But it is important that as we respond to those challenges, as Wisconsin Athletics has here, we keep our values and responsibilities in mind. Clearly someone at the university had the time to find this service. In fact, pregame radio interviews with members of the athletic marketing department repeatedly promoted how the team was looking to respond to COVID-19 and still create quality experiences for players and fans. This should have included accessibility and failing to do so is simply unacceptable.

Leave a Comment

A Couple Notes on VMWare For Virtual Machine Use

When I first wrote about using virtual machines with a screen reader, I mentioned that the machine management of VMWare’s Workstation was challenging. VMWare Workstation 16 seems to have corrected the majority of these issues.

The list of virtual machines is now something you can reach with the tab key. Up and down arrow will move to the different machines and a context menu on each machine available with Shift+F10 or the computer’s application shortcut key brings up all the options you’d expect for power, snapshots and more.

In addition, the menubar in the program now reads correctly with multiple screen readers. You can press Alt and hear that the top menu has gained focus and expected behavior with keyboard and screen reading works.

There is still at least one quirk I’ve encountered when using any menus in the program. When you are using the down arrow to move through a list of items on a menu, up arrow does not seem to move in reverse. For example, in the context menu for a virtual machine, there is a list of devices you can make available to the virtual machine. If you move down past one of these devices, you have to arrow down through all the menu choices to get back to what you skipped.

Overall in a couple days of using VMWare Workstation 16, I’ve had success. As I mentioned in my original post here, this is not a free option but with these changes it is one I’m going to be putting back into my virtual machine toolbox.

On the Mac side, VMWare has released Fusion 12. This is a must if you are going to run Apple’s newest update for the Mac OS.

I also believe this is new but there is now an option for a free personal license to use Fusion on the Mac. The license supports the typical non-commercial uses, such as education, personal use, and more. Take note though, signing up for the license requires completion of a captcha challenge that has no audio option. So far VMWare has not responded to a tweet asking about this.

If you do try Fusion on the Mac, I did post about how I resolved the Capslock conflict between VoiceOver and Windows screen readers. My work and personal interests require me to use both Windows and Mac daily so I find this to be a very viable option.

Leave a Comment

Improved Sports Descriptions

I enjoy sports but often find myself wanting more details than even the best play-by-play announcer can provide. I find myself wondering if technology could help in the situation.

I’ll start by taking football as an example. As most fans of the sport with know you have 22 players on the field at any given time. It would be simply impossible for any announcer to describe the location of all 22 players, let alone what each of them is doing during a play.

Imagine though if you divided a football field up into a complete grid where you could express the location of any spot by a set of coordinates. Then imagine that you can describe the location of players based on this grid. So at the start of the play you could indicate where all 22 players are lined up. Then as the players move during a play, you could have technology that would communicate how any player is moving throughout the grid.

Or imagine if you could combine this technology with the accessibility improvements that have come in various touch devices so you could slide your finger around in the application window and tell the location of any player at any time. Obviously doing any of this in real time would likely still be tough but imagine if you could do this on demand for certain plays just to get a sense of what’s really going on.

I have no idea how possible this is with today’s technology. It seems like it could be an interesting research project for someone in the right environment. I’m tossing this out there because it’s some thing I’ve thought about lately. Anyone want to take up the challenge?

Leave a Comment

Using Windows Quick Assist with a Screen Reader

When you want to assist someone else with a computer problem on Windows, there are a range of connection options. Some, such as an actual phone call, are easy to establish but difficult to use for this sort of a situation. Others, like Quick Assist, offer the best of all worlds in most cases where establishing the connection is easy and the access you have on the remote computer works quite well. However, Quick Assist does not currently support redirection of audio as far as I know, so if you are using a screen reader, it can obviously be difficult to use.

In the screen reader case, especially if the individual giving the assistance is using a screen reader, there’s not a great way to easily know what’s happening on the remote computer. That said, combining a couple pieces of technology, something I’m sure is no surprise to pretty much anyone who uses a screen reader, can be one potential work around if you want to use this connection option.

The Scenario

You, the screen reading user, want to aid someone remotely. That person may or may not be a screen reading user. In the old school phone call method, even if you use newer voice-over-IP technology, such as Skype, Teams, Zoom, or the never-ending products that seem to appear, you obviously have no direct control over the remote computer. Asking the individual you are trying to assist to type commands and such and read the output only goes so far.

The Quick Assist Solution

Establishing a Quick Assist connection is straight forward. Both individuals launch the Quick Assist app, which comes as a part of pretty much all Windows 10 installations. The person helping requests a six-digit code that they share with the person receiving help. The person providing help also indicates how much control they want to request. For our example this would be full control. The person receiving help will enter the six-digit code provided and approve the full control request. The pin requests, pin entry, control request and control approval all appear as web content so work quite well with screen readers.

The Wrinkle

As mentioned before, Quick Assist does not support audio redirection. To work around that situation, again you can combine a bit of technology.

  1. On either the computer providing help or the computer receiving help, create a meeting or other connection with your program of choice that allows for screen sharing. You will want to be sure to use a program that supports sharing of system audio as well. As I’m sure many blog readers know, I am a Microsoft employee, although this is my personal blog, so tend to use Microsoft Teams but Zoom and other programs will work equally well here.
  2. Both individuals involved in the assistance should then join the meeting or other connection.
  3. The person wanting help next starts a screen share from within the call or meeting. It is key to include the system audio. This will create the environment, where the person offering the assistance who is for our scenario a screen reading user, is going to hear the remote audio.
  4. Now establish the Quick Assist connection, including the person to provide assistance requesting full control and the person wanting assistance granting such control.

The Additional Wrinkles

Any time you are establishing a remote connection between two computers, the issue of how and when keyboard commands, touch gestures or mouse clicks go to the remote or local computer requires attention. In my trials of Quick Assist thus far, when you first establish the full connection, keyboard commands for me stayed with the local computer. Using various screen reader commands and techniques, I had to find part of the Quick Assist window on my computer and activate that area. The area was labeled something like Remote Input. I was able to get to this area with the JAWS cursor, NVDA object navigation or Narrator previous and next commands in those programs. Activating the input area with the appropriate command in the screen readers allowed sending keyboard commands to the remote computer. At that point I could use all the standard commands on the remote computer, such as ctrl+win+enter to launch Narrator. Again, our scenario is that a screen reading user is providing help and the person receiving help is not and likely doesn’t even have anything beyond Narrator installed.

One other item of note once you are using the remote computer. So far in my use of Quick Assist, I could not find a way to get keyboard commands to go back to the local computer without having the person receiving assistance end the connection. I did try some of the standard commands I use when working with a standard remote desktop connection and they did not seem to work. This may be a knowledge gap on my part or a limitation of Quick Assist. As I learn more I’ll update with additional details.

Using The Remote Computer

The process of creating this connection sounds more complicated than it is once you do it a couple of times. In my experience, getting the remote computer audio from the screen sharing in Teams or another app, and using Quick Assist to control the remote computer worked well. There was a slight, very slight, lag from the time I used my keyboard commands until I heard the result but it was far better than trying to ask the other person to type a range of commands. I found the environment quite usable to help solve problems.

Alternatives and More

This is not the only solution to this problem of course. If both parties in the connection are using the same screen reader, solutions such as JAWS tandem or equivalents where they exist, are better alternatives. Also, you could opt to forgo the screen sharing part of this process for the audio and just listen to the audio from the remote computer over the voice connection you are using. Likely there are other options I’ve not thought of too but I offer this as one more possible option when you want to help others out of a computer jam remotely, especially when you are using a screen reader and the person needing help is not.

One Comment

A Short Update on Virtual Machines to Include an Apple Mac

Since writing about using virtual machines with a screen reader, I’ve expanded my personal computing environment to include more Macintosh use. As a result, I wanted to offer a few comments on using VMWare Fusion on the Mac.

Bill Holton’s excellent article on running Windows on a Mac was an invaluable start for me here. From that article my biggest issue to resolve was how to address use of the Caps Lock key.

I took a different approach and opted to solve the conflict between multiple screen readers and the Caps Lock key along with that key apparently not getting passed on to Windows in a virtual machine through VMWare Fusion directly. This was easier than I expected at the start.

In Fusion’s preferences, there is a keyboard option. Within that area are options for remapping keys. I chose to remap a different key on my keyboard to the Caps Lock key. While I was at it, I assigned a keyboard combination to what Fusion calls Menu. This is the equivalent of pressing the Context or Application key for right click menus on a keyboard.

These key assignments work just fine. You can press and hold the key you’ve reassigned to be the Caps Lock key and add the necessary Windows screen reader key commands, such as t for window title.

I’m successfully using JAWS, NVDA and Narrator on Windows running under VMWare Fusion. Since I’m running on a MacBook Pro, I configure the screen readers to use their laptop keyboard layouts. I also use the option known as Fusion in VMWare. This causes the open windows from both the Mac and Windows to appear as separate windows within the Mac operating system. I find it works great with the Windows screen readers kicking in just fine when I switch to a Windows app.

Bill’s article also talked about using Boot Camp. I have tried that, and it is also an excellent option. However, I’ve found one consistent problem with Boot Camp that stops me from using it. There is a problem with the audio drivers used in Boot Camp that causes the first syllable or so of synthesized speech to be cut off unless other audio is playing. This has happened on occasion with other audio drivers on Windows. I’ve reported this to Apple, and I hope they address this sooner than later.

Leave a Comment

New Name Same Content

Despite installing as I understand it pretty much all the WordPress security you can, my blog has been hacked multiple times. It reached the point that Google was claiming my site was a security threat. As a result, I’ve opted for now to try a new name with the same old content and hopefully more frequently added new content.

I also like the new name better anyway. I like the idea of putting thoughts and ideas out and finding what resonates.

Leave a Comment

Another Day, Another Example of Missing Alt Text

As much as I’m sure anyone familiar with web accessibility doesn’t need yet another example of why alt text matters, as a consumer of web content I certainly am impacted when it is missing.

For anyone exploring cutting the cord, the U.S. Federal Communications Commission (FCC) has a handy resource to show you what digital television stations you can receive in your area with an antenna. Navigate to https://www.fcc.gov/media/engineering/dtvmaps and enter an address, city and state or zip code to get this information. Results are in a table that has headers and such. This is good.

Unfortunately, one of the key pieces of information, the signal strength from these results is a graphic. As you can expect from the title of this post, there is no alt text on these graphics.

Section 508 has been around for quite some time as have the Web Content Accessibility Guidelines. Proper alt text, again as I’m sure pretty much anyone working in the web environment knows, is a requirement. One can only wonder why something this basic was missed.

One Comment

A Request to Librarians: Please Ask OverDrive About Libby Accessibility

I’m a big fan of public libraries and the wide range of resources they make available. As a child making stops to my local bookmobile or summer afternoons spent at “Story Time Tree” to hear a fun adventure were two of my favorite activities. As an adult, I make frequent use of the eBook services, databases and other resources libraries make available.

OverDrive is as far as I know the largest player in making eBooks available to libraries. In many ways they provide a quality service but I’d encourage every librarian to fully understand the bargain you are making when you use OverDrive.

Would your library tolerate an author or other speaker coming to give a talk in your facility secretly whispering to some visitors they should not attend the talk? I think not, yet when you invite OverDrive into your facility, that is close to what you are doing.

OverDrive heavily promotes their Libby app as a part of the eBook services they offer. What I suspect most librarians do not know is that for users who rely on screen reading technology, the following is what greets their patrons when the Libby app is launched:

Welcome to Libby! This is a secret message for screen readers. We are working to improve your experience with this app. In the meantime, our OverDrive app is more accessible. You can find it in the app store. We thank you for your patience.

Libby is hardly a new app at this point and it should have been accessible from the start in the first place. This message has been present to the best of my knowledge for close to two years now. My own requests to OverDrive asking for any updates have gone without any meaningful response on multiple occasions.

Accessibility requirements too are nothing new. Nor are the technical details to make an app accessible a mystery. Apple, where this message appears on the iOS Libby app, has a wealth of resources. OverDrive itself by directing users to their older app, claiming it is more accessible, also demonstrates they understand accessibility to some degree.

I’d encourage librarians to ask OverDrive when this app will be accessible? Ask why is this message indicating the app has accessibility issues “secret”. It is beyond time that these sorts of challenges not be hidden away. It is time for them to be fixed and most definitely not hidden.

One Comment

MLB Strikes Out on Accessibility

Although it took a structured negotiation and settlement, at one point MLB seemed like it was turning the corner on improving and sustaining accessibility. Sadly that time seems to have passed, much like a favorite star of yesterday.

Most recently, MLB added a simple game to their AT Bat app with a bracket challenge to pick winners in the upcoming Home Run Derby. The problem is for accessibility purposes, MLB used “Player Name” as the accessible names for each player. Obviously this means if you use a screen reader such as VoiceOver on the iPhone, you have no idea what names you’d be selecting. Capping the problem off, when I called MLB’s dedicated accessibility support phone number to report the issue, the support representative asked me, “What’s VoiceOver?” I spent more of the call teaching the person how to use VoiceOver versus anything else.

One can only wonder if MLB did any accessibility testing of this game. It took me less than two minutes to notice the issue. For an organization that seems to keep statistics on almost every conceivable aspect of a game, it doesn’t seem like it would be too difficult to ensure accessibility testing is a part of all processes.

After trying the game in the At Bat app, I did try the web site as a part of writing this blog post. That does work a bit better but still has issues. Further, why should users of screen readers be forced to jump through extra hoops and try multiple versions just to have a bit of recreation? MLB has demonstrated the ability to be a leader in accessibility previously. Here’s hoping the organization finds a way to return to those levels. In this case, it is a strikeout on accessibility, leaving this fan disappointed and more.

2 Comments