Skip to content →

The Idea Place Posts

An Area Where Accessibility Policies Often Fall Short

In the ways I monitor accessibility, it is clear that there is a greater awareness of the topic and more effort being put into making technology accessible. This is good.

Yet, the vast majority of accessibility policies still fail at what for me is one of the most fundamental needs. Specifically what is a user supposed to do when an accessibility issue is encountered?

I’m using the term “accessibility policy” here as a broad definition. It is more than just the words and phrases that appear in some document posted on a web site or other location. In theory those are just the outward signs of a range of processes and more that lead to an organization being able to actually have an accessibility policy.

I recognize that definition can be a stretch. In fact, often the reason for the failure I describe here is because the “accessibility policy” is just words and not something embedded throughout the organization.

Far too often, if a policy tells a user what to do, those instructions are geared toward how to report an issue. While this is obviously helpful, it does nothing to solve the user’s immediate need. You are asking the user to invest time in reporting an issue at the very time that user is likely the most frustrated with an experience you have created.

For me, a robust accessibility policy will go beyond this. The policy should not simply ask the user for information. It should offer an alternative way, when at all possible, for the user to get the information or task completed they were attempting when they hit the barrier.

One of my favorite questions, even today, is still why? Why are you posting your accessibility policy for example? Is it to comply with a government regulation, to provide users with information or what?

Similarly, as a user, why do I look for accessibility policies? In my case it is typically to find how I should report issues but also to see if the organization has some alternative way of task completion when I hit a barrier. It can also serve as a reflection of the organization’s overall philosophy on accessibility.

As much as I wish the frequency of accessibility issues I encounter was reduced, I’m realistic enough to know that’s not likely any time soon. I can come to grips with that at some level but not the leading a user with no alternative. That is where organizations should look within and recognize accessibility issues are going to happen and users need to be given paths to success when they do.

Leave a Comment

Customizing Keyboard Shortcuts for Edge Extensions

Yesterday I read about the team at WebAIM releasing a version of their popular Wave toolbar as an Edge browser extension. I’m very familiar with the tool so wanted to see how it behaved in Edge.

On the page with details about the extension, there is a section talking about how to use the tool. I noticed that it said you could press ctrl+shift+u, so after installing for Edge that’s what I dit. I pressed ctrl+shift+u and imagine my surprise when the page I was on started to be read aloud.

This is one of those things lumped into the category, I knew this but kind of forgot it in the moment. Ctrl+shift+u in Edge is the hotkey to do exactly what happened, that is start reading the current web page with Edge’s Read Aloud feature.

I had also noticed in the instructions for using the Wave toolbar a comment saying you could change the shortcut in Edge on the Manage Extensions page. My days of working on web browsers exclusively have long past so I wasn’t actually aware you could change the shortcuts assigned to extensions in Edge. I certainly am now. I wanted to share how you can quickly do this when using a screen reader, although the same basic process works without a screen reader as well.

The fastest way to get to the Manage Extensions page is to press ctrl+l on Windows or CMD+l on a Mac and enter edge://extensions in the browser address bar. This opens an Edge page showing you all the extensions yu have installed with options for the extensions.

Initially I was a bit stumped at this point because I was not finding anything about changing keyboard shortcuts. I was using the JAWS screen reader at the time and as with any web page, the Virtual PC cursor was on.

I’ll spare most of the details about how screen readers work, the different modes and more for now and hopefully talk about that in a separate posting. The short version of what I discovered is that there is a tree-view on this page you can use between listing extensions and managing the keyboard shortcuts for those extensions. You need to use that control and change to the Keyboard Shortcuts entry.

In the default way Narrator and NVDA present this page, both options from the control are shown in the Scan Mode (Narrator) and Browse Mode (NVDA) view. You can press enter on the choice you want. With the Virtual PC Cursor on, JAWS currently shows a nameless tree-view you need to press enter on to turn Forms Mode on and then adjust the control. You’ll need to turn Forms Mode off as well. With Forms Mode on, the values as you arrow up and down are properly read so Arrow down once to Keyboard Shortcuts and press enter. As of now, I did not try this page with VoiceOver on the Mac.

Once you change the Manage Extensions page to show keyboard shortcuts, there are a series of edit boxes where you can assign or adjust the keyboard shortcuts being used. These behave like typical edit boxes. Since I just learned about this functionality, my information may not be 100% accurate but I believe you can use either Control or Alt along with a shortcut key key here. I also believe you can add shift to either control or alt. For example, I made my shortcut for activating the Wave toolbar ctrl+shift+w. When a shortcut key has been assigned, a clear button becomes available if you want to remove it.

As a side benefit of all of this, I did discover that Accessibility Insights, an extension I use frequently, allows for multiple keyboard shortcuts for different functions of the tool.

Leave a Comment

Using Excel to Build Pivot Tables and Get Financial Information with a Screen Reader

Earlier this month I was part of a panel presentation at the American Council of the Blind’s summer convention. The panel spoke on financial literacy. My portion of the session dealt with using Microsoft Excel to create pivot tables and using what are known as datatypes in Excel with a screen reader. I wanted to make the materials from that presentation available. Resources include:

  1. Audio of my part of the presentation.
  2. A Word document containing the text of the presentation.
  3. A plain text transcript of the presentation.
  4. An example spreadsheet with pivot tables.
  5. A sample spreadsheet using datatypes for stock information.

Additional Resources

You can obtain more details on the features of Excel discussed in the presentation at the following locations:

  1. Creating Pivot Tables
  2. Excel Datatypes – Stocks and Geography
  3. List of datatypes in Excel
  4. Excel Stock History Function
  5. Sample Templates Using multiple data types
  6. Creating drop-downs in Excel using data validation
  7. Tables in Excel.

Pivot tables have been invaluable for me in Excel for many years. It is well worth the investment of time to understand how to create them.

The newer datatypes in Excel make getting a great deal of information on multiple topics relatively straight forward. Stocks and geography are just the beginning.

Leave a Comment

The ADA at 31

Monday will mark the 31st anniversary of the signing of the Americans with Disabilities Act (ADA). Each year as another year goes by and we celebrate the signing of the legislation, I am torn between feeling celebratory and sad.

I do feel celebratory about the fact that the work of many allowed us to reach this point. As someone who has worked on various accessibility efforts over the years in a small way, I know all too well the toil, endless negotiations and so much more that happens to make progress. So congratulations for sure to those involved in getting this landmark legislation passed and sustained. That is assuredly worth acknowledgement on a grand scale.

I urge anyone reading this to read the congressional findings that are listed in the legislation. Like a lot of civil rights legislation, they detail that as a class, in this case one to which I belong, people with disabilities are not treated very well and in fact that’s an understatement. Frankly we suffer a staggering amount of outright discrimination and I’m of the opinion that far too often the level of discrimination people with disabilities experience is drastically softened when speaking about the reality of life in the U.S. today.

I know from numerous firsthand experiences, calling something discriminatory makes a lot of people uncomfortable. But trust me, experiencing the actual discrimination has that and more consequences.

So great, celebrate the reality that we have a law that at least gives some hope if that is your choice. I understand that perpetual exposure to commentary that it is all trouble can be tough to experience. But while you are celebrating, just remember, it isn’t as if in passing any of the amendments to the ADA or other legislation, congress has said any of the eight findings they list have gone away. So if hearing that there are multiple challenges is tough or that something seems discriminatory, just remind yourself that the following are still part of the society we’ve created here in the U.S. according to our own congress.

The Congress finds that

(1) physical or mental disabilities in no way diminish a person’s right to fully participate in all aspects of society, yet many people with physical or mental disabilities have been precluded from doing so because of discrimination; others who have a record of a disability or are regarded as having a disability also have been subjected to discrimination;

(2) historically, society has tended to isolate and segregate individuals with disabilities, and, despite some improvements, such forms of discrimination against individuals with disabilities continue to be a serious and pervasive social problem;

(3) discrimination against individuals with disabilities persists in such critical areas as employment, housing, public accommodations, education, transportation, communication, recreation, institutionalization, health services, voting, and access to public services;

(4) unlike individuals who have experienced discrimination on the basis of race, color, sex, national origin, religion, or age, individuals who have experienced discrimination on the basis of disability have often had no legal recourse to redress such discrimination;

(5) individuals with disabilities continually encounter various forms of discrimination, including outright intentional exclusion, the discriminatory effects of architectural, transportation, and communication barriers, overprotective rules and policies, failure to make modifications to existing facilities and practices, exclusionary qualification standards and criteria, segregation, and relegation to lesser services, programs, activities, benefits, jobs, or other opportunities;

(6) census data, national polls, and other studies have documented that people with disabilities, as a group, occupy an inferior status in our society, and are severely disadvantaged socially, vocationally, economically, and educationally;

(7) the Nation’s proper goals regarding individuals with disabilities are to assure equality of opportunity, full participation, independent living, and economic self-sufficiency for such individuals; and

(8) the continuing existence of unfair and unnecessary discrimination and prejudice denies people with disabilities the opportunity to compete on an equal basis and to pursue those opportunities for which our free society is justifiably famous, and costs the United States billions of dollars in unnecessary expenses resulting from dependency and nonproductivity.

Leave a Comment

University of Wisconsin Hangs Out No Screen Readers Allowed Sign For Big Ten Opener

On Friday, the University of Wisconsin Badgers kicked off the COVID-19-influenced 2020 football campaign with a resounding 45-7 victory over the Illinois Fighting Illini. Like much in this year of change, Camp Randall was empty of the typical 80,000 fans.

To bring some of the gameday experience into the home, Wisconsin social media touted a new Badgers Live gameday experience.  Unfortunately, what Wisconsin Athletics clearly failed to do was ensure this experience was open to all fans. Instead, they hung out a sign to people who use keyboards and screen readers saying, “You are not welcome.”

Anyone familiar with web accessibility will recognize obvious WCAG failures on the opening signup screen.  Missing form labels and lack of keyboard access to needed controls just to name a couple.

If you manage to get past that, the signup experience has another basic failure where you are asked to pick an image to represent your user account.  The images are not reachable from the keyboard and are missing proper alt text.

There are likely many other failures beyond this.  I gave up after the inability to pick an image in the account creation process.

Web accessibility is not new and in fact is not optional for public institutions such as the University of Wisconsin. The university has detailed accessibility policies at https://www.wisc.edu/accessibility/.

At this point in my mind there is no reason beyond institutional indifference from at minimum the Athletics department to accessibility for these situations to keep happening.  This is not the first time I have experienced accessibility issues with web offerings from the athletics department.

It is far beyond time that Director of Athletics Barry Alvarez and Chancellor Becky Blank take accessibility of the online experiences for Wisconsin Athletics seriously. This new gameday experience may be exciting or it may be of absolutely no interest to me. But I, like any other fan, should have the opportunity to join and evaluate for myself.

As of Sunday, inquiries to Chancellor Blank on Twitter have gone unacknowledged. Email to a member of the athletic department indicated the issue would be investigated but with no date for an answer.

We are in unique times with all of us facing many challenges that were unexpected at the start of the year. But it is important that as we respond to those challenges, as Wisconsin Athletics has here, we keep our values and responsibilities in mind. Clearly someone at the university had the time to find this service. In fact, pregame radio interviews with members of the athletic marketing department repeatedly promoted how the team was looking to respond to COVID-19 and still create quality experiences for players and fans. This should have included accessibility and failing to do so is simply unacceptable.

Leave a Comment

A Couple Notes on VMWare For Virtual Machine Use

When I first wrote about using virtual machines with a screen reader, I mentioned that the machine management of VMWare’s Workstation was challenging. VMWare Workstation 16 seems to have corrected the majority of these issues.

The list of virtual machines is now something you can reach with the tab key. Up and down arrow will move to the different machines and a context menu on each machine available with Shift+F10 or the computer’s application shortcut key brings up all the options you’d expect for power, snapshots and more.

In addition, the menubar in the program now reads correctly with multiple screen readers. You can press Alt and hear that the top menu has gained focus and expected behavior with keyboard and screen reading works.

There is still at least one quirk I’ve encountered when using any menus in the program. When you are using the down arrow to move through a list of items on a menu, up arrow does not seem to move in reverse. For example, in the context menu for a virtual machine, there is a list of devices you can make available to the virtual machine. If you move down past one of these devices, you have to arrow down through all the menu choices to get back to what you skipped.

Overall in a couple days of using VMWare Workstation 16, I’ve had success. As I mentioned in my original post here, this is not a free option but with these changes it is one I’m going to be putting back into my virtual machine toolbox.

On the Mac side, VMWare has released Fusion 12. This is a must if you are going to run Apple’s newest update for the Mac OS.

I also believe this is new but there is now an option for a free personal license to use Fusion on the Mac. The license supports the typical non-commercial uses, such as education, personal use, and more. Take note though, signing up for the license requires completion of a captcha challenge that has no audio option. So far VMWare has not responded to a tweet asking about this.

If you do try Fusion on the Mac, I did post about how I resolved the Capslock conflict between VoiceOver and Windows screen readers. My work and personal interests require me to use both Windows and Mac daily so I find this to be a very viable option.

Leave a Comment

Improved Sports Descriptions

I enjoy sports but often find myself wanting more details than even the best play-by-play announcer can provide. I find myself wondering if technology could help in the situation.

I’ll start by taking football as an example. As most fans of the sport with know you have 22 players on the field at any given time. It would be simply impossible for any announcer to describe the location of all 22 players, let alone what each of them is doing during a play.

Imagine though if you divided a football field up into a complete grid where you could express the location of any spot by a set of coordinates. Then imagine that you can describe the location of players based on this grid. So at the start of the play you could indicate where all 22 players are lined up. Then as the players move during a play, you could have technology that would communicate how any player is moving throughout the grid.

Or imagine if you could combine this technology with the accessibility improvements that have come in various touch devices so you could slide your finger around in the application window and tell the location of any player at any time. Obviously doing any of this in real time would likely still be tough but imagine if you could do this on demand for certain plays just to get a sense of what’s really going on.

I have no idea how possible this is with today’s technology. It seems like it could be an interesting research project for someone in the right environment. I’m tossing this out there because it’s some thing I’ve thought about lately. Anyone want to take up the challenge?

Leave a Comment

Using Windows Quick Assist with a Screen Reader

When you want to assist someone else with a computer problem on Windows, there are a range of connection options. Some, such as an actual phone call, are easy to establish but difficult to use for this sort of a situation. Others, like Quick Assist, offer the best of all worlds in most cases where establishing the connection is easy and the access you have on the remote computer works quite well. However, Quick Assist does not currently support redirection of audio as far as I know, so if you are using a screen reader, it can obviously be difficult to use.

In the screen reader case, especially if the individual giving the assistance is using a screen reader, there’s not a great way to easily know what’s happening on the remote computer. That said, combining a couple pieces of technology, something I’m sure is no surprise to pretty much anyone who uses a screen reader, can be one potential work around if you want to use this connection option.

The Scenario

You, the screen reading user, want to aid someone remotely. That person may or may not be a screen reading user. In the old school phone call method, even if you use newer voice-over-IP technology, such as Skype, Teams, Zoom, or the never-ending products that seem to appear, you obviously have no direct control over the remote computer. Asking the individual you are trying to assist to type commands and such and read the output only goes so far.

The Quick Assist Solution

Establishing a Quick Assist connection is straight forward. Both individuals launch the Quick Assist app, which comes as a part of pretty much all Windows 10 installations. The person helping requests a six-digit code that they share with the person receiving help. The person providing help also indicates how much control they want to request. For our example this would be full control. The person receiving help will enter the six-digit code provided and approve the full control request. The pin requests, pin entry, control request and control approval all appear as web content so work quite well with screen readers.

The Wrinkle

As mentioned before, Quick Assist does not support audio redirection. To work around that situation, again you can combine a bit of technology.

  1. On either the computer providing help or the computer receiving help, create a meeting or other connection with your program of choice that allows for screen sharing. You will want to be sure to use a program that supports sharing of system audio as well. As I’m sure many blog readers know, I am a Microsoft employee, although this is my personal blog, so tend to use Microsoft Teams but Zoom and other programs will work equally well here.
  2. Both individuals involved in the assistance should then join the meeting or other connection.
  3. The person wanting help next starts a screen share from within the call or meeting. It is key to include the system audio. This will create the environment, where the person offering the assistance who is for our scenario a screen reading user, is going to hear the remote audio.
  4. Now establish the Quick Assist connection, including the person to provide assistance requesting full control and the person wanting assistance granting such control.

The Additional Wrinkles

Any time you are establishing a remote connection between two computers, the issue of how and when keyboard commands, touch gestures or mouse clicks go to the remote or local computer requires attention. In my trials of Quick Assist thus far, when you first establish the full connection, keyboard commands for me stayed with the local computer. Using various screen reader commands and techniques, I had to find part of the Quick Assist window on my computer and activate that area. The area was labeled something like Remote Input. I was able to get to this area with the JAWS cursor, NVDA object navigation or Narrator previous and next commands in those programs. Activating the input area with the appropriate command in the screen readers allowed sending keyboard commands to the remote computer. At that point I could use all the standard commands on the remote computer, such as ctrl+win+enter to launch Narrator. Again, our scenario is that a screen reading user is providing help and the person receiving help is not and likely doesn’t even have anything beyond Narrator installed.

One other item of note once you are using the remote computer. So far in my use of Quick Assist, I could not find a way to get keyboard commands to go back to the local computer without having the person receiving assistance end the connection. I did try some of the standard commands I use when working with a standard remote desktop connection and they did not seem to work. This may be a knowledge gap on my part or a limitation of Quick Assist. As I learn more I’ll update with additional details.

Using The Remote Computer

The process of creating this connection sounds more complicated than it is once you do it a couple of times. In my experience, getting the remote computer audio from the screen sharing in Teams or another app, and using Quick Assist to control the remote computer worked well. There was a slight, very slight, lag from the time I used my keyboard commands until I heard the result but it was far better than trying to ask the other person to type a range of commands. I found the environment quite usable to help solve problems.

Alternatives and More

This is not the only solution to this problem of course. If both parties in the connection are using the same screen reader, solutions such as JAWS tandem or equivalents where they exist, are better alternatives. Also, you could opt to forgo the screen sharing part of this process for the audio and just listen to the audio from the remote computer over the voice connection you are using. Likely there are other options I’ve not thought of too but I offer this as one more possible option when you want to help others out of a computer jam remotely, especially when you are using a screen reader and the person needing help is not.

One Comment

A Short Update on Virtual Machines to Include an Apple Mac

Since writing about using virtual machines with a screen reader, I’ve expanded my personal computing environment to include more Macintosh use. As a result, I wanted to offer a few comments on using VMWare Fusion on the Mac.

Bill Holton’s excellent article on running Windows on a Mac was an invaluable start for me here. From that article my biggest issue to resolve was how to address use of the Caps Lock key.

I took a different approach and opted to solve the conflict between multiple screen readers and the Caps Lock key along with that key apparently not getting passed on to Windows in a virtual machine through VMWare Fusion directly. This was easier than I expected at the start.

In Fusion’s preferences, there is a keyboard option. Within that area are options for remapping keys. I chose to remap a different key on my keyboard to the Caps Lock key. While I was at it, I assigned a keyboard combination to what Fusion calls Menu. This is the equivalent of pressing the Context or Application key for right click menus on a keyboard.

These key assignments work just fine. You can press and hold the key you’ve reassigned to be the Caps Lock key and add the necessary Windows screen reader key commands, such as t for window title.

I’m successfully using JAWS, NVDA and Narrator on Windows running under VMWare Fusion. Since I’m running on a MacBook Pro, I configure the screen readers to use their laptop keyboard layouts. I also use the option known as Fusion in VMWare. This causes the open windows from both the Mac and Windows to appear as separate windows within the Mac operating system. I find it works great with the Windows screen readers kicking in just fine when I switch to a Windows app.

Bill’s article also talked about using Boot Camp. I have tried that, and it is also an excellent option. However, I’ve found one consistent problem with Boot Camp that stops me from using it. There is a problem with the audio drivers used in Boot Camp that causes the first syllable or so of synthesized speech to be cut off unless other audio is playing. This has happened on occasion with other audio drivers on Windows. I’ve reported this to Apple, and I hope they address this sooner than later.

Leave a Comment

New Name Same Content

Despite installing as I understand it pretty much all the WordPress security you can, my blog has been hacked multiple times. It reached the point that Google was claiming my site was a security threat. As a result, I’ve opted for now to try a new name with the same old content and hopefully more frequently added new content.

I also like the new name better anyway. I like the idea of putting thoughts and ideas out and finding what resonates.

Leave a Comment