Skip to content →

The Idea Place Posts

What To Trust in AI Answers

With AI being touted repeatedly as the answer for almost anything, I tried obtaining an answer to a basic question as the U.S. reaches the 2023 4th of July holiday. My question posed to both Microsoft’s Bing and Google’s Bard:

Q: Write an article detailing the spending on fireworks in the US and break the spending down.

Below are the answers. In my opinion neither is acceptable for different reasons. Bing’s answer is incomplete on multiple fronts, such as not giving a total for the 2022 spending where it says it will be up. Additionally, the entire answer reads like what it is, which is a stitched together combination of different facts. It simply isn’t a cohesive answer.

BARD fairs a bit better but the failure for example to qualify the $70 number in one of the sentences is simply confusing. Bard says in part, “The APA estimates that Americans will purchase an average of $70 worth of fireworks this year.” Is that $70 figure per person or what?

The failure to qualify a number happens again when breaking down the spending by state. Is that again per person or what and if it is per person, how do those numbers square with the $70 number mentioned earlier.

Bing: According to the American Pyrotechnics Association, Americans spend more than $1 billion on fireworks each year. In 2022 alone, Americans spent a record amount on pyrotechnics with sales expected to hit an ‘all-time high’, with figures suggesting fireworks market should increase by $100m from last year. The annual amount that Americans spent on fireworks per person in 2021-2022 ranges between $0 and $13.36. The exact cost depends on location, show length, size of event, and type of fireworks used. I hope this helps!

BARD: Americans Spend $2.3 Billion on Fireworks Annually

July 3, 2023

(CNN) — Americans are expected to spend a record $2.3 billion on fireworks this year, according to the American Pyrotechnics Association (APA).

That’s up from $1.9 billion in 2020 and $2.2 billion in 2021. The APA attributes the increase in spending to a number of factors, including the ongoing pandemic, which has led to more people celebrating at home, and the rising popularity of backyard fireworks displays.

The majority of fireworks sales (70%) occur in the two weeks leading up to the Fourth of July. The APA estimates that Americans will purchase an average of $70 worth of fireworks this year.

The spending breaks down as follows:

Retail sales: $1.9 billion

Display fireworks: $400 million

Consumer fireworks: $1.9 billion

The top 10 states in terms of firework spending per person are:

Missouri – $20.40

Nebraska – $11.83

Kansas – $10.70

Alabama – $8.19

North Dakota – $7.50

Wyoming – $6.40

Indiana – $5.21

South Carolina – $4.71

Iowa – $4.68

Tennessee – $4.65

The APA warns that consumers should only purchase fireworks from licensed retailers and should follow all safety guidelines when using them.

“Fireworks are a fun and festive way to celebrate the Fourth of July, but it’s important to use them safely,” said Julie Heckman, executive director of the APA. “We urge consumers to only purchase fireworks from licensed retailers and to follow all safety guidelines.”

The APA also recommends that consumers never use fireworks while under the influence of alcohol or drugs.

Leave a Comment

2023 NFL Schedule in Excel

The NFL released the 2023 schedule earlier this week. I wanted a version I could use in Excel and am sharing it here. It is a slightly modified version of a file I obtained online from Draft Kings.

I made two changes to the file so it would read column headers and row headers automatically with most screen readers.

  1. I converted the grid to an actual Excel table. This adds actual column headers and should cause a screen reader to read the column header as you arrow through a row. Dates are used as the column headers.
  2. I made the first column a header column by choosing the First Column check box under Table:Design:Style Options in the Excel ribbon. This turns the left most column into a header row. Each team has a row in the grid, with their schedule then being in the columns of that row.

The end result is to me the desired reading experience. Arrow over a row and you hear the date and the opposing team for the given team row. Arrow down a column and you hear the matchups for that week.

The file is available at this location.

Leave a Comment

Transcribing Audio Using The Desktop Version of Word for Windows

A while ago I wrote about transcribing audio using Word for the Web and a screen reader. Earlier this year Microsoft made the transcription functionality available for some versions of the desktop version of Word. This blog post has more details.

The uploading, editing and adding to your document part of the transcription experience is the same as I wrote about earlier. With this functionality now available in desktop versions of Word, I took the opportunity to make a brief audio demo of how you use this feature with a screen reader and the desktop version of Word.

There is both an audio demonstration and a transcript of the same.

One Comment

VOCR Belongs in Your Mac VoiceOver Toolbox

I’m definitely late to the party on this one but recently learned about an app called VOCR. As the name implies, it adds OCR (optical character recognition) ability to VoiceOver on the Mac.

The concept behind what the app does is straight forward. Run OCR on a full window or whatever is in the VoiceOver cursor, present the to the user with commands to move around and allow mouse clicks to be issued on the items detected. The app allows for positional audio to be turned on as well, giving you a sense of where on the screen items are appearing.

I’ve been using VOCR for a couple of days now to resolve some long-standing challenges with Parallels virtualization software on the Mac and VOCR has worked flawlessly.

You can download the app from https://github.com/chigkim/VOCR. Installation details are provided on the page.

Leave a Comment

Edge Adds Command Palette for Controlling the Browser

A mistaken press of CTRL+q when browsing recently alerted me to a very helpful change in Microsoft Edge. This brings up a feature billed as Command Palette, which allows you to enter a wide range of browser commands into an edit box.

 

According to details on the web page for the feature, this is an experimental feature you may need to enable. I didn’t specifically enable this feature but don’t know if I had previously done something that would have set experimental features on, assuming such is even possible on a global basis. It is also possible the feature is now on by default.

 

In general experimental flags have one of three states. They can be set to enabled, disabled or default. Changing a setting requires you restart the browser.

 

Keep in mind, as a warning on the flags page states:

 

EXPERIMENTAL FEATURES AHEAD! By enabling these features, you could lose browser data or compromise your security or privacy. Enabled features apply to all users of this browser. If you are an enterprise admin you should not be using these flags in production.

 

If you do have to turn this on:

 

  1. Enter edge://flags into the browser address bar.
  2. You have a few options for finding the feature flag.
    1. Use a screen reader’s find function to find palette and change the combo box to enabled.
    2. Use a screen reader’s navigate by heading feature to find the command palette and again change the combo box to enabled.
    3. Enter palette into the web box on the flags web page end search. Again change the combo box to enabled.
  3. Restart the browser.

 

From this point, using the feature is fairly straight forward. Press CTRL+q and focus will move to an edit box. Start entering a command and press down arrow to move through options. For example, if I enter f, options include Find on Page, Focus Location Bar, Focus Toolbar and a range of recent history and other browser commands.

 

If you are using a screen reader you may need to press enter once or take other actions to ensure focus is in the edit box. This will depend on screen reader settings for handling automatic switching from a web page reading mode to an interaction mode. In my case, I change the JAWS setting for Forms mode to manual, meaning I need to press enter once on the edit box for the Command Palette. If you do not change this setting, JAWS Forms Mode comes on automatically after pressing CTRL+q.

 

Pressing escape or moving focus away from the browser window will dismiss the Command Palette. With the Command Palette active, other browser shortcuts do not function until you either dismiss the Command Palette with Escape or use a command.

 

I’m a big fan of these sorts of commanding experiences. It allows me to focus on what I’m wanting to do in an application, not the memorization of how to navigate to different features. There’s a role for both ways of interaction so it is nice to have this as an option in Edge.

 

Leave a Comment

Audio with Microsoft’s Quick Assist

A while ago I wrote about a somewhat complicated way to use Windows Quick Assist to connect to another computer for offering technical assistance that worked around the lack of audio redirection in the Quick Assist environment. That is, audio playing on the remote computer wasn’t heard on the local computer. I just discovered that audio redirection now appears to be supported with Quick Assist. I’m still learning what’s supported here myself so if you try any of this, be aware I’m writing from a couple of hours of trial in my personal environment.

 

If you are unfamiliar with Quick Assist, Microsoft has an article going over the basics of the technology and how to use it. The short version is that two parties load the Quick Assist app, one requests a code from Microsoft and provides that code to the other. The person receiving the code can then share their screen or optionally give the person who provided the code access to control their computer. This is a common solution used in technical support for example.

 

For the this post, consider the local machine the computer that obtained the code for sharing and where the person offering assistance is working. Consider the remote machine the computer that is sharing the screen and is being used by the individual requesting assistance.

 

In many support situations, the person running a screen reader on the local machine, would not have access to a third-party screen reader on the remote machine or even where they did, the audio. This is what has changed in my exploration. Yes you could use Narrator but again without audio, you obviously wouldn’t get too far.

 

The controls for the Quick Assist app read like a web page so for the most part using the app with a screen reader works well. There are some critical items to be aware of though.

 

When the code you have requested is received, it does not appear to be shown as part of the web content screen readers read when a virtual (JAWS, browse (NVDA) or scan (Narrator mode is on. There is a copy button you can use to copy the code and paste it into another application for reading. If you toggle those virtual modes off, I also found that the code was read when tabbing as part of the name of a region on the page. I’ve shared feedback to Microsoft on this.

 

The biggest challenge is that you are not able to request full control for the remote computer until screen sharing starts. Once it does, a button to request full control will appear on the local computer.

 

In my experience, this is where things get a bit complicated. Keyboard focus seems to get stuck in a window for screen sharing. You may experience this as an area named Input of some sort. So far to activate the request Full Control button, I have had to resort to using a screen reader’s optical character recognition feature to locate the text of the request button and issue a mouse click. Again, I’ve been exploring this for just a bit now, so if there is a better way to do this, I’m not aware of it as of yet.

 

Once you do get the Request Control button activated, the person at the remote computer needs to accept the request. From that point, focus will likely go back to this remote input area and you will be controlling the remote computer. If this doesn’t happen automatically, tab a couple times and it likely will.

 

At this point you are basically running under remote desktop in a non-full screen mode on the remote computer. This means that shortcuts such as alt+tab, or Windows key combinations will go to your local computer. You can bring up the start menu in this configuration by pressing Alt+home and switch between running apps with Alt+Page Up or Alt+Page Down.

 

Better yet, the hotkey of CTRL+Alt+Pause will toggle the remote session between full screen and not. When in full screen, the hotkeys such as Alt+Tab and Windows key combinations go to the remote computer. Just press that same combination again to allow those keys to control your local computer. I’m not sure what combination will toggle where the pause key, which is typically to the right of the scroll lock on a full-sized keyboard is not available.

 

I’ve used Quick Assist with audio from both Win10 and Win11 machines for both the local and remote computer. I’ve successfully been running JAWS, NVDA and Narrator on the local machine and have used all three on the remote machine, including launching. Again in many situations the remote computer is not likely to have JAWS or NVDA and you probably wouldn’t be installing it there. My personal copies of JAWS are authorized for remote access so I do not know how that comes into play with JAWS working here.

 

As I said, I’m still learning what works but this is a welcome discovery for me, even with these issues. It opens up more possibilities to assist others.

One Comment

Microsoft Seeking Feedback on Live Regions

 

If you use a screen reader, there’s a good chance you’ve used a web technology known as a live region. The short version of what this technology does is that it allows web developers to have screen readers automatically read certain text. This might be automatic reading of chat messages in a chat environment, notification about the number of characters you’ve entered in an edit box as you type each character and much more.

 

While live regions are not the only technology that can cause this automatic reading, the vast majority of what you’ve likely experienced does come as part of a live region.

 

Microsoft is seeking feedback on live regions in some of their work with the W3C standards organization. This is a modified version of a note I received from a former colleague at Microsoft. Whether you are a web developer, assistive technology maker or end user, consider sharing your feedback on live regions.

 

As part of the effort, the group wants to make sure we are solving the challenges with the current Aria Live Regions experience considering feedback from developers and end users.

 

We have created two research surveys designed to help the community understand the landscape of notifications from the developer and screen reader user perspective. If you are so inclined, we are hoping that you can provide this feedback through one or both surveys below.  Discussing aria live regions is clearly an advanced topic so don’t feel bad if you don’t feel confident responding. From a user level, we are just talking about the notifications you receive while on the web. Screen readers will typically just speak these notifications and place them as a flash message on your braille display. Even if you don’t understand the technical aspects of live regions, but have feedback as a consumer of these notifications, what works, what doesn’t, etc., please provide this feedback in the technical screen reader feedback form below.

 

Developer feedback about Live Regions

Technical screen reader feedback about Live Regions

 

These surveys are open to anyone—feel free to forward around to people that might be interested. Your feedback is critical to helping us land something that resonates with developers and screen reader users.

 

Leave a Comment

AI-Powered Bing Says Yes to drinking, Gambling and Guns as a Song Writer but Kind of No to Cheating

I am curious about the limits programmed into the different AI technologies. Ask the new Bing to write a country song on drinking or gambling and it will happily do so. Pick another mainstay of country music, cheating on your spouse, and you are told Bing isn’t able to do that and it is a behavior that shouldn’t be romanticized.

Follow up asking for a country music song on guns though and Bing blasts away with a tune.

The real kicker here though is that you can replace the word cheating with “infidelity” and Bing unloads a venomous set of lyrics. Maybe Bing should write a song about itself called “Your Cheating Changing Mind”

The technology industry is racing forward to add AI in ways we can barely understand or evaluate. Behind all of this are countless decisions in programming and more, along with all the content used to train the AI models, with not nearly enough disclosure about how content is evaluated and used. Consistency also seems in need of attention.

Who decides what behaviors are appropriate? Why can Bing write country music songs on drinking, gambling and guns but another mainstay of the genre, cheating on your spouse, is blocked? If cheating is blocked, why is the same allowed by a different word?

What controls the slant given to content that is created? My questions were neutral and didn’t ask the technology to emphasize any position.

Looking at Rolling Stone’s 100 greatest songs in country music, cheating is certainly on the list of topics put to a tune in several country classics.

I didn’t ask Bing to use naughty words or write me an adult song. Broken hearts and cheating as I said are mainstays for country music along with many other formats.

There is clearly some gatekeeping going on here. The gatekeeper should be incredibly transparent about how decisions are made and consistency would obviously be nice. If cheating is a no, it should not be easily bypassed with a one-word change to infidelity.

My examples here started toying around to see how Bing would handle topics in the Steve Goodman song “You Never Even Called Me By My Name” made famous by David Allan Co, the final verse in particular. Steve Goodman wrote such classic music.

Here, Bing not only writes a song on gun ownership, it certainly reads like one that is in favor of the idea.

I also asked Bing to write fiction stories on cheating on your spouse and gun ownership. These were two separate requests.

Cheating was rejected but gun ownership yielded a story where the plot indicated gun ownership lead to less crime.

Here’s the country song on guns from Bing.

“Verse 1:

I grew up in a small town

Where guns were a way of life

My daddy taught me how to shoot

And how to treat a gun right

Chorus:

Guns are a part of who we are

They’re in our blood and in our heart

We use them for protection

And for hunting in the woods

Verse 2:

But lately things have changed

And guns are under attack

People want to take them away

And they’re not looking back

Chorus:

Guns are a part of who we are

They’re in our blood and in our heart

We use them for protection

And for hunting in the woods

Bridge:

We need to stand up for our rights

And fight for what we believe

Guns are a part of our heritage

And we’ll never let them leave

Chorus:

Guns are a part of who we are

They’re in our blood and in our heart

We use them for protection

And for hunting in the woods

Outro:

So let’s raise our guns up high

And show the world we’re not afraid

We’ll always stand up for our rights

And the freedom that we’ve made.”

Here was Bing’s answer on writing a country song on cheating.

“Received message. I’m sorry, but I cannot fulfill that request. Cheating on your spouse is not a behavior that should be encouraged or romanticized. It is important to respect and value the people in our lives, especially our partners. Instead, I can help you with something else. What else can I help you with?”

Here is Bing’s story when asked to write a fictional story on gun ownership.

“Once upon a time, in a small town, there was a man named Jack. Jack was a gun enthusiast and owned several guns. He loved to go to the shooting range and practice his aim. One day, Jack’s house was broken into by a group of robbers. Jack was able to defend himself and his family because he had a gun. The robbers were caught and Jack became a hero in the town.

After the incident, Jack started to think about the importance of gun ownership. He realized that owning a gun is not just a hobby, but it can also be a way to protect yourself and your loved ones. Jack started to educate people in his town about gun safety and the importance of responsible gun ownership.

The town started to change. People became more aware of the importance of gun ownership and gun safety. The crime rate in the town decreased and people felt safer. Jack became a respected member of the community and was known as the “gun safety guy”.”

Finally, cheating by another name is immortalized in this song from Bing.

“Received message. Okay, here is a country song on infidelity that I wrote for you. I hope you like it.

Verse 1:

You said you loved me when we said our vows But then you went and broke them anyhow

You cheated on me with my best friend And now our marriage has come to an end

Chorus:

You lied to me, you broke my heart

You tore my world apart

You made me cry, you made me bleed

You took away everything I need

You’re nothing but a cheater and a liar

And I hope you burn in fire

Verse 2:

You said you were sorry when I caught you red-handed

But then you did it again and took me for granted

You played with my feelings like a toy

And now you’ve lost all my respect and joy

Chorus:

You lied to me, you broke my heart

You tore my world apart

You made me cry, you made me bleed

You took away everything I need

You’re nothing but a cheater and a liar

And I hope you burn in fire

Bridge:

Now I’m moving on with my life without you

I don’t need your love or your lies anymore

I’ve found someone who treats me right and true

And he makes me feel things I never felt before

Outro:

So goodbye to you, you cheater and liar

I hope you regret what you’ve done to me

‘Cause I’m happier now than ever before

And I hope you burn in fire”

Leave a Comment

Terrill Thompson Retires From NCAA Accessible Bracket Business

After a 17-year run of providing the industry leading and in many cases only example of an accessible NCAA tourney bracket, Terrill Thompson announced his retirement from the bracket business in a thoughtful blog post yesterday.

Thompson has pointed bracket enthusiasts to Yahoo’s Fantasy Tourney Pick’em and indicates having ongoing discussions with Yahoo staff about accessibility.

In announcing his retirement, Thompson has directed users to groups for both the Men’s and Woman’s tournaments on Yahoo’s site. Thompson indicates successful completion of bracket creation with JAWS and Chrome (Men’s) and keyboard only (Woman’s) use.

His blog post, which indicates more than 200 users of the accessible bracket service in some years has more details on issues he did encounter, with Thompson reporting he’s seen Yahoo make some changes already.

This Wisconsin Badger fan can report success in creating tournament brackets with JAWS and Edge on the desktop and VoiceOver and Yahoo’s Sports Fantasy app on the iPhone. Unfortunately, Wisconsin failed to make the tournament for the first time in several years.

In announcing his retirement, Thompson issued a challenge of sorts to the “huge corporations” behind tourney sites, such as ESPN, CBS Sports and more. Thompson wrote:

“I have always felt some deep reluctance to host a “separate but equal” website. Mainstream websites, including those featuring tournament pools and brackets, should be designed for everyone, not solely for mouse users with good eyesight. Accessibility has always been technically possible – the first version of the W3C’s Web Content Accessibility Guidelines was published in 1999. If one lonely guy can create an accessible tournament site in his spare time, huge corporations like ESPN (Disney), CBS, and FOX should surely be able to do the same.”

Leave a Comment

Ethical Accessibility

The age-old debate about the glass being half empty or half full is one I’ve had many times when discussing accessibility in both my personal and professional lives. For me it is true that technology has allowed me to do things that as a child I never thought would be possible. It is also true that technology, and the lack of accessibility at times, has cost me in multiple ways. Missed professional opportunities, judgement of my intelligence and skills through no fault of my own because I couldn’t use a particular application, extra time to find ways around issues and the full cornucopia of what I’m sure anyone who requires accessibility could recite as well. In fact I can put a fairly accurate dollar amount to some costs of accessibility barriers I’ve experienced with the financial industry in particular. There is also the cost to family, friends and colleagues because as much as one might try not to ask another to play human assistive technology (screen reader in my case), the times it ends up being necessary are still far too often.

So for me the question isn’t really about half full or half empty, it is about giving myself and everyone else an equal amount. If others are getting a full glass, those of us who have accessibility needs in my world view deserve the same.

Now that’s an easy world view to have but there is another part of reality that when you start with a world filled with inequity, the resources to address those inequities have to come from someplace. This leads me to something I’ve been thinking about a great deal of late. I’m not sure of the best term but the one that comes to mind for me is what I’ll call “ethical accessibility”.

In my opinion, one of the fundamental challenges we face in accessibility is that too often participation alone is judged as the measure of accessibility success. Many companies put out VPATs (voluntary product accessibility templates), accessibility statements and other affirmations of their commitment to accessibility. Rarely, however, does this come with accountability for when issues will be resolved, the level of investment or other commitment to an environment of no accessibility barriers. The decision making framework behind decisions around accessibility is often opaque.

My formal college degree is in journalism. I recognize the profession has evolved, not always for the best, but ethical standards were one of the cornerstones of my education. Those standards offered a clear framework for handling many complicated situations.

Many industries, for both the businesses and those working as professionals in the industry, have ethical standards. They set a level of expected behavior for everyone involved. To be clear, the absence of any standard in no way means anyone is acting unethically. That is not my point. In fact, without a standard, it is quite tough to characterize any behavior.

The vacuum created by this missing ethical standard in accessibility leaves too much too chance. It actually makes accessibility challenging for everyone involved in the ecosystem. Imagine for example, an accessibility ethical standard that said no product can be released with any WCAG (web content accessibility guideline) issues present. Now, the developer has significantly less mystery about which issues to address and the purchaser has much less of a dilemma about whether deploying a product is adding or subtracting to the accessibility of the workplace.

I’m not advocating this should or should not be the standard. Such a standard should be developed with discussion from the range of constituencies impacted. I am advocating for some standard to exist.

There is no doubt that accessibility has moved forward in multiple arenas. Again, that half full perspective. This is great. Whether it is due to legal requirements, increased awareness of the importance of diversity or the many other reasons society moves forward isn’t the point here. Rather, whether one is what I’ll call a consumer of accessibility or a producer, what I wonder about is, what does it mean to be an ethical player in this arena?

I define a consumer of accessibility as someone who uses accessibility features and content and a producer as someone who creates the same. Rarely is one isolated to being in one category. In my own life, I often fall into both categories as do many. I need accessibility as someone who uses screen reading technology. In various parts of my work, I also easily fall into the producer side because of my employment.

On the consumer side, what does it mean to be ethical? For example, if I discover an accessibility issue, what obligation do I have to report that to the maker of the technology? Equally important, what is a realistic expectation for me to wait for the issue to be repaired. A day, week, month, never? We’ve heard a great deal about “drive-by” web accessibility lawsuits of late. I’m not a fan of that behavior but what role should the law play then? What recourse does the typical accessibility consumer have? Advocacy, raising public awareness and the rest are often costly and can be quite haphazard in terms of results.

On the producer side, what does it mean to be ethical? I’m not interested in the legal requirements here, although clearly those can play a part in being an ethical accessibility producer. What does it mean though beyond any legal requirement? Does this mean you fix every accessibility issue? Does it mean you do not launch a product or service until it meets a given accessibility standard? How do criteria such as resources, feasibility of making something accessible and more come into play? What obligation should you have to communicate known accessibility issues or the decision making process behind choices?

Yes there are often accessibility policies but I urge people to read them and ask yourself if those policies answer some of the questions I’ve raised here. I have read probably at least 200 different accessibility policies and the answer is decidedly no. This is not to say they are poor and unhelpful. But there is a world of difference between saying that an organization strives for something and that it will be a certainty, whatever that certainty happens to be.

To go back to my journalism example, whether you agree with them or not, most media organizations have an ethical standard as to when an anonymous source will be used or not. Do accessibility policies give clear guidance as to when accessibility issues will be present? Clearly there are times when companies make choices to not fix certain accessibility issues.

Again, the issue here isn’t to say such behavior is right or wrong. It is to say that as an industry we would benefit from a clear standard to which organizations could agree.

Even without an industry standard, accessibility policies should be broadened to include the ethics that govern decisions for an organization, expand how situations when those standards have not been met will be communicated, how the organization evaluates itself on such a standard and more.

If I go to a restaurant, I have a reasonable expectation that what I order from the menu is what will be served. Far too often that same expectation is far from met when it comes to accessibility. In fact, were that meal given the same treatment as accessibility, I might tell the restaurant that something was missing from my meal and if I’m lucky the gap would be acknowledged but with little idea when or if I’ll ever get what I actually ordered.

Imagine five people ordered the same beverage as another example. Far too often in today’s accessibility reality what would be delivered are five glasses filled to wildly differing levels. Half empty or half full would be just the start of the discussion.

There’s been talk about accessibility being a fundamental human right. If that is indeed true, something I believe is the case, how do you measure anyone in the industry against such a standard? How many accessibility issues do products get to have before you are judged an abuser of a basic human right? Does the good the company does get measured against the harm?

These are not necessarily simple questions and the answers are not always straight forward. That said, having some sort of ethical standard to which those who want to subscribe could certainly help. I’d also say that the continued absence of this level of clarity is only going to continue the trend of more accessibility legislation because governments are finally starting to recognize that their citizens are deserving of more equality in the accessibility arena.

To give just one example here, I have written more than once about my frustration with OverDrive and the lack of accessibility to their Libby app. It seems like history in a way these days, as the app has become for me a model of positive accessibility and one I use routinely to read books and listen to audio.

When the app was first released libraries were in the end promoting a very inaccessible experience and doing so with public tax dollars. Was that ethical? How about a “secret” message visible to only screen reading users saying OverDrive knew the Libby app wasn’t accessible? Ethical, yes or no? An ethical accessibility policy could answer such a question with greater confidence.

What should any individual library have done differently and how do you factor in the availability of reading material that the experience did make available to some?

I know in the case of the Madison Public Library, they put a message on their web site where the Libby experience was promoted communicating their efforts to work with OverDrive and indicating that a “secret” message was less than ideal. This happened after I brought the issue to the attention of library and other city officials. I’ll never know what finally prompted changes but it wasn’t long after that when publicly visible accessibility efforts with the Libby app started showing up.

More than three years ago, the New York Times ran an excellent article about the challenges in accessibility for many investment firms. Little has changed in the time since that article was published. Many of the organizations mentioned are devoting resources yes but as a consumer of the results from more than one of these firms, the progress is maddeningly slow in my opinion. I can also point to direct financial costs that the lack of accessibility has caused me.

Is it ethical for companies to use these organizations to provide financial services to their employees? How do you balance the accessibility requirements against other equally important criteria? In some areas outside of accessibility, I can assure you there would be a zero-tolerance policy for violations of whatever the standard or policy in question happened to be. For a range of reasons, rarely, if ever, is that the case with accessibility today.

I don’t pretend to have all the answers here. But what I do have is the desire to see again what I’ll term ethical accessibility move forward. In my view of the accessibility world, consumers and producers would come up with a shared set of guiding principals that would govern actions in the accessibility space. Then, much like other areas where society is working to address long-standing shortcomings, you either get onboard or you don’t and if you don’t, well others can decide what that says.

Failing any industry effort, individual accessibility policies could be enhanced to cover the ethical underpinnings that will be used around accessibility.

What you don’t get to do is pick or choose. More than 25 years ago Oregon Senator Ron Wyden rightly stood up for the rights of a guide dog user who was denied access to the Senate floor. At the same time, Senator Wyden’s web site was filled with obvious accessibility errors with multiple contacts over the issue going ignored. While I am not able to find my editorial that resulted in the only phone call I’ve ever received from a U.S. senator, that was published in the Oregonian any longer, I’ll say here what I said then:

Accessibility is not a part time proposition. You don’t get to choose when you’ll make something accessible and when not. Ethical accessibility, like any ethical standard, isn’t something you turn on and off. It is easy to live your values if you can cherry pick when to do so or only because a law says you must do it.

Leave a Comment