Stranded between empathy and penalty

The idea of penalty doesn’t belong to the law exclusively. It’s a penalty when a person can’t use technology they need.

Hello! My name is Johnny Taylor. I’m a disabled web worker. And it’s my job to keep web accessibility non-elite! But first, I should paint you a picture concerning me and the reason I’m speaking to you, here, at the fifth iteration of Accessibility Camp Toronto today.

Way back, in the summer of nineteen ninety-six, when I was all of twenty-one years old, I was involved in a very serious motor vehicle accident, which left me in a coma for two months (or there about). And that, getting straight to the point, is my personal motivation for practicing inclusive design.

See, when I emerged from my coma, I was diagnosed with what is known as Locked-in syndrome, from Wikipedia:

“Locked-in syndrome (LIS) is a condition in which a patient is aware but cannot move or communicate verbally due to complete paralysis of nearly all voluntary muscles in the body except for the eyes…”

And the fact that I recovered at all calls into question the original diagnosis; or that’s essentially what my doctors have since told me in the subsequent years. The fact that my brain stem bore the brunt of the impact, effectively inhibiting communication between my brain and limbs, definitely presented me as having the symptoms of being Locked-in at the time.

Now, if I might, I would very much like to further stress a point. A much more significant point where all this is concerned. I had full control of all of my mental faculties. I just couldn’t move. That’s what is of most importance here.

Switch access to the rescue

So essentially left with no voluntary muscle control – which included my voice – I didn’t have too much to do but endure near constant pain from semi-violent muscle spasms all day and night. This made my case manager at the time, following the first time she saw me and witnessing my “silent scream” first hand, that much more determined to find me a means of communication. Enter my first real personal computer – emphasis on personal.

It was very early January ninety-seven, by this point in my story. I had a number of medical hurdles to clear before an outfitting could be attempted. And the original plan was to have some sort of sensor on my head to track my intent, I think. I apologize, my memory from those days isn’t exactly crystal clear. Especially for events that were proposed, yet never actually happened. And when my computer did come, I had started to regain slight voluntary muscle control in my neck. And plans changed. Switch access to the rescue.

And with that re-gained muscle control, I was able to activate a simple, but big, button placed beside my head on my pillow. And with a software package referred to as “row and column scanning,” I was able to watch a cursor traverse four or five columns of letters, numbers and common punctuation marks, and when the cursor landed on the column containing the letter, number or mark I wanted to write, I’d click the button with my head so that the cursor would make a perpendicular turn and start travelling through the column I’d selected. And when the cursor landed upon the specific letter, number or mark I wanted, I bumped the switch yet again, thereby choosing it and entering the character on-screen, in a separate word processing application. These steps were repeated until I had formed the word, sentence or thought I wanted written.

As an aside, ironically enough, the first word I wrote on my computer was…

Wait for it…

“Shit…”

Simple and to the point. Nothing more needed to be said.

Hauntingly similar

Anyway, previous to receiving my computer, I only had use of my eyelids to communicate. One blink for yes, two for no. And I had a low-tech letter and number board, too. Which wasn’t anything more than it sounds like. Single letters and numbers, printed by hand in black permanent marker, on a small clear plastic cutting board with a handle jutting from its bottom. In fact, for me, this functioned in much the same way as the computer. But I needed another person to point at the columns and letters, pausing briefly at each step while watching my eyes for direction. And when that person pointed at the column or character I wanted, I’d blink once, indicating my desire to choose that column or a letter.

This is all hauntingly similar to the French film The Diving Bell and the Butterfly and how Jean-Dominique Bauby wrote his book, on which the movie was based. It’s said his book took him ten months to write, using a process similar to the one I just described. The main difference between his process and mine was his quote/ unquote “Transcriber” vocalized the letters in an order based on their frequency of use in the French language. He’d blink once to select a spoken letter. And twice to signal his choice, as well as his intent to finish the word he happened to be spelling – should that not have been guessed by his Transcriber before he finished spelling it, of course.

But the reason I characterized our processes as being “hauntingly similar” just now, despite a few subtle differences, is that I can relate to the “Jean-Do” character in the film, and I can feel his frustration working with his “Transcriber.” And although things got easier and faster the more accustomed both my team and I grew with the process, it is far too easy to make mistakes. There’s too much room for interpretation and ambivalence, and it is almost impossible to signal such concerns to the other person. As hindsight has taught me, my team could have used an “undo” sign. Live and learn.

Now, with the computer I didn’t need the help of another person – to write, I mean. Not that I had much time at that point to use and explore my computer for the things I would come to use it for later, namely to surf the internet. Keep in mind, I’d first gotten outfitted with a computer at time when I was starting to regain a fair amount of muscle control through my right arm and hand, opening the door to my primary means of access to a computer today: the keyboard.

Suffice it to say switch access wasn’t with me for very long. However, I was so grateful for the sense of security and comfort it provided during the time I did have it. But my use or non-use of it wasn’t the point. The point was, I had the option to use it had I ever needed to. I was granted access. It was the very definition of liberation, in every single sense of the word. I could speak for myself!

Very real consequences

Rather obviously, That’s precisely where everything I’ve done these nearly nineteen past years is rooted. And those roots run very deep. It’s why inclusive design matters to me. It’s not just some line drawn in the sand. Inaccessibility has very real consequences. I know far too well what it feels like to be cut off from everything, lying in a hospital bed, alone in an medically isolated room, with no way to even call for help. Literally stranded between empathy and penalty.

We’ll never be at a point where we can look to enforce a standard and expect perfection. Most obviously, because perfection is an ideal that cannot possibly be achieved. There will constantly be exceptions to any rule. You can’t fix limitations in human built systems by absolutely imposing more constraints on top of everything else.

I’m aware I’m not telling you anything we all don’t already know. And I’m not suggesting standards are constraints. The authors of the Web Content Accessibility Guidelines (WCAG) laid out their intent firmly by framing these standards as “guidelines.” It’s in the specifications’ title! They are recommendations or best practices, if you will.

And while I agree wholeheartedly with WCAGs reason for being, even the almighty WCAG 2.0 fall short. Take the “Completely Automated Public Turing test to tell Computers and Humans Apart” (CAPTCHA) exception, from WCAG Success Criterion 1.1.1 Non-text Content):

“If the purpose of non-text content is to confirm that content is being accessed by a person rather than a computer, then text alternatives that identify and describe the purpose of the non-text content are provided, and alternative forms of CAPTCHA using output modes for different types of sensory perception are provided to accommodate different disabilities.”

Problem being, this still excludes users. Deaf-Blind users come to mind. Plus users who have problems passing such barriers, like me. Case in point: I came across Googles new No CAPTCHA re-CAPTCHA solution back in May, while helping a friend reset her Gmail password. And I was asked to select all the pictures whose content matched the one that was provided. Not only was I in a rush, I had trouble with the task. I couldn’t do it. Causing the old CAPTCHA to present itself. Which I solved, on the second try. As I said, I was in a hurry but I was also rather agitated at that point. We really needed that new password.

As Gian Wild wrote in February 2013:

“There is even a specific section in the Web Content Accessibility Guidelines, Version 2.0 about CAPTCHA, in which their inaccessibility is acknowledged, but the WCAG Working Group feel they can’t be too hard-line about it.”

She goes on to cite parts of the Understanding WCAG 2.0 document that explicitly state that the CAPTCHA exception was included for fear people would choose CAPTCHAs over WCAG. I understand this reasoning. It’s based in reality. But at the same time, I feel this speaks rather forcefully to my point. And in this context, the idea of enforcing a set of guidelines can be understood as a constraint. I’m specifically citing the Accessibility for Ontarians with Disabilities Act (AODA) here.

However, assuming AODA was being enforced, or was comprehensive with all it enforces (I’m specifically referencing it’s bits that deal with the web here), I would much rather avoid getting into a pissing match with anyone over things that don’t really matter.

Hold on to that thought: I’m not saying any tactic doesn’t matter, or is not worth your time or effort to pursue. I’m only telling you, based solely on my lived experiences, I’d much rather focus my efforts on the idea of user centred inclusive design, and why.

It’s all about the users

We need to face facts, it’s not really about us at all; the content creators, the designers, or the developers. With all due respect, fuck our egos. It’s all about the users. If people can’t use what I know we all create with our best intentions, best practices, best standards, then tell me, what’s the point?

Not only did we need everything – not just the internet, but our entire civilization – to be accessible years ago, we need things to be much better than they happen to be right now. And no matter how strongly we feel inclined to force change by enforcing a mandate, it won’t realistically lead to anything of value – on the broader scale. Progress is far too frustrating to witness, and far too slow to endure as it is.

I’m convinced that Bernice King said it best at the 50th anniversary of the Selma march just this past March:

“It’s going to take some time, because at the end of the day, no law is going to change a person’s perspective, their outlook, their behavior.”

We will never agree on tactics. And that’s OK. We all agree on the goal. How we get there isn’t the point. Point is, we get there. If life has taught me anything, it’s everything hard to do is entirely worth every drop of our blood, sweat and tears. Small steps take time, and there is no mistaking their aim.

But before I finish, I’d very much to take this opportunity to again thank both Rob and Sandy for asking me to present with you today and all of you for the opportunity to be heard.

I trust you’ll have a great rest of the day.

Thank you.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.