I was asked for my thoughts surrounding the implications of “sharing” services, like that of Uber, on people with disabilities at a workshop held at the Inclusive Design Research Centre (IDRC) today. And since I was unable to attend in person, I was kindly given the opportunity to have my written words represent my interests instead. What follows are those words…
In all honesty, my initial approach toward these thoughts consisted of little more than what I’ve heard/ read about Uber from various sources, namely from the news media. And I’ll admit, that was unfair. But also consider one “source,” a Blind colleague, who was denied service because she was accompanied by a service animal, and charged a cancellation fee on top of everything, was what I drew my impressions from most!
And this wasn’t an isolated incident, for her I mean. It’s happened to her before at least one time previous (turns out it has happened fourteen or fifteen other times). And what happened after – I have no idea what amends, if any, were offered – is of little consequence; it happened. What I’m driving at here is Uber has a public perception problem as a result. However, that’s a separate issue – one potentially solved with training.
But what concerns me more about Uber’s intention to provide its customers an accessible service is what this effort will result in when it comes to Uber’s reliability. Read “Uber must earn trust from all their customers” in its entirety
This past Global Accessibility Awareness Day (GAAD) I was fortunate enough to catch part (as in a wee bit) of Inclusive Design 24 (#ID24). The Paciello Group held 24 one-hour webinars concerning various matters dealing with accessibility. It was really quite the productive gesture to, and I’ll quote, “celebrate efforts worldwide to ensure people with disabilities have full and equal access to the web.”
Remember — and not to suggest this was The Paciello Group’s intent when offering their statement about #ID24 — “if what one is unable to do continues to be used as a means of defining disability […] then every single individual on this planet is disabled.” That statement brilliantly sums up an intent of GAAD quite nicely, so says me. It’s all inclusive.
And one “webinar” in particular got my noodle cooking. The Billy Gregory’s talk, 10 Things I Wish I Knew When I Started in Digital Accessibility. Not that any talk I was able to tune into wasn’t great. But it was this one, however, that was personally relevant. In the sense I found myself thinking a lot about how I’d answer Billy’s proclamation. Read “The use of technology will always require adaptation” in its entirety
In honour of Global Accessibility Awareness Day (GAAD) today I’m throwing this method out in to the ether that is the web. However, it’s not the quote/ unquote “technique” I’m offering — in the sense I really expect anyone will use it. Rather it’s my aim to try and get people thinking about the content they consume and produce on and for the web, period. And thinking a little differently about said web content.
After all, that’s the point of going through the effort of raising awareness. To think about anything in a manner which you aren’t typically conditioned to think about them. Or in other words, it’s not so much the result I’m most interested in here, it’s the reasons for and process that give us that result. It’s my hope to draw some attention towards automatic text transcriptions of audio only podcasts, specifically.
And I’m aware such a solution is still a ways off from being practical — as in reliably useable. But it’s never too early to entertain prospects. And experiment. Read “Automatic audio text transcriptions” in its entirety
I’ve spent some time over the past few months thinking about how I craft the content I publish for the web. Specifically regarding my use of language when writing. In one certain context — not to suggest my writing is free from more problems in others — it’s not as inclusive as it should be.
I’m referring to how a screen reader user experiences the words I write. And with my limited use of the technology, I’ve taken note of something quite specific. If you use a screen reader to speak my words, I’m not sure you, as a listener, will get all of the “subtleties” (case in point) of my intent.
Using the example I cited immediately above, precisely how is a screen reader user supposed to know I’ve put the word “subtleties” in quotation marks? Just typing quotation marks before and after the word isn’t enough to make a screen reader speak them. Read “Language is a curious beast, ain’t it?” in its entirety
When I wrote Regaining focus I figured, from my past experience using user defined style sheets, things would be as straight forward as I remembered them being. Why isn’t anything as simple as the utility demands? And for the record, I know relying on memory alone is largely a sloppy move to make. And I hope to correct myself with this post today.
But I didn’t fully realize my error until I went to write up the README for a Github repository I wrote for a solution for such a problem. Using the bit of code I provided in my post I linked to above, I wrote a style sheet (FOCUS.css) that would provide a user the same experience in the same browser from one site to another.
The problem being the ways to implement user defined focus styles vary quite a bit from browser to browser. From easy to complicated. Read “Defining focus” in its entirety