An interested party left a comment on a post I wrote back in November of last year, called The frustrations of VoiceOver. The commenter wondered if the situation I described in said post was the same for VoiceOver in Safari on iOS (meaning on both the iPhone and iPad). Problem being, I had a one helluva time testing the “bug” with VoiceOver on iOS.
Long, and somewhat uninteresting (for the scope of this piece, at least), story short, I was able to clear the biggest impediment I had toward testing this quirk in iOS yesterday. How do I even turn VoiceOver on to test? It, as in iOS, will not recognize my double taps when it asks for confirmation for turning VoiceOver on. “Is this really what you want to do? iOS’s gestures change when VoiceOver is turned on” (I’m quoting from memory, it’s more than likely that isn’t what it says). So I put a call out on Twitter asking how I might overcome this.
Although the solution isn’t all the intuitive to discover on one’s own, that doesn’t necessarily make any solution any less liberating or powerful. Read “The Split Tap” in its entirety
Earlier this week Aaron Gustafson turned me onto an accessibility feature in Google+. In your “Settings” Goggle+ gives you the option to turn on “Accessibility,” to “change the presentation of some pages to work better with screen readers and other assistive tools.”
A noble goal towards inclusivity. But one thing unfortunately sticks out for me, why is this even an option? And an option a user must opt into?
Turns out the reason wasn’t as misguided as I first thought. Don’t get me wrong, it’s still totally counterproductive — accommodating what they’ve already implemented to meet a goal that would be better served if Google strove for this goal up front? 1 But, I guess, effort is being made to provide people the support they might need to better their experience. My “complaint” should be taken with a grain of salt. It could be worse. But at the same time, it it should be much better. Read “Some shortcomings in Assistive Technology?” in its entirety
Not that long ago I wrote about my initial experiences with Assistive Technology. And even though those experiences happened quite some years ago and I’ve undergone a lot of healing and a number of behaviour alterations since, I still use a handful of alternative means to access a computer.
But by far the most important one I use, that makes the time I spend on a computer much more productive and enjoyable, is the manner in which I use a keyboard.
As my physical ability has progressively changed, my needs — in the sense the solutions I use — have not. Well that’s not entirely true. I no longer need switch access scanning or mouse keys, but I still heavily rely on the keyboard, and sticky keys especially, to interface a computer. I can use two hands to type, but that can be challenging (working in Photoshop is the exception) so I don’t typically use both hands. But in an effort to speed up my productivity I don’t so much require said solution, as I much prefer to use it.
Which gets to my point, my most productive use of time, in terms of my access, is the keyboard. Most of the solutions I currently use involve these 90 keys that lay before me. Read “Keyboard accessibility” in its entirety
… In a hospital room not all that far away, a young man laying excruciatingly still in a bed, awaited his shot at communicating with anyone outside the confines of his battered skull (being All Hallows Eve I couldn’t resist)…
But unlike most nightmarish Hallowe’en tales, I’m delighted to tell you (especially since that kid in that hospital bed was me), this one ended much better than it started out. Actually by January 1997, the time this story began, I was able to move my head slightly to the right by that point. And I could communicate. Again, through the blinking of my eyes. Once signifying yes. Twice for no.
Given this, in addition to my primary means of communication, being my blinking eyes, I was also using a translucent, textured plastic cutting board with black hand scribed letters (and numbers) on it as secondary communication aid. It was these characters, arranged vertically, in I’m guessing 5 or 6 columns — the letters were in order, from A to Z, A to E in the first column, F to J in the adjacent column, and so on, and the numbers from 0 to 9, if I’m not mistaken, resided in a single column on the far right of the board. So armed with that set up any person wishing to converse with me, and that conversation required more than a quick “yes” or “no” response from the likes of me, said individual would hold this “communication board” by it’s handle with one hand, and with the other hand, while pointing, skimmed (from
right to left left to right) along the top of the board, and pausing on each of the columns for a brief moment. Read “Not that long ago…” in its entirety
Last week I was asked for my advice concerning input devices, the keyboard mainly, and image applications, specifically the HTML5 Canvas attribute. It was an interesting conversation. One that mainly had me focused on a few aspects of what I find most useable. Nothing that I’d say was incredibly stunning or necessarily revealing, again speaking exclusively about what I wrote, but yesterday something happened that made that conversation ever more relevant.
One part of said conversation, my part, brought up OSX Lion, and it’s gestures. And how I wasn’t in any hurry to upgrade. You see I haven’t the most precise control of my fingers — spacially speaking — and I find the trackpad on my computer incredibly awkward to use, a lot of the time. I can use it to move to any spot on the screen and click, once, but as for most of the advertized gestures in Lion, or at least my impression of them, having witnessed demonstrations both on the internet and in person, I was left with the strong impression those gestures would remain largely unusable to me.
Don’t get me wrong, I wasn’t hostile to the idea of Lion’s capabilities, as gestures are just one of many new features, or upgrading for that matter — I was planning on upgrading, eventually — but I said that on Thursday or Friday of last week, and by 8 o’clock Sunday night Lion was installed and running on my machine. I guess I should have defined what I meant by “not in a hurry.”
But I wasn’t incorrect. Lion’s gestures, for the most part, won’t work for me. At least in any consistent manner that would, even remotely, be productive. But I’m still trying. And I remain optimistic. Read “My behaviour is the problem?” in its entirety