Earlier this week Aaron Gustafson turned me onto an accessibility feature in Google+. In your “Settings” Goggle+ gives you the option to turn on “Accessibility,” to “change the presentation of some pages to work better with screen readers and other assistive tools.”
A noble goal towards inclusivity. But one thing unfortunately sticks out for me, why is this even an option? And an option a user must opt into?
Turns out the reason wasn’t as misguided as I first thought. Don’t get me wrong, it’s still totally counterproductive — accommodating what they’ve already implemented to meet a goal that would be better served if Google strove for this goal up front? 1 But, I guess, effort is being made to provide people the support they might need to better their experience. My “complaint” should be taken with a grain of salt. It could be worse. But at the same time, it it should be much better.
First things first
But the first thing that occurred to me was directly related to a conversation I recently had — and for all intents and purposes am still having (I need to pick it up again on my end) — which might highlight a less obvious issue at work here. Less obvious than Google overly complicating things, I mean. It may not be so black and white. Could this be, at least in part, the result of some shortcomings in Assistive Technology?
It seems my initial impressions with screen readers are common to users familiar with the technology was in the ball park. They are hard to use. But keep in mind, hearing from a couple interested web developers is by no means representative of anything. I know that. And there is the chance none of the comments came from Blind or low-vision users either — please forgive my sloppy assumption. But as it turns out 50% of the feedback I received concerning my “initial impression” was in fact from a screen reader user — again, my apologies for not checking into it before cracking off. And not just a tester. Not that anyone‘s feedback offered isn’t valid, but I think that fact is worthy of mention here. Considering Blind and low-vision users would have the most relevant opinions concerning screen readers. I digress.
An “audio first” experience
What I originally thought was going on was certain aspects of Google+ needed to be “simplified” to help Assistive Technology do its thing. That’s what I get from Google’s explanation of this feature, if I’m to take them at their word (which I don’t, at least not entirely). What I mean is apparently, taking the content of the incomplete conversation I cited above a face value, screen readers aren’t typically an “audio first” sort of technology.
Meaning the important part of a screen reader, being the voice the speaks content out loud to a user, has to deal with more than just the words it needs to speak. Obviously you say. And I’d agree. It must deal with what is happening with the content as a result of interaction. But as I understand it some of what the screen reader uses to convey the information it speaks to it’s user isn’t needed. Or in other words screen readers rely too much on visual information when providing what largely needs to be an audio experience in the end.
Nothing is as it seems
Now granted I barely (which gives me too much credit) understand what is really at stake here. I’m going on what little discourse I’ve had on the subject — which, as I said above, I need to resume — but my original impressions as to why this feature is available to users isn’t entirely discountable. Nor is the fact that screen readers need to rely on visual information to provide users with an audio experience (which I’m sure wasn’t implied during my conversation).
What I think should be taken away from my post today, “BREAKING NEWS: Web design is hard and complicated.”
- This is largely conjecture on my part, as I’ve only performed the most basic inspection towards this issue (reading through Google’s code, need I say more?), but it appears Google is providing this function as a means to circumvent “verbose” coding standards. It does seem a bit slimmer after “accessibility” is turned on. Meaning, I think, Google is compensating for what they first built by allowing users to opt into a more “focused”, both visually and structurally, experience by reforming their code to better accommodate Assistive Technology. That said, this would better accommodate all. ↩