I often wonder what backgrounds you, my readers, come from. I’m never really sure if a person reading this piece has ever seen a screen reader or used an accessible device. This month I want to reach out to those of you who have not. I’m going to talk a little bit about how I use technology and how it affects my daily life.
I want to give you an idea of my daily routine and show you how technology affects everything I do. I am not only an assistive technology specialist but I’m also a technology junkie. Tell me about a new piece of technology and I’m usually the first one to buy it.
But interestingly, there have been a few things that this has not happened with. I was the “first kid on the block” to get an accessible cell phone in the early 2000’s. However when Apple came out with the first iPhone to include Voiceover, a built-in free screen reader for the IOS platform, I never bought one. When everybody and their grandmother were jumping on Facebook, I never joined. I swore I would never have a Facebook account and still will not get one.
I’ve only recently begun to use Twitter and I’m starting to enjoy it for the rich content I can find. I guess you can say I’ve always been a person who wants to challenge myself but not jump onto the newest trends just because everyone else is. I’ve always found it interesting to play with things that are not quite ready for the market while knowing that whatever feedback I provide can influence the product in the future. As I become older, I’m finding it more of a burden to learn new technologies than the fun challenge I’ve always found it in the past. I’d like to give you this glimpse into how I do what I do to help you understand the way I, as a blind person, think and interact with technology.
When I wake up in the morning, I’m greeted by the traditional alarm clock. However, my clock speaks the time and greets me. I’m actually a very light sleeper and often check the clock in the middle of the night. Unlike a sighted person, I can’t just turn my head to do this. I need to reach over and physically push a button to get the bad news that it’s almost 4 A.M. and I haven’t fallen asleep yet. The other problem is that friendly little voice telling me its 4 A.M .may actually wake my husband as well
The only other technology I encounter in my morning routine is our burglar alarm. My alarm system has a loud obnoxious voice telling me the status of the alarm. When I reach to turn it off in the morning, it loudly says, “Disarmed; ready to arm.” As I opened the door to let my guide dog Pecan out, it beeps loudly and says, “Back door.” I find this very reassuring because I know whenever anyone comes in or out it will tell me. I rely on this little voice to give me more confidence and sense of security (perhaps false).
From this point on in the day, I am all, tech, all the time. Before I leave for the bus, I reach for my Android phone to see how long a wait I have. While I’m standing at the bus stop, I’m reading tweets and checking email. Or just playing with the phone in general.
Here’s where the first difference between a blind person and a sighted person doing the same tasks becomes apparent. A sighted person reaches into their pocket and in under 10 seconds will have the screen unlocked and the “next bus” app displaying their stop information. But I sometimes take over 30 seconds just to unlock the screen. My husband is quite concerned about this. He’s worried about the fact that it takes me overly long to unlock my phone. For me this is just a fact of life. I can’t target the individual buttons I need to find and when I do, I need to be sure that my finger is resting exactly on the right one before I remove it. Luckily the new android phone keyboard is more forgiving than the previous version. I found it so tricky to even enter a character with certainty that it will be correctly entered.
I don’t know how many times I’ve been confident that I have the right password to be only told that it’s the wrong one. Since I’m very geeky and also have a braille display, sometimes it takes me a few moments to even get the keyboard up if the phone remembers I was last using the display. The Android accessibility suite allows me to use the keys on my braille display to enter text in to the phone. However even when the braille display is not paired with the phone, the phone remembers that it should use the braille display’s keyboard to enter text and I need to click a specific spot on the screen to change back to the on-screen keyboard. Personally I think this is a bug. But for now I deal with it. If a sighted person had to fight with their phone as much as I do, I don’t think cell phones would’ve caught on as completely as they have. Once I reach work, I have a 50-50 chance of my finding my computer ready to work the minute I sit down. Many mornings I reach for the keyboard hoping for the ever-faithful spoken initial statement, “Press control alt delete to log on” to be greeted by silence. Sometimes it’s a simple problem like the computer isn’t turned on but more often than not, it’s something as silly as my screen reader has failed to authorize itself and won’t speak because it doesn’t think it has a license to.
I’ve spoken many other times about how copyright accessibility blocks access to information for blind users. This is just one more example of that. The copy protection used by my screen reader keeps my computer from speaking. It thinks that it has no license and stops speaking. This is one of the biggest headaches I ever have to deal with in my work. I swear much of the gray hair on my head is caused by licensing problems and poor screen-reader implementation of their licensing… When I ran the lab used for student exams on the Berkeley campus, this particular problem would rear its head at the most inconvenient times. More often than not the student would be in the middle of an exam and their screen reader would turn off. The Berkeley campus is a large sprawling campus with old buildings. These buildings were not meant to have modern technology infrastructure tacked on after the fact. Needless to say, the network connections couldn’t always be relied upon. The screen readers would check for a license frequently enough that a five-second break in network connectivity would disrupt the licensing.
But wait, you say, wouldn’t it just check again? No, sorry, it’s too late. The screen reader has now set itself to a “demo” mode and if it doesn’t expire instantly, it will shortly thereafter. I and my students have lost hours of work due to this type of problem. And we tolerated and we moved on. Because without the screen reader, we could not be students and we could not be functioning, working individuals in society. I’ve just described something that happens to me more often than not. I’m not going to say it happens on a daily basis, but it does happen frequently enough to cause anxiety and distress. Recently my campus has switched from a departmental support system to a centralized IT system. I was lucky enough before this to have my office right next door to the IT person. Now the IT staff is based 15 minutes away on the other side of town. Previously I used to knock on the IT door and say, “My computer stopped speaking again.” Today if I’m lucky I can grab a person in the hallway and ask them to walk me through saving what’s on my screen, or perhaps they’ll tell me what that flashing error message that I just can’t read is.
After hearing this you’re probably thinking that I’m highly opposed to something like a shared service center for IT. But no, I’m not. When it comes to my basic configuration and support needs which are shared with every other person in my office, I’ve found the central service to be faster and more effective. Shared services is a larger pool of individuals with diverse knowledge and backgrounds and I’m more likely to get a person who knows how to solve the problem, instead of having to fight through it with a person who has only there narrow set of experiences.
My needs are different and my user profile is different than any other. I don’t need an IT person on-call just for me; no one can afford that in this day of highly paid, skilled people. What I do need is assistance from an individual who does not need to be paid for their skill-set but is affordable enough to be on hand when the need arises. But wait again you say, this doesn’t happen often enough to have a person on hand. In the next few months I’ll write other articles showing many other instances about when this person could be used and how having an individual like this may not only benefit myself, but may teach them skills they need to become a more marketable commodity.
In summary, there’s a lot of technology around us that we all use, but some of us use it differently. You may have a clock; I also have a clock, but mine talks. The simple things in life as they are modified for people with disabilities begin to normalize the experiences disabled people have.
As new innovative technologies flood the market, individuals with disabilities may benefit from them but struggle to use them. Product development always outstrips the accessibility. We can only hope that accessibility eventually catches up, just like the talking clock has. I think I’m a highly productive individual who has accomplished a great deal during my time at Berkeley. However with the state of assistive technology, I think I have a large percentage of uncounted hours struggling just to use the tools my coworkers take for granted. Statistics have shown that individuals with disabilities will stay in the same job their entire lives. A lot of this is comfort zone and familiarity. I might be asking for a little bit of assistance just to get past these barriers caused by technology created to help me, but university or business administrations can be assured that by putting a little bit of extra effort into improving tools and workspace, I will be there when many of my colleagues have moved on.