As long as I remember I’ve always liked trying out new things. A new “toy” would keep me in thrilled for hours. Over the past two months I’ve had a lot of new toys to play with. I’ve been doing a little bit more work with inkling on the iPod they gave me and testing out chrome OS for Google. Technology is expanding at a much faster rate than ever before and I’m starting to feel it’s leaving me behind. So being able to jump in the pool and start testing things again is lots of fun.
I’ve seen many of my friends use IOS extraordinarily well. Some of them just fly across that touchscreen. However, every time I sit down to trying using one of these devices I feel completely inept. Half the time I touch the screen nothing happens or the wrong thing happens. Just trying to pick an application from the home screen is a chore. I swear I must have something wrong with my fingers that I can’t get these devices to work for me. Yesterday I sat down with someone admittedly that was cited and described a few of the gestures needed to work through the application and he was able to get it to work as well as all those other blind friends of mine. When I tried myself after he was able to do it I tossed the iPad down and discussed and went for dinner. My friend said well I can see what’s on the screen which does help me a bit. However I don’t think it’s the case in this situation. Almost every blind person I know is using an iPhone and loving it. So why am I so different? Yesterday afternoon I tried another touchscreen device that was supposed to be accessible but also failed with it quite astoundingly. Maybe some people just aren’t meant to use a touchscreen.
While playing with the chrome book and chrome OS I’ve had a somewhat more successful experience. After all it’s not a touchscreen. I really enjoyed working with the system and trying to learn how to do it. I’ve only cheated once or twice and asked Googlers how to do one or two things. Otherwise I started to learn my way around the system. It’s been a very interesting experiment for me. I’ve seen one or two improvements that have come in just in the short time that I’ve been using the system. When I first got the unit one or two of the buttons that were pretty key to the interface were not labeled. But I’ve seen these fixed very quickly. My overall impression of the system is that it’s still very new and it still very beta but if they keep working at it this might be a very useful product. I really enjoy using docs in an interactive way. It’s kind of cool to know that somebody can add something to my document from wherever they are in the world while the document is open for me. And then being able to chat about the document at the same time accessibly was pretty neat. I’m really interested to see how this goes in the future. I must say at this point in time I’m still not considering it as a web browser but more as a productivity tool. Yes I can browse the web with it but the time it takes is still too long. Perhaps as the product matures or I learn more about how to use it I will start browsing more.
I’ve also had to start testing a few devices for work. I’ve been asked to evaluate different devices for scanning. It’s been a long time since I’ve actually had to play with multiple devices and compare them. I was rather chagrined to learn that some of these devices are still not being released anymore accessibly than in the past. One of the devices had a completely inaccessible installer splash screen. Thank God for the new OCR feature of jaws I was able to get the thing installed long enough to realize that the product itself wasn’t worth using. At one point in time my boss sat down to work with one of the devices with me. And I explained to him that I really enjoy doing this kind of testing. However at times it does get frustrating. He said it was something that would drive him crazy. This is a very legitimate response. However, that’s why I’m the tester, I actually focus on how to fix it and not what the problem is.