You are currently viewing GGX talks to Be My Eyes/AI users and beta tester – Golden Gate Xpress

GGX talks to Be My Eyes/AI users and beta tester – Golden Gate Xpress


INTRO

 

Matthew Ali (0:00)

Hello, my name is Matthew Ali and I’m a reporter with Golden Gate Xpress. We all know what artificial intelligence is. It’s hard not to. It’s affected so many things in our lives. Programs like ChatGPT and Midjourney are just two such applications. In many cases, people with disabilities are the reason technology advances. Other times, they’re simply an afterthought. 

Be My Eyes is an application that was designed for people with visual impairments and blindness, and it isn’t the only application of its kind. There’s also Seeing AI and other hardware solutions like Orcam. What Be My Eyes actually does is connects the users with volunteers around the world and allows them to use their phone’s camera to provide descriptions of what they’re seeing. Now in beta, Be My AI has taken the volunteer out of the equation and replaced it with artificial intelligence to provide those descriptions. Today we’re talking to three users of Be My Eyes and other applications like it. One of them happens to be a beta tester for Be My AI. — Jeremy Jeffers is a musician. 

INTERVIEWS

Jeremy, what has been your experience with, I guess like these generative tech softwares like Be My AI or Be My Eyes?

Jeremy Jeffers (1:17)

For the most part, they’re pretty, pretty good. I guess through the normal channels, you have to go through rehab and get an expensive camera and hopefully use JAWS and you have to do all this other stuff. With this software, you know, you don’t have to do that. You can do it within the comfort of your own home and you don’t have to buy a fancy camera. You can download an app, and you know, get the information you need effectively. So yeah, I like it. It’s it’s it’s friendly to the pockets. It’s and it’s easy to use.

Matthew Ali (1:54)

So, my first question about that, when it comes to like, you know, the whole process of getting an expensive camera, there was like a version of this before that.

Jeremy Jeffers (2:03)

I think the thing is called the PEARL. You would have to, like, go through the Department of Rehab and they would link you up with technology specialists and that would be their recommendation. Last, I remember it was semi-portable. Like, you have to take the camera and it had a stand to it and you have to put the camera on the stand and it wasn’t a standalone camera. You have to use it in conjunction with another computer program. And it, it did the job fairly well. And I think, you know, if you’re looking for like 99.999% accuracy and you need like a, you know, professional documents, that’s fine. If you’re in school, then you need that, that’s fine. But I think at the same token, if you don’t feel like doing that and you’re, you know, in the same scenario, you can do the same thing. Just you don’t need the camera, you don’t need that program. And you can, you know, move around. All you need is your phone.

 Matthew Ali (3:04)

And what has been your experience with those kinds of software, those applications?

Jeremy Jeffers (3:08)

Oh man, it’s been great. I’ve been able to look at simple pieces of print from like just trying to figure out what mail, like what piece of mail it is to full-fledged instruction manuals. It’s been great. It’s just a matter of with Seeing AI, it’s just a matter of getting their positioning right. And with Be My Eyes, it’s just a matter of getting a volunteer that speaks your language. Other than that, you’re good. Or even now, actually, they have the text to speech now for Be My Eyes now. So, you could even — it’s pretty much on the same level as Seeing AI. So it’s it’s pretty simple. Yeah. It’s nice to be able to have something that can tell you what’s going on with you not having to bug people about, ‘Hey, can you tell me what this picture says?’ So, that’s cool.

Matthew Ali (3:56)

So, what are some of the limitations for the AI portion? Like the you know, the text-to-speech stuff.

Jeremy Jeffers (4:02)

I mean, you’re limited. I think you’re only limited to how well you get the image. I think the only limit is to what your camera picks up. If your camera picks up the image pretty well, then you’re good to go. If your camera doesn’t get a good image, then you’re subject to the technology. And even then overly the technology is giving you an accurate picture and not what it thinks it is. I mean, but other than that, I think you’re good. 

Matthew Ali (4:27)

Basically, it’s kind of subject to, you know, what you’re able to get on your camera.

Jeremy Jeffers (4:31)

Yeah.

Matthew Ali (4:38)

How does that play out with your, you know, your level of vision? 

Jeremy Jeffers (4:36)

It can be challenging, however I think if you have enough patience you can get a good result. It’s just a matter of sitting down and really, you know, hammering out the position of the picture and playing out — playing with different angles. For example, I know Seeing AI, like when you’re, when you’re doing documents, it’ll tell you, you know, hey, “left edge not visible, right edge not visible.” And when you get the perfect picture, it’ll, it’ll stay hold steady. In some aspects, technology helps. Yeah. It’s just a matter of your camera and how well you get a good shot.

Matthew Ali (5:13)

Manny Hernandez is a Be My Eyes beta tester. So you were there as they rolled out the AI portion, right?

Manny Hernandez (5:21)

Not the original it was. It was already created. I volunteered to be a beta tester for some of the newer features that were added on to it.

Matthew Ali (5:30)

Was AI included in the newer features was the AI already released when you got to it?

Manny Hernandez (5:36)

I believe the AI was already released or there was a very crude version of it, So, it was released to the public. Well, I won’t say crude. Maybe crude’s not a good word. Maybe it’s a very basic version of it. Because I remember when I installed Be My Eyes, that’s what we were talking about, that the AI version, was already part of the program. I was able to start beta testing some of the newer features they were adding on to the AI version of that program.

Matthew Ali (6:02)

How does Be My Eyes compare using it with the volunteers versus using the AI?

Manny Hernandez (6:10)

Well? I think I’ve only tried the volunteer portion once to be honest with you. I went straight to the AI even when it was originally available to the public. Before I was involved with all this beta testing stuff. I gravitated more towards the AI version part of it, even though that’s what the original concept of the Be My Eyes program — I can do the same thing by just Facetiming my mom, my dad or something. But the reason why they created the Be My Eyes and the reason why they did it is because sometimes, you can’t get a hold of your mom, or your dad, or your uncle, or your cousin, or your neighbor, because they’re either working, they’re at a doctor’s appointment or — so basically they created this so you could have access to a volunteer 24/7.

Matthew Ali (6:51)

How has the Be My AI helped you personally?

Manny Hernandez (6:56)

Well, you know, since I lost my vision, it’s very hard because I was a sighted person. I could read, I can write, I can send emails. Like I said, my job was very stressful work — I constantly had text messages, voicemails, you name it. Technology —I was so involved technology-wise, I was so connected. So, when I lost my vision, I thought I lost a part of myself. I thought I was less than a person. But the program like this taught me that there’s other ways of doing things. And another way that it helped me is like, I’ll give you an example — when I lost my vision, I was in such a depression I practically cried because I thought I would never be able to do things with my son again. I have a 7-year-old son. I thought, well number one, I’m never going to be able to see him grow because I’m blind. I’m completely blind. Both eyes are blacked out. I can’t see anything. So, I thought, ‘Man, I’m not going to be able to read my son a story anymore. I’m not going to be able to do certain things. I’m not going to be able to enjoy life.” Well basically Be My Eyes — it’s so amazing. Not only can it take pictures of the room and tell you detail what’s going on, you can take pictures of something you’re looking at, almost like a scanner, and read you what’s on there. One day I was trying to figure out how to take my medicine, ’cause I’m blind, I have a jar, and sometimes most jars either feel alike or look alike or whatever. Just by feeling them, you can’t tell what jar is what. So I scanned it. It says this is “so and so” or “so and so,” and take it twice a day. The fact that I can do that gives me more freedom. That’s the freedom that I want.

Matthew Ali (8:20)

Anthony Vasquez is a digital accessibility specialist. What has been your experience with software like Be My Eyes and Seeing AI?

Anthony Vasquez (8:31)

So I haven’t used Be My Eyes, so I have no experience with it. I’ve used Seeing AI since it almost when it started when it came out. Was it 2016, I believe? Yeah, and I liked just the fact that again for I use it and even still today, I use it mostly to like describe short pieces of text, sometimes the document function where it, you know, it reads a whole page of text to you. It takes an actual photo instead of just reading on demand — and the currency identification. You know, I never got into like the talking currency readers. Yeah, I use Seeing AI more often for that, sometimes for describing photos on my phone too, you know, photos people send to me, that kind of thing. Oddly enough, sometimes even like the light detection, I have basically no light perception except for when it’s very sunny out. I can tell you where the sun is. But especially when traveling, like sometimes hotels, they leave the lamps on or it’s just, it’s nuts. So, the light detection where you get a higher pitch for more light in the room, you get the lowest pitch when there’s no light. I use that feature too so pretty good experience with Seeing AI. None would Be My Eyes.

Matthew Ali (9:31)

You said there was another piece of software you use — AIRA? 

Anthony Vasquez (9:33)

AIRA. A-I-R-A. I’ve been using that one since 2019. That one, they actually have a desktop app now for Windows. Not for Mac. I use it on my iPhone. Yeah, that one I like to call like the paid version of Be My Eyes. Whereas Be My Eyes has volunteers, AIRA is paid agents who sit behind a laptop all over the world and they see what your iPhone sees. They can track your GPS location; they can help you look up things. They can remotely contribute computer via some like Quick Assist or TeamViewer. I use that a lot too, actually. I’m a paid member since 2019, and I’ve used it for everything from navigation in a new place, going through airports, walking around my neighborhood, stuff that Seeing AI can’t read. Sometimes that’s like, OK, let’s use AIRA instead. I really do enjoy it. Yeah, I was kind of skeptical of all these, kind of — especially the Be My Eyes and AIRA concept at first. But once I used AIRA, it’s like, “OK, I don’t want to depend on that.” But it’s nice to know I have access to, you know, these sighted folks that — a lot of them are actually really good describers and really good at like navigators. So, I do enjoy using it and I’m happy it’s out there.

Matthew Ali (10:37)

What are some of the reasons you were skeptical of, like, that kind of software?

Anthony Vasquez (10:41)

I guess that I was more like trying to be an idealist about things. Like things should be accessible from scratch or why should I, almost like bother with asking for help from some stranger on the phone or volunteers, whatever? Just like weird interactions. Just — I didn’t, maybe didn’t think the cameras were that good eight, nine years ago and they just keep getting better and better. They were good back then and they’re great, you know, they’re better now. I think it was just like a skepticism of not wanting to become reliant on these tools, right? I don’t want it to be like where I show up somewhere new and I always bring out AIRA because I don’t know, I feel incompetent with my mobility skills. I took O and M (Orientation and Mobility) all throughout school from kindergarten to 12th grade. I don’t know, like I just wanted to, I didn’t think I needed these. And I like to think that if they disappear tomorrow, I’m still going to be fine and productive and living a decent life. But I guess I was worried that I’d become too dependent on them or that and others become too dependent on them, right? Like I’m lucky to I’ve gotten decent training and live in a decently safe city and all that. But like, you know, not everybody’s that lucky. And so, these apps, especially AIRA, is financially unstable, I’d say. And what if they do disappear tomorrow? Like so anyway, that was my thinking. So I just, I didn’t want personally to become dependent, reliant on these.

Matthew Ali (11:56)

And what was the thing that changed that made you like actually give it a chance?

Anthony Vasquez (12:01)

Yeah, I think, I mean, there’s probably a few, yeah. Like if there’s one moment or one, it might have been during my ski trip with friends to Colorado. I was the only blind guy in the group. It was me and three other guys, and we’re all good friends, but somewhere along the line. I just felt like, man, if I did have this AIRA app, I wouldn’t need to maybe, I don’t know, be so — I don’t want to think dependent, but like, yeah, you know, that’s right word maybe — dependent on them while they can do their thing and I can do mine right. And I just kind of started — actually, the trial that week and when we were coming back to Los Angeles just cause yeah, it’s not like if they can do something on their own, I can do something on my own too. Plus airports I think was a big one. None of my friends then, I guess still, didn’t have T — they didn’t have TSA PreCheck and so they had to go to like the normal security and I have PreCheck and like, I really just want to use the service or have the access to that service to go through the airport a little bit better. Because the people who are hired to work at airports generally aren’t the most interested in like, I don’t know, getting you to where you want to go. They’re very good at helping you with your bags and getting you to where they think you should go, which is the gate, never the restaurant or the lounge. I’m generalizing, but so I think that I guess that trip kind of, I was hearing more about AIRA. I was being, I guess less worried about privacy concerns and just wanted to feel a little bit more prepared in case I wanted to go off solo while traveling with friends or by myself. I guess I’m the kind of person that I’m very skeptical about things. And then once I — like something flips and I go all in. Like, I didn’t get an iPhone until 2013 and they were coming out with accessible iPhones since 2009. So that’s late to the smartphone game, late to the AIRA game. It’s the kind of skepticism I have, but look, I mean, five years on and I’m still harping about AIRA. So obviously, you know, there — they’ve got a good customer. But yeah, I guess that was again, just that, that that need to like this is a tool, right? I, I guess my mindset flipped a bit like this is not just a, it’s not a like a safety blanket or something. This is another tool that you might use once in a while. And wouldn’t you — this is kind of like I guess my self-talk like — wouldn’t you want to have it just in case? And so, I tried it and yeah.

Matthew Ali (14:01)

So certain apps that you — they can either use volunteers or people to look through your camera. And then in other ways, other times it might use an AI to, you know, generate the description.  I would assume that the people would be the better, more accurate thing. But what’s been your experience in that respect?

Anthony Vasquez (14:24)

Yeah, I mean, you know, Seeing AI now has integration with GPT. I’m not sure if they’re running 3.5 or 4. You know, of course, it being a Microsoft product, they get it, you know, at very low discount and it’s free to us.  I think at this point, I’d still trust the person’s judgment over what the AI gives me. I think it depends on what I need, what are the stakes if, the description is wrong. I wouldn’t trust GPT with reading something like — I don’t know — a prescription for an important medicine. I hope I don’t upload — I mean, I guess they know everything about me already, but like — you can — whoever they are — upload like financial information to like my GPT to, you know, if they use GPT to describe like a credit card statement. Thankfully all the big credit card companies — the banks get it and they provide accessible — like their websites are pretty good, the big ones. But back to just photos. I think I would still want a human if the stakes are high — have a human review it for me and tell me what’s going on. If it’s just simple color identification, then maybe Seeing AI can do the trick. Sometimes it’s not that good either, right? Is it gray? Is it brown? Hmm, confused.  I think at this point we’re still in like the era where a combination of both works. Maybe I take a photo with Seeing AI of my dog, have it described it to me, but I don’t know if it’s a cute photo. Like, I don’t know what’s cuteness. I’ve used AIRA to take photos of my dog. So, send it to friends. People love dog photos. At least people love dogs. And so, I wouldn’t feel, I don’t think I’d take a good photo with Seeing AI yet, for example, that I could then share with friends. It might be, it might be off-angle, it might not be, you know, ’cause you know, like the AIRA folks can actually crop it for you and make it find the cooler spots of it, that kind of thing. And also, maybe identify the cuter one, whatever that means. I guess it’s still in the eye of the beholder, but at least I had a human tell me, out of, these three photos they took, this one’s cutest. OK, Will GPT-powered Seeing AI do that in the future? Probably. Will it be as good or the same? Maybe, but I don’t think we’re there yet. So yeah, it just depends on what the stakes are. I mean, is it identifying a, you know, some kind of currency from my wallet? OK, well, it doesn’t matter. It’s just currency or, you know, read me back my health card information. I don’t know, like, you want to get all the letters and numbers, right? Privacy concerns aside, you just want to get the right information and that’s where the human comes in. I’m trusting AI with currency and I would trust a human with my healthcare information.

OUTRO

Matthew Ali (16:47)

This has been Matthew Ali with Golden Gate Xpress. Stay tuned for more episodes.