Audio
Using the smartphone hands-free
Low vision-assistive tech advice on hands-free smartphone use.
Vision Australia's Senior Adaptive Technology Consultant David Woodbridge talks with Stephen Jolley about latest technological developments from a blind and low vision perspective.
This edition, David notes:
Vision Store Product Minute: Arkon Smart Phone Desktop Stand. One of the main ways I can tell if a product I use is amazing, is if I keep using it over a number of years. Like the Arkon Smart phone Desktop Stand. I can put my smartphone at a range of heights, and positions. Whether I'm on a video call, demonstrating a product online, videoing a product demo or using the smartphone camera to do various things like OCR, this stand delivers what I need.
Smart AI Apps for use with Smart Phone Camera - both for Android and iOS.
Be My Eyes Assistant, Envision AI, Seeing AI
Camera for iOS gives scene info. Oddly, the Camera app on Android doesn’t have scene info.
Android
Look Out
Blind Shell Classic 2
Envision AI
IOS 18 VoiceOver Live Recognition: door detection, people, point and speak, text,
Obstacle Detecter app for checking distance of objects.
Summing Up: Challenge in using all of these solutions for smartphone, is using the device hands free, and still being able to control the apps.
Possible solutions: chest harness to hold phone in portrait mode, head set with mic for iOS voice control/listen to app, and a small keyboard that could be used with one hand: i.e. RIVO keyboard.
00:21 S1
Hello everyone. Welcome to Talking Tech. This edition available from July the 2nd, 2024. I'm Stephen Jolley. Great to have you with us listening maybe through Vision Australia Radio associated stations of RPH Australia or perhaps the Community Radio Network. There is also the podcast. To catch that, all you need to do is search for the two words talking tech and down. It can all come usually on a Tuesday afternoon just after it's been produced. Another option is to ask your Siri device or smart speaker to play Vision Australia Radio talking tech podcast Vision Australia Radio talking tech podcast.
With me, someone who can explain all this tech stuff really well. Vision Australia's national advisor on access technology, David Woodbridge. David, let's start with our Product Minute, a product from the vision store of Vision Australia.
01:11 S2
Indeed. So this is called the Arkon Desk Stand - and Arkon's actually [spells it]. It's an extremely adjustable stand for a smartphone. So you've got the base which is made out of metal. Then you've got this telescopic pole that actually extends out, I think, up to a metre if you pull it right out. Normally it's about 30cm. And then the arm that comes out which holds the smartphone, it's got these little articulated little arms that you can adjust in all different directions, because each little arm of which is three has got little ball joints. So you can sort of rotate them around. So in effect you can have your smartphone in any position you like, whether it's portrait, landscape, upside down, whatever you like.
So it's a really good stand for when you want to do things like video conferencing or optical character recognition and so on. My wife uses her one and her craft room on Facebook and so on. She uses it so that on her main laptop screen, on her Mac, she can see what she's doing on YouTube and Facebook, and then she just uses the camera on her smartphone to point down at what she's doing for crafting. So it does have multiple uses. It's a well known standard. It's been around for a long time, so if you're after a really good desktop, as in you put it on your table stand for a smartphone, it comes in really handy. So again, it's Arkon Desktop Smartphone Stand.
02:44 S1
How much does it cost?
02:45 S2
It's about $225. There is an attachment which we don't sell for an iPad, but you can always get them from a, you know, overseas or Amazon.
02:53 S1
The Arkon Desktop Stand from the Vision Store at Vision Australia. There was chatter last week about X. We used to call it Twitter. Its accessibility disappeared. It's come back. But tell us the story.
03:09 S2
This was actually quite bizarre. So early last week um, there was a shadow on Applevis saying, you know, you can't read tweets properly anymore. You only get the person who's tweeting his name or the handle, and you don't get anything else like the text of the actual tweet. So everybody started whinging about it and complaining about it on on Twitter, of course. And then everybody was thinking, I guess, or at least I wasn't like, Oh God, here we go again. It's going to be like this all over again. We're going to have to actually fight our battle. But lo and behold, 48 hours later...
So let's say from Wednesday to Friday last week, a new version popped up. So that was like a 0.01 release or whatever it was. And then the accessibility for VoiceOver came back. And again, I think it just said, you know, I release this for for bug fixes so you don't have to fix it, your typical sort of boilerplate statement. But it was fixed. So I don't know whether, you know, somebody saw it on Twitter and said, Oh dear, here's the blindness community rightly getting annoyed again. Or there's still some software engineers at Twitter or X, because remember when good old Elon Musk took over, he got rid of lots of people, including the accessibility team for Twitter X. So everybody thought, Well, here we go. This is going to be the End of Days as we know it.
But yeah, as I just said, lo and behold, it works again. So just make sure that your, your app is completely updated and then you should be fine with voice over again. But I think it was just remarkable about how fast that bug was fixed.
04:44 S1
So for those who use ECS, things are back to normal or as they were smart AI apps with the smartphone camera. Let's have a talk about that today. Explain what that is. And then we can go through the various products that are available.
05:02 S2
Yep. So you might remember last week we spoke about three main wearables. We spoke about AR, vision, OrCam and Envision AI. Now there's different sort of versions of that for your smartphone. So rather than wearing your, you know, your smart eye stuff, this stuff, as far as apps are concerned, are actually built into the smartphone, i.e. both Android and iPhone. So what we're really talking about is similar wearable hardware that is really apps running on your smartphone.
05:34 S1
Let's start then, with what's available through both iOS and Android.
05:40 S2
So the first one that I always love to mention, of course, is the Be My Eyes API function. And that's where you can take a photo. And for me it's a lot more extensive than these sort of automatic scene detection apps. This one really goes through extensively what it sees in the photo. And then of course, you can also interrogate and ask questions about the photo. It actually said my desk had coffee stains on it, which I was horrified by. So I went and quickly wiped it up. And then I ran it again and it said, Nope, nope. I didn't detect any coffee stains, so that was really good. But no, it actually does work really nicely. So, and as I said, for Be My Eyes, for both Android and iPhone, it works really nice. And also it's very, very quick.
06:26 S1
It's my go-to place for getting descriptions of photos.
06:30 S2
Yeah, absolutely. Same here.
S1
The next one, Envision I.
06:35 S2
Yeah. So this has the the similar functionality to what we talked about last week with the envision smart glasses. So you've got things like, you know, text detection, scene detection, currency detection, all that sort of normal stuff. It's a very robust app. Now, I also happen to see in a beta testing last week to there's a new one coming out besides the Envision AI app called Envision Assistant and what it is, it's a similar thing to what the Be My Eyes AI does. You take a photo and then you can use AI to interrogate the photo.
And I've been having a look at the Beta version of it, and it does work very nicely. And I think again, it's based on ChatGPT. So that's something else to look forward to, coming up in the next couple of months, or maybe even less.
07:25 S1
Tell us about seeing AI, which has been around a while.
07:28 S2
Yes. So this is a really famous one from Microsoft. And you'll remember that it does things like short text. So instant text, document reading, barcode scanning, scene detection, face detection, Handwriting detection. Color detection. Currency detection and also handwriting. So it does everything in the toolbox and actually works really well. So it does expand the functionality of both Android and iOS very dramatically. So that's almost my other one of my my favorite toolkit apps. When I'm talking about AI all the time.
08:02 S1
What about scene detection across both platforms?
08:06 S2
It really does get a bit wonky, because I think last week when we talked about the hardware, I did leave this to the end of the discussion when I was talking about all the features, because I did say scene detection is not robust enough. It's based on what the camera is seeing, it's based on light and everything else. And then it has to then, you know, put that image against other images that it's previously had a look at and then come out of result. And I always find it's about, oh, 85, 90% accurate.
But I noticed on an app that I haven't mentioned the program, it's not very stable. There was a warning on it. Please do not take the information that this app will give you as being true, because images are affected by lots of different processing issues. So I thought that was a really, really good warning and a good thing to keep in mind in general.
08:57 S1
And is that on both Android and iOS? Yes. Mhm. Tell us about an Android One called Lookout.
09:05 S2
So this is from Google itself. And it is almost like Google's versions of Microsoft Seeing AI app. So it basically does everything that their Seeing AI app does. Now there's one little thing that's a bit of an unknown feature in Seeing Eye that people might have forgotten about. You can say, I want to find something, and then you determine what you want to find, such as, you know, your keys and someone. So you're literally take a photo of your keys and then you say, These are my keys. And next time it's looking for your keys, it knows what your keys look like.
Well, there's sort of a similar version on the lookout app called Find, except it's a bit more generic. So when you say I want to find something, it brings up a list of things like door, chair, table, cup and so on. And then you choose one of those particular items so it knows it's got to look for a cup, and then it will tell you how far it's away when it sees that particular item, in this case a cup. So it's not as flexible as the CI1 where you can either choose generic ones or customise it, but it still works really well. So Lookout to me is, let's say 95% of what the AI app can do.
10:20 S1
Let's look forward to iOS 18, which comes to the public late September, probably. What have you been finding there with your Beta exploration?
10:32 S2
This stuff is actually pretty amazing what Apple is doing. So all the stuff I'm about to talk about was originally in the magnification app. It's now part of VoiceOver rotor, and the whole sort of menu on your radar is called Live Detection. And underneath live detection you've got things like door detection, scene detection, the point and speak function for using touch screens, people detection, and of course text detection. So what happens is you use the rotor to rotate around to life detection. You flick down to the one you want, such as door detection. You double tap it, it then turns it on.
But then what you've got is a four finger triple tap. That ten turns that voice detection on or off. So depending on what function you've got running in this case, you know, door or whatever else, it then only detects that particular function. The only one that's missing for me. And this is why I put it in the show notes for a list is object detection, where it actually uses the camera to tell you how far an object is away, whether it's in within, you know, arm's reach within about 1 or 2m away or further away. So I think if Apple added that to the live detection rater, they'd practically have everything covered.
And what I do find with the live detection very quickly is the scene detection on all these ones tends to babble a lot. So it keeps saying window blind, window blind, window blind, but because you can quickly toggle it on and off, you just turn it on when you want. As soon as you got the information you want, you turn it off again with that four finger triple tap and off you go. So it's getting very, very flexible in what VoiceOver can do in iOS 18 Beta.
12:17 S1
This is very nice, all this stuff, but ergonomically it can be a bit challenging. A lot of us walk along with only one hand available, because the other hand might be holding a cane or the harness of a dog. How do we get around this? Because you need to be able to manage the phone.
12:34 S2
The first thing is you've got to have a thing to put the phone in. So normally it's a chest harness. My chest harness that I've got for my smartphone does portrait. So up and down all landscape. So that's number one. Number two, which I haven't found to be that dramatically fantastic, is using voice control to use voice over to then navigate the smartphone screen and for iOS. So things like, you know, flick left flick right, magic double tap. That gets very laborious when you're trying to use an object recognition type app. The third one, which is what I always use on my one, is a little keyboard that I've had for a long time called the Revo Keyboard Revo. It's about the size of a credit card. Of course a little bit thicker because it's got keys on it and because I can use it one-handed.
All I really need to do with that keyboard is press four and six to go left and right. I press five to select and a few other little keys. So basically I can use about 5 or 6 keys to completely navigate voice over. Of course, the only thing it hasn't got for iOS beta yet is to turn that voice live recognition on and off toggle. But besides that, I can completely control my iPhone from this little keyboard.
13:46 S1
Well, there's a lot more with AI and the camera we'll talk about in the future, and I look forward to that. It's going to be interesting. Yep. Before we go, a reminder of where there are details of what we've been talking about in this and previous editions of the program.
14:00 S2
Indeed. So as always, you can check out my blog site which is David Woodbridge dot Podbean podbean Com.
14:06 S1
David Woodbridge Podbean podbean.com to write to the program.
14:12 S2
You can write to me at Vision Australia where I work, which is David Woodbridge - how it sounds - at Vision Australia dot org.
14:19 S1
davidwoodbridge@visionaustralia.org ... this has been Talking Tech, with me has been Vision Australia's national advisor on access technology David Woodbridge. I'm Stephen Jolley - take care. We'll talk more tech next week. See you.