Audio
Apple and Humanware upgrades
Expert, life-experienced reviews of latest tech developments for people with blindness or low vision.
Vision Australia's Access Technology Manager Damo McMorrow (pictured on this page) talks with Stephen Jolley about latest tech from a blindness and low vision perspective.
This edition:
- With Apple software updates released last week, Damo explains highlights of the enhancements for iPhone and iPad iOS 18.2;
- With video description now possible through multiple apps, Damo describes a comparison he made with the performance of Seeing AI and Piccybot;
- We have news of significant upgrades to the capabilities of the Humanware's Mantis Q40 and Brailliant Braille displays.
Browse online for the range of products available in the Vision Shop.
We welcome questions or feedback - please email us.
00:08 S1
Hello everyone. Welcome to Talking Tech. This edition available from December the 17th, 2024. I'm Stephen Jolley, great to have you with us listening through Vision Australia Radio, associated stations of the Radio Reading Network or perhaps the Community Radio Network. There is also the podcast. To catch that, all you need to do is search for those two words talking tech and Dan. It can all come usually on a Tuesday afternoon just after it's been produced. Another option is to ask your Siri device or smart speaker to play Vision Australia radio talking tech podcast. Vision Australia radio talking tech podcast.
With me: Vision Australia's national access technology manager Damo McMorrow. Hey, Damo.
00:52 S2
Hi, Stephen. How are you doing?
00:54 S1
Very good. Well, it's been a big week with it has upgrades to products, devices, etc. we were anticipating some Apple updates last Tuesday there was a late hitch and there was a release. Two version came down on the Tuesday, and the final public version came down on the Thursday. So that's when we got our iOS 18.2 for the iPhone and the iPad. Let's talk about it.
01:25 S2
Yeah, it's quite a significant update and a very much sort of anticipated update, this one. One of the things that there has been much fanfare about has been the Apple Intelligence feature. That's the sort of Siri Mark 2, if you like, or a much improved Siri, but it does a number of other things as well. So with that, the Siri voices seem to have changed slightly. The Australian male one sort of sounds bored and rather sulky. But anyway, that's an aside. The but they've also now, um, increased series capability significantly - to the point where, for example, if you ask it something that it doesn't have in its own databases, it will say, Do you want me to use ChatGPT for that? And you can also say to it, ask ChatGPT about, you know, whatever it might be, and you'll get a much longer answer.
You won't get that sort of, Here's something I found on the web that might possibly help, which was the, used to be the standard response when it didn't know something. There's that feature. You can get much more accurate sort of travel information and those sorts of things from it. It's really just worth playing around and experimenting, but I would be very surprised if people didn't notice a massive difference in terms of Siri.
02:48 S1
I think that's the point, isn't it, just to play with it and you never know what you might discover. I asked her to tell me how far it would be by car from where I was to a particular address in Melbourne, and it came straight back with it and it was pretty accurate. So I don't have to go to Google Maps or those sorts of little pathways to find out those sorts of things now.
03:09 S2
You're absolutely right. And it's one of those things. I mean, I sort of had almost stopped using Siri for a lot of things because it just used to frustrate me, whereas now I'm actually using it much more than any of my smart speakers, because I'm getting better responses from it.
03:25 S1
Let's talk about email.
03:27 S2
Yes. So there's a couple of significant changes here. Part of the Apple Intelligence system, if you like, is that it can summarise content. So when you're going through an email thread, it will give you a summary of what each email is about as you as you're flicking through them. And it does the same in other things as well. So it'll give you summaries on your lock screen of your different notifications and things. But the other one that people will notice is a significant change to the layout of the email app. What happens now is that there's a bunch of filters at the top of the screen and there's things like... primary, there's as promotions, transactions and a couple of other different categories.
And so what it tries to do by default is it will try to divide your mail up into categories, which is very smart and quite interesting, but it will sort of throw you off a little bit if all you want is your email the way you've always had it. Now there are a couple of ways you can get it to that point. One is as you go through the filters, there is a more button and there's an option there called list, which will set it back to how it used to be. Or you can just deselect all of any of those filters... it'll default to, I think, primary or something like that. It says, and if you, if you double tap to unselect that, you'll get your mail in that same sort of format that it used to be where you'll just see everything.
05:01 S1
I found it convenient just to press the More button near the top of the screen - gave you a list of things, and you just select List and Away List.
05:08 S2
Yeah, that's probably the the easy, easiest way to do it. I mean, it is quite clever in that you can probably easily skim through and identify all of those, you know, all of the Boxing Day sale and Black Friday sale type emails that everyone's likely to get in their inbox at this time of year, those sort of promotional things. And obviously using the transaction filter that will identify things like bills and all of those sorts of things in an easy way. So that's... it is quite impressive the way that it does that. But it's nice that you can still have your email looking and feeling the way it always has.
05:42 S1
There's a thing called Image Playground and also a Gen Emoji feature. So it's doing a lot with images really isn't it?
05:51 S2
Yes it is. So the Gen Emoji allows you to generate your own emojis for those people that use them. I have to say, I'm not one of them. I've never really gotten into that other than when I communicate with my daughter... because I don't know, she she she likes it, but...a and the image playground allows you to... sort of use a text description and have it... generate an image for you based on that. I'll have to ask it, What happens if I if I request A Horse with No Name? But you could, you know, ask for, you know, a horse without a tail or a dog with two heads or something like that, and it would be able to generate the, an image for you.
06:35 S1
Why you would want to? But it could do some very useful things too.
06:39 S2
Yeah. Yeah. Absolutely, absolutely.
06:42 S1
So a lot to explore in iOS 18.2, which has come to the iPhone and to the iPad. Let's move on now to experimentation you've been doing this week with videos and video description.
06:59 S2
Yes. And the thing that prompted this actually was that I.. managed to get my boat out last weekend. It's been, I've had it since August and every, every weekend it's just about been either too windy or too rainy. And I managed to get the boat out last weekend and went fishing, and I had my Ray-Ban meta glasses, and I took some video while I was reeling in fish, and also just some video of kind of... standing in the boat and looking around at the ocean. And that prompted me to sort of think about, you know, being able to get the videos described.
And there are two ways of doing it. There is seeing AI now has that ability. So it's part of the scene channel in seeing AI. Um, and for a while we've been able to get descriptions of photos or, or take a picture and get a live description. But now you can... give it a video from your... camera roll so you can select a video, I think, up to ten minutes in length, and you're allowed ten videos per day that you can have described. So the idea is you... point it at the video, and you, it'll sit there and do the normal Seeing Eye kind of processing noise that it does for a couple of minutes. And then what it'll do is it'll start playing the video in short snippets with... descriptions read out in between those snippets.
So, for example, you'll hear like in the video that I tried it with, you would hear the sound of the water lapping against the boat and my friend saying, Oh, just bring your rod tip down a little bit. And then the description would say, A man in a blue shirt reeling in a fish, and then you'll hear a little bit more audio and it'll say, Another man, you know, reaches over and grabs the end of the rod, you know, out of out of the water or whatever it might be. And then it sort of described the... fish, and it described my friend Jim trying to get the hook out of his mouth and so on. So... it's quite nice in that it gives you sort of a almost real time description.
And the nice thing about the way that works, of course, is that if I was editing that video, I could probably use that description to roughly line it up and go, Oh, hang on, I've got 15 seconds here where I was looking down, or I was looking up at the boat canopy rather than at the fish or whatever. I can edit that out. So that was that was part of... what prompted that interest? The other way of doing it is using an app called Piccy Bot, and it's spelt p I c c y. And it is a paid product to be able to get the... a lot of the features - it does it in a slightly different way in that it processes the video and then just gives you a description, but it will use certain sort of cues in the video to say, at this point in the video... you know, There's a splashing noise as the man throws the fish back over the side of the boat, for example, but the description is not interlaced with the audio, if that makes sense.
10:10 S1
So it actually takes notice of what it sees as well as what it hears spoken in the...
10:16 S2
That's right, that's right. So it is two very different sort of experiences in terms of having your video described. But I think they are probably equally valuable in... different circumstances.
10:28 S1
Very interesting. The two apps you were using, their Seeing Eye, which is a bit of a Swiss Army knife really, you can do it really is, isn't it? And the other one, Piccy Bot, Piccy Bot. So have a play with those if you're interested in video description.
10:44 S2
Yeah, absolutely. It's just nice to be able to have a go at taking a video, I think as a blind person and then having a rough idea of what the result was.
10:52 S1
It's been a big week for Humanware. People like to get their stuff out before Christmas, I think. And Humanware have announced some upgrades to their Braille display products, the Mantis Braille Display and the Brailliant series, the B20 X and the B40 X. Tell us quickly about those devices.
11:13 S2
Sure. So the Brailliant B20 and B40 X are there refreshable Braille displays that have the Perkins style keyboard, obviously, the B26 is a 20 cell display and the 40 X is a 40 cell display. And then we have the Mantis, which is the 40 cell Braille display equipped with a QWERTY keyboard.
11:35 S1
The Braillianters had text to speech for some time in this upgrade. That feature went to the Mantis.
11:42 S2
It did, which is really quite significant for Mantis users, because that's a feature that I think a lot of people have wished we had for quite a long time. So yeah, the whole family now has... text to speech built in.
11:57 S1
And there's an interesting package of other features that have gone to both devices.
12:00 S2
Yes. So there is... a number of interesting new features. We now have access to Wikipedia and Wiktionary directly from the device. That obviously requires an internet connection, but it does mean that you can look things up directly from your Braille or your Mantis without having to first pair it to your phone or your computer. And there is also an onboard dictionary which you can access offline. So even if you're not connected to the internet and you just want to quickly look up something, you can do that from the device as well.
There are also some significant audio improvements, so you can now pair your device to either Bluetooth headphones or a Bluetooth speaker, or even Bluetooth hearing aids. So those that use hearing aids may actually now benefit more from the text to speech, because you can have it piped directly into your Bluetooth hearing aids. And there's also now support via Bluetooth for the Apple TV as well. So definitely quite a significant upgrade. And you can do that upgrade online. Just connect your device to Wi-Fi and you can download the update that way. There's no um, sort of you don't have to send it back or anything to get the upgrade.
13:14 S1
That Apple TV implementation... that's a big one, isn't it?
13:18 S2
It will be. I haven't had a chance to try it yet on my Apple TV, but I do have a Brailliant here. So that's on my list for this coming weekend to experiment with that, because I think it'll be quite good.
13:28 S1
So that's the Mantis and the Brailliant series, both those products from Humanware. Braille displays and major upgrades this week. Before we go, a reminder that you can find details of this program and previous editions by going to VA radio.org/talking tech. VA radio.org/talking tech. And to write to the program...
13:52 S2
You can email me, Damo. Damo McMorrow - M C M O R R O W - at Vision Australia dot org - and thanks to everybody who has been doing that.
14:04 S1
damo.mcmorrow@visionaustralia.org ... This has been Talking Tech, with me has been Vision Australia's national access technology manager Damo McMorrow. I'm Stephen Jolley. Take care. We'll talk more tech next week. See you.