Audio
Meta Ray-Bans, Jaws updates, and goodbye Living Blindfully
Expert updates on assistive tech developments for people with vision impairment.
Vision Australia's Access Technology Officer Damo McMorrow talks with Stephen Jolley about latest tech developments from a blindness and low vision perspective.
In this edition:
- Explanation of the Meta Ray-Ban smart glasses and the partnership with Be My Eyes;
- Major upgrade to the Humanware Prodigi low vision product;
- Major upgrade to Hartgen Consultancy Jaws-related Leasy product;
- More detailed look ahead to the annual upgrades to FS products JAWS, Fusion and ZoomText - more detail via Freedom Scientific
- Acknowledgement of the final edition of the Living Blindfully podcast from Jonathan Mosen (pictured on this page).
00:33 S1
Hello everyone. Welcome to Talking Tech. This edition available from October the 1st, 2024. I'm Stephen Jolley. Great to have you with us listening maybe through Vision Australia radio, associated stations of the Radio Reading Network or the Community Radio Network. There's also the podcast. To catch that, all you need to do is search for the two words talking tech and Dan O'Connell. Come usually on a Tuesday afternoon just after it's been produced. Another option is to ask your Siri device or smart speaker to play Vision Australia Radio talking tech podcast. Vision Australia Radio talking tech podcast.
With me, Vision Australia's national access technology manager Damo McMorrow. Hey, Damo.
01:15 S2
Hi, Stephen. How are you?
01:16 S1
Very well, and lots to chat about again today. Let's start with smart wearables, and I'd like to ask you in particular about the Meta Ray-Ban smart glasses.
01:27 S2
Sure. So these are made by Ray-Ban. So they look, for all intents and purposes, like a regular set of sunglasses. Ray-Ban have partnered with meta, who is the company that is responsible for Facebook, WhatsApp and Instagram, those sorts of platforms. So these glasses have a small camera in the sort of left hand top left hand corner of the frame. The right hand arm of the glasses is a touch sensitive area that you can use for adjusting volume, and that type of thing. The speakers you get the audio from are built into the arms of the glasses, so they're not bone conduction, it's just that they sit sort of at the ends of the arms very close to your ears. They have a camera built into them. They pair to your phone via Bluetooth.
They were really designed originally for the kind of Instagram Facebook market. So the idea being that people could take point of view, type photographs and post them straight to their Instagram feed or Facebook. But these things have some nice features that we as blind people can use. So they use Meta's own AI engine. And so you can do various things with that. You can use them like a set of Bluetooth headphones, as it were, so you can play your music through them. You can send messages, that type of thing by voice, but you can also ask it various questions so you can say things like, Hey Meta, look and tell me what you see... Hey, Meta, look and read me any text... Look and tell me if there is any signage... Look and tell me what colour this shirt is... Look and tell me what colour pants this would go with... all of those kinds of things.
03:24 S1
You use them yourself, don't you?
03:26 S2
I do, yes. Also, I should mention they do come in a few different frame styles and different lenses so you can get polarised ones, or you can just get the sort of plain dark lenses. I have the Ray-Ban Meta Wayfarers, which are a sort of a trapezoid shaped lens. My wife has the headliner, which are more the traditional sort of round glasses, and then they have another one called the Skylar, which I haven't seen yet, but it's a different shaped lens again. I got them really just to experiment with. I mean, they're a they are a mainstream piece of tech. They're around about sort of $450, and they're available from your standard sunglasses type shops. And I just sort of wanted to see what I could do with them.
I liked the idea that it was, you know, perhaps a mainstream bit of tech that we could use. It doesn't do all of the things that things like the envision or the Arcs vision can do in that it when you try to read text, it will want to summarize it rather than reading it in its entirety. But certainly for environmental scanning and that kind of thing, I find it quite good. I was in Melbourne the other week and... was using it to just help me find some shops. I don't come from Melbourne, I'm based in Brisbane and I was trying to navigate the Block Arcade in Collins Street in Melbourne and you know, just being able to use it to help me identify shops and things was actually quite helpful.
04:51 S1
Now with these glasses, meta have recently gone into partnership with Be My Eyes. Now Be My Eyes are one of the organisations who have support people or agents available from anywhere in the world as volunteers who will explain things to you that you might want to know about, like where the battery went to, that you dropped on the floor, etc. but this is an interesting partnership, isn't it?
05:16 S2
It is, and it's probably one of the biggest news stories in assistive tech of the week. So yes, they are partnering officially with Meta. And the idea is that once the feature is released, you will be able to say, Hey Meta, call the volunteer on Be My Eyes... and the volunteer will then be able to interpret things through your glasses camera rather than the camera built into your phone. So if you were, you know, needed some hands free navigation or something like that, or if you were cooking and you wanted to know, Hey, does this steak look like it's done? What have I got the stove set to? That kind of thing. You know, all of those sort of visual interpretation tasks that we would normally use our phone for. It just gives you a nice hands free option. You know, if you were navigating through an airport or something like that.
IRA also have an implementation in beta as well. So IRA is a similar kind of visual interpretation service, but it's not free. It's a paid subscription. Whereas, Be My Eyes utilises volunteers. Unfortunately, the IRA implementation at this point in time is a little clunky. And it's it's using WhatsApp as its video call engine, whereas the partnership between Meta and Be My Eyes means that it's seamless and accessible through the app. So it is quite an exciting development, I think.
06:38 S1
Is that available in Australia yet?
06:41 S2
No. Not yet. It's only just been announced in the last couple of days, I think Wednesday or Thursday last week. But I get the impression that it's certainly not far away. It is in Beta. I have heard that there are people testing it and that kind of thing. So I don't think it'll be far away.
06:58 S1
Let's turn to Humanware now and Prodigy. There's been a major update.
07:04 S2
Some of you may be familiar with the Humanware Prodigy magnifiers, the old Prodigy Duo or the Prodigy Connect, which was their tablet-based solution. They've now released the Prodigy software for Windows, and the idea of this is that you can run it on a Windows laptop or tablet and have a document camera connected, and it can serve as a portable magnification slash OCR solution. So you could, for example, if you know as a student, you might use it to OCR, for example, a worksheet, or you might just choose to magnify it if there were graphics on it. You can, depending on the camera that you have connected, you can also use it for distance, magnification and then potentially again, OCR that and convert it to spoken text.
And they have it in a few different versions. You can download the software as a 14 day trial, which is nice, and you can get it as a software only, or you can get it as a bundle with a camera or with a camera and a surface. Microsoft Surface Laptop. So yeah, it provides a nice a nice portable option for students or anyone needing a portable magnification and OCR solution on the go.
08:20 S1
Mm. Very good. Tell us now about Hartgen Consulting and amongst the work that Brian Hartgen does, the Leasey product.
08:30 S2
So Harken Consultancy is headed up by Brian Hartgen, who's probably one of the world's most prolific Jaws scripters. And they do a range of different products including things like scripts for the station playlist. Broadcasting suite. They've got scripts for the Twitter website. Scripts for Zoom and Teams. And they also do a bunch of online training courses. Lisi is a productivity tool that bolts onto Jaws, and it provides a whole range of different options. There is a couple of versions of it. There's one that sort of functions like a menu system for people who are very new to computer use.
And then there's the Leasey advanced product, which allows you to do all kinds of things. You can do things like have a whole bunch of shortcuts. For example, if there's different blocks of text that you frequently use in a report, you can set those up. You can call up any of your web favourites from anywhere. You can use it for posting to various social media platforms from anywhere. You can listen to radio stations in the background while you're working. There's a whole bunch of sort of time saving things that they've introduced and the latest build, addresses some issues with things like the VLC media player. If you do a an insert T in Jaws 2024, which normally gives you a Window title, it'll tell you the status of the window as well. If you if you quickly press that key twice, it is a product that's constantly evolving and you know they generally have an update ready as soon as there's a new version of jaws available.
And yeah, it really does offer a lot of... productivity improvements and just ways of getting to information with a couple of keystrokes.
10:16 S1
Leasey, l-e-a-s-e-y. We'll put the address to Hartgen Consulting in the show notes. You mentioned Jaws 2024. Jaws 2025 is not far away, along with the other Freedom Scientific Updates, Fusion and Zoomtext. Tell us about that.
10:33 S2
Sure. So... there are some quite exciting new features in Jaws slash Fusion. Probably the biggest ones that come to mind is the PFS companion, which is an AI search type tool that allows you to ask questions about Jaws commands, Microsoft products, you know. So how to do something in Outlook or Excel for example? It is limited to sort of Jaws use Microsoft product use that kind of thing. So you couldn't ask it for a recipe for chocolate muffins. But if you wanted to know how to, for example, sort something in Excel or how to use the OCR feature in Jaws or something like that, you can do that. You can ask it questions in that sort of conversational style, and it will respond and give you the, you know, a number of different pieces of information.
So it's a good way to... search for, you know, commands, particularly given that the things like the Microsoft Office suite seems to change every five minutes at the moment. And, um, you know, so there's always new commands or slightly new ways of doing things. The other big one is the continuous OCR feature. We've had convenient OCR for a while, which enables you to scan or OCR things like PDFs that have started life as a scanned document or controls on your screen that are perhaps not labelled. But this allows you to do continuous OCR so that as the screen changes, it continues to perform OCR and drop the results into a virtual viewer so that you can look at them. So if you're trying to navigate screens that are not accessible. It provides a really good way of doing that.
As usual, a whole range of other bug fixes and performance improvements and that kind of thing. Likewise, with zoom text, they've made some changes to the multi-monitor modes in response to user feedback. They've also made quite a few tweaks under the hood that improve things like start up time with the zoom text and so on. Definitely some some quite exciting improvements coming.
12:37 S1
So that'll be out from the end of October. The upgrades to those products, zoom text, Jaws and fusion from Freedom Scientific. Now hats off and thank you to our friend and colleague Jonathan Mohsin, because over the weekend, he dropped the final edition of the Living Mindfully podcast 305. It was. That includes the Mosen at Large podcasts that were available for many years as well. Great service to the blind community of the world with lots of product demos, conversation and discussion of blindness issues. It's going to be missed.
13:22 S2
Absolutely. It's something that I listen to very regularly. It's a great source of, as you say, demos, but also some really good discussions around ride share refusals and behavior of airlines when it comes to assistance dogs and all of those kinds of things. So hats off to Jonathan. He's done a fantastic job, provided a really well produced and informative podcast.
13:45 S1
Those podcasts will stay around for some time. If you haven't caught up with them yet and downloaded them, and I'm sure we'll hear from Jonathan from time to time as he moves into his new career phase with the National Federation of the Blind in the United States.
Now, just before we go, a reminder of where there are details of what we've been talking about in this and previous editions of Talking Tech, you can go to VA radio, dot org slash talking tech. VA radio, dot org slash talking tech. And to write to the program....
14:17 S2
People can send an email to Damo, Damo McMorrow m-c-m-o-r-r-o-w at Vision Australia dot org. I always enjoy receiving your emails.
14:29 S1
damo.mcmorrow@visionaustralia.org ... This has been Talking Tech, with me has been Vision Australia's national access technology manager Damo McMorrow. I'm Stephen Jolley. Stay safe. We'll talk more tech next week. See you.