Audio
Is AI a blessing or a curse? Dr Scott Hollier (part 2)
Part 2 of a recent roundtable address on AI and its significance for people with print disabilities.
This series from Blind Citizens Australia looks at the organisation's work and issues surrounding fair access for people who are blind or have low vision.
In this episode: partt 2 of presentation from the May 2024 Round Table on Information Access for People with a Print Disability.
Dr. Scott Hollier, CEO of the Centre for Accessibility Australia, delivered this presentation on Artificial Intelligence, AI, and its use by blind or and vision impaired people, and those with other print disabilities.
Our thanks to the Round Table for allowing us the use of this material.
New Horizons is produced in the studios of Vision Australia.
Pictured on this page: Dr Scott Hollier
Speaker 1 00:08 (Program theme)
It's up to you and me to shine a guiding light and lead the way / United by our core...
Speaker 2 00:17
We have power to pursue what we believe...
Speaker 1 00:23
We'll achieve the realisation of our...
Speaker 2 00:29
Hello and welcome to this week's episode of New Horizons - and this week we bring you part two of the presentation by Dr Scott Hollier from the recently held Roundtable on Information Access for People with a Print Disability. Once again thank you to the Roundtable for the use of this material. And this week Scott talks about the future of artificial intelligence and what it means for people with print disabilities.
Speaker 3 00:51
So what other things do we think we can do in the print disability context that could help? You know, is it possible that AI could change things on the fly to help us? Well, even the AI of today could absolutely achieve a lot of things. In the group I'm a part of, the Research Questions Taskforce Connected to W3C, we are currently putting together international guidance to provide some support in this area because the AI of today can potentially do a lot. There are certainly, as we've seen already, some areas where auto-generated alt text may be helpful, if not entirely what we need, but it can be helpful, live captioning can be helpful.
But also, there is no reason that we couldn't have AI scan the page and go, yep, that text is meant to be a heading, but the person just made it bold text instead, so screen readers and other assistive technologies can't get to that. So I will change that to be the heading it's meant to be in code with an H2 or using a style in Microsoft Word, for example. There's no reason that AI can't look at the color contrast in our pages and go, yeah, that's really not great, let's fix that on the fly, let's push out the color palette to make it work.
And when we have links which aren't descriptive, if we get links like click here or read more, and that's all the information we have, there's no reason that AI can't follow those links, figure out what that website is, come back to our page and change that link to be more descriptive. We have the AI technology to do all those things now. So why can't we have, for example, a web browser that figures out what the best AI, generative AI process is for each of those processes and change it on the fly for us as we browse along the web. It might see cluttered text and go, yep, we need to spread that out, we need to make that bigger. So, we have the technology of today to do that, but it's all a bit piecemeal at the moment in terms of what's being developed by who and the race is on for consumers to embrace AI.
As such, some of these benefits that we see are still taking some time to come together, certainly coming together in a package where we could on the fly make those changes. Although we don't see any time soon, and we strongly still advocate for the implementation of the web content accessibility guidelines or WCAG. I'm not sure what WCAG came up as in my live captioning. Usually it's wicked egg or water cake. But we did last year in October get the new version of WCAG 2 .2, and we strongly advocate for that to still be implemented. But if people do miss the old text, could AI step in and fix it? As we've seen, the answer is to some degree, but it's only going to get better from here. And we should see more developments, I think, as people bring together these inclusive technologies to provide that support.
03:41
So, what about the actual language that we use? How effective is AI in helping us with that? Well, again, it's been an interesting time for new developments internationally. There was a new plain language standard last year, ISO 24495-1. If you're not familiar with it, I strongly recommend having a look. And it really focuses on maximising the reach of our language to a lower secondary reading level. The standard is something that looks at making sure that we use common words, that we define important words, that we avoid the really clear use of tense, that double negatives are avoided. And we also want to make sure that nested clauses cut down as much as possible. And things like metaphors and similes can often trip up our language.
I realised that this was an issue. Just last year, we had an Italian exchange student who was staying with us. And she asked what we were doing the next day. And I said, basically, we're going to head down to Frio, go to Rotto, we'll be back in the Ava, and then we'll catch some Maccas. And I realised that none of that sentence made sense. So, it does emphasise the importance of plain language. But what about from AI? We've seen a lot of work in AI helping us to convert information into plain English to maximise the reach of our words, and also easy English, specifically supporting people with cognitive and intellectual disability. And we also at our work use maybe the odd sentence here and there, stick a complex sentence into our generative AI program and see if it helps. And sometimes it does. But can we just grab an entire document, throw it in, take its output and say, mission complete?
Well, let's put that to the test. So, here I have a nursery rhyme that I think we're all fairly familiar with. Mary Had a Little Lamb. So... Mary had little lamb, its fleece was wide the snow, everywhere that Mary went, the lamb was sure to go. So, a big thing that AI is touted to be a solution for is when we want to translate into different languages. So, what happens if we translate this into a different language and we translate it back to English? A lot of the demos, especially the demos this past week, have focused quite heavily on the fact that we can have real-time live communication across multiple languages, and our AI programs will largely take care of that.
So, let's see what happens to this if we translate it to another language and then back to English. So, what we have is Mary owned short lamb. Okay, that's, you know, ballpark. It's snowed sheep hair. I think we have an issue with that one. And then, where Mary walked, lamb returned. So, look, that's a bridge, but I think lines one, three and four didn't do too bad. Certainly, the nursery rhyme has largely been kept intact. But what happened to its snowed sheep hair? So, why did we end up there? Well, I assumed that it went to fleece and then it said white as snow and figured, okay, snow, white, fleece, and it assumed that that's all one thing.
And this comes back to when we use metaphors, similes, and things like that, where it does trip up generative AI. So, when we are doing our language conversions, our document conversions, again, these tools are helpful. But, you know, we do need to be really aware that these tools still have some way to go. So that's really a bit about what we can do on the fly with AI at the moment in real time. But what about tools that can help us to make decisions ourselves? So there's a lot of tools, and these tools are rapidly improving, that can help us to check how accessible our content is. So for example, if we want to check if our document is accessible in Microsoft Word, if we go to the Review tab and you go to where the spell check is, a few icons along, you'll see a Check Accessibility button.
07:43
Well, likewise, there's lots of great tools online. One I've got on my slide here is referencing WAVE. So WAVE is another automated testing tool for the web. You can install it into your Chrome browser or you can go to WAVE.webaim.org. And it lets you put in a web address and you can analyse that page against the web content accessibility guidelines. So these tools are quite helpful in being able to figure out if accessibility issues exist and what we can do to fix them. So how good are they? Well, they do vary in quality, but even the best ones as of today, with all the AI power behind us, can only check about 40% of the WCAG standard. And with that in mind, how good is the results?
Well, if you just want to find out if you're missing alt text on an image, it's pretty good. Sometimes they'll pick up links that aren't descriptive, like ones which just say, click here. But for a lot of things, it really struggles. And also, it quite often can have false positives, false negatives. Sometimes you'll check a page in one tool, it says it's fine. You check it in another tool, it says it's not, depending on what type of AI engine is being used. However, I think this is one area that we will start to see improvements as our AI gets going. So although AI may not be, I guess, ready for prime time in terms of our daily use, it might be useful in some cases, like our live captioning.
But there's still limitations there. But some of these tools, while again, probably not in a place where we can rely on them, are already providing useful information. And I think this is an area that we will see great improvements going forward. So the last thing I would like to share is actually not as print disability related, but I think it's a good indication of where some of the issues sit going forward. So generally, although AI is not quite up to the task for us yet, I am optimistic that it will be something helpful. In some cases, it's already helpful to a point. And I do tend to be an optimist, and I look forward to where I might take us in the future. But there is some really significant things we need to consider.
And so this is an example where I think we should have a look at it. Those familiar with captures, the squiggly text that tries to figure out if you're a bot or not, or those tick the traffic light type scenarios, will be very aware that people with disability often get put in the bot category because they're so inaccessible. And this is a concern when it comes to other elements of AI. I'm going to play a short video just to explain a feature that is now in iPhones, which supports people who are nonverbal or losing the ability to speak, and some benefits to the technology there. So I'll just play a short snippet of this video, and then I'll explain the implications.
Speaker 1 10:37
Apple this week showed off new generative AI tools that can actually help humanity. But you may not have heard a lot of talk about Apple's AI because Apple doesn't really use the term AI. Apple likes to use the phrase machine learning. There are several new features coming to the iPhone and iPad that can give assistance to people living with disabilities. And one feature that is getting a lot of attention is the option to have the iPhone speak in your voice. It learns how to sound like you after about 15 minutes of training. The idea is that this can help people who have trouble speaking or people who may be losing their voice to a disease, like ALS. Reporters who've heard a demo have told me the artificial voice sounded just like the real person.
Speaker 3 11:24
So that's in iPhones now... and for people who are non-verbal or losing their voice it's been a really great tool. People can either sample their voice before they lose it or can select a synthesised voice and can make phone calls. And for many people it was the first time to make phone calls. We've done some research just recently which we launched last Thursday on Global Accessibility Awareness Day where we tested 44 SIM providers as to how easy it is to get out of a phone plan and discovered that the majority cannot cancel a phone plan unless you make a phone call.
So the importance of being able to make a call and being able to have speech is very important especially in that particular scenario. So it's great that there is this tool that now allows people who are not able to easily use speech to type into their phone and have a speech, this voice, being able to be used. At least it was until recently, when a lot of the automated software around the world, with banks and other, when we call up and it's press 1 for this, press 2 for that, a lot of those automated systems have now been set up for scam alert and as soon as they detect that something is a robotic or automated voice it immediately shuts it down.
So this is where we end up in the tug-of-war of AI. We have great benefits that can support people with disability but then because of privacy and security concerns, some of these benefits are then taken away. And unfortunately, as much as I am enthusiastic and supportive of where AI can go in the future, I think there's a lot of direct relevance and excitement for supporting people with disability. The arguments for privacy and security taking away some of these benefits are already in play and I am concerned that we'll continue to see this arm wrestle between technological innovation going forward and security and privacy concerns pulling it back.
Speaker 2 13:19
Dr. Scott Hollier there and his presentation to the Roundtable on Information Access for People with a Print Disability. If you'd like to know more about the Roundtable, printdisability.org is the website, printdisability dot org. If you'd like to know more about Scott Hollier and the Centre for Accessibility, you can have a look at accessibility.org.au, accessibility dot org dot AU. If you'd like to get in touch with Blind Citizens Australia, you can call 1800 033 660, 1800 033 660 or of course you can email bca@bca.org.au ... BCA at BCA dot org dot AU. I'll talk to you again next week.
Speaker 1 13:59 (program theme)
We'll achieve the realisation...
Speaker 2 14:07
Of a...