WEBVTT

1
00:00:03.530 --> 00:00:21.179
David Sloan: Welcome to the Stage of Accessibility podcast from Vispero. This is episode 18. I'm David Sloan, Chief Accessibility Officer, and each month I'm joined by a guest or two to discuss a topic of interest and relevance to the practice and profession of digital accessibility.

2
00:00:21.310 --> 00:00:25.650
David Sloan: And for this episode, the April 2026 episode.

3
00:00:25.790 --> 00:00:36.999
David Sloan: We're returning to a topic that we've addressed in several previous podcasts, and really, honestly, which we can't ignore for any length of time, given the rapid pace of change.

4
00:00:37.000 --> 00:00:54.240
David Sloan: Yep, it's artificial intelligence again. And joining me this month to discuss what's changed in AI's impact on digital accessibility efforts is a core member of Vispero's Knowledge Center, Principal Technical Writer, Ricky Onsman. So, welcome, Ricky.

5
00:00:55.210 --> 00:00:57.440
Ricky Onsman: Thank you, David. Lovely to be here.

6
00:00:57.910 --> 00:01:02.949
David Sloan: Yeah, and thank you for getting up early to join us for this conversation.

7
00:01:03.300 --> 00:01:16.750
David Sloan: I know you've been tracking the impact of AI and digital accessibility for a while now, and it sounds like from your very first effort, once you joined TPGI Vispero.

8
00:01:17.220 --> 00:01:35.810
David Sloan: So, talk through what's piqued your interest, how you've been tracking and reporting on progress in AI from an accessibility perspective over recent years, and you could include, kind of, defining the different types of AI that are making a difference to how we build and make sure that digital content's accessible.

9
00:01:36.870 --> 00:01:37.570
Ricky Onsman: Sure.

10
00:01:37.730 --> 00:01:52.229
Ricky Onsman: Just for perspective, so that a lot of people won't know exactly what the Knowledge center is, what we do is we provide… we write the guidance for our TPGI, the Sparrow.

11
00:01:52.490 --> 00:02:07.370
Ricky Onsman: Accessibility engineers, so we work on the rules engine that people use to assess and audit whether, websites and apps are meeting accessibility requirements, and we provide guidance to our clients.

12
00:02:07.400 --> 00:02:12.519
Ricky Onsman: on how to address specific accessibility issues. So…

13
00:02:12.840 --> 00:02:20.329
Ricky Onsman: Artificial intelligence comes very strongly into our, kind of, area of work.

14
00:02:20.570 --> 00:02:34.410
Ricky Onsman: And in fact, I've been working for Vispero for TPGI for five and a half years, and I realized the other day the first thing I actually did for TPGI was write a blog post on artificial intelligence.

15
00:02:34.780 --> 00:02:42.459
Ricky Onsman: I reviewed that the other day, and I actually now, five and a half years later, I would write a very different article.

16
00:02:42.670 --> 00:03:00.249
Ricky Onsman: It was, it was very positive, and I remain positive about artificial intelligence, and I think we should, make that clear from the start. I think it has enormous potential, in society in general, but very specifically in how we approach digital accessibility.

17
00:03:00.470 --> 00:03:06.169
Ricky Onsman: I think it has been a hard slog in the world of digital accessibility.

18
00:03:06.290 --> 00:03:15.750
Ricky Onsman: And there are a lot of ways that artificial intelligence can help us to achieve our aims in making the digital world a more accessible place.

19
00:03:16.310 --> 00:03:17.620
Ricky Onsman: Having said that.

20
00:03:17.900 --> 00:03:29.090
Ricky Onsman: I think one of the things that has probably changed in the way I think about artificial intelligence is, first of all, we've got to stop calling it artificial intelligence.

21
00:03:29.610 --> 00:03:32.520
Ricky Onsman: For one thing, It's not very intelligent.

22
00:03:33.120 --> 00:03:38.019
Ricky Onsman: It's a terrific tool, but a lot of it is just automation.

23
00:03:38.230 --> 00:03:42.630
Ricky Onsman: And it's not… doesn't necessarily show a great deal of insight.

24
00:03:43.450 --> 00:03:45.860
Ricky Onsman: As an example of that.

25
00:03:46.340 --> 00:03:53.680
Ricky Onsman: When I ask a generative AI chat model, to…

26
00:03:55.380 --> 00:03:59.239
Ricky Onsman: Let's say I have a set of code here.

27
00:03:59.380 --> 00:04:04.949
Ricky Onsman: And, it's got a bunch of divs and spans in it. How do I make them accessible?

28
00:04:05.080 --> 00:04:11.080
Ricky Onsman: And what it will tell me is to use ARIA, Labels to make it accessible.

29
00:04:11.490 --> 00:04:16.450
Ricky Onsman: And… What it doesn't do is question my basic premise.

30
00:04:16.550 --> 00:04:31.700
Ricky Onsman: It doesn't say, why are you using divs and spans? You should be using semantic HTML. It just answers the question that I put to it, but it doesn't make me question myself, and it doesn't make me think about what I'm doing.

31
00:04:32.020 --> 00:04:34.869
Ricky Onsman: That's a real problem, because then we just…

32
00:04:35.200 --> 00:04:42.589
Ricky Onsman: perpetuate the same issues, instead of rethinking things. Now, that would be intelligence.

33
00:04:42.710 --> 00:04:46.660
Ricky Onsman: And I don't see a lot of that evidence in what

34
00:04:47.100 --> 00:04:54.030
Ricky Onsman: Claude, ChatGPT, your other artificial intelligence, chat, models do.

35
00:04:54.210 --> 00:04:58.939
Ricky Onsman: And they are… almost sycophantic.

36
00:04:59.140 --> 00:05:08.519
Ricky Onsman: In that they will respond in a lovely conversational tone. Oh, that's a very good point, you've raised a terrific thing there, but they don't make me think.

37
00:05:09.450 --> 00:05:14.429
Ricky Onsman: So, I think one of the things there is we've just got to find a different name for it.

38
00:05:14.710 --> 00:05:15.740
Ricky Onsman: And,

39
00:05:15.890 --> 00:05:22.280
Ricky Onsman: I guess one of the things that comes out there is that we should stop referring to it altogether.

40
00:05:22.470 --> 00:05:26.370
Ricky Onsman: I think a good model for that is what happened with Google Translate.

41
00:05:26.820 --> 00:05:31.930
Ricky Onsman: In 2016, Google Translate moved to an artificial intelligence model.

42
00:05:32.110 --> 00:05:35.189
Ricky Onsman: Its accuracy rose by 80%.

43
00:05:35.370 --> 00:05:45.860
Ricky Onsman: And nobody thought about it. Nobody questions it. Now, you use Google Translate, and you don't think, I'm using artificial intelligence. You're just using Google Translate.

44
00:05:45.940 --> 00:06:00.040
Ricky Onsman: What drives it, what powers it, is almost irrelevant. The question is, does it do a good job? If it does a good job, fantastic. And that's how we've got to kind of incorporate artificial intelligence into what we do with digital accessibility.

45
00:06:00.140 --> 00:06:11.710
Ricky Onsman: So that when it helps us write code or address issues, we don't think about it being artificial intelligence. It's just a tool that's helping us to do our job.

46
00:06:12.120 --> 00:06:13.390
Ricky Onsman: Does that make sense?

47
00:06:13.700 --> 00:06:16.660
David Sloan: Absolutely, yeah, and it also helps

48
00:06:17.190 --> 00:06:20.120
David Sloan: Move us on from this kind of pressure to be

49
00:06:20.220 --> 00:06:26.380
David Sloan: using AI, quote-unquote, without really un… understanding

50
00:06:27.360 --> 00:06:31.910
David Sloan: what do you mean, and what for? You know, it's just this… it's almost like a…

51
00:06:32.020 --> 00:06:38.290
David Sloan: A bandwagon, or a, you know, something that you feel like you have to be doing, and have to be showing that you're doing without

52
00:06:38.760 --> 00:06:40.829
David Sloan: Really having the context of

53
00:06:41.190 --> 00:06:58.260
David Sloan: where can I… where can I make my work more efficient? Where can I make it quicker and reduce errors, or whatever? So, almost taking the… taking the underlying technology out of the equation and just thinking, what are the tools that I could be using that I'm not using now, will stop people feeling

54
00:06:58.510 --> 00:07:03.760
David Sloan: I don't know, I don't know where to go, or… or I… or worse, kind of.

55
00:07:04.170 --> 00:07:09.989
David Sloan: Turn the opposite direction and kind of walk away from something that could help them.

56
00:07:10.320 --> 00:07:15.060
David Sloan: do their job more… more effectively. You know, putting aside the

57
00:07:15.210 --> 00:07:18.979
David Sloan: Or now the, you know, the reasonable concerns around

58
00:07:19.630 --> 00:07:24.330
David Sloan: AI in terms of the, sort of, economic and the environmental

59
00:07:24.690 --> 00:07:30.499
David Sloan: aspects, but I do like the idea of just, like, let's get rid of the whole

60
00:07:31.480 --> 00:07:33.900
David Sloan: Raise and talk about the tools that

61
00:07:34.080 --> 00:07:38.660
David Sloan: Absorb it and make use of it, and then maybe it becomes more…

62
00:07:39.000 --> 00:07:44.050
David Sloan: In a way, more accessible to people who just need They're always looking for tools.

63
00:07:44.320 --> 00:07:53.769
David Sloan: to do their job better, and I guess in the accessibility profession, that is a place where we're always looking at tools that can automate things that we trust will

64
00:07:54.000 --> 00:07:59.620
David Sloan: Do a better job than we could do on our own, and cut down on errors, and cut down on time.

65
00:08:00.820 --> 00:08:06.849
Ricky Onsman: Yeah, I think that's right, and I mean, I think there are two main drivers in that context.

66
00:08:07.060 --> 00:08:13.440
Ricky Onsman: One is that it can help us to carry out tasks in an automated way.

67
00:08:13.600 --> 00:08:15.779
Ricky Onsman: That we can rely on.

68
00:08:16.130 --> 00:08:30.199
Ricky Onsman: And the other is that it can replace certain things that we do. So, in the latter context, what I'm thinking of is things like generating alternative text for images.

69
00:08:31.080 --> 00:08:35.139
Ricky Onsman: Even 5 years ago, The job that it did…

70
00:08:35.289 --> 00:08:41.850
Ricky Onsman: was not very good, because what it did was… it did the same thing that inexperienced

71
00:08:42.070 --> 00:08:51.979
Ricky Onsman: accessibility engineers do when assessing the purpose of an image. They look at the image and say, it is this. It's a dog sitting at the base of a tree.

72
00:08:52.080 --> 00:09:00.330
Ricky Onsman: And AI would generate alternative text that says, dog sitting at the base of a tree. It doesn't tell you why that image is there.

73
00:09:00.920 --> 00:09:17.850
Ricky Onsman: Nowadays, they're actually a lot better. Gemini is a lot… has become very, very good at understanding the context of the image, why that is there, and telling you the dog is sitting at the base of the tree because it's hunting truffles.

74
00:09:18.060 --> 00:09:24.190
Ricky Onsman: You know, and that comes from the surrounding text. It understands the context better.

75
00:09:24.650 --> 00:09:29.469
Ricky Onsman: If we can get to the point where we don't have to think a lot about

76
00:09:29.570 --> 00:09:40.350
Ricky Onsman: what we want to put in alternative text. We can rely on artificial intelligence to do it accurately. That's going to make our job a lot easier. And we know, you and I both know, that

77
00:09:40.540 --> 00:09:59.460
Ricky Onsman: Alternative text for images is one of the top issues that exists. We see the WebAIM million report, and every year, and it's not getting better. And the same issues keep coming up that account for 90% of the problems that exist in accessibility on the web and in apps.

78
00:09:59.580 --> 00:10:12.059
Ricky Onsman: And one of those is that images don't have alternative text at all when they should have, or that the alternative text doesn't convey what the image is actually supposed to convey.

79
00:10:12.320 --> 00:10:22.100
Ricky Onsman: So if artificial intelligence can do that for us in an automated way, and we can trust it without having to think about it, then that works really well.

80
00:10:22.600 --> 00:10:35.099
Ricky Onsman: Having said that, at this stage, I would not trust any AI to do that job for me. What I would like trusted to do is make suggestions that I check.

81
00:10:35.970 --> 00:10:47.140
Ricky Onsman: So, it comes up with, this is an image of… and I… no, you're missing the point here. That's not what the image purpose of this image is. The purpose of this image is this.

82
00:10:47.260 --> 00:10:49.839
Ricky Onsman: And it then carries out that task for me.

83
00:10:49.980 --> 00:10:57.649
Ricky Onsman: Now, that's a step in the right direction. It's not total automation, because we can't rely on it with full confidence.

84
00:10:57.900 --> 00:11:01.479
Ricky Onsman: But it is a step towards making our jobs easier.

85
00:11:01.720 --> 00:11:10.480
Ricky Onsman: And that applies as well to using artificial intelligence in content management systems, to say, you haven't put in alternative text for this image.

86
00:11:10.930 --> 00:11:12.169
Ricky Onsman: What should it be?

87
00:11:12.360 --> 00:11:14.280
Ricky Onsman: To make us question ourselves.

88
00:11:14.800 --> 00:11:21.190
Ricky Onsman: And to act like a checklist of things that you need to do. And artificial intelligence is good at checklists.

89
00:11:21.310 --> 00:11:25.949
Ricky Onsman: It's good at coming up with things like, have you done this? Have you done this? Have you done this?

90
00:11:26.270 --> 00:11:29.740
Ricky Onsman: But it's still up to us to carry out the tasks.

91
00:11:29.860 --> 00:11:34.379
Ricky Onsman: And it requires some human intervention, at least at this stage.

92
00:11:34.480 --> 00:11:38.339
Ricky Onsman: Frankly, I don't see that changing a lot over time.

93
00:11:39.650 --> 00:11:54.420
Ricky Onsman: An example of where we put too much trust in artificial intelligence is the dreaded overlays, which rely on, you know, that you can, for a very low cost, install a widget on your website.

94
00:11:54.540 --> 00:12:00.970
Ricky Onsman: press a button, and it will solve all the accessibility problems, because it is AI-powered.

95
00:12:01.380 --> 00:12:05.809
Ricky Onsman: And that is just, it's basically almost snake oil.

96
00:12:06.040 --> 00:12:13.109
Ricky Onsman: A single push-button response to all accessibility issues on a website or an app

97
00:12:13.350 --> 00:12:15.829
Ricky Onsman: It's a long, long way off.

98
00:12:16.210 --> 00:12:18.950
Ricky Onsman: But people are using them now.

99
00:12:19.150 --> 00:12:24.069
Ricky Onsman: And that's a problem, because it ends up causing more accessibility issues, not less.

100
00:12:24.600 --> 00:12:26.760
David Sloan: Yeah, and that's… this is a…

101
00:12:27.650 --> 00:12:41.989
David Sloan: a theme that… I attended an AI and accessibility webinar last week that Accessibility.com hosted Mark Shapiro and Gerard Cohen talking about advances, and one of the things that I think it was Gerard said.

102
00:12:42.490 --> 00:12:51.679
David Sloan: was that there are companies that look at AI as a way, oh, we can use this for our accessibility strategy, and we can fire all our accessibility team.

103
00:12:51.810 --> 00:12:59.429
David Sloan: And then a more mature way to look at AI is we can empower our team to do more and do better, so there's this kind of line between

104
00:13:00.130 --> 00:13:09.510
David Sloan: You're on one side, it's, like, a less mature approach that thinks that AI's already capable of

105
00:13:10.050 --> 00:13:16.400
David Sloan: Replacing humans to do tasks like, your accessibility, quality assurance.

106
00:13:16.920 --> 00:13:24.209
David Sloan: And, like you say, it isn't. And that leaves companies with more problems to solve.

107
00:13:24.610 --> 00:13:28.950
David Sloan: But there's no longer any staff to… to address them, so…

108
00:13:29.300 --> 00:13:31.410
David Sloan: How… how do you see that?

109
00:13:31.730 --> 00:13:34.969
David Sloan: Challenge being addressed over time, where

110
00:13:35.160 --> 00:13:41.450
David Sloan: You have this kind of less mature perspective that Thinks that the technology's already

111
00:13:42.370 --> 00:13:47.300
David Sloan: Already can replace humans, and they make decisions based on that, but then create problems.

112
00:13:47.550 --> 00:13:50.559
David Sloan: Further down the line that they don't have the capacity to fix.

113
00:13:51.800 --> 00:13:56.130
Ricky Onsman: I think one of the things that has created that is what…

114
00:13:56.260 --> 00:14:02.020
Ricky Onsman: A lot of people see as a big advantage of artificial intelligence, the conversational tone.

115
00:14:02.400 --> 00:14:13.859
Ricky Onsman: That you're… you're able to ask a question in human terms of a machine, and the machine responds in what appear to be human conversational tones. It's very convincing.

116
00:14:14.000 --> 00:14:29.729
Ricky Onsman: And it's convinced of its own correctness. It doesn't question itself. It's only when you come back to it and say, are you sure that's right? I didn't think those two flags were the same thing. I mean, no, of course they're not the same thing. This is the correct answer.

117
00:14:29.900 --> 00:14:40.299
Ricky Onsman: And she goes, no, I'm not sure that's totally the correct answer either. And I go, no, it's not the correct answer. This is the correct answer. And everything is given with total certitude.

118
00:14:40.630 --> 00:14:42.540
Ricky Onsman: And that is…

119
00:14:43.060 --> 00:14:50.320
Ricky Onsman: it's certainly artificial, but it's not intelligent. Yeah. And, you know, we've got to… we've got to get into the habit of… of…

120
00:14:50.640 --> 00:15:03.610
Ricky Onsman: thinking of it as an assistive tool. And this is where I think, you know, if we come in our industry to think of artificial intelligence as another form of assistive technology, that would be a big step forward.

121
00:15:03.920 --> 00:15:12.799
Ricky Onsman: Instead of thinking of it being the answer to everything, and that it can be totally relied on, it's something that helps us to do something.

122
00:15:13.130 --> 00:15:16.540
Ricky Onsman: Right. And in those terms, I think we need to…

123
00:15:17.020 --> 00:15:28.979
Ricky Onsman: Dial back the overconfidence that artificial intelligence exudes, and turn it into something that's more interrogative, and something that questions what's going on.

124
00:15:28.980 --> 00:15:41.039
Ricky Onsman: and tries to think about what possible answers it could be. If… if I use ChatGPT, and I ask it a question, and it says, well, the answer could be this, or this, or this.

125
00:15:41.050 --> 00:15:48.079
Ricky Onsman: That would be a lot more useful than it just coming out with one blatant answer that says, oh, I know the answer perfectly.

126
00:15:48.800 --> 00:16:06.350
David Sloan: And then you… and then you have to have the persistence to say, are you sure? And then it might… well, actually, no. So yeah, I like that move towards thinking about it as a kind of position support tool, and a, you know, paired programming, almost, relationship where you're kind of working together and…

127
00:16:07.350 --> 00:16:07.970
Ricky Onsman: Yeah, yeah.

128
00:16:07.970 --> 00:16:11.180
David Sloan: the best way forward. So, when we think about.

129
00:16:11.180 --> 00:16:26.060
Ricky Onsman: In that, you know, you need the expertise to be able to doubt artificial intelligence. In which case, if you have the expertise, why do you need the artificial intelligence in the first place?

130
00:16:26.760 --> 00:16:27.310
David Sloan: Yeah.

131
00:16:27.440 --> 00:16:28.880
David Sloan: That's a good point.

132
00:16:29.280 --> 00:16:36.030
David Sloan: So, you would… you mentioned, you started off an exit with an example of where AI doesn't…

133
00:16:36.260 --> 00:16:38.540
David Sloan: sufficiently interrogate

134
00:16:38.740 --> 00:16:46.259
David Sloan: A coding practice that's suboptimal, let's say, and then tries to kind of remediate the code to improve accessibility.

135
00:16:46.580 --> 00:16:49.870
David Sloan: Have you… You know, stepping back from that, have you seen

136
00:16:50.370 --> 00:16:54.400
David Sloan: You know, genuine advances in the accessibility quality of

137
00:16:54.870 --> 00:17:04.049
David Sloan: code that has been created by generative AI over the years? And if so, where have the advances come?

138
00:17:04.200 --> 00:17:06.039
David Sloan: What have been the factors there?

139
00:17:07.560 --> 00:17:21.590
Ricky Onsman: I think the… it kind of pertains to what we've just been talking about. It depends a lot, and we've come to understand it depends a lot, on what we ask artificial intelligence to do, the prompts that we give.

140
00:17:21.890 --> 00:17:24.770
Ricky Onsman: And if we set out

141
00:17:25.010 --> 00:17:29.330
Ricky Onsman: A clear set of instructions of what our expectations are.

142
00:17:29.470 --> 00:17:38.639
Ricky Onsman: artificial intelligence can achieve those outcomes very, very well. Check my code. Have I used everything correctly here?

143
00:17:38.860 --> 00:17:45.050
Ricky Onsman: Is there a way of making this more accessible that I haven't considered?

144
00:17:45.410 --> 00:18:02.070
Ricky Onsman: Those kind of tools that address those kind of issues have come a long way in recent years. The co-pilot for coding that you get in things like Visual Studio Code, and even in Microsoft products, you know.

145
00:18:02.700 --> 00:18:04.130
Ricky Onsman: That kind of stuff.

146
00:18:04.680 --> 00:18:08.829
Ricky Onsman: To give an example, when you're using a program like Microsoft Word.

147
00:18:09.110 --> 00:18:15.579
Ricky Onsman: Which becomes critical for digital accessibility because of where it goes to.

148
00:18:15.580 --> 00:18:15.950
David Sloan: Yep.

149
00:18:15.950 --> 00:18:22.679
Ricky Onsman: If you have, at the moment, you have accessibility checkers in Microsoft Word, but not everybody uses them.

150
00:18:23.140 --> 00:18:28.540
Ricky Onsman: If it's… artificial intelligence is used to say, you must do this.

151
00:18:28.970 --> 00:18:33.830
Ricky Onsman: I will check the accessibility of this document for you, whether you like it or not.

152
00:18:34.300 --> 00:18:43.690
Ricky Onsman: And that has implications for, you know, one of the big issues that I wrestle with is the accessibility of PDF documents on the web.

153
00:18:44.010 --> 00:18:53.430
Ricky Onsman: And… That is an area where, in generating PDF documents to be accessible, AI has become very good.

154
00:18:53.660 --> 00:19:10.640
Ricky Onsman: For a start, it makes the source documents more accessible, and then it helps the process of the conversion into a PDF document accessible. Now, that still leaves a huge problem for us, because that doesn't address all the PDF documents that are already there.

155
00:19:10.830 --> 00:19:23.890
Ricky Onsman: Right. But it does help in creating new PDF documents that are accessible. And on that front, I think artificial intelligence is very good. And I think the way that companies like Adobe and Microsoft are using it

156
00:19:24.000 --> 00:19:34.079
Ricky Onsman: to… Assist slash force users to consider accessibility issues is a very good thing.

157
00:19:34.230 --> 00:19:37.239
Ricky Onsman: And that's one area where it's been very, very good.

158
00:19:37.570 --> 00:19:40.320
David Sloan: Yeah, I could… I totally agree with you. I think that…

159
00:19:40.710 --> 00:19:47.849
David Sloan: In accessibility, we get… yeah, and there are very valid reasons for doing so, where we get so focused on tools to help us

160
00:19:48.050 --> 00:19:51.070
David Sloan: Test and remediate existing stuff.

161
00:19:51.260 --> 00:19:52.790
David Sloan: We kind of forget about

162
00:19:53.000 --> 00:19:59.010
David Sloan: all the new stuff that's being built, so that ends up also needing tested, and a lot of remediation.

163
00:19:59.050 --> 00:20:14.969
David Sloan: So, it's really encouraging to hear that the innovations are guiding authors of different levels of awareness. You know, let's face it, you know, we talk about accessibility every day, but most people don't. So, any tool that kind of

164
00:20:15.280 --> 00:20:30.050
David Sloan: puts it into somebody's, publishing workflow, whether they… whether they were expecting it or not, and just gives… increases the chances that what they produce is more accessible than it would have… than it would have been had the AI not been there.

165
00:20:30.330 --> 00:20:33.019
David Sloan: I think that's a really encouraging thing.

166
00:20:33.200 --> 00:20:34.789
David Sloan: So, do you see any kind of…

167
00:20:35.110 --> 00:20:40.620
David Sloan: maturity and best practices of using AI in

168
00:20:41.040 --> 00:20:57.340
David Sloan: digital content creation emerging, you know, whether it's tools, or the way that you write prompts, or the way that you feed the tool with examples of what you mean. Are we getting to a point where we're sharing best practices, or are people still

169
00:20:57.580 --> 00:21:01.040
David Sloan: Kind of more figuring it out on their own at this point.

170
00:21:02.220 --> 00:21:11.029
Ricky Onsman: Well, I think it really comes down to your expectations and the limits that you put on artificial intelligence. And by that, I mean

171
00:21:11.470 --> 00:21:14.089
Ricky Onsman: when it… when I first started using

172
00:21:14.320 --> 00:21:24.620
Ricky Onsman: generative AI to ask questions, it became quickly apparent that the data set that AI was using was fundamentally flawed.

173
00:21:24.890 --> 00:21:29.760
Ricky Onsman: Largely, it draws on what's available on the web.

174
00:21:30.280 --> 00:21:42.210
Ricky Onsman: And a lot of the information about digital accessibility on the web is fundamentally flawed, if not downright dangerous. Practices that really don't encourage digital accessibility.

175
00:21:43.210 --> 00:21:59.249
Ricky Onsman: where that has developed, and I think people like agencies like Vispero and other digital accessibility consultancies are coming to realize that you can set a specific data set for an artificial intelligence model to use.

176
00:21:59.250 --> 00:22:03.999
Ricky Onsman: So instead of using a large language model, use a limited language model.

177
00:22:04.680 --> 00:22:22.139
Ricky Onsman: At Vispero, what we've done is we've created a knowledge base assistant that uses AI to interrogate every audit we've ever done, every article we've ever written, every piece of guidance we've provided to a client, and we know we can rely on that.

178
00:22:22.360 --> 00:22:30.050
Ricky Onsman: So, we can rely on the artificial intelligence to produce advice based on sound information.

179
00:22:30.250 --> 00:22:35.109
Ricky Onsman: And I think that becomes a very refined tool

180
00:22:35.270 --> 00:22:39.400
Ricky Onsman: That, is much more useful and practical and reliable.

181
00:22:39.610 --> 00:22:50.520
Ricky Onsman: So it becomes a bit of, you know, like, instead of what was touted as the huge advantage of AI, that it draws on all the information ever in the world.

182
00:22:50.780 --> 00:22:54.640
Ricky Onsman: That it only relies on reliable information.

183
00:22:54.940 --> 00:23:06.300
Ricky Onsman: And, I think all of us are working on tools that do that. And I think that will make a very big difference, not only just in confidence, but in just… just in accuracy.

184
00:23:06.560 --> 00:23:07.389
David Sloan: Right.

185
00:23:07.390 --> 00:23:19.709
Ricky Onsman: You know, and that is happening, so I see that as a very useful thing, and it's not insignificant that it takes humans to shape that.

186
00:23:20.030 --> 00:23:28.460
Ricky Onsman: You know, we have to tell AI what we want it to do, and what it should pay attention to, and what it should not pay attention to.

187
00:23:28.630 --> 00:23:42.649
Ricky Onsman: And that's a… that's a slow process, because we need to figure out, well, how do we… how do we get that information right in the first place, so that AI uses and spreads the right information?

188
00:23:42.880 --> 00:23:57.420
Ricky Onsman: And we're way off that, but it is part of the task that we have at the moment. And digital accessibility is a terrific crucible for that, because not only is there a lot of bad information out there, but it really matters.

189
00:23:57.640 --> 00:24:11.789
Ricky Onsman: You know, this is not a trivial matter. Making the digital world accessible to people with disabilities is only going to get more and more important, and whatever tools we can use to do that, we should use them.

190
00:24:12.030 --> 00:24:15.990
Ricky Onsman: Artificial intelligence is undoubtedly a part of that scenario.

191
00:24:16.150 --> 00:24:26.449
Ricky Onsman: But what we mustn't lose sight of is why we're doing it. And that's… that's… I mean, we all work in digital accessibility because it really matters, you know?

192
00:24:26.570 --> 00:24:42.120
Ricky Onsman: More and more people are having to rely on the digital world for information, for functionality, for processes to do with every aspect of their lives. And we have to make sure that that is accessible to people with disabilities, so we don't disenfranchise them.

193
00:24:42.610 --> 00:24:48.179
David Sloan: Right, exactly, yeah, yeah, no, I totally agree, and I guess it's a bit of a…

194
00:24:48.780 --> 00:24:53.230
David Sloan: There's a race between the content and code generated by the

195
00:24:53.760 --> 00:24:57.630
David Sloan: General large learning models, which are trained on

196
00:24:58.640 --> 00:25:03.119
David Sloan: Code that may not be what we would want it to be trained on.

197
00:25:03.510 --> 00:25:12.930
David Sloan: And then… The code that's created with the assistance of more specialized models that are more accessibility aware.

198
00:25:13.100 --> 00:25:18.939
David Sloan: We'll give more reliable, best practice and guidance and examples.

199
00:25:19.250 --> 00:25:28.199
David Sloan: So this, this race, you know, where the sort of general world potentially is learning from bad practice and, and

200
00:25:28.430 --> 00:25:35.529
David Sloan: Ecolating or propagating bad practice, and then, separately, the smarter models that you were talking about.

201
00:25:35.990 --> 00:25:38.360
David Sloan: That are assisting people, so…

202
00:25:38.890 --> 00:25:41.229
David Sloan: you know, I don't know… I don't know if it…

203
00:25:41.790 --> 00:25:52.070
David Sloan: where that race will head, and who's gonna win it, but I know the part of it that we want to be on and support.

204
00:25:54.170 --> 00:25:59.079
Ricky Onsman: Yeah, there's an aspect to this that we haven't really spoken about so far. We've focused on

205
00:25:59.330 --> 00:26:03.179
Ricky Onsman: Within the digital accessibility industry, how we use artificial.

206
00:26:03.580 --> 00:26:15.079
Ricky Onsman: But there's another aspect to it that I think is going to… is already becoming more pervasive, and will continue to do so, where artificial intelligence assists the user

207
00:26:15.400 --> 00:26:29.599
Ricky Onsman: And I'm talking here about agentive artificial intelligence that makes a browser act like a screen reader, that lets a browser understand a particular user's needs, and

208
00:26:29.700 --> 00:26:44.440
Ricky Onsman: literally influence the presentation of web and app content to suit that individual user's needs, to customize digital experiences so that they meet every person's individual needs.

209
00:26:44.440 --> 00:26:50.759
Ricky Onsman: That's an area that I think is going to explode, and in the… in the short term.

210
00:26:50.800 --> 00:26:55.960
Ricky Onsman: Browsers, by definition, have been for the average user.

211
00:26:56.380 --> 00:27:11.779
Ricky Onsman: But if you can customize your browser to be something that is incredibly personalized to you, even to the extent of not looking like a browser, it just presents web content to you in the way that is most meaningful and useful to you.

212
00:27:12.290 --> 00:27:16.610
Ricky Onsman: Taking into account things like disabilities,

213
00:27:16.880 --> 00:27:20.889
Ricky Onsman: Language backgrounds, cultural backgrounds, all those kind of things.

214
00:27:20.970 --> 00:27:36.959
Ricky Onsman: If you can have your own personal AI present web content to you in the way that you need it to be presented, that's a massive step forward. And that is something that I think will happen in the short term. We will see that happening in the next couple of years.

215
00:27:36.960 --> 00:27:47.760
David Sloan: Absolutely, and there's a whole other podcast to talk about how assistive technology can be empowered by agentic AI, and how your assistive technology is

216
00:27:48.320 --> 00:27:53.700
David Sloan: You know, the capability of it is… amplified

217
00:27:54.390 --> 00:28:01.650
David Sloan: you know, I don't know what kind of level of order, that you can ask your screen reader, whatever, to

218
00:28:02.190 --> 00:28:04.249
David Sloan: Go and book my flight to…

219
00:28:04.650 --> 00:28:19.999
David Sloan: to Cancun, or wherever, and you're not even interacting with the website, the agent can do things for you. So, you know, empowering the browser, or empowering the assistive technology, all of the lines

220
00:28:20.280 --> 00:28:34.229
David Sloan: in the model, the user agent, the authoring tool, the content, you know, that classic W3C shared model of accessibility responsibility. All those lines seem to be getting really blurred.

221
00:28:34.470 --> 00:28:35.230
David Sloan: with…

222
00:28:35.230 --> 00:28:35.880
Ricky Onsman: Yeah.

223
00:28:35.880 --> 00:28:38.539
David Sloan: evolution of AI, the way that we kind of

224
00:28:38.950 --> 00:28:44.559
David Sloan: compartmentalize who's responsible for what in the accessibility world. It seems to be.

225
00:28:44.970 --> 00:28:50.770
David Sloan: Getting upended, and who knows where it's gonna be in 2, 3, 10 years' time.

226
00:28:51.580 --> 00:29:03.429
Ricky Onsman: Yeah, there are risks associated with that that we should be aware of. We don't want to get to the point where artificial intelligence is compensating for bad code and bad markup.

227
00:29:03.600 --> 00:29:20.580
Ricky Onsman: Right. So it… there's still… we have to make sure that the web content, digital experiences are presented in a way that meet… that artificial intelligence can present to a user in a meaningful and accessible way.

228
00:29:20.820 --> 00:29:25.310
Ricky Onsman: So, you know, getting rid of…

229
00:29:26.420 --> 00:29:42.230
Ricky Onsman: code spaghetti, is still… in fact, becomes more important than before. So, you know, there is still going to be a role for people to, make sure that they actually write code in a accessible way.

230
00:29:43.380 --> 00:29:51.580
Ricky Onsman: And it's… it's a… that… that is a considerable risk, because we've already seen how browsers will compensate for poor markup.

231
00:29:51.800 --> 00:30:02.770
Ricky Onsman: And it's meant in a positive way of, look, okay, they haven't coded that very well, that's okay, I'll ignore that, and I'll present it as if it had been coded properly.

232
00:30:02.940 --> 00:30:06.609
Ricky Onsman: But that doesn't create a better digital world.

233
00:30:06.900 --> 00:30:16.520
Ricky Onsman: And we've got to make sure that we actually do it correctly in the first place, so that artificial intelligence does it correctly for the individual user.

234
00:30:17.180 --> 00:30:24.070
David Sloan: Yeah, and I guess, ironically, the whole, you know, back in the day when we were using search engine optimization as a…

235
00:30:24.240 --> 00:30:32.580
David Sloan: you know, the SEO is very closely aligned with accessibility best practices, and now we're thinking of AI, or Gen AI,

236
00:30:32.780 --> 00:30:44.799
David Sloan: Optimization is very closely aligned with accessibility best practices when you can make something machine-readable, whether the machine is an assistive technology or a generative AI.

237
00:30:45.230 --> 00:30:49.090
David Sloan: fuel, yeah, it's… it's all…

238
00:30:49.340 --> 00:30:56.590
David Sloan: It's all kind of helping… hopefully helping us all move forward in the right direction.

239
00:30:57.250 --> 00:30:57.810
Ricky Onsman: Yeah.

240
00:30:58.240 --> 00:31:09.789
Ricky Onsman: One thing… That's also a… that's a thing with, like, you know, in the early days of search engine optimization, the techniques that people used to optimize their websites for search

241
00:31:09.910 --> 00:31:20.710
Ricky Onsman: were not all positive. Keyword stuffing. Putting white text on white backgrounds so that machines would read it, but humans wouldn't see it, right?

242
00:31:20.710 --> 00:31:21.130
David Sloan: some humor.

243
00:31:21.130 --> 00:31:21.940
Ricky Onsman: liking it.

244
00:31:22.220 --> 00:31:24.879
Ricky Onsman: Exactly, it's, it's, it's…

245
00:31:25.030 --> 00:31:32.150
Ricky Onsman: It's essential that we kind of think through what we're doing here, and how it might be abused.

246
00:31:32.700 --> 00:31:34.670
David Sloan: Yep, yep, absolutely.

247
00:31:34.890 --> 00:31:36.339
David Sloan: So one thing I wanted to…

248
00:31:36.630 --> 00:31:43.419
David Sloan: ask about your, you know, as you've been kind of surveying and researching into evolution of AI to support

249
00:31:43.670 --> 00:31:45.419
David Sloan: Content creation.

250
00:31:45.880 --> 00:31:55.350
David Sloan: You know, one… clear opportunity, and also a threat to accessibility, is accessibility of AI tools,

251
00:31:55.760 --> 00:31:59.360
David Sloan: You… what evidence do we have that they…

252
00:31:59.670 --> 00:32:04.540
David Sloan: user interface of generative AI tools and the agents.

253
00:32:05.250 --> 00:32:24.920
David Sloan: is getting easier to use by people with disabilities. I mean, I've heard kind of conflicting evidence that tools are more or less accessible to people with disabilities, and maybe if you're comfortable using a command line interface and you're a screen reader user, then everything's good, but again, that also requires a level of

254
00:32:25.240 --> 00:32:33.180
David Sloan: technical expertise that may… maybe can't be assumed by everybody. So, have you… have you seen any trends there in terms of accessibility of the

255
00:32:33.430 --> 00:32:35.139
David Sloan: Gen AI tools.

256
00:32:36.030 --> 00:32:40.469
Ricky Onsman: Yeah, and I think that that works on a number of levels.

257
00:32:40.990 --> 00:32:47.100
Ricky Onsman: One… one aspect is just the… the simple usability of the tool with the user interface.

258
00:32:47.200 --> 00:32:49.619
Ricky Onsman: There's kind of… it's…

259
00:32:50.280 --> 00:33:07.359
Ricky Onsman: it's, again, it's a… it's two sides of the coin, because a lot of people with disabilities have become used to using assistive technologies, that people who don't have disabilities, for them it's a complete mystery, you know? Using a screen reader is like… and… and I have… I have…

260
00:33:07.570 --> 00:33:23.550
Ricky Onsman: A used a screen reader myself quite a lot, but I've seen people who are very adept at using screen readers, and they, you know, operate on a completely different level. So there's a level of comfort with technology that assists them in their daily tasks.

261
00:33:23.750 --> 00:33:29.560
Ricky Onsman: that… Coincides with the way that you might use assistive technology.

262
00:33:29.790 --> 00:33:37.080
Ricky Onsman: That might use artificial intelligence. So on that level, there is… there is a kind of a…

263
00:33:37.240 --> 00:33:43.030
Ricky Onsman: A prospect there for people who are comfortable with using what you might call unusual tools.

264
00:33:43.520 --> 00:33:46.279
Ricky Onsman: At the same time.

265
00:33:46.410 --> 00:33:57.279
Ricky Onsman: I think when artificial intelligence kind of broke through the, what you might call the conversational banner, it became a lot more usable by people in general.

266
00:33:57.440 --> 00:34:01.559
Ricky Onsman: There was very little thought given to accessibility at that point.

267
00:34:01.870 --> 00:34:13.890
Ricky Onsman: I think things that are happening now are influencing that, that if you are going to produce a tool that is meant to be used by everybody, it has to be designed to be used by everybody.

268
00:34:14.489 --> 00:34:23.069
Ricky Onsman: And there are efforts in that direction. I think the companies that create the artificial intelligence tools are becoming aware that

269
00:34:23.139 --> 00:34:36.350
Ricky Onsman: As with everything else, there is a market for people with disability, and you don't want to ignore that. There is a cultural imperative, there is a social need to make those tools accessible.

270
00:34:36.760 --> 00:34:42.789
Ricky Onsman: That's a… an evolutionary thing. And it takes pressure to make that happen.

271
00:34:43.010 --> 00:34:46.230
Ricky Onsman: The efforts of people like,

272
00:34:46.510 --> 00:35:05.050
Ricky Onsman: Joe Devon in setting benchmarks for how AI tools approach accessibility. That's going to be very, very important. The good thing is that most of those companies creating the AI tools are open to that. They're not deliberately trying to create inaccessible tools.

273
00:35:05.090 --> 00:35:09.519
Ricky Onsman: They want it to be accessible, they just haven't given it a lot of thought up until now.

274
00:35:09.690 --> 00:35:16.119
Ricky Onsman: So I see that, that as being, an area of, of progress.

275
00:35:16.260 --> 00:35:22.319
Ricky Onsman: There are some things that are built in that make that process a little easier.

276
00:35:22.430 --> 00:35:41.460
Ricky Onsman: The fact that you can use voice access to use artificial intelligence is a very positive thing, and it's something that actually moves the dial forward for people with physical disabilities who can't use their hands, you know, can't use a keyboard at all, can't use a mouse, so use voice access.

277
00:35:41.620 --> 00:35:44.150
Ricky Onsman: That's tremendously empowering.

278
00:35:44.380 --> 00:35:56.730
Ricky Onsman: Having said that, you've then got to make sure that the AI tools don't rely on voice alone, because then people who have voice impediments can't use the tools.

279
00:35:57.080 --> 00:36:04.910
Ricky Onsman: So it has to be something that is customizable to meet the needs of individual people. And that's… we're a little way off that yet.

280
00:36:05.080 --> 00:36:19.600
Ricky Onsman: I think as we get to the things we were talking about with browsers being AI-influenced and customizable by individuals, that will influence a lot of the more general AI tools that are available.

281
00:36:19.660 --> 00:36:28.310
Ricky Onsman: And it will become… there's no reason why artificial intelligence tools can't be customizable to individual needs.

282
00:36:28.470 --> 00:36:36.569
Ricky Onsman: But that's a process, and it's something that people in our industry need to exert influence on that to make it happen.

283
00:36:36.790 --> 00:36:53.590
David Sloan: Absolutely, yeah, the accessibility field is an important role to play there. So, we're approaching end of time. I'm going to ask you a couple of very quick questions with quick answers, see how you respond. Firstly, do you see the

284
00:36:53.780 --> 00:37:04.170
David Sloan: trend for… around AI. Are people moving from skeptic to enthusiast, or vice versa, or do you see a kind of

285
00:37:04.950 --> 00:37:07.150
David Sloan: Sort of stability in terms of

286
00:37:07.730 --> 00:37:16.530
David Sloan: enthusiasm. What, you know, do you see any change in how people line up to how AI can help accessibility efforts?

287
00:37:18.090 --> 00:37:33.850
Ricky Onsman: Okay, quick answer, yes, I do, but it depends on what form of artificial intelligence you're talking about. For things like translation, absolutely, that's… that's a positive move. For things like creating plain

288
00:37:33.880 --> 00:37:47.070
Ricky Onsman: easy-to-read language versions of complicated information? Yes, absolutely. We can see the positive sides of artificial intelligence in making things easier to understand.

289
00:37:47.480 --> 00:37:54.780
Ricky Onsman: In some other areas, where it oversteps the boundary from

290
00:37:56.560 --> 00:38:09.840
Ricky Onsman: a transformative mechanism to something that generates something from nothing, there is a long way to go, and I think there is a level of distrust there that is

291
00:38:10.410 --> 00:38:18.730
Ricky Onsman: justified, quite frankly. So I think it depends on what kind of artificial intelligence you're talking about.

292
00:38:19.090 --> 00:38:19.690
David Sloan: Right.

293
00:38:20.160 --> 00:38:29.039
David Sloan: We'll arrange another conversation in 6 months and see if that answer has changed. I bet it has to some extent.

294
00:38:29.240 --> 00:38:45.769
David Sloan: And last question, just one piece of advice you'd give to organizations who are looking at ways to use AI to help with their accessibility efforts, rather than hinder those efforts? What would your one piece of advice be to organizations?

295
00:38:46.780 --> 00:38:51.560
Ricky Onsman: I think it would be… Use AI as a tool

296
00:38:51.800 --> 00:39:01.550
Ricky Onsman: to assist human processes. Don't rely on it as something that replaces human functionality, particularly in regard to digital accessibility.

297
00:39:01.660 --> 00:39:06.300
Ricky Onsman: Use it as something that helps, not something that replaces human effort.

298
00:39:07.830 --> 00:39:11.670
David Sloan: Yeah, no, well said. I think that's great advice.

299
00:39:11.800 --> 00:39:26.690
David Sloan: Well, I think we've… we'll stop it there, but I know that we could carry this conversation on for another 2-3 hours, but the good news is I'm sure we'll find more time to catch up later in the year and see how things have changed.

300
00:39:26.980 --> 00:39:32.679
David Sloan: Thank you, Ricky, so much for sharing your perspective today. That was a great conversation, and…

301
00:39:32.790 --> 00:39:40.129
David Sloan: like I say, I know that AI and its influence on accessibility efforts is only going to change and evolve over time.

302
00:39:40.760 --> 00:39:52.200
David Sloan: So, well, now you know the state of accessibility. I'm David Sloan, thanking Ricky Odinsman, and reminding you that the state of accessibility is always changing, so please help us effect change.

