Statistical Bias: How AI Leaves the Disability Community Behind

January 20, 2026 00:50:27
Statistical Bias: How AI Leaves the Disability Community Behind
Disability Empowerment Now
Statistical Bias: How AI Leaves the Disability Community Behind

Jan 20 2026 | 00:50:27

/

Show Notes

Is your smartphone a communication tool or a portable casino? AI consultant Peter Swim joins host Keith Murfee-DeConcini to expose the high cost of "Silicon Valley imagination." Peter breaks down his summer experiment where he consolidated 7,000 AI programs into just 42 megabytes, proving that we don’t need faster hardware—we need better stewardship. They discuss why AI struggles to recognize speech patterns that fall outside the "statistical mean," the environmental debt of every ChatGPT prompt, and the radical act of "downgrading" tech to regain control over your life. This is a must-listen for anyone tired of being the product and ready to start being the user again.

View Full Transcript

Episode Transcript

[00:00:00] Speaker A: Foreign. [00:00:08] Speaker B: Disability Empowerment Now Season four bonus episode. I'm your host, Keith Murphy de Gansini. Today I'm talking to Peter Swim and a AI consultant. Peter, welcome to the show. [00:00:34] Speaker A: Thanks for having me, Keith. [00:00:36] Speaker B: So, Peter, what is a AI consultant? [00:00:44] Speaker A: Well, there's two kinds really, right? There's the people are trying to sell you AI and there's people like me who are trying to help you understand what AI can do. So you can understand if you want to even mess with it or not. So I spend a lot of time understanding how people work and the things that they use and the tools they have, and then we can have open discussion on whether or not AI is the right choice for them. [00:01:08] Speaker B: So how does AI benefit accessibility? And how can people with disabilities who use technology implement it for their needs? [00:01:31] Speaker A: Well, I think out of the box AI actually is harmful to people who are not part of the mainstream user story because the way AI, as we discussed today works is through probability and statistical likeliness. And so for individuals who are outside of the statistical meaning of what is likely to happen, such as those with disabilities, they can find the AI actually works the opposite of how they intended to. And that's one of the things I caution concern my clients in that they don't automate it so well that it starts creating roadblocks for people who don't work the way that they do. [00:02:13] Speaker B: Yeah, a side tangent, I was trying to train Dragon, naturally, speaking about. [00:02:27] Speaker A: Oh. [00:02:27] Speaker B: 20 years ago, something like that, and the algorithm leads then could never really pick up my speech pattern shocker there. But I remember having to say I like to read books so many times that it actually made me hate reading books for a time afterwards. And if you were here or if anyone would share, they could see I have that library. So it was very ironic, but I totally get what you mean. Like big companies like Apple that just released the hotly expected iPhone 17 at the launch event today, and then Google, Microsoft, they certainly come a long way in terms of making their technology accessible. And I'm focusing on their smartphones of its late smart wearable technology. But AI is a brand new innovation in being successfully incorporated into the mainstream. And Google Judge not too long ago announced the Pixel 10. And so to be fair to the companies who may or may not be listening to this episode, but back on track, you're an AI consultant and what do you see is the big hurdles going forward for big companies adds the vagining AI into usability for people with disabilities? [00:05:35] Speaker A: Well, I think the problem that it has with people with Disabilities is exactly the same the problem it has with people who have other comorbidities or people that are neurotypical or neuro atypical and that's customization. Right. And like square peg and round home syndrome. And the thing that is cool about AI and like the things when you know, OpenAI has made a one to one copy of the Internet in a model that you can talk to, but it's also a one on one model of a bunch of stuff that's bad for people with disabilities and bad for people. And the thing that's missing is knowing about you. And I don't think I, I don't think I want Google to know every single thing about me. So search is better. And I think that's. [00:06:20] Speaker B: Yeah, yeah, that and I mean, talk about the Facebook ads. Ads in general. Like some of them are useful, but a lot of them are downright creepy. [00:06:42] Speaker A: That's right. [00:06:42] Speaker B: So I totally get you on that. [00:06:46] Speaker A: Yeah. And I think like the difference between creepy and helpful is outcomes. Right. So yeah, if micro, if Microsoft has a complete 1:1 mapping of my entire brain on their servers and they use that to sell more products to people like me and I don't get a cut, that's a bad outcome. But also like, also if they have a copy of my brain and I still have a bad experience using their products and they still use me to sell me to other products, that's another bad outcome. Right. Like, and I think, I think people, the modern Internet is all about like, well, if it's free, you're the product. And people are kind of used to that and they're like, okay, this, you know, Twitter's free and it's free and then someone bought it and now it's bad. And, and they're used to that cycle. And I think there has to be kind of a reset where the people who work at these big companies, they get everything done the way they like it, and the way they like it is only as big as their imagination can handle. And so you get a lot of people on the edges, including sa, including people that don't look like the people in Silicon Valley that are left on the outside. And so we need to figure out a way for customization to not come at the expense of consent and privacy and all that. [00:07:58] Speaker B: So before we get too deep, which I definitely want to explore because this is a fascinating topic that like it or not, AI is here to stay. AI is like the new social media, the new YouTube, whatever. One of the criticisms or one of the early Criticisms of AI it's that every time you use AI, AI function, you use up water, a natural resource, and you, you see hundreds, if not thousands of videos showing that or what they want you to see. And someone like my dad, who grades in his 90s, he uses Siri in ChatGPT even though he doesn't know what Chad GPT is. But for him it is a vital use, the vital accessibility of his phone that he would be frankly logged without. And sorry, dad, he's terrible with technology, but even he would admit that. And so how do you rebuff the criticism that using AI is harmful to the environment every time you use it? [00:10:16] Speaker A: Yes. Well, first of all, I would say I 100% agree that's harmful for the environment. And I also agree that I believe in climate change and I believe that we have to make drastic change as a society to, to prevent the stuff that's happening. And AI certainly isn't helping. Right. And, and I, and so this summer I did an experiment. I did a week of nothing but AI because for years, I'm a technologist. I've been 10 years. No, for, for, for. I do, you know, I've been building stuff with AI. I've used computers since I was like 12 years old. And, and, but I've never completely like just like drank the Kool Aid and did it. And so at the end of the week, I had 7,000 computer programs. I had 7,000 computer programs. And then I did analysis of the code and it was really 300 computer programs made over and over again. So let's talk, keep in mind this as we talk about waste. And then I did an analysis of 300 programs. It's actually four programs with 10 features each. And so the way AI is being made today is like if you went into a bathroom and all the toilet paper was terry cloth towels, right? So you clean yourself with the towel, you throw it away, and then you get a whole new towel. And it's totally like wasteful and stupid. It is just wasteful and bad. And we're, you know, we're up to our debt eyes in environmental debt that we need to fix and it came around at the worst time. So I've built a tool and I'm working on developing this for people. And so I'm capturing 93% of AI usage. And what 93% means is I only use AI at the back of the line. So like if I say I tell AI, you know, reboot the server or make me toast, you know, or order cheap flights to Miami. If it knows the answer, it's not going to use AI, Right? And it only uses AI to, like, come up with a guess at the answer or, you know, like, you know, because I'm not a programmer, you know, I make products, but I still go to stack overflow and I look for other people's code and, you know, and AI is good for that. It's not, it's not good for, like, unless you give it, like a complete map of your life. It doesn't know how to, like, say, like, what are the best times a day to walk my dog? You know? Yeah. [00:12:47] Speaker B: I mean, so the reason why I say AI is like the nude social media. It's because it's literally baked in to fade every product to Google and to odd smartphones. So no matter how harmful it is to the environment or if somehow that gets fixed, AI is not going away, people are not going to stop using AI. [00:13:37] Speaker A: So, yeah, and I think, like, that is a really good point. Like, is my phone hardware version of Facebook? You know what I mean? Like, is that the purpose of my phone is to look at Facebook? Okay. And I would say, no, I've deleted my Facebook account years ago. I believe my Twitter account, because I hate how those things suck. Well, well, they, they. My brain is bad for them, right? Like, I, I am, you know, I have ADHD and I have hyperactive disorder. And it's, it's like a casino. It's like going, if I have a gambling addiction, I can't be in those things. And what works for me is email, where I can, like, read email and archive it and clear out my inbox. And all these things are kind of designed to be kind of like a casino and keep you pulling the wheel versus, like, yeah, you know, when do you get to the bottom of Instagram? You know, Never. Right? Because they don't want you to. They don't want to do anything. And I think, like, it's really, really bad to make our phones only be able to work in that context. Right? Because, like, if I didn't use my phone all day, the battery would be fine. But yeah, if I have social, If I have social apps, I have to charge it multiple times a day. And what's the point of having a mobile phone? [00:14:52] Speaker B: So, yeah, to call, to text. [00:14:56] Speaker A: Yeah. [00:14:57] Speaker B: I mean, sorry, Apple, but I've turned out Apple Intelligence. It's not for me, not at this point in my life. But I bring up my dad adds a classic example of someone who doesn't understand all this new technology and certainly Wouldn't want more of his data out there. [00:15:39] Speaker A: Yeah. [00:15:40] Speaker B: But he, he desperately relies on cheery on some form of speakability with his divides. And so I don't think it's wise, even with the humble in the environmental effects to label AI ads all bad or all good. [00:16:27] Speaker A: Yeah, yeah. [00:16:27] Speaker B: There had to be a middle ground because if you think about the third smartphones, they were judging color and you could make calls with them and then texting slowly rolled out. This is like 20, 25 years ago now. [00:17:00] Speaker A: Yeah. [00:17:01] Speaker B: And it was a gradual progression, relatively speaking to where we are now. [00:17:15] Speaker A: Yeah, yeah. Well, I mean if you think about it like I, I think it was the iPhone5 was like iPhone5 was enough computing power in your pocket that was greater than the entire computing power. Put a man on the moon. Yep. And like, and the reason it keeps going, buying more and more phones is for what? For. You know, I haven't had a reason to upgrade my iPhone in years other than Apple Intelligence is part of my job. Right. And, and the reason Apple intelligence is bad I think is going back to that thing of like they're trying to make Apple intelligence work for everyone and actually it's so intimate being on my device, it has to work specifically only for me and I'm not going to give Apple enough info about me to develop that profile. So. [00:17:59] Speaker B: Yeah, no, and that's absolutely right. And that's where I am. Even as a disability advocate, I use their accessibility features that are already baked in to their iOS because really that the only thing I need. But I'm only one person with one disability. [00:18:41] Speaker A: Yeah. [00:18:42] Speaker B: I have multiple. But you get what I'm saying. Which is very parallel to what you'd say with your job and why the only way, the only reason you upgrade your phone and use Apple intelligence is because it's literally part of your job. Otherwise. Yeah, otherwise. And pardon me for assuming that you wouldn't use Apple intelligence and not to pick on Apple but Google, Microsoft, if they still made phones, what have you, if it's part of your day to day job that different than if it's just curiosity a part of your day to day life. [00:19:58] Speaker A: And you know, I feel bad every time I buy a new phone. I'm like, you know, hundreds of dollars or thousands of dollars on something and I only feel good like if I'm able to like give my phone to my father in law, you know, or give my phone to like my niece or my nephew who needs a phone because it's like it's sick. It's just Like, I mean, I'm old enough to remember when you, you had to rent the phone from the phone company, you know, and, and like, but you know what I'm talking about. Yeah. And I remember, like, you had one phone in the house. And, and now everyone has their own phone. And you go out to dinner with your friends and family, everyone's looking at their phone and that's a whole social thing. And, but here's the good news, right. So Apple. [00:20:45] Speaker B: Sorry, what? What? I do know what you're talking about. Could you say that again? Well, I have. [00:20:59] Speaker A: Okay. [00:21:00] Speaker B: Legend. [00:21:03] Speaker A: Okay. So we, we, we are in a parasocial relationship of technology and to a point where facilitation society is kind of broken. Right. And we are unable to view reality without the help of devices. Bad. Right. That's bad news. But the things that they gave us to do, it can be used to get us out of it. And this is what I discovered over the summer with my AI experiment that. So AI doesn't work if you use Apple Intelligence, but Apple Intelligence starts to work if you start putting your roles at the front where you're talking into the mic. Right. [00:21:39] Speaker B: Wait, so hold on, hold on. You just said something I want you to explain. [00:21:47] Speaker A: Sure. [00:21:48] Speaker B: AI doesn't work if you use Apple Intelligence or Google Intelligence. [00:21:58] Speaker A: Yeah. [00:21:59] Speaker B: What the difference? Because someone like my dad. [00:22:05] Speaker A: Yeah. [00:22:06] Speaker B: Will be like, aren't they all the same? [00:22:11] Speaker A: No. Well. And let me just say I'm not picking on any particular vendor. I think this is. [00:22:17] Speaker B: Neither am I. [00:22:19] Speaker A: And I think this is like, you know, it's a difference between Sony and Nintendo, you know, and like, and like they're, they're all complicit in how they get this wrong. And it goes like this. The way your father uses the phone is correct. And the way that they tell him he has to use the phone to get the results he wants is wrong. Because technology should be like turning on a faucet and water comes out and you don't have to know about water treatment plants and pipes and plumbing and all that stuff. So they are too invest. They're making people too invested in the process and not, not supportive enough of the outcomes. So that's how we get things like disability issues and bias is that they make assumptions on our behalf that are wrong. And so we deal with them and we model through them the best way we can. You know, you're mentioning accessibility features almost. I use so much accessibility features because it's literally the only way you can turn off many Apple things. Yeah. You know, that that like drive me crazy. And they're just like arbitrary stylistic choices. And you know, thank heaven for state attorney generals and the EU for forcing them to add toggles for certain things like Apple, you know, Apple glass, you know, like who, you know, who wants Apple glass besides like some guy from the 90s, you know, it doesn't help. And I, I have really poor stigmatism. It's hard for me to read. I can't even imagine people who are legally blind or low vision. It's going to be awful. Right. And so, so we'll say technology. And why does technology do this is because they're addicted to shipping new stuff and they never have time to fix the stuff they shipped. And so, so they ship, they'll ship Apple glass and I bet it'll be bad for five years and they'll ship the next thing and they'll start over again. [00:24:12] Speaker B: Yeah, so, and I mean, so. [00:24:17] Speaker A: Go ahead. [00:24:17] Speaker B: Technology Judge progresses too fast. I mean the old adage that if you buy a new computer, a new laptop or new desktop, within three to six months, that brand new computer that you judge bought with all the latest builds and web solds and what have you, it's already out of date. [00:25:05] Speaker A: Well, and I just downgraded my iPad, you know, because my daughter uses iPad for school. [00:25:12] Speaker B: And did you see down watch that? [00:25:25] Speaker A: So my daughter, she has dyscalculia and dyslexia and she uses her iPad as a communication facilitator. [00:25:33] Speaker B: And do I. [00:25:35] Speaker A: And so it was, the battery was dying and we're like looking at getting her a new one. And I realized I haven't touched mine in literally four months and I only pick it up to read comic books. Right. And so, so I'm like, okay, I'll just trade. [00:25:49] Speaker B: I love that, that, that vulnerability and that honest. [00:25:57] Speaker A: I mean I don't need, like I can borrow hers so I can figure out what, what's up with Apple intelligence because she only gets to use it a couple hours a day and we won't fight over it. And I can read my comic books. And I think this is like bad news for Apple. Right? Because Apple is like they're addicted to the fact that people want to buy a brand new phone every 12 months. Right. And, and I think what you're, you're talking about how they keep like inventing things and the thing is like they kind of forgotten how to refine things. And I think that is a key part of accessibility is your ability to be told that you did something wrong and to fix it. Yeah. You know, and if you keep, like, starting your story over again from the beginning with the new version of the product, one, you don't learn as an organization how to listen to people who are different than the people who pay your paycheck. And two, you. You never get to the bottom of your task list to do things that are less important, which may be, you know, supporting something for 500,000Americans who are, you know, have a specific disability or condition. And, you know, we're talking earlier about Dragon, the Dragon team, I worked with them when they got bought by Microsoft. And the way they would solve it for corporations is they would take that model and they would do what they did for you at scale in the company for millions of dollars. And. Yeah, and I think, like, there's a flip side. Like, we can use AI to train things like that or people, and they would get that corporation treatment, but it's going to set the world on fire. Or we can use the world on fire computer, which is the good news. So we could take the new iPhone 17 and never install any app on it, but our AI app. And our AI app will just build a software that does what I want. [00:28:03] Speaker B: Okay. [00:28:04] Speaker A: And so, like, instead of using five applications in Apple, I could just build a comic book app. Right. And so if I just need a comic book app, I don't need the latest iPhone. You know, I only. I can use the first iPhone. Right. And the only reason I have to upgrade it is maybe security and safety and privacy. And Apple, Apple's decided they won't support old machines for money. But if you can find an old iPhone and you can turn off the WI fi, it's. [00:28:33] Speaker B: You can. You can very, very easily find old. [00:28:41] Speaker A: Tech. [00:28:43] Speaker B: And do what you judge said. [00:28:48] Speaker A: Yeah. And I could take the iPhone 5, which I could probably get at any flea market for $10, and I have the computing power to put the man on the moon. Yeah. And, you know, I just got to write the software to do it. And so, like, that kind of hit me with ton of bricks. So that also means that, like, say I'm talking to you, Keith, and we're using some AI to translate between my language and your language, and there might be problems and misunderstandings. We can't have a custom software that does what we need to communicate without spending millions of dollars, but we can have software that turns what I'm saying and what you're saying into a mutually compatible language in the background, and we just talk. And so my microphone will come out in Spanish and your microphone will come out in Swedish. And we don't need to know the language, we don't need to have even the ability to speak. Right. And that hit me with a ton of bricks because none of these. If Apple proposed that if I bought an iPhone, I could talk to everyone, anyone on the planet in my native language, I would buy a new one at a heartbeat. Right? Yeah, because. And they have the technology to do that, but they're being driven to chase like tick tock trends and they're, you know, all these things and like someone out there is just like missing this billion dollar thing where it's like, oh yeah, everyone has a phone and everyone talks through the phone that it doesn't matter what each person is or where they're at or how they speak or how they communicate. We have a universal translation layer. Yeah. [00:30:24] Speaker B: I mean, so much we could talk about. The AI topic is truly that like the Internet that preceded it and the concept of the Internet was originally developed by the, the military as a communication. [00:31:02] Speaker A: And that's all it is. That's all it is. That's, you know, it's just, it's a communication method and everything else that happens in Internet is because a company decided to make it that way. So if every, if everything, you know, if everything's just ones and zeros, then why do I even have to make a Facebook account to see my nephew's babies? Yeah. You know, like, like whatever happened. So let's just go back to email and whatever. But that's gonna be hard like, for, like you can do that today. If everyone was like me and not your father, like you, you can totally go off the grid of Facebook and all that. And, but we will leave a lot of people behind. And I think that is like, the key is that no one is looking at everyone's like, well, what's the next social network? And I'm like, my phone book, you know, my, my, my email. Like, like, okay, I may not get along with everyone in my phone book, but I definitely don't have people like wishing me death in my phone book. [00:32:02] Speaker B: You know, like, yeah, yeah, that, that's absolutely correct. [00:32:10] Speaker A: And I think the, the thing that we miss off social networks is like, okay, we need some sort of ability to know a hurricane's coming or we need to find out who to vote for and all that. And it's just like, it makes you wonder. It's like, oh, that's why billionaires are buying these platforms, you know, so if all these things are like, suspect and like the very truth that we hold in our hands on the phone is suspect. And I tell this to my clients, like, you need to have a source of truth that you believe in and that represents your ethics and your values. And it can't live on Facebook server. It's gotta live, like, in a notebook, in your house or on your phone, you know, whatever you have to do. And I think disability advocates have been looking at this world for years just because they're not served. So they've had their kind of roommate make this world with a bunch of tools and all that. And I think we have to get better at, you know, how do we create a Microsoft Office that is not super focused on enterprise compliance, but it's focused on human autonomy and safety and accessibility and all that. And I think, like, if we trust it to the open market, it'll never happen because people say, well, you know, we only have a billion dollars to spend. And I really want to spend $40 billion on these VR goggles that mark Zuckerberg wants me to wear. And I just don't have enough money in the budget for, you know, the screen reading software that works for people who are from the deep south, you know, like, or whatever, you know, like, there's like so many. It's like, okay, unless everyone speaks California perfect English, these tools will never work for you. And you're not. You don't make us enough money for us even care about you. So you're. So we gotta fire them. You know, people say that to you. It's like. Like, I don't care how many cute puppies are on Facebook. It's not worth it. Treating people I care for like trash. So, yeah, we have to, like, we have to get rid of these products. And. And so I. I help a lot of clients figure out, like, okay, you pay for 19 things in a month and you hate 10 of them. Let's fire them. Build something and build something that works the way you work and it's cheaper. And you don't have to have, like, AI showing up in the product if you don't want AI. Yeah. You know, and. And AI helps you do it. And then so the one thing I discovered in my AI thing, it makes. I just finished it yesterday, and I finished it, meaning now AI sits on the back. And so when AI was in the front, it made 76 gigabytes of content since July 1st. Wow. And now that I'm done and everything works the way I want it, my entire directory of software is 42 megabytes. And so. [00:35:03] Speaker B: Wow, that. That's A gigantic difference. [00:35:10] Speaker A: And I think that's the difference between, you know, wiping yourself with terry cloth or Sherman's in that. Like, like if people don't need a lot and we don't need a thousand dollar phone and that most people probably don't even get a hundred dollar phone because it boil down to a device that does what I need to do. Maybe it's only 42 megs, right? Yeah. And the, the thing the device has to do is be able to say, oh, how do I figure out how to do things that don't come on the phone and nothing is installed. And the first question is, how can I help you? Like imagine Gmail or Google in 1998 website. And that's how you turn on your computer and you say, what can I help you with? It's like, I need to write an essay. Like, okay, do you want me to go on the Internet? Like, no, okay, well, I come with a text program. Let's just write here and you're done. Because. And then you don't have to learn why there's 37 buttons in office, you know, in word, you know, because all you have is a typewriter. And I think if companies were less interested in building a better mousetrap and just wanted to like an abacus was perfectly good at math for many thousands of years until it wasn't. So, you know, let's let some reject modernity and just kind of like be like your father and just like, yeah, I go to this website and I get the things I want and I don't know how it works and it doesn't matter how it works and it's not doing anything I don't want to find out about, you know, And I think that's kind of like the lesson everyone should take from the disability community as a, you know, you're a market that's not attractive to Silicon Valley in many ways. And I've talked to many really great disability people at big companies who really care and try, but they're behind the eight ball because they're in a world where everything's ruled by return on investment. And disability is a stewardship obligation that you have to do because you're a good person. And it's hard to justify that in capitalism. So, and I think like, so we have to put tools like, you know, I work in an organization for my company and I have people in Finland and in Canada and New York and I have a designer and we were paying for this expensive program called Figma, and he hated figma. I hated Figma. The developers hated Figma. So we fired Figma and we just built a little script to let him work in his art program. My developer work in GitHub and I work in email and it just figures it out in the back end. And I think like, yeah, less software is better. You know, like, let's, let's like let people work and how they work. And so instead of having a video meeting with you, I have a person who is disabled who works for me as my product person, then they're very good, but they don't work hours because of their disability. And so because I could be Async and I could do, I don't need to have a video meeting with them and tax their energy and stuff. And they can work how they work. I get really good work. And so like all this stuff is like very formative in a way that's envisioned by people with poor imaginations and our mainstream abilities. And it doesn't have to be this way. Right? Like we can be completely. And in fact it's, you know, 70 gigs too much. You know, like. [00:38:33] Speaker B: Yeah. [00:38:34] Speaker A: So like, okay, so if all my software is 42 megs, why do I have a one terabyte hard drive? Yeah. You know, it's ridiculous. It's ridiculous. So let's, let's kind of figure out the reset things. And also we use a on the back and we have a 93 reduction the next time we run it. And so we're treating, in my company, we treat AI as like a tax. And so when you pay the AI tax, you've got to get back to zero. And so, you know, but what does that mean? Use every part of the buffalo. You take everything you use the AI for and use it as much as you can and use it to measure and you have progress and you don't throw things away. And I think that's the only way it makes sense. And so the good news, you know, to bury the lead is so Apple shipped us the Apple Intelligence phone today. You can run everything Apple can on that phone or on a Apple computer for free. And that's completely private. So someone can make an app that uses Apple Intelligence that lives in icloud and talks to your MacBook but doesn't talk to Apple services and is 100% private and safe and work exactly how you'd like it. And so if that's the future and I got to pay a thousand dollars one time for iPhone, I can get off this like escalator of like new phones and all that and we could be free. So that's kind of like what I'm kind of thinking. AI is like, AI is a methodology and you either do it in a corporatist way where you're, you know, where it's a subscription to expensive service that never ends, or you can use it to put your mask on like the plane's about to crash and get all your people to safety and then shut up, you know, before the fire consumes you. So I think like, you know, AI as being sold today by the majority of people in the AI industry is 100% the wrong way to do it. But at the end of the day, it's just computers, you know, like, you know, and I don't mean to say this in a way of like only guns kill people, you know, kind of things, because it doesn't have to be like the solution to computing isn't more computers, it's like smarter computers. Right? And so, you know, we should, if someone tells you, oh yeah, you'll get the thing you want, we just need to buy the new iPhone 18 or you need to let me build a nuclear power plant outside your school. All that. That's, you know, that's someone who has a, a plan to go IPO in the next five years and needs to make that happen. And it's not thinking about, you know, yeah, we really want to have a universally accessible app for everyone, you know, because they're, they don't know how to make products like that. So that's what I tell my clients. [00:41:30] Speaker B: So any legend of your want to follow in your footsteps and become a AI consultant, what would be some action steps that you would give them? [00:41:53] Speaker A: I think the one thing that helps me a lot, my very, very, very first job in tech, it was at the library. And I would help seniors learn how to use the Internet in the mid-90s, the 1995 Internet, and like teach people to look stuff up. And you just need to listen to people and you need to be like. Because I think a lot of people like look at technology as the solution to anything versus understanding what the problems are. And so first thing, you have to be a very good product sense and you have to be a good listener and understander. And so you just got a lot of people talk and think, you know, and say the problem. Because sometimes people describe problems as like, oh yeah, I need to like get the file from Word into the website, into this. And then you're like, well, what are you really trying to do? Or like something simple. It's like okay, delete all those programs and just like put a sign up form on the bulletin board at the community center. You know, like there could be like non computer solutions to things if you just listen to the real problem as opposed to like trying to get all these things to work. So that's like the basic thing to working that I try to get people to think about. [00:43:03] Speaker B: And so we've talked a lot of, about a lot of things during this episode and I hope you will come back and do another episode sometime. Full all listeners and viewers. I like to think that both people with disabilities and those who have yet to discover or embrace their disabilities legend and watch this program as my guests. What do you hope that people will with disabilities take away from this episode? And what do you hope that people who have yet to discover or embrace their disabilities take away from this episode? [00:44:08] Speaker A: Yeah, you know, I think like one, I have to say that people have the right to feel how they feel in the absence of truth. Right. And I think the way society treats them and the way they deserve to be treated are often very different. You know, I grew up in the 80s, like at the tail end of, you know, the disco era and the hippie era. And you know, I had teachers who were like pushing 70, 80 that remember, used to be able to hit kids. And so there's like things that didn't get noticed that they would get noticed and I would be, I would get a 504 and I would be taken care of. And my life has been a lot different. And I think like it's very hard for people who have disabilities to kind of like step outside of their lived experience and because they're trying to live and it's unfair to ask them to think differently. And I think the abled often don't, aren't forced to confront it unless they become a parent or a loved one or something happens to them where they are forced to confront it. And I think it boils down to if we had a unequal system of like understanding and respect that wasn't based on like means testing and like, you know, the best for everyone and all, that we wouldn't have these problems. Like, I think they're all very like, I think disabled rights and human rights and trans rights and racial rights, they all boil down to like, how do you treat people who need different accommodations? And so, you know, if we treat everyone and accommodate them to be the best person that they are and course that's sometimes not enough, right? You need more. It's not the right time or whatever. But I think we need to get in a better spot of doing this. And I think technology, because it moves so fast, it doesn't have all the problems for it, but it definitely ships the problems faster than humans are used to. Right. And I think we need to understand, like when I turn on a new feature, there's this French historian called Paul, Paul Varillo, and in the 80s he wrote an essay called the Ship, Invented the shipwreck, right? So we have climate change, we have a problem. And so we invented this thing that is accelerating climate change. And you have a responsibility to fix that as soon as you can, Right. Or prove that you don't, that you shouldn't be shot into the sun. And I think this comes down to everything. Like, you know, you shouldn't have to have it disabled person in your circle of loved ones to have an appreciation for the needs of disabled people. Right. And if you're being forced to like exclude people based on like means testing and budgeting and stuff, then you need to like, find a different place to be in. And I think if we start thinking about these things, then we can start addressing like accessibility gaps and all these things. Because, you know, it's all human rights, you know, it's all just, and I don't mean to say it like all lives matter kind of way. Right. Because the bias inheritance system is disproportionately affecting people and disabled people are like the brunt of many areas of this and we shouldn't make them the canary in the coal mine. So I think this is symptomatic of larger systemic problems. And that's why I like working with small companies, because I've worked with the biggest companies and it's impossible to make meaningful changes unless you start local. Right. And so that's what I kind of encourage people to do is like, maybe when you graduate college, don't get that big corporate job, go work for a mom and pop or go down the corner and like figure out who you are before you start making decisions that affect millions of people you never met. So. [00:47:43] Speaker B: Well, Peter, I certainly enjoyed this conversation and as I said, I, I hope you will come back because there's so much more we could talk about, really Internet, but in case anyone wants to get in touch with you and has quite more questions. [00:48:16] Speaker A: Yeah. [00:48:17] Speaker B: What's the best way to do that? [00:48:20] Speaker A: Yeah, well, my website is peoplemakeitbetter.com and so Toyville, we work with a lot of companies and small business people, but don't let the word work with scare you if you are a, you know, you're supporting local organizations or whatever. I'm always free to talk so there's a contact page on the site and book some time. Happy to answer any questions or help people working on stuff in this space. [00:48:42] Speaker B: Again, Peter, thank you for your knowledge and for coming on today. I really appreciate it. [00:48:53] Speaker A: My pleasure. Thanks a lot Keith. It's been a great conversation. [00:48:57] Speaker B: Thank you. You have been living transitioning to Disability Empowerment Now. I would like to thank my guests, you, Glitzener and the Disability Empowerment now team that made this episode possible. More information about the podcast can be [email protected] or on social media @disabilityempowermentnow. This podcast is available wherever you listen to podcasts and on the official website. Don't forget to rate, comment and share the podcast. You can watch four episodes on YouTube, but this episode of Disability Empowerment Knowledge copyrighted 2020.

Other Episodes

Episode 0

May 04, 2025 01:10:11
Episode Cover

Visibility, Vision, and Victory: Lessons from Renee’s Global Impact

Renee started her impressive career 35 years ago as one of the pioneering female outside industrial salespeople. Fueled by a unique blend of passion...

Listen

Episode 0

November 20, 2022 00:58:15
Episode Cover

S2 Episode 5 with Tony Coehlo

The Honorable Tony Coelho is a retired U.S. Congressman from California, a former House of Representative Majority Whip, Democratic Congressional Campaign Committee Chairman and...

Listen

Episode 0

April 16, 2023 01:14:39
Episode Cover

Athletics and Poetry with Jamon Freeman

Jamon Freeman,  B.K.A. A “triple threat” black, Deaf & physically challenged. He’s a generous guy who’s multi-talented, an athlete and a poet making a...

Listen