OpenAI Quietly Shipped the Most Important Accessibility Architecture in a Decade.
-
OpenAI Quietly Shipped the Most Important Accessibility Architecture in a Decade. And Almost No One Noticed. | By Aaron Di Blasi, Publisher, Top Tech Tidbits | Courtesy of the PWD Media Co-Op
https://at-newswire.com/openai-ships-most-important-accessibility-architecture-in-a-decade/The model is calling a function — not pretending to be human. OpenAI's openai/realtime-voice-component (April 27, on gpt-realtime-1.5) lets models invoke app tools directly — ending 20 years of screen-reader friction.
#DisabilityRights #PWDMediaCoOp #ATNewswire

-
OpenAI Quietly Shipped the Most Important Accessibility Architecture in a Decade. And Almost No One Noticed. | By Aaron Di Blasi, Publisher, Top Tech Tidbits | Courtesy of the PWD Media Co-Op
https://at-newswire.com/openai-ships-most-important-accessibility-architecture-in-a-decade/The model is calling a function — not pretending to be human. OpenAI's openai/realtime-voice-component (April 27, on gpt-realtime-1.5) lets models invoke app tools directly — ending 20 years of screen-reader friction.
#DisabilityRights #PWDMediaCoOp #ATNewswire

@news If this is what you call accessibility, you can take all my tech, I never want to use it again. This does not solve the actual problem. This is not what accessibility should be, and this is not intuitive. I should not need to yap to some fucked-up AI model in order to fill out a form some dumbass failed to make screen reader accessible. btw, if I have to scroll through tons of other article suggestions before being able to read the actual post content that's not very accessible either, might wanna figure that out first.
-
R relay@relay.infosec.exchange shared this topic