The trek to tech central for an annual meet-up of coders isn’t all hoodies, snacks and all-night gaming sessions.

Blind software engineer Ed Summers recently shared the main stage with other accessibility experts at the annual GitHub Universe summit in San Francisco.

“When you build, please build without barriers,” he urged the elite audience.

“You collectively create the technologies that are used by all of humanity and if you unknowingly introduce barriers into those technologies then those barriers will undoubtedly create disabilities for some people.”

Artificial intelligence is already being used by some people – including coders – who are blind or have low vision, are non-verbal or have difficulty speaking, or people with cerebral palsy and other conditions that affect movement.

Eye-tracking software allows people with vision but no movement to interact with a computer by using their eyes.

For example Becky Tyler, who was born with quadriplegic cerebral palsy, has helped develop software designed specifically to help people with disabilities play Minecraft with their eyes.

Mr Summers, head of accessibility at code-hosting platform GitHub, said there should be “nothing about us without us”.

GitHub is a platform used by more than 100 million open-source software developers that allows them to share and fine-tune their work – including 1.4 million coders in Australia.

“We do not want people making decisions for us or on our behalf,” he told AAP.

“We are capable of being included in the process and contributing ourselves, and technology is not optional.”

The World Bank coined the phrase “disability divide”, which affects the 1.3 billion people globally with disabilities including about one in five Australians.

“As a group we experience lower outcomes across many aspects of our lives, including health, education and employment,” Mr Summers said.

He said everyone must have access to technology and digital information, and called for diversity in those creating new technology.

“As opposed to 20-something, white male, North American people creating the technology,” he explained.

“If we can expand that out to be more global and more inclusive along many dimensions, including disability, then it greatly enhances the chance that everybody can benefit from this human progress,” he said.

“That’s a big driver for us, and for me personally as well.”

Mr Summers said he thinks about accessibility with an upper-case “A” and a lower-case “a”.

“Upper-case A” accessibility is about removing barriers for people with disability and ensuring eye-tracking software or his screen-reader or phone works as best as it can.

“In the industry we have a pretty good understanding of what that is and some standards around how to maximise compatibility with assistive technologies,” he said.

“For people who need to customise the user interface and minimum colour thresholds – that’s a lot of what accessibility work is, from one perspective.”

And then there’s “lower-case a” accessibility, which is about making technology more approachable and usable for a wider number of people.

“What’s happening right now on the platform and more broadly with generative AI is simply amazing,” he said.

“It’s going to yield some serious benefits.”

The so-called natural language aspect of generative AI, where machines can understand and respond to text and voice data, is transformative – not for everyone, but for many people, he said.

Natural language is how most people communicate with each other and express themselves.

Usually, learning how to code involves cryptic syntax and semi-colons that makes sense for a machine and a narrow group of humans.

“But if we can express what we want and build things by articulating our vision, that’s a game-changer, that’s lower-case a accessibility,” Mr Summers said.

There is another aspect that he says he experiences personally, and that is anxiety.

“So it’s very poignant for me,” he said.

He hasn’t written code on a daily basis for 15 years, because he’s been managing or building and running projects.

“I’m rusty, so when I have to do new things I have anxiety about that and frustration because I can’t just rip things out like I used to when I was coding all the time,” he explained.

“And I don’t want to ask the brilliant engineers because I don’t want them to know.”

He said he can resolve that anxiety by asking in chat format what to do next, particularly about something that he should already know how to do.

“If I had to go ask one of my co-workers I would be very embarrassed.”

Darryl Adams, director of accessibility at Intel, said tools such as mobile phone app Be My Eyes provided a feeling of connection that he didn’t know he was missing.

“I’m visually impaired and I have a pretty difficult time with many visual tasks, including seeing the details of images on my phone,” he said.

“I’ve been using Be My AI to describe my images for me. The results are remarkable … a series of wow moments over and over, and it just kind of feels like magic.”

Be My Eyes founder Hans Jorgen Wiberg said the AI assistant Be My AI, powered by GPT-4, is the latest function of the app and means people are no longer reliant on another person’s description.

The function is available for iPhone users and began to be rolled out for Android phones in December to describe a whiteboard, read a menu, or help navigate a street and “see” the outdoors.

“You take a photo and this photo is automatically uploaded to open AI and you will get a detailed description,” the inventor said.

“It’s super important that you actually develop with the people you are developing for,” he said.

“The app has definitely improved from the feedback we have gotten directly from our users.”

Mr Adams said apps were important for people with disability to be connected with the world, and remain connected, particularly when suffering from conditions that mean they are “locked in” with no movement or speech.

Intel’s ACAT, or assistive context aware toolkit, was originally developed during years of collaboration with Professor Stephen Hawking, the late world-leading physicist and cosmologist with motor neuron disease.

The open-source software enables communication through keyboard simulation, word prediction and speech synthesis, including accessing emails, editing documents and using the internet.

“The general idea is to be able to get access to all kinds of computing functions through a single switch,” Mr Adams said.

“We can trigger that digital switch by using spacial gestures detected by a camera, or proximity sensors, or even just off-the-shelf mechanical buttons and switches.”

He said a new version of ACAT includes a brain computer interface that can interpret brain signals and communicate basic needs, which adds another mode for people who are experiencing advanced disease progression.

“We also want to give ACAT ears,” he said.

“We want the system to be able to listen to the conversation and provide responses or response suggestions in real time.”

That would reduce the “silence gap” that usually occurs, he explained.

But tech entrepreneur Joe Devon sounded a note of caution about brain computer interface technology as it begins to touch on the ability to read thoughts.

“We have to pay attention and draw a line in the sand,” he said.

“There should be some regulation around it in order to make sure data is private unless we agree to share it.”

 

Marion Rae
(Australian Associated Press)