Hacking for Good at the Accessibility Hackathon

Its not often you get a chance to make a difference the way people live their lives. As my role of judge at the recent Accessibility hackathon by Barclays I met lots of teams spending their weekend doing just that.

The hackathon started with an amazing set of stories from the charities involved. These stories gave the teams a great insight into the challenges people in these communities faced. The presence of the the accessibility community through the hackathon helped teams stay focused and create relevant apps that would make a significant difference.

The Charities

Judging the Hackathon

With all the great ideas generated by the teams taking part, it made the judging quite tough. Although not as tough as hacking an app together in less that one weekend :)

At the end of the hackathon, each team had 3 minutes to present there app, which is really no time at all. However, as a judge I had been going round the teams over the weekend to get to know them and find out what they were doing. This also gave me insight into how they had progress over the weekend. One of the things we were looking for was if the team could carry on developing there apps afterwards, so their capability and cohesiveness played a factor in our final decision.

It was vital to have members of the accessibility community on the judging panel to be able to judge the impact of each app presented. There were several judges who had physical challenges who related closely to the value of each teams app.

Accessibility Apps that made a difference

With 19 teams to choose from, the judges had a challenge on their hands to come up with 3 winners. At one point we asked if there could be a couple more prizes. The apps that really stood out for me though were:

  • Soundbyte - filtering out noise for different types of hearing loss
  • Visual-eyes - creating a very detailed and highly accurate description of the scene in a pictue (fantastic concept)
  • Gesture Touch - controlling HTML5 apps and games simply
  • Elephants, ears for everyone - transcripting conversations in real time
  • Real assistance - guiding peoples journeys and assisting the last meters
  • Say what you mean - navigate the web by voice
  • MemoryBox - helping those suffering dementia to recall memories easily

Accessibility Apps in detail

What follows is a summary of what I thought of some of the apps presents.

Soundbyte

I really liked the concept this team opened wiht “We have all experienced sound loss”, it helped make the project very relevant. Also the way the team got everyone to stand up and clap to simulate the experience was very striking.

The project itself was great. Having a smart phone as a hearing aid takes away some of the stigma around the hearing impaired. Although phones can cancel out background noise in phone calls, this Android application can eat the sounds that you don’t need. As the app cancels out the background noise in near real time, you can then listen to only that which is valuable, based on filters defined for different types of hearing loss. The team had already created a number of options to help you find out the best sounds based on a persons hearing ability and situation.

This was a very striking project and is high on the list as it also has implications for a wider audience, not just those with hearing loss. As the app was available on the Android Play store in about an hour after they presented, the project seemed very sustainable.

Visual-eyes

This team only came together at the hackathon and found a vision inspired from talks given by the charities. There vision was simple and very relevant to the theme of the hackathon. Photos are everywhere and people love to share them with family and friends. However, its not possible to share photos in an easy way with those who are visually impaired

Their app, Visual Eyes, returns a meningful description of any picture provided. I liked that the team used random images from Facebook as they are representative of the images people share. As the images were random, then you saw how credible the software was at describing the images. I was very struck how detailed the descriptions could be, including if people were wearing sunglasses!

This app was very impressive and therefore high on my list due to the detail of description of each picture. The team had already integrated it their app with Facebook and there were many other integration possibilities. I was very confident this team would carry on developing their app.

The team were looking to open source the whole processing so that costs from 3rd party services are taken out of the process. The team are also considering the use of tags to help make the description even more relevant.

Real Assistance

This app stood out immediately. The ability to record your favourite journeys and play them back to help you find your way seemed like a real win for those with vision issues. It would give those people a lot more confidence when they are navigate to their favourite places.

This app could also be useful for a wider audience, for example to help navigate to a place in a new location or a foreign country.

This app really stood out when the final part of the app was shown, the assisted guidance for the last few meters. To be able to call someone who can direct you using the camera on your mobile device and be guided in real time was a great idea. It can be a challenge finding entrances and then navigating steps and doors, so this is a great way to deal with that issue too.

The app uses existing phone technologies and WebRTC so the team seemed to have a fully working app come time for the demo.

The team had an eye on future features, such as pre-program points of interest (Banks, restaurants). This demonstrated that they are willing to take this app further.

Gesture Touch

The team created a way of helping those with physical challenges to interact with HTML5 based apps, especially games. The team created different modes and controls to help users find the best way of interacting.

I liked that the team had simulated using their app using a device that restricted movements in the hand and what they produced looked quite effective.

This team also had future plans for their apps, including integrating voice recognition, so it seems that they will carry on with their development efforts.

Elephants - ears for everyone

The team develop a real time transcription of conversations taking place, aimed at those with hearing disabilities. They had tried to get hold of some Google Class equipment so that they could have had real time sub-titles when talking to other people.

The team instead created a simple and clean mobile app, allowing you to open up a “channel” in which two or more people could talk and the text of their conversation would be displayed in a similar form to modern text apps.

The team did a great demo, although there was some doubt about how effective this would be if there was background noise. The team seemed keen to keep on with the development if they got positive feedback, so if they can also include filtering of the background noise I believe they have a valuable app.

Say what you mean

I appreciated that the team invested time in the experience of being blind and accessing the web. That they discussed ideas with the people from RNIB helped them identify a real need, that the key desire people had was to go faster. Screen readers linearised the experiences when people acutally wanted a content driven experiences

The app had a very simple user interface, press a key and say a word. You are then sent to a link that matches that word. This is acceptable for websites your are familiar with.

For other sites you don’t know well, then its used like a search that returns the links at the start of the page so you don’t have to go hunting for them.

As their app works as a browser extension, then it works for all web sites without specific configuration.

It was great that the team have considered future functionality, like related terms and filtering search criteria. I can see this app being quite useful to many.

Teams that had a great idea but had not gone far enough with the development

Food for though

The team had a great presentation and I really appreciated the use of using Alice as a persona to help us understand who the audience was the were trying to reach.

The concern they were talking was memmory loss which affects a great number of people. Without a good memory your experiences from a human perspective is about having your independence.

The team continued to tell the story around the persona. Alice does not always eat properly, because she forgets if she hasnt eaten. The app the team developed reminds Alice of key meals, helps her select from different meals and talks her through the making of the meal she has selected. The meals can be put together by family members, doctors or nutritionist, to give more diversity to Alice’s diet.

Although this was a great concept, I felt that the team had not developed the application far enough in the time they had. There were unanswered questions and I hope that the team are able to get more of the app developed.

MemoryBox

The team chose a really powerful sounding topic, remenicance therapy. This was a great technique for helping family and friends to engage with those with dementia By creating a wide range of media to form a collection that would trigger memories about events and people, it would help those with the condition feel more positive and help them relive experiences.

The challenge was to create something that would easily create this experience and be significant improvement on the basic photo collections you can do with many online services. The app would need to help the supporting family members create these collections easily and relate them to specific memory categories. An app would also need to help the family members by relating images to each other automatically, I guess in the same way that Amazon relates other products.

Leap Motion accessibility device

The sole developer on this personally driven project - his grandmother has difficulties with her hands and finds interaction with computing devices almost impossible. However, she has a very active mind and the developer wanted a way to help her engage with the Internet which most of us take for granted.

The project was quite simple, more like a proof of concept as no real substantial application was created. The developer used an open source project and a Chrome extension to support the leap motion device. Whilst this is a great device, I was looking for something specific to be built from this concept.

Although this was a enthusiastic developer that may create some good ideas, he didnt really create much of an app to realise this concept.

Library accessibility

I liked the idea of improve accessibility of other apps by identifying libraries that apps that are used, then sending in patches to make them have accessibility features. This was a great effort by one developer, although if its only one developer I was not sure on the impact. This wasnt an app that made it easier for people to improve libraries or even encourage other developers to get involved.

Its a very worthwhile effort on this sole developers behalf. I would have like to have seen something that would help lots of other developers do the same thing.

Thank you.
@jr0cket


This work is licensed under a Creative Commons Attribution 4.0 ShareAlike License, including custom images & stylesheets. Permissions beyond the scope of this license may be available at @jr0cket
Creative Commons License