Getting to understand the accessSOS application has been a great experience! The use of icon-based communication to broaden 911 accessibility in emergency situations is a unique solution to a problem that over 10% of the American population faces: either auditive disabilities or the inability to communicate in English.
We have taken the original concept of the application and place it into an AI-powered chatbot, with the following improvements:
Icon-based communication remains as the primary tool for information retrieval, yet users are given the chance to just explain the situation in text, such that we then only ask for the missing information, making the process more efficient.
Pre-elaborated questions, asked only when relevant to responses previously given by the user. The answers to this questions are at the end collected together, organized by information type and send to 911 dispatch. After sending the message to 911, the bot keeps asking relevant followup questions for a second report with more details on the situation.
Nice user interface, icons with contrast colors, visually enhanced attributes
We are specially proud of our implementation of the AI model, used to extract specific information from user inputted text, making it such that the human dispatcher only gets the most crucial details first and is able to send aid, and then gets the rest of the details.
Log in or sign up for Devpost to join the conversation.