Inspiration

We were inspired to create AllergiX because we both have people in our lives that are effected by dietary restrictions. We feel that these family members, friends, and others in our society have the right to know what's in their food. Because of this, we created an app that can detect possible allergens in food by simply analyzing a picture or description.

What it does

AllergiX is a tool for those with dietary restrictions that may have questions about the food they are eating. Users have two ways of interacting with the app: First, they can simply search a food item, such as a grilled cheese, and AllergiX will find recipes online for that item. Then, it looks for certain keywords that may indicate certain allergens being present. Finally, it aggregates these results into "risks" that show the average percent of recipes that have different allergens mentioned in their instructions. The second way users can search for food is by simply taking a picture and uploading it to AllergiX. From there, a machine learning model analyzes the image to determine what kind of food it may be, and what allergens may be present.

How we built it

We built AllergiX using ReactJS for the frontend, and Node JS for the backend server. It also makes use of several APIs such as Google's Search API, and a HuggingFace machine learning model for image recognition.

Challenges we ran into

Initially, we wanted to use an open-source npm package that would scrape the ingredients from supplied recipe URLs, but we found that many of these simply didn't work or were too difficult to implement. To overcome this, we built our own web-scraper that identifies keywords on these websites. We also ran into issues with finding the right AI model to use for our project. We started with OpenAI's model, but found it too restrictive and costly. Then, we switched to Google's Vision, but it lacked the ability to differentiate between types of foods, only labelling them as "food" or "sandwich". Finally, we settled on a HuggingFace API that recognizes certain ingredients in food and returns a confidence percentage for each of them. This was exactly what we were looking for. The last challenge was getting the application to look pretty (this was definitely the most important step). We used NextUI and MUIx components for our application, which ended up working nicely.

Accomplishments that we're proud of

We ran into a lot of complex issues along the way, but we were able to overcome them by brainstorming and collaborating to find a solution. From this project, we both accomplished the creation of an app that we feel would be a benefit to society, and are happy to have worked on. There were countless times we faced trouble, but at the end of the day, we are proud of the app we made.

What we learned

Overall, we learned a lot about interfacing with third-party APIs and interpreting that data in a reasonable way. We also had a lot of fun brainstorming new and exciting ways we could improve the app, and what features people would really care about if they were to use it.

What's next for AllergiX

The biggest improvement for AllergiX would be the addition of more allergens. Currently, we only support a few (dairy, gluten, shellfish), but it would be useful to add more. This would help our app to be more inclusive, and open the door for more users. Also, we would like to improve our web-scraper. At the moment, it's very rudimentary due to time constraints, but given more time we think that it could be much improved. Finally, we had a more expanded vision of our UI that we want to build out more. This included better instructions for using the app, as well differentiating the two 'modes' a user could make use of (image, text).

Built With

Share this project:

Updates