Inspiration

All of us have worked with people who have ASD at some point and wanted to be able to help them. People affected with ASD often have trouble recognizing other's emotions, so we wanted to make a device to help them in everyday life, making interactions with others easier and more accessible to them.

What it does

It recognises if someone is smiling or not using two haacascades, frontal face detection and smile detection. Then we added two messages for smiling and not smiling to let the user know what the life feed is projecting, and added it all into a web-accessible site.

How we built it

We used Haar Cascades, opencv, numpy, and html/python to make the website.

Challenges we ran into

The website wasn't working for a while because of live feed input and output issues. The video input on the website somehow connected to the phone instead of accessing the computer camera, and we didn't realize until we got a notification on the phone. We also wanted to make our own haarcascade but it didn't work out.

Accomplishments that we're proud of

The program accurately depicts whether someone is smiling or not and we were able to use a live feed to make it more accessible and efficient for everyday users.

What we learned

How to code proper websites and how to make/use Haar Cascades, as well as get more experience in different languages.

What's next for Smiley

We hope to include more emotions so that it has a wide range such at frowning, crying, angry and so on. We could also extend the project into an IoT device with the rasberry pi and tweak the code to add an LED to detect if the user is smiling or not. We could also CAD a case for all the components to detatch the camera from the computer camera using a picam. By doing this we could implement it more easily for video calls.

Built With

Share this project:

Updates