Project Description
What it does:
Our project is a facial expression detector that aims to decode non-verbal communication through computer vision. It can detect 7 emotions: happy, neutral, fear, surprise, angry, disgust, and sad. In our demonstration video, we showcase examples of detecting 3 of these emotions.
How we built it:
We built this model using the concept of convolutional neural networks (CNNs) because they efficiently use mathematics to detect patterns and produce an output.
Challenges we faced:
One major challenge we encountered was the lengthy training time of the neural network, which took about twenty minutes before applying the CNN model. To address this, we implemented a system to register the weights and main values in a file, reducing the need for continuous training.
Accomplishments:
Given our resources, we are most proud of the accuracy achieved by our project. We are also pleased with the interface and output of our project.
What we learned:
We gained extensive knowledge about CNNs, including their hyperparameters, and learned ways to optimize them for regular laptops, as they typically require immense processing power. Additionally, we learned about haarcascades and their operation to detect faces for applying the CNN model.
What's next for Emotion Detector This model can be applied to mental health programs where it is essential to detect the human emotion and this model can be more user friendly by creating a UI to make it more accessible
Log in or sign up for Devpost to join the conversation.