Inspiration: A lot of our friends have a hard time seeing and we wondered how we could do something about that.

What it does: Our Machine detects objects that are nearby and if they are too close, the machine will vibrate telling the user that they are approaching an object.

How we built it: We used a laptop to run a pre-trained model and used code to provide haptic feedback to inform the user of potential collisions.

Challenges we ran into: One challenge we ran into was installing the pre-trained AI models and necessary software

Accomplishments that we're proud of: We were proud of getting the AI-model up for the first time with accurate detection of objects

What we learned: How AI-models work effectively and how to improve them

What's next for C.A.R.L Hack-A-Thon: Polishing our designs and adding more features with optimization

Built With

Share this project:

Updates