Albie - Learn and build LLM based project with deep and real-time feedback.

🤯 Inspiration

Learning is a long and personal journey, made even better with a friend! With Albie, we aim to use personalized education to revolutionize and accelerate the way people learn through highly-tailored feedback. Through Albie, we challenge users to test their understanding through AI conversation and AI-guided projects. For our first topic, we wanted to show how we can explain new and relevant topics such as LLMs and augmented retrieval.

🧐 What it does

In our demo, we chose to focus on teaching about LLMs and challenged the user to apply their knowledge of LLMs through the of augmented retrieval. Our product's design is backed by research following Bloom’s Taxonomy, a hierarchical model developed by educational psychologist Benjamin Bloom. This model aims to foster critical thinking and problem-solving skills -- two skills which are essential to become a highly effective developer.

In Albie, to teach and implement retrieval augmentation, go through 2 phases: 1) checking student understanding and 2) creating a solution. Using Hume's Facial Expression AI, Hume's Vocal Expression AI, and Context Injection, the user is both challenged and assisted through the complex topic and the project.

Lets say...

  • The user just learned LLMs and how to use openAI API --> Albie now will tell you to explain what you learned, understand what you understand, and assigns you a problem
  • User explains the problem and asks clarifying questions until there is closure --> Next, the user has to explain the solution in actionable steps and then jump to building the project --> As they code, we use ChatGPT to check if their code matches their previously outlined plan and automatically cross off steps in visualizing completion while keeping in touch with the user

🥸 How we built it

  • React NextJS as frontend framework, MUI for UI library
  • Flask Python for backend API services

LLM & Prompt Engineering

  • Anthropic, cohere, Open AI for processing Generative AI through text input
  • Whisper OpenAI for transcribing text.
  • Pinecone for vectorized database of score-based emotions data
  • Hume AI for processing real-time emotional face detections. We obtain statistics about the emotions, e.g the percentage of each emotion expressed throughout the full audio file
  • We gather the predictions for the code and emotions to help assistant to provide more insightful feedback. (Optimally, as the snippet length gets smaller, our program more closely simulates "real time." However, we chose a length of 1 second because predicting emotion in a shorter time than that is difficult).

Challenges we ran into

  • building an interface with so many ambitious features and functions within very constrained time processing complex prompt engineering to process human's data in real-time.
  • Learning LLM implementation in a short time frame
  • S C O P E
  • APIs failed CORS errors, ChatGPT Hallucinated, user's code compiling in a daemon.

Challenges in prompt engineering

  • It was difficult creating a prompt such that ChatGPT would recognize if the code matched their previously outlined plan. Since the user can develop in any order and can break down the API call into many lines, we prompted ChatGPT to match the line code to the task by providing the API sample code to ChatGPT.

😊 Accomplishments that we're proud of

  • Teach, hint and deploy code to create a system that uses vectors and pine cone.
  • We're proud of Implement nearly e v e r y t h i n g we stated in the beginning to do in the span of 24 hours Literally proud!

🤔 What we learned

Small steps go a long way. Instead of building something super complex, build something small but deliver.

😤 What's next for Albie

Create dynamic problems (LLM projects). Customize the assessment in the language the user understands.

Built With

Share this project:

Updates