> The Problem
Starting a conversation can feel awkward, even when you already know something about the person in front of you. In social settings, it can be hard to think of the right thing to say quickly, especially if you are trying to make the conversation feel personal instead of generic.
Cupid Glasses started from that gap: what would it look like if a wearable device could quietly understand who you are talking to and help you start a better conversation in real time?

An awkward first conversation
> What Cupid Glasses do
Cupid Glasses are wearable smart glasses that generate personalized icebreakers while you are talking to someone. The user first creates profiles in a mobile app with each person’s name, photo, and interests. When the glasses see someone, the system matches the live camera feed to a saved profile, combines that person’s interests with their inferred emotion, and uses Google Gemini to generate context-aware conversation starters. The generated icebreakers then appear directly on a LCD mounted to the glasses.
Top down view of Cupid Glasses
> How it works
Cupid Glasses are built as a pipeline across a mobile app, a backend server, and wearable hardware.
The mobile app lets users create and manage profiles for people they know. Each profile stores a name, photo, and interests, with Firebase used to store and retrieve the profile data. When the glasses are running, an ESP32-CAM streams live video over a local Wi-Fi network to a Python backend. The backend uses OpenCV and DeepFace to match the person in the live feed to one of the saved profile photos. After a match is found, the backend retrieves the person’s interests from Firebase and runs emotion inference to add live social context. That information is sent to Google Gemini, which generates personalized icebreakers which is sent back over the network to the glasses and displayed on the LCD.
This project won first place in the valentines day theme and best use of Gemeni API at MakeUofT 2026

My team at the hackathon award ceremony
© Eshaan Marocha 2025


