Augmented reality gives us the entirely new ability to interact with virtual objects in our physical world. But AR isn't just about manipulating 3D space — it gives us powerful ways to enhance what we see on a day to day basis.

Imagine being able to determine an individual's vitals in a matter of seconds. The medical implications are monumental: visualizing blood flow during surgery, helping EMT responses be more accurate, and allowing civilians to take more effective action during an emergency.

With Pulse, we've made all of this possible.

We took the Meta glasses and extracted real-time video feed from a scene. At the action of a user, or potentially even upon facial detection, the glasses will automatically begin recording a video stream and send it off to our web API.

Our web API, running python with Flask and implementing OpenCV, will then apply the Eulerian Video Magnification algorithm to the data by analyzing amplified pixel coloration changes. We implemented the algorithm by modifying Tristan Hearn's webcam pulse detector library https://github.com/thearn/webcam-pulse-detector. After a few seconds, the app performs facial detection/tracking and reports the individual's blood rate, and sends it back to our Unity app — and it's strikingly accurate.

This is only the beginning for Pulse — we intend to add breathing rate and blood flow detection in the future, making this app a genuine testament to the promise of AR in the medical field.

Built With

+ 4 more
Share this project:

Updates