Eye Tracking with Deep Learning (Week 4 Update)
Вставка
- Опубліковано 21 вер 2018
- In my class on Deep Learning at Regis University, I have had the opportunity to work on my own project--modeling eye tracking using a data set called GazeCapture. For this week, I had three major focus areas:
1. Exploratory data analysis (including FPS) of face-alignment project after running it on all the GazeCapture data
2. Installing a second Nvidia GeForce GTX 1080 TI
3. Retrain COCO-trained YOLO v3 model using a Keras implementation against a subset of Google's Open Images v4
Links:
Repository: l.rcd.zone/01-eye-track-repo
GazeCapture: l.rcd.zone/gaze-capture
Face Alignment: l.rcd.zone/face-alignment
Keras-Yolo: git.io/fAd9H (@qqwweee)
TensorBoard: www.tensorflow.org/guide/summ...
Open Images v4: storage.googleapis.com/openim...
Open Images v4 (subset ~ 563 GB): www.figure-eight.com/dataset/...
COCO: cocodataset.org/#home - Наука та технологія
Can i use this code to control a motor right or left depending on the direction of the eye?
Remind me what monitor you're using again? Looks bigger than a 34"...
It's a 49" widescreen. :-)
Links in description do not go anywhere