Skip to content

This project utilizes machine learning algorithms to find the direction in which a person is looking by using the face landmarks

Notifications You must be signed in to change notification settings

Mostafa-Nafie/Head-Pose-Estimation

Repository files navigation

Head Pose Estimation

This project tries to find the direction a person is looking using the orientation of the head.
Head orientation can be defined using pitch, yaw, and roll, so I used a machine learning model tries to estimate these values using facial landmarks.

head rotations

Demo:

output.mp4

Dataset:

For this project, the dataset used is AFLW2000 Dataset, which consists of 2000 face images, with some information about them like the facial landmarks and the pitch, yaw, and roll values for each picture.

Solution:

  1. Used the MediaPipe library to extract the face landmarks, then chose the most important facial features (like the nose, forehead, eyes and mouth positions) to be the input of the model.
  2. Performed preprocessing step to make the model independent of the face position or scale.
  3. Trained a regression model using the position of the most important facial landmarks to estimate the values of the pitch, yaw, and roll.
  4. Used rotation, translation, and projection of the axes on the image to visualize the direction the person is looking at.
  5. Used the values of pitch, yaw, and roll to define the direction.
    image

Libraries used:

  1. MediaPipe
  2. Numpy
  3. OpenCV
  4. Matplotlib
  5. Pandas
  6. Scikit-Learn

About

This project utilizes machine learning algorithms to find the direction in which a person is looking by using the face landmarks

Topics

Resources

Stars

Watchers

Forks

Languages