Top Banner
FINAL REPORT OF DIGITAL VIDEO PROCESSING FACE TRACKING I.INTRODUCTION In this report, the work consists of capturing a video with my face moving in and out; developing a technique to track my face(in video) ;finally I will show the details the technique I used to do this by showing final results in pictures. To do this some techniques related to matlab programming language was used as we will see it in details bellow. The result of this report was obtained with the help of the combination all code for the tracking in appendences. II. PROJECT IN DETAILS II.1.Capturing a video with my face Although I have mentioned in introduction do my project I used technique related to matlab programming language, but this step (Capturing a video with my face) is not related to mat lab programming language. To capture a video, I only used a camera(digital camera).After capturing this camera ,I saved it in my computer in 3gp format in file as gashema. II.2.Playing the captured video in matlab. To play the captured video in matlab ,matlab function was used as will see it in the indices of this report. The objective of playing this video in matlab was for testing it before tracking it. As the original video was in 3gp format and as I used matlab function to play this video as avi format, before playing it in matlab I first converted it in avi format by using one of suitable software to tthis work(avi converter).To do it I used play_video (‘filename.avi’) as function to read this video in matlab. III.TRACKING MY FACE. 1
14

Tracking my face with matlab ws word format

Dec 15, 2014

Download

Documents

Gaspard Ggas

 
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Tracking my face with matlab ws word format

FINAL REPORT OF DIGITAL VIDEO PROCESSING

FACE TRACKING

I.INTRODUCTION

In this report, the work consists of capturing a video with my face moving in and out; developing a technique to track my face(in video) ;finally I will show the details the technique I used to do this by showing final results in pictures. To do this some techniques related to matlab programming language was used as we will see it in details bellow. The result of this report was obtained with the help of the combination all code for the tracking in appendences.

II. PROJECT IN DETAILS

II.1.Capturing a video with my face

Although I have mentioned in introduction do my project I used technique related to matlab programming language, but this step (Capturing a video with my face) is not related to mat lab programming language. To capture a video, I only used a camera(digital camera).After capturing this camera ,I saved it in my computer in 3gp format in file as gashema.

II.2.Playing the captured video in matlab.

To play the captured video in matlab ,matlab function was used as will see it in the indices of this report. The objective of playing this video in matlab was for testing it before tracking it. As the original video was in 3gp format and as I used matlab function to play this video as avi format, before playing it in matlab I first converted it in avi format by using one of suitable software to tthis work(avi converter).To do it I used play_video (‘filename.avi’) as function to read this video in matlab.

III.TRACKING MY FACE.

To track my face ,I used technical based on Kalman filter i.e. useful technique for tracking different types of moving objects which was invented by Rudolf Kalman at NASA( National Aeronautics and Space Administration) to track the trajectory of spacecraft. Normally, at its heart, the Kalman filter is a method of combining noisy (and possibly missing) measurements and predictions of the state of an object to achieve an estimate of its true current state.As the aim of this project is not to explain or develop this technique(Kalman filter) applied to track my face. I inserted those few explanations of Kalman filter in order to allow someone who reads my report can have some ideas of this technique i.e what this technique is about.

a. Running Kalman filter tracker.

1

Page 2: Tracking my face with matlab ws word format

To run the basic Kalman filter tracker for my video,the following matlab function was used: kalman_tracher(gashema,0.05,50,3);

Matlab code for this function is found in appendices.The code is tittles as matlab code for combination of all above codes.

When you are tracking the face in matlab, pressing any key while the tracker is running will pause the tracker until you press another key.

b. Implementation of the project in tetails

To implement this project on tracking, four steps were implemented using matlab programming language. Figure bellow shows how the process looks like.

The basic processing pipeline for object tracking.

In this section we will describe each of the processing modules and the implementation provided. First we describe the toplevel function that strings together all processing modules into a complete tracker.

1 .The toplevel function

A tracker in the practicum software framework consists of:

A segmenter that segments foreground objects from the background; a recognizer that identifies which foreground objects are being (or should be)

tracked;

a representer that represents each tracked object in terms of some abstract feature space;

a tracker that actually does the tracking (i.e. estimates the position of each tracked target in the current frame); and

an optional visualizer that displays a visualization of each frame of the tracker.

As I have mentioned at the beginning of this report,a to track my face in video, Matlab function implementing each of these modules in order to be complete is required .i.e a tracker (i a collection of processing modules) is modeled in Matlab as structure containing references to each module (that is the Matlab function implementing the

2

Page 3: Tracking my face with matlab ws word format

module). This structure also contains all state variables that must be passed down the pipeline to other modules. The structure T must contain the following substructures:

1. T.segmenter which must contain a function reference T.segmenter.segment to a function taking the structure T and the current frame.

2. T.recognizer which must contain a function reference T.recognizer.recognize to a function taking the structure T and the current frame.

3. T.representer which must contain a function reference T.representer.represent to a function taking the structure T and the current frame.

4. T.tracker which must contain a function reference T.tracker.track to a function taking the structure T and the current frame.

5. And optionally T.visualizer which must contain a function reference T.visualizer.visualize to a function taking the structure T and the current frame.

For example, you can run the tracker on a video using the run_tracker function. where T is the structure setup previously with references to all of the tracking modules and fname is the filename of the video to run the tracker on. Also,you will write a wrapper function around run_tracker that sets up the processing pipeline and sets any needed parameters We will see all these matlab functions in matlab codes at the end of this report in appendices .

2.Segmenter

The segmenter provided performs simple background subtraction based on a rolling-average background model and temporal differencing. The background model is updated like this:

where   is a learning rate parameter that controls how quickly the background model incorporates new information and how quickly it forgets older observations. Every pixel in every frame is classified as either background (0) or foreground (1) using a simple threshold:

3

Page 4: Tracking my face with matlab ws word format

where   is a threshold on the difference between a pixel in the current frame and background model. This threshold controls which pixels are classified as foreground. Finally, the foreground image is cleaned a little to close small gaps:

where   is the morphological closing operation and   is a disk of radius  .

The Matlab implementation of this segmenter looks like this (file background_subtractor.m in the framework directory).See matlab code for this at the end of this report in appandices.

3. The recognizer

The next stage in the pipeline is the recognizer. The basic recognizer provided with the tracking framework is based on simple blob detection. Connected components are found in the segmented image, and the whole set of components is retained as the "recognized" object to be tracked.

By referring to the matlab code of recognizer, you can see how the recognizer assumes the segmented image from the previous stage is passed in the T.segmenter.segmentedvariable. This is an example of how state is passed between processing modules. The recognizer then puts its result in the variableT.recognizer.blob for subsequent modules to use.

It is important to note that the framework does not enforce any particular protocol for passing state variables from module to module. You are responsible for establishing how each module receives its parameters and how it stores its results.

4. The representer

The next module is the representer, which takes the results of recognition and computes a representation of each recognized object to be tracked. In the basic tracker, the representer extracts the BoundingBox and size of each blob and only keeps the biggest one as the single object being tracked. See the matlab code in the appendices.

In a more realistic tracker, the representer module would be responsible for maintaining a list of objects being tracked and establishing correspondences between recognized objects and those currently being tracked.

5 The tracker

A simple Kalman filter tracker has been provided in the practicum framework. It uses the results of the representer to track the position and extent of the object being tracked. The Kalman filter works by estimating an unobservable state which is updated in time with alinear state update and additive Gaussian noise. A measurement is provided at each time

4

Page 5: Tracking my face with matlab ws word format

step which is assumed to be a linear function of the state plus Gaussian noise. It is the most complex module in the framework, and receives and stores a large amount of state information .Matlab code of this step is at the end of this report in appendices.

3.6 The visualizer

The final module in the processing pipeline is the visualizer. It is not mandatory, but it is always a good idea to provide a minimal visualizer to see how your tracker is performing. The visualizer provided simply displays the current frame with rectangles indicating the current detection and current estimate (file visualize_kalman.m in the framework directory).See matlab codes in appendences.

III.THE RESULT AND DISCUSSIONS ON MY WORK

When my face in the video is tracked,it was shown that the tracker is tracking the face.When the face is moving,the tracker also will move to where the face moves in order to track it(the face).When the face is out, the tracker disappears.If the face comes in again ,immediately the tracker also comes in.If the face is stationary, the tracker stops moving i.e it track the face without movement.

The result of the work.

a.the face is in and moving

5

Page 6: Tracking my face with matlab ws word format

100 200 300 400 500 600

50

100

150

200

250

300

350

400

450

100 200 300 400 500 600

50

100

150

200

250

300

350

400

450

6

Page 7: Tracking my face with matlab ws word format

c. The face is out

.

100 200 300 400 500 600

50

100

150

200

250

300

350

400

450

I

REFERENCE

1. Yao-Jiunn Chen, Yen-Chun Lin, simple Face-detection Algorithm Based on Minimum Facial Features.

2. Sanjay Kr. Singh1, D. S. Chauhan2, Mayank Vatsa3, Richa Singh3*, A Robust Skin Color Based Face Detection Algorithm

3. Eli Brookner: Tracking and Kalman Filtering Made Easy. Wiley-Interscience, 1st edition, 4 1998.

4. http://en.wikipedia.org/wiki/Face_detection

5. http://www.micc.unifi.it/bagdanov/tracking/

7

Page 8: Tracking my face with matlab ws word format

V.APPANDENCES

1.Matlab code to read my video

function play_video(fname);

% PLAY_VIDEO - simple function to play a video in a Matlab figure.% PLAY_VIDEO(fname) will play the video in the file specified by 'fname'.% See also: videoReader, nextFrame, getFramevr = videoReader(fname);while (nextFrame(vr)) img = getFrame(vr); imshow(img); pause(0.01);end return

2.Matlab for Tracking framework

a.Matlab code for segmenter

function T = background_subtractor(T, frame) % Do everything in grayscale.frame_grey = double(rgb2gray(frame)); % Check to see if we're initializedif ~isfield(T.segmenter, 'background'); T.segmenter.background = frame_greyend % Pull local state out.gamma = T.segmenter.gamma;tau = T.segmenter.tau;radius = T.segmenter.radius; % Rolling average update.T.segmenter.background = gamma * frame_grey + (1 - gamma) * ... T.segmenter.background; % And threshold to get the foreground.T.segmenter.segmented = abs(T.segmenter.background - frame_grey) > tau;

8

Page 9: Tracking my face with matlab ws word format

T.segmenter.segmented = imclose(T.segmenter.segmented, strel('disk', radius)); Return

b.Matlab code for recognizer

function T = find_blob(T, frame)T.recognizer.blobs = bwlabel(T.segmenter.segmented);return

c.Matlab code for representer

function T = filter_blobs(T, frame) % Make sure at lease one blob was recognizedif sum(sum(T.recognizer.blobs)) % Extract the BoundingBox and Area of all blobs R = regionprops(T.recognizer.blobs, 'BoundingBox', 'Area'); % And only keep the biggest one [I, IX] = max([R.Area]); T.representer.BoundingBox = R(IX(size(IX,2))).BoundingBox;endreturn

d. Matlab code for tracker

function T = kalman_step(T, frame) % Get the current filter state.K = T.tracker; % Don't do anything unless we're initialized.if isfield(K, 'm_k1k1') && isfield(T.representer, 'BoundingBox') % Get the current measurement out of the representer. z_k = T.representer.BoundingBox'; % Project the state forward m_{k|k-1}. m_kk1 = K.F * K.m_k1k1; % Partial state covariance update. P_kk1 = K.Q + K.F * K.P_k1k1 * K.F';

9

Page 10: Tracking my face with matlab ws word format

% Innovation is disparity in actual versus predicted measurement. innovation = z_k - K.H * m_kk1; % The new state covariance. S_k = K.H * P_kk1 * K.H' + K.R; % The Kalman gain. K_k = P_kk1 * K.H' * inv(S_k); % The new state prediction. m_kk = m_kk1 + K_k * innovation; % And the new state covariance. P_kk = P_kk1 - K_k * K.H * P_kk1; % Innovation covariance. K.innovation = 0.2 * sqrt(innovation' * innovation) + (0.8) ... * K.innovation; % And store the current filter state for next iteration. K.m_k1k1 = m_kk; K.P_k1k1 = P_kk;else if isfield(T.representer, 'BoundingBox'); K.m_k1k1 = T.representer.BoundingBox'; K.P_k1k1 = eye(4); endend % Make sure we stuff the filter state back in.T.tracker = K; return

e. Matlab code for Visualizer

function T = visualize_kalman(T, frame) % Display the current frame.imshow(frame); % Draw the current measurement in red.if isfield(T.representer, 'BoundingBox') rectangle('Position', T.representer.BoundingBox, 'EdgeColor', 'r');

10

Page 11: Tracking my face with matlab ws word format

end % And the current prediction in greenif isfield(T.tracker, 'm_k1k1'); rectangle('Position', T.tracker.m_k1k1, 'EdgeColor', 'g');enddrawnow;return

f. matlab code for combination of all those above codes

function T = kalman_tracker(fname, gamma, tau, radius);% Initialize background model parametersSegmenter.gamma = gamma;Segmenter.tau = tau;Segmenter.radius = radius;Segmenter.segment = @background_subtractor; % Recognizer and representer is a simple blob finder.Recognizer.recognize = @find_blob;Representer.represent = @filter_blobs; % The tracker module.Tracker.H = eye(4); % System modelTracker.Q = 0.5 * eye(4); % System noiseTracker.F = eye(4); % Measurement modelTracker.R = 5 * eye(4); % Measurement noiseTracker.innovation = 0;Tracker.track = @kalman_step; % A custom visualizer for the Kalman state.Visualizer.visualize = @visualize_kalman; % Set up the global tracking system.T.segmenter = Segmenter;T.recognizer = Recognizer;T.representer = Representer;T.tracker = Tracker;T.visualizer = Visualizer; % And run the tracker on the video.run_tracker(fname, T);return

11

Page 12: Tracking my face with matlab ws word format

12