Thursday, December 25, 2014

Face Detection From Photograph

In this tutorial, I am going to write an app to detect face from a photograph.

Create a new XCode Project single-view application called FaceDetectionFromPhoto.

In storyboard, add an UIImageView with the following constraints for the adaptive layout:


The constraints are shown on the Size Inspector above (right of screen). Also, set the View Mode to Aspect Fit as seen in the Attributes Inspector below (right of screen):



Then add a group photo to your SupportingFiles by right-clicking SupportingFiles folder and select Add Files to... :


I added obamafamily.jpg in the above example.

Next download this file



Add the xml file to SupportingFolder the same way you did for the jpg file above.

Then, add the following frameworks to your project:

opencv2.framework
UIKit.framework
CoreGraphics.framework

In your Build Settings, under Apple LLVM 8.0, set Compile Sources As: 
Objective-C++

Hookup your storyboard's UIImageView to IBOutlet on ViewController.h and also create a variable called faceDetector and add the opencv2 header files. Your ViewController.h file should look like this:

//
//  ViewController.h
//  FaceDetectionFromPhoto
//
//  Created by Paul Chin on 12/26/14.
//  Copyright (c) 2014 Paul Chin. All rights reserved.
//

#import <UIKit/UIKit.h>
#import <opencv2/opencv.hpp>
#import <opencv2/highgui/ios.h>

@interface ViewController : UIViewController{
    cv::CascadeClassifier faceDetector;
}

@property (weak, nonatomic) IBOutlet UIImageView *imageView;


@end


And your ViewController.m as follows:

//
//  ViewController.m
//  FaceDetectionFromPhoto
//
//  Created by Paul Chin on 12/26/14.
//  Copyright (c) 2014 Paul Chin. All rights reserved.
//

#import "ViewController.h"

@interface ViewController ()

@end

@implementation ViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.
    
    NSString *cascadePath = [[NSBundle mainBundle
                   pathForResource:@"haarcascade_frontalface_alt"
                            ofType:@"xml"];
    faceDetector.load([cascadePath UTF8String]);
    
    UIImage *image = [UIImage imageNamed:@"obamafamily.jpg"];
    
   
    cv::Mat faceImage;
    UIImageToMat(image, faceImage);
    
    cv::Mat gray;
    cvtColor(faceImage, gray, CV_BGR2GRAY);
    
    std::vector<cv::Rect>faces;
    faceDetector.detectMultiScale(gray, faces,1.1,2,
                        0|CV_HAAR_SCALE_IMAGE,cvSize(30, 30));
    
    
    for(unsigned int i=0; i<faces.size(); i++){
        const cv::Rect face = faces[i];
        cv::Point topLeft(face.x,face.y);
        cv::Point bottomRight= topLeft + cv::Point(face.width,face.height);
        
        // Draw rectangle around the face
        cv::Scalar magenta = cv::Scalar(255, 0, 255);
        cv::rectangle(faceImage, topLeft, bottomRight, magenta, 4, 8, 0);
        
        // Show resulting image
        self.imageView.image = MatToUIImage(faceImage);
    }
}

- (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}

@end


Then, run your app in the iOS Simulator:




I've set the ViewController background to green so that it stands out when I post to this blog.








No comments:

Post a Comment