在iOS上,人脸检测、识别可以用opencv for iOS来实现,但是这不是需要导入第三方库么,上网搜寻了一下,发现iOS自带的CIDetect可以检测人脸,可以获取到人脸位置、大小、左右眼睛位置、嘴巴位置等数据,但是比较遗憾的是无法获取眼睛、嘴巴大小。
1.获取人脸数据
CIImage *img = [CIImage imageWithCGImage:image.CGImage];
CIDetector *detector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:[NSDictionary dictionaryWithObject:CIDetectorAccuracyHigh forKey:CIDetectorAccuracy]];
NSArray *features = [detector featuresInImage:img];
features数组里保存的是图片上所有人脸的数据,数据类型是CIFaceFeature。
2.CIFaceFeature
我们可以访问到的信息有:
@interface CIFaceFeature : CIFeature
{
CGRect bounds;
BOOL hasLeftEyePosition;
CGPoint leftEyePosition;
BOOL hasRightEyePosition;
CGPoint rightEyePosition;
BOOL hasMouthPosition;
CGPoint mouthPosition;
BOOL hasTrackingID;
int trackingID;
BOOL hasTrackingFrameCount;
int trackingFrameCount;
BOOL hasFaceAngle;
float faceAngle;
BOOL hasSmile;
BOOL leftEyeClosed;
BOOL rightEyeClosed;
}
各个属性含义如名字所示,不过,检测到的人脸的位置同opencv一样,需要以纵向中线为中心轴镜像对称一下。
3.例子
举个例子,检测嘴巴位置:
faceFeature = features[0];
if(faceFeature.hasMouthPosition)
{
CGPoint p = faceFeature.mouthPosition;
p.y = image.size.height - p.y;
}
p就是嘴巴的位置啦!