超级玛丽程序_如何构建一个超级快速的微笑跟踪应用程序

超级玛丽程序

ARKit might seem intimidating but it’s not so bad if you already have some basic experience building iOS apps.

ARKit似乎令人生畏,但如果您已经具有构建iOS应用程序的一些基本经验,那还不错。

I’m a learn-by-doing type, so I’ve been playing around with ARKit, building basic apps to get familiar with it. In this post, I’ll review what I’ve learned creating a simple face tracking app.

我是一个边做边学的人,所以我一直在玩ARKit,构建一些基本的应用程序来熟悉它。 在这篇文章中,我将回顾我创建一个简单的面部跟踪应用程序所学到的知识。

I’ll do this in 3 parts:

我将分3部分进行操作:

  1. Initial Setup → First things first, get Camera permissions and make sure the device can use ARKit.

    初始设置→首先,获得“相机”权限,并确保设备可以使用ARKit。

  2. Smile tracking → Start tracking smiles with ARKit. This is probably what you’re here for.

    微笑跟踪→开始使用ARKit跟踪微笑。 这可能就是您想要的。

  3. User Interface → Add the UI for our app that will react to smiles.

    用户界面→添加将对微笑产生React的应用程序UI。

As of this writing, the Xcode simulator does not support the front facing camera so you will need a real device to run the app. Your device will also need to have a TrueDepth camera (iPhone X or newer should be fine).

在撰写本文时,Xcode模拟器不支持前置摄像头, 因此您将需要一个真实的设备来运行该应用程序。 您的设备还需要配备TrueDepth相机(可以使用iPhone X或更高版本)。

Finally, for my fellow members of the Copy Paste Club, all the code is available on Github.

最后,对于我的Copy Paste Club成员, 所有代码都可以在Github上找到

最初设定 (Initial Setup)

Start by opening up Xcode and creating a new project named “SmileTracker” (or whatever name you prefer).

首先打开Xcode并创建一个名为“ SmileTracker”(或您喜欢的名称)的新项目。

Before we can get into face tracking, we’ll need to do two things:

在进行面部跟踪之前,我们需要做两件事:

  1. Make sure your device supports ARKit

    确保您的设备支持ARKit
  2. Get permission to access your device’s camera

    获得访问设备相机的权限

In your new project, open ViewController.swift. Near the top of the file, underneath import UIKit, add the line: import ARKit. This will let us access all the goodies that Apple has provided us to make face tracking super easy.

在新项目中,打开ViewController.swift 。 在文件顶部附近的import UIKit ,添加以下行: import ARKit 。 这将使我们能够访问Apple提供给我们的所有优点,从而使人脸跟踪变得异常简单。

Now add the following code inside of viewDidLoad:

现在在viewDidLoad内部添加以下代码:

guard ARFaceTrackingConfiguration.isSupported else {
    fatalError("Device does not support face tracking")
}

ARFaceTrackingConfiguration.isSupported is a boolean that will be true if the device running the app can support face tracking and false if not. In this case, if the device can’t support face tracking, we’ll crash the app with a fatal error.

ARFaceTrackingConfiguration.isSupported是一个布尔值,如果运行该应用程序的设备可以支持面部跟踪,则为true;否则为false。 在这种情况下,如果设备不支持脸部追踪,我们将因致命错误而使应用程序崩溃。

Next, let’s get permission to use the camera. Add the following in viewDidLoad below our guard statement:

接下来,让我们获得使用相机的许可。 将以下内容添加到viewDidLoad中我们的guard声明下方:

AVCaptureDevice.requestAccess(for: AVMediaType.video) { granted in
   if (granted) {
      Dispatch.main.sync {
          // We're going to implement this function in a minute
          self.setupSmileTracker()      
      }
   } else {      
      fatalError("User did not grant camera permission!")   
   }
}

Here we’re asking the device to request camera permissions. If the user grants permissions, we’ll run the function that will setup our smile tracking (don’t worry about the error, we’ll be implementing this function in a moment).

在这里,我们要求设备请求相机权限。 如果用户授予权限,我们将运行将设置微笑跟踪的功能(不用担心错误,我们稍后将实现此功能)。

We wrap the function in Dispatch.main.sync because we’ll be adding UI elements in this function, which can only be done on the main thread.

我们将函数包装在Dispatch.main.sync因为我们将在该函数中添加UI元素,这只能在主线程上完成。

We’ll also need to add a Camera Usage Description to our Info.plist. Open Info.plist and add a new row (you can do this by highlighting the last row and hitting enter).

我们还需要将相机使用说明添加到我们的Info.plist 。 打开Info.plist并添加新行(您可以通过突出显示最后一行并点击enter来完成此操作)。

In the row you just created, add Privacy — Camera Usage Description to the Key column and make sure the Type column is set to string. You can leave the Value column blank or add a message to explain how you will use the camera to the user.

在刚创建的行中,将“ Privacy — Camera Usage Description添加到“ Key列,并确保“ Type列设置为字符串。 您可以将“ Value列保留为空白,或添加一条消息以向用户说明如何使用相机。

Your Info.plist should now look something like this:

您的Info.plist现在应该看起来像这样:

If you’d like to test your app so far, you can comment out the line where we call setupSmileTracker(). Just remember to uncomment it later.

如果您想到目前为止测试您的应用程序,可以在我们称之为setupSmileTracker()的行中setupSmileTracker() 。 只记得稍后取消注释。

If you run your app now, you should see a popup asking you to enable camera permissions. If you say no you’ll have to go to application settings to enable those permissions in order for the app to run.

如果现在运行应用程序,应该会看到一个弹出窗口,要求您启用相机权限。 如果您拒绝,则必须进入应用程序设置以启用这些权限才能运行该应用程序。

If the app crashes, check the console for one of our two error messages to see what went wrong.

如果应用程序崩溃,请在控制台中查看我们的两条错误消息之一,以查看出了什么问题。

微笑追踪 (Smile tracking)

Open ViewController.swift and add the following variable to the top of ViewController:

打开ViewController.swift并将以下变量添加到ViewController的顶部:

class ViewController: UIViewController {   
   let sceneView = ARSCNView()
   
   override func viewDidLoad() {...}
}

ARSCNView comes equipped with an ARSession that your iPhone uses to coordinate AR experiences. We’ll be using sceneView’s ARSession to analyze our user’s face through the front facing camera.

ARSCNView配备了一个ARSession ,iPhone可以使用它来协调AR体验。 我们将使用sceneViewARSession通过前置摄像头分析用户的脸部。

Add this function to your file underneath viewDidLoad:

将此功能添加到viewDidLoad下的文件中:

func setupSmileTracker() {   
   let configuration = ARFaceTrackingConfiguration()   
   sceneView.session.run(configuration)   
   sceneView.delegate = self   
   view.addSubview(sceneView)
}

Here we’ve created a configuration to handle face tracking and used it to run our sceneView’s ARSession.

在这里,我们创建了一个配置来处理人脸跟踪,并使用它来运行sceneViewARSession

Then we set sceneView's delegate to self and add it to our view.

然后,我们将sceneView的委托设置为self并将其添加到我们的视图中。

Xcode will tell you that there is a problem since ViewController does not conform to ARSCNViewDelegate. Go to where ViewController is declared near the top of the file and change the line to the following:

Xcode会告诉您存在问题,因为ViewController不符合ARSCNViewDelegate 。 转到在文件顶部附近声明ViewController的位置,并将行更改为以下内容:

class ViewController: ViewController, ARSCNViewDelegate {   
   ...
}

Now add thisARSCNViewDelegate function in your ViewController class setupSmileTracker:

现在,在您的ViewControllersetupSmileTracker添加此ARSCNViewDelegate函数:

func renderer(_renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
   ...
}

renderer will run every time our scene updates and provides us with the ARAnchor that corresponds to the user’s face.

renderer将在每次场景更新时运行,并为我们提供与用户面部相对应的ARAnchor

To make it easier to create face tracking experiences, Apple automatically creates an ARFaceAnchor and adds it to our session when we use an ARFacetrackingConfiguration to run it. This ARFaceAnchor is then passed into renderer as an ARAnchor.

为了使创建面部跟踪体验更加容易, Apple 在使用 ARFacetrackingConfiguration 进行运行 时会 自动创建一个 ARFaceAnchor 并将其添加到我们的会话中 然后将此ARFaceAnchor作为ARAnchor传递到renderer中。

Add the following code to renderer:

将以下代码添加到渲染器:

func renderer(_renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {   
   // 1      
   guard let faceAnchor = anchor as? ARFaceAnchor else { return }
   
   // 2   
   let leftSmileValue = faceAnchor.blendshapes[.mouthSmileLeft] as! CGFloat
   let rightSmileValue = faceAnchor.blendShapes[.mouthSmileRight] as! CGFloat
   
   // 3
   print(leftSmileValue, rightSmileValue)
}

There’s a lot going on inside this function so I’ve numbered the steps (Ray Wenderlich style).

这个函数内部有很多事情要做,所以我已经编号了步骤(Ray Wenderlich风格)。

In step 1 we convert the ARAnchor into an ARFaceAnchor and assign it to the faceAnchor variable.

步骤1中,我们将ARAnchor转换为ARFaceAnchor并将其分配给faceAnchor变量。

ARFaceAnchor contains information about the current position and orientation, topology, and facial expression of the face we’re tracking.

ARFaceAnchor包含有关我们正在跟踪的面部的当前位置和方向,拓扑以及面部表情的信息。

ARFaceAnchor stores information about facial expressions in its variable blendShapes. blendShapes is a dictionary that stores coefficients corresponding to various facial features. If you’re interested, I suggest you check out the full list of facial features in Apple’s documentation. (Hint: if you want to add frown tracking you’ll find a way to do it in here.)

ARFaceAnchor在其变量blendShapes存储有关面部表情的信息。 blendShapes是一个字典,用于存储与各种面部特征相对应的系数。 如果您有兴趣,建议您在Apple文档中查看面部特征的完整列表 。 ( 提示 :如果您想添加皱眉跟踪,则可以在此处找到一种方法。)

In step 2, we use faceAnchor.blendShapes to get the a CGFloat that corresponds to how much the left and right sides of the user’s mouth is smiling by using the keys mouthSmileLeft and mouthSmileRight.

第2步中 ,我们使用faceAnchor.blendShapes来获取CGFloat,该CGFloat通过使用mouthSmileLeftmouthSmileRight键来对应于用户的嘴的左侧和右侧微笑的mouthSmileRight

Finally, step 3 just prints out the two values so you can make sure it’s working properly ?.

最后, 第3步只是打印出这两个值,因此您可以确保其正常工作。

At this point you should have an app that:

此时,您应该有一个应用程序,该应用程序:

  • Gets camera and face tracking permissions from the user

    从用户获取相机和面部跟踪权限
  • Uses ARKit to track the users facial expressions

    使用ARKit跟踪用户的面部表情
  • Prints how much the user is smiling on the left and right sides of their mouth to the console

    在控制台上打印用户在嘴巴左右两侧微笑的程度

We’ve made a lot of progress, so let’s take a second to make sure everything is running properly.

我们已经取得了很大的进步,所以让我们花点时间确保一切正常运行。

When you run the app for the first time, you should be asked if you will grant camera permissions. Make sure to say yes.

首次运行该应用程序时,系统会询问您是否要授予相机权限。 请务必说是。

You’ll then be sent to a blank screen, but you should start seeing CGFloat values being printed to the console (there may be a short delay before you see them).

然后,您将被发送到黑屏,但是您应该开始看到CGFloat值被打印到控制台上(看到它们之前可能会有短暂的延迟)。

When you smile at your phone you should notice the values being printed going up. The more you smile the higher the numbers go.

当您对电话微笑时,您应该会注意到正在打印的值正在上升。 微笑越多,数字越高。

If it’s working properly, congratulations ?! If you’re running into an error, double check to make sure your device supports face tracking and you have camera permissions turned on. If you’ve been following this writeup from the beginning the console will print errors in both those cases.

如果工作正常, 恭喜 ? 如果您遇到错误,请仔细检查以确保您的设备支持脸部追踪并且您已开启相机权限。 如果您从一开始就一直关注着这篇文章,那么在这两种情况下,控制台都将打印错误。

用户界面 (User interface)

So we’re tracking faces, now let’s build the UI to react to smiles.

因此,我们正在跟踪人脸,现在让我们构建用户界面以对微笑做出React。

First add a new UILabel called smileLabel to the top of the file, just below sceneView.

首先,在sceneView顶部下方的文件顶部添加一个名为smileLabel的新UILabel

class ViewController: UIViewController {   
   let sceneView = ARSCNView()      
   let smileLabel = UILabel()
   
   ...
}

This will be the view that reacts to the user’s facial expressions.

这将是对用户面部表情做出React的视图。

Add the following code at the bottom of your setupSmileTracker function:

setupSmileTracker函数的底部添加以下代码:

smileLabel.text = "?"smileLabel.font = UIFont.systemFont(ofSize: 150) 

view.addSubview(smileLabel)

// Set constraints
smileLabel.translatesAutoresizingMaskIntoConstraints = false
smileLabel.centerXAnchor.constraint(equalTo: view.centerXAnchor).isActive = true
smileLabel.centerYAnchor.constraint(equalTo: view.centerYAnchor).isActive = true

Here, we’re adding the basic UI properties to our smileLabel and setting its constraints so it is in the middle of the screen. Now when you run the app, you should see a giant ? emoji in the middle.

在这里,我们将基本的UI属性添加到smileLabel并设置其约束,使其位于屏幕中间。 现在,当您运行该应用程序时,您应该会看到一个巨大的? 表情符号在中间。

Once you see the emoji appear, add the following function to your ViewController:

看到表情符号出现后,将以下函数添加到ViewController

func handleSmile(leftValue: CGFloat, rightValue: CGFloat) {
   let smileValue = (leftValue + rightValue)/2.0
   switch smileValue {      
   	  case _ where smileValue > 0.5:         
      	 smileLabel.text = "?"      
      case _ where smileValue > 0.2:         
         smileLabel.text = "?"      
      default:         
         smileLabel.text = "?"      
   }
}

This function will change the emoji in our smileLabel depending on how much the user is smiling into the camera. We calculate the smileValue by taking the average of the left and right smile values given to us by our ARFaceAnchor (very scientific, I know).

此功能将根据用户对相机微笑的程度来更改我们smileLabel的表情符号。 我们通过取ARFaceAnchor (非常科学,我知道)提供给我们的左右微笑值的平均值来计算smileValue

Plug that value into the switch statement, and the more the user smiles the happier our emoji gets.

将该值插入switch语句中,用户笑的越多,我们的表情符号就会变得越快乐。

Finally, go back to our renderer function and add this to the bottom to plug in our left and right smile values into handleSmile:

最后,返回到renderer功能,并将其添加到底部,将左右微笑值插入handleSmile

DispatchQueue.main.async {   
   self.handleSmile(leftValue: leftSmileValue, rightValue: rightSmileValue)
}

Again, we use DispatchQueue because we are making changes to the UI, which must be done on the main thread.

同样,我们使用DispatchQueue因为我们要对UI进行更改,这必须在主线程上完成。

When you run the app you should now see the emoji change depending on how much you smile at it.

当您运行该应用程序时,您现在应该会看到表情符号的变化,具体取决于您对它的微笑程度。

In the gif below I’ve added my face so that you can see it working with the camera output along with the emoji.

在下面的gif中,我添加了我的脸,以便您可以看到它与相机输出以及表情符号一起使用。

Your app won’t have the camera output, but you can add it by adding our ARSCNView, sceneView, to the superview and giving it dimensions.

您的应用没有摄像头输出,但是您可以通过将我们的ARSCNViewsceneView添加到ARSCNView视图并为其指定尺寸来添加它。

结语 (Wrapping up)

I hope this post was helpful for you to get started creating apps with ARKit.

我希望这篇文章对您开始使用ARKit创建应用程序有帮助。

If you want to extend this app further, check out the list I mentioned above with all the other facial features you can track. I left a hint for how to extend this to check for frowns as well.

如果您想进一步扩展此应用程序,请查看我上面提到的列表以及您可以跟踪的所有其他面部特征。 我留下了一个提示,说明如何扩展它以检查皱眉。

Come back and comment with any cool projects you create on your own, I’m still getting my feet wet with this stuff so it’d be excited to see more complex apps.

回来对您自己创建的任何很棒的项目发表评论,我仍然对这些东西感到不满意,因此看到更复杂的应用程序会感到很兴奋。

I’ve posted all the code for this app up on Github for feedback and questions. Thanks for reading and good luck!

我已在Github上发布了该应用程序的所有代码,以获取反馈和问题。 感谢您的阅读和好运!



Thanks so much for reading! If you liked this story, follow me on Twitter where I post updates about stories I’m working on and what I’m up to.

非常感谢您的阅读! 如果您喜欢这个故事,请在Twitter上关注我,我会在其中发布有关我正在处理的故事以及最新动态的更新。

翻译自: https://www.freecodecamp.org/news/how-to-build-a-super-quick-smile-tracking-app-16eee960888d/

超级玛丽程序

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值