iPhone X上的ARKit人脸追踪

时间:2024-04-01 15:55:15

ARKit has been established as a reliable way to do stable consumer level AR since its first announcement at WWDC in June. More recently, during the iPhone X announcement, it was revealed that ARKit will include some face tracking features only available on the iPhone X using the front camera array, which includes a depth camera.

自从6月份在WWDC上首次发布ARKit以来,ARKit已经成为建立稳定的消费者级AR的可靠方法。 最近,在iPhone X发布期间,据透露ARKit将包括一些面部跟踪功能 ,这些功能仅在使用前置摄像头阵列(包括深度摄像头)的iPhone X上可用。

Unity has been working closely with Apple from the beginning to deliver a Unity ARKit plugin with the ARKit announcement so that Unity developers could start using those features as soon as it was available. Since then, we have continued to work closely with Apple to deliver the face tracking features of ARKit as part of the Unity ARKit plugin.

Unity从一开始就与Apple紧密合作,在发布ARKit的同时提供了Unity ARKit插件,以便Unity开发人员可以在可用后立即开始使用这些功能。 从那时起,我们一直与Apple紧密合作,将ARKit的面部跟踪功能作为Unity ARKit插件的一部分提供。

New code and examples for these features are integrated into the Unity ARKit plugin, which you can get from BitBucket or from the Asset Store.

这些功能的新代码和示例已集成到Unity ARKit插件中,您可以从BitBucket或资产商店中获得该插件。

插件中的API添加 (API additions in plugin)

There is now a new configuration called ARKitFaceTrackingConfiguration which can be used when running on an iPhone X. There are new RunWithConfig and RunWithConfigAndOptions methods that take the ARKitFaceTrackingConfiguration to start the AR session.

现在有一个名为ARKitFaceTrackingConfiguration的新配置,可以在iPhone X上运行时使用。有新的RunWithConfigRunWithConfigAndOptions方法采用ARKitFaceTrackingConfiguration来启动AR会话。

1
2
3
public void RunWithConfig(ARKitFaceTrackingConfiguration config)
public void RunWithConfigAndOptions(ARKitFaceTrackingConfiguration config, UnityARSessionRunOption runOptions)
1
2
3
public void RunWithConfig ( ARKitFaceTrackingConfiguration config )
public void RunWithConfigAndOptions ( ARKitFaceTrackingConfiguration config , UnityARSessionRunOption runOptions )

There are also event callbacks for when ARFaceAnchor is added, removed or updated:

还有添加,删除或更新ARFaceAnchor时的事件回调:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
public delegate void ARFaceAnchorAdded(ARFaceAnchor anchorData);
public static event ARFaceAnchorAdded ARFaceAnchorAddedEvent;
public delegate void ARFaceAnchorUpdated(ARFaceAnchor anchorData);
public static event ARFaceAnchorUpdated ARFaceAnchorUpdatedEvent;
public delegate void ARFaceAnchorRemoved(ARFaceAnchor anchorData);
public static event ARFaceAnchorRemoved ARFaceAnchorRemovedEvent;
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
public delegate void ARFaceAnchorAdded ( ARFaceAnchor anchorData ) ;
public static event ARFaceAnchorAdded ARFaceAnchorAddedEvent ;
public delegate void ARFaceAnchorUpdated ( ARFaceAnchor anchorData ) ;
public static event ARFaceAnchorUpdated ARFaceAnchorUpdatedEvent ;
public delegate void ARFaceAnchorRemoved ( ARFaceAnchor anchorData ) ;
public static event ARFaceAnchorRemoved ARFaceAnchorRemovedEvent ;

人脸追踪功能 (Features Exposed by Face Tracking)

There are four main features exposed by Face Tracking in ARKit. They are described below and corresponding examples that use them are detailed.

ARKit中的“面部跟踪”揭示了四个主要功能。 下面介绍它们,并详细说明使用它们的相应示例。

面锚 (Face Anchor)

iPhone X上的ARKit人脸追踪

The basic feature of face tracking is to provide a face anchor when ARKit detects a face with the front camera on the iPhone X. This face anchor is similar to the plane anchor that the ARKit returns usually, but tracks the position and orientation of the center of the head as you move it around. This allows you to use the movement of the face as input for your ARKit app, but also allows you to use this anchor to attach objects to the face or head so that it will move around with the movement of your head.

面部跟踪的基本功能是当ARKit在iPhone X上使用前置摄像头检测到面部时提供面部锚 。此面部锚与ARKit通常返回的平面锚类似,但会跟踪中心的位置和方向。当您四处移动时。 这允许您将面部的运动用作ARKit应用程序的输入,还可以使用此锚点将对象附加到面部或头部,以便其随头部的运动而运动。

We have made an example scene to demonstrate the use of this called FaceAnchorScene.  This has a GameObject with the component UnityARFaceAnchorManager that initializes ARKit with ARKitFaceTrackingConfiguration. It also hooks into the FaceAnchor to create, update, and remove events so that it can do the following:

我们制作了一个示例场景来演示此名为FaceAnchorScene的用法。 它具有一个带有UnityARFaceAnchorManager组件的GameObject ,该组件使用ARKitFaceTrackingConfiguration初始化ARKit 。 它还可以挂接到FaceAnchor上以创建,更新和删除事件,以便可以执行以下操作:

  1. On face anchor creation, it enables a GameObject that is referenced, in this case a model of three axes, and moves it to the position and orientation that is returned by the FaceAnchor.

    创建面部锚点时,它将启用被引用的GameObject(在本例中为三轴模型),并将其移动到FaceAnchor返回的位置和方向。
  2. On face anchor update, it updates the position and orientation of the GameObject

    在面部锚点更新时,它会更新GameObject的位置和方向
  3. On face anchor removal, it disables the GameObject

    在移除面部锚点时,它将禁用GameObject

This scene also uses ARCameraTracker component that updates the Main Unity Camera via the FrameUpdateEvent that a regular ARKit app uses.

该场景还使用ARCameraTracker组件,该组件通过常规ARKit应用程序使用的FrameUpdateEvent更新Main Unity Camera。

面网格几何 (Face Mesh Geometry)

iPhone X上的ARKit人脸追踪

The face tracking API can also return the geometry of the face it detects as a mesh. We can then use the mesh vertices to create a corresponding mesh in Unity. Then we can use the mesh in Unity with a transparent texture to allow all sorts of face painting and masks. We can also put an occlusion material on this mesh when we attach things to the face anchor so that the attachments occlude properly against the video of the face.

面部跟踪API还可以返回其检测为网格的面部的几何形状 。 然后,我们可以使用网格顶点在Unity中创建相应的网格。 然后,我们可以在Unity中使用具有透明纹理的网格以允许进行各种面部绘画和蒙版。 当我们将物体连接到面部锚上时,我们还可以在该网格上放置遮挡材质,以使附件正确遮挡在面部视频上。

The example scene called FaceMeshScene shows how to display the face mesh geometry on top of your face with a default material on it (so it appears grey). It has the usual ARCameraTracker GameObject to move the camera in the scene. In addition, it has ARFaceMeshManager GameObject, which has a standard Mesh Renderer and an empty Mesh Filter component. This GameObject also has UnityARFaceMeshManager component, which does the following:

名为FaceMeshScene的示例场景显示了如何在带有默认材质(因此显示为灰色)的情况下,在您的脸部上方显示脸部网格几何体。 它具有通常的ARCameraTracker GameObject可以在场景中移动相机。 此外,它还具有ARFaceMeshManager游戏对象,该对象具有标准的网格渲染器和空的网格过滤器组件。 此GameObject还具有UnityARFaceMeshManager组件,该组件执行以下操作:

  1. Configuration of the face tracking

    人脸跟踪的配置
  2. Anchor position and rotation updates on the transform of this GameObject

    锚点位置和旋转在此GameObject的变换上更新
  3. Extract the mesh data from the anchor per frame and populate a mesh with it, and set the MeshFilter component to reference that mesh.

    每帧从锚中提取网格数据,并用其填充网格,然后设置MeshFilter组件以引用该网格。

混合形状 (Blend Shapes)

iPhone X上的ARKit人脸追踪

Another set of data we get from face tracking are coefficients that describe the expressions on your face, which can be mapped onto a virtual face to make it have a similar expression to yours.

我们从面部跟踪中获得的另一组数据是描述您面部表情的系数 ,可以将这些系数映射到虚拟面部上,以使其具有与您相似的表情。

Our example scene FaceBlendShapeScene shows this. In the UI, it shows the coefficients of the different blend shape values that are returned by the current expression on your face. See how they change when you change your expressions!

我们的示例场景FaceBlendShapeScene显示了这一点。 在用户界面中,它显示了当前表情在您的脸上返回的不同混合形状值的系数。 看看当您改变表情时它们如何改变!

This scene has the same GameObjects as the FaceMeshScene, but in addition also has a BlendshapeOutput GameObject which contains a BlendshapePrinter component. This component extracts the blend shapes from the face anchor if it exists and outputs it to the screen UI.

该场景具有与FaceMeshScene相同的GameObject,但是除此之外,还具有BlendshapeOutput GameObject,其中包含BlendshapePrinter组件。 如果存在,此组件将从面锚提取混合形状并将其输出到屏幕UI。

We are working on a more elaborate example where these values will be mapped on to the facial animation of a virtual head to get a better idea on how this could work in your experience.

我们正在研究一个更精致的示例,其中将这些值映射到虚拟头部的面部动画上,以更好地了解如何将其应用于您的体验。

定向光估计 (Directional Light Estimate)

iPhone X上的ARKit人脸追踪

Another interesting set of data that you get with face tracking is a directional light estimate of the scene, based on using your face as a light probe in the scene. The estimate that is generated contains three things:

通过面部跟踪获得的另一套有趣的数据是基于将面部用作场景中的光探测器的定向光估计 。 生成的估计包含三件事:

  1. Primary light direction

    主光方向
  2. Primary light intensity

    一次光强度
  3. Spherical harmonics coefficients of the estimated lighting environment in all directions

    估计照明环境在所有方向上的球谐系数

The last of these is very interesting for us in Unity, as it is the solution used for dynamic global illumination in our standard rendering pipeline. Knowing this information, we can take advantage of it in our example scene.

其中的最后一个对我们在Unity中非常有趣,因为它是标准渲染管线中用于动态全局照明的解决方案。 知道了这些信息,我们可以在示例场景中利用它。

The FaceDirectionalLightEstimate scene has an ARCameraTracker and an ARFaceAnchorManager, which moves a standard grey sphere mesh around with your face.  What’s new is the ARKitLightManager GameObject, which has the UnityARKitLightManager component on it that gets the spherical harmonics coefficients from the FrameUpdated event and plugs it into all of the Unity light probes in the scene, including the ambient light probe (which is used when none of the light probes in the scene affect the mesh). This effectively lights the meshes in the scene with the estimated environment lighting dynamically.

FaceDirectionalLightEstimate场景具有ARCameraTracker和ARFaceAnchorManager,可将标准的灰色球体网格围绕您的脸移动。 新增的功能是ARKitLightManager GameObject,其上具有UnityARKitLightManager组件,该组件可从FrameUpdated事件获取球谐系数,并将其插入场景中的所有Unity光探测器中,包括环境光探测器(当场景中的光探针会影响网格)。 这样可以有效地照亮场景中的网格物体,并具有估计的环境照明。

Alternatively, if you wish to use your own mechanism to light the scene you can get the raw spherical harmonics coefficients in Unity’s coordinate system via the FrameUpdatedEvent and plug it into your lighting formulas. You may also just want to light with the primary light direction and intensity, which are also available in the same manner.

另外,如果您希望使用自己的机制来照亮场景,则可以通过FrameUpdatedEvent在Unity坐标系中获取原始球谐系数,并将其插入您的照明公式中。 您可能还只想以主要的光线方向和强度进行照明,它们也可以相同的方式使用。

在应用程序中使用面部跟踪 (Use Face Tracking in your Apps)

As you can see, there are some nice ARKit features available with face tracking on the iPhone X. Unity’s ARKit plugin can help you to easily implement these features within your apps. As usual, show us your creations on @jimmy_jam_jam, and ask any questions on the forums.

如您所见,iPhone X上的面部跟踪提供了一些不错的ARKit功能。Unity的ARKit插件可以帮助您轻松地在应用程序中实现这些功能。 像往常一样,在@jimmy_jam_jam上向我们展示您的作品,并在论坛上提问。

翻译自: https://blogs.unity3d.com/2017/11/03/arkit-face-tracking-on-iphone-x/