arkit技术介绍_面向移动AR的触觉技术:如何以“触摸”感增强ARKit应用

arkit技术介绍

by Neil Mathew

通过尼尔·马修(Neil Mathew)

面向移动AR的触觉技术:如何以“触摸”感增强ARKit应用 (Haptics for mobile AR: how to enhance ARKit apps with a sense of “touch”)

I’m really excited about the future of haptics for AR and VR. It feels like the missing link between my HTC Vive and jumping into the OASIS with Parzival and Art3mis. So it’s not surprising that haptics is perhaps the most hotly anticipated tech in the XR community right now. Several companies like Microsoft and HTC, as well as startups like SenseGlove and HaptX, have shown demos of increasingly promising iterations of haptic gloves that I’m itching to try out.

我对AR和VR触觉的未来感到非常兴奋。 感觉就像我的HTC Vive 与Parzival和Art3mis跳入OASIS之间的缺失链接。 因此,触觉技术可能是XR社区中目前最受期待的技术也就不足为奇了。 微软和HTC等多家公司,以及SenseGloveHaptX等初创公司都展示了演示,这些演示我正在尝试尝试,它们的触觉手套的迭代前景越来越好。

Unfortunately, like most AR developers today, our work at Placenote is focused almost entirely on mobile AR platforms like ARKit and ARCore. Naturally, this got us thinking, “Could haptics do anything for mobile AR?

不幸的是,像今天大多数AR开发人员,我们的工作Placenote几乎完全集中在像ARKitARCORE移动AR平台。 自然,这使我们想到: “触觉可以为移动AR做任何事情吗?

Haptics have been an awesome addition to touch screens, from simulating tactile button clicks to silent notifications. But, after some frantic googling we realized that there’s actually been no real discussion about haptics for mobile AR apps so far… CHALLENGE ACCEPTED ??

从模拟触觉按钮的点击到静音通知,触觉已经成为触摸屏的绝佳补充。 但是,经过一番疯狂的搜寻之后,我们意识到到目前为止,实际上还没有关于移动AR应用程序的触觉的真正讨论……挑战已被接受 ??

移动AR的挑战 (The challenge of mobile AR)

We decided to dig into why haptics hasn’t made it’s way into mobile AR and it wasn’t hard to see why. Mobile AR is by far the least immersive AR medium. The consensus in the community is that it’s just a stop gap to the ultimate AR platform — smart glasses.

我们决定深入研究为什么触觉尚未进入移动AR,并且不难理解为什么。 到目前为止,移动AR是最不沉浸式的AR媒介。 社区中的共识是,这只是终极AR平台(智能眼镜)的制止点。

But mindset isn’t the only barrier here. We found that the mobile form-factor presents some unique challenges to the AR experience designer:

但是心态并不是这里唯一的障碍。 我们发现,移动外形对AR体验设计师提出了一些独特的挑战:

  • unlike headsets, the phone screen is the display as well as the controller

    与耳机不同,电话屏幕既是显示屏又是控制器
  • it’s impossible to bring your hands into the experience since you’re holding the phone.

    握着手机,不可能将您的双手带入体验。
  • we still rely on touch screen interactions that are ambiguous in dimensionality — 2D or 3D touch?

    我们仍然依赖于维度模糊的触摸屏交互-2D或3D触摸?

Nevertheless, the reality is that, for the next few years and perhaps more, mobile AR is here to stay. There are a billion mobile devices in consumer pockets right now and only about a handful of AR headsets on their heads. As a developer, distribution for your apps trumps most other factors. In fact, in applications like indoor navigation and gaming, mobile has already proven itself as a viable medium for deploying AR experiences.

尽管如此,现实是,在接下来的几年甚至更长的时间内,移动AR仍将继续存在。 目前,消费者的口袋里有十亿个移动设备,而只有少数几个AR头戴式设备在他们头上。 作为开发人员,应用程序的发行量要比其他大多数因素都要重要。 实际上,在室内导航和游戏等应用中,移动设备已经证明自己是部署AR体验的可行媒介。

This brings us to the topic of haptics for mobile AR. At first, it might seem like there’s no real hope for haptics to enhance mobile AR experiences, but recent studies have actually shown otherwise.

这将我们带入了移动AR的触觉主题。 乍一看,触觉似乎没有真正的希望来增强移动AR体验,但是最近的研究实际上却显示出了其他事实。

在触觉中,少即是多 (In haptics, less is more)

There’s been a myriad of methods conceived to achieve haptic feedback. In general they fall under two broad categories — kinesthetic haptics (force feedback) and cutaneous haptics (skin sensations).

有许多方法可以实现触觉反馈。 通常,它们分为两大类- 运动觉触觉 (力反馈)和皮肤触觉 (皮肤感觉)。

Kinesthetic haptics has widely been considered to be the more realistic haptic technology. It involves physical actuators, either grounded or ungrounded. These push and pull our fingers and other appendages in response to interactions with virtual objects. Intuitively, realistic force-feedback should perform vastly better than plain old vibrations. But a study published in Science Robotics this year titled “The Uncanny Valley of Haptics” has challenged these assumptions.

动觉触觉已被广泛认为是更现实的触觉技术。 它涉及已接地或未接地的物理执行器。 这些响应与虚拟对象的交互来推动和拉动我们的手指和其他附件。 凭直觉,逼真的力反馈应该比普通的旧振动好得多。 但是, 今年在《 科学机器人》杂志上发表的一项名为“触觉异常谷”的研究挑战了这些假设。

The researchers found that increasing the realism of haptic sensation doesn’t necessarily increase the quality of the AR experience. It often has a negative impact due to the uncanny valley of realism in simulations. They found that cutaneous haptics, which is essentially a combination of light touches and vibrations, did a lot better in fooling the brain deeper into the illusion. Strange results, but they basically realized that we’ve underestimated how good our brain is at filling the gaps in our sensation of reality.

研究人员发现,增加触觉的真实感并不一定会提高AR体验的质量。 由于在模拟中存在不可思议的现实主义谷底,因此通常会产生负面影响。 他们发现,皮肤触觉本质上是轻触和振动的结合,在使大脑更深入幻觉方面做得更好。 结果很奇怪,但他们基本上意识到,我们低估了大脑在填补现实感方面的能力。

The situations where our brain steps in to fill the gaps is what I find most interesting about our perception of the sensation of touch. — Justin Brad, CEO of Osso VR
对于我们对触摸感的感知,我发现大脑最有趣的情况就是填补这些空白。 — Osso VR首席执行官Justin Brad

将触觉带入移动AR (Bringing haptics to mobile AR)

Given these findings, why not test what cutaneous haptics can do for mobile AR? After all, haptics on mobile is not just about vibrating ring tones anymore.

鉴于这些发现,为什么不测试可移动AR的皮肤触觉器呢? 毕竟,移动设备上的触觉不再只是振动铃声了。

Micro-Electro-Mechanical Systems (MEMS) on mobile devices have gotten a lot more sophisticated and capable of some pretty nuanced behaviors. Since the iPhone 7, Apple has upgraded the old basic rumble vibrations to what they now call the Taptic Engine. This is a lot more subtle and consists of seven different types of haptic feedback with varying patterns and strengths.

移动设备上的微机电系统(MEMS)变得更加复杂,并且能够表现出一些细微的行为。 自iPhone 7起,Apple已将旧的基本隆隆振动升级为现在称为Taptic Engine的装置。 这要更加微妙,它由七种不同类型的触觉反馈组成, 它们具有不同的模式和强度。

The haptic feedback modes available are:

可用的触觉反馈模式为:

  • Selection Change

    选择变更
  • Impact Light

    撞击灯
  • Impact Medium

    冲击介质
  • Impact Heavy

    重冲击
  • Notification Success

    通知成功
  • Notification Warning

    通知警告
  • Notification Failure

    通知失败

To learn more about the iOS feedback generator, check out this Apple documentation. At the end of this article, I will share some code you can use to quickly add these feedback types to your ARKit apps.

要了解有关iOS反馈生成器的更多信息, 请查阅此Apple文档 。 在本文的结尾,我将分享一些代码,您可以使用这些代码将这些反馈类型快速添加到ARKit应用中。

We decided to experiment with a number of these haptic feedback modes in our AR apps and I’m really excited to say that the results were a pleasant surprise to our team.The following are some examples of haptic implementations in our mobile AR apps.

我们决定在我们的AR应用程序中尝试多种触觉反馈模式,我很高兴地说结果为我们的团队带来了惊喜。以下是我们的移动AR应用程序中触觉实现的一些示例。

移动AR中触觉的用法示例 (Usage examples of haptics in mobile AR)

In our experiments so far, we’ve found that haptic feedback for mobile AR works well in five distinct scenarios. Here’s a description of each.

到目前为止,在我们的实验中,我们发现针对移动AR的触觉反馈在五个不同的场景中效果很好。 这是每个的描述。

1.磁性指针(即对齐网格) (1. Magnetic pointers (i.e. snap to grid))

A pointer locked along a planar surface is a commonly used feature in many ARKit apps, especially in measurement tools like Air Measure and Magic Plan. Since your phone behaves as a controller in mobile AR, the standard UX in measurement apps involves dragging a pointer along a surface to draw lines or polygons to measure things in the real world. Of course, when it comes to line drawing, magnetic pointers that snap to the end points and edges of lines are seen everywhere — from PowerPoint to Photoshop.

指针锁定在平面上是许多ARKit应用程序中常用的功能,尤其是在Air MeasureMagic Plan等测量工具中。 由于您的手机在移动AR中充当控制器,因此测量应用程序中的标准UX涉及沿表面拖动指针以绘制线或面以测量现实世界中的事物。 当然,在绘制线条时,从PowerPoint到Photoshop到处都可以看到捕捉到线条的端点和边缘的磁性指针。

We found that subtle haptic feedback indicating a “snap” in pointer position is a great enhancement. It almost feels like your phone, (i.e your controller) is physically moving to snap into place.

我们发现,微妙的触觉反馈指示了指针位置的“快照”,这是一个很大的增强。 几乎感觉就像您的电话(即您的控制器)在物理上移动以卡入到位。

I was really happy to see that Apple’s new app “Measure” actually uses haptic feedback in their UX. It’s an amazingly subtle implementation and you can see a GIF of it in action below. An “Impact Medium” is fired when the pointer snaps to the edge of the plane.

我真的很高兴看到Apple的新应用“ Measure”实际上在其UX中使用了触觉反馈。 这是一个非常微妙的实现,您可以在下面的操作中看到它的GIF。 当指针捕捉到平面边缘时,将触发“冲击介质”。

2.命中测试(感觉真实世界的表面) (2. Hit testing (feeling real world surfaces))

Another common feature in ARKit apps is the hit-test. This is implemented as a ray-cast from a point on the screen — either a touch point or the center — to a surface in the real word. It is generally used to add a 3D object at the point of contact. A slight haptic sensation can help the user understand that a surface was “hit”. We found two methods that work well here:

ARKit应用程序的另一个常见功能是命中测试。 这是通过从屏幕上的某个点(触摸点或中心)到实际单词的表面的射线投射实现的。 通常用于在接触点添加3D对象。 轻微的触觉可以帮助用户了解表面被“击中”。 我们发现两种方法在这里很有效:

PinningIn this example, a marker is added to the scene at the hit point. An “Impact Light” helps users sense the “pinning” of the marker in 3D space. Of course, the downside to this is you can’t quite sense the “depth” of the hit point — in other words, how far the pin is from the user.

固定在此示例中,将标记添加到命中点处的场景。 “撞击光”可帮助用户感知3D空间中标记的“固定”。 当然,这样做的不利之处是您无法完全感觉到击中点的“深度”,换句话说,就是针距用户的距离。

GrazingAn alternative to pinning is the grazing method of hit testing. In this case, a constantly updating marker previews where a marker might be added to a scene. We found that a series of haptic impulses, based on the magnitude of displacement of the preview marker at each frame, gives the sensation of scraping a pointer along a 3D surface and let’s you “feel” a 3D surface.

放牧钉扎的一种替代方法是命中测试的放牧方法。 在这种情况下,不断更新的标记预览会在其中将标记添加到场景中。 我们发现,基于每帧预览标记位移的大小,一系列触觉脉冲给人一种沿3D表面刮擦指针的感觉,让您“感觉” 3D表面。

Here’s a code example of grazing in Unity:

这是在Unity中放牧的代码示例:

if (distanceChange >= 0.1 && distanceChange < 0.2) 
{
    iOSHapticFeedback.Instance.Trigger(Impact_Light);
}
else if (distanceChange >= 0.2 && distanceChange < 0.4) 
{
    iOSHapticFeedback.Instance.Trigger(Impact_Medium);
}
else if (distanceChange >= 0.4)
{
    iOSHapticFeedback.Instance.Trigger(Impact_Heavy);
}
3. FPS枪后坐或爆炸 (3. FPS gun recoil or explosions)

This is by far the most fun example of haptic feedback. When building a first person shooter in AR, your phone is the display as well as the weapon. A great way to simulate a gun shot is a simple “Impact Heavy”, which produces a single bump or a “Notification Failure”, which creates a double bump that feels a lot like a gun recoil. Of course the example below is a laser weapon but, hey, this isn’t meant to be too realistic remember?

这是迄今为止最有趣的触觉反馈示例。 在AR中建立第一人称射击游戏时,您的手机既是显示屏又是武器。 模拟枪击的一种好方法是简单的“ Impact Heavy”,它会产生一个颠簸,或者是一个“ Notification Failure”,它会产生一个双颠簸,感觉就像枪的后坐力。 当然,下面的示例是激光武器,但是,这并不意味着太现实了吗?

4.与控制器尖端碰撞 (4. Collision with controller tip)

In VR apps like Oculus Medium or Tilt Brush, one of the handheld controllers serves as a brush tip that the user moves around to draw in 3D space. I’ve spent hours painting in Tilt Brush and so naturally I have tried really hard to mimic this experience with ARKit.

Oculus MediumTilt Brush等VR应用程序 ,其中一个手持控制器用作笔尖,用户可以在其中移动以绘制3D空间。 我已经花了几个小时在Tilt Brush中绘画,所以自然而然地,我非常努力地模仿ARKit的这种体验。

The trouble is that creating an accurate drawing experience on mobile becomes really difficult. You lose the sense of depth when your phone is both the display and the controller. One of the hardest things in 3D drawing apps on mobile is knowing where your brush tip is relative to the other 3D objects in the scene.

问题在于,在移动设备上创建准确的绘画体验变得非常困难。 当电话既是显示屏又是控制器时,您会失去深度感。 移动设备上的3D绘图应用程序中最难的事情之一就是知道笔刷尖端相对于场景中其他3D对象的位置。

And, again, haptics was the answer. We found that one way to give users a sense of depth is to imagine the brush is actually a cane you can use to hit 3D objects that are already in scene. Providing haptic feedback to let users know whether the brush tip is in contact with any existing objects in the scene lets users accurately pin point their brush in 3D space.

而且,触觉是答案。 我们发现,给用户一种深度感的一种方法是想象画笔实际上是可以用来击打场景中已经存在的3D对象的手杖。 提供触觉反馈以使用户知道笔刷尖端是否与场景中的任何现有对象接触,从而使用户可以准确地将笔刷指向3D空间。

5.在Persistent AR Apps中重新定位快照。 (5. Re-localization snap in Persistent AR Apps.)

At Placenote , we primarily build Persistent AR, or AR Cloud, apps. The core functionality of these apps is the ability to save AR content permanently in a physical place. Users can load it up in the same location every time.

Placenote ,我们主要构建Persistent AR或AR Cloud应用程序。 这些应用程序的核心功能是能够将AR内容永久保存在物理位置。 用户每次都可以将其加载到同一位置。

This behaviour is called the relocalization of a scene.

此行为称为场景重新定位。

In order to relocalize an AR scene, a user must first point their phone’s camera to the real world, and then wait until the camera detects its location.

为了重新定位AR场景,用户必须首先将手机的摄像头对准现实世界,然后等待,直到摄像头检测到其位置。

With Placenote, relocalization happens almost instantaneously but it all happens internally. Hence we need to design a way to notify the user of a successful relocalization. The visual cues might be enough, as seen in the GIF above. But a more subtle indication is to provide a haptic “Impact Light” to suggest that you have snapped into place in the real world.

使用Placenote,重新定位几乎是即时发生的,但所有这些都内部发生。 因此,我们需要设计一种方法来通知用户成功的重新定位。 如上面的GIF所示,视觉提示可能就足够了。 但是,更微妙的指示是提供触觉“冲击光”,以暗示您已融入现实世界。

如何为您的ARKit项目添加触觉 (How to add haptics to your ARKit project)

If you’re working with Swift for Native iOS ARKit development, check out this tutorial on implementing haptic feedback in Native apps.

如果您正在使用Swift进行Native iOS ARKit开发, 请查看有关在Native应用中实现触觉反馈的本教程

If you’re working with Unity, my favorite package so far is the iOS Haptic Feedback Package on the Unity Asset Store. It’s $5 but well worth it because Unity’s built in function Handheld.Vibrate() doesn’t actually expose the new iOS Taptic Engine functions!

如果您正在使用Unity,那么到目前为止我最喜欢的软件包是Unity Asset Store上的iOS Haptic Feedback软件包 。 这是5美元,但非常值得,因为Unity的内置函数Handheld.Vibrate()实际上并未公开新的iOS Taptic Engine函数!

The iOS Haptic Feedback Package provides a simple Prefab and Scripts to add all 7 types of haptic feedback into your app. You can get it from the Asset Store link here:

iOS触觉反馈包提供了一个简单的Prefab和脚本,可将所有7种类型的触觉反馈添加到您的应用中。 您可以从此处的资产商店链接中获取它:

注意事项 (Things to watch out for)

As with any design tool, here’s a few things to watch out for when incorporating haptics in your mobile AR app.

与任何设计工具一样,在移动AR应用程序中集成触觉时需要注意以下几点。

过多使用触觉会弄乱ARKit跟踪 (Using haptics too much can mess up ARKit tracking)

Test the impact of haptics on your AR session. Since ARKit relies on inertial sensing to track the phone’s motion, adding too many vibrations during an ARKit session can throw off tracking slightly.

测试触觉对您的AR会话的影响。 由于ARKit依靠惯性感测来跟踪手机的运动,因此在ARKit会话期间添加过多的振动可能会略微影响跟踪。

过多使用触觉会导致设备过热 (Using haptics too much can overheat the device)

Haptics is, after all, a physical movement of your mobile device and naturally tends to use more energy. Use this sparingly to ensure your phone doesn’t overheat or run out of battery too fast.

毕竟,触觉是移动设备的物理运动,自然会消耗更多的能量。 请谨慎使用此功能,以确保手机不会过热或电池电量耗尽过快。

过多的触觉反馈可能会使您的用户感到困惑和不敏感 (Too much haptic feedback might confuse and desensitize your user)

This is true for any haptic mechanism. Don’t overdo it. Specifically, don’t use it without a clear understanding of why haptic feedback is necessary for the action your user is performing. The danger of overuse is that your user gets confused by it and therefore gets desensitized to your feedback.

对于任何触觉机制都是如此。 不要过分。 具体来说,不要在没有清楚理解为什么用户执行操作时需要触觉反馈的情况下使用它。 过度使用的危险是您的用户对此感到困惑,因此对您的反馈不敏感。

And that’s it! I hope this article has given you a helpful dose of design ideas and convinced you to venture into the world of mobile AR haptics. We’ve really enjoyed exploring the different ways we could simulate touch sensations in mobile AR and if you have any more ideas we would love to talk to you about it. If you’re interested in trying out any of our code examples for mobile AR haptics, send me an email at neil [at] placenote.com.

就是这样! 我希望本文能为您提供有益的设计思想,并说服您冒险进入移动AR触觉世界。 我们真的很喜欢探索可以在移动AR中模拟触摸感觉的不同方法,如果您还有其他想法,我们很乐意与您讨论。 如果您有兴趣尝试使用我们的任何有关移动AR触觉的代码示例,请发送电子邮件至neil [at] placenote.com

If you’re interested in Persistent AR apps or what we do at Placenote, message us on twitter, or check out Placenote.com

如果您对Persistent AR应用程序或Placenote的功能感兴趣,请在Twitter上向我们发送消息,或查看Placenote.com

翻译自: https://www.freecodecamp.org/news/haptics-for-mobile-ar-how-to-enhance-arkit-apps-with-a-sense-of-touch-151d9e9c9950/

arkit技术介绍

  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
要进行AR应用开发,您需要掌握以下技术: 1. 编程语言:常用的AR应用开发语言包括C#(Unity开发平台)、Java(Android开发)、Objective-C/Swift(iOS开发)和JavaScript(WebAR开发)。选择合适的编程语言取决于目标平台和开发工具。 2. AR开发框架和引擎:使用AR开发框架和引擎可以简化开发流程。一些常用的AR开发框架和引擎包括Unity3D、ARKit(iOS)、ARCore(Android)、Vuforia、Wikitude和EasyAR等。 3. 计算机视觉:了解计算机视觉的基本原理和技术,包括图像处理、特征检测和跟踪、相机定位和姿态估计等。这些技术对于AR应用中的目标识别、姿态跟踪和虚拟物体叠加至关重要。 4. 3D建模和动画:掌握基本的3D建模和动画技术,可以使用软件如Blender、Maya或3ds Max创建虚拟对象和场景。 5. 传器和设备:了解如何使用移动设备的传器(如摄像头、陀螺仪和加速度计)获取环境信息,并与AR应用进行交互。 6. 数据处理和算法:掌握数据处理和算法技术,用于处理和分析AR应用中的数据,如姿态估计、虚拟物体与现实世界的交互等。 7. 用户界面设计:设计具有良好用户体验的AR界面,包括交互设计、图形用户界面(GUI)和用户反馈等。 8. 测试和调试:掌握AR应用的测试和调试技术,以确保应用的稳定性和性能。 综上所述,AR应用开发需要掌握编程语言AR开发框架和引擎、计算机视觉、3D建模和动画、传器和设备、数据处理和算法、用户界面设计以及测试和调试等技术

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值