Unity MARS is a suite of authoring tools and runtime systems for creating the next generation of spatial computing applications. Companion apps that allow for authoring and data capture on augmented reality (AR) devices are a key part of this suite. We believe that the combination of PC Editor tools and on-device authoring applications is the most powerful and accessible way to create AR content.
Unity MARS是一套创作工具和运行时系统,用于创建下一代空间计算应用程序。 允许在增强现实(AR)设备上进行创作和数据捕获的伴侣应用程序是此套件的关键部分。 我们认为,PC编辑器工具和设备上创作应用程序的组合是创建AR内容的最强大且可访问的方法。
We announced Unity MARS at Unite Berlin 2018 and shared our progress more recently at Unite Copenhagen, where we showed off an early version of the companion app for smartphones. We also published a blog post earlier this year on lessons learned in spatial design while working on the head-mounted display (HMD) version. You can find out more general information on our Unity MARS page. Here, we’ll talk about the motivation behind creating the apps in the first place and how they fit within the overall AR authoring workflow of Unity MARS.
我们在 Unite Berlin 2018上 宣布了Unity MARS, 并在Unite Copenhagen上分享了我们的最新进展,在那里我们展示了 智能手机伴侣应用 的 早期版本 。 我们还在 今年早些时候 发表了 一篇博客文章 ,介绍了在头戴式显示器(HMD)版本上进行空间设计的经验教训。 您可以在我们的 Unity MARS页面 上找到更多常规信息 。 在这里,我们将首先讨论创建应用程序背后的动机,以及它们如何适合Unity MARS的整个AR创作工作流。
在上下文中编辑 (Editing in context)
Content authoring is at its best when creators can take advantage of any and all tools available. With the Unity MARS companion apps, we have the opportunity to not only take advantage of 3D authoring using tracked devices, but enable users to do this in any location, on any AR device, without needing access to the full project and Unity Editor. If a creative agency is working on a museum tour app, they don’t need to physically visit the museum to take pictures of the art or capture scans of the rooms. They can share their project with their client – the museum curator – who can install the Unity MARS companion app from their smartphone app store, capture the necessary data, and save it to the cloud.
当创作者可以利用任何可用的工具时,内容创作处于最佳状态。 借助Unity MARS配套应用程序,我们不仅有机会利用跟踪设备来利用3D创作 ,而且使用户能够在任何AR设备上的任何位置进行此操作 ,而无需访问完整的项目和Unity Editor。 如果创意代理商正在使用博物馆旅游应用程序,则他们无需亲自参观博物馆即可拍摄艺术品或捕获房间的扫描图像。 他们可以与客户(博物馆馆长)共享项目,后者可以从智能手机应用商店中安装Unity MARS配套应用,捕获必要的数据,并将其保存到云中。
World-aware apps need to make sense of an unpredictable and dynamic environment. To confront these constraints head-on, Unity MARS includes a robust set of simulation features, including synthetic environments, simulated AR data discovery, and recording/playback for AR data. These tools are great for testing your data constraints or testing AR interactions in play mode. But the strongest advantage is that you can start to see your app as it will appear on a device in real-time. AR devices allow users to lay out their scenes in real space – either the dedicated location or any useful room – on the device. And if a bug occurs because of lighting conditions or a particular room setup, it’s useful to have the actual device data to reproduce and fix the issue.
世界知名的应用程序需要理解一个不可预测的动态环境。 为了直面这些限制,Unity MARS包括一组强大的模拟功能,包括合成环境,模拟的AR数据发现以及AR数据的记录/播放。 这些工具非常适合在播放模式下测试数据约束或测试AR交互。 但是最大的优势在于,您可以开始看到应用程序,因为它会实时显示在设备上。 AR设备允许用户在设备上的真实空间(专用位置或任何有用的房间)中布置自己的场景。 而且,如果由于光线条件或特定的房间设置而发生错误,那么拥有实际的设备数据来重现并解决该问题非常有用。
We are trying to create workflows that are accessible to nontechnical users but still familiar to Unity developers. You are editing the same scene, with the same assets, using tools that share terminology and, where it makes sense, interaction patterns. Proxies in the companion app have an active state, can have child objects, and show a list of components. You save/load a “scene” and group all of these assets in a “project.” We don’t expose the full hierarchy and inspector, but these types of tools will eventually make their way into the EditorXR runtime feature set.
我们正在尝试创建非技术用户可以访问但Unity开发人员仍然熟悉的工作流程。 您正在使用共享术语的工具以及具有交互作用的意义的工具来编辑具有相同资产的相同场景。 配套应用中的代理处于活动状态,可以具有子对象,并显示组件列表。 您保存/加载“场景”并将所有这些资产分组在“项目”中。 我们没有公开完整的层次结构和检查器,但是这些类型的工具最终将进入EditorXR运行时功能集。
为什么要建立配套应用? (Why build companion apps?)
One of the biggest pain points of mobile development in Unity is build times. You can learn about the strides we’ve made to improve build times by watching a recent Unite Now talk on the subject. All of these features help developers iterate and test their final app on their target hardware. With the Unity MARS companion apps, by using the target device’s AR capabilities to test and iterate on specific pieces of content, either during development or in real-world scenarios, more team members can be incorporated into the process without requiring them to work directly in the Unity Editor.
Unity中移动开发最大的痛点之一就是构建时间。 您可以观看最近 关于这个主题的 Unite Now演讲 , 了解我们在缩短构建时间方面所取得的进步 。 所有这些功能可帮助开发人员在目标硬件上迭代和测试其最终应用程序。 借助Unity MARS配套应用程序,在开发过程中或在实际场景中,通过使用目标设备的AR功能来测试和迭代特定内容,可以将更多团队成员纳入流程,而无需他们直接在其中工作。 Unity编辑器。
This is where the Unity MARS companion app comes in. The person who is laying out the scene, whether it’s a user, a client, or a remote team member, can capture their environment or make a data recording and upload it to the cloud, and any Unity user can download it in the Editor and see what their teammates have made. In the Editor, you can also tweak constraints until the app works in the virtual environment and save those edits to the cloud.
这是Unity MARS配套应用程序进入的地方。布置场景的人,无论是用户,客户还是远程团队成员,都可以捕获其环境或进行数据记录并将其上传到云中,任何Unity用户都可以在编辑器中下载它,并查看其队友所做的事情。 在编辑器中,您还可以调整约束,直到应用程序在虚拟环境中运行并将这些编辑保存到云中为止。
By building the companion apps in Unity, we are better able to serve our customers across all platforms and potential use cases. We are currently building companion apps for Android, iOS, Magic Leap, and Hololens, but in theory any platform supported by Unity can serve as a platform for our companion apps. Any device that supports Unity and can capture real-world data is a potential platform for spatial authoring.
通过在Unity中构建配套应用,我们可以更好地为所有平台和潜在用例的客户提供服务。 我们目前正在为Android,iOS,Magic Leap和Hololens构建配套应用程序,但是从理论上讲,Unity支持的任何平台都可以用作我们配套应用程序的平台。 任何支持Unity并可以捕获实际数据的设备都是潜在的空间创作平台。
Furthermore, each platform has its own strengths, and the best possible authoring environment involves using each device for its best characteristics. For precise, complex workflows, a keyboard and mouse are hard to beat. For getting a sense of presence and scale, as well as how to deal with a limited world understanding, we’ve found nothing is better than working on the device directly.
此外,每个平台都有其自身的优势,最佳的创作环境涉及使用每个设备以实现其最佳特性。 对于精确,复杂的工作流程,很难击败键盘和鼠标。 为了获得存在感和规模感,以及如何应对有限的世界了解,我们发现没有什么比直接在设备上工作更好。
For even more immediate feedback, the XR Platforms team is working on an AR Remoting package that establishes a direct connection between the Editor and the device to stream live AR data. As these features come online, we will be incorporating them into the Unity MARS simulation view and companion apps.
为了获得更多即时反馈,XR平台团队正在开发一个AR Remoting软件包,该软件包可在编辑器和设备之间建立直接连接以流式传输实时AR数据。 这些功能上线后,我们会将它们合并到Unity MARS仿真视图和配套应用程序中。
功能分解 (Feature breakdown)
Users of the Unity MARS companion apps can perform two primary tasks: data capture and in-situ authoring. Both of these tasks involve sending and receiving data to and from the Editor, which is done via cloud storage. This means that users all over the world can work on the same project together, and that the data persists between sessions and is shared between users. We use the existing Cloud Services account and project permissions to control which users have access to what data. If a user can access the cloud dashboard for a particular project, they have access to that project’s companion app data, both on their AR device and in the Editor. We use QR codes to transfer the project ID, and users will be able to log into the app using either their Unity account or a temporary token shared via QR code. Users can work offline or on an unlinked project and can later sync their data or resolve conflicts between edits made to the same resource based on when that edit was made.
Unity MARS配套应用程序的用户可以执行两项主要任务:数据捕获和原位创作。 这两个任务都涉及通过云存储与编辑器之间发送和接收数据。 这意味着世界各地的用户可以一起处理同一个项目,并且数据在会话之间持久存在并且在用户之间共享。 我们使用现有的Cloud Services帐户和项目权限来控制哪些用户有权访问哪些数据。 如果用户可以访问特定项目的云仪表板,则可以在其AR设备和编辑器中访问该项目的配套应用程序数据。 我们使用QR码传输项目ID,用户将能够使用其Unity帐户或通过QR码共享的临时令牌登录应用。 用户可以脱机工作或在未链接的项目上工作,以后可以同步其数据,或根据进行编辑的时间来解决对同一资源所做的编辑之间的冲突。
捕获数据 (Capturing data)
Data capture comprises a few different tasks. Users can use the Environment mode to capture a static environment scan: either a full mesh in the case of a Magic Leap and Hololens, or a set of flat surfaces from ARKit and ARCore. In the future, it will be possible to integrate third-party software for smartphones to produce mesh scans or other AR data, and create custom builds of these apps to record it. For now, we are focusing on data provided by the OS. In the case of ARKit and ARCore, we also allow users to manually trace out the corners of their room to provide walls in the simulation environment. These aren’t exposed to the simulation as data, but they help provide visual context for the surfaces that were scanned.
数据捕获包括一些不同的任务。 用户可以使用“环境”模式捕获静态环境扫描:在Magic Leap和Hololens的情况下为全网格,或者在ARKit和ARCore中使用一组平面。 将来,将有可能集成用于智能手机的第三方软件以生成网格扫描或其他AR数据,并创建这些应用程序的自定义版本以对其进行记录。 目前,我们专注于操作系统提供的数据。 对于ARKit和ARCore,我们还允许用户手动绘制房间的角落,以在模拟环境中提供墙体。 这些没有作为数据公开给模拟,但是它们有助于为被扫描的表面提供可视上下文。
If the issue or interaction you wish to test involves the user moving around and scanning their environment, users of the app can use the Data Recording mode to capture video, camera path, and environment data as it changes over time. This can range from a simple “user approaches a surface” recording to a full recording of the scanning process from start to finish. We encourage developers to record basic user interactions for playback in the Editor to do as much refinement as possible on the “noisy” data that comes from how users handle devices in the real world. Of course, video files can be quite large, so brief recordings of specific actions are preferred over long recordings that cover multiple actions. It is also possible to select which data streams should be recorded and uploaded to reduce bandwidth usage.
如果您要测试的问题或交互作用涉及用户到处走动并扫描其环境,则该应用程序的用户可以使用数据记录模式来捕获随时间变化的视频,摄像机路径和环境数据。 从简单的“用户靠近表面”记录到从开始到结束的完整扫描过程记录,范围都可以。 我们鼓励开发人员记录基本的用户互动,以便在编辑器中播放,以尽可能完善来自用户如何处理现实世界中的设备的“嘈杂”数据。 当然,视频文件可能很大,因此,相对于包含多个动作的长时间录制,首选对特定动作进行简短录制。 还可以选择应记录和上传哪些数据流以减少带宽使用。
Users can also capture AR Image markers with their device’s camera and annotate specific locations, or “hotspots,” on the image for anchoring content.
用户还可以使用其设备的相机捕获AR图像标记,并在图像上标注特定位置或“热点”,以锚定内容。
关注此空间 (Watch this space)
From our experience on EditorXR, we have learned that being inside a VR scene while you are making edits to it is hugely beneficial. Further, the degree of control afforded by tracked head and hands is a great way to quickly lay out a scene and have fun while you’re doing it. We also learned that things like manipulating text and fine-tuning are best done with a mouse and keyboard – it’s best to avoid those tasks in AR/VR. In fact, the Unity MARS companion apps incorporate features of EditorXR that were created to support editing scenes at runtime. We are extracting these features into reusable packages to create a Runtime Authoring Framework. Transform manipulators, inspectors, and scene serialization will come from a shared code-base for doing authoring work outside of the Unity Editor.
根据我们在EditorXR上的经验,我们了解到在进行编辑时进入VR场景非常有益。 此外,被跟踪的头部和手所提供的控制程度是一种快速布局场景并在执行过程中获得乐趣的好方法。 我们还了解到,最好使用鼠标和键盘来完成诸如操作文本和微调之类的事情-最好避免在AR / VR中执行这些任务。 实际上,Unity MARS配套应用程序合并了EditorXR功能,这些功能是为了在运行时支持编辑场景而创建的。 我们将这些功能提取到可重用的程序包中,以创建运行时创作框架。 变换操纵器,检查器和场景序列化将来自共享的代码库,用于在Unity Editor之外进行创作工作。
Designing our workflows across devices has had some unexpected benefits. The Unity MARS Create and Compare flow, where users in the Editor can drag and drop assets directly into the Simulation View and have Unity MARS infer the necessary constraints, came from thinking about how to author Unity MARS content for an HMD. It makes a lot of sense to grab and place an object in the world where you want it to go when using a Magic Leap. It makes just as much sense to do it in the Editor, but we were only accustomed to dragging prefabs into the Scene View. As we push the capabilities of the companion apps further, we may yet find more ways to incorporate spatial design thinking into our 2D tools.
跨设备设计我们的工作流程具有一些意想不到的好处。 通过考虑如何为HMD编写Unity MARS内容,Unity MARS创建和比较流程使编辑器中的用户可以将资产直接拖放到Simulation View中,并使Unity MARS推断必要的约束。 使用Magic Leap时,将物体抓住并放置在想要移动的世界中非常有意义。 在编辑器中执行此操作同样有意义,但是我们只习惯于将预制件拖到“场景视图”中。 随着我们进一步推动伴随应用程序的功能,我们可能还会找到更多将空间设计思想纳入我们的2D工具的方法。
As we approach the public release of Unity MARS, we will be sharing more information about its features and lessons learned over the course of development. We plan to release builds of the smartphone companion apps on the App Store and Google Play alongside Unity MARS, so creators can dive right into these workflows from day one, with the HMD version to follow later in the year. We believe that Unity MARS is the best way to develop apps for spatial computing, and that creators are best served by using all of the tools available.
随着我们接近Unity MARS的公开发布,我们将分享有关其功能的更多信息以及在开发过程中获得的经验教训。 我们计划与Unity MARS一起在App Store和Google Play上发布智能手机配套应用的构建,以便创作者从第一天开始就可以直接进入这些工作流程,而HMD版本则将在今年下半年发布。 我们相信Unity MARS是开发用于空间计算的应用程序的最佳方法,并且使用所有可用工具都能为创建者提供最佳服务。
This is just the beginning. We are excited to expand the world of Unity authoring outside of the Editor and take our first steps toward a connected, distributed authoring environment. We can’t wait to hear from you about what we should build next.
这仅仅是开始。 我们很高兴在编辑器之外扩展Unity创作的世界,并朝着互联的分布式创作环境迈出第一步。 我们迫不及待想听听您关于下一步应该做什么的消息。
翻译自: https://blogs.unity3d.com/2020/04/23/mars-companion-apps/