VR系列——Oculus Rift 开发者指南:三、Oculus Rift的渲染(一)

Oculus Rift的渲染

Oculus Rift 需要用带有失真校正的立体分屏来为每只眼睛取消镜头相关失真。


这里写图片描述
图3:OculusWorldDemo(一个例子) 立体渲染

校正失真是具有挑战性的,失真的参数随不同的镜头类型和个人的视距而变化。为了使开发更容易,Oculus SDk在分层组件(Compositor)进程中自动处理失真校正,同时它也会降低时延,并传送帧到头盔。

随着Oculus SDK做了大量的工作,应用程序的主要工作是基于位置跟踪进行模拟和渲染立体的世界。立体视图可以被渲染成一个或两个独立的纹理,并通过调用ovr_SubmitFrame提交给分层组件。我们将在这章中详细介绍这个过程。

Oculus Rift的渲染

Oculus Rift 需要分别为每只眼睛渲染一半屏幕的场景。

当使用Rift时,左眼看到屏幕的左半部分,右眼看到屏幕的右半部分。虽然每个人都不同,但人眼的瞳孔距离都是约65毫米,这就是所谓的瞳距(IPD)。在应用程序中,摄像机应该以这个相同的距离进行配置。

注:这是摄像机的平移,而不是旋转,正在这种平移(和伴随产生的视觉的差异)造就了立体的效果。这个意味着应用程序需要渲染整个场景2次,左边虚拟摄像头一次,右边的再一次。
这种二次投影立体渲染技术,它依赖于一个完整的渲染视图生成的左和右视图。但因为镜头的边缘伪影,这种技术对于HMD来说通常是不可行的。

Rift的镜头通过放大视图像提供了非常宽的视角(FOV),增强人们的沉浸感。然而,这个过程会出现明显的扭曲图像。如果Rift上是显示图像是原始图片,那么用户将会看到枕形失真的图片。


这里写图片描述
图4:枕形失真和桶形失真

为了抵消这种失真,SDK使用后处理方式去渲染一个大小相同方向相反的桶形失真,来使两者相互抵消,从而眼睛看不到失真的视图。此外,SDK还纠正镜片边缘的颜色分离效果引起的色差。虽然准确的失真参数取决于镜头的特性和眼睛相对镜头的位置,但Oculus SDK也照顾到了网格失真产生时所有必要的矫正计算。

当Rift 进行渲染时,如下图所示,投影轴必须相互平行,左右视图完全独立于另一个。这意味着摄像机的设置与用于常规的非立体渲染非常相似,除了摄像机侧向移位以调整与眼睛的位置。


这里写图片描述
图5 HMD 视野锥

在实践中,Rift 的预测往往会因为我们的鼻子阻隔,而稍微偏离中心。但有一点,跟电视或电影屏幕中产生的立体视图不一样,在Rift中的左右视图完全互相独立的。这意味着因为它们通常不会应用在VR,所以如果想使用这些媒体开发的方法,必须非常小心。

场景中的2个虚拟摄像头应使它们有相同方向的指向(现实中由HMD来确定方向),并且使它们之间的距离与眼睛之间的距离或者瞳距(IPD)相同。这通常是通过添加ovrEyeRenderDesc::HmdToEyeViewOffset来完成视图矩阵组件的平移向量。

尽管对于大多数用户来说,Rift镜头的距离大约相隔适当,但是它们还是可能不完全匹配用户的IPD。但是由于光学设计的方式,每个眼睛仍然能看到正确的视图。重要的是,该应用程序使得虚拟摄像机的距离匹配用户IPD之间的距离在配置工具集里面设置,而不是Rift镜片之间的距离。


原文如下


Rendering to the Oculus Rift

The Oculus Rift requires split-screen stereo with distortion correction for each eye to cancel lens-related distortion.


这里写图片描述
Figure 3: OculusWorldDemo Stereo Rendering

Correcting for distortion can be challenging, with distortion parameters varying for different lens types and individual eye relief. To make development easier, Oculus SDK handles distortion correction automatically within the Oculus Compositor process; it also takes care of latency-reducing timewarp and presents frames to the headset.

With Oculus SDK doing a lot of the work, the main job of the application is to perform simulation and render stereo world based on the tracking pose. Stereo views can be rendered into either one or two individual textures and are submitted to the compositor by calling ovr_SubmitFrame. We cover this process in detail in this section.

Rendering to the Oculus Rift

The Oculus Rift requires the scene to be rendered in split-screen stereo with half of the screen used for each eye.

When using the Rift, the left eye sees the left half of the screen, and the right eye sees the right half. Although varying from person-to-person, human eye pupils are approximately 65 mm apart. This is known as interpupillary distance (IPD). The in-application cameras should be configured with the same separation.

Note: This is a translation of the camera, not a rotation, and it is this translation (and the parallax effect that goes with it) that causes the stereoscopic effect. This means that your application will need to render the entire scene twice, once with the left virtual camera, and once with the right.
The reprojection stereo rendering technique, which relies on left and right views being generated from a single fully rendered view, is usually not viable with an HMD because of significant artifacts at object edges.

The lenses in the Rift magnify the image to provide a very wide field of view (FOV) that enhances immersion. However, this process distorts the image significantly. If the engine were to display the original images on the Rift, then the user would observe them with pincushion distortion.


这里写图片描述
Figure 4: Pincushion and Barrel Distortion

To counteract this distortion, the SDK applies post-processing to the rendered views with an equal and opposite barrel distortion so that the two cancel each other out, resulting in an undistorted view for each eye. Furthermore, the SDK also corrects chromatic aberration, which is a color separation effect at the edges caused by the lens. Although the exact distortion parameters depend on the lens characteristics and eye position relative to the lens, the Oculus SDK takes care of all necessary calculations when generating the distortion mesh.

When rendering for the Rift, projection axes should be parallel to each other as illustrated in the following figure, and the left and right views are completely independent of one another. This means that camera setup is very similar to that used for normal non-stereo rendering, except that the cameras are shifted sideways to adjust for each eye location.


这里写图片描述
Figure 5: HMD Eye View Cones

In practice, the projections in the Rift are often slightly off-center because our noses get in the way! But the point remains, the left and right eye views in the Rift are entirely separate from each other, unlike stereo views generated by a television or a cinema screen. This means you should be very careful if trying to use methods developed for those media because they do not usually apply in VR.

The two virtual cameras in the scene should be positioned so that they are pointing in the same direction (determined by the orientation of the HMD in the real world), and such that the distance between them is the same as the distance between the eyes, or interpupillary distance (IPD). This is typically done by adding the ovrEyeRenderDesc::HmdToEyeViewOffset translation vector to the translation component of the view matrix.

Although the Rift’s lenses are approximately the right distance apart for most users, they may not exactly match the user’s IPD. However, because of the way the optics are designed, each eye will still see the correct view. It is important that the software makes the distance between the virtual cameras match the user’s IPD as found in their profile (set in the configuration utility), and not the distance between the Rift’s lenses.

  • 0
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
Virtual reality (VR) is changing the world of gaming and entertainment as we know it. VR headsets such as the Oculus Rift immerse players in a virtual world by tracking their head movements and simulating depth, giving them the feeling that they are actually present in the environment. We will first use the Oculus SDK in the book and will then move on to the widely popular Unity Engine, showing you how you can add that extra edge to your VR games using the power of Unity. In this book, you’ll learn how to take advantage of this new medium by designing around each of its unique features. This book will demonstrate the Unity 5 game engine, one of most widely-used engines for VR development, and will take you through a comprehensive project that covers everything necessary to create and publish a complete VR experience for the Oculus Rift. You will also be able to identify the common perils and pitfalls of VR development to ensure that your audience has the most comfortable experience possible. By the end of the book, you will be able to create an advanced VR game for the Oculus Rift, and you’ll have everything you need to bring your ideas into a new reality. What You Will Learn Increase immersion with 3D audio and intuitive interfaces Create group VR experiences using multi-player networking Design fun and engaging mechanics that utilize VR principles Explore the best ways to navigate and interact using the Oculus Rift Design intuitive ways to navigate and interact with scenes in VR Add stunning realism to a scene with three-dimensional audio Invent mechanics and features that take full advantage of VR hardware Table of Contents Chapter 1. Exploring a New Reality with the Oculus Rift Chapter 2. Stepping into Virtual Reality Chapter 3. Improving Performance and Avoiding Discomfort Chapter 4. Interacting with Virtual Worlds Chapter 5. Establishing Presence Chapter 6. Adding Depth and Intuition to a User Interface Chapter 7. Hearing and Believing with 3D Audio Chapter 8. Adding Tone and Realism with Graphics Chapter 9. Bringing Players Together in VR Chapter 10. Publishing on the Oculus Store
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值