[关闭]
@plantpark 2014-08-08T11:54:13.000000Z 字数 3729 阅读 1738

Oculus Rift on the Raspberry Pi

未分类


Oculus Rift on the Raspberry Pi

This runs on my desktop PC. Just out of curiosity, I tried the OVR SDK on the Raspberry Pi as well.

The ARM architecture isn't officially supported by the Oculus guys but it compiled almost OK. The only thing I had to change was a bit of assembler code. As far as understand, the "dmb" instruction doesn't exist on armv6 (which is the Pi's instruction set). After a bit of research, here is the black magic alternative I've found:

In file Kernel/OVR_Atomic.h, struct AtomicOpsRawBase (around line 100):

  1. #elif defined(OVR_CPU_ARM)
  2. // http://www.jonmasters.org/blog/2012/11/13/arm-atomic-operations/
  3. #define MB() __asm__ __volatile__ ("mcr p15, 0, r0, c7, c10, 5" : : : "memory")
  4. struct FullSync { inline FullSync() { MB(); } ~FullSync() { MB(); } };
  5. struct AcquireSync { inline AcquireSync() { } ~AcquireSync() { MB(); } };
  6. struct ReleaseSync { inline ReleaseSync() { MB(); } };

Another thing I didn't fix was the name of the output directory for the lib which is still "i386".

Unfortunately, the OculusWorldDemo doesn't work. First, it doesn't find the HMD device. Also there seems to be a more global issue with X11+OpenGL on the Raspberry Pi. I haven't had a chance to look at that yet.

On the good side though, the Sensor is properly detected, and by adding few lines to the initialization code, I could get some readings... (oh and thanks Synergy for the remote mouse! Instructions to build it for the Pi here)

I already approached the subject a while ago, when I got the Oculus Rift SDK compiled for the Raspberry Pi and successfully accessed the head-orientation.

This time, I wanted to render simple 3D geometry for the Rift using the Pi. Is this adorable little machine powerful enough to support the Rift?

As explained in the official Oculus SDK document, rendering for the Oculus Rift implies several things:
Stereo rendering: rendering the 3D scene twice, once for each eye, side by side
Distortion correction: distorting the rendered image in a such a way that viewing it through the Rift lenses makes it look correct
Chromatic aberration correction: this is a bonus step that aims at reducing color fringes introduced by the lenses

Accelerated 3D on the Raspberry Pi means using OpenGL ES, much like any mobile platform these days. The Pi supports OpenGL ES 2.0 which is enough for the shaders which implement the corrections described above. In fact, the SDK comes with "regular" Open GL shaders that work perfectly on Open GL ES 2.0. Smashing!

So after some learning and testing, I finally got "Rift-correct" rendering on the Pi. The scene is extremely simple as it's just a rotating cube floating in front of the user (the head tracking is there too). And here is how it looks like. Note that I removed the lenses of the Rift in order to film its screen.

Now for some benchmarks
All the tests were done on a Release build using the official vertex and fragment shaders. No attempt was made to optimize anything (I'm not sure there's much to be done honestly).

Rendering the scene twice (no correction shaders): 16 ms/frame
Rendering the scene in a texture and rendering the texture on a quad filling the screen (no fancy correction shaders): 27 ms/frame
Same thing with distortion correction shader: 36 ms/frame
Same thing with distortion and chroma correction instead: 45 ms/frame
Note: the render-target texture used in the tests was precisely the size of the screen (1280x800). Because of the pinching effect of the distortion shader, it should be larger than that (see "Distortion Scale" section of the Oculus doc) about 2175x1360. Unfortunately this was too much for the Pi. As a result a good part of the FOV is lost (the visible pink border). I haven't tried to see what the maximum texture size is, so I stuck to a scale of 1.

Conclusion
The good news is that using the Rift with the Pi can be done! However don't expect amazing results: with distortion correction on (a minimum to see a scene correctly), the frame rate on the simplest possible scene is about 27 fps. At first glance, this doesn't seem that bad. But when head-tracking is on, it does feel choppy and uncomfortable. Indeed, the Oculus team says that 60 fps is a minimum (in fact I believe the next development kit will have a screen with an even higher frame rate than that).

Getting the code
The code of this little experiment can be found at https://github.com/jbitoniau/RiftOnThePi
Building instructions are provided there.

Enjoy!

添加新批注
在作者公开此批注前,只有你和作者可见。
回复批注