|>>|| No. 21043
>Doesn't vsync fuck with the low latency?
Absolutely, which is why it presents such a technical challenge. In VR, almost anything that interferes with the relationship between head position and the visual display will reduce the immersiveness of the experience and induce motion sickness or disorientation. Tearing, latency and frame jitter are all unacceptable, so the only way to get a good experience is to use graphics hardware with enough performance in reserve to render each frame on time even when things get hectic.
The Oculus SDK does all sorts of clever things to mitigate the unreliable nature of the graphics pipeline. The head tracking is predictive, to reduce perceived latency - frames are rendered based on the predicted trajectory of your head rather than the measurement at the beginning of the frame. Each frame is slightly warped at the end of the rendering process based on a new reading from the head tracker, to more accurately match the correct head position and compensate for pipeline latency. Oculus are incredibly committed to getting every last detail right.
Part of the problem is that GPU manufacturers have been using buffering to cheat on benchmarks. They write drivers that maximise FPS, at the expense of latency and jitter. This is more or less the exact opposite of what you want with VR - a relatively low frame rate is acceptable, but latency isn't. Fortunately, Oculus have John Carmack on board, who is an absolute genius at low-level graphics hacking.