Quantcast
Channel: VR displays – Road to VR
Viewing all 89 articles
Browse latest View live

CastAR Kickstarter Enters Final Stretch With Almost $900k Raised

$
0
0

castar kickstarter campaign

CastAR, the new HMD technology that promises to bridge the gap between augmented reality and virtual reality is almost over—raising well over 200% it’s original target, we take a look at its progress.

A Kickstarter, Expertly Executed

CastAR, the new Virtual / Augmented reality system developed by ex-Valve employees Jeri Ellsworth and Rick Johnson, is in its final few hours of its Kickstarter campaign – and there’s little argument as to its success.

I can well foresee in the near future (if this hasn’t already happened) a new specialist field developing in PR and Marketing – that of the Crowd-funding Campaign Consultant. Kickstarting is rapidly becoming an art. We’ve seen (and backed) our fair share of Kickstarters here on RoadToVR, but it’s generally easy to pick out those whose execution has had real thought and planning put into it. CastAR’s campaign was a textbook example on how you should do it. As a result, the campaign’s progress has been swift to raise funds:

castAR: the most versatile AR & VR system -- Kicktraq Mini

From Jeri Ellsworth’s heartfelt pre-Kickstarter 18 month potted history of CastAR, through what seems to have been a gruelling 30 day PR offensive, both Rick and Jeri’s enthusiasm and obvious passion for the CastAR project remained seemingly undimmed. Healthy and interesting project updates showing both the technical feats the team have accomplished and also the range of potential applications CastAR technology could be applied to. Also the team were able to keep the interest high with announcements such as the integration of Leap Motion and the early support for the open source Vireio Perception drivers for the devices VR Clip-On accessory.

But most effective was the reactions of real people trying the system for the first time. As with the Oculus Rift, CastAR is one of those devices you really need to experience for yourself. Filmed at Maker Faire 2013, California back in May, the team officially unveiled their work to the public for the very first time:

Stretch Goals Progress

The campaign has blown through its first 3 stretch goals (pictured below) and may just squeeze the fourth before the curtain falls. The $900k goal promises the addition of a high-speed on board Gyroscope for additional movement tracking accuracy.

castar-sg600kcastar-sg700k

castar-sg800k

Backers can expect early editions of the CastAR glasses and accessories to ship in September 2014, although it must be said that’s a challenging goal to reach. As with the now legendary Oculus Rift shipping delays, expect some wiggle room on these dates before you get your hands on the kit.

We can’t wait to see the system for ourselves and find out where this unique approach to immersive 3D gaming might take take the industry and what other uses the team might dream up in the mean time.

Back the CastAR KickStarter Here

The post CastAR Kickstarter Enters Final Stretch With Almost $900k Raised appeared first on Road to VR.


GDC 2014: Hands-on with Sony’s Project Morpheus VR Headset

$
0
0

sony-ps4-vr-headset-project-morpheus-hands-on-gdc-2014

As part of continuing GDC 2014 coverage, Cymatic Bruce and myself had our first opportunity to experience Sony’s new Project Morpheus, the PS4 VR dev kit, earlier today.

How Does it Feel to Wear?

Brian: I found it a little awkward to figure out how it fits on your head. The design of it looks very well polished, but it’s a little less intuitive to place onto your head than the Oculus Rift, and it’s less obvious how to make adjustments for fit. The Sony rep giving the demo worked with me to achieve a comfortable fit. Since the Sony Morpheus sort of sits on top of your head instead of being strapped on, I did have the impression a few times that it might fall off, but that was unfounded as it stayed securely in place once I started whipping my head around.

Bruce: Comfortable in some ways, uncomfortable in others. Overall the unit was less stifling than the Rift, with more airflow around the face. The sweet spot seemed to be quite loose (which is a good thing), however after several minutes of play, near the end of the second demo (Eve: Valkyrie), it was definitely bearing down on my glasses, causing pain on the bridge of my nose. I’m not sure if it was a design issue or an adjustment issue.

What Were the Demos?

sony-ps4-virtual-reality-project-morpheus-hands-on-gdc-2014

Brian: Two demo experiences were shown today. The first, called The Deep, puts you underwater in a shark cage, which is similar to the Game of Thrones experience put on recently at SXSW in Austin in that they both take place in confined spaces, so they don’t have to account for player movement. As an experience, it was ok. The game itself was well done, and the shark looked amazing, but I think I was just looking for a different kind of experience… which I definitely got in the second demo, the unreleased-but-already-famous Eve: Valkyrie. I won’t review the game here, but it’s a terrific game in its own right right, and a great showcase of Morpheus’ strengths. Eve offers the same advantage as The Deep in that you’re sitting in a cockpit, a confined space.

Bruce: Both demos were solid. The first demo, The Deep, was definitely playing to the strengths of the hardware. It was passive; not much to do but experience it. I can understand their choice of demo given the type of consumer they’re targeting. Eve was great as always but seemed to be missing some graphics components that were in the Oculus Rift version. The targeting reticle was simplified and there were some panels missing from the cockpit.

How is the Hardware?

Brian: Positional tracking didn’t work at the beginning of my demo of The Deep, but the rep made some adjustments and it started. After that, I was able to look down and see my knees, and when I bent down in real life, I could see my avatar’s knees bend and I crouched closer to the bottom of the cage. I was also able to pitch forward and my torso would move closer to the edge of the cage. In Eve, I was able to lean forward to look more closely at the cockpit, and, hilariously, was able to detach my head from my torso from leaning way back.

The image quality was, frankly, stunning. As mentioned, the games themselves are very well done, and I had to concentrate hard to break the illusion and try to see individual pixels, and this is with me whipping my head around like a crazy person trying to make the image skip or blur.

I did experience slight disorientation in Eve, and it was only in certain cases where a ship was passing very close to me at a high rate of speed. Even now I’m not able to put my finger on what the issue was, but I felt ‘unsettled’ somehow by something I couldn’t consciously perceive. It would be interesting to record that at a high frame rate and play it back to see if there’s something going on.

Morpheus line

Bruce: The positional tracking was solid for the most part. I did encounter occasional hiccups and jumps in my position, but when it worked, it worked very well. I would say the DK2 still has more precise positioning. The controller tracking was also very good in the The Deep demo. The controller only rotated the hand; the arm’s position wasn’t tracked. It was kind of cool that the crouching made the avatar crouch.

The screen was very clear, with the screen door effect hardly noticeable. There was still quite a bit of motion blur with rapid head movement, especially with the neon lettering in Eve Valkyrie.


Road to VR has been invited back tomorrow for two new experiences. Please respond with what else you’d like to know in the comments, and we’ll try to address them in tomorrow’s demo.

The post GDC 2014: Hands-on with Sony’s Project Morpheus VR Headset appeared first on Road to VR.

Sony Morpheus VR Headset Shown in Public for First Time at SVVR Meetup #10

$
0
0

morpheus-on-head

Sony Computer Entertainment surprised last evening’s Silicon Valley VR meetup #10 with the first public display of their Project Morpheus VR headset prototype.

demo-team
Anton Mikhailov, Jeff Stafford and Frederick Umminger, from Sony’s Project Morpheus Team

Morpheus, unveiled at this year’s Game Developers Conference in March, was only shown to a limited audience of conference attendees and press at that event. Earlier this week it was on display at a small, invite-only event for developers in Europe. Last night was the first opportunity the public had to freely experience the prototype. Project team members Jeff Stafford, Frederick Umminger, and Anton Mikhailov were on hand at SVVR to show off the Morpheus hardware and their Castle demo.

morpheus-scottPlayers received a head-mounted display, stereo headphones (which plug into the HMD), and a pair of Playstation Move controllers.

The game takes place in a medieval courtyard, with a ragdoll dummy in front of you, and several sharp weapons nearby. Pulling the trigger buttons on the controllers balls your hand into a fist. Instructions are mostly unnecessary: take out your aggression on the dummies in front of you using fists or swords. Cutting the dummy’s hand off and then slapping them across the face is easy, fun and satisfying. The level of interaction the Move controllers adds lots of depth to the experience.

Much like at GDC, interested players raced to queue up as soon as the demo opened, and despite a long line lasting most of the night, nobody seemed to mind. Feedback from those that tried it was positive, with most saying it was better than expected. Having your hands in the game takes the experience to a whole new — and satisfying — level.

Music video and commercial director Jonnie Ross, a co-founder of Virtual Reality LA who drove up from Los Angeles for the SVVR meetup, shared his thoughts:

As far as gameplay, this video shows up-close game action using the “social screen” that shows non-players an unwarped view of the action:

In another video, attendee Jacob Rangel demonstrates some of the play mechanics, including a dual-wield head chop:

And Scott Broock from Jaunt VR shows off his moves:

Many thanks to Sony Computer Entertainment for sharing with SVVR this evening, and to Jeff Stafford, Frederick Umminger, and Anton Mikhailov for working the extra hours to bring it to us. I expect there are at least a few people who updated their VR wishlists after tonight.

Don’t forget, you can still grab tickets to the forthcoming SVVR Conference and Expo in May and there’s a $100 discount available for Road to VR readers. Check out this article for more details on how.

castle-scene

The post Sony Morpheus VR Headset Shown in Public for First Time at SVVR Meetup #10 appeared first on Road to VR.

AntVR ‘Universal’ VR Headset Kickstarter Proves Controversial – What Would You Like To Know?

$
0
0

5349919967d076b5313a142b160cbe0c_large

The newest entrant into the VR Headset space, AntVR and their ‘Universal, all-in-one’ wireless VR System, has caused some confusion and controversy among those in the VR community. So, we thought we’d give the new company the chance to answer their critics.

AntVR Kickstarter Causes Confusion

We had a brief look at the new AntVR ‘Universal’ all-in-one VR system a few days ago. The new company, based in Beijing, has since launched their Kickstarter campaign and with it a slew of technical specifications for the system. However, some of the claims made have caused scepticism in the VR community, somewhat understandably.

The campaign has already raised a staggering $122k as of writing and it seems clear they’ll pass their $200k goal with ease. But there are some real gaps in the technical information supplied on the Kickstarter page which have caused concern in the community. Some of the claims can perhaps be attributed to translation difficulties, but some appear purely misinformed. Just a couple of examples from their Kickstarter FAQ:

What are the differences between spherical and aspherical lens? Why do you use aspherical lens?

Generally, aspherical lens has two advantages over spherical lens. First of all, with spherical lens, standard images would look distorted if not specifically designed to work with a spherical lens. But aspherical lens is compatible with any standard image.

Whilst it’s technically true aspherical lenses aren’t spherical (this one’s somewhat of a no brainer), suggesting that they cause no distortion is incorrect. The Oculus Rift also uses aspherical lenses and any image presented to the device requires pre-warping to compensate for distortion caused when passing through the lenses. No mention of chromatic aberration either here, also requiring compensation correction at the rendering stage.

Do you have a screen door problem?

Because we use aspherical lenses and never waste any pixels in the full HD screen, it will be crystal clear in the headset. There’s no screen door problem.

Screen door is of course the visibility of the display panel’s structure, in particular the gaps between pixels. The team claim to use a single 1080p LCD panel (960 x 1080 per eye) with a 100 degree FOV, so it’s not clear how their claim of ‘no screen door’ can be substantiated. The FOV is all-important here. If they’re suggesting that they’re achieving a 100 degree horizontal true FOV, then loss of pixels is inevitable.

aNGexQC

Finally, the comparison table used to stack AntVR’s system up against its competitors is simply inaccurate. The Oculus Rift DK2 uses a 5 inch panel, aspheric lenses and does away with the breakout ‘driver’ box. It would have to be said that merely stating ‘Built-in’ to describe AntVRs positional tracking system is somewhat unhelpful.

We Need Your Questions!

So, we want to help clear up some of the confusion but we need your help. What would you like to ask the makers of AntVR? CEO of AntVR Qin Zheng has agreed to answer your questions. Reply in the comments (or head over to the thread at /r/oculus) below this article and we’ll collate then send them over and of course report back once we’ve receive some answers.

The post AntVR ‘Universal’ VR Headset Kickstarter Proves Controversial – What Would You Like To Know? appeared first on Road to VR.

Survios Announces $4 Million in Venture Funding for VR Projects

$
0
0

The team behind Project Holodeck has announced that it has secured $4 million in venture funding to help them build their immersive VR experience.

Project Holodeck – From University Project To Investment Opportunity

Nathan Burba, CEO of Survios with an early version of Project Holodecks hardware
Nathan Burba, CEO of Survios with an early version of Project Holodecks hardware

We’ve been following Survios (or Project Holodeck as they used to be called) for some time now. Ben Lang stepped into the Holodeck back in February of last year (see video above) at the dawn of the current VR resurgence and we later broke news of their Razer Hydra and Oculus Rift enabled experience, Zombies on the Holodeck.

The original Project Holodeck idea has its roots in an experiment that began at USC in their Mixed Reality Lab. One of their peers was none other than Palmer Luckey, Founder of Oculus VR. In fact, early Project Holodeck iterations included an early prototype HMD similar in design to the Oculus Rift. As you can see from the above video, the system resembled a mad scientist’s hobby project. It leveraged technology from the Razer Hydra motion controllers (for user input) to Sony’s Move controller, used ostensibly to allow tracking of the player 3D space – both of which mounted precariously atop a wobbly looking helmet. It looked crazy, but to quote Ben from his time in that early version:

I was so immersed that at one point I needed to set the Razer Hydra controllers down to adjust my helmet and I nearly tried to set them down on a virtual table next to me. There was no table in real life — had I not quickly realized what I was about to do, I would have dropped the controllers straight onto the floor below. 

Survios are an interesting proposition for funding as they are working to produce both a hardware solution and compelling software to leverage that hardware. What’s more, despite their relative youth as a company, they can claim to have spent more time than most trying to solve technical and creative problems thrown up by virtual reality.

shasta_logo

The money raising has been led by Shasta Ventures, a company that labels itself as a “boutique, early stage venture firm”. Shasta partner said of the young startup “Survios is hands down the best at what they do. The team is young, hungry, and extremely talented, and they have solved some tremendously important challenges that will enable the entire VR industry to evolve.”

We’re thrilled to see such James Illif (CCO) and Nathan Burba (CEO) achieve so much in such a short time. Writing for Road to VR at times feels very much like documenting the beginning of VR history. Like Palmer Luckey and his Oculus Rift, Project Holodeck is another project in the VR space that started with nothing but makeshift prototypes and a real passion for what they believe in and have now attracted the attention from the business world. Many will say this is the Facebook effect, and to some extent that’s true, but we think it’s more that the world is finally waking up to the fact that not only is VR here to stay, it’s going to change everything. And there’s a lot of money in everything.

The post Survios Announces $4 Million in Venture Funding for VR Projects appeared first on Road to VR.

VRelia Partners with ImmersiON to Offer VR Headsets and Content

$
0
0
The ImmersiON 'Pro' VR Headset
The ImmersiON ‘Pro’ VR Headset

VRelia, a Spanish company recently formed with the aim of providing users with high resolution VR Headsets, has announced a partnership with ImmersiON, a startup formed from the key players behind TD Vision, a company with a history in 3D technologies and video codecs.

We first covered VRelia back in February, at which time they were planning to offer not one but three different VR headsets to compete directly with Oculus VR’s Rift. Since then, it looks like the company has returned to the drawing board and has now paired down it’s offering count to 2. The VREye Go, a headset smartphone harness much like that which we’ve seen from vRase, Durovis and countless others. The VREye Pro offers what’s claimed to be 2 x 5.9″ Full HD (1080p) display panels which the company claims delivers an effective resolution of 2190 x 1920, a slightly confusing figure given that 2 x 1080 (vertical resolution of a full HD panel) = 2160.

The VRelia VREye 'Go' mobile phone VR Headset
The VRelia VREye ‘Go’ mobile phone VR Headset

The products sold by ImmersiON share the same base specs as above and adds what looks to be dual front facing cameras in order to facilitate augmented reality experiences. It’s also key to point out that the images of all these HMDs are at this stage merely renders, and that should you choose to opt for pre-ordering one of the ImmersiON devices, you’re also opting in to beta test early versions of the new VR headsets as it seems there’s no final hardware as yet not to mention a complete absence of pricing. Furthermore, there’s no clear indication when those placing their orders might receive actual hardware.

TDVision seems to have had their hand in many technologies over the years, including the production of their own VR Headset, the TD Visor. It’s their software platform called AlterSpace which is highlighted in a recent press release announcing the partnership though and this seems to take the form of an online collaboration, chat and social VR hub where people can hang out in VR, share online content and socialise. Beyond that, it’s not too clear how the software plays a part in pushing the VREye headsets, or even if / when it will be made available – the Alterspace website seems completely unfinished and the only documentation I could find on TDVision’s website seemed to describe concepts and planned features only.

Competition is great, on that we can all agree. But right now it’s simply unclear precisely what this announcement means for the VR Industry. With hardware that’s yet to enter the beta stage and an unclear roadmap for ImmersiON’s value added software packages, we’ll just have to wait and see.

The post VRelia Partners with ImmersiON to Offer VR Headsets and Content appeared first on Road to VR.

Stanford Unveils ‘Light Field Stereoscope’ a VR Headset to Reduce Fatigue and Nausea

$
0
0

Researchers at Stanford University have unveiled a prototype VR Headset which they claim reduce the effects of VR sickness through the use of an innovative ‘stacked’ display which generates naturalistic light fields.

VR display technology continues to evolve rapidly, with each new generation of headset offering better and better image quality on the road to an ever more immersive experience. Right now however, development has been concentrated on making so-called ‘flat plain’ 3D display technology with less latency, blurring and ghosting.

Low persistence of vision, provided by OLED powered displays first appeared publicly at CES 2014 in Oculus’ Crystal Cove feature prototype for example. But these displays still offer an artificially rendered stereocopic view of virtual worlds, one that has binocular stereoscopic depth but doesn’t allow a user’s eyes to focus on different depth plains. Essentially, 3D on a flat plain.

stanford-light-field-hmd-4

Now, a team of researchers at Stanford University, claim they’ve made the first step in delivering a more naturalistic and comfortable virtual reality display technology. They’ve developed a system which uses off-the-shelf transparent LCD panels stacked, one in front of the other, to generate visible light fields for each eye which include proper depth information. The resulting image, combining the front and rear displays (multiplicative rather than additive), when passed to the user’s eyes means that objects at different depths can be naturally focused on.

As the user’s interpretation of the image presented isn’t reliant on just binocular stereoscopic imagery (that is, one image for each eye view on a single plain) and allows the eye to more naturally respond and rest at focal cues, this reduces fatigue as it more naturally replicates how we perceive the world in real life. At least that’s the theory. You can see it in practice in the video above – where the team use a 5mm aperture camera to simulate focussing at different depths. It’s impressive looking stuff.

LightFieldStereoscope_Refocus

The research, being headed up by Fu-Chung Huang, Kevin Chen and Gordon Wetzstein, claim their new Light-field stereoscope might allow users to enjoy immersive experiences for much longer periods, opening the door to lengthy virtual reality play sessions.

LightFieldStereoscope_Prototype1
The Light Field Stereoscope decontructed

The only thing not currently clear is that, although this technology impressively adds natural depth perception to VR displays, it does use LCD based technology which is currently eschewed in current generation headsets owing to its high persistence nature leading to motion blur. OLED’s extremely fast pixel switching time allow both minimisation of motion blue and the use of low persistence of vision, where the displayed image is only displayed to the user for a fraction of a second to combat stuttering in fast movement. It’s not clear whether which technology present the greater benefit to the consumer VR industry long term.

The unit is now being developed at NVIDIA research and will be demonstrated at SIGGRAPH 2015 in Los Angeles next week. It’ll be interesting to see where the GPU giant takes the ground breaking research next.

You can read more about this research and download the papers here.

The post Stanford Unveils ‘Light Field Stereoscope’ a VR Headset to Reduce Fatigue and Nausea appeared first on Road to VR.

NVIDIA Demonstrates Experimental “Zero Latency” Display Running at 1,700Hz

$
0
0

At GTC 2016 this week, NVIDIA’s Vice President of Graphics Research demonstrated a novel prototype display running at an incredibly high refresh rate, all but eliminating perceptible latency.

Update (4/6/16, 11:08AM PT): An earlier version of this story transposed ‘17,000Hz’ in place of the correct 1,700Hz.

When it comes to latency in virtual reality, every part of the pipeline from input to display is a factor. You’ll often hear the phrase ‘motion to photons latency’ which describes the lag between the instant you move your head to the moment that the display responds to that movement. Between those two points are several sources of latency, from the detection of the input itself, to the rendering, to the time it takes for the display to illuminate its pixels.

For desktop-class VR, current state of the art VR headsets have displays running at 90Hz, which means that they’re capable of showing 90 images per second. And while we’ve seen that 90Hz is more than sufficient for a comfortable VR experience, NVIDIA Vice President of Research David Luebke says that ever higher refresh rates could improve the VR experience by further reducing latency.

David-Luebke-gtc-2016
NVIDIA Vice President of Research David Luebke at GTC 2016

At GTC 2016 this week, Luebke demonstrated an experimental display with a refresh rate that’s almost 20 times faster what we see in current consumer head mounted displays. Running at a whopping 1,7000Hz, the display was mounted on a rail system which allowed it to be rapidly moved back and forth. When shaken vigorously, the image on the display stayed locked in place to an impressive degree. Even when magnified closely, the image on the screen seemed entirely fixed in place.

A 90Hz display shows an image every 11 milliseconds, while this 1,700Hz display shows an image every 0.58 milliseconds.

“…if you can apply this to a VR display, that kind of ultra-low latency would help things stay rock-solid in the environment, to the point that the display would no longer be a source of latency. So this is effectively a zero latency display,” said Luebke.

One thing I find particularly interesting about this (and all VR displays in general) is that while the object on the screen appeared to be fixed in space to our eye, in reality, the image is racing back and forth across the display, illuminating many different pixels across the screen as it goes. The illusion that it’s still is actually evidence of how quickly it can move, which is curiously counterintuitive.

“You could put this thing in a paint shaker and it would appear to stay solid… it’s very cool,” Luebke said.

Of course, for this level of tracking, you also need extremely low latency input. Thus a second reason for the rail system is revealed; Luebke told me that wheels on the rails feed the movement of the carriage almost instantaneously into the system. Without such precision and low latency input, even a display this fast as the one demonstrated wouldn’t appear to show such a steady image, highlighting the need for low latency across the entire ‘motion to photons’ pipeline.

While less than 20ms of latency from input to display is generally considered good enough for VR, Luebke said that things get better toward 10ms and there’s even measurable benefits down to as low as 1ms of latency.

Until we can brute-force our way to zero latency with super high refresh rates like Luebke’s demonstration, a technique called low persistence is employed by modern VR headsets to capture some of the benefits of a super fast display, namely blur reduction. Low persistence works by illuminating the display only briefly, then turning it off until the next frame is ready (rather than keeping it illuminated continuously from one frame to the next).


Road to VR is a proud media partner of GTC 2016

The post NVIDIA Demonstrates Experimental “Zero Latency” Display Running at 1,700Hz appeared first on Road to VR.


OSVR HDK 2160×1200 Screen Upgrade Kit Now Available

$
0
0

OSVR has launched a display upgrade kit for the HDK headset which allows users to swap a new, higher performance panel into their existing HDK headsets.

The $220 screen upgrade kit, sold at the OSVR Store, offers HDK 1.3 & 1.4 owners a way to upgrade to the same display found in the recently launched HDK 2 headset. The upgraded OLED panel has a resolution of 2160×1200 and refresh rate of 90Hz, bringing the screen’s on-paper specs up to that of the Oculus Rift and HTC Vive. The display found in the HDK 1.3 and 1.4 is 1920×1080 running at 60Hz.

osvr-hdk-2
See Also: OSVR Announces $399 HDK 2 Headset with 2160×1200 Resolution

OSVR says that upgrade should take as little as 15 minutes, but notes that inserting the new display will mean the loss of an auxiliary USB connector on the side of the headset. No specialized tools are needed; the 10-page instructions appear simple enough for anyone who has opened up their desktop computer to replace RAM or other components. You’ll be exposing the inside of the headset, and could do damage if you aren’t careful, but it isn’t anything unexpected for a product bearing the name ‘Hacker Development Kit’.

osvr-screen-upgrade-kit-board osvr-screen-upgrade-kit-display

This is good news for HDK 1.3 and 1.4 owners who were promised an open, modular device. From our understanding of OSVR’s open-source nature, this display could even be used (and sold) in other VR headsets made by other companies.

Availability of the HDK screen upgrade kit brings hope for further modular upgrades for the headsets, like the Leap Motion-embedded faceplate which the company announced more than a year ago, and forward-looking upgrades like the possibility of a Lighthouse tracking upgrade (now that Valve is licensing the technology).


Disclosure:​ At the time of writing, OSVR is running advertisements on Road to VR.

The post OSVR HDK 2160×1200 Screen Upgrade Kit Now Available appeared first on Road to VR.

Display Conglomerate JDI Developing Ultra High Resolution Panels for VR Headsets

$
0
0

JDI, a display conglomerate consisting of the display businesses of Sony, Toshiba, and Hitachi, announced this week the development of ultra high resolution panels that are made specifically for virtual reality headsets.

In a press release issued this week, JDI says that it has already begun shipments of a 651 PPI (pixel per inch) made-for-VR LCD display which uses the RGB subpixel layout (compared to the PenTile layout used by the Rift and Vive displays). The 1700 x 1440 3.42 inch display has a 1.18:1 aspect ratio, meaning it’s mostly square compared to the typical 16:9 rectangular aspect ratio of a smartphone or TV display; that’s because two of the displays are designed to be used in a VR headset—one for each lens—which also opens the door to hardware IPD adjustment (like we see on the Rift and Vive).

Comparison of pixel fineness between a low ppi display vs. a high ppi display | Photo courtesy JDI
Comparison of pixel fineness between a low ppi display vs. a high ppi display | Photo courtesy JDI

For comparison, the Rift uses a pair of 1080 x 1200 displays (same as Vive) with a PPI of ~456, while Gear VR’s PPI comes in at 575 PPI (when used with the Galaxy S7). So the new JDI display has around a 30% higher PPI than the Rift and the Vive, and around 12% more than Gear VR. When it comes to pixel count, the numbers are even more impressive; the JDI display has 2,448,000 pixels per-eye, 48% more than the 1,296,000 of the Rift and Vive, and 25% more than Gear VR’s 1,843,200.

jdi-vr-display-low-persistence
Comparison of motion blur between a smartphone display vs. a VR display | Photo courtesy JDI

JDI says the IPS display is capable of a 90Hz refresh rate and a 3ms black-to-white response time, which is critical to keep low to reduce motion blur that’s especially noticeable when in VR.

SEE ALSO
NVIDIA Demonstrates Experimental "Zero Latency" Display Running at 1,700Hz

The company says they have already begun shipping samples of the 651 PPI VR display, but they aren’t stopping there; JDI is already teasing the development of an 800 PPI display which would represent massive increase in pixel count, somewhere in the realm of 2088 x 1768 resolution (assuming the same 3.42 size). That would put it at 3,691,584 pixels, 65% more per eye than what’s in the Rift and Vive today, and 50% more than Gear VR.

Even though 1920 x 1080 looks sharp and fine on a standard desktop monitor or HDTV, the nature of a VR headset demands much more pixel density to achieve the same apparent fidelity. That’s because the screens in a VR headset are mere inches from your eyes, while being magnified and stretched by the lenses around a far larger field of view. As display technology achieves increasingly higher resolution, the image in the headset becomes clearer and more immersive, thanks to it being harder to see the individual pixels. The effect of 3D is also enhanced as resolution increases because lines become sharper, making it easier for your eyes to identify depth cues.

SEE ALSO
Oculus Chief Scientist Predicts the Next 5 Years of VR Technology

But the need to render high resolutions means the need for greater processing power to fill all those pixels, a challenge which many think foveated rendering should make much more tractable in the next few years.

The post Display Conglomerate JDI Developing Ultra High Resolution Panels for VR Headsets appeared first on Road to VR.

Kopin Unveils ‘Lightning’ 2k x 2k 120Hz OLED Microdisplay for Mobile VR

$
0
0

Display specialists Kopin have unveiled a tiny, high resolution OLED ‘microdisplay’ designed specifically for the use inside immersive headsets. It’s dubbed ‘Lightning’ and it could open the door to a thinner and lighter form of VR headset.

Myriad VR hardware continues to be announced at CES 2017 as the space still remains one of the technology industry’s hot topics. Among them, a tiny new display from a company called Kopin, one that’s not only interesting but potentially extremely pertinent to next generation VR hardware.

Kopin, a company who designs and manufactures displays (among other components), are extending their reach to immersive headsets. They’ve debuted a new display called ‘Lightning’, an OLED microdisplay boasting a 2048 x 2048 pixel resolution and an impressive 120Hz refresh rate at a diminutive size of 1 inch diagonal. On top of that, the display boasts low power consumption, low heat dissipation, and a low refresh latency of 10 microseconds.

In order to demonstrate the display as a solution for mobile VR, Kopin have integrated it with their own patented optics dubbed ‘Pantile’. Using these, Kopin state they can wring a 90 degree FOV (Field of View) from the displays, all in a form factor that is no larger than a thick pair of glasses, according to a new report.

kopin-logo-1In the past, microdisplays were seen as unsuitable for immersive HMDs as lens technology couldn’t provide the high magnification needed for such small displays without exhibiting distortion and artefacts. Today we have companies who are starting to develop novel optical solutions that get around this, bending light innovative compared with traditional lenses in order to achieve high FOV. The company eMagin for example debuted a similar solution with 2k by 2k per eye and with their own specialized optics technology to provide a 100 degree FOV.

SEE ALSO
eMagin Announces 2K×2K 'Flip Up' VR Headset, Demoing at AWE 2015

Kopin’s optics seem to be using something similar to fresnel technology, which both the HTC Vive and Oculus Rift use. However, thanks to the ridged nature (see image inside the Vive consumer headset below) of this lens type, we’ve seen that fresnel optics produce ‘god ray’ or visible ‘ringing’ artifacts that can distract from the VR experience. It’s unknown yet how much of a problem that poses for these new optics – which in conjunction with this tiny new display must magnify more greatly – may exacerbate certain artifacts like those. On the other hand, Kopin has shown interest in providing the display to customers and partners, so coupled with advanced optics from other sources, the end result could be less visual issues.

htc-vive-lenses-2Some other interesting details about the new microdisplay is the ability for Kopin to manufacture at $50 per panel, at least according to John Fan, the CEO of Kopin, who said as much in a recent interview. Fan also teased that their roadmap for “late 2017” was to increase the display’s resolution to 3k x 3k. For comparison, the Rift and Vive are 1080 x 1200 per eye so, if successful, Kopin’s new display could triple that resolution.

We’ve not yet seen’s Kopin’s technology in action yet of course, and clearly there are some significant challenges to overcome as detailed above. But if the company can pull it off, we could see that dream ‘sunglasses’ form factor for VR devices a little sooner than we expected.

The post Kopin Unveils ‘Lightning’ 2k x 2k 120Hz OLED Microdisplay for Mobile VR appeared first on Road to VR.

Avegant Claims Newly Announced Display Tech is “a new method to create light fields”

$
0
0

Avegant, makers of Glyph personal media HMD, are turning their attention to the AR space with what they say is a newly developed light field display for augmented reality which can display multiple objects at different focal planes simultaneously.

Most of today’s AR and VR headsets have something called the vergence-accommodation conflict. In short, it’s an issue of biology and display technology, whereby a screen that’s just inches from our eye sends all light into our eyes at the same angle (where’s normally the angle changes based on how far away an object is) causing the lens in our eye to focus (called accommodation) on only light from that one distance. This comes into conflict with vergence, which is the relative angle between our eye eyes when they rotate to focus on the same object. In real life and in VR, this angle is dynamic, and normally accommodation happens in our eye automatically at the same time, except in most AR and VR displays today, it can’t because of the static angle of the incoming light.

For more detail, check out this primer:

Accommodation

accomodation-eye-diagram
Accommodation is the bending of the eye’s lens to focus light from objects at different depths. | Photo courtesy Pearson Scott Foresman

In the real world, to focus on a near object, the lens of your eye bends to focus the light from that object onto your retina, giving you a sharp view of the object. For an object that’s further away, the light is traveling at different angles into your eye and the lens again must bend to ensure the light is focused onto your retina. This is why, if you close one eye and focus on your finger a few inches from your face, the world behind your finger is blurry. Conversely, if you focus on the world behind your finger, your finger becomes blurry. This is called accommodation.

Vergence

vergence-diagram
Vergence is the rotation of each eye to overlap each individual view into one aligned image. | Photo courtesy Fred Hsu (CC BY-SA 3.0)

Then there’s vergence, which is when each of your eyes rotates inward to ‘converge’ the separate views from each eye into one overlapping image. For very distant objects, your eyes are nearly parallel, because the distance between them is so small in comparison to the distance of the object (meaning each eye sees a nearly identical portion of the object). For very near objects, your eyes must rotate sharply inward to converge the image. You can see this too with our little finger trick as above; this time, using both eyes, hold your finger a few inches from your face and look at it. Notice that you see double-images of objects far behind your finger. When you then look at those objects behind your finger, now you see a double finger image.

The Conflict

With precise enough instruments, you could use either vergence or accommodation to know exactly how far away an object is that a person is looking at. But the thing is, both accommodation and vergence happen in your eye together, automatically. And they don’t just happen at the same time; there’s a direct correlation between vergence and accommodation, such that for any given measurement of vergence, there’s a directly corresponding level of accommodation (and vice versa). Since you were a little baby, your brain and eyes have formed muscle memory to make these two things happen together, without thinking, any time you look at anything.

But when it comes to most of today’s AR and VR headsets, vergence and accommodation are out of sync due to inherent limitations of the optical design.

In a basic AR or VR headset, there’s a display (which is, let’s say, 3″ away from your eye) which shows the virtual scene and a lens which focuses the light from the display onto your eye (just like the lens in your eye would normally focus the light from the world onto your retina). But since the display is a static distance from your eye, the light coming from all objects shown on that display is coming from the same distance. So even if there’s a virtual mountain five miles away and a coffee cup on a table five inches away, the light from both objects enters the eye at the same angle (which means your accommodation—the bending of the lens in your eye—never changes).

That comes in conflict with vergence in such headsets which—because we can show a different image to each eye—is variable. Being able to adjust the imagine independently for each eye, such that our eyes need to converge on objects at different depths, is essentially what gives today’s AR and VR headsets stereoscopy. But the most realistic (and arguably, most comfortable) display we could create would eliminate the vergence-accommodation issue and let the two work in sync, just like we’re used to in the real world.

Solving the vergence-accommodation conflict requires being able to change the angle of the incoming light (same thing as changing the focus). That alone is not such a huge problem, after all you could just move the display further away from your eyes to change the angle. The big challenge is allowing not just dynamic change in focus, but simultaneous focus—just like in the real world, you might be looking at a near and far object at the same time and each have a different focus. Avegant claims it’s new light field display technology can do both dynamic focal plane adjustment and simultaneous focal plane display.

Avegant Light Field design mockup
Avegant Light Field design mockup

We’ve seen proof of concept devices before which can show a limited number (three, or so) of discrete focal planes simultaneously, but that means you only have a near, mid, and far focal plane to work with. In real life, objects can exist in an infinite number of focal planes, which means that three is far from enough if we endeavor to make the ideal display.

Avegant CTO Edward Tang tells me that “all digital light fields have [discrete focal planes] as the analog light field gets transformed into a digital format,” but also says that their particular display is able to interpolate between them, offering a “continuous” dynamic focal plane as perceived by the viewer. The company also says that objects can be shown at varying focal planes simultaneously, which is essential for doing anything with the display that involves showing more than one object at a time.

Above: CGI representation of simultaneous display of varying focal planes. Note how the real hand and rover go out of focus together. This is an important part of making augmented objects feel like they really exist in the world.

Avegant hasn’t said how many simultaneous focal planes can be shown at once, or how many discrete planes there actually are.

From a feature standpoint, this is similar to reports of the unique display that Magic Leap has developed but not yet shown publicly. Avegant’s announcement video of this new tech (heading this article) appears to invoke Magic Leap with solar system imagery which looks very familiar to what Magic Leap has teased previously. A number of other companies are also working on displays which solve this issue.

SEE ALSO
'HOLOSCOPE' Headset Claims to Solve AR Display Hurdle with True Holography

Tang is being tight lipped on just how the tech works, but tells me that “this is a new optic that we’ve developed that results in a new method to create light fields.”

So far the company is showing off a functioning prototype of their light field display (seen in the video) as well as a proof-of-concept headset that they represents the form factor that the company says could eventually be achieved.

We’ll be looking hoping to get our hands on the headset soon to see what impact the light field display makes, and to confirm other important information like field of view and resolution.

The post Avegant Claims Newly Announced Display Tech is “a new method to create light fields” appeared first on Road to VR.

Understanding Pixel Density & Retinal Resolution, and Why It’s Important for AR/VR Headsets

$
0
0

While most of us are used to dealing with resolution figures that describe pixel count (ie: a 1920×1080 monitor), pixel density stated as pixels per degree is a much more useful figure, especially when dealing with AR and VR headsets. Achieving ‘Retinal resolution’ is the ultimate goal for headsets, where at a certain pixel density, even people with perfect vision can’t discern any additional detail. This article explores those concepts, and takes a look at how far today’s headsets are from retinal resolution.

yuval boger

Guest Article by Yuval Boger

Yuval is CEO of Sensics and co-founder of OSVR. Yuval and his team designed the OSVR software platform and built key parts of the OSVR offering. He frequently shares his views and knowledge on his blog.

If the human eye was a digital camera, its ‘data sheet’ would say that it has a sensor capable of detecting 60 pixels/degree at the fovea (the part of the retina where the visual acuity is highest). For visual quality, any display above 60 pixels/degree is essentially wasting resolution because the eye can’t pick up any more detail. This is called retinal resolution, or eye-limiting resolution.

This means that if there an image with 3,600 pixels (60 x 60) and that image fell on a 1° x 1° area of the fovea, a person would not be able to tell it apart from an image with 8,100 pixels (90 x 90) that fell on a 1° x 1° area of the fovea.

Note: 60 pixels per degree figure is sometimes expressed as “1 arc-minute per pixel”. Not surprisingly, an arc-minute is an angular measurement defined as 1/60th of a degree. This kind of calculation is the basis for what Apple refers to as a “retina display”, a screen that when held at the right distance would generate this kind of pixel density on the retina.

If you have a VR headset, you can calculate the pixel density—how many pixels per degree it presents to the eye—by dividing the number of pixels in a horizontal display line by the horizontal field of view provided by the lens. For instance, the Oculus Rift DK1 dev kit (yes, I know that was quite a while ago) used a single 1280 x 800 display (so 640 x 800 pixels per eye) and with a monocular horizontal field of view of about 90 degrees, it had a pixel density of just over 7 pixels/degree (640 ÷ 90). You’ll note that this is well below the retinal resolution of 60 pixels per degree.

Not to pile on the DK1 (it had many good things, though resolution was not one of them), 7 pixels/degree is the linear pixel density. When you think about it in terms of pixel density per surface area, it’s not just 8.5 times worse than the human eye (60 ÷ 7 = 8.5) but actually a lot worse (8.5 × 8.5 which is over 70).

The following table compares pixel densities for some popular consumer and professional HMDs:

VR Headset Horizontal Pixels Per Eye Approximate Horizontal Field of View (degrees per eye) Approximate Pixel Density (pixels/degree)
Oculus DK1 640 90 7
OSVR HDK1 960 90 11
HTC Vive 1080 100 11
Sensics dSight 1920 95 20
Sensics zSight 1280 48 27
Sensics zSight 1920 1920 60 32
Human fovea –  –  60

Higher pixel density allows you to see finer details—read text; see the grain of the leather on a car’s dashboard; spot a target at a greater distance—and in general contributes to an increasingly realistic image.

Historically, one of the things that separated professional-grade VR headsets from consumer headsets was the a higher pixel density. Let’s simulate this using the following four images. Let’s assume that the first image (taken from Epic’s Showdown demo) is shown at full 60 pixels/degree density (which it could be, depending upon the resolution and distance you sit from your monitor). We can then re-sample it at half the pixel density (simulating 30 pixels/degree) and then half again (15 pixels/degree) and half again (7.5 pixels/degree). Notice the stark differences as we go to lower and lower pixel densities.

Photo courtesy Epic Games
Full resolution (simulating 60 pixels/degree) | Photo courtesy Epic Games
Photo courtesy Epic Games
Half resolution (simulating 30 pixels/degree) | Photo courtesy Epic Games
Photo courtesy Epic Games
Simulating 15 pixels/degree | Photo courtesy Epic Games
Photo courtesy Epic Games
Simulating 7.5 pixels/degree | Photo courtesy Epic Games

Higher pixel density for the visual system is not necessarily the same as higher pixel density for the screen because pixels on the screen are magnified through the optics. The same screen could be magnified differently with two different optical systems resulting in different pixel densities presented to the eye. It is true, though, that given the same optical system, higher pixel density of pixels on the screen does translate to higher pixel density presented to the eye.

As screens get better and better, we will get increasingly closer to eye-limiting resolution in the headset and thus closer to photo-realistic experiences.

The post Understanding Pixel Density & Retinal Resolution, and Why It’s Important for AR/VR Headsets appeared first on Road to VR.

Samsung’s New VR Display Has Nearly 3.5x More Pixels Than Rift & Vive

$
0
0

At last week’s Display Week 2017 conference, Samsung showed off a new ultra-high resolution display for VR headsets that more than triples the pixel count of the displays in the Oculus Rift and Vive.

A new display from Samsung targeting use in VR headsets packs a whopping 2,024 x 2,200 pixels into a 3.5″ form-factor, delivering an impressive 858 PPI, nearly twice the 460 PPI of the Rift and the Vive. The display is also capable of a 90Hz refresh rate and 100 nits brightness. From a raw pixel-count standpoint, Samsung’s new VR display has 3.4 times the number of pixels in those headsets.

The new display was shown off by the company at Display Week 2017. Seen in photos posted to Reddit by user ‘Krenzo’, the display was shown side-by-side against what we presume to be the same 3.5″ 1,080 x 1,200 display presently used in the Rift and Vive. Both the old display and the new were shown inside Gear VR shells; seen through the lens was a high-resolution image of Where’s Waldo for comparison.

Photos through the lens of each headset shared by Krenzo reveal a major reduction in the so-called ‘screen door effect’ and the visibility of individual pixels seen on Samsung’s new VR display.

Samsung currently provides the displays in both the Rift and Vive—both of which use two individual displays of 1080×1200 resolution per-eye with a 90Hz refresh rate—which means it’s quite likely that this new display is destined for future generations of those headsets.

Samsung also makes the smartphone displays in the company’s Gear VR compatible phones which actually have a higher 1,440 x 1,280 per-eye resolution than the Rift or Vive, but are not as suitable for those headsets due to the aspect ratio.

SEE ALSO
Hands-on: Latest StarVR Upgrades Highlight Ultra-wide FoV & Nearly Invisible Pixels

Even when compared to the higher resolution Gear VR display, the new Samsung VR display has 2.4x more pixels and a substantial increase in PPI.

The post Samsung’s New VR Display Has Nearly 3.5x More Pixels Than Rift & Vive appeared first on Road to VR.

Kopin Reveals “Smallest VR Headset” With 2k x 2k Per Eye Resolution @120Hz

$
0
0

Display specialists Kopin in partnership with Chinese company GoerTek have announced a new reference VR headset design that it claims is the smallest of its kind integrating the firm’s ‘Lightning’ OLED micro display panels sporting a substantial 2k x 2k resolution.

One of the key ‘most wanted’ advances desired in today’s retail virtual reality headsets is higher resolution displays. Recently we reported on Samsung’s prototype OLED panels sporting a PPI (pixels per inch) figure of 858, nearly twice that of the current generation HTC Vive and Oculus Rift headsets. Now, micro display specialist Kopin have unveiled a new reference design headset with displays that top even that.

The adorably named ‘Elf VR’ headset is equipped with two of Kopin’s “Lightning” OLED micro display panels, which each feature a 2048 x 2048 resolution, providing “binocular 4K image resolution at a 120Hz refresh rate” – a figure which is misleading as the horizontal resolution is ‘per eye’ and there cannot resolve the 3840 horizontal pixels required for an equivalent ‘UHD’ image (even ignoring the shortfall in vertical resolution). In case you’re wondering, each diminutive display represents an impressive 2940 pixels per inch – that’s five times the number on existing Samsung panels in the Vive and Rift.

A Kopin Micro Display [Image courtesy Kopin]
Going by images included in our recent report on those prototype Samsung panels, this would substantially reduce screendoor effect, artifacts cause by the visible gap between display elements. What’s more, Elf VR should represent not only a great visual experience for traditional VR experiences, but also provide an impressive bump for 360 and standard movie watching too.

“It is now time for us to move beyond our conventional expectation of what virtual reality can be and strive for more,” explained Kopin founder and CEO John Fan as part of a recent press release. “Great progress has been made this year, although challenges remain. This reference design, created with our partner Goertek, is a significant achievement. It is much lighter and fully 40% smaller than standard solutions, so that it can be worn for long periods without discomfort. At the same time, our OLED microdisplay panel achieves such high resolution and frame rate that it deliver a VR experience that truly approaches reality for markets including gaming, pro applications or film.”

SEE ALSO
Kopin Unveils 'Lightning' 2k x 2k 120Hz OLED Microdisplay for Mobile VR

Of course, the other major statistic of interest for VR headsets is the expansiveness of the field of view (FOV) or, how much of your peripheral vision is encompassed by the image. With smaller displays come optical challenges in achieving immersive FOVs. Kopin claim are tackling this with a two-pronged approach. Their reference design includes two “Multi-lens” optical design branches. The first is a unit targeting the aforementioned media / movie watching category which offers a 70 degree FOV (it’s not stated if this is horizontal, vertical or diagonal) – which will present a sharper image with higher pixel density. The second offers a much greater 100 degree FOV, presumably at the sacrifice of optical sharpness.

Of course with smaller integrated panel hardware and these optical systems, the other benefit to Kopin’s approach could be weight advantages. Kopin claim it’s managed to reduce its optical module by 60% to leverage a 50% weight reduction – although as no numbers were provided we’re not sure what this comparison refers to.

As we’ve seen time and again since the start of the most recent VR renaissance, it continues to provide an impressive catalyst to accelerating technological innovation in multiple fields. And with both Samsung and Kopin already at a stage where they can produce next generation VR displays, it hopefully won’t be too long before we begin to see tangible upgrades over existing ‘first gen’ hardware. That ‘soon’ may mean mid 2018, at least according to Oculus founder Palmer Luckey, speaking in an interview recently.

The post Kopin Reveals “Smallest VR Headset” With 2k x 2k Per Eye Resolution @120Hz appeared first on Road to VR.


AxonVR is Building a Generalized Haptic Display

$
0
0

Jake-RubinAxonVR was awarded US Patent No. 9,652,037 of a “whole-body human-computer interface” on May 16th, which includes an external exoskeleton as well as a generalized haptic display made out of microfluidic technology. I had a chance to demo AxonVR’s HaptX™ haptic display that uses a “fluidic distribution laminate” with channels and actuators to form a fluidic integrated circuit of sorts that could simulate variable stiffness and friction of materials.

At GDC, I stuck my hand into a 3-foot cube device with my palm facing upward. I could drop virtual objects into my hands, and there was an array of tactile pixels that was simulating the size, shape, weight, texture, and temperature of these virtual objects. The virtual spider in my hand was the most convincing demo as the visual feedback helped to convince my brain that I was holding the virtual object. Most of the sensations were focused on the palm on the hand, and the fidelity was not high enough to provide convincing feedback to my fingertips. The temperature demos were also impressive, but also were a large contributor to the bulkiness and size of the demo. They’re in the process of miniaturizing their system and integrating it with an exoskeletal system to have more force feedback, and the temperature features are unlikely going to be able to be integrated in the mobile implementations of their technology.

LISTEN TO THE VOICES OF VR PODCAST

I had a chance to talk with AxonVR CEO Jake Rubin about the process of creating a generalized haptic device, their plans for an exoskeleton for force feedback, and how they’re creating tactile pixels to simulate a cutaneous sensation of different shapes and texture properties. Rubin said that that the Experiential Age only has one end point, and that’s full immersion. In order to create something like the Holodeck, then Rubin thinks that a generalized haptic device will unlock an infinite array of applications and experiences that will be analogous to what general computing devices have enabled. AxonVR is not a system that’s going to be ready for consumer home applications any time soon, but their microfluidic approach for haptics is a foundational technology that is going to be proven out in simulation training, engineering design, and digital out of home entertainment applications.


Support Voices of VR

Music: Fatality & Summer Trip

The post AxonVR is Building a Generalized Haptic Display appeared first on Road to VR.

Kopin’s Prototype VR Headset is Incredibly Thin & Light, More Than 3x the Pixels of Rift and Vive

$
0
0

Kopin is touting a new prototype VR headset featuring their 4K OLED ‘Lightning’ microdisplay that they say is made specifically for VR. At nearly half the size of other headsets, and made from lightweight materials, the device feels featherlight compared to VR products on the market today.

Kopin is a publicly traded display manufacturer that was founded in 1984. With the massive buzz generated by VR, the firm has turned developed a roadmap for manufacturing displays specifically for VR headsets. Microdisplays by their nature are small and incredibly pixel dense, and also capable of high refresh rates.

The first microdisplay that Kopin is positioning for VR is what they’re calling ‘Lightning’, a 1-inch display with 2,048 x 2,048 per-eye resolution and running at a whopping 120Hz. With the Rift and Vive using displays of 1,080 x 1,200 pixels, Kopin’s Lightning display has just over 3.2 times as many pixels, and runs substantially faster than the 90Hz refresh rate of those headsets.

Photo by Road to VR

The tiny size of the microdisplay also brings another advantage: the potential for a much shorter focal length. Consumer VR headsets on the market are all roughly the same bulk size, not because we can’t design smaller enclosures, but because the physics of light requires that the displays be a certain distance from the lenses in order to present a focused image to the user’s eye. A smaller image allows for a shorter focal length, which means the displays don’t need to be as far from the lenses, potentially resulting in a much more compact headset.

Kopin has worked with Chinese ODM Goertek to develop a prototype VR headset that employs their Lightning microdisplay. The result is an incredibly compact and lightweight device that is an absolute joy to wear compared to the bulk of today’s consumer headsets.

Photo by Road to VR

I got to handle and wear a functional prototype at E3 2017, but unfortunately I didn’t actually get to see VR content through it since, according to the company, the only computer the company had on hand that was cooperating with the demands of driving a custom 4,096 x 2,048 resolution across both displays at 120Hz had to be shipped off to CES Asia (another conference which is also running this week). I expect to meet with Kopin again in the near future to see content running on the prototype headset; for now I can only talk about the form factor.

Photo courtesy Kopin

Compared to the consumer headsets on the market today, even the very lightest among them (like Gear VR and Daydream View) the Kopin prototype headset feels feather-light (note that it was missing a small driver-board for the displays which would add a slight bit to the weight). A single flexible strap that goes around the back of your head holds the device on your face with ease, no top strap required. The shell was made from a thin and extremely lightweight plastic. It was rigid, but it’s unclear to me if the durability of this material is enough to stand up to consumer usage; they may need to shift to a thicker or more durable material which could push the weight up some.

Photo courtesy Kopin

In photos alone it’s hard to appreciate how much smaller the Kopin headset is than others, but it feels much closer to the size and weight of a pair of ski goggles; it hugs close around your eyes without taking over so much of your face. It’s not nearly as ‘deep’ either, meaning it doesn’t jut out so far from your face. The slender profile compounds with the light weight since the leverage is not nearly as great as it would be with a bigger enclosure sticking further out from your face.

Photo by Road to VR

If and when most immersive VR headsets achieve this form factor, it’s going to make a massive difference in comfort and ease-of-use for VR.

Continued on Page 2: Microdisplay Tradeoffs »

The post Kopin’s Prototype VR Headset is Incredibly Thin & Light, More Than 3x the Pixels of Rift and Vive appeared first on Road to VR.

Startup Aims for Retinal Resolution VR Display With 70x “effective resolution” of Today’s Headsets

$
0
0

Finland-based Varjo has announced a new display technology for AR and VR which the company is calling “the first human eye-resolution VR/AR/XR immersive display.” Varjo claims an “effective resolution” that’s nearly 70 times greater than the Rift and Vive.

We’ve recently seen new VR displays from both Samsung and Kopin with an impressive ~3.25x increase in resolution compared to the Rift and Vive.

Varjo however claims an effective resolution that’s nearly 70 times greater than today’s headsets. They’re doing so with an interesting display layout which combines typical VR displays with higher density micro-displays to make part of the image hyper-sharp. The company says that portion of the image achieves retinal resolution, meaning that the pixel density is so great that you can’t tell individual pixels apart.

SEE ALSO
Understanding Pixel Density & Retinal Resolution, and Why It's Important for AR/VR Headsets

To achieve this, Varjo is projecting a high-resolution microdisplay into the center of a lower resolution display. The effect, as reported by The Verge, is like having a super high-resolution window right in the middle of your field of view, with lower quality everywhere else; that’s why they’re calling this “effective resolution” rather than “resolution.” Although the super high-resolution area is presently limited in its field of view, the visual fidelity it offers looks very impressive.

Above: Through-the-lens view of a limited portion of Varjo’s prototype headset. Below: the same view (and limited portion) of the Oculus Rift | Image courtesy Varjo

The Verge’s Sean O’Kane explains what it was like to look through the company’s prototype headset, which is built from an Oculus Rift:

Looking at them through that window in the center gave each of these scenes new life. Textures that were obscured by the Oculus’ dual 1080 x 1200 displays could now be seen in more lifelike detail. I could read individual filenames in the folders on the virtual desktop. The cockpit of the plane was especially striking. Looking at it through the Oculus displays surrounding Varjo’s tech, I couldn’t understand any of the labels on the many knobs and switches at my virtual fingertips. Looking “through” those microdisplays, though, I was able to read all of them.

O’Kane notes a few issues with the current prototype, like a notable border separating the high-resolution and low-resolution parts of the image (rather than a seamless blend between them), and issues with a mismatching framerate. According to the report, Varjo says they can fix both of these issues, and has an aggressive timeline set for what they say will be a professional-focused VR headset coming in 2018 which will be priced at “thousands of dollars.” The company also says the tech could be used for AR applications.

 

Image courtesy The Verge

 

Combining multiple displays to form a single view in a VR headset is not a new idea, but it is an interesting one. One of the biggest challenges is integrating each display in a way that forms one cohesive image. With Varjo’s present tech, the high-resolution area sticks in the middle of the field of view no matter where you move your eyes.

SEE ALSO
Vive to Get Eye-tracking Add-on with Optional Corrective Lenses

With advancements in eye-tracking, it could be feasible to move the high-resolution area to wherever your eyes are pointing, and it seems this is where Varjo hopes to head in the future.

 

The post Startup Aims for Retinal Resolution VR Display With 70x “effective resolution” of Today’s Headsets appeared first on Road to VR.

VR Headsets Based on Kopin’s 2K Display Expected by End of 2018

$
0
0

After we went hands-on with Kopin’s prototype headset featuring their 2K VR micro display last week, people are curious to know when the screens might actually hit the market. Speaking with Kopin and manufacturing partner Goertek, the companies tell us that the first products incorporating the ‘Lightning’ display are expected by the end of 2018.

We explained in our hands-on that although functional, the Kopin ‘Elf’ headset isn’t a consumer product but actually a demo for the company’s 2K 120Hz Lightning display:

One important thing to remember is that Elf headset is not going to become a product, it’s simply a pitch for Kopin’s VR microdisplays and Goertek’s manufacturing capabilities. The company’s hope is that a consumer electronics company will want to produce a product based on the Lightning display, and the Elf headset is the demo to sell them on the form-factor that it enables. Goertek says that the companies are “actively marketing” the Elf headset to potential consumer electronics companies. That means that an end product containing Kopin’s Lightning display might end up looking quite a bit different than the Elf headset today. In fact, although Elf is tethered, Kopin says that the foundation of the headset is also suitable for all-in-one mobile VR headsets.

Speaking with GoertekKopin’s partner and one of the manufacturers behind the Oculus Rift and PSVR—at the company’s Silicon Valley office, I was told that the first products to launch with the Lightning display are expected toward the end of 2018, which puts products around five quarters away.

Photo by Road to VR

Part of the gap between now and then is finding partners who are convinced by the Elf headset demo and decide to build a product based on the components; Kopin and Goertek are in the process of demonstrating the headset to a range of companies.

The big pitch for the Elf headset is its impressively compact size and diminutive weight; the prototype I tried was just 220 grams, less than half the weight of the Rift and Vive (though to be fair it’s just a demonstration of the display, and lacking some extra hardware found on most headset). And of course it’s backed by the incredibly sharp 2K Lightning display which has more than three times the pixels of today’s leading headsets, and a whopping 120Hz refresh rate (though, as I found in my hands-on, those benefits may come at the cost of a lower field of view).

 

Another reason for the delay is that the display and the lenses are still in active development; it will be some time yet until Kopin is pumping them out at manufacturing quantities. The companies expect that in 2019 they would have the manufacturing capacity to create some 5 million headsets.

A test board and headset used by Kopin to demonstrate their Lightning displays with several different lenses | Photo by Road to VR

Before that time though, there’s still a few kinks for Kopin to work out. For one, depending upon which optics they use, the display needs to be brighter. They told me that they expect to be able to double the brightness by the time they begin shipping displays, but it isn’t clear if they’ll have enough brightness to enable low-persistence, a key technique for reducing blur as users turn their heads in the virtual environment. The company also showed me three different lenses, each of a different design and offering a different field of view, though they’re still in development as well.

One big question for Kopin is whether traditional display manufacturers will be able to ship next-gen VR displays ahead of the Lightning display, potentially offering another route to bringing greater pixel density to VR headsets. For instance, Samsung demonstrated a new 2K VR display back in May, though it isn’t clear when they’ll be ready to sell the part to headset makers.

In the long term, Kopin and Goertek say they’re investing $150 million to create a new display fabrication plant which will enable them to make larger and higher resolution VR micro displays. According to the companies’ roadmap, they’re aiming to make a 1.3-inch 3K display followed by a 1.37-inch 4K, in due course.

The post VR Headsets Based on Kopin’s 2K Display Expected by End of 2018 appeared first on Road to VR.

Google is Developing a VR Display With 10x More Pixels Than Today’s Headsets

$
0
0

Earlier this year, Clay Bavor, VP of VR/AR at Google, revealed a “secret project” to develop a VR-optimised OLED panel capable of 20 megapixels per eye. The project was mentioned during SID Display Week 2017 but has gone largely under the radar as little information has surfaced since.

Following a general overview of the limits of current VR technology, and an announcement that Google is working with Sharp on developing LCDs capable of VR performance normally associated with OLED, Bavor revealed an R&D project that hopes to take VR displays to the next level. A video of the session comes from ARMdevices.net’s Nicolas “Charbax” Charbonnier.

“We’ve partnered deeply with one of the leading OLED manufacturers in the world to create a VR-capable OLED display with 10x more pixels than any commercially available VR display today,” Bavor said. At 20 megapixels per eye, this is beyond Michael Abrash’s prediction of 4Kx4K per eye displays by the year 2021.

“I’ve seen these in the lab, and it’s spectacular. It’s not even what we’re going to need in the ‘final display’” he said, referring to the sort of pixel density needed to match the limits of human vision, “but it’s a very large step in the right direction.”

SEE ALSO
Exclusive: How NVIDIA Research is Reinventing the Display Pipeline for the Future of VR, Part 1

Bavor went on to explain the performance challenges of 20 MP per eye at 90-120 fps, which works out at unreasonably high data rates of 50-100 Gb/sec. He briefly described how foveated rendering combined with eye tracking and other optical advancements will allow for more efficient use of such super high resolution VR displays.

The post Google is Developing a VR Display With 10x More Pixels Than Today’s Headsets appeared first on Road to VR.

Viewing all 89 articles
Browse latest View live