Intel makes the case for wireless PC-based high-end virtual reality

Intel recently hosted a 100-year-old man at its virtual reality lab in Hillsboro, Oregon. Lyle Becker was a big fan of the VR flight simulator, which reminded him of the planes that he flew in World War II. That kind of first-time experience evokes a sense of wonder at the immersiveness of VR, and that’s why Kim Pallister, director of VR excellence at Intel, believes so much that VR will be a transformative medium.

While VR is off to a slow start, Pallister believes that it will catch on in the long run. Intel recently pivoted away from a tech demo that it called Project Alloy, a stand-alone VR headset, to something entirely different. The first generation of VR headsets, such as the Oculus Rift and the HTC Vive, are connected to powerful personal computers via wires. Everybody wants those wires to go away, but how you tackle the issue matters.

Pallister said Intel tried at first to put the processing power in the headset itself so that you don’t have to connect a wire to a PC. But the company also worked on connecting the headset display to a PC via a wireless technology. Pallister thinks that will deliver a much better experience.

The WiGig wireless networking technology, which uses a short-range 60-gigahertz radio, can transfer data at fast enough rates to feed VR imagery from a PC to the display in a VR headset. With the WiGig connection, VR headset makers will be able to exploit the extra processing power available in a full desktop computer, rather than a more limited processor that has to run on battery power in a compact headset.
I recently joined a small group of journalists who jointly interviewed Pallister. We talked about Becker, the immersive nature of VR, Project Alloy, WiGig, and Intel’s partnership with Blueprint Reality on a VR presentation technology. Here’s an edited transcript of our interview.

GamesBeat: I saw that video of the 100-year-old man visiting [the VR lab].

Kim Pallister: That was in my lab up in Oregon. That was so much fun. We had a guy based in Oregon. His daughter is friends with the ops manager on my team. She said, “Hey, my dad is really into tech. He’s been hearing about VR, and I know you work on that.” We said, “Bring him by!” We have this open — the lab is behind locked doors, but there’s a “knock for VR demos” sign up, and we bring people in all the time. People bring their kids in. It’s a bit of a zoo sometimes.

We brought this guy in. He’s 100 years old. If I’m as spry as he is at 100 years — the guy came booking into the lab. He’s in great shape. We put together some demos for him to film it and make a thing out of it. He was a pilot, a commercial pilot, and a pilot in World War II. He flew these supply routes that went from Burma over the high mountains in southeast Asia. It was known to be a pretty dangerous route. If you took the wrong route in the dark, you’d hit a mountain you couldn’t get over. We thought, “Hey, we’ll put him in some flight sims and see what he thinks.”

The one question we asked him was — he was born in 1917. You’ve seen the advent of television, the advent of computers, the advent of the internet. You’ve seen the advent of commercial flight. So what’s the one technology you think was the biggest thing? You’d think maybe the internet, maybe computers. Maybe that’s my bias because I work at Intel. But he says, “Without a question, GPS.” As a pilot, seeing the world before and after, that’s the biggest delta.

GamesBeat: Do you work with Project Alloy?

Pallister: The team that worked on that is the other lab down in Santa Clara. They’re the group that worked on all our sensing technologies. It’s a different team, but I work with those guys all the time. We’re trying a lot of different stuff. There are ways they’re related. With Alloy we said, “We have this great sensing technology. We have this high-performance PC technology. Let’s see how far we can push the envelope of putting it all in one form factor.”

We realized this is not necessarily the optimal form factor. We learned a lot of things. We think both the inside-out tracking technology and the 3D depth sensing technology are absolutely applicable across a lot of VR areas. But the way we saw the price and performance versus form factor and the fact we were getting increasing confidence on the WiGig side of things — the best way to deliver a high-performance PC experience is to wirelessly talk to a high-performance PC plugged into a wall outlet.

GamesBeat: Is the dev kit for Alloy still supposed to come out this year?

Pallister: I think we’ve said we’re not necessarily doing it. We haven’t seen that much interest. All the technologies are still applicable, so we’re looking at ways to deliver those. Guys on my team are taking that same inside-out tracking tech and integrating it into the smartphone demo you saw there. We’re working on that now. The tech is still very much being worked on, but we’ve seen there isn’t necessarily a good fit for bulky-form-factor PC on your head. If you want the PC-quality experience, you’re better off with a full-powered PC and doing it wirelessly.

All the technology components are still there. But bringing it to market as-is, that was never our intent. It was a proof of concept to try out all these different things. We saw that you don’t want to have the wire and that the PC experience can deliver a more rich experience than what you can deliver on your phone. One way is to say, “How much PC can we deliver in something you wear on your head?” Another is to say, “How do I remotely deliver the signal from the PC wirelessly?” The latter turned out to have more legs.

GamesBeat: It seems like more people are still pursuing the stand-alone headsets, though.

Pallister: There are people working on them. Facebook talked about theirs. The Oculus guys talked about theirs. There will be a place for those things. Even putting our position in PCs aside, I think we’re quite a ways off still from people doing enough VR that buying a dedicated appliance that does nothing but VR makes sense.

You can imagine certain verticals where it makes sense. Something you use in a retail outlet to demo products to people, something like that. But for all these other spaces, whether it’s smartphone based or PC based, you have this Swiss-army-knife platform you use for a bunch of things and therefore, you can justify spending — what’s Apple charging for the latest phone? $1000? Or you can justify a high-end PC. But are you doing enough VR that you can justify that price? If you can’t, you end up putting less compute in there, and you deliver a less good experience.

It’s not surprising that people are working on that. I think they’ll have their place. But our bet right now is that the biggest employment you’re going to see with these things is on general-purpose compute platforms, whether it’s a phone at the low end or a high-end PC or something on the spectrum in between.

GamesBeat: You don’t think there’s too much room for mixed reality, then?

Pallister: That’s a different question. You can do MR or AR or pick your mix of terminology, things along that spectrum. Those will also have ways to do it in a dedicated platform and ways to do it in a general-purpose platform that you’re then either putting into a headset or wirelessly talking to a headset or something.

An example might be, you already have the ARKit stuff people are doing on phones. One could imagine somebody putting something like that up either on through-the-display AR, or — people are doing the kind of reflective, “Let’s mount the phone like this with a half-mirror visor.” That’ll be the Google Cardboard, Gear VR approach. Then, there are people already doing PC-connected headsets of some kind, which will then say, “I’ll use this open platform for development and leveraging the compute power there.” You don’t need to do a dedicated device if you’re going to do MR. Certainly, there’s room for it. HoloLens is a great example.

GamesBeat: Do you have a road map for the launch at this point?

Pallister: No. What we’ve said is that HTC has announced that they’re doing something. It’s up to them to announce when. They’ve said they’re not saying when yet or how much. We’d like them to say something soon, but they tend to announce things on a very short runway.

GamesBeat: Obviously the TPCAST is available for pre-orders now.

Pallister: Yeah, it is. We believe the WiGig solution is going to be better. But you guys will have to try it when it comes out.

GamesBeat: Are you completely in line with the Microsoft MR event?

Pallister: As much as we’re always aligned with Microsoft. We’ve been working really hard with them on a bunch of stuff there. Myself and a number of people went in there a couple of years ago and said, “OK, VR is happening. Where is the DirectX of VR? What are we going to do?” They formulated their plan, and we got behind it in a big way.

In some ways, what they’re doing is their own offering, very much along the lines of what Oculus or HTC offers. It’s a high-end gaming headset instead of offering single controllers. A thing we’re doing that’s different, they’ve made a deliberate effort to scale down into a set of experiences that will run well on mainstream PCs. That’s something we’ve had whole teams of dedicated engineers working hard on. It’s a collaboration between us doing the right power/performance profiling and custom drivers and optimizations on pieces of code we run and then doing a set of vectors of scalability and different knobs and sliders within their rendering stack and their applications.

I’m excited about it. We have a bunch of the headsets in-house now. We’ve been using their desktop shell. We have versions of Minecraft running on 15-watt notebooks at full framerate. It’s going to be great.

GamesBeat: They’ve confused things a bit with the term “mixed reality.”

Pallister: Yeah, as marketing people are sometimes wont to do.

GamesBeat: We’ve seen some price movement this summer, which is good. It seems like we’re still waiting for the point where it’s in line with other consumer electronics devices — $299, $399 — where someone can try out VR, have a great time, and not get hit by sticker shock when they find out how much their own headset would cost.

Pallister: That’s the goal with what we and Microsoft are trying to do. Can you take a sub-$300 headset and say, “This will work on the notebook you were going to buy, anyway?” You don’t have to change the notebook you were going to buy and get a high-end gaming machine. You can buy the notebook you wanted anyway and have a great set of experiences. If you want to play high-end 3D games, if you want to play Raw Data, then you’ll buy a different machine. If you want to play Minecraft, though, it’s fine.

GamesBeat: I’m also waiting to see — how well is the platform designed to suspend everything else on the computer in order to ensure a smooth experience, even on some of those low-end games?

Pallister: That’s part of why Microsoft has been focused on their universal app platform. It’s a quality control — a walled garden inside Windows. It’s a point of friction elsewhere in the industry, about what it prevents people from doing and innovating on, but for this particular use case, it’s understandable that they want to do a thing where they can really control what’s running, what’s not, what policies the apps are forced to conform to. But it’s a good question. What if I’ve installed a toolbar on my other browser, and it’s eating 50 percent of my CPU to mine Bitcoins and send coupons to my uncle?

GamesBeat: It seems like there’s a psychological barrier we’ll get over when headsets are sold as another peripheral that’s part of your computing experience, not a $500 Vive that you just use to play games. You get your keyboard, your mouse, your antivirus solution, and your headset.

Pallister: Even if there’s a bundle, the fact is, if I’m hearing right, as a consumer — if I look at the Rift, I’m not just buying a VR headset. I’m making a decision about whether it has the games I want. I’m getting locked into that platform of offerings. It feels like — our initial thought was there are any number of usages you can point to that started with this vertical model and eventually got standardized and went horizontal.

We’ve been saying that standards will be the way that happened. We’re on the Khronos OpenXR standards body, and we’re working with other companies to make that happen. But it feels like the industry is on the verge of making it happen anyway. Somebody enabled the path to get Steam VR titles to run on Oculus headsets. Initially, they said, “No, no, you have to go through our store,” but their users said, “We’re Steam users, and you’ll make that happen.” We recently saw an announcement from Microsoft that said, “Our headsets will also support that content.” Valve has hinted that other guys might be coming beyond HTC. The LG guys made an announcement.

It feels like, even if there’s not an outright declaration that all this stuff works on all the headsets, at least there will be a body of knowledge among enthusiast users who use Steam. “I know that I can go on the Steam discussion list and see that these five headsets support all this stuff, and I should probably buy this one.” It very much feels like we’re heading in that direction. At the very least, when the standards happen, we’ll get there, but even before that, you’ll see consumers getting more choice.

GamesBeat: VR arcades are very interesting and the things IMAX is doing with VR cinema. Do we think that has legs? Is that going to keep going?

Pallister: I don’t know what the economics look like for IMAX, but I think the model absolutely has legs. Whether it’s the next arcade and looks like arcades did in the ‘80s, and they’re a massive destination business, or they’re a bit like arcades now, where certain types of games — the only thing arcades have left today is form factor and peripherals. You sit on a motorbike or pick up a machine gun and do all this stuff you can’t do at home.

It’ll be somewhere in between, I think. Some people will say, “I’ll take an off-the-shelf Vive and deliver some flavor of experience to people who can’t afford it.” You’ll see that in China, say. And some people will say, “Even if people have a headset at home, how do I really do a next-level experience?” Something like what The Void is doing. I’ll have force-feedback vests and other peripherals. I’ll do an environment where it’s a 1:1 mapping with the VR environment. When I see stairs in the VR space, I’ll step forward, and there are stairs there. I’m mixing the real world in. Next-gen laser tech, absolutely, right?

I think people would pay a lot in some markets for — think of something between paintball and laser tag. People would pay a lot of money for a really high-adrenaline, really immersive experience. There’s an opportunity to do that. It’s got a long way to go from just dropping the PC version of a game in a Vive. That’s a good start, but it’s got room to expand. When you get good multiplayer experiences there, and people can collaborate with their friends — when they start painting a whole environment green and doing mixed-reality green-screening with four people in there and eight friends in the lobby, and it’s like a karaoke bar where you watch your friends do stuff, and you can swap stories about it, take the video home, post it on YouTube. There’s a lot of legs for stuff like that.

GamesBeat: I talked to the guys who made the Omni treadmill. They were arguing that these treadmills are much more economical because you can just have people run in place, and then, you don’t need a lot of real estate where you can never amortize the cost.

Pallister: That’s true. And certainly, for some urban environments, it might be the case. Paintball and laser tag have real space. But they have their own set of barriers. I think it’s true for some. The thing I’ll say about those — as much as they’re restrictive and sliding your feet around doesn’t quite feel like running, and you have to strap yourself into it, what I’ll give them is that every time they’re at a trade show, and they do the two-on-two shooter app, there’s always a crowd watching.

It’s part of why we got excited about VR and esports in combination. I think there’s a real opportunity for spectators. There’s a nugget of something there. If you show people a video of one person running on the treadmill, they say, “Why would anyone care about this?” But if you have the two people competing….

GamesBeat: Can you talk about the technical side of MixCast?

Pallister: Anybody who’s been looking at VR has seen some flavor of this mixed reality green-screen use case people are doing. The initial spawn of that idea was — it’s a number of companies, but the guys who did Fantastic Contraption, Owlchemy, the Job Simulator guys, a few early pioneers in this area said — we highlighted this is a problem to solve two years ago. One of the hard things about selling VR, convincing people they want to try it and why it’s interesting, is you have to do it one user at a time.

I personally spent months running around Intel cornering executives and saying, “I want to talk to you about this thing.” They said, “I’m not wearing that on my head.” I’d have to talk them into putting it on their face, and then, they got it. You can’t get to 20 million units selling things like door-to-door vacuum cleaners, one at a time. So how do you convey that? This is one piece of the equation people came up with: Here’s a way to convey the experience this person is having. You composite a video of them doing the thing with a video of the in-game simulation, mixing the two together. We said, “That’s really cool. Somebody needs to make that easy to do.”

Then, we ran into these guys at Blueprint in Vancouver. The application they developed is called MixCast. It’s a Steam VR application that makes it easier to set up and do green screen. We contacted them and met up at GDC and said, “Hey, we have a bunch of things we want to do here. We want to make it easier to get this out to people. You guys are already doing this. Can we collaborate on stuff and make the product better and get it out to people?” They thought that was great, so we did this technical collaboration.

The first thing we tackled is making it easier to do. This was an idea we discussed, although it’s largely their technical development. They made it from an SDK that a developer could integrate into their app, which they still have, into an application that hooks into any Unity app. Any Unity app that was built for VR but doesn’t necessarily have support for MixCast, they can hook into it and support someone using it. A reviewer or YouTuber or streamer on Twitch can make this work, even if the game wasn’t built from the ground up to work with it. Check, more people can use this thing.

We looked at it with a very default Intel point of view and said, “How can we apply performance and make this run better?” We’ve been working with them on taking our media SDK and doing a highly threaded video codec, so you can do up to 4K video and spread that workload across cores. While you’re on the same PC that’s doing your VR and rendering the user’s view and the MixCast view, you’ll be able to do a video encode and not be constrained by low-resolution video. Check, that’s another goal. It’s not yet shipping, but their app will take that technical proof of concept, and we’ll roll it into some drop of the application.

Once you have that video on the computer, then, what are some things you can do with that information to make it better? What kind of compute can you do on the video workload to make the end result better? We’re looking at things like adding support for RealSense cameras, so you can have a depth view of the user. That means you can do things like light them correctly for the scene, or maybe one day actually use background segmentation and remove them from the background without a green screen there. Then, the end user doesn’t have to put up bed sheets on the walls to try to do this. They could just say, “Sort out anything between this and that depth.”

That’s the nature of the collaboration with them. We’re trying a little science experiment right now where we’re using an AI-driven drone as a pilot to control the camera position. If you want to have a moving camera and you’re doing VR at home with this green screen, you need to get someone to hold the camera and film you [and] make sure they get certain things in the frame. We’d like that to be handled by a drone. We don’t think an end user is necessarily going to do this any time soon, but for a professional setup, it’s interesting. It’s an example of how we can start to think about expanding the ways developers work with this stuff.

GamesBeat: When it comes to content, are you doing any direct funding to get it built and get more applications out there?

Pallister: We’re not a game publisher: “Here’s money, go build this game.” We have done deals where we’ve said, “We want to show how you can use compute to add more bells and whistles to apps and make them look better.” Sometimes, that will involve funding or co-marketing or whatever makes business sense. We did things with Star Trek Bridge Crew and Arizona Sunshine. We’ll do more things like that going forward.

Our Intel Capital Group has looked somewhat into the consumer space but a lot in the commercial space as far as where VR is applicable. Who are interesting companies that aren’t just getting off the ground but that we think are sound investments and a good use of Intel Capital’s money? Situations where applying capital toward them can bolster the rapid growth of VR in different areas.

I was telling someone earlier about a company called InContext Solutions. They do specialized applications for retailers. If you’re laying out your new store and you want to put shelves in and decided how high things should be and what displays should look like, that’s what they do. They saw VR coming and said, “Hey, we can let people see what the store looks like before they build a 15,000-square-foot space. We can put focus groups in VR and see where their eyes are drawn when they walk in the store.”

That’s an example of a company where they’re doing great stuff. They’re propagating VR into a particular commercial segment. We’ve done things with them. We’ve done things with medical and architectural and a bunch of other sectors. We did a project with the Smithsonian art museum. Something like 90 percent of what the museum owns can never be viewed just because of space considerations, so we captured a wing of the museum and took it a step further. When you approach a painting of the Aurora Borealis, you can actually step through that painting and be surrounded by the landscape. For people who can’t travel, for teaching kids, that’s one element that’s been very cool.

We’ve done a couple of film collaborations with independent filmmakers. One was about sustainability and the Amazon rain forest, called Tree. It has every sense available to you — sight, sound, smell, haptics — so you feel like you’re a tree in the Amazon. You’re growing and growing. I’m not going to spoil the ending, but it’s about protecting that forest, so you might imagine what happens to you. But again, it’s a really good story and a way to raise awareness.

Another creator in particular is Eliza McNitt, who’s an Intel Science Fair winner. She’s created a couple of films. Her first one is called Fistful of Stars, placing you in the middle of a galaxy. We’re working on a new one with her called Pale Blue Dot, which is focused on Earth. In addition to gaming, there commercial and educational applications, and I think film is only just getting started. There are categories at Sundance and all the other festivals now, so we’re beginning to see what might happen there.

One last area on the content front is the stuff the Intel Sports Group is doing. We have a whole division at Intel now focused on what they call the digitization of sports. It’s the way that, when people watch football or baseball, it’s all being digitized in some way or another to let them engage with it more. That’s a very big domain, but a portion of what they’re doing involves how you view it and consume it in VR in different ways.

They have an effort going where we acquired Voke. They’ve turned that into the TrueVR technology, where they’re streaming 360 sports content, but you can switch between different camera points in the stadium. “I want to be court side. I want to back up and view from behind the net.” That’s not being created just to propagate VR, but VR is an obvious target for it. It’s real content that Intel is involved in developing and technology we’re taking to different partners that will end up being one of the things on the list of what you can do in [your] Microsoft headset or your Vive.

GamesBeat: Where is that available right now?

Pallister: Right now it’s Gear VR. We’re working with them on getting it to some flavor of PC VR. It’s really just that. They’re almost a startup within Intel. It’s not a question of whether we want to do it but just in what order we want to do it.

GamesBeat: What’s the difference between the system we’re seeing here on the Vive and DisplayLink’s XR?

Pallister: This is using the DisplayLink codec, but the implementation here is built on top of Intel’s WiGig technology. Chances are, if you saw DisplayLink demoing something — they can build their stuff on top of other transports, but the best version right now is on top of WiGig, so they’re probably demoing stuff on top of that. Think of one as the transport layer and the other is the compression and communications layer. We built the prototype together with them.

GamesBeat: Are you just using their algorithms to encode and decode, or is it a chip?

Pallister: It’s a piece of silicon, and they’re doing that. It’s an algorithm, but it has to run at such a fast rate that it wouldn’t be power-efficient to take a general-purpose processor and run it in software there. The best power efficiency is building it into hardware. They could opt to go take a product to market, but what we collaborated with them and HTC on — HTC wants to go build, and has said they will build, an add-on for the Vive based on that combination of technologies. Our belief is that ultimately, the best solution is going to come if HTC takes it and integrates it with their product. Even as an add-on module, it’s all built to work together smoothly. It really should come out of their shop.

GamesBeat: What’s the timeline and the economics of WiGig? How fast does this become a low-cost solution?

Pallister: There are two pieces to that. As to how well it gets integrated into the product and what it’s going to cost, that’s an HTC question. Obviously, if they do a future version of the Vive, and they integrate it directly into the headset, there will be a lot of cost savings there. But it’s their call. The way we’ve spoken to them about making it work with PCs out there — right now, not many gaming PCs have WiGig built in. It has to be a PCI-E card or a Thunderbolt module or something like that. Eventually, that’ll get to where people ship WiGig in the platform, and that’ll lower cost. You could see an OEM saying they’ll do a bundle and build a gaming rig for WiGig that has this thing in it.

GamesBeat: How long a process was it to figure out that wireless to PC was the better way to go versus a stand-alone headset? Is that the last year or so?

Pallister: I’ll toot my own horn and say I knew three years ago. [Laughs] Or I believed. Let’s put it that way. There’s a lot of factors in there. Some people didn’t think we could pull off a wireless solution. Some people thought a large number of customers wanted a platform that was exclusively VR, a VR Wii if you want to think of it that way. It was a continual point of debate, so we had efforts focused on both possible outcomes.

Then, it was over the past six months, eight months — even at CES last year, we were showing WiGig behind closed doors. It was becoming apparent that we could do a decent job there. We really have no compromises. The performance you can deliver from a 500-watt desktop that plugs into a wall outlet — no way you can get that kind of experience out of anything you wear on your head.

GamesBeat: It seems like WiGig might also usher in a lot of other things that should be wireless. We’re already getting to wireless keyboards and mice. I don’t know if wireless monitors or TV connections are coming.

Pallister: There are a bunch of things like that. The infrastructure is laid down for those who know how to do it. The challenge there is that — in the VR case, the implementation of WiGig was really aimed at VR. Latency is king. Resolution is second. Other things are third and fourth. You could do a solution for, say, wireless displays for gaming, taking the same approach. But a lot of what drove these things were other use cases where the priority was different.

To take the wireless display example, a lot of the first implementations — the primary use case was, “I want to walk into my conference room and not plug in anything and put my report on the screen. I don’t care about latency, but the text had better be readable, and it needs to always be compatible.” That set of priorities results into something that you then have to repurpose for gaming because as it is, it’s not so good. Over time, though, absolutely.