Horizon Workrooms

Now that Horizon Workrooms has launched I’m very happy to be able to write about the functionality that I found most exciting while building the experience: the mapping of virtual objects to their real world counterparts.

Typically augmented and mixed reality experiences overlay real world objects with virtual annotations. An AR app on your phone might recognize a face seen by the camera and show the face with added bunny ears on the phone screen, for example. Workrooms is different in that it shows you an entirely virtual environment, but asks you to indicate the position of real world objects like your desk in a process which feels like a more detailed guardian setup process. Workrooms then positions your avatar in the virtual world so that your real desktop aligns with your virtual desktop. When you reach out and touch the virtual desk your hands touch the real desk. As long as the important virtual objects within arms reach are mapped to real objects, the virtual environment can be much bigger than the available real space while maintaining the illusion that every virtual object can be touched as well as seen.

In addition to adding to the immersion and realism of the experience, the haptic feedback has important practical benefits. Typing on a virtual keyboard is much easier if you’re also typing on a real keyboard rather than on a flat featureless surface or just moving your fingers in space; it is much less tiring to draw on a virtual whiteboard if you are also leaning on a real wall and it’s much less dangerous to show someone a virtual desk if leaning on it or putting a coffee on it doesn’t result in you falling over or spilling coffee on your feet. While requiring a suite of real objects to be available puts some additional constraints on using the application, the required objects are common enough in the case of Workrooms that the trade off is often worthwhile and less onerous than requiring the use of specialised additional VR hardware to provide much lower fidelity haptic feedback.

Mapping real objects to virtual objects can also go beyond providing haptic feedback by being used to provide a very natural indication of intent. Moving from a seated desk to a nearby wall in the real world can be used to indicate to an application the intent to use a whiteboard and might result in teleporting in the virtual world to a whiteboard location much further away, allowing people to navigate virtual spaces which are much larger than the available space in the real world.

Using real objects that can be touched shifts the problem of providing haptic feedback from one requiring complex mechanical force feedback devices to one that, in future, would use computer vision to recognise objects in the real environment using the same cameras that provide inside-out tracking capabilities in modern VR headsets. By using virtual objects that map exactly to their real counterparts the haptic feedback provided would be perfect. A real MacBook keyboard feels exactly like a MacBook keyboard when you touch it because it is a MacBook keyboard, whereas any force feedback device trying to synthesize the same haptic feedback could only ever provide an approximation.

While the relative ubiquity of home offices means that this mapping approach lends itself to an experience like Workrooms, it’s also exciting to think about how similar approaches could be used to add perfect haptic feedback to less mundane VR experiences in more fantastic virtual environments. The Workrooms team was lucky enough to see some of those possibilities during a team offsite to the Star Wars VR experience at the Void in London. The experience created the illusion of a large virtual environment using a much smaller real space by using tricks like having participants moving through a real doorway into a virtual elevator and then later exiting through the same doorway to a new virtual room once the elevator had virtually moved to a new floor in the virtual environment.

With the increasing availability of headsets which use SLAM based inside-out tracking and the development of techniques like redirected walking, it’s possible to imagine a future where rather than defining a safe cuboid within the real world, guardian systems could instead map an entire home or office, identifying doors, chairs, tables and other objects which could be enumerated via an API to VR applications. The application could then use the available real world features to generate huge custom fantasy environments in a rogue-like game in which every wall and table can be leant on, every door can be opened and closed, every chair sat in and potentially every staircase climbed. I’m very excited to see, hear and touch the experiences these techniques might enable in the near future.


The Art Of Social VR

Wed 03 February 2021 by Jim Purbrick

The recording of my recent Stereopsia 2020 talk about the art of designing social VR experiences is now online. The talk summarises a lot of material covered in more depth in my posts on The Conversation Around Content, A Tall Dark Stranger and Small Places Loosely Joined, so if please …

read more

A Past And Present Future Of Work

Wed 30 September 2020 by Jim Purbrick

Studio Blighty

Over the last few years I’ve spent a lot of time helping people new to virtual worlds learn how they work. Over the last few weeks I’ve been sharing a series of short posts on some of the high level concepts I covered which will hopefully be useful …

read more

Small Places Loosely Joined

Wed 23 September 2020 by Jim Purbrick

Untitled

Over the last few years I’ve spent a lot of time helping people new to virtual worlds learn how they work. Over the next few weeks I’m sharing a series of short posts on some of the high level concepts I covered which will hopefully be useful to …

read more

A Tall Dark Stranger

Wed 16 September 2020 by Jim Purbrick

Untitled

Over the past few years I’ve spent a lot of time helping people new to virtual worlds understand how they work. Over the next few weeks I’m going to share a series of short posts on some of the high level concepts I covered which will hopefully be …

read more

The Conversation Around Content

Wed 09 September 2020 by Jim Purbrick

Okinawa

Over the last few years I’ve spent a lot of time helping people new to virtual worlds learn how they work. Over the next few weeks I’m going to share a series of short posts on some of the high level concepts I covered which will hopefully be …

read more

HTTPS

Tue 08 September 2020 by Jim Purbrick

Before my recent post about leaving Facebook, it had been a while since I’d updated The Creation Engine and it turned out I had some housekeeping to do. After pushing the Pelican output to https://github.com/jimpurbrick/jimpurbrick.github.com I got a mail from GitHub saying that …

read more

0 to 1

Thu 20 August 2020 by Jim Purbrick

Facebook badge

8 years ago London was hosting the Olympics and I met Philip Su for the first time at Browns in Covent Garden to talk about the engineering office Facebook was planning to open in London. By the end of this year Facebook London will have thousands of people working in …

read more

This blog is 10

Mon 02 July 2018 by Jim Purbrick

Just over ten years ago I set up The Creation Engine No. 2 after previously blogging on the original Linden Lab hosted Creation Engine and before that on Terra Nova. So, while I’ve been blogging for almost 14 years, 10 years of The Creation Engine No. 2 seems like …

read more

Replicated Redux: The Movie

Tue 22 May 2018 by Jim Purbrick

The recording of my recent React Europe talk about Replicated Redux is now online and I’ve written several other posts describing designing, testing and generalising the library if you would like to know more about the details. If you’d like to play the web version of pairs or …

read more

Replaying Replicated Redux

Fri 10 November 2017 by Jim Purbrick

While property based tests proved to be a powerful tool for finding and fixing problems with ReactVR pairs, the limitations of the simplistic clientPredictionConstistenty mechanism remained.

It’s easy to think of applications where one order of a sequence of actions is valid, but another order is invalid. Imagine an …

read more

Building Safety in to Social VR

Thu 26 October 2017 by Jim Purbrick

Last year I hosted a panel on creating a safe environment for people in VR with Tony Sheng and Darshan Shankar at OC3. I commented at the time that the discussion reminded me of the story of LambdaMOO becoming a self-governing community told by Julian Dibbell in My Tiny Life …

read more

Testing Replicated Redux

Mon 31 July 2017 by Jim Purbrick

Opening a couple of browser windows and clicking around was more than sufficient for testing the initial version of ReactVR pairs. Implementing a simple middleware to log actions took advantage of the Redux approach of reifying events to allow a glance at the console to reveal precisely which sequence of …

read more

ReactVR Redux Revisited

Tue 04 July 2017 by Jim Purbrick

There were a couple of aspects of my previous experiments building networked ReactVR experiences with Redux that were unsatisfactory: there wasn’t a clean separation between the application logic and network code and, while the example exploited idempotency to reduce latency for some actions, actions which could generate conflicts used …

read more

Generation JPod

Sat 03 June 2017 by Jim Purbrick

I’ve just got back from Kaş where I spent a lovely few days celebrating Pinar and Simon’s wedding and while there spent a few hours reading Now We Are 40: a thoughtful and entertaining look at everything from house music to house prices from the perspective of Generation …

read more

2² Decades

Thu 20 April 2017 by Jim Purbrick

Several years ago when we were in 100 robots together, Max was celebrating his 40th birthday. When I said that mine would be in 2017, it felt like an impossibly far future date, but, after what feels like the blink of an eye, here we are.

Along with many other …

read more

VR Redux

Wed 04 January 2017 by Jim Purbrick

Mike and I have been talking about how to easily build simple networked social applications with ReactVR for a while, so I spent some time hacking over the Christmas break to see if I could build a ReactVR version of the pairs game in Oculus Rooms. Pairs is simple and …

read more

Creating A Safe Environment For People In VR

Mon 31 October 2016 by Jim Purbrick

I was very happy that Oculus found time at OC3 to host a panel on creating a safe environment for people in VR. As social VR becomes more popular over the next few years it will quickly have to learn how to keep people safe together in shared environments. Some …

read more

crestexplorer

Sun 21 August 2016 by Jim Purbrick

At the 3rd Party Dev State of the Union at EVE Fanfest 2016 earlier this year, CCP FoxFour drew my attention to a limitation of the current approach used by crestmatic to generate CREST documentation: it only discovers resources always reachable from the API root from the perspective of the …

read more

Strange Tales From Other Worlds

Tue 10 May 2016 by Jim Purbrick

At the end of last year, Michael Brunton-Spall and Jon Topper asked me if I would like to give the opening keynote at Scale Summit as I had “lots of experience scaling weird things”, by which they meant Second Life and EVE Online. I immediately thought of The Corn Field …

read more
Fork me on GitHub