A Tall Dark Stranger

Wed 16 September 2020 by Jim Purbrick


Over the past few years I’ve spent a lot of time helping people new to virtual worlds understand how they work. Over the next few weeks I’m going to share a series of short posts on some of the high level concepts I covered which will hopefully be useful to other people new to virtual worlds. The previous post talked about ways in which you can fill an entire world with things to keep everyone interested forever. This post talks about how you can make sure everyone can create an avatar that allows them to fully express themselves within a virtual world.

Professor Richard Bartle, the co-creator of MUD1 and networked multi-user virtual worlds in 1977 is fond of saying that he prefers text-based virtual worlds because the graphics are better. Just as a good book can conjure vivid images in the mind, text can conjure images of worlds which would be hard or impossible to render with triangles. Another huge advantage of text-based worlds is that it is far more common for people to author text than to create 3D art assets. If I want to be a tall dark stranger in a text based world it is as simple as setting my description to those words. If I want to be a tall dark stranger in a VR environment it is much harder.

Where a virtual world has a clearly defined setting such as high fantasy, space opera or tropical pirates it can be feasible to supply a wide enough variety of faces, bodies, clothing, accessories and hair styles. This is the approach taken successfully by EVE or Rare with the infinite pirate generator and results in a wide variety of avatars which are all thematically consistent with the world. If you are building a Metaverse which connects every possible virtual world how do you start? How do you please the person who wants to have the perfect tall dark stranger avatar rendered in 3D as well as everyone else?

This was the challenge facing Linden Lab when building the avatars for Second Life at the turn of the millennium. While the default avatars could be manipulated into a huge variety of shapes and sizes using the available parameters, the results were often crude and the available wardrobe never strayed far from geeky teen hangout chic. Luckily for Linden Lab the inability of the avatar system to satisfy every whim was a huge opportunity for the entrepreneurs of Second Life.

Builders quickly realized that they could fashion shoes, hair, clothes and accessories from primitive 3D objects which could be attached to avatars to dramatically increase the gamut of styles that could be realised. Linden Lab enabled another wave of innovation by allowing animations to be uploaded which as well as allowing every conceivable dance move to fill the nightclubs of Second Life also enabled tiny avatars which either folded up the default avatar geometry inside cutely sculpted animal models or moved the default avatar under the floor so it was completely hidden and just served as a platform for a tiny replacement avatar standing on its head. Many years later Linden Lab finally allowed people to entirely replace the default geometry and textures so now it is common for not a single triangle or texel in a Second Life avatar to have been defined by Linden Lab.

People were welcome to build their own clothes and avatars from textures and geometry, but the vast majority instead chose to buy and combine pieces from the vast array of builders who were paying their real life rent by selling virtual goods and so were more than happy to cater to their every desire. If you wanted to be a tall dark stranger there were a huge number of them to choose from. If you wanted to combine a punk jacket with a tutu to create a unique and look you could do that without ever opening photoshop.

People chose to be robots, punks, steampunks, wizards, astronauts and chose looks from across every conceivable genre of art, fiction and real life fashion scene. Conversations about where people found clothes or accessories became common, people went shopping for clothes together, attended and hosted fashion shows or reported on them in social media, just like in real life. Avatars and fashion became important content around which conversation would form.

Second Life in general accepted this riot of self expression with open arms. IBMers attended a company all hands talk by their CEO in Second Life as aliens in suits among a variety of other things. While a few areas enforced dress codes or handed out attire which would match the theme of their experience, most of the time this was optional. Avatars were seen as a very personal form of expression which people building experiences would respect and not constrain. Some people would revel in this freedom by changing between avatars and identities frequently and on a whim, but in general the more well known someone became in Second Life, the more useful having an immediately recognizable avatar became, the less frequently they would tend to change their avatar and the more onerous being asked to change their appearance in order to fit in with an experience became.

We should bear these lessons in mind as we think about avatars for the virtual worlds of the future. While default avatar styles should be comfortable for a wide audience of people and acceptable for developers building a wide range of experiences, we likely also need to allow people to easily create everything from clothes and virtual personal electronics to blend shapes and eye movement simulation models and the services to allow people to buy, sell and police virtual goods and take them wherever they go across the Metaverse. Ultimately the developers of a successful Metaverse shouldn’t need to define avatar styles: the virtual fashion industry they enable will.

(Second Life screenshot: Ella Pinellapin)

The Conversation Around Content

Wed 09 September 2020 by Jim Purbrick


Over the last few years I’ve spent a lot of time helping people new to virtual worlds learn how they work. Over the next few weeks I’m going to share a series of short posts on some of the high level concepts I covered which will hopefully be …

read more


Tue 08 September 2020 by Jim Purbrick

Before my recent post about leaving Facebook, it had been a while since I’d updated The Creation Engine and it turned out I had some housekeeping to do. After pushing the Pelican output to https://github.com/jimpurbrick/jimpurbrick.github.com I got a mail from GitHub saying that …

read more

0 to 1

Thu 20 August 2020 by Jim Purbrick

Facebook badge

8 years ago London was hosting the Olympics and I met Philip Su for the first time at Browns in Covent Garden to talk about the engineering office Facebook was planning to open in London. By the end of this year Facebook London will have thousands of people working in …

read more

This blog is 10

Mon 02 July 2018 by Jim Purbrick

Just over ten years ago I set up The Creation Engine No. 2 after previously blogging on the original Linden Lab hosted Creation Engine and before that on Terra Nova. So, while I’ve been blogging for almost 14 years, 10 years of The Creation Engine No. 2 seems like …

read more

Replicated Redux: The Movie

Tue 22 May 2018 by Jim Purbrick

The recording of my recent React Europe talk about Replicated Redux is now online and I’ve written several other posts describing designing, testing and generalising the library if you would like to know more about the details. If you’d like to play the web version of pairs or …

read more

Replaying Replicated Redux

Fri 10 November 2017 by Jim Purbrick

While property based tests proved to be a powerful tool for finding and fixing problems with ReactVR pairs, the limitations of the simplistic clientPredictionConstistenty mechanism remained.

It’s easy to think of applications where one order of a sequence of actions is valid, but another order is invalid. Imagine an …

read more

Building Safety in to Social VR

Thu 26 October 2017 by Jim Purbrick

Last year I hosted a panel on creating a safe environment for people in VR with Tony Sheng and Darshan Shankar at OC3. I commented at the time that the discussion reminded me of the story of LambdaMOO becoming a self-governing community told by Julian Dibbell in My Tiny Life …

read more

Testing Replicated Redux

Mon 31 July 2017 by Jim Purbrick

Opening a couple of browser windows and clicking around was more than sufficient for testing the initial version of ReactVR pairs. Implementing a simple middleware to log actions took advantage of the Redux approach of reifying events to allow a glance at the console to reveal precisely which sequence of …

read more

ReactVR Redux Revisited

Tue 04 July 2017 by Jim Purbrick

There were a couple of aspects of my previous experiments building networked ReactVR experiences with Redux that were unsatisfactory: there wasn’t a clean separation between the application logic and network code and, while the example exploited idempotency to reduce latency for some actions, actions which could generate conflicts used …

read more

Generation JPod

Sat 03 June 2017 by Jim Purbrick

I’ve just got back from Kaş where I spent a lovely few days celebrating Pinar and Simon’s wedding and while there spent a few hours reading Now We Are 40: a thoughtful and entertaining look at everything from house music to house prices from the perspective of Generation …

read more

2² Decades

Thu 20 April 2017 by Jim Purbrick

Several years ago when we were in 100 robots together, Max was celebrating his 40th birthday. When I said that mine would be in 2017, it felt like an impossibly far future date, but, after what feels like the blink of an eye, here we are.

Along with many other …

read more

VR Redux

Wed 04 January 2017 by Jim Purbrick

Mike and I have been talking about how to easily build simple networked social applications with ReactVR for a while, so I spent some time hacking over the Christmas break to see if I could build a ReactVR version of the pairs game in Oculus Rooms. Pairs is simple and …

read more

Creating A Safe Environment For People In VR

Mon 31 October 2016 by Jim Purbrick

I was very happy that Oculus found time at OC3 to host a panel on creating a safe environment for people in VR. As social VR becomes more popular over the next few years it will quickly have to learn how to keep people safe together in shared environments. Some …

read more


Sun 21 August 2016 by Jim Purbrick

At the 3rd Party Dev State of the Union at EVE Fanfest 2016 earlier this year, CCP FoxFour drew my attention to a limitation of the current approach used by crestmatic to generate CREST documentation: it only discovers resources always reachable from the API root from the perspective of the …

read more

Strange Tales From Other Worlds

Tue 10 May 2016 by Jim Purbrick

At the end of last year, Michael Brunton-Spall and Jon Topper asked me if I would like to give the opening keynote at Scale Summit as I had “lots of experience scaling weird things”, by which they meant Second Life and EVE Online. I immediately thought of The Corn Field …

read more

Towards A Generic Media Type System

Sun 17 April 2016 by Jim Purbrick

The early days of RESTful hypermedia API design tends to involve lots of homogeneous collections. In the case of CREST vnd.ccp.eve.Api-v1 pointed to the logged in vnd.ccp.eve.ccp.Capsuleer-v1 which pointed to a vnd.eve.ccp.CharacterCollection-v1 of contacts which pointed to many vnd.ccp …

read more


Fri 15 April 2016 by Jim Purbrick


3 weeks ago I spent a few hours with photoshop working on the Story Bird logo that Linda made a while ago to make it suitable for print. 2 weeks ago I spent a few hours researching the best way to convert the 24 bit 48 Khz Story Bird mixes …

read more


Sat 26 March 2016 by Jim Purbrick

Black barn mixing desk

I love record shops. Whenever I had pocket money it would go on Metallica and Nirvana CDs bought from Our Price or black t-shirts to match. When I lived in Nottingham I bought Boards Of Canada CDs from the same Selectadisc that my Dad bought a rare Fairport Convention single …

read more


Sun 03 January 2016 by Jim Purbrick

A year ago I gave a talk at EVE Vegas about building RESTful CREST applications. My #1 recommendation was to specify representations in requests, but that’s hard to do when there is little documentation on which representations are available and what they contain.

Fortunately CREST is self describing: send …

read more
Fork me on GitHub