Earlier in the year I helped Josh Sanburn and his team put together a podcast series on building Second Life for the Wall Street Journal called “How To Build a Metaverse” which I’m now really enjoying. It’s great to hear all of the amazing stories about the origin of Second Life told by some of the most amazing people I’ve ever worked with, but the stories I’m enjoying the most are the stories I hadn’t heard before.

An incredible example from the most recent episode is the story of how Second Life’s moderation began. Peter Alau tells the tale of how he cribbed together a 20 paragraph Terms of Service using examples from other virtual worlds before Philip stepped in and mandated that the terms of service should be “Be Nice”.

My life was changed forever when I read Julian Dibbell’s article “A Rape In Cyberspace” while working on the literature review for my PhD in Persistent Virtual Environments at the University of Nottingham. The article made me realize that virtual worlds were not just a collection of interesting technical challenges, but that they could become real, meaningful communities and that the people who met in them could be hurt, just as they can in real life. Reading Dibbell’s article made me want to work on commercial virtual worlds that enabled those real human communities rather than experimental worlds that only existed as technical proofs of concept. I left academia, started working in the games industry, and quickly found the Terra Nova group blog that Julian Dibbell was contributing to along with Cory Ondrejka, the CTO of Linden Lab.

Given the path that led me to Second Life, I was saddened to hear that the origin story of moderation in Second Life didn’t mention it. Peter should have known that people wouldn’t read his epic Terms of Service, but at least he tried to apply best practice. “A Rape In Cyberspace” was already 10 years old when Second Life launched and Linden Lab were talking to many of the pioneers who worked on early virtual worlds. Philip should have known better, but pursued a wishful, naive approach to moderation and Second Life ended up learning a lot of lessons that had already been learned the hard way.

Grey Goo

This wasn’t the only occasion that Second Life’s design was optimistic, naive and didn’t give enough thought to how it might be abused by bad actors. When I first visited San Francisco I hosted a party on Russian Hill to get to know my colleagues only to end up huddled in the living room with other engineers battling a plague of grey goo spreading across the grid that was enabled by an over-permissive API. The API allowed scripted objects to self-replicate and so exponentially overwhelm regions until firewalls of shut down simulators limited the spread and space lasers were able to delete scripts to purge the world of the menace. Shortly after I returned to the UK I woke up one morning to my first encounter with the infamous Goatse image which a resident had pasted across the world so that it would show up on the live map that had been naively been added to the front page of secondlife.com without enough thought about how it might be abused.

Eventually Second Life’s moderation policies and processes got to a good place (Robin Harper was one of the people I spoke to about best practices for moderation when I was working on building safety into Oculus Venues) but the story of how moderation began in Second Life is one of missed opportunities. We shouldn’t just laugh off Second Life’s failings as the stories of swashbuckling hackers while at the same time pointing fingers at the similar failings of the current efforts to build a Metaverse. Multi-user virtual worlds were already 20 years old when Second Life was built and many lessons had already been learned.

“How To Build a Metaverse” is incredibly entertaining and illuminating, but this part of the story is a good example of how not to build a metaverse. You can’t just read the fiction about virtual worlds and ignore the non-fiction. You can’t just talk to people who built early virtual worlds or hire them: you have to actually listen to them and apply the lessons they learned.


Virtual Worlds, Real People

Thu 17 March 2022 by Jim Purbrick

Last week I gave a lab talk to my former research colleagues at the Mixed Reality Lab at the University of Nottingham about the work I’ve been doing since leaving the lab over 20 years ago. Rather than talk about technology I focussed on the lessons that todays efforts …

read more

The Art Of Social VR

Wed 03 February 2021 by Jim Purbrick

The recording of my recent Stereopsia 2020 talk about the art of designing social VR experiences is now online. The talk summarises a lot of material covered in more depth in my posts on The Conversation Around Content, A Tall Dark Stranger and Small Places Loosely Joined, so if please …

read more

21st Century JavaScript

Sat 12 March 2011 by Jim Purbrick

The slides and video of my talk at AsyncJS on Thursday are now online. The video is pretty murky, but the sound has come out fine and you can see enough of the slides to be able to follow along at home. The talk focuses on ways to bring useful …

read more

Bouncaline

Tue 03 November 2009 by Jim Purbrick

Last week I took some time off to spend with Luke and Natty during half term and we spent Wednesday having a lovely time finishing off a game we started a couple of months ago: Bouncaline.

Luke has been interested in making games for a while: he made a level …

read more

Jon Blow

Fri 01 August 2008 by Jim Purbrick

Jon Blow

At the recommendation of John and Alice I took a break from Develop Online to listen to Jon Blow‘s talk at Games:Edu this week and was totally blown away.

Jon talked about whether games are poised to enter a golden age similar to films in the ‘30s, when …

read more