The Lab’s had a number of well-publicised (and some rather less publicised) feature and technology development projects that haven’t come to fruition. Where are they at? Or, in other cases, what happened to them?

Expressive Puppeteering

This project was a system to allow the parts of an avatar to be moved on the fly, with the possibility of creating animations in-world in real-time. Unofficially cancelled in mid-2007 due to a company-wide focus on reducing crashes and lag. Officially cancelled in 2008.

Physical Avatars (AKA: Avatar 2.0)

This project was to place the avatar in the same 3D space as the rest of the environment (rather than in its own loosely-coupled coordinate-space. It was partly in support of Expressive Puppeteering, and partly to allow avatars to hug, kiss, shake-hands and so on. It suffered the same fate as Expressive Puppeteering, and at the same time. Officially cancelled.

Script Limits

Script Limits involve finer per-parcel and per-agent allocation of scripting runtime resources. It’s a very tricky thing to implement properly, and it isn’t known if any of the staff involved with the project remain with the Lab. Given the potential benefits to Linden Lab, I’m not sure that the project is entirely dead. Status unknown.

Project Firefly

Project Firefly was hinted to be a client-side scripting project, that is making the Second Life viewer itself scriptable. Exactly what the scope of that might be remains unknown. It might allow for very simple alterations of the user-interface, or might extend to the capacity for full-blown automation of a viewer. No more has been heard about it, and the status of employees who might have been associated with it is uncertain. Status unknown, but pay attention to Oz and Esbee at SLCC. This could be integrated with new viewer development strategies.

C# Scripting

The possibility of adding support for other scripting languages in addition to LSL has been raised by the Lab on a number of occasions, though the integration effort isn’t without some hitches, including the possibility that other languages will have significantly larger runtime overheads than LSL scripts, and may require scripters to use some relatively arcane interface libraries to shoehorn additional languages into the execution/event model that the grid uses.

I believe the talent relating to this project has largely been let go, and there doesn’t seem to be any particular benefit to Linden Lab other than a marketing bullet-point. Status unknown, but not likely. Not officially cancelled.

Second Life Enterprise

Dead according to Lab management. There have been hints and suspicions of this for a while, but Philip Rosedale confirmed it in comments at the Q&A after he took over from Mark Kingdon, and this was confirmed the following week by a Lab spokesperson.

Face recognition/hands-free interface via Webcam

This was a technology from one of Mitch Kapor’s other businesses. While it’s been reported in the media as a Second Life/Linden Lab project, it doesn’t appear to have ever been taken on-board at the Lab. Not in progress; not planned; never started.

Mesh uploads

Server-side mesh-handling seems to be fine. Viewer-side mesh-handling also appears to be fine. The grid (or at least some simulators) appear to be able to deliver mesh objects now (as of about server version 1.40), only viewers (other than the closed-beta mesh-viewer) don’t know how to signal support for them or to ask for those meshes.

The Lab says this isn’t cancelled, and I’m inclined to believe it. If it was cancelled, the closed-beta would likely be terminated, and so far, that beta seems to be not just operational, but actually active.

Mobile viewer

Word is that the Lab is or has been working on a mobile/lightweight Second Life viewer, possibly for the iPad/iPhone. The project has dropped out of sight, and it is logical to assume that it has been cancelled, though there’s no official word on it at this time.

Did I miss any projects while I was rummaging through my notes? Do you have solid information about one of these projects that is different to what I have? Let me know.



Linden Lab looked into SpeedTree middleware back in 2005, which would have replaced all of the stock vegetation with high-quality, procedurally-generated models. This was abandoned in late 2005 (and officially cancelled in 2006), due to a number of probable issues. Firstly, as Linden Lab explained, it would put anyone in the existing SL virtual-vegetation industry out of business overnight. Second, costs associated with licensing the technology out in every viewer probably put the kibosh in things pretty quickly.

The reliable inventory service

Inventory loss comes in two classes. The rarer case is when the servers themselves have lost the inventory items. The more common is where the viewer does not receive a complete copy of inventory data from the servers. Inventory loading is a bit slow and balky because of the protocol used, and information can certainly be lost in transit.

By conversion to a TCP-based service, inventory loading was to be vastly accelerated and cases where the viewer received incomplete inventory data would drop to zero.  This was scheduled to roll out in January 2009 after some considerable time in development. It apparently did not, and nothing has been heard about it since.

HTTP textures

The same sorts of problems that plague inventory data transfers also affect texture transfers to some degree. The protocol used is slow and occasionally prone to pathological behaviour. Allowing textures to be loaded as a TCP-based Web-service was conceived of about 2-3 years ago, and viewers have had the (well-hidden) option to make use of it for about that long.

The servers, however, didn’t really support it, except for occasional testing. Now, however, it looks as if that project is about to reach fruition. Testing is going ahead at the moment, and – barring show-stopping issues – it looks like this project will roll out within a month, delivering faster and more reliable texture loading.

Got a news tip or a press-release? Send it to [email protected].
Read previous post: