The design moral of this story is that we should spend some time thinking about the language we use to describe designs and their features. This helps all participants in the design process to develop a shared understanding. But we also need to make sure we don’t let our language constrain creativity. It is always just a good starting point.

My friends at Limina Immersive have done something rather special – in fact, I think, of historical significance. They have published a taxonomy, or design language, of VR experiences. The aim is to bring the emerging industry together, to accelerate the innovation process, and to inspire further creativity. Why do I think this is so important? Here’s a history lesson, courtesy of Douglas Adams, illustrating how not to repeat the mistakes of previous generations.

Let’s begin with some time travel…

In 1977, three years before he “invented” the wirelessly connected tablet computer, Douglas Adams encountered the Commodore PET, and was, like most other people at the time, both baffled by its apparent uselessness and beguiled by the hint of greatness emanating from its off-white pyramidal shell – designed in accordance with the sci-fi rule book of the day. The future had landed like a Spielbergean flying saucer. With a slightly disturbing whoosh and hum, a hatch opened and a high-tech ladder descended. A glow suffused through dry ice clouds. Clearly, the aliens were about to emerge. But what kind of life form should we expect? And what would their purpose be? Nobody seemed to know.

Commodore (one of the big names from the early days of computing) didn’t themselves actually know what they were selling. They called it a “Personal Electronic Transactor”, abbreviated to PET so as to imply a friendliness, but also perhaps in admitting its un-trainability and predilection for misbehaviour. Today we recognise it as a primitive kind of desktop computer. We have a slightly more advanced design language for such glowing electronic boxes. As for how the PET might be used, beyond being a vanity purchase, there was even less clarity. And to Douglas Adams, looking back upon the history of personal computing in November 1999, that tells us something important about how advanced technologies sometimes come into our lives. To me, watching the introduction of an equally revolutionary technology (virtual reality) now, I can see how today’s innovators are proactively developing a rich-enough design language, so as to accelerate the process of adoption and adaptation. We have more sophisticated methods, learned from design research (the study of how successful designers do what they do), to short-circuit socio-technical evolution. Rather pleasingly, this intelligent strategy is being spearheaded by my friends at Limina Immersive and the Digital Catapult Centre (which is exactly what we want from a catapult kind of thing).

But let’s not get ahead of the story. Adams tells us of how in the ‘70s and ‘80s we passed through a series of limited and limiting conceptions of what the computer actually was for:

“The reason I couldn’t imagine what use it would be to me was that I had a very limited idea of what a computer actually was—as did we all. I thought it was a kind of elaborate adding machine. And that is exactly how “personal” computers (a misleading term as applied to almost any machine we’ve seen so far) were for a while developed—as super adding machines with a long feature list.” (Adams, 1999: 91)

Adams describes how over several years, and at great expense to many “enthusiasts”, we cycled through a series of “aha, that’s what it is” revelations. Once we had exhausted the limited usefulness of the “adding machine” idea (not many people really needed such a powerful adding machine), we moved on:

“Then, as our ability to manipulate numbers with these machines became more sophisticated, we wondered what might happen if we made the numbers stand for something else, like for instance the letters of the alphabet. Bingo! An extraordinary, world-changing breakthrough! We realised we had been myopically short-sighted to think this thing was just an adding machine. It was something far more exciting. It was a typewriter!” (ibid. 91)

And so was born Microsoft Word and its “long and increasingly incomprehensible feature list”. But actually, as many professional writers realised, that’s not actually what they needed. This explains the now almost impossible to believe and long-running popularity of the Amstrad word-processing computer (which looked like a PC, but only did word-processing):

“With new, more-immature technologies there is a danger in getting excited about all the ways you can push them forward at the expense of what you want to say. It is therefore rewarding to work in a medium where you don’t have to solve those problems because it is a mature medium.”

Adams bought an early Apple Mac, and persevered with his enthusiasm (perhaps contributing to his habit of letting deadline after publishing deadline pass by with a pleasing but financially awkward whooshing sound). And then, with most other writers happily hammering away on the keys of their Amstrads, the computer enthusiasts made their next astonishing [sarcasm] advance:

“The next breakthrough came when we started to make these numbers, which were now flying round inside these machines at insane speeds, stand for the picture elements of a graphic display. Pixels. Aha! we thought. This machine turns out to be much more exciting even than a typewriter. It’s a television! With a typewriter stuck in front of it!”

Some people actually turned their computers into televisions using analogue TV receiver cards – while everyone else just got bigger and bigger actual real TVs. Meanwhile, the enthusiasts (many of whom were socially isolated men), discovered an alternative means of accessing still and moving imagery (often of dubious nature) beyond the control of broadcast regulators: The Internet. And so, the computer became recognised for what it really was (again), an alternative kind of postal system. And after that? A kind of shop. And almost (but never quite), a university. We spent years churning through different concepts of the computer. But, as Adams tells us, along the way a more sophisticated and utterly revolutionary idea emerged (or rather re-emerged, as Turing had already worked this out):

“The computer is actually a modelling device. Once we see that, we ought to realise that we can model anything in it. Not just things we are used to doing in the real world, but the things the real world actually prevents us from doing.”

This deeper concept of the computer was more rooted in the one aspect of personal computing that Adams missed out of his account: gaming. Perhaps it is missing from the narrative because of the complexity and diversity of what the category of computer use covers, and the close connection between games and the increasing virtualisation of humanity (which is not necessarily a bad thing). To my children (aged 13 and 7), it is absolutely blindingly obvious. They are Minecraft players. Not only do they live a large part of their lives in virtual worlds, they understand how the laws of that environment are “hacked” through “mods” to continually make the impossible possible in the virtual.

And that leads us into the now: virtual reality becoming a ubiquitous technology for modelling the real world, possible worlds, and (most excitingly), impossible worlds. VR is new, exciting, and very, very, different. It’s also very much real now. The technology has arrived (Oculus Go for £200). But as Catherine Allen has argued, it isn’t like strapping the glowing rectangle of a computer or TV screen to your face (well it is in a practical way, but the magic of its illusion means that it really isn’t like that at all). And I agree with the claim that VR is a distinctive new medium, in the same way as digital text is distinct from print – but even more so.

This time, let’s not make things so painful for ourselves. Let’s not spend years churning through unhelpful, unnecessarily constraining concepts as to what VR is and is not. Let’s learn from the designers.

Every design discipline has its own language. Some of them are highly formalised, but with zones of raggedness enabling emergence. They might be organised into what Christopher Alexander called a “pattern language” – consisting of design patterns, each of which outlines at a fairly abstract level a common design challenge and a well-known and tested design solution. Such pattern based design solutions describe the actions and interactions of human and non-human actors in a narrative. The pattern serves as a recognisable, shareable, starting point for a design conversation between the many diverse partners in a design process – including many specialist designers, clients, engineers, business people and end users. The design dialogue is improved if we can say, when thinking about the challenge and potential solutions: “is it that kind of thing? – I’ve seen that before” and “sort of, but there are aspects of it that are more like this kind of thing.”

Alexander argued for an extensive formalisation of shared patterns around which a more participatory and democratic design dialogue would take place (his particular field was town planning, so very much in need of democratising). The pattern language approach is now most widely used in software engineering, where it has proved to be invaluable, although less successful attempts have been made to port it to other fields including education and architecture.

But there are dangers in such a formalised approach. It might, for example, inhibit creativity. Patterns are only meant to be starting points, indeed potential points for critical difference, around which the design conversation may flow. But reification may occur. The design dialogue may become unwittingly stuck on a pattern. This is an especially big risk when we are at the early stages of learning about and developing a new technology, and more so, a new medium – such as virtual reality, or to use more open design language, XR (extended reality), covering a wider range of interconnected immersive technologies and techniques. Even the simple term VR might be too constraining at this stage. Designers are exploring many variations of VR, augmented reality (AR), as well as interfaces and flows between the digital and the analogue. Holographic technologies (which my son has been playing with as part of a Ludic Rooms project), might be effectively combined with digital, haptic, analogue elements into something uniquely new, but not easily categorised. Artists (and, interestingly, teachers) tend to shun formal pattern languages. They want to keep things more open, more interstitial. If you want to be deeply inventive, it is often best to avoid reification of pattern languages and the ‘jumping to conclusions to early’ that they encourage.

But at the same time, if you are looking to accelerate the growth of a community of experts, or indeed an industry (as does the Digital Catapult), it helps to have some formality, some publicly stated design language, around which it can agree. This is especially important when seeking to bring many new practitioners into the community quickly – the process that Wenger describes as enabling “legitimate peripheral participation” moving to active membership of a community of practice. What we need to do is to get the balance exactly right. We want to avoid the endless, inefficient, wishy-washy, painfully slow diffusion of innovation encountered in the early years of computing, but without stifling creativity. Another variation of the design language strategy is required.

And that brings us back to the work of Limina Immersive who, working with Dan Tucker (formerly a BBC producer, now a key figure in the XR industry). The icons listed above, designed by Piers Elliot, represent 15 genres of immersive experience. They are the core content of the Digital Catapult report on “Immersive Content Formats for Future Audiences” (2018). They are not intended to be a conclusive set. More will emerge in the future. Instead, they work to open-up pathways that producers and consumers may follow in their growing exploration of the new media. Within design dialogues, they work as initial “placements” (Buchanan, 1992). For example, I might be working with a group of academics and students from a discipline (say engineering), trying to inspire them into imagining different ways in which XR might be useful in engineering education. We might start by saying “imagine a treasure hunt” and then “what do we know about how treasure hunts work?” and “is there anything like that in engineering education?” or “could there be?” – and we can start to build design ideas around it. We might even use the icons within the diagrams and clusters of post-it notes that develop. We could think about different perspectives on the treasure hunt idea, looking around the idea to explore its implications. We might then quickly move to some prototypes – a more imaginable development with the initial placement as our starting point. In fact, we will be doing this. We have funding, with Warwick Manufacturing Group (WMG) and Monash University (Melbourne) to explore the potential of XR in engineering education. We will be generating some design patterns. But first we need a more fluid, less constrained exploration inspired by some attractive and easily understood starting points.

As Limina’s Catherine Allen explained at the launch event for the report, they were chosen through a long and extensive curatorial process, in which the 130 most impactful experiences were identified from a list of over 1000. The Limina team, along with a range of other participants, themselves experienced the long short list, in order to identify patterns, create potential categories, and choose 2 best examples for each category. A panel of members of the public was used to test the categories and assess audience impact. This was a massive job! But necessary. The outcome is a design language that takes us away from talking to each other, and the wider public, in vague terms about VR experiences, AR experiences etc. We can now, for example, curate a programme of experiences for an event, in which we tell audiences that they will be embarking on an “audio journey”, or an experience to “access all areas” of, for example, the nuclear industry, or perhaps they will be getting “up close and personal” with sea monsters. Digging further down into the report, the descriptions of the example experiences for each category provide a richer design language relating to human experiences: aesthetic, cognitive, emotional, physical aspects – with technical aspects appearing only when relevant and necessary. Design conversations are enriched by this language. People’s experiences of the products, and their reflections on them, will be enriched as the language spreads and grows. Already, in their initial research, a richer body of design knowledge has become concrete:

“The most successful formats tended to generate more than one sort of immersion in their audience. Immersion can be categorised into being immersed in a space (spacial immersion) and being mentally immersed (strategic immersion, narrative immersion and tactical immersion).”

And now we have design guidelines indicating how we can make more fruitful design choices, following a richer but more precise range of options.