Architects Using Virtual Reality: A Gamechanger?

Once the preserve of computer game designers, virtual reality technology enables architects—and their clients—get up front and personal with a building far in advance of its completion.

Being able to immerse oneself in a 3D representation of the world has always held an attraction for artists and designers. The technology that has enabled people to do this to some degree or another has been around for nearly 200 years.  

It all kicked off in 1838 when Charles Wheatstone, a Fellow of the Royal Society in London, invented the stereoscope, a device which, as its name suggests, enabled the viewer of an image to see it in stereo, thereby seeing it in relief, a sort of primitive 3D.  

Over time the technology has evolved. Hollywood technician Morton Heilig offered the world’s movie-goers “Sensorama” in the 1950s, which gave audiences a more immersive cinematic experience. But it wasn’t until the 1980s that what we are familiar with today as constituting virtual reality (VR) equipment came into being, when U.S. computer artist Jaron Lanier manufactured a VR headset and hand controls, becoming known in some circles as the “founding father of virtual reality.” 

Initially a technology to enhance the experience of those playing computer games, VR has increasingly been used across industry to help people better visualise the world around them. Architects are now using virtual reality to considerable effect. 

Along with augmented reality, VR can offer users—including architects, designers and their clients—a view of what a building and its interior, together with fixtures and fittings, will look like once the project has been completed. It enables designers to “play around” with a concept before construction gets underway, helping to carry out adjustments to an existing proposal so that it delivers a better outcome. It also means clients can see what an end result might look like months—even years—before completion. 

Architects are using virtual reality more and more, and Bluebeam spoke to HawkinsBrown’s Jack Stewart, architect and digital design lead, and Ben Robinson, associate, digital design, to get their views on VR. 

Bluebeam: How long has HawkinsBrown been using virtual reality? 

Jack Stewart: We would have got our first Oculus VR headset when they first came out, which is about five years ago, and we started to explore creating computer game environments for our projects. We can do it on any scheme, but usually we use it where we think it’s relevant or if there’s a need to explain something to a client about a particular part of a project.  

A big kind of chunk of what we do is computational design, and we use a variety of programming technology to help us with our design work. But we are also magpies for new technology, so we love to take any technology that exists out there and apply it to a project, provided there’s a relevant reason to do that. That’s where the VR side of things came in; we got hold of some headsets and we started to create the VR environments that sit behind those headsets for a few of our projects. The Oculus staircase at the Cardiff Innovation Campus’ Sbarc|Spark building is probably the most developed example of that. 

BB: Ben, can you tell us about that project? 

Ben Robinson: The Sbarc|Spark building [being built by Bouygues U.K.] was the first time we had utilised game engine technology and VR on a project. This was very powerful, both for design development and coordination with MEP/structures. We used Dynamo alongside a real-time game engine walkthrough to update the design. It meant we were able to immediately walk around the model to assess the impact of the change either on screen or using the VR headset in order to be more immersed in the space.  

One of the key design drivers for the Oculus was to ensure that the view from the base up was as clear of MEP equipment as possible. During coordination workshops we could load in new MEP and structural models, tweak the architectural design as required and all experience the space much better than we could off-plan or in Revit. Using this game engine technology also really helped the university buy into the vision of the Oculus, as we could use the VR headset to place them at the top or bottom and experience the space much better than they could from 2D plans or CGIs. In the end we produced a high-quality walkthrough model for the university that they could use for marketing. 

BB: What are the specific drivers that inform whether or not to use VR on a project? 

JS: We build a computer-generated 3D model for pretty much every job we work on, assuming it has gone beyond the feasibility stage and there is some kind of physical massing to be created. Once you have a 3D model, it’s a natural step to get that into a virtual reality environment.  

It’s a case of building on the detail, particularly if that’s necessary for a conversation you’re having with a client or if you’re presenting to the community for planning purposes. And then it becomes about adding that detail into the model and then getting it into an application that can visualise it in a headset. The technology is available to allow that to happen pretty quickly.  

BB: Are there limitations to using VR? 

JS: There are different schools of thought around how useful VR can be. There are certainly good things about it, but it does have limitations, yes. Only one person can experience the VR visuals at any one time, so for client meetings where there is more than one person—which is most of the time—we tend to use a large screen. It’s the same stuff as would be seen on a VR headset, just on a screen. What’s going on behind the scenes, as it were, the experience behind them, that is what we build. And that’s the most important bit, whether that goes onto a screen or onto a screen that’s right in front of your eyes via a headset, is neither here nor there. 

BB: But VR definitely has its uses? 

JS: Certainly. Using VR in conversations with a client can lead to new outcomes. On the Oculus staircase job we did in Cardiff for the Innovation Campus’ Sbarc|Spark building, the client felt it was really important to be able to stand at the bottom of the staircase and see right the way to the top to achieve a visual connectivity through the building. The contractor suggested shifting the staircase’s position slightly and making it more vertical. But that would have meant losing that “through the building” viewpoint. By using a VR headset, we were able to show what that would have looked like. As we had computationally modelled the staircase we were able to tweak the view through the VR headset and show how the view would change. The contractor’s suggestion would have saved the client money, but the client wasn’t prepared to do that at the expense of changing the perspective offered by the Oculus staircase.  

Looking at plans, elevations and sections won’t give you that visualisation experience. That’s only achievable if you can put yourself in a position of using some kind of VR technology. It’s definitely a big selling point to be able to show clients these 3D perspectives, whereby we put them in the room, in the buildings that they will be occupying in a couple of years’ time. 

BB: What does VR add to the design process? 

JS: It’s mainly being used more as an add-on at the end of the design process. So the design’s been completed and then it’s a case of communicating it to the client, or another stakeholder. But this could change in time. There is technology being developed—from the likes of Johan Hanegraaf, co-founder of Arkio—where you put a headset on and you use handheld controllers to push, pull, drag, shape and form masses and designs for buildings.  

Applications such as Revit and Rhino give us most of the tools that we need to create our designs. But we can’t currently put a headset on and design natively in them—with the headsets on, we don’t have the full suite of design functionality. They’re not designed to be used with a headset and two controllers. Those software vendors need to develop VR-friendly versions of their software, so that we can use Rhino in a headset, and then maybe we would choose to do that rather than have a mouse, keyboard and screen.  

If software developers crack that, it could be a really valuable piece of design software. But crucially it’s got to enable us to produce the designs that we currently can, or enable us to do more. Get that sorted and then we’re going to want to use it.

Augmented Reality in Construction: Stepping Into the Future