Seeing someone at NC State wear a virtual reality headset and work a hand-held controller may look like a scene out of Ready Player One, the popular Ernest Cline novel turned highly anticipated Steven Spielberg flick that premiered last week at theaters nationwide. Instead of the digital playground OASIS, though, researchers at the Center for Geospatial Analytics don virtual reality headsets to connect with Tangible Landscape, or plug into custom-designed scenes of experimental settings for research purposes. We asked doctoral student Payam Tabrizian, co-majoring in Design and Geospatial Analytics, to tell us more.
In short, we evaluate people’s experience of landscapes, i.e., natural and built environments. We integrate immersive virtual environments and geospatial analytics to understand the impact of landscape change on people’s perceptions and affective responses, such as moods, feelings, and attitudes. We also use virtual reality to enhance participation and decision making processes for alternative urban development scenarios.
IVE is useful in three types of projects: landscape assessment, landscape development (design), and theory development. In landscape assessment projects, IVE helps us to quantify and map experiential qualities of a landscape. In other words, it allows us to evaluate a landscape beyond its ecological or monetary value––for example: understanding perceived safety patterns in a neighborhood park, or the perceived historic or spiritual qualities of a large-scale urban park, or perceived authentic components of farmscapes.
In landscape development projects, we use IVE to display different future (design) scenarios, in a realistic and high-fidelity manner, to identify those solutions that are most compatible to stakeholders’ preferences. We see great value in integrating IVE in the design process as a tool for public engagement, and as a decision-support tool.
Finally, in theory development, we delve into theories of environmental psychology and environmental design to see what types of landscape, components of landscapes, or their configuration, elicit the most positive (or negative) responses––for example, understanding the restorative potential of various vegetation components of urban green spaces.
Photorealistic scenes have a much greater degree of presence, or the degree to which a person feels present in a virtual world, and they’re much easier to create––because we use real photographs of real places. So, they have high realism, are inexpensive in terms of time and labor, and they are good for evaluating perceptions of existing environments. While it might take one or two months to create a fully digital IVE with 3D modeling software, it can take one or two hours to create a photorealistic IVE. It’s also possible to generate IVEs of places before and after some landscape change, so they are useful for site archival and inventory purposes.
On the other hand, fully simulated IVEs have a number of benefits that photorealistic ones do not: First, we have much more control and freedom for changing things. Second, in a simulated environment, we can implement enhanced interactions––for example, we can give a person a joystick, and they can use it navigate in the environment; they can move and use gestures to experience the environment in new ways. And third, we can manipulate the degree of realism, which is useful for research. For example, sometimes abstract environments without a lot of visual distraction are better than realistic ones for understanding certain cognitive functions, like wayfinding.
The notion of presence is at the heart of this. While taking participants to a site or to a variety of sites for on-site experiences provides high ecological validity, this method is relatively difficult and expensive to employ. Alternatively, researchers have utilized photos and videos to understand perceptions, but these are limited in their ability to represent the in-situ experience. In comparison with these conventional methods, IVEs elicit a high degree of immersion and presence, meaning that people can feel physically present in the environment that’s under study. IVE headsets continuously stream visual information, linked to users’ head and body movements, so that they can actively explore all facets of the environment. For example, being aware of the environment behind you, which is not possible with static photos, is very important to perceptions of danger, and consequently feelings of fear or personal safety.
In addition, virtual environments are flexible and can be programmed to enable higher experimental control and more rigorous data collection. For instance, we program them in a way so that participants can respond to a survey using a joystick controller while they are immersed in an environment, which could mimic capturing responses in a real setting. We also program the order of settings being experienced as well as the duration of presentation/experience, to control for these variables.
I was always interested in the human aspect of design, and perception, and I found virtual reality a very good way to represent different future scenarios of landscapes and understand perceptions. Experiencing IVEs myself, I immediately realized the benefits for geospatial and design research––allowing the researcher to represent environments much more realistically and to create beautiful experiences of imagined landscapes. Coming to the Center for Geospatial Analytics was also very influential, because I used the opportunity to continue the existing line of IVE research and development established by Jordan Smith [a professor affiliated with the center, now director of the Institute of Outdoor Recreation and Tourism at Utah State University] and colleagues.
Three things: (1) implementing better human interactions such as navigation; (2) enhanced real-time 3D modeling with geospatial data (such as with the center’s urban growth simulation, FUTURES); and (3) photorealism––much better texture mapping and shading and atmospheric effects. These aren’t going to be easy, but they’re the way to go for the future.
Payam is advised by Perver Baran, a faculty fellow at the Center for Geospatial Analytics. Would you like to know more about 3D modeling with virtual reality and open source software? Check out Payam’s tutorials, available on GitHub: https://github.com/ptabriz/geodesign_with_blender
This post was originally published in Center for Geospatial Analytics.