LightworksLightworks
← Back to SLIPSTREAM News

We recently brought you a piece on the current features available to design review studios in 2018, but we didn’t want to stop there. The time and effort design teams have invested in VR experiences in such a short space of time suggests that the capabilities of design reviews are only set to continue on an upward trend.  The SLIPSTREAM team see 2019 as an optimisation and refinement year - we don’t see groundbreaking changes making their way into design reviews. What we do see are a range of tools set to incrementally enhance your capabilities in a VR experience, giving you more control and flexibility of what you can achieve. Here’s what we expect to see moving into 2019:

The next phase of physical and virtual alignment
Let’s say your VR design experience has successfully integrated a physical buck alongside the game itself. You’ve chosen your colorways, your part variants and an exterior environment and you’re stood 2 metres away from your iteration in both the physical and virtual world. To get a full perspective, you’re using the transportation feature to view your design from 10 metres. 

SLIPSTREAM Hackrod buck

The only trouble is, once you’ve transported virtually, you’re still physically 2 metres away from your buck, meaning there’s every chance of walking into it when viewing your design from different angles. This opens up the possibility to create a feature notifying the user that they are about to hit the physical buck. Including such a simple, yet effective tool will continue to bridge the gap between the physical and virtual world for design reviews, a significant step in elevating the sensory experience for the executive team.

The alignment of the physical and virtual world won’t stop there. We fully expect automotive design studios to explore a range of tools within their arsenal and how they’ll be aligned in both the physical and virtual world. This includes some of the following:

 

  • The ability to virtually open a door when the door on the physical buck has been opened.
  • Sliding your virtual seat back will result in your real seat being adjusted.
  • Rotating the wheel that is part of your physical buck will rotate the wheel in your virtual experience.

 

Free-hand drawing on body surfaces
Even with a range of design review tools at the hands of design studios, we’ve only touched the surface in terms of what is capable and the possible tools that could be incorporated. In 2019, we’re expecting to see a rise in the ability to draw free-hand on surfaces. The possibility of drawing on surfaces would be an exciting option for teams who need an added level of customisation for a production cycle. Not only that, adding the finishing touches to a design by drawing on multiple surfaces will contribute to a tailored design review, building on the use of familiar environments and boosting its overall levels of personality.


Switch between light settings in one environment
With design studios utilising the power of light baking, there’s every opportunity that 2019 will bring with it the possibility to configure light settings within an environment through the improvements in dynamic lighting. Whilst light baking provides high-quality, photo-realistic settings and surfaces, it has limitations when incorporating actions into a design review, such as opening your car door or choosing different part variants.
Major enhancements have been made through 2018 in relation to dynamic lighting and we expect this trend to continue into 2019. By the latter stages of next year, we are predicting that design teams will be able to integrate customisable lighting options into their design review. At the moment, light settings for exterior environments use light baking, sky domes and early stages of dynamic lighting to replicate a particular time of day. Wouldn’t it be incredible if there was a feature to allow you to choose the time of day for your design review, updating the way in which your environment looks almost instantaneously?

Major improvements in eye tracking
There’s still scope for improvement when it comes to the capabilities of Virtual Reality headsets; one of which is associated with the focus of headsets. It’s currently difficult for headsets to recognise the way the human eye behaves, often creating blur spots in the vision for the user when in a VR experience

One possibility to counteract this is linked to the concept of foveated rendering. Foveated rendering allows you to mimic the behaviour of your eyes in a Virtual Reality environment, only rendering what you are looking at. By only rendering what you are looking at, you’ll be able to improve the smoothness of your VR experience, getting you closer to the optimal 90 FPS and eliminating dizziness and nausea.
It’s crucial to bear in mind that there is still a lot of data to evaluate and research before this feature is production ready, but it may take Virtual Reality rendering to the next level once it is made commercially available.

Book a discovery call today if you’re interested in SLIPSTREAM as a design review tool.

← Back to SLIPSTREAM News
Contact Us