VR/MR Archives - AEC Magazine https://aecmag.com/vr-mr/ Technology for the product lifecycle Tue, 03 Dec 2024 08:13:26 +0000 en-GB hourly 1 https://wordpress.org/?v=6.6.2 https://aecmag.com/wp-content/uploads/2021/02/cropped-aec-favicon-32x32.png VR/MR Archives - AEC Magazine https://aecmag.com/vr-mr/ 32 32 Cintoo Metaverse launches for immersive reality https://aecmag.com/reality-capture-modelling/cintoo-metaverse-launches-for-immersive-reality/ https://aecmag.com/reality-capture-modelling/cintoo-metaverse-launches-for-immersive-reality/#disqus_thread Mon, 25 Nov 2024 12:14:04 +0000 https://aecmag.com/?p=22054 Product portfolio led by Cintoo VR Experience, a new app powered by Unreal Engine

The post Cintoo Metaverse launches for immersive reality appeared first on AEC Magazine.

]]>
Product portfolio led by Cintoo VR Experience, a new app powered by Unreal Engine

Cintoo has launched Cintoo Metaverse, a new product portfolio designed to bring high fidelity reality capture data to an immersive environment.

The Cintoo VR Experience app is one of two initial apps in the Cintoo Metaverse portfolio. The VR app, which runs on Unreal Engine and is cloud-connected to the Cintoo platform, is designed to extend Cintoo’s collaboration and decision-making capabilities.

It allows project managers, engineers and installers to navigate reality models at a true-to-life 1:1 scale on ‘almost any’ VR device, as well as compare as-builts to as-designed by overlaying scans with 3D models.

3D scans and 3D models are all streamed from the Cintoo Cloud in real time, at the same high mesh resolution as the source point cloud thanks to Cintoo’s TurboMesh engine. According to Cintoo, no preparation work or pre-production is required.

To navigate around the reality model, the Cintoo VR Experience is using a technology first introduced to its platform earlier this year.

With the ‘teleportation camera’ users can teleport anywhere in the scene simply by pointing and clicking or navigating between scan set up locations.

In VR, users can create annotations, take measurements and then sync everything back to the Cintoo project.

Issues identified in VR can be exported in BCF Format, or synced with BIM Track, Autodesk BIM 360 or Procore.

“This is not just about visualization; it’s about driving actionable insights, reducing costs, and improving operations,” said Dominique Pouliquen, CEO of Cintoo. “Whether a construction, oil and gas or manufacturing company, the industrial metaverse enables you to harness your data in real-time, creating smarter, more efficient workflows.”

The second Cintoo Metaverse app, 3D Layout Experience, has been developed with Cintoo’s partner, Theorem Solutions. It allows users to navigate a 3D mesh, check clearances when moving equipment, and simulate future workspaces.

Meanwhile, Cintoo has closed a €37 million Series B funding round led by Partech, a global tech investment firm. With the fresh funding, Cintoo will enhance its SaaS platform by expanding its portfolio of integrations and will build on its new industrial metaverse experience and automatic asset tagging capabilities.

The post Cintoo Metaverse launches for immersive reality appeared first on AEC Magazine.

]]>
https://aecmag.com/reality-capture-modelling/cintoo-metaverse-launches-for-immersive-reality/feed/ 0
Resolve brings 2D construction data into VR https://aecmag.com/vr-mr/resolve-brings-2d-construction-data-into-vr/ https://aecmag.com/vr-mr/resolve-brings-2d-construction-data-into-vr/#disqus_thread Tue, 10 Sep 2024 10:17:18 +0000 https://aecmag.com/?p=21355 ‘App Portal’ allows teams to view construction apps in VR to enhance design / review

The post Resolve brings 2D construction data into VR appeared first on AEC Magazine.

]]>
‘App Portal’ allows teams to view construction apps in VR to enhance design / review

Resolve has added an ‘App Portal’ to its collaborative immersive design / review software that allows users to view 2D construction project data from inside VR.

With the new feature users can ‘seamlessly’ access and interact with 2D drawings, dashboards, and issue databases without having to remove their Meta Quest VR headset.

According to Resolve, this empowers construction teams to identify and resolve potential issues more effectively by providing immediate access to critical information from 2D data sources. For enhanced collaboration, users can share and pin 2D screenshots within the virtual model for team members to reference.

The App Portal is launching with four key integrations: Autodesk Construction Cloud (ACC), Autodesk BIM 360, Procore, and Newforma Konekt.

“At Resolve, we are committed to providing innovative solutions that improve the construction process and we also believe 2D data still holds an important place in the industry. This new feature allows our users to fully leverage all their project data, leading to better decision-making and increased project efficiency,” said Angel Say, CEO Resolve.

“The enhanced integration of Procore into Resolve’s application empowers project teams to leverage Procore data like never before. This integration supports project coordination and stakeholder engagement with critical project data in a whole new way, within Resolve’s virtual experience,” said Dave McCool, director of product, Procore.


Find this article plus many more in the Sept / Oct 2024 Edition of AEC Magazine
👉 Subscribe FREE here 👈

The post Resolve brings 2D construction data into VR appeared first on AEC Magazine.

]]>
https://aecmag.com/vr-mr/resolve-brings-2d-construction-data-into-vr/feed/ 0
Varjo Teleport unveiled for reality reconstruction https://aecmag.com/reality-capture-modelling/varjo-unveils-teleport-for-reality-reconstruction/ https://aecmag.com/reality-capture-modelling/varjo-unveils-teleport-for-reality-reconstruction/#disqus_thread Wed, 24 Jul 2024 16:17:42 +0000 https://aecmag.com/?p=21202 XR specialist using 3D gaussian splatting and machine learning to capture reality

The post Varjo Teleport unveiled for reality reconstruction appeared first on AEC Magazine.

]]>
XR specialist using 3D gaussian splatting and machine learning to capture reality

Varjo has unveiled Teleport, a new service designed to transform how users create and interact with 3D environments for a wide range of spatial computing applications.

The technology preview highlights the service’s capability to quickly generate photorealistic 3D capture scans of real-world environments directly from an iPhone Pro / Pro Max and allows users to view these scenes from a variety of devices, including PCs, VR headsets, and more.

The idea behind Teleport is that anybody can create a high-resolution 3D model of their environment without needing skills in real-time 3D graphics or photogrammetry.

Teleport reconstructs the real-world scene with accurate lighting, shading, textures and reflections using what Varjo describes as breakthrough advancements in 3D gaussian splatting and machine learning technologies.

The resulting 3D reality capture can be viewed and experienced with a range of devices, starting with Varjo headsets, other PC-connected OpenXR headsets, or Windows desktops.

Varjo’s reality reconstruction technology is designed to enable users to virtually visit and interact with remote locations in ‘great detail’. Varjo names training, mission planning, and remote assistance as potential application areas.

The Finnish National Opera and Ballet will be using Teleport in its operations, as Hannu Järvensivu, XR Stage Project Manager explains. “Together with our XR Stage modelling tool, we expect it can improve the evaluation of new incoming rental productions significantly, as the digital twins of real-world sets can be investigated on the virtual stage in their authentic size and form, instead of trying to figure out their visual appearance and fit on stage only based on photos and CAD images.

Varjo doesn’t specifically mention construction as an application area, but we expect there will be some use cases for capturing as-built conditions or issue resolution.

According to Varjo, it has tested captures from 5m2 to 1,000m2. A ‘large room’ (5m x 5m) would take a few minutes to capture and need about 500 photos. Larger spaces can be captured up to a limit of 2,000 photos.

Varjo is inviting users interested in trying out the technology to join a waitlist. The service is expected to become generally available later in 2024.

The post Varjo Teleport unveiled for reality reconstruction appeared first on AEC Magazine.

]]>
https://aecmag.com/reality-capture-modelling/varjo-unveils-teleport-for-reality-reconstruction/feed/ 0
Sentio VR 2.0 simplifies client presentations https://aecmag.com/vr-mr/sentio-vr-2-0-simplifies-client-presentations/ https://aecmag.com/vr-mr/sentio-vr-2-0-simplifies-client-presentations/#disqus_thread Sat, 13 Jul 2024 08:50:57 +0000 https://aecmag.com/?p=20949 Latest release of the architectural VR solution introduces team collaboration around 360 panoramas

The post Sentio VR 2.0 simplifies client presentations appeared first on AEC Magazine.

]]>
Latest release of the architectural VR solution introduces team collaboration around 360 panoramas, 1-click casting and offline storage.

Sentio VR 2.0, the latest release of the VR solution for architects, has simplified client communication, by enabling multiple stakeholders to join a single session and collaborate around 360 panoramas.

Architects can guide clients through presentations using high-fidelity content from real-time visualisation software including Lumion, Twinmotion, V-Ray, Enscape, and D5 Render in the Meta Quest VR headset, without the need for a high-end GPU, and without requiring clients to learn how to use VR controls.

Sentio VR 2.0 also includes ‘1-click casting’ which allows users to stream their VR view to a web link in seconds, eliminating the need for Meta accounts or Wi-Fi configurations. According to the developers, it solves the problem of streaming to a wider audience in client meetings through a simple link, providing a high-resolution VR experience without any setup complications.

For 1-Click Casting users open a 360 tour in the SentioVR Meta Quest App, click on ‘Broadcast tour’, note the PIN code, then enter the code at cast.sentiovr.com. The VR view is then automatically streamed to the web link in real-time.

Another new feature is the ability to download projects offline to the VR headset with a single click, so architects can take their headsets to meetings, trade shows or job sites without having to worry about Wi-Fi or 45/5G connectivity.

Sentio VR also supports real-time collaboration for fully navigable VR where you can walk and teleport around a building. The software integrates directly with Revit and SketchUp via plug-ins, which offer ‘one-click export of the model to the cloud for fully automated conversion to VR.

Users can then use a standalone Meta Quest headset, enter a 6-digit code, and explore the model in 1-to-1 scale, teleporting around the model.

For real-time collaboration, there’s a guided mode for users who are new to VR, as well as a PC companion app to join or host meetings.

SentioVR is used by visualisation teams at firms including KPF, Grimshaw, and Arup.


Sentio VR Collaboration
Sentio VR Collaboration

 

The post Sentio VR 2.0 simplifies client presentations appeared first on AEC Magazine.

]]>
https://aecmag.com/vr-mr/sentio-vr-2-0-simplifies-client-presentations/feed/ 0
AEC Magazine May / June 2024 Edition https://aecmag.com/bim/aec-magazine-may-june-2024-edition/ https://aecmag.com/bim/aec-magazine-may-june-2024-edition/#disqus_thread Wed, 29 May 2024 09:21:03 +0000 https://aecmag.com/?p=20651 Openess in AEC: we look beyond the interoperability agreement + lots, lots more

The post AEC Magazine May / June 2024 Edition appeared first on AEC Magazine.

]]>
In our May / June 2024 edition of AEC Magazine we explore openess in AEC, preview our incredible NXT BLD and NXT DEV conference, which take place in London on 25/26 June, report on a new AI plug-in that generates Revit models from 2D plans, plus plenty more on BIM, digital fabrication, XR streaming, cyber crime, micro workstations and lots more.

It’s available to view now, free, along with all our back issues.

Subscribe to the digital edition free + all the latest AEC technology news in your inbox, or take out a print subscription for $49 per year (free to UK AEC professionals).



Cover story: towards open systems
We explore Autodesk’s new approach to openness and note that, with its recent Nemetschek announcement, things seem a little different

NXT BLD / NXT DEV event previews
At AEC Magazine’s annual events in London on 25-26 June you’ll not only see what the future holds for AEC technology but you can have a say in how it unfolds

Skema: BIM workflow compression
Skema is one of a handful of new tools from design-oriented start-ups that is engineered to work with existing BIM software to shrink project timescales

Dassault Systèmes (DS) in AEC
A market leader in manufacturing, DS is developing a new generation of AEC tools, which aim to cross the chasm between digital design and digital manufacture

Twinview (digital twins)
We explore Space Group’s Twinview, one of the most advanced BIM digital twin offerings available today

Safeguarding contractors from Cyberattacks
It’s every construction firm’s biggest nightmare: criminals taking control of their data and holding them to ransom

Nvidia Omniverse spreads its wings
With new Cloud APIs, Nvidia is extending the reach of Omniverse beyond the core demographic of designers and artists

XR: streaming to a headset near you soon
All-in-one XR headsets have proved very popular for AEC design review. But for realism and complexity, 3D models must be processed externally, and pixels streamed in

Review: Scan micro workstation
This compact 8-litre workstation might not bring much new to the table in terms of chassis, but it’s hard not to take notice when the price is so aggressive

Review: Nvidia RTX 2000 Ada
This entry-level pro viz GPU is a great option for small workstations

The post AEC Magazine May / June 2024 Edition appeared first on AEC Magazine.

]]>
https://aecmag.com/bim/aec-magazine-may-june-2024-edition/feed/ 0
Enterprise XR: streaming to a headset near you soon https://aecmag.com/vr-mr/enterprise-xr-streaming-to-a-headset-near-you-soon/ https://aecmag.com/vr-mr/enterprise-xr-streaming-to-a-headset-near-you-soon/#disqus_thread Wed, 22 May 2024 11:09:55 +0000 https://aecmag.com/?p=20572 For realism and complexity in XR, 3D models must be processed outside the headset and pixels streamed in.

The post Enterprise XR: streaming to a headset near you soon appeared first on AEC Magazine.

]]>
All-in-one XR headsets have proved very popular for AEC design review. But to add realism and complexity, 3D models must be processed outside the headset, and pixels streamed in. Greg Corke reports on a growing number of XR technologies that do just that, from cloud services to local appliances

The first wave of VR headsets – the Oculus Rift and HTC Vive – were tethered, so all the graphics processing was done on a workstation with a powerful GPU. But cables can be restrictive, and external processing can add complexity and cost, which is why all-in-one headsets like the Meta Quest have proved so popular in architecture, engineering, and construction with tools like Arkio, Resolve, The Wild, and Autodesk Workshop XR.

But relying on a headset to process 3D graphics has its limitations – in terms of the size and complexity of models it can handle and, especially, the realism it can convey.

In AEC, this hasn’t proved such an issue for BIM-centric design/review, where simply rendered geometry is adequate for ironing out clashes and issues before breaking ground. But when more realism is required, perhaps to get a true feel for a space, refine materials and lighting, or simply to wow a client, visual fidelity and model scale and complexity is much more important. The ultimate goal of XR is to make the virtual indistinguishable from the real.

Graphics processing on XR headsets has been steadily improving over the years. The new Apple Vision Pro has a tonne of compute on the device, but it’s still not enough to handle huge models with millions of polygons.

The answer, according to Nvidia, is to offload the processing to a powerful workstation or datacentre GPU and stream the pixels to the headset. In March, the company introduced a new technology that allows developers to beam their applications and datasets with full RTX real-time physically based rendering directly into Apple Vision Pro with just an internet connection. It uses Nvidia Omniverse Clouds APIs to channel the data through the Nvidia Graphics Delivery Network (GDN), a global network of graphics-optimised data centres.

“We’re sending USD 3D data into the cloud, and we’re getting pixels back that stream into the Vision Pro,” explains Rev Lebaredian, vice president, Omniverse and simulation technology at Nvidia. “And working with Apple’s APIs on the Vision Pro itself, we adjust for the latency, because there’s a time between when we ask for the pixels, and it comes back.

“In order to have a really great immersive experience you need to have those pixels react to all of your movements, locally. If you turn your head, the image has to change immediately. You can’t wait for the network, so we compensate for that.”

Relying on a headset to process 3D graphics has its limitations, both in terms of the size and complexity of models it can handle, and the realism it can convey

Nvidia is one of many firms streaming XR content from the cloud for use by AEC and manufacturing firms. Hololight offers a range of services based around Hololight Hub, which it describes as an enterprise streaming platform for spatial computing. It can run on public cloud, private cloud, or on-premise.

The platform supports a variety of headsets – Magic Leap, HoloLens, Meta Quest 3, Lenovo ThinkReality VRX, as well as iOS tablets and Windows for desktop apps. For streaming, Hololight Hub can use Nvidia CloudXR or the bespoke Hololight Stream technology, which also supports AMD and Intel GPUs. To address latency from the cloud, Hololight has developed several mechanisms including ‘frame skipping and reprojection algorithms’.

The company recently launched an integration with Nvidia Omniverse and OpenUSD, which provides an environment for real-time 3D collaboration.


Apple Vision Pro
Using GPUs in the cloud, Nvidia can stream advanced 3D experiences to Apple Vision Pro, rather than relying solely on the headset’s compute

Innoactive is another specialist that offers XR streaming from the cloud. Through its Innoactive Portal, it works with a range of tools including Twinmotion, Enscape, VREX and Bentley Systems iTwin, as well as apps built on Unreal Engine and Unity.

More recently, the company combined its Innoactive Portal with Nvidia Omniverse and Cesium, allowing users to combine Universal Scene Description (OpenUSD) projects of buildings with contextual geospatial data and visualise both in real time.

OpenUSD projects can be streamed to a standalone VR headset, such as Meta Quest, HTC Vive XR Elite, Pico 4E or Lenovo ThinkReality VRX, or to a web-browser on an office PC. Innoactive Portal is available through AWS and can also be self-hosted.

Extending the reach of XR

Thanks to cloud technology, BMW Group has dramatically extended the reach of XR within the automotive firm. With a bespoke platform hosted on AWS, using Nvidia GPUs for acceleration, and Nvidia CloudXR for streaming, over 200 BMW Group departments across the world, from design, engineering and production to sales and marketing, can now get near instant access to XR. BMW is using the platform for CAD visualisation, collaborative design review, and factory training.

BMW’s solution, the ‘3D AppStore’, provides easy access to content through a user-friendly web-based portal. Users simply select the headset they want to use (Meta Quest or Vive Focus), click the application and dataset, and it starts an GPU-accelerated instance on AWS.


Meta Quest
The Meta Quest is hugely popular in AEC but lacks the onboard processing to render large hyper-realistic 3D models

Local delivery

Of course, streaming doesn’t have to be from the cloud. AEC firms can use local workstations or servers equipped with one or more pro GPUs, such as the Nvidia RTX 6000 Ada Generation (48 GB), which are typically more powerful than those available in the public cloud. What’s more, keeping everything local has additional benefits. The closer the headset is to the compute, the lower the latency, which can dramatically improve the XR experience. At Nvidia’s recent GTC event, Nvidia gave a mixed reality demonstration where a realistic model of a BAC Mono road-legal sports car rendered in Autodesk VRED could be viewed alongside the same physical car.

Streaming doesn’t have to be from the cloud. AEC firms can use local workstations or servers equipped with one or more pro GPUs, such as the Nvidia RTX 6000 Ada, which are typically more powerful than those available in the public cloud

The Autodesk VRED model was being rendered on a local HP Z Workstation and streamed to an Nvidia CloudXR application running on an Android tablet with pass-through enabled. Nvidia also showed the potential for natural language interfaces for immersive experiences, controlling certain elements of VRED through voice commands via its API.

“VRED doesn’t have anything special that was built into it to support large language models, but it does have a Python interface,” explains Dave Weinstein, senior director of XR at Nvidia. “Using large language models, using some research code that we’ve developed, we’re able to simply talk to the application. The large language model translates that into commands that VRED speaks and through the Python interface, then issues those commands.”

Practical applications of this technology include a car configurator, where a customer could visualise the precise vehicle they want to buy in their own driveway – changing paint colours, trim and wheels, etc. Also in collaborative design review, voice control could give non-skilled users the ability to interact with the car and ask questions about it, hands free, when wearing an immersive VR headset.

Nvidia’s natural language research project is not tied to XR and could be used in other domains including Product Lifecycle Management (PLM), where the language model could be trained to know everything about the car, and present information. One could query a product, pull up a list of parts, etc. The same approach could also be applied to construction projects. It seems likely the technology will make its way into Nvidia Omniverse at some point.


Find many more articles like this in AEC Magazine
👉 Subscribe FREE here 👈

The collaborative package

Lenovo has an interesting take on collaborative XR for design review with its new Spatial Computing Appliance. The desktop or rack-mounted solution enables up to four users to collaborate on the same scene, at the same time, using a single workstation, streaming pixels with Nvidia CloudXR over a WiFi 6E network.

The ‘fully validated’ reference architecture comprises a Lenovo ThinkStation PX workstation, four Nvidia RTX 6000 Ada Generation GPUs, and four Lenovo ThinkReality VRX headsets. It supports Nvidia Omniverse Enterprise, Autodesk VRED, and other XR applications and workflows.

Lenovo’s Spatial Computing Appliance works by carving up the workstation into multiple Virtual Machines (VMs), each with its own dedicated GPU. Lenovo uses Proxmox, an open-source hypervisor, but will work with other hypervisors as well, such as VMware ESXi.

Four users is standard, but the appliance can support up to eight users when configured with eight single slot Nvidia RTX 4000 Ada Generation GPUs.

The Nvidia RTX 4000 Ada is nowhere near as powerful as the Nvidia RTX 6000 Ada and has less memory (20 GB vs 48 GB), so it won’t be able to handle the largest models at the highest visual fidelity.

However, it’s still very powerful compared to a lot of the GPUs offered in public cloud. BMW, for example, uses the equivalent of an Nvidia RTX 3060 Ti in its AWS G5 instances, which on paper is a fair bit slower than the Nvidia RTX 4000 Ada.

With eight VMs, firms may find they need more bandwidth to feed in data from a central server, Omniverse Nucleus, or data management system. To support this, a 25Gb Ethernet card can be added to the workstation’s ninth PCIe slot.

The Lenovo ThinkReality VRX is an enterprise-level all-in-one headset, offering 2,280 x 2,280 resolution per eye. The headset largely earns its enterprise credentials because it works with Lenovo’s ThinkReality MDM (mobile device management) software.

It means IT managers can manage the headsets in much the same way they do fleets of ThinkPad laptops. The headsets can be remotely updated with security, services and software, and their location tracked. The appliance will work with other VR headsets, but they won’t be compatible with the Lenovo ThinkReality MDM software.

Of course, these days IT can be flexible. No firm needs access to XR technology 24/7, so the ThinkStation PX can be reconfigured in many different ways – for rendering, AI training, simulation and more.

To help deploy the Spatial Computing Appliance, Lenovo is working with partners, including Innoactive. The XR streaming specialist is using the Lenovo technology for on-premise XR deployments using Nvidia CloudXR. At AEC Magazine’s NXT BLD event in June the company will be giving demonstrations of streaming VR from a ThinkStation PX to multiple users running tools including Enscape, Omniverse, and Bentley iTwin.

Physical attraction

Despite big advances in streaming technology, for the ultimate XR experience, headsets still need to be tethered. To deliver its hyper-realistic ‘human eye’ resolution experience the Varjo XR-4 must maintain stable and fast data transfer rates. This can only be achieved when physically connected to a workstation with USB C and DisplayPort cables.

For rendering you need an exceedingly powerful GPU like the Nvidia RTX 6000 Ada or Nvidia GeForce RTX 4090, which can help deliver photorealism through real time ray tracing.

But that doesn’t necessarily mean you’ll always need cables. “We’re still working on that on Varjo,” said Greg Jones, director of XR business development at Nvidia, at Nvidia GTC recently.


Main image: Lenovo’s Spatial Computing Appliance enables multiple users to collaborate on the same scene, at the same time, using a single workstation.

The post Enterprise XR: streaming to a headset near you soon appeared first on AEC Magazine.

]]>
https://aecmag.com/vr-mr/enterprise-xr-streaming-to-a-headset-near-you-soon/feed/ 0
AEC Magazine March / April 2024 Edition https://aecmag.com/technology/aec-magazine-march-april-2024-edition/ https://aecmag.com/technology/aec-magazine-march-april-2024-edition/#disqus_thread Mon, 08 Apr 2024 15:34:46 +0000 https://aecmag.com/?p=20312 Autonomous drawings and the race to eliminate one of the AEC sector’s biggest bottlenecks

The post AEC Magazine March / April 2024 Edition appeared first on AEC Magazine.

]]>
In our spring 2024 edition we delve deep into a future where drawings are fully automated, look at a new approach to building performance analysis, report on a new massing tool for architects, plus plenty more on acoustic design, reality capture, workstations, modern methods of construction, and laptop processors

It’s available to view now, free, along with all our back issues.

Subscribe to the digital edition free + all the latest AEC technology news in your inbox, or take out a print subscription for $49 per year (free to UK AEC professionals).



The dawn of auto-drawings
Several CAD software firms are making real progress in drawing automation in the race to eliminate one of the AEC sector’s biggest bottlenecks.

Enscape: building performance analysis
Enscape is to get a new module, powered by IES technology, that gives instant visual feedback on building performance.

TestFit runs free
The Texas-based design automation software developer releases a free massing tool for architects.

NXT BLD / DEV 2024
AI, automation, digital fabrication, BIM 2.0, data specifications, open source, automation, and lots, lots more at AEC Magazine’s London conferences

Industry news
AEC technologies emerge for Apple Vision Pro, Unreal Engine and Twinmotion get new licensing, Alice uses AI to optimise Primavera P6 schedules, plus lots more

Autodesk to take over VAR payments
New changes to the Autodesk business model could be set to diminish the role of the CAD reseller.

Workstation news
Intel Core Ultra laptop processors, Nvidia Ada Generation RTX GPUs for CAD, plus new workstations from HP and Dell

Prime time for iGPU
Laptop processors with integrated GPUs are now powerful enough for 3D CAD. Dos this mean a cheaper, slimmer future?

Enscape and V-Ray: a collaborative future
Chaos has big plans to enhance workflows between Enscape and V-Ray, boost real time collaboration, and more.

Smart reality capture
A new integrated reality capture solution from Looq uses computer vision, AI and a proprietary handheld camera with GPS, to capture infrastructure at scale.

Treble: sound advice
New software helps analyse and optimise designs for acoustic performance.

Informed Design
Autodesk connects BIM (Revit) with fabrication (Inventor) via the cloud to support modern methods of construction.

Scaling-up on-site digital construction
Facit Homes brings new hope to the need to build houses and digitise fabrication.

The post AEC Magazine March / April 2024 Edition appeared first on AEC Magazine.

]]>
https://aecmag.com/technology/aec-magazine-march-april-2024-edition/feed/ 0
D5 Render 2.7 launches for real-time visualisation https://aecmag.com/visualisation/d5-render-2-7-launches-for-real-time-visualisation/ https://aecmag.com/visualisation/d5-render-2-7-launches-for-real-time-visualisation/#disqus_thread Fri, 26 Apr 2024 12:33:16 +0000 https://aecmag.com/?p=20464 New release introduces new AI and Procedural Content Generation capabilities, along with enhanced Global Illumination (GI)

The post D5 Render 2.7 launches for real-time visualisation appeared first on AEC Magazine.

]]>
New release introduces new AI and Procedural Content Generation capabilities, along with enhanced Global Illumination (GI)

D5 Render 2.7, the latest release of the architectural-focused real-time rendering software, has launched today. Of the 35 updates and optimisations, many are driven by AI and PCG (Procedural Content Generation), the company states.

D5 Scatter is a new PCG vegetation scatter tool designed to help generate realistic landscapes more easily and quickly. Surfaces can be divided into multiple areas, each accommodating different plant species from 4,000+ D5 Asset Library.

Scattered plant models support individual/batch adjustments of density, proportion, scale, and orientation for various effects. The software also offers new ‘lush and realistic’ grass materials for creating ‘vibrant and visually’ appealing lawns.

AI Atmosphere Match, first introduced in D5 Render version 2.6 to help designers generate sky, natural lighting, and post-production effects from reference images, has been further refined to produce more precise matching results for both exterior and interior scenes.

AI Ultra HD Texture integrates AI Super Resolution to automatically upgrade the resolution of grainy textures up to 4K. According to the company, it reduces noise and imperfections while preserving high-frequency texture details.

‘Make Seamless’ incorporates AI inpainting to remove seams between base colour map textures for a ‘consistent and natural’ effect. Meanwhile, ‘Text to 3D’ transforms text input into 3D assets in realistic, cartoon or low-poly styles.

D5 Render 2.7 also includes enhancements to its global illumination (GI) algorithms. According to the company this brings the software one step closer to achieving the quality of offline rendering.

Interoperability has also been enhanced with new ‘LiveSync’ for 3ds Max 2025 and SketchUp 2024, and there’s beta support for the 3DConnexion SpaceMouse.

Other features include accelerated rendering speed for both images and videos, with an increase of ‘over 45%’, higher-quality VR with more realistic lighting, and a range of new assets, including walking characters, grouped characters, and a selection of new PBR materials.

D5 Render works with SketchUp, 3ds Max, Revit, Cinema4D, Archicad, Rhino and Blender. Prices start at $30 per user per month for D5 Render Pro for professional. There’s a free version – D5 Render Community – for beginners and individuals.


The post D5 Render 2.7 launches for real-time visualisation appeared first on AEC Magazine.

]]>
https://aecmag.com/visualisation/d5-render-2-7-launches-for-real-time-visualisation/feed/ 0
Nvidia streams colossal 3D models into Apple Vision Pro https://aecmag.com/vr-mr/nvidia-streams-colossal-3d-models-into-apple-vision-pro/ https://aecmag.com/vr-mr/nvidia-streams-colossal-3d-models-into-apple-vision-pro/#disqus_thread Mon, 18 Mar 2024 20:00:40 +0000 https://aecmag.com/?p=20004 Omniverse Cloud APIs let developers stream interactive 'digital twins' into the mixed reality headset

The post Nvidia streams colossal 3D models into Apple Vision Pro appeared first on AEC Magazine.

]]>
Omniverse Cloud APIs let developers stream interactive ‘digital twins’ into the mixed reality headset

Nvidia has introduced a new service that allows firms to stream interactive Universal Scene Description (OpenUSD) industrial scenes from 3D applications into the Apple Vision Pro mixed reality headset.

The technology makes use of Nvidia’s new Omniverse Clouds APIs (read our story), using a new framework that channels the data through the Nvidia Graphics Delivery Network (GDN), a global network of graphics-optimised data centres.

“Traditional spatial workflows require developers to decimate their datasets – in essence, to gamify them. This doesn’t work for industrial workflows where engineering and simulation datasets for products factories and cities are massive,” said Rev Lebaredian, VP of Omniverse and simulation technology at Nvidia.

“New Omniverse cloud APIs let developers beam their applications and datasets with full RTX real-time physically-based rendering directly into vision pro with just an internet connection.”

In a demo unveiled today at Nvidia GTC, Nvidia presented an interactive, physically accurate digital twin of a car streamed in full fidelity to Apple Vision Pro’s high-resolution displays.

The demo featured a designer wearing the Vision Pro, using a car configurator application developed by CGI studio Katana on the Omniverse platform. The designer toggles through paint and trim options and even enters the vehicle —  blending 3D photorealistic environments with the physical world.

“The breakthrough ultra-high-resolution displays of Apple Vision Pro, combined with photorealistic rendering of OpenUSD content streamed from Nvidia accelerated computing, unlocks an incredible opportunity for the advancement of immersive experiences,” said Mike Rockwell, VP of the Vision Products Group at Apple. “Spatial computing will redefine how designers and developers build captivating digital content, driving a new era of creativity and engagement.”


Find this article plus many more in the March / April 2024 Edition of AEC Magazine
👉 Subscribe FREE here 👈

The post Nvidia streams colossal 3D models into Apple Vision Pro appeared first on AEC Magazine.

]]>
https://aecmag.com/vr-mr/nvidia-streams-colossal-3d-models-into-apple-vision-pro/feed/ 0
Graphisoft BIMx available for Apple Vision Pro https://aecmag.com/vr-mr/graphisoft-bimx-available-for-apple-vision-pro/ https://aecmag.com/vr-mr/graphisoft-bimx-available-for-apple-vision-pro/#disqus_thread Wed, 28 Feb 2024 11:32:31 +0000 https://aecmag.com/?p=19858 BIM presentation / collaboration tool now available on mixed reality headset

The post Graphisoft BIMx available for Apple Vision Pro appeared first on AEC Magazine.

]]>
BIM presentation / collaboration tool now available on mixed reality headset

Hot on the heels of the launch of Apple’s long awaited mixed reality headset, Graphisoft has released BIMx – BIM Experience for Apple Vision Pro, an interactive app for exploring BIM projects and linked documentation sets created in Archicad and DDScad.

The first version of BIMx on Apple’s visionOS supports full immersion mode in 3D. However, Graphisoft explains that navigation in the virtual model is experimental and will be fine-tuned.

“We’re actively working on solutions to bring you the immersive experience you crave. We have big plans for its evolution, with more features and enhancements scheduled for future updates,” wrote Graphisoft’s Emoke Csikos in a recent blog post.

Meawhile, Resolve, the AEC-focused design review solution, is also available for the Apple Vision Pro

Resolve is a collaborative VR tool that is well established on the Oculus Quest VR headset. It includes native integrations with Autodesk Construction Cloud (ACC) and Procore, where models are automatically kept up to date in VR without having to export or upload with each design change.

According to Resolve CEO, Angel Say, one of the key benefits of Apple Vision Pro in BIM and AEC workflows is in its ability to ‘supercharge multi-dimensional multitasking’.

In other words, viewing crisp 2D documents in a 3D spatial environment thanks to the headset’s incredibly high-resolution display.

The post Graphisoft BIMx available for Apple Vision Pro appeared first on AEC Magazine.

]]>
https://aecmag.com/vr-mr/graphisoft-bimx-available-for-apple-vision-pro/feed/ 0