The initial concept of digital twins originated in the early 2000s, though researchers laying the fundamental foundations decades earlier. In 2002, at the University of Michigan, Dr Michael Grieves is recognised for authoring the initial formal definition of digital twins, which were first developed within the context of Product Lifecycle Management (PLM). Although this constitutes NASA's first formal accreditation of the digital twin, the agency pioneered a precursor technology during the Apollo program of the 1970s, utilising physical spacecraft replicas for real-time troubleshooting and simulation. These fundamental principles formed the basis for the definition of virtual models as digital twins.
The initial application of digital twins was within high-stakes sectors like aerospace and manufacturing. Their foundational application involved the creation of a virtual representation of physical assets to simulate desired metrics, enabling performance analysis, design optimisation, and predictive maintenance. These early applications offered transformative potential to these industries, enhancing project efficiency and decision-making processes.
Industries prioritising safety, precision, and efficiency for optimal results have established the crucial role of digital twins. The primary area of focus, in collaboration with NASA, was aerospace simulation, encompassing the modelling of spacecraft and aircraft performance under extreme environmental conditions. The manufacturing process involves product design, optimisation, and predictive maintenance, providing a 6.01% increase in production efficiency and a 87.56% reduction in downtime, while automotive testing is conducted on vehicles under simulated conditions.
The proliferation of digital twins across various sectors is a direct result of technological progress. The Internet of Things (IoT) facilitated real-time data acquisition from sensors integrated within physical assets, concurrently with cloud computing, enabling the storage and processing of extensive datasets. The incorporation of artificial intelligence (AI) and machine learning algorithms significantly improved the analysis of this data, enhancing the predictive and adaptive capabilities of digital twins. Moreover, the utilisation of high-performance computing facilitated the execution of intricate simulations, thereby substantially enhancing the functionalities of digital twins.
Digital twins are revolutionising numerous UK industries; however, their widespread adoption is hindered by challenges arising from the intersection of innovation and regulation. A significant challenge lies in data integration; the unification of disparate data sources and formats into a functional, actionable model remains a complex undertaking, particularly when considering COBie and its interpretation in the context of company-specific taxonomies and data methodologies. The cost of developing and maintaining digital twin infrastructure presents a significant barrier to entry, particularly for small and medium-sized enterprises (SMEs). Another significant factor is the absence of universal standards for data formats and methodologies, thereby hindering interoperability and adoption.
Despite the challenges faced by digital twins in the UK markets, they have a promising future due to emerging trends and increased adoption across various sectors, like agriculture and retail. This aligns with the UK's commitment to integrating virtual infrastructure, offering opportunities to engage stakeholders within these virtual environments. Furthermore,
standardisation efforts, exemplified by BS ISO 30173:2023, furnish a robust framework to maintain strategic consistency and facilitate interpretation in digital twin implementations, thereby advancing and enhancing engagement across the UK.
The successful development and implementation of digital twins hinges upon a robust standardisation framework to guarantee multidisciplinary consistency, scalability, and security. BS ISO/IEC 30173:2023 provides a comprehensive framework from the British Standards Institution for integrating digital twin understanding across industries. Precise definition and standardisation of terminology achieve this, eliminating ambiguity and fostering stakeholder alignment. This outlines the fundamental components of a digital twin for physical assets or systems, encompassing its virtual representation and data link enabling real-time interaction.
In addition to presenting essential foundational principles, it underscores the significance of interoperability to guarantee the integration of digital twins with IoT systems, cloud platforms, virtual infrastructures, and cutting-edge technologies. The standard highlights the scalability and adaptability of the digital twin ecosystem for diverse applications in different sectors, in line with global frameworks, such as ISO/IEC TR 30172:2023. This foundational framework ensures compatibility across sectors and regions, promoting consistency and cross-pollination to future-proof digital twin strategies.
Tandem in practice
Digital twins are revolutionising the design, construction, and management of both digital and physical assets; Autodesk Tandem exemplifies leadership in this transformative technology. Autodesk Tandem fundamentally offers a framework for a digital twin platform, transforming BIM data into a dynamic, interactive, and visually accessible live representation of real-world assets to improve operational efficiency and collaboration between stakeholders. For parametrix, Autodesk Tandem’s seamless integration with pre-existing workflows, capacity to bridge the gap between design and practical application, and aesthetically pleasing data presentation for stakeholders interacting with Parametrix rendered it an intuitive software choice.
Several factors contributed to the selection of Autodesk Tandem, including.
· Tandem’s compatibility with the Autodesk ecosystem, including Revit, Navisworks, and Autodesk Construction Cloud (ACC), facilitates a seamless transition from design to digital twin.
· Data-Driven Decision Making: The platform distinguishes itself through its capacity to integrate real-world sensors and deliver real-time actionable intelligence.
· Tandem’s adaptable and scalable platform accommodates diverse project requirements, from individual buildings to extensive asset portfolios.
At Parametrix, we consistently seek innovative tools to optimise our processes and improve client deliverables. Upon initial implementation of Autodesk Tandem, the transformative potential for digital twins became immediately apparent. Our considerable experience with BIM software, including Revit and Navisworks, facilitated the use of Autodesk Tandem to integrate design and operational processes. This Blog details our progress, from model uploads to sensor integration and Uniclass-based classification systems, and underscores Autodesk Tandem’s transformative potential.
Staring the process
The adoption of tandem by new users involved a learning curve and necessitated modifications to workflows for seamless integration, despite its alignment with Autodesk software. When initiating a tandem, the facilities tab serves as the repository for all projects. It is imperative to provide essential details such as name, template, project name, and address before advancing.
Subsequently, model upload is required. This process may be accomplished in two ways: by selecting the necessary file from your local drive or, preferably, by seamlessly exporting uploaded project files via Autodesk Construction Cloud (ACC). This facilitates a seamless workflow integration with ACC, ensuring a smooth transition between design and target platform.
The Tandem application’s “Manage” tab is a crucial element of its data-driven functionality. This section offers comprehensive functionalities for template and parameter creation and modification, class definition, and project-level access control. Parameter definition for all generated files is mandatory to ensure seamless operation and adherence to project specifications.
Initial optimisation of Tandem’s functionality necessitates the creation of a meticulously designed template, with parameters specified to meet project requirements within the Manage tab. This template provides a foundational framework, establishing a standardised data format for consistent cross-project implementation. Template creation entails: Define Key Parameters: We began by identifying the critical attributes and metadata required for our digital twin, such as asset categories, maintenance schedules, and warranty details.
· Consistent nomenclature standardisation improved data entry accuracy and collaborative efforts.
· Parameter Defaulting: Preset parameters, including projected lifecycles and performance metrics, were incorporated to expedite new asset onboarding.
· Uniclass Classification was integrated into the template to ensure standardisation across all projects.
Integrating Sensors from Disruptive Technologies
Our next step was to connect real-world sensors to the digital twin. Using Disruptive Technologies (DT) sensors, we monitored temperature, humidity, and occupancy data. The integration process was surprisingly seamless.
What is Disruptive Technologies?
Disruptive Technologies is a foremost purveyor of miniature wireless sensors engineered for real-time monitoring and data acquisition. The sensors are characterised by their compact dimensions, ease of deployment, and extended battery life, rendering them suitable for diverse applications within built environments. Environmental data is accurately collected by DT sensors and integrated into digital twins, thereby improving operational efficiency.
Why Use Disruptive Technologies?
· Simple Installation: Compact and wireless, DT sensors are easy to deploy, even in hard-to-reach areas.
· Scalable: They can be used across multiple assets with minimal infrastructure changes.
· Reliable Performance: Offering accurate, real-time data with up to 15 years of battery life.
· data streams, making the implementation process quite simple.
Here’s how we integrated DT sensors:
To begin using Tandem, the first action required is to establish a connection; this is done by navigating to the Connections tab, which can be found within the project interface. Before you can proceed, it’s necessary to first establish both a name and a classification for your entry, which you can do conveniently through the designated “Manage” tab. Upon establishment of the connection, the parameters associated with the chosen classification will become visible on the right-hand side of the interface.
Once the parameters and connections have been clearly defined, the subsequent stage in the process necessitates the strategic employment of disruptive technologies through their designated dashboard interface. In the Sensors and Cloud Connections section of this interface, you have the ability to access and manage any previously configured sensors, including, for example, those used for monitoring the temperature in the kitchen and office areas.
In order to successfully integrate these new sensors into your system, you will need to navigate to the designated API Integrations tab, followed by the Data Connectors tab, both of which can be found within the main dashboard interface. A new data connector should be created and subsequently linked to the sensors that were previously detailed in the list provided. Subsequently, go back to Tandem, locate the URL associated with the connection you established previously, and insert that URL into the designated data connector field.
This process ensures seamless integration of sensor data with Tandem, enabling efficient data flow and management tailored to project needs.
A thorough assessment of Autodesk Tandem has provided Parametrix with substantial insight into the platform’s capabilities and its potential to enhance digital project delivery. Evaluation of Tandem revealed substantial advantages, such as effortless integration, enhanced collaboration, and superior data functionalities; however, several challenges were also identified.
Autodesk Tandem’s integration with Revit, Navisworks, and ACC has yielded significant improvements in workflow efficiency, data integration, and project continuity. The platform’s user-friendly design promotes seamless cross-team usability, accommodating team members of diverse technical proficiency and thereby fostering collaboration and efficiency. Tandem’s integration of real-time data with static BIM models has revolutionized decision-making, facilitating predictive maintenance, enhanced operational efficiency, and the development of
robust, intelligent solutions. Its integration with smart building technologies and sustainable practices ensures future adaptability, offering scalability to accommodate diverse project needs, ranging from individual facilities to extensive portfolios.
Nevertheless, our experience has also revealed areas in which the platform could be enhanced. Although Tandem offers a user-friendly basic interface, proficiency in advanced features, including API integrations and intricate workflows, demands substantial time investment and technical skill. The customization options are somewhat limited, thereby hindering the software’s adaptability to diverse workflows and project requirements, necessitating the use of workarounds. Furthermore, Tandem’s integration within the Autodesk ecosystem, while beneficial, may limit interoperability with non-Autodesk platforms for some teams
In projects utilizing substantial datasets or extensive sensor integration, Tandem’s data management and visualization capabilities may prove cumbersome; enhanced scalability would therefore improve usability. Finally, Tandem’s dependence on cloud connectivity limits its offline capabilities, presenting challenges for teams in remote or low-connectivity regions.