Digital twins for metrology; metrology for digital twins

Digital twinning is a rapidly growing area of research. Digital twins combine models and data to provide up-to-date information about the state of a system. They support reliable decision-making in fields such as structural monitoring and advanced manufacturing. The use of metrology data to update models in this way offers benefits in many areas, including metrology itself. The recent activities in digitalisation of metrology offer a great opportunity to make metrology data ‘twin-friendly’ and to incorporate digital twins into metrological processes. This paper discusses key features of digital twins that will inform their use in metrology and measurement, highlights the links between digital twins and virtual metrology, outlines what use metrology can make of digital twins and how metrology and measured data can support the use of digital twins, and suggests potential future developments that will maximise the benefits achieved.


Introduction
Digital twinning is still a rapidly growing area of research.Whilst the detailed definition of a digital twin is still in flux (a topic discussed in more detail below), it is widely accepted that a digital twin involves a real world object, a model of some aspect or behaviour of that object, and data gathered on the object and used to ensure that the model is a good representation of the current state and behaviour of the object [1,2].For the majority of digital twins, some of the data collected will be gathered by measurement.Digital twins are therefore of potential interest to metrology as a useful tool, but also metrology is Original content from this work may be used under the terms of the Creative Commons Attribution 4.0 licence.Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI. of vital importance to digital twins because good metrological practice ensures that the data used in digital twins is reliable.
Digital twins are useful because they provide the opportunity to understand the current state of an object or a system based on recently gathered data.This understanding leads to significant benefits in areas such as structural health monitoring [3,4] and advanced manufacturing [5] where the use of up to date data to estimate the degree of damage in a structure or the extent of wear in a tool can enable a predictive maintenance approach that minimises down time and focusses resources on areas where damage has been detected.Digital twins are also useful for high value low volume manufacturing [6] when considering complex assemblies of multiple parts.Based on characterisation of the performance of individual parts, the digital twin can help to design an assembly approach and can ensure that the overall performance goals of the assembled product are met.
It is clear that many applications of digital twins will use metrology data, and that some of the benefits of a digital twin may be of use to metrology and measurement scientists, particularly as the widening use of digital technologies in measurement lead to opportunities for remote monitoring of field experiments and remote calibration.This paper aims to explore two topics in more depth: what use can metrology make of digital twins, and how can metrology support the use of digital twins.In order to answer these questions, the paper describes the variety of digital twins, discusses where digital twins bring most benefit, and identifies possible metrological uses of twins, both within national metrology institutes (NMIs) and designated institutes (DIs) and beyond.The paper finishes by summarising the ways in which metrology can support development and usage of digital twins.

Digital twin definitions
It was noted above that the definition of a digital twin is still under discussion, but that the basic components of an object, a model, and data flow linking the model and the object are largely agreed.The 'twin' part of the name suggests that the real and virtual objects should be identical when first developed and should maintain a high degree of resemblance over time.Since the real world object will change over time, this means that the model (the virtual object) must be updated to continue the resemblance.One of the earliest usages of the term by NASA [1] proposes the concept as a method to improve certification and fleet management in the aerospace sector, based on the use of high-fidelity multi-scale physicsbased models to continuously monitor the health of the vehicle or system and forecast its remaining useful life.Some authors [7] require the data flow to be two-directional so that the model can affect the state of the physical object as well as vice versa; this idea is discussed in more detail below.
Some authors use 'model' in a broader sense than the original definition [8] implies, by including conceptual and data models and allowing a digital twin to be an appropriately structured and linked data store containing all available information about an object.Whilst this approach is in some ways broader than the NASA definition, it lacks the forecasting capability that is the main benefit of the use of a digital twin in many contexts.This type of 'digital twin', however, would be a useful resource for construction of forecasting models, particularly if similar twins of this type were available for multiple instances of nominally identical objects.
Other authors [9] regard 'model', and in some cases 'digital twin', as meaning 'photo-realistic representation' without requiring any representation of the physical behaviour of the system or any updating of its state: this definition is particularly common in virtual test environments such as those used for simulated testing of autonomous vehicles [10].In practice, such environments are snapshots of a place and whilst they can provide photo-and physically realistic environments for virtual testing, they do not reflect the current state of the environment on which they are based: aspects such as weather, time of day and traffic conditions are set by the user rather than reflecting reality.
The definition used in this paper is that discussed in [2], which uses data from a real object to update one or more aspects of a model of the real object that changes over time.The reverse data flow, from the model to the real object, is not mandatory, but it is useful to consider what this flow may look like in practice.The majority of digital twins are developed with the intention of providing a specific piece of information that feeds into a decision-making process.For instance, a digital twin of a machine used in manufacturing may be used to analyse and predict tool wear so that a decision about when to replace the tool can be made; a digital twin of the factory that the machine sits in may be used to analyse current orders, energy use and workflows to plan scheduling and shift patterns to minimise cost.In both cases information from the digital twin is used to make a change to the real world object, and so the flow of information and effect goes from the model to the real world.However a human will generally be in the loop and the decision and associated action are not usually automated, unlike the typical flow of data from the real world into the model.
It is also interesting to note that there are researchers who use exactly the type of approach defined as 'digital twins' in this paper but do not often refer to them as such.In particular, meteorologists use weather stations to collect data on temperature, humidity, air pressure and wind speed and use these measurements to update the initial conditions of physics-based models that are run to produce weather forecasts.The significant body of related research in this area is a potentially rich source of information useful to the development of digital twinning methods.The area is particularly strong on model updating methods such as data assimilation [11].

Details of digital twins
The aspects of the model that change over time affect the details and usage of the digital twin.The changing aspects can be considered as falling into two broad categories: change that is internal to the object and change that is external to the object.Internal changes to the object are caused by effects such as wear, damage, and fatigue, leading to a (typically irreversible) change in the physical state of the object.Such effects lead to a degradation of the ability of the object to perform its required function, and hence drive maintenance and replacement schedules.These effects are often challenging to measure directly in a non-destructive way, but can be parameterised within a model and estimated using an inverse approach to match the model output to the measured data.
External changes are typically changes in the loading or usage of the object.In many cases these quantities are unknown (and unknowable) when the digital twin is in development.For instance, a digital twin of a city may take in traffic information to predict air quality in real time, and whilst bounds on road capacity can be estimated, the details of road use cannot be known ahead of time.This type of change can typically be measured directly and applied to the model as initial, boundary or loading conditions, making the model updating process more simple.It should be noted that a single digital twin may include both types of change: a digital twin of a wind turbine may take the current measured value of wind speed (an external source of change) as an input and may generate an estimate of wear parameters of its bearings (an internal source of change) as an output.
The greatest value of most digital twins comes from their ability to inform decisions (and thus close the flow of information loop) based on the current state of the system of interest.This 'real time' aspect is often stressed when the benefits of digital twins are discussed, but the concept of 'real time' is worth exploring in more depth.There are several different time scales associated with a digital twin, some of which contribute to what 'real time' means for a given application: 1.The time scale over which significant change is occurring 2. The time interval between consecutive collection of data points, 3. The time taken to run and update the model when new data become available 4. The time interval over which the decision and the associated action take place.
'Significant change' in point 1 can be taken to mean 'change that would affect the decision being made using the twin'.Note also that these time scales may not be constant: for instance, damage due to wear often accelerates as damage progresses, so the time scale may grow shorter.
In order for the digital twin to be of practical use, the following relationships between these time scales must hold: • The data collection must be fast enough to capture the changing behaviour.• The model updating must be frequent enough to identify that significant change has occurred.• The decision and action time scale must be short enough that action can be taken before the change in the object reaches some critical value.• The data collection and model updating time scales must be short enough that the decision made and the action can be carried out before the change in the object reaches some critical value.
These restrictions on the time scales most commonly affect either the type of model that can be used or the action that can be taken following the decision.The requirement for models to be updatable rapidly has led to a growing interest in surrogate and reduced-order models [12].The restrictions on the decision and action time scale are most likely to affect the action.If the aim of the action is to stop the output used in decision making from reaching some critical value, the user may be able to choose between a planned intervention or a rapid response.The planned intervention gives a longer time scale to act but increases the likelihood of acting unnecessarily since the uncertainties associated with long-term forecasts are always higher than those associated with shortterm forecasts.The rapid response reduces the likelihood of unnecessary intervention but runs the risk of acting too late.
As an example, consider a digital twin for maintenance which quantifies a damage level such that a higher level of damage indicates a shorter expected remaining life.If the decisionmaker is prepared to accept 'turn the machine off and replace the part immediately' as an action, they could set a high critical level of the damage parameter; if unplanned stoppage of the machine is undesirable, then the user could set a lower critical value and schedule the maintenance.

Virtual metrology: distinct from digital twins
Many digital twins are based on physics-based models.The use of physics-based models for product design has been standard practice in sectors such as aerospace, automotive, and civil engineering for decades.Similarly the use of such methods in design of equipment and experiments and for analysis of experimental results within metrology has been happening for well over 20 yr [13,14].The use of these approaches to accurately reproduce the steps involved in a measurement procedure, in a 'virtual metrology experiment' can bring significant benefit even if it does not sit within a digital twin.A recent strategic research agenda [15] created jointly between mathematical and statistical experts in a number of European NMIs (the Mathmet European Metrology Network) highlighted virtual metrology as a key area for future research, described the current state of the art and identified key areas for future research.It is also worth noting that 'virtual metrology' is already a well-established technology in the semiconductor industry [16].
The varying definition of the term 'digital twin' means that some papers that use the term could more accurately use 'virtual metrology'.An excellent example is given in [17], where a kinematic rigid body model of a coordinate measuring machine (CMM) is used to evaluate the uncertainties associated with some of the geometrical and optical aspects of its operation.These aspects are believed to dominate the uncertainties for many devices.The paper validates the model by comparing the variation in values obtained over the course of 100 measurements of the same artefact with the variation in values obtained over 100 virtual measurements of an equivalent virtual artefact.The two sets of values covered the same range; the standard deviation of the 100 measurements was 0.645 µm and that of the simulated data was 0.684 µm.Another example is in [18], where a useful description of the building blocks needed to construct a virtual metrology experiment traceable and consistent with best practice in uncertainty evaluation is given.In both cases there is no flow of data from the real world to the experiment and no updating of the model, and hence no digital twinning.
A further example [9] uses a computer assisted design (CAD) model of a product and a virtual CMM to define a measurement strategy to measure various geometric features of the product, and exports the strategy in a form that can be read and acted on by an actual CMM, so that the virtual and real CMMs carry out exactly the same steps and the measurements reported within the visualisation are therefore comparable without further processing.The results are then displayed on the CAD model for ease of interpretation.

Current uses of digital twins
An online search for papers on digital twins and metrology suggests that the majority of papers have a strong focus on industrial metrology rather than using digital twins to support or advance fundamental metrology research.Perhaps the most active area is the use of digital approaches for dimensional metrology, and particularly coordinate, surface, and freeform metrology.Examples include • Development of a metrology and inspection approach to support digital twins of aircraft parts [19] • Use of a digital twin of an optical micro-coordinate measurement machine for efficient and safe measurement path planning [20] • Reduction of the required measurement time in structured light metrology by using a digital twin to combine measured and simulated data to estimate the minimal required exposure time [21] • Use of accurate freeform metrology to provide an accurate digital twin for individual instances of a freeform calibration artefact [22] as well as work described elsewhere in this paper [9,17,18].As was mentioned in the previous section, some of these papers could be regarded as virtual metrology rather than true digital twins.Whilst the papers concerned do not explicitly mention it, the application of digital twinning to CMMs potentially offers the opportunity to change the approach used when defining operating practices for CMMs.At present some procedures (e.g.[23]) require CMMs to be in tightly temperaturecontrolled laboratories in order to achieve the required uncertainties.If a well-validated digital twin that captured the response of the CMM and the associated artefact to temperature was available, future versions of procedures could permit lower levels of lower levels of temperature control on the proviso that the digital twin is used to provide corrections and appropriate contributions to the uncertainty budget.Whilst some CMMs already have temperature correction algorithms, these typically provide corrections back to a reference temperature from a single user-defined operating temperature rather than taking fluctuation of temperature during a measurement sequence into account.
It is interesting to consider why industrial dimensional metrology is perhaps further advanced in its use of digital twins than other areas.It seems likely that there are several contributing factors.Dimensional quantities are commonly use to specify acceptance criteria for parts, and for complicated parts the use of digital data and automated digital measurement has been in use for many years.A related factor is the widespread use of CAD tools to specify the geometry of parts and products.These tools and associated data standards such as ISO 10303-21 provide a digital representation of dimensional quantities and present them in a way that is intuitively easy for end users to interpret and adjust.The availability of such tools increases both the ease of developing digital twins (because the CAD model can be used as a starting point for the twin) and the trust of the user in the twin (because the CAD software presents the twin in a familiar and easily interpretable form).A third factor is that some dimensional techniques, and particularly optical techniques rather than contact measurement techniques, generate large volumes of data that are difficult to interpret without an underlying model or visualisation.
The remainder of this section highlights some example applications where the use of digital twins is currently rare, but could bring benefits for metrology and measurement science more broadly in future.

Twins of measurement apparatus
Measurement involves capturing the response of an object to a stimulus.The stimulus is the control variable and the response is what is used to characterise (or calibrate) the object.In some cases, such as calibration of a CMM using a reference artefact, the measurement device is the object responding; in other cases, such as tensile testing of a material sample using a calibrated rig, the sample creates the response.Within an NMI or DI, the metrologist typically has a good knowledge of the stimulus and the equipment or artefact that supplies it, and the equipment or artefact and its surroundings have been designed to deliver a well-characterised and accurate stimulus to the object.It is therefore generally easier for the metrologist to create a digital twin of the source of stimulus, but it is arguably less useful precisely because the system is well-characterised and well-controlled.
The close control of laboratory environments leads to significant energy costs, as well as requiring expensive materials and systems at the commissioning stage.In theory, it may be possible to use a digital twin approach to reduce the level of control required by using a twin to compensate for temperature drift and similar environmental effects by creating a parameterised model of the required effects and updating the parameters based on data.In practice, this possibility has a number of drawbacks.The main drawback is that the changes of a measured value typically have many sources, and the reliable deconvolution of individual contributions from a data stream is likely to be challenging unless a single term with a well known functional form can be shown to dominate.Even under those circumstances, there is a risk that the introduction of this term may introduce further errors and uncertainties, particularly if the reduction in environmental control leads to a change in the functional form of the response to the effect being compensated for.For instance the thermal drift of a device may be approximately linear if the temperature is controlled to with ±1 K but this fit may become a poor approximation if the temperature is only controlled to ±5 K.
There are some cases where an approach similar to this may be useful.These cases are likely to use a digital twin to enhance the understanding of the current state of the system rather than to replace close control.One example is the Kibble balance [24], where the gradient of the magnetic field due to the magnet and coil used to generate a known force is a parameter known to change over time.The procedure for processing the data gathered during a Kibble balance measurement outlined in [24] involves a sequence of iterative calculations after the data has been collected to determine the time-varying behaviour of the magnet based on the assumption of linear drift in several terms.A digital twin would enable a more flexible approach to this analysis and the subsequent data processing to provide a real time statement of the uncertainty associated with the measurement based on the data gathered up to that point.
Another area where digital twins may be useful is for experiments that have long stabilisation times.Some experiments take place in controlled environments and require the complete system, including the environment, the reference device and the device under test, to have reached a stable equilibrium before the data can be captured.For example humidity measurements typically require thermal and hygrometric equilibrium to be achieved before measurements are made.In other cases, such as electrochemical hydrogen permeation testing, the measurand is the steady state reached by an independent quantity such as electrical current.In both cases, the time it takes to make the measurement is determined by the time to reach equilibrium.Systems [25] to automatically detect when equilibrium conditions have been met based on user-defined criteria already exist for some experiments and some of these systems offer an automated system to carry out a sequence of measurement steps and ensure that equilibrium is reached in each case.
Use of a digital twin could go beyond this to predict the equilibrium measurement based on the data gathered and so remove the need to wait for a fully stable equilibrium to be reached and reduce the overall time taken to carry out the measurement.As a first step, a suitable parametric model of the system would be developed.A generalised approach using a lumped parameter model is likely to be suitable for most problems since many of the systems requiring stabilisation are effectively convection-diffusion systems.As more data points are acquired, the model parameters are updated and the steady-state values with associated uncertainties are estimated.Once the uncertainty associated with the steady state value has reached an acceptable level, the experiment can be stopped, thus potentially reducing the experiment duration.Once a suitable model has been developed the same approach could be used for multiple realisations of the same type of experiment, but the parameters in the associated model would only be applicable to the individual run of the experiment (rather than being associated with the device under test or the equipment creating the controlled conditions).
In order for this approach to be of benefit, it will be necessary for the data-driven approach to be accepted by customers and written into standards as providing correct results.It is likely that creating this level of trust will take a long time, but could be achieved by running a digital twin of the experiment without acting on its predictions in order to demonstrate the reliability required for acceptance.
Similarly, a digital twin can help to develop a minimal measurement strategy to meet uncertainty targets in real time as the data are gathered.As an example, consider the determination of geometrical quantities using a CMM.A virtual measurement [9] has been proposed that can simulate a measurement path and create the appropriate instruction file for the measurement device.If this virtual measurement was updated with measured data as the measurement progressed, it could be used to identify a series of additional measurements that would reduce the uncertainty to a given target level and could feed those commands back into the machine.This approach would reduce the need for re-measurement in cases where targets have not been reached.This type of twin is likely to be accepted by customers as it is not introducing assumptions or reducing the amount of data gathered.

Twins of objects under test
Whilst the equipment or artefact creating the stimulus is well characterised and controlled, the same is not necessarily true of the object under test.Objects for calibration are typically not well characterised when received (although historical data from previous calibrations may be available) otherwise they would not need calibration.Stability and sensitivity to environmental conditions are common sources of uncertainty, reflecting the fact that the object may be changing during measurement.
The data captured during the calibration of an object provides the first step towards creation of a digital twin of that object.Several papers [26,27] have suggested that the data on a digital calibration certificate (DCC) could act as a digital twin.However, the information on a calibration certificate or test report alone is not enough to create a reliable model of the object and keep it updated based on new data.If data is generated on a regular basis using a procedure similar to the calibration then the data could be used to update a twin.Many laboratories already carry out this type of operation to monitor the state of their physical standards and calibration equipment, but the models produced from the data are likely to be simplified.
Such an updating process would need to take the uncertainties and correlations of the different data sets into account, particularly if the laboratory carrying out the initial calibration is not the same as the laboratory regularly checking and using the object.This type of problem is common in data fusion research [28].The main benefit of such an approach would be a better understanding of the object's behaviour in the time between calibrations, and hence the potential ability to correct for that behaviour and to identify when the object has diverged from its expected behaviour and so needs to be calibrated ahead of schedule.This method could also reduce the uncertainty associated with drift between calibrations For instance if a voltmeter is subject to drift over time, and this drift is expected to be linear, a daily repetition of a measurement procedure could be used in combination with the calibration data to update the best estimate of the drift coefficient and to identify when it is likely that the drift has ceased to be linear, which may be an early warning that recalibration is necessary.
A simple example can be developed by considering mass transfer standards.Mass transfer standards are used throughout industry for the calibration of balances and weighing machines.These standards are subject to drift (for instance due to accumulation of surface contamination over time) and their apparent mass when weighed is sensitive to air density (via buoyancy, a known correction that depends on the volume of the standard) and humidity (via surface sorption of water).These factors mean that the mass of the standard when calibrated will differ from its mass when it is later used to calibrate a device.
The mathematical form of the dependency of the mass on time (drift) and on humidity is usually known, but the values of the parameters that govern the behaviour of individual standards are not known a priori.Whilst the form and associated parameters of the buoyancy correction are known, the correction depends on the volume of the standard and the density of the air at the time of use.In some cases the volume is measured when the standard is first produced, but in other cases an assumed density (for instance stainless steel, commonly used for mass transfer standards, is typically assumed to have a density of 8000 kg m −3 ) is used to calculate the volume from the mass.This assumption is a further potential source of difference between the mass when calibrated and the mass in use.
A digital twin of a mass transfer standard would therefore consist of • a model that linked the calibrated mass of the standard, the humidity and air density (via measurements of air temperature, pressure and humidity), the time of measurement and the volume or density of the standard, via a parameterised model, • a set of calibrated values of the mass of the standard, and • a process to update the model parameters using the calibration data whenever a new calibration is carried out.
The updated parameters could include the density of the standard if an estimated value has been used.Once new values of the parameters are obtained, the model can be used to generate a more accurate estimate of the mass of the standard when it is being used for device calibration.This process is sketched in figure 1.
The requirements for high precision measurement on the factory floor and for reliable measurement in challenging environments has led metrologists to develop approaches to in situ realisation of SI units [29][30][31] and self-validation of sensors [32,33].This approach shrinks the traceability chain and removes the need for calibration (or allows for remote or self-calibration), but it is likely that some monitoring of these in situ realisations will be necessary to maintain end user confidence in their stability and accuracy.Whilst the primary realisations of the units cannot drift, the electronics used to capture the relevant data can and hence potentially require monitoring.This monitoring is well suited to a digital twin approach.If the possible sources of instability or inaccuracy can be identified and described mathematically then a model of the system can be constructed, updated as new data is gathered, and used to inform maintenance and replacement.
Many organisations that would benefit from these types of digital twin will not have the relevant skills to develop and maintain the relevant models.If NMIs provide such models as part of their calibration services, the maintenance of the twin, and associated costs, falls to the NMI.The customer would be able to (remotely) interrogate the twin to assess the suitability of the standard for continued use.For the traditional model of software distribution such maintenance, and support for the variety of potential operating systems etc., can rapidly become time-consuming and costly.
Recent developments at the UK's National Physical Laboratory have created Software as a Service (SaaS) digital platforms for processing of specific types of calibration data that can be accessed by customers via a web-based portal or directly via application programming interfaces (APIs) to avoid this problem.A single version-controlled instance of the data is held on the digital platform and is called via a web interface or API.The user uploads data, which could be added to a user-specific data store so that historical trends can be analysed, and receives processed values in return.The platform approach simplifies maintenance and version control of data and removes the need for compatibility with multiple operating systems as the implementation uses software agnostic APIs.This platform could be extended to deliver digital twins to customers, as illustrated in figure 2.

Metrology for digital twins?
Before discussing how metrology can support the use of digital twins it is useful to discuss which sectors are using digital twins.A literature review in 2022 [34] included a table of the number of published papers obtained when searching for 'digital twin' in three different sources, broken down by application sector. Figure 3 shows the data from Science Direct as a pie chart (the distribution for the other two sources is similar).This figure shows that the three leading sectors that generate publications are manufacturing, power generation & energy, and construction.It is possible that other sectors (for  instance aerospace, automotive, and pharmaceutical) are making use of digital twins but do not publish for intellectual property reasons.
Many of the challenges associated with digital twins are not solely metrology problems, and much of the research that addresses those challenges will have an impact on computational modelling beyond digital twins.The Mathmet strategic research agenda [15] identified the two main challenges for virtual metrology and digital twins as being reliability and efficiency, and similar concerns are reflected in the analysis agenda of NAFEMS [35] where they feed into topics such as simulation supporting certification.Research on these topics by metrologists (for instance within the European Partnerships Digital Transformation programme) can feed into these bigger pictures.
The broad purpose of metrology for digital twins is to improve the confidence in the outputs of the twins through use of SI units, traceability back to national standards, and uncertainty evaluation.Recent years have seen a surge of interest in digitalisation within metrology [36], and in particular in how to make metrology data machine-actionable.The international activities associated with this interest, such as the joint statement of intent "On the digital transformation in the international scientific and quality infrastructure" signed by the Bureau international des poids et mesures, the Organisation Internationale de Métrologie Légale, the Internationale Meßtechnische Konföderation, the Committee on Data of the International Science Council, Commission internationale de l'éclairage, the International Laboratory Accreditation Cooperation, the International Electrotechnical Commission and the National Conference of Standards Laboratories International [37], will be of benefit to digital twins as well as to other algorithms.In particular the provision of a digital unique point of reference for the SI will make it simple for the appropriate units to continue to be associated with data in digital form.This capability will (for instance) permit digital twin creators to specify units associated with the model inputs and outputs and hence to reject unsuitable input data.An extension of this idea is the use of 'unit as a type' within programming languages [38], where the variables and parameters used in software can have measurement units associated with them and the compiler is able to check that the calculations within the software are dimensionally consistent.Two recent papers [39,40] have also looked at data transfer standards for uncertainty and how they will support the reliable use of uncertainty information within digital twins.
The approaches developed for secure and structured transfer of metrology data for DCCs, and the related Digital Product Passport (DPP) [41], will also have application in supporting digital twins, particularly in cases where the digital twin is owned and maintained by a different organisation from that which owns the physical twin and is generating the data.
It is possible that the popularity of digital twins in some sectors is boosted by data availability and standardisation: the construction sector in particular has well-established tools and standards for building information management, making it easier to find and use relevant data.Development of similar standards for digital data and metadata in other sectors could support wider uptake of digital twins.Metrology frequently plays a strong role in standards development, and the digital framework for the SI will enable data and metadata standards to benefit from the traceability that the SI brings.
Figure 3 shows that the sector with most published research on digital twins is manufacturing.The European Metrology Network for Advanced Manufacturing [42] has identified key areas where metrology support of advanced manufacturing is needed, including digitalisation and vertical integration, within its strategic research agenda so this topic will not be discussed in more depth.
The abstracts of papers in this sector suggest that the two most common applications of digital twins within manufacturing are quality control and maintenance.The application mentioned above for automated planning of a series of measurements to achieve a target uncertainty would transfer directly onto factory floor measurements to support quality control.The idea of combining calibration data from a laboratory with user data in a digital twin could be extended to monitor the state of sensors mounted on a production line that may require periodic recalibration.
Digital twins for maintenance [43] require the various faults that can occur within a machine to be identified and implemented within a model that can use measured data to estimate either the likelihood of a failure occurring within a given period or the expected time to the next failure.These estimations are likely to have similar statistical qualities to radioactive decay events, and may be able to benefit from data analysis techniques used to calculate half-lives [44].In addition, advanced techniques for early warning signals such as tipping point analysis [45] have been developed by NMIs that could be implemented as digital twins to identify signs that the state of the machine has changed and so some form of maintenance may be required.
Several of the sectors identified in figure 2 are likely to use digital twins based on sensor networks.Applications include smart grids within the energy sector and air quality monitoring in the Smart City sector [7,46].Sensor networks, and in particular smart sensor networks that can share data and information between individual sensors, are a topic of growing interest to metrology.A project on fundamental sensor network metrology, FunSNM, funded by the European Association of National Metrology Institutes has recently started and it is likely that the outputs of that project will be of benefit to digital twins of these networks.

Conclusions
Digital twins make use of the informative power of up-to-date data and the predictive power of models to provide reliable insights into the current state of a system.Metrology can benefit from digital twins by using them to better understand systems that change over time.In some cases the characterisation of the change over time is of value in itself, and in other cases the twin can be used to identify anomalous behaviour.These approaches will bring benefit to: • areas such as humidity measurement, where stabilisation times are significant: the use of a digital twin to predict the stabilised value reliably can reduce measurement times.• monitoring of in situ realisations of SI base units such as Johnson noise thermometry: use of a digital twin to identify anomalous behaviour and to quantify drift and degradation will ensure that maintenance can be carried out in a timely fashion, improving end user confidence.• areas such as mass measurement where transfer standards are affected by time and the local environment: a digital twin of an individual artefact can capture its drift behaviour and sensitivity to conditions such as temperature and humidity so that a more accurate value can be assigned when the artefact is in use.
Conversely, metrology can support the uptake of digital twins by furthering current efforts to develop a digital framework for communication of metrological data.The provision of an agreed approach to, and a single point of reference for, digital use of the SI will simplify assurance that correct and consistent units are being used within twins.The harmonisation of a structure for DCCs will enable inclusion of calibration data directly into twins, and the increase in use of DPPs will offer a wider range of data for development of digital twins.The ongoing research into areas ranging from in-line metrology to sensor networks will support practical development and deployment of twins and will give confidence in the data that they use, the predictions they generate, and the decisions that their outputs inform.

Figure 1 .
Figure 1.Sketch illustrating the inputs and data associated with a digital twin of a mass transfer standard.

Figure 2 .
Figure 2. Sketch of potential approach to hosting and accessing digital twins.

Figure 3 .
Figure 3. Sectors publishing papers on digital twins, based on data from [34].