What do we mean with 'Determinism'?
(Causal) Determinism - , simply put - means that - given the same settings or conditions a story will always unfold in the Exact same way. Although this may seem like a non-issue, ‘full determinism’ is actually impossible in real-life (!), and can only be approximated in a computer simulations.
Not convinced? as a real world experiment, try dropping a pen in the exact same way (exact same spot, exact same fall duration from the exact same height)).
When we Discuss Determinism in the context of Digital Twins we generally refer to the ‘Determinism of the Spatial Models that regulate material transport- and transfer’ (or by its' developer shorthand ‘Physics’).
Simply put, we find that the virtual object movement- and material interaction over time is the least like what happens in real life, and these models have the greatest deviations when running the same setup.
Material Handling Fault Finding
Having a greater degree of determinism is paramount in finding and fixing material handling faults in your project - if you can exactly repeat the test result with exactly the same settings, you can arguably change the settings one-by-one to find out which setting is the cause of your problem (i.e. settings in this case can also mean source code). Determinism in the context of Physics allows Compartmentalization - it allows you to split of a small partial problem from the material handling entire case, and fix that in isolation
As such, we find that having a highly deterministic simulation (1) greatly decreases Digital Twin development time, (2) makes it much easier/doable to parallelize the workload (i.e. work together on developing the project) and (3) is a requirement for any simulations intended on validation and verification.
Model Validation - Measuring Moving Models
Since we expect a lack of Determinism In the development of real world material handling we include tolerances in validating experiments in the development process (e.g. Factory Acceptance Testing). Similar verification - but moreover tolerancing - seems to be overlooked in dealing with the virtual development counterpart - not in the least part since we assume that the material handling/physics models that we use are inherently deterministic.
It stands to reason - that when precision is a strict requirement (as with Verification Twins) - that we will want to test the tolerance of our enacted material handling. Implicitly this means we must actually run 3 types of tests:
Resolution Tests - Are the chosen virtual material handling (measurement) tools correctly setup to operate within the desired tolerance range? Are we using them correctly?
Fidelity Tests - Is the virtual prototype sufficiently transforming material as the real-one would, is the model representative?
Accuracy Tests - Is the virtual prototype operating within the tolerances we determined acceptable for the end-product? Is this design viable?
Note that particularly the second test type is unique to virtual testing - in the real world you need not test that physics works the way it does, in a virtual world you do.
This second type of testing is particularly the anterior application of the DES tooling - in the form of Ghost Rider Testing: It is not just intended as a timewise visualization tool, its been designed with low tolerance in mind - currently spatially about 1e-7m (1/10th micrometer) and timewise about 1e-10s (1 nanosecond) at 1x Realtime speed. we maintain these low tolerances since we want to use a ‘DES Animation’ to be able to identify and validate tolerances over time caused by Limited Determinism and rounding errors on a simulation fed with alternative tooling (like for instance NVidia PhysX) - arguably the only way to do a Fidelity Test
Â