We saw previously how we can track qualitative changes of objects and events with quantitative measures based on indexes such as BDM. We will see now how we can use the same concepts and methods to reconstruct dynamical systems from disordered observations. We have here again an elementary cellular automaton, our favourite guinea pigs to illustrate how algorithmic dynamics and our algorithmic causal calculus works. This is rule 254 which basically turns every cell black producing a characteristic cone, the cone actually defines the region of influence of the initial condition, in other words nothing outside the cone is affected by the initial condition, and the cone defines the only region that can be potentially causally influenced by the initial condition both in space and time. Remember that times flows downwards. It is clear how a generating model for this cellular automaton would be short and, after applying any change to the cellular automaton it would make the candidate model explaining the perturbed version longer and of greater algorithmic complexity. So the whole cellular automaton can be coloured in red except for the last step, that is, as we have seen before, is of neutral nature because the model generating the cellular automaton with or without the last step will be exactly the same. But after applying several bit perturbations to a row as marked in blue, then deleting such row would make the cellular automaton simpler. So we can see how we can identify different parts of a dynamical system by perturbing different regions of the system, and how to identify different perturbations performed to such a system. Once shown how algorithmic information dynamics helps us identify those elements and regions in a dynamical system, we can show you how this new interventional calculus can help us reconstruct the space-time evolution of a dynamical system using, again, Elementary Cellular Automata. Here are the regular space-time evolutions of a number of cellular automata rules followed by a reconstructed space-time evolution from a randomly row-scrambled version of the same automaton. The reconstruction is done by identifying the lowest algorithmic complexity configuration among all possible 9! = 362880 row permutations emulating all possible scrambled versions, that is for up to 8 steps plus the initial condition hence 9. We are not showing the scrambled versions because it wouldn’t make much sense, just imagine we scramble them and we show you the result of the reconstruction from a disordered version of the original evolution. The values at the bottom of each case are the Spearman correlation values between the original and reconstructed. As you can see correlation values are very high most of them close to 1 meaning that we were able to reconstruct the original systems or very close to do so, in all cases we captured the qualitative dynamics even in those with low correlation value. So here you can see how you can capture something more fundamental and algorithmic that statistics and correlation cannot easily see. Look at rules 57, 9 or 73, the reconstructed versions clearly captured some qualitative features of the original cellular automata yet this is not reflected in a value such as Spearman correlation. But this other is the most striking result in the reconstruction of these cellular automata. Because in the previous version we reconstructed the space-time diagram or we did so closely, but the results were not giving the order of the evolution of the system, that is which row was before which other. Here, however, instead of scrambling the original CA we applied a perturbation to every row and tried to reconstruct the original cellular automaton by looking at how much and what type of effect would have each row perturbation. You can see, again, that the reconstruction is very close again, or the reconstruction extracts some key features of the data, such as in cases 11, 9 and 54. And when putting together and considering more data, the reconstructions are even better. In figure (c), for example, and as it would be theoretically predicted, the later in time a perturbation is performed the less disruptive. Each pair shows the statistical rho and p values between the reconstructed and original space-time evolutions, with some models separating the system into different apparent causal elements. In figure (d) is the reconstruction of one of the simplest elementary cellular automata (rule 254) and, in figure, (e) one of the most random-looking ECA, both after 280 steps, illustrating the perturbation-based algorithmic calculus for model generation in 2 opposite behavioural cases. In figures (f) and (g): The accuracy of the reconstruction can be scaled and improved at the cost of greater computational resources by going beyond single row perturbation up to the power-set (all subsets). Depicted here are reconstructions of random-looking cellular automata (30 and 73 running for 200 steps) from single (1R) and double-row-knockout (2R) perturbation analysis. Errors inherited from the decomposition method (see Sup. Inf., BDM) look like ‘shadows’ and are explained (and can be counteracted) by numerical deviations from the boundary conditions in the estimation of BDM. Variations of the magnitude of the effects are different in systems with different qualitative behaviour: the simpler the system the less different the effects of deleterious perturbations at different times, but in non-trivial systems different information perturbations, that is, positive, negative or neutral, had significantly different effects. And in those systems in which those effects were neutralised, they were so because systems were so simple that different types of perturbations were expected to have about the same effect in the candidate models. So in this lecture we have seen how powerful this algorithmic interventionist calculus based on algorithmic information dynamics can be at helping reconstructing models and system’s behaviour in a profound manner tackling causation. In the next lecture we will now see how we can take advantage of this power not only to describe, study objects and find out models and causes but to actually steer and manipulate systems.