Team:ETH Zurich/human/interviews/expert6
From 2014.igem.org
Discussion with Etienne Klein
Etienne Klein is director of research at the CEA with a Ph.D in the sciences. He currently heads up the Research Laboratory of the Science of Matter (LARSIM), situated at Saclay. He has participated in various big projects, in particular, the development of the process of laser isotopic separation and the study of a superconducting cavity accelerator. At CERN, he participated in the conception of the large European particle physics accelerator, the LHC. He has taught quantum and particle physics at the Ecole Centrale de Paris and is a Professor of philosophy in the sciences. He is a specialist in the question of time in physics, and is the author of numerous books written for the general public.
HOW CAN WE DEFINE COMPLEXITY?
Complexity is not always where we think it is. Simple dynamics can produce a complex phenomenology, as it happens for fractals. The structure can be apparently complex when its derivation is in fact simple. So how can we define complexity?
When we are talking about complexity, are we talking about the complexity of laws or about ontological complexity? In biology, one doesn't really talk about law, but more about functions. Is the identification of all functions the same thing as the formulation of physics laws? Let us consider for example the standard model of particle physics, in which gravitation is not included. One can say that this model is mathematically simple. A master student would be able to understand its theory. It is abstract, but not diabolical. On the other hand, its ontological furniture, that is the set of objects which it declares as existing objects, is complicated. There are a lot of elementary particles. Some people think it's too complicated, and prefer to simplify this ontological furniture by superstring theory. Every particle from the standard model is a vibration mode of a unique string. It is very simple ontologically, but it is infernal mathematically. Few people really have a good command of this theory. So we need a criterion in order to define complexity.
It is commonly accepted that for example life is complex. What makes a living system be alive? Some complex systems don't have any reproduction, that doesn't mean they are less complex. We can even imagine that some beings which are not complex are alive. One can often define something, a concept or a notion only by what it is not. For example: matter is what doesn't have a conscience. Conscience is what is more than matter. Definitions refer to each other. For life, it is kind of the same thing. Life is what is not inert; inertia is what is not alive… On this question one has to start with a characterisation of life, even if it is questioned, so that one doesn't talk to himself.
WHAT IS EMERGENCE?
I realised that people like to hear themselves when saying the word emergence, which is not well thought. What is emergence?
There is a first type of emergence: the fact that a property of a system is expressed only if this system reaches a certain size. The property is always present but it is invisible while the size of the system is limited. See for example superconductivity. An electron is not superconductor alone. The equations of propagation of an electron in a solid contain its possible superconductivity, but this property is expressed only in some conditions. The emergence of superconductivity is simply the apparition of a phenomenon determined by a theory which is valid at all scales. In that sense, emergence is nothing more; it doesn't bring any more information.
There is another emergence, radical emergence. Something absolutely absent in the underlying levels can appear. It can be a new law, new physics. This second emergence is due to the fact that our theories are only effective theories. See for example nuclear physics. When you do nuclear physics, when you work on the nucleus of an atom, you are not going to write a Hamiltonian on all quarks inside, gluons, etc. There are some effective theories which, at your level, will enable you to understand phenomena without needing underlying theories. At the level of the nuclear physicist, there is an effective theory, and between his level and the level of the particle physicist, there is a kind of emergence. The reductionist approach assumes that there is a fundamental theory below, and that we are able to extend it. This is what we call the synthetic ascent. Yet we are still not able to do it. For example: we don't know how to derive Ohm's law from axioms of the quantic theory. Ohm's law is something we only observe. We are not able to make it emerge from Dirac's equations. However we assume it is possible. Maybe it is completely wrong.
The question of what type of emergence we are dealing with raises another more metaphysical question:
Does Nature have a core level, which we could qualify as fundamental, or do we have to deal with only effective theories?
And in the case of biology, this question about emergence raises also new questions:
What is the status of life, the status of conscience, compared to matter? Is conscience a property that matter has when it is organized in a complex way? Or is conscience coming from an extraterritoriality in relation to life? Are biological laws derivable from physical laws, or do they simply add up to physical laws? In short: if there are biological laws, do they have or not a link to physical laws?
HOW DOES ONE DEAL WITH COMPLEXITY IN PHYSICS?
Physics' commitment is to avoid complexity. Modern physics starts from the principle that fully describing a phenomenon is impossible. So we limit our ambitions, and this is how we can become efficient. For example if one tries to understand celestial mechanics, one has to understand first that the dynamics of a body orbiting the sun is independent from its shape, its density, its colour and its mass. What you can write mathematically is the dynamics of this body in motion. If one would like to describe entirely this body, one would have to deal with complex systems. This simplification approach is called reductionism.
This approach has become impressively efficient, to the extent that people wondered whether physics really had to limit its ambitions, and whether it could colonize other domains such as Biology. We thought at the age of Enlightenment that physics could serve as a theoretical basis for theorising all phenomena.
Today we came back of this enthusiasm because of complexity and because of the incredible difficulty to find a theory which is valid at all scales. We still think in terms of reductionism, but we compartmentalise more and more the different levels which become more and more hermetic. It is becoming more and more difficult to follow the synthetic ascent approach. Science has become very complex. Are we able to follow this evolution? We have the utopia of hearing a simple speech one day.
The risk of reductionism is to forget that what we had to neglect indeed exists. Our mind naturally tends to consider that what we don't need doesn't exist. You discover quarks, neutrons, and this leads you to consider that there are only quarks and neutrons. Are we a mass of quarks? What makes that sometimes there is complicity between two humans, sometimes not? There is always something beyond our understanding. If on the other hand you try to take everything into account in a whole theory, how can you check that your whole theory doesn't miss anything?
IS COMPLEXITY ALWAYS REDUCIBLE IN PHYSICS? CAN PHYSICS LIMIT ITSELF TO A REDUCTIONIST APPROACH?
Complexity is not always reducible in physics: we are not able to understand turbulence for example. We run simulations in order to go on, we find empirical laws with domains of validity… A traffic jam is also complex. What makes that a traffic jam appears? What makes that it disappears? This is not really something that you can model. One uses statistics in order to make predictions. This is not really what we call comprehension. Physics can limit itself to reductionism, not in the corpuscular sense of reductionism, but more in the sense of a non-naive reductionism. The fundamental object is a field, not a thing. I don't know if we can still call this reductionism.
IF TAKING INTO ACCOUNT ALL DIMENSIONS OF A PROBLEM IS TOO COMPLICATED, SHOULDN'T WE ACCEPT THAT WE CAN'T UNDERSTAND EVERYTHING AND PROCEED BY TRIAL AND ERROR IN ORDER TO GO FORWARD?
In my research, I collaborate with neuroscientists and I see how they proceed. They use this approach indeed and have incredible difficulties interpreting their results. They don't have a full epistemological frame. The research director decides on thought experiments to carry and everyone else follows. But I discussed with him and we don't agree at all on the notion of causality. He considers sequences as consequences.
Biologists have a finalistic speech. Just the fact of describing systems by their function is finalistic. Physics has still managed to break up with Finalism. We don't say that bodies fall because they are better off when they are down. Biology falls within a finality. Even when people talk about Darwinism, they have a finalistic speech. Yet organisms don't adapt by themselves to environment. The environment changes, some of them are adapted, the others not.
I wonder if we could maybe draw an epistemological comparison between Physics and Biology. Physical laws could for example be the equivalent of functions; the equivalent of an invariant could be a structure, the equivalent of a symmetry, maybe a kind of plasticity…