Team:ETH Zurich/human/interviews/expert7

From 2014.igem.org

(Difference between revisions)
m
m
Line 1: Line 1:
{{:Team:ETH_Zurich/tpl/head|Interview with PD Dr. Harald Atmanspacher}}
{{:Team:ETH_Zurich/tpl/head|Interview with PD Dr. Harald Atmanspacher}}
-
<!--
+
 
<html><article></html>
<html><article></html>
Line 7: Line 7:
===There are various definitions of complexity. Most of the time, the notion of system is mentioned in those definitions. Would you say that complexity is a property of a system? ===
===There are various definitions of complexity. Most of the time, the notion of system is mentioned in those definitions. Would you say that complexity is a property of a system? ===
-
I would say that complexity is not a property, but a characterization of a system. The next question is what could be the defining criterion for this characterization. At some point, people tried to come up with formal definitions of  what complexity could actually be and how it can be measured quantitatively. Before that happened, for a long time,  for decades actually, people were talking about complex systems in a colloquial sense. What you’ll find in literature  is criteria like, for instance: “The coupling of a system with its environment is important for the behavior of the system” (open systems), “Many complex systems have a lot of constituents” (Nevertheless, we can find complex systems with a small number of degrees of freedom, for example in deterministic chaos). What you need for complexity is non-linear behavior, non-linear feedback. Another criterion that some people use is that you are dealing with systems far from thermal equilibrium. In biology, you are typically far from the thermal equilibrium. Another feature of complex systems is their intrinsic instability, which makes it difficult or impossible to treat their behavior as stationary. In many of these complex systems, it is not easy to find those domains of behavior in which they are stationary, stable structurally, stable dynamically, stable. There are these stability islands, which are generally not easy to find. This is a number of terms and concepts which have been used for a long time.  
+
I would say that complexity is not a property, but a characterization of a system. The next question is what could be the defining criterion for this characterization. At some point, people tried to come up with formal definitions of  what complexity could actually be and how it can be measured quantitatively. Before that happened, for a long time,  for decades actually, people were talking about complex systems in a colloquial sense. What you’ll find in literature  is criteria like, for instance: “The coupling of a system with its environment is important for the behavior of the system” (open systems), “Many complex systems have a lot of constituents” (Nevertheless, we can find complex systems with a small number of degrees of freedom, for example in deterministic chaos). What you need for complexity is non-linear behavior, non-linear feedback. Another criterion that some people use is that you are dealing with systems far from thermal equilibrium. In biology, you are typically far from the thermal equilibrium. Another feature of complex systems is their intrinsic instability, which makes it difficult or impossible to treat their behavior as stationary.  
=== Can one measure complexity quantitatively? ===
=== Can one measure complexity quantitatively? ===
-
The first attempt to really define complexity in more rigorous way was already happening in the 1960s, it was done not by physicists but by mathematicians. Kolmogorov and others had a definition of complexity that has later been called “algorithmic complexity”. When you have a pattern and want to measure its complexity, their approach was the following: if you construct an algorithm that you can run on a computer, then the length of the shortest algorithm that is capable of reproducing the pattern is the algorithmic complexity of the pattern. If you have a completely regular pattern, like a period 2 process, this is a very short algorithm. That is not a huge algorithm and the complexity is low. On the other hand, if you construct your pattern as a random sequence of black and white pixels, then the shortest algorithm to reproduce that is the sequence of pixels. So that is the longest algorithm in relation to the pattern itself that you can imagine. It will be the pattern with the highest complexity score. There was a point of criticism that was quickly raised: in this sense, complexity is indeed nothing else than randomness. Then, the question is: why do we need two different names for the same phenomenon? In the 1980s, people came up with a different view point. The intuition was: if you have a completely regular behavior, that is not complex anyway but, if you only have a completely random behavior, this should also not be called complex. What should be called complex is an intricate mixture between random and regular elements in your pattern.  
+
The first attempt to really define complexity in more rigorous way was already happening in the 1960s, it was done not by physicists but by mathematicians. Kolmogorov and others had a definition of complexity that has later been called “algorithmic complexity”. When you have a pattern and want to measure its complexity, their approach was the following: if you construct an algorithm that you can run on a computer, then the length of the shortest algorithm that is capable of reproducing the pattern is the algorithmic complexity of the pattern. If you have a completely regular pattern, like a period 2 process, this is a short algorithm and the complexity is low. On the other hand, if you construct your pattern as a random sequence of black and white pixels, then the shortest algorithm to reproduce that is the sequence of pixels. So that is the longest algorithm in relation to the pattern itself that you can imagine. It will be the pattern with the highest complexity score. There was a point of criticism that was quickly raised: in this sense, complexity is indeed nothing else than randomness. Then, the question is: why do we need two different names for the same phenomenon? In the 1980s, people came up with a different view point. The intuition was: if you have a completely regular behavior, that is not complex anyway but, if you only have a completely random behavior, this should also not be called complex. What should be called complex is an intricate mixture between random and regular elements in your pattern.  
This is a basic distinction between two different general categories of complexity measures: one of them just being a measure of randomness, the other one characterizing the mixture between order and randomness.  
This is a basic distinction between two different general categories of complexity measures: one of them just being a measure of randomness, the other one characterizing the mixture between order and randomness.  
Line 30: Line 30:
===Is complexity linked to Emergence? ===
===Is complexity linked to Emergence? ===
-
Yes because that’s another one of these colloquial features of complexity. Complex systems often have a hierarchical structure. So you have levels of description. For instance, you can describe complex systems in terms of individual constituents, like individual neurons in the brain. Then, you also have other levels of description: for instance, the level at which neural assemblies are formed. Then, these neural assemblies often have properties. Some people call them emergent properties, which you cannot simply derive from their constituents unless you know something about the collective level. In physics, when you study the relationship of individual molecules in a box of gas and the thermodynamic behavior. Temperature is, of course, not a property of single molecules. In this sense, it is also an emergent property and much has been written about that example.  
+
Yes because that’s another one of these colloquial features of complexity. Complex systems often have a hierarchical structure. So you have levels of description. For instance, you can describe complex systems in terms of individual constituents, like individual neurons in the brain. Then, you also have other levels of description: for instance, the level at which neural assemblies are formed. Then, these neural assemblies often have properties, some people call them emergent properties, which you cannot simply derive from their constituents unless you know something about the collective level. In physics, when you study the relationship of individual molecules in a box of gas and the thermodynamic behavior, temperature is, of course, not a property of single molecules. In this sense, it is also an emergent property and much has been written about that example.  
Emergence is very intensely discussed in the context of complex systems. Another issue that is more and more discussed in the context of complex systems is the issue of reproducibility of certain results or experiments.  
Emergence is very intensely discussed in the context of complex systems. Another issue that is more and more discussed in the context of complex systems is the issue of reproducibility of certain results or experiments.  
Line 41: Line 41:
Most of the research results we are talking about are a few decades old. Was there an evolution in the field of complexity this past few years or has the research on complexity attained a bottleneck?  
Most of the research results we are talking about are a few decades old. Was there an evolution in the field of complexity this past few years or has the research on complexity attained a bottleneck?  
-
I think there was a very decisive point in time in the study of complexity. That was when people could buy for not so much money high-power computing system. The reason is obvious. You cannot analytically solve complex systems in most cases and if you really want to study them, you have to run them in simulation studies. Everything that happened before the late 1970s was more or less heuristic: mathematicians had analytical examples for complex systems. Those examples were the simplest ones. After powerful computer came up, everything could be simulated. Then, the whole field exploded.
+
I think there was a very decisive point in time in the study of complexity. That was when people could buy for not so much money high-power computing system. The reason is obvious. You cannot analytically solve complex systems in most cases and if you really want to study them, you have to run them in simulation studies. Everything that happened before the late 1970s was more or less heuristic: mathematicians had analytical examples for complex systems. Those examples were the simplest ones. After powerful computers came up, everything could be simulated. Then, the whole field exploded.
Line 51: Line 51:
===You described diverse measures for complexity. Would it be possible to build a universality theory of complex systems? ===
===You described diverse measures for complexity. Would it be possible to build a universality theory of complex systems? ===
-
There are differences in the notion of universality classes on the way from regular to chaotic behavior. I am not saying that complexity lacks completely of a kind of universal behavior. But what we do not have is a compact set of equations that describes everything, like Maxwell’s equations. Maxwell’s equations resulted from the attempt of physicists to create a fundamental universal law for electromagnetism. In complex systems research, something like this has simply never happened. My intuition is that it is a fundamental problem in complex system theory and it is not simply that we have to work harder or to work for a longer time. Considering universality as a methodological pillar on scientific work, Peter Grassberger had an intuitive argument about this issue. He brings in the issue of meaning. For him, complexity is nothing else than the “difficulty of a meaningful task”. Thus, meaning implies subjectivity, which implies uniqueness, which is opposed to universality. That created some real controversy at that time in the study of complex systems because people realized that when you try to import meaning as an explicit object of study in physics, then you are really not doing physics anymore. At that time, a lot of people considered this as a no-go in physics. But Grassberger was courageous, he did it. I think it is interesting because it opens up a whole new level of discussion and deliberation. My favorite notion in this kind of discussion is contextuality. I would not contrast universality with the subjective but with the contextual.  
+
There are differences in the notion of universality classes on the way from regular to chaotic behavior. I am not saying that complexity lacks completely of a kind of universal behavior. But what we do not have is a compact set of equations that describes everything, like Maxwell’s equations. Maxwell’s equations resulted from the attempt of physicists to create a fundamental universal law for electromagnetism. In complex systems research, something like this has simply never happened. My intuition is that it is a fundamental problem in complex system theory and it is not simply that we have to work harder or to work for a longer time. Considering universality as a methodological pillar on scientific work, Peter Grassberger had an intuitive argument about this issue. He brings in the issue of meaning. For him, complexity is nothing else than the “difficulty of a meaningful task”. Thus, meaning implies subjectivity, which implies uniqueness, which is opposed to universality. That created some real controversy at that time in the study of complex systems because people realized that when you try to import meaning as an explicit object of study in physics, then you are really not doing physics anymore. At that time, a lot of people considered this as a no-go in physics. But Grassberger was courageous, he did it. I think this is interesting because it opens up a whole new level of discussion and deliberation. My favorite notion in this kind of discussion is contextuality. I would not contrast universality with the subjective but with the contextual.  
Line 73: Line 73:
*[http://link.springer.com/article/10.1007/BF00668821 Grassberger, Peter. "Toward a quantitative theory of self-generated complexity."International Journal of Theoretical Physics 25.9 (1986): 907-938. ]  
*[http://link.springer.com/article/10.1007/BF00668821 Grassberger, Peter. "Toward a quantitative theory of self-generated complexity."International Journal of Theoretical Physics 25.9 (1986): 907-938. ]  
-
<html></article></html>-->
+
<html></article></html>
{{:Team:ETH_Zurich/tpl/foot}}
{{:Team:ETH_Zurich/tpl/foot}}

Revision as of 13:04, 17 October 2014

iGEM ETH Zurich 2014