Prof. Dr. Bertram Kienzle, Dr. Olaf Engler, Martin Lemke (M.A.) (Institut für Philosophie, Universität Rostock)
Die mathematische Weltbeschreibung zwischen Formel und Anschauung
22. Februar 2011, 19:00 Uhr
Die Ausweitung der Mathematik und ihrer Anwendung auf die Natur ist eine Tendenz der abendländischen Wissenschaftsgeschichte. Sie beginnt vielleicht mit der Messung der Höhe der ägyptischen Pyramiden über die Länge ihres Schattens durch Thales von Milet und ist heute (hoffentlich) mit der Mathematik der Quantenphysik und allgemeinen Relativitätstheorie noch nicht am Ende. In den Einführungen zur Mathematik wird aber stets betont, dass die Mathematik keine Naturwissenschaft ist und naturwissenschaftliche Befunde sie nicht widerlegen. Damit stellt sich die Frage wovon die Mathematik handelt, welchen metaphysischen und ontologischen Status die Gegenstände der Mathematik haben und warum etwas, das nicht von der Natur handelt trotzdem auf selbige angewendet werden kann.
Prof. Dr. Bertram Kienzle: War Kant Formalist?
Dr. Olaf Engler: Sind mathematische Formalismen heuristische Werkzeuge?
Martin Lemke (M.A.): Sind Skizzen Beweismittel?
Tobias Breidenmoser (M.A.) (Universität Rostock)
The Concept of Microtrabecular Lattice and the Epistemology of Scientific Experimentation
11. Oktober 2010, 19:00 Uhr
Experimental biologists often have to deal with the problem whether their experimental results are reliable and trustworthy or artifacts created by some kind of measurement process. Special caution is indicated if one detects novel entities which were unknown from using other techniques before; hence strategies and methodologies of how to separate real entities from artifacts are needed. For this reason, Allan Franklin (1986, 1990) has compiled a list of several strategies, such as calibration and independent confirmation, which are not infallible but there are at least good reasons that experimental results are reliable if the strategies are applied. Moreover, William Bechtel (2006) has brought some of these strategies into a hierarchical order suggests a three-step-procedure to yield reliable experimental results, containing of (1) searching for a distinguishable structure or pattern in the results, (2) comparing at least some results of the new technique with results obtained through already established techniques and (3) making the results coherent with plausible theoretical accounts of the phenomenon.
The aim of my talk is to examine these strategies by means of a biological case study on the concept of microtrabecular lattice. In the mid-70s, formerly unknown elements of the cytoplasm called trabeculae were detected by Keith Porter and his colleagues by using the new and more powerful technique of high-voltage electron microscopy (HVEM). Porter supposed that these entities connect other elements of the cytoskeleton and form a microtrabecular lattice enmeshing all cellular components and regulating axonal transport. Some cell biologists have challenged the significance of Porter’s discovery by claiming that it is just an artifact created by methods of fixation and dehydration, so Porter (1979) has defended its reality. Franklin (1986, 184-189) has pointed out that many epistemological strategies proposed in his own philosophical studies were used by Porter, namely the strategies of experimental checks and calibration, independent confirmation using different experiments, elimination of possible sources of error and using the results themselves to argue for their validity. He concluded that Porter is right and trabeculae are not artifacts but real entities of the living cell.
Although many of Franklin’s strategies and even Bechtel’s three-step-procedure support the reality of the microtrabecular lattice, it has turned out to be an artifact. Pawley and Ris (1987) showed that the microtrabecular lattice is created by insufficient application of critical-point-drying and freeze-frying. It appears not only in animal cells but also in actin solutions after incomplete dehydration, but can be observed neither in cells nor in actin solutions if the methods are applied adequately. Further studies indicated that trabeculae are not novel entities but that the microtrabecular lattice is largely composed of actin filament which has been partially agglutinated and decorated with irregular condensations of what may have been soluble cytoplasmic proteins before dehydration. Moreover, due to the invention of video-enhanced contrast microscopy it was possible to surpass the resolution limits of conventional light microscopy and Robert D. Allen and colleagues (1985) were able to receive direct observations of microtubules, which demonstrated their active role in axonal transport and disproved the hypothesis that contractions of the microtrabecular lattice might provide the force for axonal transport.
The concept of microtrabecular lattice indicates that the epistemologies of experiment suggested by Franklin and Bechtel are deficient. Franklin’s strategy of let the results themselves argue for their validity (Bechtel’s first step) is just a necessary precondition to ask whether the results are real or artificial, but the consistency and arrangement of experimental results can not exclude the possibility of artifacts. Calibration and indirect validation (Bechtel’s second step) is not sufficient as well. A new technique which produces novel phenomena can not be completely unreliable if some of its results can be confirmed by other techniques, but you can not draw the conclusion that all of its results are reliable. Bechtel’s third step, the integration of experimental results into a theoretical account, cannot guarantee their reliability as well because the plausibility of a theoretical account can change due to new evidences. After investigations of axonal transport in living cells using video-enhanced contrast microscopy, the microtrabecular lattice was not a plausible explanation of axonal transport anymore. Therefore, we have to limit Bechtel’s third criterion: New detected phenomena have to be compatible with already established scientific knowledge that is robust and reliable. But the three steps of Bechtel aren’t enough to secure such reliability, so we need additional strategies.
Whereas the three steps of Bechtel are insufficient, other strategies of Franklin are misguided. Predicting a phenomenon by an independent well-corroborated theory can go wrong. The microtrabecular lattice was predicted by theoretical conceptions in the first half of the 20th century and has nevertheless turned out to be an artifact. History shows that wrong theories can predict entities successfully which turns out to be artifacts later. This not only demonstrates that an underlying theory is not enough to justify an experimental result but also attacks the novel defense of scientific realism. Similar, the elimination of possible alternatives is itself fallible. Whereas model fibers investigated by Porter and colleagues did not show an artificial lattice-structure, Hans Ris was able to produce such structures artificially.
I will argue that the most promising strategy to separate real entities from artifacts and to secure the reliability of experimental results is the strategy of independent confirmation using different experiments, called Robustness-Analysis nowadays (Weisberg 2006). Porter and colleagues themselves tried to apply this strategy to the microtrabecular lattice but failed due to the fact that the underlying theories of various preparation techniques for HVEM were not independent in a sufficient way. According to the argument of coincidence of Ian Hacking (1983), which is the underlying argument for Robustness-Analysis, it would be miraculous if completely different instrumental procedures would lead to the same experimental results but nevertheless are artificially. However, the artificial nature of the microtrabecular lattice was not miraculous because its reality was not the only explanation of the concordance of experimental results. Robustness-Analysis is applicable only if all relevant alternative explanations can been eliminated and there is no reason to believe that another non-trivial explanation was not conceived which could be true. Therefore, the strategy of independent confirmation is based on an inference to the only explanation (Bird 2007) similar to a local no-miracle-argument.
Concluding, a powerful epistemology of experiment includes necessary the first and second step of Bechtel, the modified, weaker third step and an additional step. Robustness-Analysis is a sufficient fourth yet maybe there are alternatives.
Allen, Robert D., Weiss, Dieter G., Hayden, John H., Brown, Douglas T., Fujiwake, Hideshi and Simpson, Marcia. 1985. “Gliding Movement of and Bidirectional Transport along Single Native Microtubules from Squid Axoplasm: Evidence for an Active Role of Microtubules in Cytoplasmic Transport”. In: Journal of Cell Biology 100: 1736-1752.
Bechtel, William. 2006. Discovering Cell Mechanisms. Cambridge: Cambridge University Press.
Bird, Alexander. 2007. “Inference to the only Explanation”. In: Philosophy and Phenomenological Research 74, 424-432.
Franklin, Allen 1986. The Neglect of Experiment. Cambridge: Cambridge University Press.
Franklin, Allen 1990. Experiment, right or wrong. Cambridge: Cambridge University Press.
Hacking, Ian. 1983. Representing and Intervening. Cambridge: Cambridge University Press.
Pawley, J. and Ris, Hans 1987. “Structure of the cytoplasmic filament system in freeze-dried whole mounts viewed by HVEM”. Journal of Microscopy 145, 319-332.
Weisberg, Michael. 2006. “Robustness Analysis.” Philosophy of Science 73:730-742.