|Edited by Niall Shanks
Amsterdam-Atlanta, GA: Rodopi, 1998
|ISBN 90-420-0642-0 (b)|
This paper has three main sections: (1) A thought experiment is a mental representations (a mental model) of an experiment in which a result is derived by a process of reasoning that employs substantive assumptions about how the world works (Nersissian 1993). I defend this account on the grounds of utility it fits smoothly into our best understandings of the historical trajectory of actual thought experiments and it offers insights about how thought experiments work. (2) Thought experiments can test how unified a scientific theory is. They can play a role in rational theory choice if unification is a theoretical virtue. (3) Ironically, the use of unactualized thought experiments to test a theory typically betrays a commitment to a kind of realism which is incompatible with van Fraassen's (1980) views about the aim of science.
Philosophers of science often complain about multiplicity of meanings attached to numerous technical concepts employed in the language of science. The concept of a model is claimed to be a typical example. In this paper, we would like to argue that this view is false and to illustrate this on the example of the alleged ambiguity of the term "model." What are taken to be models in the language of science reveal properties homogeneous enough to apply one and the same term. We list the main meanings of the term "model" (1), and after having explained the adopted assumptions (2), we specify the central meaning (3) and indicate the connections between it and derivative ones (4). Finally, the same method is applied to another allegedly ambiguous term, "experiment," in order to argue that there is a rationale in the everyday scientific language usage to call both mental and factual experiments by the same term (5).
We present an account of idealization in quantum mechanics from the perspective of the semantic approach to theories. Using the notions of "partial structures" and "partial isomorphism" we reject the dichotomy between the understanding how the models of quantum mechanics relate to systems in the world. As a case study we take the calculation of atomic structure by means of the self consistent field approach and our general conclusion is that, contrary to the claims of recent commentators, the models of quantum chemistry count as "quantum mechanical" on the twin grounds that they can be related via partial isomorphisms to models satisfying Schrφdinger's equation and that they incorporate Pauli's Exclusion Principle.
The decoherence approach is one of the most recent and widely debated theories of quantum measurement. To some it's the most promising, while to others yet another non- starter. The aim of this paper is two-fold: to discuss the theory as a case study of idealization and approximation in theory building and to articulate an objection to the decoherence approach. The proponents of the decoherence theory see the chance of a felicitous marriage between it and the modal interpretation, a recent and widely debated interpretation of quantum theory. There is high hope that the two together constitute a genuine solution to the age-old interpretation problem of quantum physics. Against both, separate objections are raised to the effect that even though one or both of them can handle cases of ideal measurements, they fail when it comes to non-ideal or approximate ones (not the same thing as will be shown). A discussion of this situation provides perfect illustration of the role of modelling and idealization in quantum physics.
This paper explores various functions of idealizations in quantum field theory. To this end it is important to first distinguish between different kinds of theories and models of or inspired by quantum field theory. Idealizations have pragmatic and cognitive functions. Analyzing a case-study from hadron physics, I demonstrate the virtues of studying highly idealized models for exploring the features of theories with an extremely rich structure such as quantum field theory and for gaining some understanding of the physical processes in the system under consideration.
Idealization presents well known problems for realist views of theories: truth cannot be what we want in our theories if we knowingly introduce falsity into our theoretical treatments. One attractive defense of realism has it that, although idealized models themselves falsely represent their subjects, they stand in for intractable exact treatments in the theoretical deductions from true premises that constitute explanations and predictions. In this paper I will argue that on this proxy view of idealized models, certain relations are to be expected between theory and model, relations that fail to hold between quantum mechanics and its applications in quantum chemistry. What is needed is a view of what makes a molecular model a good one that is independent of its relation to some exact treatment.
This paper explores the role of ideal elements in empirical science by examining Einstein's commitment to simplicity as a criterion of theory choice, which Einstein described as a "Platonist" aspect of his epistemology. Einstein's holistic and underdeterminationist model of the role of empirical evidence in theory choice provides the context, simplicity considerations taking over where evidential constraints carry one no further. An affinity is noted between Einstein's view and Schlick's "logical" justification of simplicity as requiring us to choose among empirically equivalent theories the one with the fewest "arbitrary" elements. But the most important question is why Einstein insisted that while, in principle, empirical evidence underdetermines theory choice, in practice considerations of simplicity or the "inner perfection" of theories yields apparent determinateness in theory choice.
All theoretical sciences depend upon idealizations. Cosmology's most important idealization is The Cosmological Principle, which holds that the large-scale universe is everywhere uniform. But the Principle was controversial from the moment of its first proposal by E.A. Milne. Important debates in epistemology and methodology focussed upon its role and status. These debates shaped the course of cosmology for over a generation, culminating in the creation and deployment of the Steady State Theory of the Universe, a major competitor to the previously dominant cosmology founded on the General Theory of Relativity. A case history of this development is presented, based upon published articles, correspondence and interviews with participants.
In this paper I shall contrast two accounts of the role of idealization in the development of the principle of equivalence. One involves idealizations connected with the elevator thought experiment. I shall argue that this version leads to a dilemma typical of the use of idealizations: the initial idealization which gives physical insight into the problem is an incorrect description of the phenomena, while the corrected version which accurately saves the phenomena gives no physical insight. The second approach involves idealizations surrounding the concept of a rigid body and does not lead to this dilemma. I shall argue that this approach played an important part in the development of Einstein's thought. Furthermore, it provides reasons for a realist reading of the spacetime of general relativity.
This paper discusses the ways in which different accounts of idealization depend on different notions of stability. We argue that traditional views (in terms of quantitative approximation) as well as the Cartwright-Ellis view (in terms of natures) presuppose notions of stability that are too narrow to allow understanding of idealization in the context of nonlinear dynamics. We describe a more general notion of stability which is appropriate also for nonlinear dynamics and use it to provide an account of idealization which places emphasis upon qualitative as opposed to quantitative approximation.
Extending a recent trend in philosophy, we use a description of experimental practice in nonlinear dynamics as a lens to focus questions of ideals and idealization in chaos physics on the one hand, and questions of a general philosophy of experiment on the other. We find that the most promising general account of rationality in experimental practice, able to explain why a certain level and type of idealization in experiment is necessary, is the late medieval tradition of natural magic, itself an innovation in the even more ancient Aristotelian tradition of practical rationality.