Synopsis

Mention idealization in the context of physics and point masses, rigid rods, and frictionless planes naturally spring to mind for these were the idealizations we first encountered as we gained our basic training in physics. But what exactly is an idealization? What is the difference between a mathematical idealization and a physical idealization? What are the different roles played by idealization? And what are the central idealizations to be found in contemporary, twentieth century physics? These are some of the interesting and difficult questions undertaken by the contributors to this volume which is dedicated to an exploration of this interesting topic.

The first two essays in this volume aim to clarify the concept of idealization, and to examine its implications for the use of thought experiments in contemporary physics. In An Epistemological Role for Thought Experiments, Michael Bishop discusses the role of thought experiments in the development of twentieth century physics.

Thought experiments provide an important context in which to explore intuitions about the role of idealization in physics. Bishop attempts to answer the question as to why a rational person would allow an imagined, possibly even an unactualizable situation, to influence their view of the world. Bishop begins by defending the view according to which thought experiments are mental representations of experiments grounded in substantial assumptions about the nature and structure of the world.

But this view of thought experiments leaves us with an epistemological puzzle: why should something imaginary lead us to modify our views about the nature of the world around us? Are thought experiments, as Feyerabend suggests, mere psychological trickery? Bishop allows that thought experiments may have rhetorical force, but defends the view that they may nevertheless play an important role in rational theory choice. Bishop defends the claim that thought experiments serve to test how unified a scientific theory actually is. Roughly speaking, theories are unified to the extent that they are capable of explaining a wide range of phenomena with an economical range of specific explanatory strategies (unification is a matter of degree). Bishop illustrates this claim through an analysis of the Einstein-Podolsky-Rosen paradox, a result which appears to show that there is something inadequate in the explanatory base of the quantum theory.

Bishop ends by arguing that the ubiquity of thought experiments in contemporary physics forces a rejection of van Fraassen's view that the aim of science is the modest one of accounting for "observables." Instead, he sees in the ubiquity of thought experiments, a commitment to a species of realism. Science must offer a coherent account of "unobservables" too, and thought experiments are the devices by which this feat is to be accomplished.

In "Models" and "Experiments" as Homogeneous Families of Notions, Izabela Nowak and Leszek Nowak aim to disambiguate the concepts of "model" and "experiment," by differentiating between the central meanings of these terms and those meanings which are derivative. With regard to the term "model," Nowak and Nowak differentiate between models as idealizations, analogical models and models as devices to simulate phenomena of interest.

Central to the approach adopted by Nowak and Nowak is the idea that the models constitutive of scientific theories are idealizational constructs. In this sense, such models serve neither as instrumentalist shorthand notational devices to save the phenomena, and nor are they to be thought of as clusters of factual hypotheses concerning objects and properties in the actual world. Instead such models are best viewed as distortions of real systems "caricatures" in the terminology of Nowak and Nowak. They do not claim that all of science is a caricature of the world, but they perform a valuable service in recognizing the role played by deliberate distortion in the construction of scientific models. And this, in itself, is a useful antidote to the naive realism that would have it that all good science worthy of the name should in some sense mirror or reflect "reality."

Inherent in Nowak and Nowak's conception of a model as a caricature is the idea that laws governing the properties of the idealized situation may, under certain circumstances, be concretized through the admission of previously ignored features of the real situation, and the consequent modification of the laws in the model. The complexity of this concretization process becomes clear from Nowak and Nowak's discussion of the concepts of "dialectical correspondence" and "dialectical refutation."

Nowak and Nowak argue that the central concept of "model" in science is captured by the idea of model as idealizational construct. Models as analogies and models as devices for the simulation of phenomena are derivative notions, the former serving a role in the formation of idealized constructs, the latter serving a role in the testing of such constructs for empirical adequacy. There are thus several meanings attaching to the term "model" in science, but Nowak and Nowak argue convincingly that these meanings are not unconnected with each other, and the concept of model as idealizational construct is both conceptually and functionally prior to the other, derivative, notions of model.

Building on the idea that idealizational constructs lie at the heart of science, Nowak and Nowak move on to consider the well-known dichotomy between real experiments and thought experiments. Nowak and Nowak conclude that while there are obvious differences between real experiments and thought experiments, they are nevertheless complementary procedures in the business of science.

The next four essays in this volume concern various aspects of idealization as it appears in quantum theoretic contexts. In A Semantic Perspective on Idealization in Quantum Mechanics, Steven French and James Ladyman are concerned with idealizations of the objects of physics themselves. In classical physics, we encounter such idealized objects in the form of point masses, rigid rods and frictionless surfaces. But what of quantum mechanics?

French and Ladyman begin by examining the distinction, drawn by Nancy Cartwright, between the idealization of concrete situations to yield empirical, phenomenological laws that are approximately true of real, concrete objects; and the kind of idealization that involves abstraction away from concrete situations, and leads to a consideration of abstract, fictional entities such as square well potentials and Hilbert spaces. For Cartwright, the laws describing such abstract entities literally lie if we interpret them as being laws concerning real, concrete physical objects.

French and Ladyman criticize Cartwright's distinction between the concrete and the abstract, and between phenomenological laws and fundamental theoretical laws. They argue that these distinctions are hard to draw with analytical precision, that even phenomenological laws involve some degree of abstraction, and that the objects such laws describe harmonic oscillators, simple pendulums, and so on are themselves objects contaminated by varying degrees of abstraction. By contrast, French and Ladyman attempt to articulate a unified account of theories and models a model to provide a rationally coherent view of the relationship between the theoretical models of the quantum theory, and systems in the real world. In the course of their analysis, they explain and defend a semantical conception of physical theories as classes of mathematical models, and according to which the idealized systems of physics are

"exemplars for the application of the theory."

In Decoherence and Idealization in Quantum Measurement, Chuang Liu considers two interrelated puzzles lying deep at the core of quantum mechanics. The first concerns the apparent indeterminateness of quantum objects their general failure to take exact values for observables ascribed to them by the theory, in particular outside of measurement contexts. This failure prevents us from viewing the denizens of the microcosm as billiard balls writ small. The second puzzle concerns the quantum theory of measurement and the so-called "collapse of the wave packet" the account from within the quantum theory of how observables show definite values upon measurement, and the source of indeterminism in orthodox presentations of the theory. These two problems, among others, have inclined theorists to adopt an instrumentalistic interpretation of the theory according to which it is to be viewed as calculational tool to account for measurement results and their probabilities

The central concern of Chuang Liu's essay is the decoherence approach to the problem of measurement in quantum mechanics. His discussion of this theory begins by treating it as a case study of the role played by idealization and approximation in the general business of theory construction. But he also goes on to lay a serious objection at the door of the decoherence approach. While the decoherence theory can offer an account of ideal measurements, it runs into trouble in the context of nonideal or approximate measurements.

It is commonplace in discussions of idealization to connect the concept of idealization with that of approximation. But Chuang Liu contends that this connection may only be good for classical theories. In that context a good idealization is one that yields a good approximation to natural phenomena of interest. Classically, it is commonplace to say that when a quantity is small or quickly approaching zero, its effects will be correspondingly small or negligible. (Sharp and Rueger question this assumption even in classical contexts, but it is nevertheless a common assumption.)

Chuang Liu points out that classical intuitions concerning idealization and approximation do not go over in any straightforward way into the quantum realm. Quantization itself cuts one off from considerations of arbitrarily small effects. Moreover, quantum states differ in relevant ways from classical states, so the fact that the tails of a particle's (position) wave packet are small in a certain region does not entail that the effect of the particle in that region will be small or negligible (though it does imply that the probability of the particle becoming localized in such a region is small).

In Idealization in Quantum Field Theory, Stephan Hartmann differentiates between pragmatic and cognitive functions of idealization. His analysis takes place in the context of a case study derived from hadron physics. It is Hartmann's contention that a failure to appreciate the multiple functions served by idealizations in science has led philosophers and scientists alike to neglect a proper study of the phenomenon of idealization, and its importance in our understanding of scientific theories.

Idealizations are not merely devices serving pragmatic ends. For example, they are not merely devices enabling scientists to treat otherwise untreatable problems, especially problems that result from extreme mathematical complexity. Indeed, it is Hartmann's claim that this pragmatic motive for idealization is to some extent being undermined by developments in computer science and technology that have rendered hitherto intractable computational problems tractable. Hartmann's study of hadron physics leads him to the conclusion that idealizations serve cognitive functions that go well beyond their pragmatic functions by providing investigators with insight into highly complex hadron dynamics. Like Michael Bishop, Hartmann argues that an appreciation of the role of idealization in physics should lead us to suspect the correctness of van Fraassen's view that the aim of science is simply empirical adequacy. Idealizations are devices that serve to enrich our understanding of physical processes. They are not there simply to help save the phenomena. Hence, idealizations have intrinsic theoretical virtues. They are not merely necessary vices to enable us to get a computational handle on an otherwise complicated and messy world. Hartmann has provided a valuable service by embedding his analysis of the multiple roles served by idealizations deeply within concepts and constructs derived from some of our current best theories of the microcosm.

In Models and Approximations in Quantum Chemistry, Robin Hendry explores the role of idealization in the context of the relationship between quantum mechanics and quantum chemistry. He considers the following question: can the idealized models of quantum chemistry be rationalized as approximations to exact quantum mechanical equations for molecules? The traditional view is that such models can be considered as approximations to exact quantum mechanical equations for molecules.

This view is discussed in the first section of his paper, where implications of Hempel's views concerning the application of general theories are subject to analysis. The upshot of this discussion is that the traditional view holds that the idealized models of quantum chemistry ought to serve as proxies for exact quantum mechanical analyses. Whatever the idealized models explain, the exact quantum mechanical analyses should explain too, if only we knew how.

But as Hendry goes on to point out, a number of commentators have expressed skepticism concerning the traditional view: the idealized models differ from the exact treatments in ways that are relevant from the standpoint of the explanation of phenomena of interest. The idealized models of quantum chemistry have explanatory features that are absent in the exact quantum mechanical treatments, where such are known.

In the light of his discussion, Hendry suggests that we need to evaluate idealized models in ways that are independent of their relationships to exact treatments. Such models should be evaluated on the basis of their ability to save the phenomena. Idealized models will not, of course, make predictions that are exactly true, but good models ought to come close, moreover, they ought to be improvable. Hendry concludes that in the end what matters is the model itself, and not its role as a means to some end in grander theoretical and philosophical projects.

The next three essays in this volume address the issue of idealization in cosmology and relativity theory. In Astride the Divided Line: Platonism, Empiricism, and Einstein's Epistemological Opportunism, Don Howard considers the role of simplicity in Einstein's scientific epistemology. Howard reminds us that for the Platonist geometrical and mathematical knowledge is knowledge of an intelligible realm, a realm to be sharply separated from the domain of sense experience, concerning whose objects we form mere opinion.

It is Howard's contention that there are vestiges of Platonism in modern empirical science. One of these concerns the pervasiveness of idealizations in modern physics and these are not infrequently the subjects of our most fundamental laws. The other concerns the role played by simplicity in guiding choice of theory. It is Howard's further argument that Einstein is perhaps the only important philosopher of science in our century who embraces these vestiges of Platonism as being essential to good science. From his analysis it emerges that the importance of idealizations in physics is derivative from the importance of simplicity as a criterion guiding theory choice.

For Einstein, like Duhem, theory choice was a matter that was underdetermined by evidence in the sense that for a given body of evidence, there would always be a multiplicity of empirically equivalent theories that would be compatible with it. Moreover, such empirically equivalent theories may provide us with radically different, indeed incompatible, ontologies. It was in this context that Einstein put great emphasis on simplicity as a criterion guiding theory choice.

Empirically equivalent theories may nevertheless differ with respect to simplicity, and by simplicity, Einstein spoke not merely of logical simplicity but also of "naturalness" and "inner perfection." In the context of investigations in fundamental physics, where the evidence is at best remote, simplicity becomes an important guiding light to the theorist. According to Howard, this "was Einstein's route to Platonism in empirical science."

Howard goes on to argue that it is in this light that we should interpret the role of idealization in Einstein's philosophy of science. The central reason why basic laws refer to idealizations is that they represent the ideal of simplicity. This view is not committed to an abandonment of considerations based on empirical evidence, for such evidence most assuredly constrains theory choice. Nevertheless, evidence does not uniquely determine theory choice, and Einstein's philosophy of science is formulated in the light of this fact.

In Idealization in Cosmology: A Case Study, George Gale presents a historical analysis of the methodological and epistemological roles played by E.A. Milne's cosmological principle, according to which the large-scale universe is everywhere uniform. This principle is one of the central idealizations found in twentieth century cosmological inquiry. As Gale points out, debates concerning this principle shaped the course of cosmological theorizing for over a generation, leading ultimately to the Steady State Theory of the Universe.

In cosmological theory, idealization in the form of simplifying assumptions is required to render difficult mathematics manageable. Gale begins by considering analogies between models and maps. From this standpoint, models, like maps, involve abstraction in the form of the selective representation of some features of the real situation, and the rejection of other such features. But models, like maps, also involve idealization in the sense that the features represented in the model have been "smoothed out." Unlike abstraction, which involves the loss of features, idealization involves the "perfection" of features represented in the model. According to Gale, when scientists speak of "ideal x's" they are speaking of models of x's which have been constructed through both abstraction and idealization. And this turns out to be true of cosmological models employing the cosmological principle.

Gale performs a valuable historical service by clearly demonstrating the different conceptual and methodological routes to the cosmological principle taken, on the one hand, by Robertson and the relativistic cosmologists, and on the other, by Milne and the kinematic relativists. For the former group of theorists, the cosmological principle was something induced from observation. For the latter theorists, the principle lived in a hypothetical model from which useful predictions might be deduced. Gale sees in this conflict over the status of the cosmological principle a manifestation of a much older methodological debate between inductivists and hypothetico-deductivists. And as Gale goes on to argue, echoes of these debates can be found in Bondi's work on Steady State Cosmology. Gale's essay is valuable not merely for what it teaches us about the nature of idealization itself, but also for what it teaches us about the role played by idealizations, and reactions to them, in the history of the development of cosmology.

In Idealization, Heuristics and the Principle of Equivalence, Anna Maidens considers the role played by idealization in the development of Einstein's general theory of relativity (GTR). The focus of her inquiry concerns the role played by idealization in two crucial thought experiments employed by Einstein to motivate central features of GTR: the falling elevator and the rotating disk.

Maidens is concerned with the following question: what role does idealization play in uses of the principle of equivalence to support the adoption of the field equations of GTR? The "textbook" story about the principle of equivalence employs the elevator thought experiment to motivate the claim that there is an equivalence between the effect of a gravitational field and the effect of uniform acceleration. The trouble is that the situation envisaged in the falling elevator thought experiment is not a situation found in nature objects in the elevator, falling radially toward the center of the earth, would approach each other in the course of the elevator's descent.

To save the situation, some theorists introduce an infinitesimal principle of equivalence, and consider what goes on in an infinitesimal region around a point. But this move is not satisfactory, for as Maidens points out, infinitesimal regions do not have enough structure defined on them to differentiate geodesics from other lines, and hence provide a solution to the problem of inertia. Maidens, following the work of Norton, thinks we need to rethink the role played by the principle of equivalence in the development of GTR. In particular, she agrees with Norton that the correct view of the role of the principle of equivalence in Einstein's thought was as a device to explore extensions of the principle of the relativity of inertial motion to uniformly accelerated frames of reference.

Maidens argues that the role played by idealization in the context of the principle of equivalence has implications for the debate between realists and anti-realists. In this case, making sense of idealization involves the presupposition of realism. To see the idealization for what it is involves having true beliefs about the world as Maidens puts it, the idealization becomes "a foil to some realistic construal of the situation." In the present case, it is a foil to a realistic view of the nature of spacetime.

The final two essays in this volume grapple with the implications of chaos theory and nonlinear dynamics for questions concerning idealization. These essays raise some serious challenges for several contemporary views concerning the nature of idealization in physics.

In Idealization and Stability: A Perspective from Nonlinear Dynamics, David Sharp and Alex Rueger examine issues surrounding idealization in contemporary physics from the standpoint of chaos theory and nonlinear dynamics. As our understanding of nonlinear dynamics, as well as its relevance for the study of basic physical processes, has expanded over the last two decades, it has become apparent that a number of traditional philosophical issues raised by the physical sciences have had to be re-examined. Such issues include questions about the nature of determinism, the nature of scientific explanation and matters pertaining to the nature and role of mathematical models in physics.

Sharp and Rueger ask the following question: how and why do idealized systems teach us something about real systems? They contend that an examination of nonlinear dynamics has implications for traditional accounts of idealization by focussing our attention on distinct concepts of stability - concepts which are associated with sharply contrasting conceptions of idealization. In particular they differentiate between stability with respect to small changes in initial conditions, for systems described by qualitatively similar dynamics, on the one hand; and structural stability (stability with respect to the quality or kind of dynamics describing systems of interest) on the other. Both types of stability are found in the study of nonlinear dynamics, as well as corresponding instabilities.

Sharp and Rueger contend that a general theory of idealization one that is not restricted to a narrow range of specialized systems requires a concept of stability that cannot be accommodated by what they consider to be the two main approaches to idealization: (a) quantitative approximation of the real system; or (b) isolation of the "natures" or "essences" of real systems.

With respect to idealizations as quantitative approximations of real systems, Sharp and Rueger contend that a failure of quantitative approximation can arise from sensitive dependence on small changes in initial conditions in the model itself - a source of failure of quantitative agreement that has nothing to do with the effects of extraneous factors on real systems, effects to be explained away in terms of ceteris paribus clauses.

With respect to idealizations as attempts to model the "natural tendencies" or "essences" inherent in real systems, it has been contended that an idealized system does explanatory work when it permits the study of real systems either in ideal circumstances or even outside of any context whatsoever (ideal or otherwise). This view of idealization presupposes structural stability that the quality of a systems dynamics does not change from one context to another, or simply from the introduction of a context. As Sharp and Rueger point out, there exist physical systems where structural stability does not obtain. Their discussions of these matters leads into an analysis of different concepts of structural stability, and their implications for the idealization question.

In Towards a Very Old Account of Rationality in Experiment: Occult Practices in Chaotic Sonoluminescence, Lynn Holt and Glynn Holt focus on the role of idealization in experimental practice in the field of nonlinear dynamics. Their conclusion is that an account of rationality in experimental practice leads back to a reconsideration of ideas derived from the late medieval tradition of natural magic. Holt and Holt embed their analysis in the particular details of experiments concerning the phenomenon of sonoluminescence in nonlinearly oscillating bubbles experiments performed by Glynn Holt.

Air bubbles in water, subject to acoustic pressure, can oscillate so wildly that they emit pulses of light. This is the phenomenon known as sonoluminescence. Such sonoluminescent systems exhibit a complex array of dynamical behaviors, including chaotic behaviors. While the systems are currently subject to experimental inquiry, a theoretical understanding of the phenomenon has so far eluded researchers. Holt and Holt first show that attempts to idealize sonoluminescent systems as systems with linear dynamics, fail to save the phenomenon. And this discussion is enhanced by a very clear account of the experimental work that led to the discovery of nonlinear bubble dynamics. As argued by Glynn Holt, a number of candidate mechanisms exist to explain the phenomenon, but at present there appears to be now way to decide between them.

In the light of their presentation of details of experimental practice, Holt and Holt consider experimental situations as a context for idealization the phenomena under analysis are often manufactured under contrived circumstances, and this is especially true of chaotic sonoluminescence, which almost certainly does not exist outside the laboratory. It is this issue that Holt and Holt wish to understand. How are we to make sense of experiments designed primarily to create new phenomena by anomalous manipulation?

To answer this question, Holt and Holt are led back to the medieval tradition of natural magic in this sense: the medieval tradition embodied the idea that the secrets of nature are not available to mere casual inspection, but must be made manifest by suitable manipulations hence the contrast within that tradition between occult (hidden) and manifest properties, the former to be revealed by the natural magician. Thus Holt and Holt think that scientific activity is not to be understood through abstract debates about "scientific method," but rather needs to be conceptualized through considerations of the practical rationality of experimenters revealing nature's secrets.

Niall Shanks
Department of Philosophy Department of Biological Sciences
East Tennessee State University
USA