Abstracts


Joseph Agassi — Why There Is No Theory of Models

The original concept of the model belonged to Cartesian metaphysics, which was rejected early, but which transmitted in to Newtonian metaphysics. The aversion to metaphysics led to reluctance to analyze the concept. It is the concept of generalized initial conditions, which is still unclear. It cries for examination, especially in view of the dilemma of the doctrine of emergence which is now increasingly popular. First we have to give up the hostility to metaphysics and to acknowledge that at its best metaphysics is a powerful scientific heuristic and so it often%$#acts as an ally to scientific research.


Malgorzata Czarnocka — Models and Symbolic Nature of Knowledge

I analyze very broadly mathematical models taking into consideration their construction in the process of acquiring knowledge. The concept of a cognitive system is introduced to grasp the character of knowledge creation; this concept replaces the traditional concept of pair of distinct objects, namely the subject and the object of cognition. The analysis leads to the conclusion%$#that scientific knowledge is of a symbolic, still referential (but at a serious revision of the concept of reference) character; it has no copy nor instrumentalistic nature. It is claimed that models are symbolic, abstract pictures of objects, by no means being compared with the pictures of a photographic kind. Symbols contained in the model are not created freely nor are expressions of subject goals, interests, etc. They are not biological nor social constructs as well. Their function resembles symbolizing in arts and, in general, in all human culture: the object of cognition is "hidden" in the content of the model, still the model gives some (indirect, vague, partial, and presented in an abstract form) information about the intended object of cognition.


Adam Grobler — The Representational and the Non-Representational in Models of Scientific Theories

The realist-antirealist issue is put in terms of what features models are meant to represent and what features are not. The concept of empirical content of a theory is discussed against the%$#contrast between van Fraassen's and Shapere's conceptions of observability. Adopting the latter, the concept of theoretical content of a theory is defined in a way which admits its possible%$#relations to empirical contents of other theories. The empirical and the theoretical contents of a theory are claimed then to be on a par concerning their possible representational function.%$#The latter is claimed to be capable of being established by an appropriate theoretical discussion of the appropriateness of idealizations used in a given context of inquiry. The cognitive (empirical + theoretical) content of a theory is distinguished from its mathematical structure. The latter is claimed to be a raw material of representation rather than representation itself.%$#Its choice is said to be the matter of expressive power rather than truth. The resulting thesis of mathematical instrumentalism is meant to set empirically acceptable and as broad as possible limits to the realistic interpretation of scientific theories.


Stephan Hartmann — Models As a Tool for the Theory Construction; Some Strategies of Preliminary Physics

Theoretical models are an important tool for many aspects of scientific activity. They are used, i.e., to structure data, to apply theories or even to construct new theories. But what exactly is a model? It turns out that there is no proper definition of the term "model" that covers all these aspects. Thus, I restrict myself here to evaluate the function of models in the research process while using "model" in the loose way physicists do. To this end, I distinguish four kinds of models. These are (1) models as special theories, (2) models as a substitute for a theory, (3) toy models and (4) developmental models. I argue that models of the types (3) and (4) are considerably useful in the process of theory construction. This will be demonstrated in an%$#extended case-study from High-Energy Physics.


William Herfel — Nonlinear Dynamical Models As Concrete Construction

Giere, in developing his realist response to van Fraassen's constructive empiricism, has provided a detailed account of scientific models. Based on a close analysis of the role that the simple harmonic oscillator plays in physics, Giere's account gets us closer to an understanding of how models work in the actual process of science than do those relying on formal considerations%$#based on the semantic account of theories (e.g. the work of Sneed, Stegmόller, Suppe, and Suppes). Nevertheless, Giere's account is not comprehensive, for he considers models only as abstract entities. I argue that many models, some of which are mathematical, cannot be accounted for within Giere's Platonistic theory. Instead, the role played by, for example, the Lorenz model of turbulent atmospheric flow is best understood within the context of an account established on a firm basis of concrete modeling in experimental practice.


Elzbieta Kaluszynska — Styles of Thinking

In the paper a reason for the failure of the neopositivistic program of creating a logical theory of science is examined and directions for further research are indicated.


Stathis Psillos — The Cognitive Interplay between Theories and Models: The Case of 19th Century Optics

The aim of this paper is to revive, articulate, expand and illustrate an approach on models defended by Mary Hesse and Peter Achinstein. The prime characteristics of this approach, which I call the analogical approach, are three: First it focuses on models of physical systems rather than on models of theories; second, it stresses the role of substantive analogies in model construction; and third, it allows that models may be substantive means for discovering the furniture of the world. I also present a case-study. on the development of the research in optics during the previous century showing how the analogical approach can capture and explain the basic features of this case. In doing so, however, I suggest that the analogical approach needs to be augmented and improved by taking more into account the role of background theories and theoretical frameworks in suggesting, choosing and evaluating models.


Nancy D. Cartwright, Towfic Shomar and Maricio Suarez — The Tool-Box of Science

We call for a new philosophical conception of models in physics. Some standard conceptions take models to be useful approximations to theorems, that are the chief means to test theories. Hence the heuristics of model building is dictated by the requirements and the practice of theory-testing.

In this paper we argue that a theory- driven view of models can not account for common procedures used by scientists to model phenomena. We illustrate this thesis with a case study: the construction of one of the first comprehensive model of superconductivity by the London%$#brothers in 1934. Instead of a theory-driven view of models, we suggest a phenomenologically-driven one.


Javier Echeverria — The Four Contexts of Scientific Activity

Reichenbach's distinction between the context of discovery and the context of justification of science was criticized in the 60's by Toulmin, Hanson, Kuhn, Feyerabend, Koch, etc., because of%$#the restriction proposed by the logical empiricists about the philosophy of science in the context of justification. However, the distinction itself was considered by Salmon to be "a major%$#focal point for any fundamental discussion between history of science and philosophy of science." Goldman claimed that discovery and justification are interactive. Laudan and Hoyningen-Huene%$#made a review of the distinction.

This paper claims that the distinction discovery/justification has to be radically reformulated because it stems from a fundamental misunderstanding: the reduction of science to scientific knowledge. Contrary to this idea, the author defines science mainly as an activity. Philosophy of science should not be restricted to epistemic nor cognitive aspects of scientific activity. Consequently, an alternative distinction is proposed and argued. Scientific activities can be studied and analyzed within the framework of the four contexts of education, innovation, evaluation and application of science. Scientific practice is much more complex than what Reichenbach and his followers supposed.


Katline Havas — Continuity and Change; Kinds of Negation in Scientific Progress

In the paper a distinction is made between (A) the negation of a theory T1 by a group of scientists working on a theory T2 and (B) the negation of a theory T1 by a theory T2.

(A) Some of%$#the basic affirmations of a theory T1, which was dominant before the scientific revolution, are negated in the sense of classical negation. They are proclaimed to be false, instead of showing the differences between the conceptual apparatuses of T1 and T2.

(B) In many cases T2, a new theory, helps to discover the specific characteristics of the objects and the limits of T1. In this new light the results of T1 are preserved. T2 is not the rejection of T1 in the sense of "throwing away." It is a kind of dialectical negation.

The negation of a theory which does not result in inconsistency between theories is shown in the example of negation used by J. Bolyai in the elaboration of his geometry.

After a scientific revolution the paradigms of%$#the vanquished theory T1 live on as tendencies outside the mainstream of science. If the members of this "set aside community" are able to learn from T2 then they might have an important role in elaborating the negation of negation of T1, that is a theory T3.

A theory might be interpreted not only as a dialectical negation of a former theory, but rather as a return to that theory which was the subject matter of the first negation. I.e. a new theory T3 is not only different from the negated former theory T2 and from that theory T1 which was negated by the former%$#theory, but T3 contains elements both from T1 and from T2. An example is shown from the history of logical theories.


Matthias Kaiser — The Independence of Scientific Phenomena

The paper argues in favor of a basic trichotomy to be used in philosophy of science. The trichotomy in question is: theory — phenomena — data. The notion of phenomenon in science is clarified, as it was partially introduced in the author's earlier works on this topic. It is claimed that scientific phenomena enjoy a relative independence towards both theory development and data%$#acquisition. The argument for the proposed view is derived from an in depth analysis of the discovery of reversals of the geomagnetic field. Several more general conclusions are drawn. One consequence of the proposed view is the essential difference between theoretical knowledge and applied science.


Wladyslaw Krajewski — Scientific Meta-Philosophy

The author introduces the concept of scientific meta-philosophy containing two postulates:

1. The postulate of scientific philosophy: philosophy must be based on the sciences, drawing general conclusions from them, especially from comparison of various sciences.

2. The postulate of analytical philosophy: philosophy must be clear and consistent use formal logic, perform%$#conceptual analysis, define concepts, etc.

Scientific meta-philosophy was originated by B. Russell, the Vienna Circle, the Warsaw-Lvov School and K. R. Popper. The logical empiricism of Vienna Circle presented a rather bad (narrow) philosophy but a good meta-philosophy.


Ilkka Niiniluoto — The Emergence of Scientific Specialities: Six Models

Six different models of specialty formation are suggested in this paper. A new scientific specialty may emerge (1) by separation from philosophy, (2) by branching or migration from other sciences, (3) by the emergence of a new subject matter, (4) by collecting together related disciplines, (5) by theoretical integration of earlier separate disciplines, and (6) by the scientification of human arts and technologies. It is argued that the sixth model, so far ignored both by philosophers and sociologists of science in this context, helps us to understand as "design sciences" many old and new academic professional disciplines (e.g., the engineering sciences, nursing science).


Leszek Nowak — Antirealism, (Supra-) Realism and Idealization

The paper attempts to find a way out of the famous alternative: realism vs. antirealism in philosophy of science. A metaphysical assumption shared by the two competing parties is identified%$#and replaced with another view which is a strengthening of ontological possibilism. This view allows to propose the suprarealist thesis about the status of scientific theory. A natural application of it is understanding the place of idealization in science.


Rinat M. Nugayev — Classic, Modern and Postmodern Scientific Unification

The paper considers and compares three types of unification patterns in physics. The paradigmatic example of synthesis is Maxwell's fusion of electricity, magnetism and light. It is the analysis of Maxwell's works that enables to extract three properties of any successful scientific unification. These properties are crucial since any violation of them leads to a radical decrease of unifying theory's predictive power. It is argued that classical and modern (Einstein) stages of physics development are in good agreement with the rules described. However, all the three basic stages of post-modern physics — the electroweak theory of Weinberg and Salam, GUTs and SUSY approach culminating in superstrings — can be described as consequent and more and more large violations of the synthesis rules.


Veikko Rantala — Translation and Scientific Change

Kuhn's criteria of adequacy concerning translation seem to be so strong that they exclude almost any nontrivial, perfect translation, thus commensurability in his sense. In this paper I shall consider a (nonstandard) notion of translation with less firm criteria of adequacy. It is of better use in one's attempts to understand the positive role of translation in connection with%$#scientific change, and it is closer to Kuhn's earlier notion.

If translation is studied in terms of speech act theory (in a generalized sense), it can be shown that an interpretation on which a translation (in the nonstandard sense) is based is guided by the following principles. The first of them has been discussed, for instance, in the context of Davidson's notion of radical interpretation and Ricoeur's approach to narratives. The second, called the minimization principle, applies to cases where the direction of minimizing the gap between the speaker and the%$#hearer is, so to say, opposite to the one discussed by Davidson and Ricoeur. The third one derives from Gadamer, and in many respects it can be seen as a mixture of the first and the second%$#principle. The fourth, called the refinement principle, is known in the philosophy of science from the context of reduction. The principles have cognitive import since they are principles of%$#how the hearer attempts to understand and explain the speaker's positions. They are important when one tries to understand the nature of radical scientific change and overcome the difficulties contained in Kuhn's strict notion of translation.


Gerhard Schurz — Theories and Their Applications: A Case of Nonmonotonic Reasoning

By an application I mean a kind of application situation x described by a complex pre-theoretical predicate A(x). The theory T entails that for situations of kind A(x) a certain theoretical and empirically contentful claim t(x) is true. For instance, if T is classical particle mechanics, then A(x) may be "x is a planet of a solar system"; TX then is the differential equation describing a two particle mechanics. The central question of this paper is: what is the logical nature of the loose%$#if-then relation between an application A(x) and the theoretical claim t(x) ? The paper presents three approaches to this problem: 1. the deductivist approach (Popper, logical empiricists), 2. the structuralist approach (Sneed-Stegmόller), and 3. the ceteris-paribus approach (Holzkamp). The approaches are discussed at hand at four examples of scientific theories: (i) Newtonian mechanics, (ii) theories of chemical bond, (iii) psychological theories of aggression, and (iv) the psychological theory of cognitive dissonance.

After a thorough criticism of these three approaches a fourth approach is suggested as an alternative: the nonmonotonic approach. It reconstructs the loose if-then relation as an uncertain but non-probabilistic implication of the form "Normally, if A(x), then t(x) ," formalized as A(x) ( t(x). This statement claims that the theoretical claim holds for all normal application situations, but the theory admits exceptions caused by possibly unknown perturbing factors. After a brief explanation of the inference rules of nonmonotonic logics (developed since the 80's) the paper demonstrates how the problems of the three previous approaches can successfully be handled by the nonmonotonic approach.


Witold Strawinski — The Unity of Science Today

Logical empiricists introduced and elaborated four ideas related to unity of science: unity of language, unity of laws, unity of method, and less typical for the mainstream of the movement Neurath's sociologically oriented idea of the unity of science practice. This paper presents the development of these ideas within the logical empiricist movement, and then outlines an answer%$#to the question: how should we approach today this legacy from logical empiricism in the respect of science's unity.


Vardan Torosian — Are the Ethic and Logic of Science Compatible

The most widespread approach to the ethics of science as a constraining and even prohibiting factor only is vulnerable in two points at least: 1) being based on specific ethical considerations, it contradicts the general idea of ethics; 2) as a constrain it is illusional and utopian. Idea that scientific knowledge is organically "loaded" with ethics looks more realistic and promising.


Ernest W. Adams — Problems and Prospects in a Theory of Inexact First-Order Theories

Inexact first-order generalizations are formulas with free variables like (xPy & yPz) ( xPz, symbolizing "if z is preferred to y and y is preferred%$#to z, then x is preferred to z," which may have "exceptions" and not be true in all particular instances, but which have degrees of truth measured by the proportions of%$#the particular instances that satisfy them in models. Given a class of "inexact axioms" of these kinds, the associated theory is the class of inexact formulas that are entailed by them, in a%$#sense appropriate to these kinds of statements. This paper summarizes properties of such theories that have been derived by the author and by Professor Ian Carlstrom, showing, among other things, that the rules of valid "inexact" inference are precisely those that preserve "truth almost everywhere," i.e., except on a set of measure 0, and that being classically entailed is necessary but not in general sufficient for this. However, there are special circumstances in which practical idealization is permissible, wherein errors or exceptions are ignored, much as errors of measurement are ignored for practical purposes in certain calculations. These are investigated, and the relation between this and Plato's theory of the relation between the world of appearances and the "real" world is commented on. There are brief comments on extension of the theory, including confirmation and expected degrees of truth (especially concerning Hempel's Paradox), the application of information and measurement systems, and comparisons with theories of fuzzy logic and verisimilitude.


Wolfgang Balzer and Gerhard Zoubek — On the Comparison of Approximative Empirical Claims

We investigate sufficient conditions under which the approximative empirical claim of one theory T' implies that of another theory T. Quantitative bounds for the degree of approximation to be used in T are defined in terms of the degree of approximation of T'. We use a formal apparatus of structuralist origin enriched by systems of quasi-metrics in order to obtain precise proofs of three theorems expressing corresponding implication of empirical claims.


Gianpierro Cattaneo, Maria Luisa Dalla Chiara and Roberto Giuntini — Unsharp Approaches to Quantum Theory

A basic aim of the unsharp approaches to quantum mechanics is to provide a mathematization of some ambiguous aspects, that seem to be characteristic of the "concrete reality." This has suggested different forms of paraconsistent quantum logic. We work in the framework of a general semantics for physical theories, which concerns both the classical and quantum case.


Theo A.F. Kuipers — Falsification versus Efficient Truth Approximation

It is argued that Kuhnian and Lakatosian justifications and explanations of non- falsificationist behavior are essentially redundant from the viewpoint of truth approximation, for the truth or falsity of a theory is rather irrelevant for its distance to the truth. More precisely, the irony of the cunning of reason, is that the instrumentalist methodology, which may keep a falsified theory in the game, or replace it with a better but also already falsified theory, e.g. essential for idealization and concretization (Krajewski, Nowak), is (much) more efficient for truth approximation than a falsificationist methodology. Some consequences for the empirical studies of science are drawn.


Bernhard Lauth — Limiting Decidability and Probability

Kevin Kelly has shown recently (drawing on earlier results that have been obtained by Putnam and Gold) that the notions of verification and falsification in the limit of an hypothesis h ( (N can be characterized by the position of the hypothesis with respect to the finite Borel hierarchy. (h is verifiable in the limit, iff it is S2, h is falsifiable in the%$#limit, iff it is (2, and h is decidable in the limit, iff it is (2). In this paper, I will propose a stochastic version of testing in the limit, which works for hypotheses of arbitrary complexity with respect to the Borel scale.


Jaroslaw Pykacz — Many-Valued Logics in Foundations of Quantum Mechanics

A many-valued logic from an old neglected |ukasiewicz paper is adopted to describe results of physical (esp. quantum) experiments. Truth-values of propositions are interpreted as probabilities of their experimental confirmation. It is shown that the Lindenbaum algebra of the obtained logic is an orthomodular partially ordered set, i.e. a "quantum logic" in the Birkhoff-von Neumann sense. Short outline of the historical development of applications of many-valued logics in foundations of quantum mechanics is given.