Abstracts


Theo A.F. Kuipers and Anne Ruth Mackor — General Introduction: Cognitive Studies of Science and Common Sense

The general introduction situates the style of research exemplified in the book among the various ways of doing philosophy of science and “philosophy of common sense.” The book demonstrates that there is a way, dubbed cognitive studies, which is a middle course between abstract philosophy of science and social studies of science. To clarify this claim, abstract philosophy of science and social studies of science are first roughly characterized. Then, cognitive studies of science and common sense are described in some detail. Next, some examples as well as the practical value of such studies (in particular their heuristic role for new research and their piloting role for social studies of science) are discussed. Finally, the research topics of the four parts of the volume are briefly introduced.


Henk Zandvoort — Concepts of Interdisciplinarity and Environmental Science

A concept of interdisciplinary science is developed that is adequate both as a description of interdisciplinary environmental research as it actually occurs, and as a guide for the field’s future development. It is intended to replace the prevailing “strong view” of interdisciplinary environmental science, which in this paper is judged inadequate in both respects. The concept presented here is inspired by interdisciplinary cooperation in modern natural science, based on “guide-supply relationships” between research programmes of different disciplines. The concept is developed in two steps, comprising two subsequent models of interdisciplinary environmental science. The first “hierarchical model” assumes that one discipline, to be called the central guiding discipline, is effectively guiding the contributions of all other disciplines involved in environmental research. The untenability of this assumption leads to the second “interactive model,” where several programmatic accounts compete to take up the position of the central guiding discipline.


Rein Vos — The Logic and Epistemology of the Concept of Drug and Disease Profile

Until recently, taking physics as the model, the efforts of philosophers of science have focused on elucidating knowledge representations related to the theoretical structure of science (Kuhn’s notions of paradigm), to the semantic conception of scientific theories (Suppes, Giere), or to research programmes (Lakatos). Little is known, however, about the cognitive structures related to experimental and practical work in science. This paper deals with an important way of knowledge representation and use in the medical and pharmaceutical sciences, i.e., drug and disease profiles. The characteristics of storing knowledge of disease and therapy in drug and disease profiles are analyzed. Subsequently, the “logic” of profiles is described in set-theoretical terms. Finally, the epistemological role of the concept of profile in science and medicine is indicated, and argued to be complementary to Schaffner’s approach of “exemplars” in scientific practice.


Rick C. Looijen — On the Distinction between Habitat and Niche, and Some Implications for Species’ Differentiation

One of the traditional, normative, objectives of philosophy of science is conceptual clarification. After the (post-Kuhnian) descriptive turn in science studies, this is still an important task of philosophy. It is particularly important with respect to controversies in science, especially to those involving conceptual rather than factual issues. The present paper is a case in point. It aims to clarify two of the most important, yet controversial, concepts (or conceptual theories) in ecology, viz. “habitat” (theory) and “niche” (theory). Conceptual analysis of the ecological literature reveals that the major source of controversy is the confusion of several different concepts of habitat and niche, such that the distinction between the two is blurred. To remove this confusion, the paper sorts out these various concepts and suggests a clear assignment of terms, in line with some basic, relatively uncontroversial, ecological ideas.


Gerben J. Stavenga — Cognition, Irreversibility and the Direction of Time

The problem of the direction of time is tackled by an analysis of a fundamental aspect of cognition. First, it is shown that the development of physics reveals four different relations between instrument and object, each linked with one crucial aspect of the acquisition of knowledge. Next, it is argued that these four relations together form a complete set, in other words that they are the only qualitatively different relations possible. The analysis concentrates on the fourth relation (instrument and object most closely connected) and the concomitant basic aspect of cognition: information recording, which appears to be crucial at a fundamental level of reality. It is argued that at this level it is precisely the inevitable irreversibility, due to this aspect of cognition, which must be the origin of time and its direction.


Rene Dalitz — Knowledge, Gender and Social Bias

Theories in feminist epistemology and the feminist criticism of science are based on the idea that the construction of knowledge ought to be seen as a social human process of knowledge acquisition and justification. Accordingly, feminist studies ought to be directed to the so-called “gendered” features of knowledge, that is, the way (scientific) knowledge is and is not structured by the individual gender of the scientist, and the social, hierarchical structures in which knowledge is developed. Feminists hence take the factual neglect of sex and gender with regard to knowledge as the fundamental problem to be met. In this article, it is first argued that the definition of sexism and patriarchy as extra-scientific factors that influence science will not suffice to fend off the feminist critique of science. Second, it is argued that the feminist critique of science and epistemology is uncritically based on a background theory according to which “dichotomies imply hierarchies.”


Erik C.W. Krabbe — Can We Ever Pin One Down to a Formal Fallacy?

According to one definition of “fallacy,” a fallacy is a violation of a rule of critical dialogue. The question then arises what a critic is to do about the alleged violations. How can she convincingly build a case that a fallacy was committed and that her adversary should retract his argument? In this paper formal fallaciousness, and more generally non sequitur, will be studied from such a dialectical point of view. Basically, to substantiate a charge of formal fallaciousness, the critic must show two things: that the argument is invalid and that the invalidity is of a vicious type, meaning that it violates a dialogue rule. It appears that there are several techniques of which the critic may avail herself to show invalidity, the most vigorous one being that of giving a counterexample. Notwithstanding the Oliver-Massey asymmetry thesis, this is a strong method of showing invalidity.


Theo A.F. Kuipers — Explicating the Falsificationist and Instrumentalist Methodology by Decomposing the Hypothetico-Deductive Method

What is the relation of the HD-method to the falsificationist and the instrumentalist methodology? The falsificationist methodology is usually considered to be the straightforward application of the HD-method. The instrumentalist methodology, on the other hand, is rarely seen as such. Implicitly, it is well-known from the work of Hempel, Popper and Lakatos that the HD-method is essentially a stratified, two-step method, based on a macro- and a micro-argument, with much room for complications. An explicit analysis of this implicit knowledge turns out to suggest a detailed explication of both methodologies. The decomposition sheds new light on falsifying general facts and on the ravens paradoxes. Moreover, it suggests a systematic presentation of the different factors that complicate the straightforward application of the HD-method. The illuminating and surprising consequences of the analysis for theory comparison and theory choice can be indicated only briefly.


Alfons Keupink — Causal Modelling and Misspecification: Theory and Econometric Historical Practice

This paper examines the question of when a causal model should be considered misspecified. Two main types of specification error are distinguished and clarified on the basis of the statistical relevance account of causation. The thesis that statistical relevance only implies causal relevance if assumptions are made a priori, i.e., on the basis of economic theory, is found to be characteristic for the methodology of the so-called econometric history. A specific debate between two econometric historians concerning the labor-scarcity hypothesis, proposed by Habakkuk in 1962, is analyzed on the basis of two, prima facie different, variants of this thesis: Hans Reichenbach’s principle of the common cause and Herbert Simon’s procedure for distinguishing between genuine and spurious correlation.


Maarten C.W. Janssen and Yao-Hua Tan — Default Reasoning and Some Applications in Economics

This article builds upon two earlier articles on the applicability of non- monotonic logic to economic reasoning. We first discuss how economic laws (with ceteris paribus clauses) can be formulated as default laws. Given our formulation an important epistemological question arises: how can one distinguish intuitively appealing default laws (like “usually, birds can fly”) from intuitively unappealing ones (like “usually, birds cannot fly”). In economics (and econometrics), this question is usually answered by referring to some statistical tests. We argue, however, that these statistical techniques give a different interpretation of these default rules. In the paper, we also give a new application of default rules to the way economic agents are supposed to reason in game theory.


Bert Hamminga — Interesting Theorems in Economics

This paper deals with the philosophical analysis of the method by which “interesting theorems” are scrutinized by theoretical economists. It deals with two reactions to this analysis by two Dutch economists. The first of them is international trade theorist Jager. This reaction is defensive and is a result of a normative interpretation. The second reaction, of Cools, consists in a successful application of the results to an area of economics hitherto not studied by professional methodologists (the Modigliani-Miller research programme in the field of capital structure theory). Finally, the results are compared with standard doctrines in the philosophy of science and framed in terms of Kuiper’s general verisimilitude programme.


Sjoerd D. Zwart — A Hidden Variable in the Discussion about ‘Language Dependency’ of Truthlikeness

Since 1974, Miller has castigated almost all definitions of verisimilitude as being language dependent. The question of language dependency of truthlikeness definitions has been a topic of much discussion ever since. The answers to Miller’s example given so far have not been very convincing. This paper presents Miller’s argument, and considers it under the strongest interpretation possible. It proposes an alternative kind of reaction to responses given so far. Miller’s observation of language dependency is correct, and although he presents it as a curse, it proves to be a blessing. If one distinguishes between the language of the cognitive problem, and that of the formulation of the theories, Miller’s problem disappears. Truthlikeness proves to be a relative, rather than an absolute concept. Instead of being language dependent, it is considered to be fundamentally related to a cognitive problem. This paper shows how a change of a cognitive problem can be paraphrased semantically.


Hinne Hettema and Theo A. F. Kuipers — Sommerfeld’s Atombau: A Case Study in Potential Truth Approximation

The paper presents a case of potential truth approximation in physics: the sequence of relevant theories of the atom of Rutherford, Bohr, and Sommerfeld (the “old” quantum theory of the atom), in the form as given in Sommerfeld’s Atombau und Spektrallinien. In particular, after a sketch of the historical and scientific background, it will first be shown in structuralist terms that Bohr’s theory is a specialization of Rutherford’s theory and that Sommerfeld’s theory in its turn is a concretization of Bohr’s theory. Then it will be shown in general that a specialization followed by a concretization may well be a case of truth approximation in the sense of the structuralist theory of truth approximation, which would explain the increase of explanatory success of the successive theories.


Roberto Festa — Verisimilitude, Disorder, and Optimum Prior Probabilities

The problem of the choice of prior probabilities is considered with reference to the theory of inductive probabilities and the analysis of the multinomial inferences provided by Bayesian statistics. Among other things, it is argued that the choice of prior probabilities in a given empirical inquiry should be suitably restricted by specific “contextual constraints” such as the available background knowledge and the cognitive goal of the inquiry. In particular, the problem of the choice of a specific inductive method for a given inquiry is investigated with reference to the class of inductive methods proposed by Carnap and Stegmüller (1959). The intuitive idea underlying the proposed solution is that a good reason for selecting a particular inductive method, from a certain class of inductive methods, is that there are grounds to believe that it is “the optimum tool” for achieving a high degree of verisimilitude.


Anne Ruth Mackor — Intentional Psychology Is a Biological Discipline

Many philosophers claim that intentional psychology has several features which distinguish it from the natural sciences. Davidson and Fodor among others conclude that it is an autonomous science, whereas Churchland, for example, claims that it should be eliminated altogether. In this paper, I argue that intentional psychology is a biological science which derives its distinctive characteristics from biology. I conclude that intentional psychology is neither an autonomous discipline nor one that should be eliminated.


Jeanne Peijnenburg — Hempel’s Rationality. On the Empty Nature of Being a Rational Agent

Immediately after its publication in 1961, Hempel’s schema for the explanation of actions was subject to fire from two seemingly different sides. On the one hand, Melden cum suis suggested that the empirical law in the schema is in fact analytical in character, thus giving birth to the so-called logical connection argument. On the other hand, Davidson and Dennett argued that the rationality assumption in Hempel’s schema lacks empirical import and hence is analytical too. Both kinds of criticism have the nature of intuitive suggestions, rather than that of a convincing argumentation. The present article shows how one may set up an argumentation by constructing semi-formal proofs on the basis of Hempel’s text. These proofs demonstrate that both criticisms are correct, and, moreover, that they are two sides of the same coin.


Lex Guichard — The Causal Efficacy of Propositional Attitudes

Intentionality is a relational property, and cannot be analyzed solely in terms of internal properties. So a problem arises as to how intentionality can be both externally classified and causally efficacious. I argue in this paper that this problem is solved by Millikan’s (1984) externalist theory of proper functions, because the relational aspects of proper functions can be analyzed as conjunctions of causal claims. I also claim that these functions can be captured by conditional causal laws, which are inductively weaker than similar laws which capture nonrelational proper functions, thus explaining what is often incorrectly called the “anomalous” character of the mental.


Michel ter Hark — Connectionism, Behaviourism and the Language of Thought

Fodor and Pylyshyn have claimed that the recent interest in connectionism is of no relevance to the cognitive level of explanation typical of orthodox cognitive science. Instead, they argue, connectionism is of no more relevance to cognitive science than associationism. It is argued that Fodor and Pylyshyn’s charge of associationism is based upon a misconception of learning. A comparison with scientific behaviorism shows the psychological relevance of learning networks. Next, it is shown that the appeal to internal representation of connectionism is compatible with behaviorism’s tendency to avoid the theoretical invocation of internal states. Finally, the intimate relation between learning and representation is used to counter some of Fodor’s and Pylyshyn’s arguments that are based upon the language of thought.