Tuesday, August 21, 2012

Philosophy of Science (notes)

The Euclidian, the Lobachevskian, and the Riemannian geometries are different languages in the sense of theories of logical structure, which as such are concerned only with the logical implications of axioms.  In this work Carnap references Einstein's Sidelights on Relativity (1921; English, 1923) where Einstein says that the theorems of mathematics are certain in so far as they are not about reality, and that in so far as they are about reality they are uncertain.  Carnap states that the philosophical significance of Einstein's theory of relativity is that it made clear that if geometry is taken in an a priori or analytic sense, then like all logical truths it tells us nothing about reality, while physical geometry is a posteriori and empirical, and describes physical space and time.

Carnap notes that in relativity theory Einstein used the Riemannian mathematical geometry as the axiomatic system for his physical geometry, but the reason for the choice of which mathematical geometry to use for a physical theory is not obvious.  Several years before Einstein developed his relativity theory the mathematician Poincare postulated a non-Euclidian physical space, and said that physicists have two choices.  They can either accept non-Euclidian geometry as a description of physical space, or they can preserve Euclidian geometry for the description of physical space by adopting new physical laws stating that all solid bodies undergo certain contractions and expansions, and that light does not travel in straight lines.  Poincare maintained that physicists would always choose to preserve the Euclidian description of physical space, and would claim that any observed non-Euclidian deviations are due to the expansion or contraction of measurement rods and to the deflection of light rays used for measurement.  Einstein's choice of the Riemannian geometry and physical laws for measurement was based on the resulting simplicity of the total system of physics.  Relativity theory using Riemannian geometry greatly simplifies physical laws by means of geodesics, such that gravitation as a force is replaced by gravitation as a geometrical structure. In 1928 Carnap published his Der Logische Aufbau der Welt. The book was translated in 1967 with the title The Logical Construction of the World, but in the literature the book is always referred to as the Aufbau.  This work exhibits a detailed design for an ambitious investigation.  In the first three of the book’s five parts Carnap sets forth the objective, plan, and essentials of this investigation.  His objective is the “rational reconstruction” of the concepts of all fields of knowledge on the basis of certain elementary concepts, that describe the immediately given in experience.

Carnap illustrates the relation between the two aims of science with an analogy: the construction of an object is analogous to the indication of the geographical coordinates for a place on the surface of the earth.  The place is uniquely determined through the coordinates, so that any other questions about the nature of the place have definite meaning.  The first aim of science locates experience, as does the coordinate system; the second aim addresses all other questions through experience, and is a process that can never be completed.  Carnap says that there is no limit to science, because there is no question that is unanswerable in principle.  Every question consists of putting forth a statement whose truth or falsity is to be ascertained.  However, each statement can in principle be translated into a statement about the basic relation and the elementary experiences, and such a statement can in principle be verified by confrontation with the given.  Fifty years later Quine also uses the coordinate system analogy to express his thesis of ontological relativity.  But instead of developing an absolute ontology consisting ultimately of the immediately given in terms of elementary experiences and a basic relation, Quine relativizes ontology to one’s “web of beliefs” including science, and ultimately by nonreductionist connection to one’s own “home” or native language.

Quine viewed the thesis of analytical truth as the Achilles heel of Carnap’s philosophy of science, its parallel postulate to be replaced with the new Pragmatist philosophy of language.

Thus in spite of the subordination of hypotheses to empirical control by means of protocol sentences, hypotheses contain a conventional element, because the system of hypotheses is never "univocally" determined by empirical material however rich it may be.  Carnap never developed this thesis of the empirical underdetermination of a system of hypotheses, and the artifactual theory of language it implies, which was extensively developed by Quine in the 1950's and afterward.
 Carnap describes semiotics as the general theory of signs, which is divided into three parts based on the three factors involved in language. These factors are (1) the expression, (2) the designatum, and (3) the speaker.  The part of semiotics that deals with all three of these factors is called pragmatics.  The second part of semiotics, called semantics, abstracts from the speaker, and contains a theory of the meaning of expressions, which leads to the construction of a dictionary for translating the object language into the metalanguage.  Finally the third part of semiotics is called syntax, which abstracts from both the speaker and the designata of the signs, in order to consider only the expressions.  Carnap further distinguishes between descriptive semantics and syntactics on the one hand, and pure semantics and syntactics on the other.  The former are included in pragmatics because they are empirical, while the latter are not because they are analytic.

The former is the development of a new method of semantical analysis, which Carnap calls the method of extensions and intensions, and which is based on the customary concepts of class and property respectively.  Carnap maintains that these concepts of extension and intension should be substituted for the idea of naming of an abstract entity.  In his autobiography he notes that some philosophers [who happen to include Quine and Goodman] reject this way of speaking as the "hypostatization of entities.”  In their view it is either meaningless or at least in need of proof, to say that such entities as classes and properties actually exist.  But Carnap argues that such terms have long been used in the language of empirical science and mathematics, and that therefore very strong reasons must be offered, if such terms as "class" and "property" are to be condemned as incompat­ible with empiricism or as unscientific.

Carnap borrows Carl G. Hempel's metaphor­ical language describing the axioms with their primitive terms as "floating in the air", meaning that the theoretical hypotheses are firstly developed by the imagination of the physicist, while the elementary terms occurring in the empirical laws are "anchored to the ground.” 

In the opening statements of "Semantic Information" the authors observe that the measures of information developed by Claude Shannon have nothing to do with what the semantics of the symbols, but only with the frequency of their occur­rence in a transmission.  This deliberate restriction of the scope of mathematical communication theory was of great heuristic value and enabled this theory to achieve important results in a short time.  But it often turned out that impatient scientists in various fields applied the terminology and the theorems of the theory to fields in which the term "information" was used presystematically in a semantic sense.

Claude E. Shannon published his "Mathematical Theory of Communication" in the Bell System Technical Journal (July and October, 1948).  The papers are reprinted together with an introduction to the subject in The Mathematical Theory of Communication (Shannon and Weaver, 1964).  Shannon states that his purpose is to address what he calls the fundamental problem of communication, namely, that of reproducing at one point either exactly or approximately a message selected at another point.  He states that the semantical aspects of communication are irrelevant to this engineering problem; the relevant aspect is the selection of the correct message by the receiver from a set of possible messages in a system that is designed to operate for all possible selections.  If the number of messages in the set of all possible messages is finite, then this number or any monotonic function of this number can be regarded as a measure of the information produced, when one message is selected from the set and with all selections being equally likely. 

Shannon uses a logarith­mic measure with the base of the log serving as the unit of measure.  His paper considers the capacity of the channel through which the message is transmitted, but the discussion is focused on the properties of the source.  Of particular interest is a discrete source, which generates the message symbol by symbol, and chooses successive symbols according to probabilities.  The generation of the message is therefore a stochastic process, but even if the originator of the message is not behaving as a stochastic process, the recipi­ent must treat the transmitted signals in such a fashion.  A discrete Markov process can be used to simulate this effect, and linguists have used it to approximate an English-language message.  The approximation to English language is more successful, if the units of the transmission are words instead of letters of the alphabet.  During the years immediately following the publication of Shannon's theory linguists attempted to cre­ate constructional grammars using Markov processes.  These grammars are known as finite-state Markov process grammars.  However, after Noam Chomsky published his Syntactical Struc­tures in 1956, linguists were persuaded that natural language grammars are not finite-state grammars, but are poten­tially infinite-state grammars.

In the Markov process there exists a finite number of possible states of the system together with a set of transition probabilities, such that for any one state there is an associated probability for every successive state to which a transition may be made.  To make a Markov process into an information source, it is necessary only to assume that a symbol is produced in the transition from one state to another.  There exists a special case called an ergodic process, in which every sequence produced by the process has the same statistical properties.  Shannon proposes a quantity that will measure how much information is produced by an information source that operates as a Markov process: given n events with each having probability p(i), then the quantity of information H is:
                            n
                    H = S  p(i) log p(i).
                              
       i=1
In their “Semantic Information" Carnap and Bar-Hillel introduce the concepts of information content of a statement and of content element. Bar-Hillel notes that the content of a statement is what is also meant by the Scholastic adage, omnis determinatio est negatio.  [me: space-time-notional?]
It is the class of those possible states of the universe, which are excluded by the statement.  When expressed in terms of state descriptions, the content of a statement is the class of all state descriptions excluded by the state­ment.  The concept of state description had been defined previously by Carnap as a conjunction containing as compo­nents for every atomic statement in a language either the statement or its negation but not both, and no other state­ments.  The content element is the opposite in the sense that it is a disjunction instead of a conjunction.  The truth condition for the content element is therefore much less than that for the state description; in the state description all the constituent atomic statements must be true for the conjunction to be true, while for the content element only one of the constituent elements must be true for the conjunction to be true.  Therefore the content elements are the weakest possible factual statements that can be made in the object language.  The only factual state­ment that is L-implied by a content element is the content element itself.  The authors then propose an explicatum for the ordinary concept of the "information conveyed by the statement i" taken in its semantical sense: the content of a statement i, denoted cont(i), is the class of all content elements that are L-implied by the statement i

Carnap's semantic theory of information may be contrasted with a more recent semantic information theory proposed by the Russian information scientist, Yu A. Shreider (also rendered from the Russian as Ju A. Srejder).  In his "Basic Trends in the Field of Semantics" in Statistical Methods in Linguis­tics (1971) Shreider distinguishes three classifications or trends in works on semantics, and he relates his views to Carnap's in this context.  The three classifications are ontological semantics, logical semantics, and linguistic semantics.  He says that all three of these try to solve the same problem: to ascertain what meaning is and how it can be described.  The first classification, ontological semantics, is the study of the various philosophical aspects of the relation between sign and signified.  He says that it inquires into the very nature of existence, into the degrees of reality possessed by signified objects, classes and situations, and that it is closely related to the logic and methodology of science and to the theoretical foundations of library classification. 

The second classification, logical semantics, studies formal sign systems as opposed to natural languages.  This is the trend in which he locates Carnap, as well as Quine, Tarski, and Bar-Hillel.  The semantical systems con­sidered in logical semantics are basic to the metatheory of the sciences.  The meaning postulates determine the class of permissible models for a given system of formal relations.  A formal theory fixes a class of syntactical relations, whence there arises a fixed system of semantic relations between a text describing a possible world.  
 
The third classification, linguistic semantics, seeks to elucidate the inherent organization in a natural language, to formulate the inherent regularities in texts and to construct a system of basic semantic relations.  The examination of properties of extralinguistic reality, which determines permissible semantic relations and the ways of combining them, is carried considerably farther in lin­guistic semantics than in logical semantics, where the question is touched upon only in the selection of meaning postulates.  However, linguistic semantics is still rather vague and inexact, being an auxiliary investigation in lin­guistics used only as necessity dictates.  Shreider locates his work midway between logical and linguistic semantics, because it involves the examination of natural language texts with logical calculi.

Shannon's concept pertains only to the potential ability of the receiver to determine from a given message text a quantity of information; it does not account for the information that the receiver can effectively derive from the message, that is, the receiver's ability to "understand" the message.  In Shreider's theory the knowledge had by the receiver prior to receiving the message is considered, in order to determine the amount of information effectively communicated.

Thus some­one who has learned a branch of a science will derive more information from a special text in the branch than he would before he had learned it.  This peculiar property of the semantic theory of information basically distinguishes it from the Shannon's classical theory, in which the increase in a priori information always decreases the amount of information from a message statement M.  In the classical theory there is no question of a receiver's degree of "understand­ing" of a statement; it is always assumed that he is "tuned.”  But in the semantic theory the essential role is played by the very possibility of correct "tuning" of the receiver.

Carnap’s explicit statement of the aim of sci­ence is set forth in his Aufbau.  The aim of science consists in finding and ordering true propositions firstly through the formulation of the constructional system - the introduction of concepts - and secondly through the ascertainment of the empirical connections between the concepts.  This is completely programmatic, and says nothing about what scientists actually do in their research practices.

In his later years Quine concluded that his wholistic view of observation statements implies a relativistic theory of truth, and he retreated from the implications of his “Two Dogmas of Empiricism” (1952).  After reading Quine's "Two Dogma's of Empiricism" in which Quine criticized Carnap's concept of analyticity, Hempel gave serious reconsideration to Carnap's analyticity thesis.  Hempel does not reject Carnap's concept of L-truth.  His disagreement is only with the concept of A-truth, the truth that Carnap calls meaning postulates, which are known to be true by virtue of the meaning relations among the descrip­tive terms in the sentence.

0 Comments:

Post a Comment

<< Home