Tuesday, February 15, 2011

Applied Linguistics: Formal Logic

Formal Logic in Linguistics

Formal semantics is the study of the semantics, or interpretations, of formal and also natural languages, by describing them formally, that is, in mathematical terms. A formal language can be defined apart from any interpretation of it. This is done by designating a set of symbols (also called an alphabet) and a set of formation rules (also called a formal grammar) which determine which strings of symbols are well-formed formulas. When transformation rules (also called rules of inference) are added, and certain sentences are accepted as axioms (together called a deductive system or a deductive apparatus) a logical system is formed. An interpretation of a formal language is (roughly) an assignment of meanings to its symbols and truth-conditions to its sentences.

One of the purposes of studying Formal Logic is to formally distinguish between ambiguous sentences. 

Example 

(1) Two women seem to be expected to dance with every senator. 

Interpretation could be either (2a) or (2b) 

(2a) 2x
y (two women will dance with every senator) 
(2b)
y2x (every senator will dance with two women)


Additional Reading: Excerpt of a Dissertation

QUANTIFICATION IN FORMAL LOGIC AND NATURAL LANGUAGE
Formal Logic versus Linguistic Analysis- Formal Logic and Linguistics

My aim is an inquiry into the connections between logic and linguistics, that is to say into
human mind and language.

This work takes for granted some version of the thesis that “things mental – that is minds- are
emergent properties of brains.”Such emergences are produced by principles that control the
interactions between lower level events.

A key point is the kind of relationship between the elementary property of human language as
“species property”or biological property, and the property of discrete infinity, which is exhibited in
its purest form by natural numbers.Such properties might be considered as part of our biological
endowment.

From the point of view of generative grammar, we know that diversity and complexity of
human languages can be no more than appearance. They are variations of a single theme.Language
structure must be invariant, except at the margins (Chomsky:1981,1993,1995,etc..).

As a consequence of this, we can state that each particular language can be derived from a
uniform initial state under the boundary conditions set by experience.This is an explanation of the
properties of languages at a deeper level.

It seems to be a universal characteristic of language that entities are regarded as divisible or
indivisible, so things may be represented as quantifiable or unquantifiable.Indeed the categorization
of things on this dimension is not fixed at an upper level.

Quantification is a notion of logic that has been in use in linguistics as well.To “quantify”- in
its ordinary sense,- if there is one- means to point out a certain quantity of something.

The definition of the word in the Oxford English Dictionary is “determine quantity of, measure,
express as quantity”.The technical sense of the term is not far from this definition.

In order to understand why quantification is a necessary concept both in formal logic and in
natural languages, we should consider that any common noun naming an object we may think of is
associated with the totality of objects of the same kind.

Quantification is a means to quantify what proportion of that totality we have in mind.Let us
consider some examples:

When we say “dogs bark” we refer to the totality of dogs.

In both “ a dog is barking” and “ the dog is barking” we refer to just one member of the set of all
dogs.

In “ the dogs are barking “ we refer to all the members of the set of dogs within hearing distance,
In “ some dogs do not bark “ we point out that in the set of all dogs there are a number which do not
bark.

“The”, “a”, “some”, and the “zero article”are linguistic means of indicating among other things,
what proportion of the set of all dogs the speaker has in mind

In logic, quantification is made by means of special operators called quantifiers.The role of the
quantifiers can be explained in the following way: each proposition is seen as a relation between a
number of arguments or nominal entities in linguistic terms.Thus, a distinction is made between
individuals or objects which have properties or enter into certain relations and the properties they
have or the relations they contract.

The arguments can be constant (e.g.proper names: John, Fido) or variables( x, y).The predicates
are relations (e.g.verbs:love, walk).

Intransitive verbs translate into logic as one-place relations (e.g.leave, walk) and so do
adjectives and common nouns( clever, boy, dog). In order to show that two elements, x and y, are in
relation R with each other, we write R (x,y);if x has property P (a one-place relation) we write P(x).

If the arguments are constant, then a predicate and the respective arguments make up a
proposition,e.g.:

Dog (Fido)- meaning “ Fido is a dog ”.
Clever( Fido)- meaning “ Fido is clever ”.
Love(Fido, John)-“Fido loves John ”.
Leave (John)- “John is leaving”.
Walk(John, Fido)- “John is going to walk Fido”.

The sequences above are propositions;they make sense from a logical point of view – they can
be interpreted, i.e.they can be assigned one of the two “ meanings” with which logic operates:True
(T) or False (F).

If, on the other hand,one or more arguments are variables, the predicate and the variables alone
do not form a proposition, i.e. a sequence that can be assigned one of the values T or F; they form a
propositional function, which is a sequence that can only be interpreted if additional information is
given regarding the variables involved.

More precisely, variables cover a certain domain ( a set of objects), and in order to be able to
say whether a sequence containing variables is true or false, we have to know what the spread of the
variable(s) is, the portion occupied in the set of variables of certain type. This can be specified by
means of quantifiers.

A UNIVERSAL SEQUENCE IS TRUE IF AND ONLY IF ALL CASES ARE TRUE
AN EXISTENTIAL SEQUENCE IS TRUE IF AND ONLY IF AT LEAST ONE CASE IS TRUE

There are two quantifiers in formal logic: the ‘Universal quantifier’" ∀  – which shows that the
whole set is covered, and the ‘Existential quantifier’ Ǝ – which shows that at least one member of
the set is referred to.

The quantifiers are said to ‘bind’ the variables. A sequence- in which the variables bound by
quantifiers can be interpreted as either T or F - is a proposition.
e.g.      ∀ " (x) Dog(x) – Clever(x)                           (‘Dogs are clever’
(implies)                                              ‘All dogs are clever’)

∀ " (x) Dog(x) – Bark(x)                               (‘Dogs bark’
(implies)                                              ‘All dogs bark’)

Ǝ  (x) Dog(x) and Not Bark(x)                     (‘Some dogs don’t bark’
‘There are dogs which don’t bark’)

∀"(x), "(y) Dog(x) and Boy(y) -                  (‘Dogs love boys’; ‘All dogs love boys’)
Love(x,y)

Ǝ (x),"(y) Dog(x) and Boy(y) and                (‘Some dogs do not love boys’;
Not Love(x,y)                                    ‘There are dogs which do not love boys’)



Source: 
http://www.cs.unipr.it
http://www.lingforum.com