John F. Sowa | 1 Jan 04:30 2010
Picon

[] Language Games and Semantic Domains

In various publications, I've been talking about the importance
of Wittgenstein's notion of language games for NL semantics.
But Wittgenstein never gave a formal definition of the term
'language game'. In Section 65 of _Philosophical Investigations_,
he admitted that a formal definition that includes all possible
kinds of language games is not possible:

LW> Here we come up against the great question that lies behind
 > all these considerations. For someone might object against me:
 > "You take the easy way out! You talk about all sorts of language-
 > games, but have nowhere said what the essence of a language-game,
 > and hence of language, is: what is common to all these activities,
 > and what makes them into language or parts of language. So you
 > let yourself off the very part of the investigation that once
 > gave you yourself most headache, the part about the general form
 > of propositions and of language."
 >
 > And this is true. Instead of producing something common to all
 > that we call language, I am saying that these phenomena have
 > no one thing in common which makes us use the same word for all,
 > but that they are related to one another in many different ways.
 > And it is because of this relationship, or these relationships,
 > that we call them all "language". I will try to explain this...

The Italian linguists and computational linguists have done
a lot of research on this topic, and they use the term
'semantic domain'.  Following is a paper that begins by
quoting the above passage by Wittgenstein and relates his
views to their ongoing research on semantic domains:

(Continue reading)

David Cox | 1 Jan 14:54 2010
Picon

Re: [] Language Games and Semantic Domains

John,

What citation shall we use for your Language games, a foundation for semantics and ontology ?

Your paper opens doors and windows of mind. A great read to start 2010 with.

Dave

John F. Sowa wrote:
In various publications, I've been talking about the importance
of Wittgenstein's notion of language games for NL semantics.
But Wittgenstein never gave a formal definition of the term
'language game'. In Section 65 of _Philosophical Investigations_,
he admitted that a formal definition that includes all possible
kinds of language games is not possible:

LW> Here we come up against the great question that lies behind
> all these considerations. For someone might object against me:
> "You take the easy way out! You talk about all sorts of language-
> games, but have nowhere said what the essence of a language-game,
> and hence of language, is: what is common to all these activities,
> and what makes them into language or parts of language. So you
> let yourself off the very part of the investigation that once
> gave you yourself most headache, the part about the general form
> of propositions and of language."
>
> And this is true. Instead of producing something common to all
> that we call language, I am saying that these phenomena have
> no one thing in common which makes us use the same word for all,
> but that they are related to one another in many different ways.
> And it is because of this relationship, or these relationships,
> that we call them all "language". I will try to explain this...

The Italian linguists and computational linguists have done
a lot of research on this topic, and they use the term
'semantic domain'.  Following is a paper that begins by
quoting the above passage by Wittgenstein and relates his
views to their ongoing research on semantic domains:

   http://tcc.itc.it/people/gliozzo/Papers/Rescogitans.pdf
   Teorie geometriche del significato, by Roberto Basili,
   Alfio Massimiliano Gliozzo, Paolo Marocco

This paper is in Italian, but the middle author has published
many related papers in English:

   http://tcc.itc.it/people/gliozzo/publications.html

Following is a recent paper that is available online:

   http://tcc.itc.it/people/gliozzo/Papers/GliozzoetalHLT2007.pdf
   The Domain Restriction Hypothesis: Relating Term Similarity
   and Semantic Consistency, by Alfio Massimiliano Gliozzo,
   Marco Pennacchiotti, Patrick Pantel

They use a variety of different techniques for identifying
a semantic domain.  One method is to augment WordNet with
domain identifiers for each synset.  The following paper
describes that approach:

   http://tcc.itc.it/people/gliozzo/Papers/Gliozzo-CSL-2004.pdf
   Unsupervised and Supervised Exploitation of Semantic Domains
   in Lexical Disambiguation, by Alfio Gliozzo, Carlo Strapparava,
   Ido Dagan

The extended version of WordNet with domain information is
available with a free license for research purposes:

   http://wndomains.fbk.eu/
   WordNet Domains

Related techniques have been used at CLRU (Cambridge Language
Research Unit) since the 1960s.  I summarize those methods and
relate them to other issues in linguistics and computational
linguistics in the following article:

   http://www.jfsowa.com/pubs/lgsema.pdf
   Language games, a foundation for semantics and ontology

John Sowa


---------------------------------------------------------------------
To unsubscribe, e-mail: cg-unsubscribe-ndqbGWLCXyQvU1jxblBdo2D2FQJk+8+b@public.gmane.org
For additional commands, e-mail: cg-help-ndqbGWLCXyQvU1jxblBdo2D2FQJk+8+b@public.gmane.org



John F. Sowa | 1 Jan 15:39 2010
Picon

Re: [] Language Games and Semantic Domains

Dave,

 > What citation shall we use for your Language games, a foundation
 > for semantics and ontology ?

I usually (but not always) include that information immediately
after the abstract:

    This is a preprint of an article that appeared as Chapter 2
    in Game Theory and Linguistic Meaning, edited by Ahti-Veikko
    Pietarinen, Elsevier, 2007, pp. 17-37.

John

---------------------------------------------------------------------
To unsubscribe, e-mail: cg-unsubscribe@...
For additional commands, e-mail: cg-help@...

David Cox | 1 Jan 15:47 2010
Picon

Re: [] Language Games and Semantic Domains

Thanks.
D.

John F. Sowa wrote:
Dave,

> What citation shall we use for your Language games, a foundation
> for semantics and ontology ?

I usually (but not always) include that information immediately
after the abstract:

   This is a preprint of an article that appeared as Chapter 2
   in Game Theory and Linguistic Meaning, edited by Ahti-Veikko
   Pietarinen, Elsevier, 2007, pp. 17-37.

John


---------------------------------------------------------------------
To unsubscribe, e-mail: cg-unsubscribe-ndqbGWLCXyQvU1jxblBdo2D2FQJk+8+b@public.gmane.org
For additional commands, e-mail: cg-help-ndqbGWLCXyQvU1jxblBdo2D2FQJk+8+b@public.gmane.org



John F. Sowa | 2 Jan 20:28 2010
Picon

[] Re: [CG] Language Games and Semantic Domains

I received some offline comments about the previous posting
on this subject.

> People involved in NLP are, for the most part, not concerned
> with philosophy.

I certainly agree.  For the most part, they just follow the fads,
which are generated by people who are concerned with philosophy.

> The members of the Vienna Circle really weren't philosophers.

Their PhDs were in math & science.  That puts them in the same league
as Descartes, Leibniz, and Kant. (Kant, by the way, taught Newtonian
mechanics and is credited with the hypothesis that the planets formed
from a disk-shaped cloud of dust around the sun.)  The VC followed
Ernst Mach, who was an experimental physicist (best known for the
Mach number).  But their writings are put in the same category with
Plato the mathematician, Aristotle the biologist, Peirce the chemist,
Whitehead the mathematician, and Wittgenstein the engineer.

> Montague and Chomsky at least provided some guidance
> in implementing basic semantic interpretation.

But the basic parsing technology was developed in the computer
field, independently of Chomsky and Montague.  John Backus was
the leader of the IBM project that produced the first FORTRAN
compiler in 1957 -- the same year of Chomsky's first book.
Backus and Naur developed BNF for the definition of Algol,
independently of Chomsky, and Peter Lucas implemented it with
the first version of a recursive decent parser.

Long before Montague, the computer scientists developed methods
of syntax-directed compilation (one-to-one association of grammar
rules and semantic rules).  Bill Woods applied that technique
to English for his PhD dissertation of 1967.  Montague didn't
publish his two famous papers that defined "Montague grammar"
until 1970.  By that time, Bill was working at BBN, where he
implemented his approach in an English query system about moon
rocks -- long before anyone implemented Montague grammar.

And by the way, the three-way distinction of syntax, semantics,
and pragmatics was based on Peirce's triad of grammar, logic,
and rhetoric (which he based on the medieval Trivium).  In the
1930s, Charles Morris replaced Peirce's terms with the currently
popular syntax, semantics, and pragmatics.

But there were earlier systems for NLP in the 1950s, long before
Chomsky.  Silvio Ceccato published a paper about his "correlation
nets" for semantics in 1956, which he implemented on an IBM 650 --
a vacuum-tube machine with a drum memory.  Following is the table
of contents for the 1961 International Conference on Machine
Translation of Languages and Applied Language Analysis:

    http://www.jfsowa.com/misc/icmt1961.pdf

There were several papers on network notations by Silvio Ceccato,
by David Hays, and by Margaret Masterman.   After the table of
contents is a copy of Masterman's paper, in which she has the
first published use of the term "semantic net."  At the end of
this note is the URL of a review I wrote of Masterman's
collected papers and some excerpts from it.

Michael Halliday was one of the cofounders of CLRU with Masterman,
and his work is a more tightly integrated combination of syntax,
semantics, and pragmatics than anything by Chomsky or Montague.
Some of Halliday's early work formed the basis for Terry Winograd's
SHRDLU system, which he implemented for his PhD dissertation (1971),
which also antedated any implementation of Montague grammar.
Halliday's later work stimulated Rhetorical Structure Theory
(RST), which is a widely used approach to pragmatics.

As I said, the bits and pieces of technology are fine.  The
major problem is the question of how to put them together.
Both Masterman and Halliday, who were at Cambridge with
Wittgenstein, had found a better way.  But it was swamped by
the far noisier and more misguided groups who followed the
Frege-Russell-Carnap-Quine direction.  Montague and Chomsky
were infected with that virus, and they passed it along.

> We now have reasonable parsers, both structural and dependency
> oriented, and both statistical and symbolic, means of converting
> the results to partial semantic interpretations (partial in the
> sense of not accounting for everything), ways of dealing with
> lexical ambiguity, and highly active research in filling out
> the blanks (large scale discourse analysis, sentiment analysis, ..).

Yes, indeed.  Those are excellent pieces of technology, none of
which require anything by Chomsky or Montague.  The major question
is how to put them together.  Masterman and Halliday, who were
influenced by Wittgenstein's later work, had found a good way.
But it was swamped by the far noisier and more misguided groups who
followed the Frege-Russell-Carnap-Quine direction.  Montague and
Chomsky were infected with that virus, and they passed it along.

The dependency parsers were based on Tesnière.  And Chomsky had
a strong negative influence on the use of statistics in NLP.
Peirce, by the way, is credited with having made some important
updates and extensions to Laplace's theory of probability.

In summary, ideas have consequences.  Unfortunately for AI and NLP,
the bad ideas had much more hype behind them.

John  Sowa
_____________________________________________________________________

Excerpts from http://www.jfsowa.com/pubs/mmb_rev.htm

Margaret Masterman was one of six students in Wittgenstein's course of 
1933-34 whose notes were compiled as The Blue Book (Wittgenstein 1958). 
In the late 1950s, she founded the Cambridge Language Research Unit 
(CLRU) as a discussion group, which evolved into one of the pioneering 
centers of research in computational linguistics.

As a student of Wittgenstein, Masterman was also deeply concerned about 
the foundations of theoretical linguistics. Around the same time that 
Chomsky was developing his syntactic theories and Montague was 
advocating a logic-based alternative, she was proposing a 
"Neo-Wittgensteinian" view, whose organizing principle was a thesaurus 
of words classified according to the "language games" in which they are 
used. Although no single paper in the book formulates a succinct summary 
that could be called a theory, the following principles are discussed 
throughout:

   * Focus on semantics, not syntax, as the foundation for language:
     "I want to pick up the relevant basic-situation-referring habits
     of a language in preference to its grammar" (p. 200).

   * Recognition that ambiguity is a consequence of the flexibility and
     extensibility of natural language and not a defect that can be
     eliminated by switching to a purified language of logic.

   * Context-dependent classification scheme with three kinds of
     structures:  a thesaurus with multiple groups of words organized
     by areas of use, a fan radiating from each word in the thesaurus
     to the area in which it occurs, and dynamically generated
     combinations of fans for the word tokens of a text.

   * Emphasis on images as a language-independent foundation for meaning
     with a small number (about 50 to 100) of combining elements
     represented by ideographs or monosyllables, such as IN, UP, MUCH,
     THING, STUFF, MAN, BEAST, PLANT, DO.

   * Recognition that analogy and metaphor are fundamental to the
     creation of novel uses of language in every field, especially in
     the most advanced areas of science.

Unlike the a priori formalisms of Chomsky or Montague, this approach is 
based on data about actual language use. In the commentary, Wilks noted 
that Masterman's work contained "the germ of what was later to be called 
EBMT or example-based translation (Nagao 1989), which is now perhaps the 
most productive current approach to MT world-wide, and I have heard 
Professor Nagao refer to [her] in this connection in a lecture" (p. 279).

As a whole, the book presents a cognitive view of language that has 
strong similarities to the Cognitive Linguistics by Croft and Cruse 
(2004). Croft's radical construction grammar, Cruse's dynamic construal 
of meaning, and Lakoff and Johnson's work on metaphor (1980) are 
compatible with and to some extent anticipated in Masterman's papers. 
The multiplicities of context-dependent word senses discussed in the 
first paper of the book could be aptly characterized by the term 
'microsense', which was coined by Cruse (2000). Although most of the 
papers are forty years old or older, the goal of implementing the ideas 
in a computable form has forced a greater attention to detail and 
precision than is found in some of the more recent work on cognitive 
linguistics.

The age and origin of most of the papers as unpublished memos is evident 
in their rather disorganized structure, but the book contains many 
intriguing insights that still seem fresh today. Among them are her 
penetrating criticisms of Chomsky's fixation on syntax:

MM> My quarrel with [the Chomsky school] is not that they are
 > abstracting from the facts. How could it be? For I myself in this
 > paper am proposing a far more drastic abstraction from the facts.
 > It is that they are abstracting from the wrong facts because they
 > are abstracting from the syntactic facts, that is, from that very
 > superficial and highly redundant part of language that children,
 > aphasics, people in a hurry, and colloquial speakers, quite
 > rightly, drop. (p. 266)

As an alternative, she discussed the writings of the phoneticist Peter 
Guberina (1954), who had worked in a school for the deaf:

MM> A large part of Guberina's daily life is spent in developing
 > electronic techniques for helping the deaf to speak. This means
 > that, for him, what is being talked about — that is, the actual
 > subject of any piece of discourse, and the linguistic elements
 > that carry it — is vastly more important than what is said about
 > it. If the deaf man can once pick up the subject of conversation,
 > three-quarters of this problem is solved, even if he cannot hear
 > all that is said about it. If, on the other hand, he clearly hears
 > some one thing that is clearly said about some basic subject of
 > discourse, while the actual subject of discourse remains unknown
 > to him, very little of the deaf man's problem is solved; he has
 > only heard one thing. (p. 228)

In summary, she said that "human communication consists of patterns
of semantic interactions between ascertainably cognate subjects of 
discourse." By cognate subjects, she meant ones that originate from the 
same or similar language games and are grouped in the same area of a 
thesaurus. The semantic patterns led to the templates of Wilks' own 
theory of preference semantics, and they are closely related to the 
chunks, frames, scripts, and schemata of other systems.

---------------------------------------------------------------------
To unsubscribe, e-mail: cg-unsubscribe@...
For additional commands, e-mail: cg-help@...

Rich Elk | 2 Jan 22:42 2010
Picon

[] RE: ] Re: [CG] Language Games and Semantic Domains

John,

In an earlier post you mentioned WordNet Domains and gave a url there for
wndomainds.fbk.eu which I visited.  Is this WNDomains project based on the
Masterman approach in a recognized way?  A lot of the concepts you described
in your paper http://www.jfsowa.com/pubs/mmb_rev.htm and in your post below.

MM's approach, as you described it, seems a lot more appropriate to the
realities of automated language analysis than the usual syntactic theories.

Her theories of the thesaurus of word groups, each word associated with a
fan of edges to context labels, even seem to have been at least slightly
mechanically structured for computation.  She may have inspired the Quillian
who wrote about "spreading activation trees" of concept nodes.  

But even with WNDomains, progress in implementing MM's mechanistic approach
seems stifled and unfulfilled.  So much more could have been done in all
those years since 1980.  Have there been any modern outgrowths or
continuations of MM's empirical semantic focus you can point to or name?

TIA,
-Rich

Sincerely,
Rich Cooper
EnglishLogicKernel.com
Rich AT EnglishLogicKernel DOT com

-----Original Message-----
From: John F. Sowa [mailto:sowa@...] 
Sent: Saturday, January 02, 2010 11:29 AM
To: cg@...
Subject: [CG:] Re: [CG] Language Games and Semantic Domains

I received some offline comments about the previous posting
on this subject.

> People involved in NLP are, for the most part, not concerned
> with philosophy.

I certainly agree.  For the most part, they just follow the fads,
which are generated by people who are concerned with philosophy.

> The members of the Vienna Circle really weren't philosophers.

Their PhDs were in math & science.  That puts them in the same league
as Descartes, Leibniz, and Kant. (Kant, by the way, taught Newtonian
mechanics and is credited with the hypothesis that the planets formed
from a disk-shaped cloud of dust around the sun.)  The VC followed
Ernst Mach, who was an experimental physicist (best known for the
Mach number).  But their writings are put in the same category with
Plato the mathematician, Aristotle the biologist, Peirce the chemist,
Whitehead the mathematician, and Wittgenstein the engineer.

> Montague and Chomsky at least provided some guidance
> in implementing basic semantic interpretation.

But the basic parsing technology was developed in the computer
field, independently of Chomsky and Montague.  John Backus was
the leader of the IBM project that produced the first FORTRAN
compiler in 1957 -- the same year of Chomsky's first book.
Backus and Naur developed BNF for the definition of Algol,
independently of Chomsky, and Peter Lucas implemented it with
the first version of a recursive decent parser.

Long before Montague, the computer scientists developed methods
of syntax-directed compilation (one-to-one association of grammar
rules and semantic rules).  Bill Woods applied that technique
to English for his PhD dissertation of 1967.  Montague didn't
publish his two famous papers that defined "Montague grammar"
until 1970.  By that time, Bill was working at BBN, where he
implemented his approach in an English query system about moon
rocks -- long before anyone implemented Montague grammar.

And by the way, the three-way distinction of syntax, semantics,
and pragmatics was based on Peirce's triad of grammar, logic,
and rhetoric (which he based on the medieval Trivium).  In the
1930s, Charles Morris replaced Peirce's terms with the currently
popular syntax, semantics, and pragmatics.

But there were earlier systems for NLP in the 1950s, long before
Chomsky.  Silvio Ceccato published a paper about his "correlation
nets" for semantics in 1956, which he implemented on an IBM 650 --
a vacuum-tube machine with a drum memory.  Following is the table
of contents for the 1961 International Conference on Machine
Translation of Languages and Applied Language Analysis:

    http://www.jfsowa.com/misc/icmt1961.pdf

There were several papers on network notations by Silvio Ceccato,
by David Hays, and by Margaret Masterman.   After the table of
contents is a copy of Masterman's paper, in which she has the
first published use of the term "semantic net."  At the end of
this note is the URL of a review I wrote of Masterman's
collected papers and some excerpts from it.

Michael Halliday was one of the cofounders of CLRU with Masterman,
and his work is a more tightly integrated combination of syntax,
semantics, and pragmatics than anything by Chomsky or Montague.
Some of Halliday's early work formed the basis for Terry Winograd's
SHRDLU system, which he implemented for his PhD dissertation (1971),
which also antedated any implementation of Montague grammar.
Halliday's later work stimulated Rhetorical Structure Theory
(RST), which is a widely used approach to pragmatics.

As I said, the bits and pieces of technology are fine.  The
major problem is the question of how to put them together.
Both Masterman and Halliday, who were at Cambridge with
Wittgenstein, had found a better way.  But it was swamped by
the far noisier and more misguided groups who followed the
Frege-Russell-Carnap-Quine direction.  Montague and Chomsky
were infected with that virus, and they passed it along.

> We now have reasonable parsers, both structural and dependency
> oriented, and both statistical and symbolic, means of converting
> the results to partial semantic interpretations (partial in the
> sense of not accounting for everything), ways of dealing with
> lexical ambiguity, and highly active research in filling out
> the blanks (large scale discourse analysis, sentiment analysis, ..).

Yes, indeed.  Those are excellent pieces of technology, none of
which require anything by Chomsky or Montague.  The major question
is how to put them together.  Masterman and Halliday, who were
influenced by Wittgenstein's later work, had found a good way.
But it was swamped by the far noisier and more misguided groups who
followed the Frege-Russell-Carnap-Quine direction.  Montague and
Chomsky were infected with that virus, and they passed it along.

The dependency parsers were based on Tesnière.  And Chomsky had
a strong negative influence on the use of statistics in NLP.
Peirce, by the way, is credited with having made some important
updates and extensions to Laplace's theory of probability.

In summary, ideas have consequences.  Unfortunately for AI and NLP,
the bad ideas had much more hype behind them.

John  Sowa
_____________________________________________________________________

Excerpts from http://www.jfsowa.com/pubs/mmb_rev.htm

Margaret Masterman was one of six students in Wittgenstein's course of 
1933-34 whose notes were compiled as The Blue Book (Wittgenstein 1958). 
In the late 1950s, she founded the Cambridge Language Research Unit 
(CLRU) as a discussion group, which evolved into one of the pioneering 
centers of research in computational linguistics.

As a student of Wittgenstein, Masterman was also deeply concerned about 
the foundations of theoretical linguistics. Around the same time that 
Chomsky was developing his syntactic theories and Montague was 
advocating a logic-based alternative, she was proposing a 
"Neo-Wittgensteinian" view, whose organizing principle was a thesaurus 
of words classified according to the "language games" in which they are 
used. Although no single paper in the book formulates a succinct summary 
that could be called a theory, the following principles are discussed 
throughout:

   * Focus on semantics, not syntax, as the foundation for language:
     "I want to pick up the relevant basic-situation-referring habits
     of a language in preference to its grammar" (p. 200).

   * Recognition that ambiguity is a consequence of the flexibility and
     extensibility of natural language and not a defect that can be
     eliminated by switching to a purified language of logic.

   * Context-dependent classification scheme with three kinds of
     structures:  a thesaurus with multiple groups of words organized
     by areas of use, a fan radiating from each word in the thesaurus
     to the area in which it occurs, and dynamically generated
     combinations of fans for the word tokens of a text.

   * Emphasis on images as a language-independent foundation for meaning
     with a small number (about 50 to 100) of combining elements
     represented by ideographs or monosyllables, such as IN, UP, MUCH,
     THING, STUFF, MAN, BEAST, PLANT, DO.

   * Recognition that analogy and metaphor are fundamental to the
     creation of novel uses of language in every field, especially in
     the most advanced areas of science.

Unlike the a priori formalisms of Chomsky or Montague, this approach is 
based on data about actual language use. In the commentary, Wilks noted 
that Masterman's work contained "the germ of what was later to be called 
EBMT or example-based translation (Nagao 1989), which is now perhaps the 
most productive current approach to MT world-wide, and I have heard 
Professor Nagao refer to [her] in this connection in a lecture" (p. 279).

As a whole, the book presents a cognitive view of language that has 
strong similarities to the Cognitive Linguistics by Croft and Cruse 
(2004). Croft's radical construction grammar, Cruse's dynamic construal 
of meaning, and Lakoff and Johnson's work on metaphor (1980) are 
compatible with and to some extent anticipated in Masterman's papers. 
The multiplicities of context-dependent word senses discussed in the 
first paper of the book could be aptly characterized by the term 
'microsense', which was coined by Cruse (2000). Although most of the 
papers are forty years old or older, the goal of implementing the ideas 
in a computable form has forced a greater attention to detail and 
precision than is found in some of the more recent work on cognitive 
linguistics.

The age and origin of most of the papers as unpublished memos is evident 
in their rather disorganized structure, but the book contains many 
intriguing insights that still seem fresh today. Among them are her 
penetrating criticisms of Chomsky's fixation on syntax:

MM> My quarrel with [the Chomsky school] is not that they are
 > abstracting from the facts. How could it be? For I myself in this
 > paper am proposing a far more drastic abstraction from the facts.
 > It is that they are abstracting from the wrong facts because they
 > are abstracting from the syntactic facts, that is, from that very
 > superficial and highly redundant part of language that children,
 > aphasics, people in a hurry, and colloquial speakers, quite
 > rightly, drop. (p. 266)

As an alternative, she discussed the writings of the phoneticist Peter 
Guberina (1954), who had worked in a school for the deaf:

MM> A large part of Guberina's daily life is spent in developing
 > electronic techniques for helping the deaf to speak. This means
 > that, for him, what is being talked about — that is, the actual
 > subject of any piece of discourse, and the linguistic elements
 > that carry it — is vastly more important than what is said about
 > it. If the deaf man can once pick up the subject of conversation,
 > three-quarters of this problem is solved, even if he cannot hear
 > all that is said about it. If, on the other hand, he clearly hears
 > some one thing that is clearly said about some basic subject of
 > discourse, while the actual subject of discourse remains unknown
 > to him, very little of the deaf man's problem is solved; he has
 > only heard one thing. (p. 228)

In summary, she said that "human communication consists of patterns
of semantic interactions between ascertainably cognate subjects of 
discourse." By cognate subjects, she meant ones that originate from the 
same or similar language games and are grouped in the same area of a 
thesaurus. The semantic patterns led to the templates of Wilks' own 
theory of preference semantics, and they are closely related to the 
chunks, frames, scripts, and schemata of other systems.

---------------------------------------------------------------------
To unsubscribe, e-mail: cg-unsubscribe@...
For additional commands, e-mail: cg-help@...

---------------------------------------------------------------------
To unsubscribe, e-mail: cg-unsubscribe@...
For additional commands, e-mail: cg-help@...

John F. Sowa | 2 Jan 23:38 2010
Picon

Re: [] Language Games and Semantic Domains

Rich,

RC> MM's approach, as you described it, seems a lot more appropriate
 > to the realities of automated language analysis than the usual
 > syntactic theories.

I agree.  That's why I cited it as an important alternative to Chomsky 
and Montague.  It did have a considerable influence on other people
at Cambridge, including Yorick Wilks.  But Halliday should be given
credit for the general atmosphere at Cambridge in the late '50s and
early '60s.  You can see a lot of that general approach in Halliday's
complete list of publications.

RC> Her theories of the thesaurus of word groups, each word associated
 > with a fan of edges to context labels, even seem to have been at
 > least slightly mechanically structured for computation.  She may have
 > inspired the Quillian who wrote about "spreading activation trees"
 > of concept nodes.

She most definitely inspired Quillian.  The full proceedings, from
which I copied those excerpts, listed all the attendees.  And
Quillian was one of them.  (He was still a graduate student in 1961,
and he finished his PhD dissertation in 1966.)  Karen Sparck-Jones
said that Quillian visited CLRU after that 1961 conference, and his
views at the time were still very underdeveloped.

RC> But even with WNDomains, progress in implementing MM's mechanistic
 > approach seems stifled and unfulfilled.

I agree.  Wittgenstein admitted that the notion of language game
involved many complex interrelationships:

LW> Instead of producing something common to all that we call
 > language, I am saying that these phenomena have no one thing in
 > common which makes us use the same word for all, but that they
 > are related to one another in many different ways.  And it is
 > because of this relationship, or these relationships, that we
 > call them all "language".

In various writings, LW said that there are as many different kinds
of language games as there are different kinds of human behavior.
The Italian WNDomains is a start, but it's too static.  What's needed
is a more dynamic way of recognizing and generating new language
games (or whatever else anyone may prefer to call them).

RC> Have there been any modern outgrowths or continuations of MM's
 > empirical semantic focus you can point to or name?

As I said, it's hard to separate MM's influence from Halliday's
influence and from the general atmosphere of people at Cambridge
who had been influenced by Wittgenstein.  Some of the second
generation "grandstudents" (students of former students) of LW
have designed systems that incorporate various aspects of that
approach.

Following is Yorick Wilks' list of publications:

    http://www.dcs.shef.ac.uk/~yorick/papers.html

Alan Bundy has been working in logic and theorem proving, and he
was also a former student of a student of LW.  His work shows a
more flexible attitude than the mind set of typical logicians.
Following is his list of publications:

    http://homepages.inf.ed.ac.uk/bundy/

Note his work on metalevel reasoning, ontology repair, and
ontology evolution.  Bundy's approach to ontology evolution is
more compatible with Peirce's notion that "symbols grow" than
the goal of "One True Formal Ontology."

John

---------------------------------------------------------------------
To unsubscribe, e-mail: cg-unsubscribe@...
For additional commands, e-mail: cg-help@...

Jon Awbrey | 3 Jan 05:04 2010
Picon
Picon

Re: [] Re: [CG] Language Games and Semantic Domains

Re: "But it was swamped by the far noisier and more misguided
     groups who followed the Frege-Russell-Carnap-Quine direction.
     Montague and Chomsky were infected with that virus, and they
     passed it along."

John,

It really makes no sense at all to assign Chomsky
to that analytic-behaviorist-reductionist tradition.
Chomsky, for all his self-imposed limitation to syntax,
presented a most appreciative and comprehending account
of Peirce's theory of inquiry all throughout the many dry
decades of naive empiricism -- out of which desert we have
yet to fully escape.

Jon Awbrey

--

-- 

inquiry list: http://stderr.org/pipermail/inquiry/
mwb: http://www.mywikibiz.com/Directory:Jon_Awbrey
knol: http://knol.google.com/k/-/-/3fkwvf69kridz/1

---------------------------------------------------------------------
To unsubscribe, e-mail: cg-unsubscribe@...
For additional commands, e-mail: cg-help@...

Nagarjuna G. | 3 Jan 07:48 2010

[] iccs2010 not listed at the website

Hi,

I couldn't locate a link to the iccs2010 conference to be held in
Malaysia at the  http://conceptualstructures.org/confs.htm.
Isn't this event  (http://www.mimos.my/iccs2010/) supposed to be linked there?

--

-- 
Nagarjuna G.
http://www.gnowledge.org/

---------------------------------------------------------------------
To unsubscribe, e-mail: cg-unsubscribe@...
For additional commands, e-mail: cg-help@...

John F. Sowa | 3 Jan 16:10 2010
Picon

Re: [] Re: [CG] Language Games and Semantic Domains

Jon,

I agree with your comment to an extent, but I would add
some qualifications.

JFS>> But it was swamped by the far noisier and more misguided
 >> groups who followed the Frege-Russell-Carnap-Quine direction.
 >> Montague and Chomsky were infected with that virus, and they
 >> passed it along.

JA> It really makes no sense at all to assign Chomsky to that
 > analytic-behaviorist-reductionist tradition.  Chomsky, for
 > all his self-imposed limitation to syntax, presented a most
 > appreciative and comprehending account of Peirce's theory
 > of inquiry all throughout the many dry decades of naive
 > empiricism -- out of which desert we have yet to fully escape.

I acknowledge that Chomsky had made some favorable remarks about
Peirce and that he was adamantly opposed to the behaviorists.
Following is a reprint of Chomsky's review of B. F. Skinner's book
_Verbal Behavior_ (1959) with a preface written by Chomsky in 1967:

    http://www.chomsky.info/articles/1967----.htm

But in my previous note, I contrasted Chomsky's position with another
great linguist, Roman Jakobson, who also said that he had been strongly
influenced by Peirce.  I especially like to quote Jakobson's one-line
slogan:  "Syntax without semantics is meaningless."

I agree with many people that Chomsky's writings up to 1965 were of
immense value in reviving the study of linguistics and in opening up
some very fruitful lines of research.  After 1965, Chomsky spent
several years in politics, especially in protests against the
Vietnam War.  That added one more voice against a misguided war.

But when he returned to linguistics full time in the early 1970s,
Chomsky violated Peirce's first rule of reason:

    Do not block the way of inquiry.

While Chomsky was distracted by the Vietnam war, some of his former
students and colleagues proposed a revised and extended version
called *generative semantics*.  The basic idea was that semantic
rules generated meaning representations, which the syntactic rules
converted to the spoken form.  That approach was one of several
inspirations for a term paper I wrote in 1968 for Marvin Minsky's
course on AI.  The title was "Conceptual Graphs."

In the early 1970s, Chomsky attacked his former students as heretics
and excommunicated them from any further collaboration or recognition.
Following is a reply by one of the targets, George Lakoff:

    http://www.nybooks.com/articles/9956

I have been generally sympathetic to Lakoff's positions, but I
criticized his habit of claiming credit for ideas that had been
anticipated by early linguists such as Aristotle.  Following is
my review of his book _Philosophy in the Flesh_:

    http://www.jfsowa.com/pubs/lakoff.htm

As I say in that review, "Chapter 22 of this book presents a strong
case against Chomsky's 'autonomous syntax' and for an approach that
bases syntax on semantics and semantics on the bodily mechanisms
of perception and action."  That is a position that Aristotle and
Peirce would support.

In short, I would agree that Chomsky doesn't fall into the same
errors as the positivists and behaviorists.  As Chomsky wrote in
his book _Cartesian Linguistics_, he was a rationalist.  That's a
different extreme, but he was just as rigid as Frege in promoting
an ideal (linguistic competence) as opposed to the actual use of
language (performance).

In fact, you could say that Chomsky and Montague had equally rigid
views of language, and the primary difference between them was
whether their ideal was syntactic or semantic.  Two people can
catch the same virus, but it might stay in the nasal passages
or migrate to the lungs.

John

---------------------------------------------------------------------
To unsubscribe, e-mail: cg-unsubscribe@...
For additional commands, e-mail: cg-help@...


Gmane