Archive for syntax

Putting my stuff up where someone can see it

Posted in Uncategorized with tags , , , , , , on 2011 March 17 by Asad Sayeed

I just posted a draft article that I wrote with my advisor Amy Weinberg on LingBuzz, which has become a sort of joint working papers archive or respectable self-publishing system for generative grammar folk.  It’s a great idea—linguistics journals are hyper-competitive and who knows how many interesting ideas languish in the queue?

This article is a draft of something that I’ve been working on for several years.  It’s a kind of mini-linguistics-thesis inside my computer science degree.   I had to stop significant activity on it in March 2009 in order to devote the bulk of my time to experiments for my dissertation and other projects on which I was working at the time, so it was left in a (in my opinion) nearly ready state, but just needs a couple of more solid months of work, or a couple of years of hacking at it in my non-existent spare time.  But, you know, dissertation…

So I was faced with a choice of hiding my light under a bushel (or whatever the saying is) while I waited for that solid time-period or letting the world see it before it became a little too out-of-date.  I choose the latter.

What it’s about: The abstract (and article) is written in syntax-ese, but it’s the culmination of some thoughts I’ve had about languages that allow for long-distance extraction of noun phrase constituents from finite clauses…there I go with the syntax-ese again.  But let me put it this way: in some languages you can turn:

I now see that the jelly donuts were tasty.

into something like:

(Not English) The jelly donuts I now see that _____ were tasty.

But English does something weird as well: preposition stranding in questions:

What is he going on about ____?

In our article, Amy and I suggest that these facts are related, but current versions of generative syntax have removed the mechanism to express relationships of this kind.  These were called “escape hatches” in the past, and we propose a way to express them that is consistent with the terms of current theory.

I suspect/hope that it will be controversial.

Update 11:20 – 6 downloads and it’s hardly been 20 minutes.  That was fast.

Advertisements

Linguists on the lam

Posted in Uncategorized with tags , , , on 2011 March 10 by Asad Sayeed

So, the reason why I decided to start writing again as per the previous post is that I was inspired by this post by Melody Dye which was intended, I guess, to stir up an old debate, and kind of also succeeded.  I didn’t participate on the thread due to time constraints, but I vehemently disagree with the argument she presents, and I eventually got into a “tweetflooding” argument with old friend Jeremy Kahn and new virtual friend Zoltan Varju on the matter. I will eventually get to responding to it, I hope, even though I really shouldn’t as I have something called a “dissertation” to write.  (Ugh.)  I’m going to write something related but just slightly tangential here.

In a nutshell: Melody’s thread is yet another rehash of the old methodological arguments against linguistic (particularly syntactic) theory that are destined to be visited on every generation. Multiple times. Forever and ever—it is simply a fact one must accept that people are going to believe that Google is a sort of linguistic counterexample engine.  I am in the peculiar position of someone who works with Big Corpora as his bread-and-butter and dissertation topic and so on—but remains quite skeptical of the ability of this work to provide us with particularly interesting insights as to the human capacity for language in itself.

But the main point I want to make, briefly, is on the linguistics blogosphere itself.  Is it just me, or is it wildly unrepresentative of the linguistics field as a whole?  Maybe it’s because I live very near to/participate in the Maryland hothouse of unreconstructed generative grammarians (Philip Resnik excepted, heh), but, um, it doesn’t seem to reflect the other “hothouses” (Carleton U and U of Ottawa) to which I’ve belonged as well, nor does it reflect my brushes past other real-life linguists and other departments.  On the occasions that I have read Language Log, it and its commentariat have tended to take positions a lot closer to Melody’s than the part of mainstream syntactic theory.

Aside from an obvious accusation of “anecdote” and “sample bias”, let me throw out another possible explanation that might actually tie together a number of issues: the fact that a lot of syntactic theorists, both faculty and students, tend to come from humanities (lit. and philosophy) backgrounds, and that it is not really surprising that the linguistic blogosphere is pretty saturated by Big Corpus and neo-empiricists and so on—and why a Google (heh) search for “minimalist linguistics blog” and various terms like that don’t tend to turn up much.

Again, perhaps I missed the Big Syntax Blog out there, but I’m pretty connected and well-read *cough* online, and I’d be surprised if I had truly missed it.

Now as to why syntacticians tend to have this background, why the technically-oriented ones might drift to the Big Corpus side of things, and what this all means for the field, well, those are interesting questions indeed.  It seems to be the case, for example, that a lot of syntactic theorists are getting jobs in English departments rather than, say, applied math or logic positions.  (More anecdotal experience.)

And as to what it all means, well, it means that syntactic theory is susceptible to criticism from the camp on the opposite side, the “European-style” logicians and formal grammarians—a criticism to which I am much more sympathetic than the claims of post-Chomskyans/neo-empiricists. (And about which I intend to write a post in the not-to-distant future!) But it also means that, regardless of who is right about these matters, syntax is not growing its base in places that it needs to grow its base, insofar as academic blogs are potential incubators of future collaborators and grad students.  And I believe that they are these days to a goodly extent.

And unfortunately it kind of also means that a lot of syntacticians will only be dimly aware that these issues are being revisited, even if the arguments aren’t really all that different from the ones that have been made in the past.  I am definitely sympathetic to people who might think we’ve been here and done that, like, 50 years ago.  So it goes.

Derivational cycles: syntax seminar

Posted in Uncategorized with tags , , , , , , on 2009 September 17 by Asad Sayeed

By the way, I am also attending a seminar in syntax (specifically, theories of derivational cycles) held by Norbert Hornstein and Juan Uriagereka.  Unfortunately, it overlaps with Philip Resnik’s sentiment analysis seminar, and technically being CS and all, I attend Philip’s class fully and then barge in an hour late to the syntax seminar.  That means I have a devil of a time picking up the thread of the conversation, but so far—as it has focused on a historical review of cyclicity in the syntax literature, much of which I am already familiar with—I don’t yet feel like I am suffering.

For those not as familiar with theoretical syntax and wondering what a “derivational cycle” might be…hoo, boy.  One of the criticisms of theoretical linguistics of the so-called “Chomskyan” variety (that UMD linguistics practices with gusto) is that it has its head in the formalistic clouds, far away from language, but never far enough away that it can be described with a great deal of mathematical precision.  Cyclicity in syntax is both a prime example of this, and one of the most important and IMO convincing and interesting aspects of the approach.

But one simple way of thinking about it is that there are definite limitations to the scope of question words in a sentence, and that these limitations happen in “cycles” roughly—but not strictly—defined by nested clauses.  Making the case requires a lot of examples and reams of PhD theses, but here’s an illustrative pair:

  • Why was the man sleeping in the boat?
  • What boat was the man sleeping in?

We can extend both questions by adding another clause-embedding, in a sense recursively (to appeal to CS sensibilities).

  • Why did you tell the reporter that the man was sleeping in the boat?
  • What boat did you tell the reporter that the man was sleeping in?

In the “why” case, the shorter question asks the reason for sleeping in the boat, but the longer question no longer allows that interpretation.  We are instead forced to interpret that it was asking why “you [told] the reporter” about it.  In other words, introduction of the “that” seems to have had an effect of “blocking” the question from applying to the later clause.

But not so for “What boat”!  There is therefore something special about the introduction of a clause boundary in English that blocks some interpretations of questions but not others.  We can extend these examples further, and into other languages.  As clauses can be nested further, we can suggest that these phenomena are therefore in some sense cyclic or recursive, yet apply to very abstract human faculties of interpretation.

This particular class was a review of Chomsky’s classic Barriers monograph (in a nutshell, how clause boundaries act as barriers to certain interpretations) followed by a review of Lasnik and Saito’s work (that’s UMD’s Howard Lasnik) on “proper government”, which elaborates on some of the conditions that permit barriers to form through characteristics of abstract variables called “traces”.  Each of these, however, would take me hours to summarize, so I won’t, at this point.