Arbtirary thoughts on nearly everything from a modernist poet, structural mathematician and functional programmer.

Thursday, February 5, 2009

On Syntax and Semantics

This is actually a post about combinatorics, but before we get there, I need to talk about languages. Every expression in any language has two important aspects: syntax-- the structure of the expression, and semantics-- the meaning of the expression. Let's see an example. I'll take Chomsky's; "Colorless green ideas sleep furiously." Syntactically this sentence is "adjective, adjective, noun (subject, plural) being modified by the adjectives, intransitive verb (present tense, active, third person plural) being modified by adverb, adverb." This sentence is what logicians call a "well-formed formula." A well formed formula is any formula which does not violate the grammar of the language. So, we could replace every word in the sentence with another which has the same part of speech, tense, mood, (and every other grammatical term they satisfy that I don't know) and still have a grammatically correct sentence. E.g. "soft deep swords read wildly."

On the other hand, I think we can all agree that without reading too far into either of the sentences, they are both meaningless-- semantically, they are both nonsense, despite their syntactic correctness. Then again, going a step farther we can milk meaning out of them, and I'm sure there's a Zen Koan hidden somewhere in one of those sentences if you know where to look. This search for an expression's meaning, oddly enough, captures the essence of combinatorics.

I'm going to take another detour. Let's look at algebra. Whatever level of algebra you have experience with, this should be true, although it may fall apart a little bit at the higher levels. When you have some statement such as "x+2=y" it means at any point you see y, you can replace it with x+2 and any time you see x+2, you can replace it with y. Equality in an algebraic sense is a rule of transformation. So when you get some long expression, such as "(x+2)*(x+2) + x-2", you can transform it to "y*y+y-2-2" From these transformation rules, you can show the equality of new expressions. So we can say that x*x+5*x+2 = x*x+4*x+4+x-2 = (x+2)*(x+2) + x-2 = y*y+y-2-2 = y*y-4. Notice that these transformations are syntactic changes. You are replacing one expression (which may be a variable, a literal [e.g. 1], or literals and variables combined by operators) with another expression. The semantics of your expression do not change: x+2 has the same value (semantics) as y. This is the idea behind algebraic manipulation: you never change the values, and so you show that the value of some expression whose value you know (e.g. x*x+5*x+2) is the same as the value of some expression whose value you want to know (e.g. y*y-4).

As I'm sure you've guessed, I'm going to assert that in combinatorics, we make semantic transformations. This may seem to be really dangerous at first: how does reinterpreting an expression give us something valuable? You can't just say that "x" means something different because you feel like it! So what's happening here?
To be precise, you don't actually make semantic transformations-- I lied. Instead, you're equating semantic interpretations of an expression. This may still seem problematic-- "colorless green ideas" can mean just about anything you want it to mean. The difference here is that math is significantly more precise. If you say x means y, you don't mean that x gives the emotionally impression that y does, you mean that under some reasonable interpretation of the system, x is interpreted as y. What constitutes a reasonable interpretation is a foundational issue that I'm not going to get into. So, "fine," you say. "I can accept that meaning is stronger in math than English; but what the hell are you talking about?"

Combinatorics works under the the assumption that mathematical expressions are representations of some sort of structural relationship-- some abstraction of a pattern or structure that is commonly found somewhere. And these expressions sometimes codify the same abstract structure. When we can find overlaps like these, we've found two things which are the same.

Ok. So, let's complete this thought with a classic theorem from the first week of any combinatorics course. Let C(n,k) be the number of ways of choosing k objects out of a set of n, without repetition, and where order doesn't matter-- so we want to know how many hands of k cards we can form out of a deck of n. Then C(n,0)+C(n,1)+...+C(n,n) = 2^n, whatever n happens to be.

The proof is as follows:

The right hand side: 2^n is obviously the number of bitstrings of length n: every bit is either 0 or 1 (2 choices) and we have n of them 2*2*...*2 = 2^n.

The left hand side: establish any ordering of the n objects. When we choose k elements, we mark the k we've chosen with a 1, and the rest with a 0. This gives us all bitstrings with k 1's. Now we sum this over all k, this gives us all bitstrings with any number of 1's of length n; in other words, all bitstrings. Since both the left hand side and the right hand side count the number of length n bitstrings, they are the same.

So, what are we doing? we're saying "what does this expression mean?" and finding something... and then finding another way of saying the same thing. It's a very weird way of doing math. A friend of mine once said combinatorial proofs almost seem more "subjective". There's a bit of truth in this.

One thing that combinatorics does is elucidate connections between expressions. Since you're looking at what an expression means, linking the two ideas comes naturally. An algebraic proof says "look you can use these interchangeably", but a combinatorial proof goes a step further, it says "these two concepts are actually the same." It provides a link in your mind between two things that aren't necessarily linked in an obvious way.

And this post happened because I was trying to find a combinatorial proof for C(n+1,2) = 1+2+..+n (which is trivial to prove algebraically).

Edit: The last sentence reminds me of trying to find meaning in "colorless green ideas sleep furiously." Perhaps there is no good reason they are equal. maybe it's a sentence which works, but there is no "deeper meaning". Is this possible? do mathematicians accept this possibility? I'm not sure if they do.

No comments:

Creative Commons License Cory Knapp.