04 November 2011

Language Part V: History of Language

As I noted recently, people in my industry spend a lot of time doing the business of language, but amazingly little time talking about language itself. So today, I would like to talk a bit about the history of language.

Sadly, there were no time-travelers about with recording devices in hand to witness the birth of language. As a result, we can only hypothesize about language’s origins. Opinions vary wildly: some researchers think we have possessed speech in at least rudimentary form for as long as 2.3 million years, since our ancestor Homo habilis. Others believe it is far more recent and that only our current evolutionary iteration, Homo sapiens, has possessed this gift, which would put it as recently as 50,000 and as long as 200,000 years ago.

But we have to make a clear distinction here between speech and language. While we may have had speech (i.e., the ability to make intelligible sounds) for a very long time, full-blown language (a coherent system of speech with structure and rules used to communicate complex ideas) is definitely much more recent and probably doesn’t pre-date Homo sapiens.

Using methods similar to the linguistic analysis that led to the theoretical Proto-Indo-European language that is supposed to be the root of most European languages (among others), researchers have determined that the current diversity of world languages would have required about 100,000 years to develop its present level of complexity and diversity, which just so happens to dovetail nicely with the rise and spread of Homo sapiens. And a recent discovery that pushes back the first known use of paint to around 100,000 years ago would also seem to argue in favor of that being the latest possible date for language to be in use. (Producing and using paint are of course not direct links to language development, but I find it hard to grasp how a group of primates lacking language could even formulate the desire - never mind carry out the work - to fashion a paint production workshop and employ those pigments. To my mind, paint is tightly bound to abstract thought because unlike, say, a hammer or ax, it is not a strictly utilitarian technology.) An even more recent study suggests that more complex, structured language evolved from a precursor spoken in East Africa about 50,000 years ago, so that 100,000 to 50,000 year time frame seems pretty well established by those bookends.

Regardless of the when, once language did take hold, it was here to stay. No crude tool we had ever developed was nearly as powerful as language. Now knowledge could be spread effectively and quickly, sending technological achievement on a path that led to ever-faster innovation. And with language came the birth of ideas as opposed to just technology and know-how. While it may be a bit slower and trickier, you can fashion pretty effective tools and show someone else how to use them without language; but without language, you can’t have true ideas, those little packets of abstract thought that give rise to the Really Big Things. In other words, you can have a hammer without language, but you can’t conceive or share the significance of that hammer and its meaning to you without language.

So language was the one thing all human populations took with them as they colonized the planet, and as they did, it invariably changed. Fast forward to today and we find ourselves with between 6000 and 7000 languages belonging to many different families. Of those families, though, just five make up roughly 85% of all speakers: Indo-European, Sino-Tibetan, Altaic, Niger-Congo and Afro-Asiatic.

And what of the future of language? It’s impossible to say for sure, but I think that if we maintain and build upon our current trends in technology and its accompanying degree of interconnectedness, we will slowly move towards just a few world languages – and maybe just one. That vision of the future is also supported by the fact that languages are disappearing at an alarming rate these days, with the last native speakers of unused languages dying off slowly but surely. But yikes, if we whittle it down to just one language, that means I will be out of a job! Fortunately for my job security, however, that process will take (at least) hundreds of years.

28 October 2011

Language Part IV: Just How Many Languages Are There?

I run the U.S. business unit of a multinational translation company. Given that I work in an industry that is entirely about language, it’s amazing how little time my peers and I spend actually talking and thinking about language itself. Not about workflows and automation and cost-reduction and efficiencies, but just language. This fact first occurred to me some years ago after a brief exchange with a senior operations manager at a major language services provider. Somehow we had gotten onto the topic of the maximum number of languages a certain client might potentially need and she commented, “Well, there are probably only around, what, maybe 200 total languages in the whole world?” I noted that in fact there are well over 6000, which earned me a look of incredulity and a raised eyebrow that suggested I had perhaps been spiking my morning coffee.

So let’s take this seemingly straightforward question of the total number of languages as an opportunity to talk about language as a topic on its own.

To be fair to that ops manager, it is very easy to believe that the number of languages could be as low as a couple of hundred. After all, according to one study, 80% of the buying power of the Web population can be reached with just 10 languages.* Upping that to 90% requires just six more languages, and 95% just seven beyond that. Just 14 more and you have reached 99%. So with 37 of the world’s 6000-odd languages, you have reached very far down the long tail, leaving the other thousands of languages languishing in relative obscurity.

What about those other 6000+ languages? Well, first of all, it’s next to impossible to be certain about the numbers. Your count depends on a number of factors, not least of which is the very definition of what constitutes a language. Some criteria are easy enough, e.g. living v dead (e.g. French counting as a language in our tally, but an ancient language like Latin being omitted). But others are less straightforward. For example, one of the thorniest issues is determining if a given language is in fact a distinct language on its own, or merely a dialect of another language. For example, do you say that Columbian Spanish is a language or a dialect of a generic language we call Spanish? I think most linguists agree it’s the latter.

But what about, say, Norwegian v Danish v Swedish? That’s a bit trickier: they are far more distinct from each other than Columbian Spanish is from Iberian Spanish, but to a very real extent, they are mutually intelligible. When I was studying in Norway, for example, my textbooks were almost as likely to be Danish or Swedish as they were to be Norwegian. And there are some dialects of spoken Norwegian that I found harder to understand than Danish. And for that matter, even on the written level, I found Danish far less challenging than the Nynorsk form of written Norwegian (since I had only studied the Bokmål form). So are all those forms of writing and all those spoken dialects counted as languages or just to be considered variants of a common ‘Scandinavian’? And that in turn opens a whole new can of worms: the definition of language can become a very personal and even political and cultural question. Many a Norwegian, for example, might find it downright offensive to hear you say that his language is nothing more than a dialect of a somewhat artificial construct called ‘Scandinavian’.**

Another tricky example is Chinese. Calling Chinese one language is not at all practical really, despite the fact that most people still do. The temptation to talk about ‘Chinese’ as a single language stems from the fact that everyone who speaks variants of it, all write in just two variants (Simplified or Traditional script). The fact that Chinese script isn’t a phonetic system employing an alphabet, makes this trap hard for Westerners to appreciate, because their reasoning is that if all those languages didn’t sound more or less the same, they couldn’t be limited to just the two written variants. But when you stop thinking in terms of written language being a representation of the sounds a language contains, you suddenly realize that you can have many, many very differentiated spoken languages represented by even a single written form. When a character represents an idea without much phonetic information being imparted, I could use the same character for ‘bike’, even though in two spoken languages employing that script, the sounds being uttered for ‘bike’ are nothing alike. And indeed, that is the case: several spoken versions of ‘Chinese’ are in fact mutually unintelligible, even though they employ the same script and can thus be mutually intelligible on the written level, something to keep in mind the next time you’re tempted to say, ‘Let’s record some audio for that Simplified Chinese script’.

So we started out with what seemed like a straightforward question: how many languages are there? But we end without a real answer, because the counting is in the eye of the beholder. And with that we have yet another example of the beautiful complexity that is language!

--------------------------------------------------------------------------------------
*They are English, S. Chinese, Spanish, Japanese, German, French, Portuguese, Russian, Arabic and Korean

**And for the record, almost everyone considers these Scandinavian ones to be separate languages… and I do not recommend disputing that assertion when sharing a beer with a Norwegian, Dane or Swede!

13 October 2011

Simple Math

A lot of people are talking about Herman Cain's elegantly simple '9-9-9' plan. For the uninitiated, '9-9-9' refers to an easy, straightforward taxation plan to replace the current, extremely complex US tax code, with a flat 9% tax on corporate earnings, personal income and sales. Republicans love it. And well they should: it would constitute a huge shift in the tax burden from their constituency (the rich) to everyone else (the other 90% of America). When asked about this fact on NPR today, Cain dismissed such questions as 'playing the class warfare card'. How is math playing any kind of card? If you lower the tax rate on the wealthiest Americans and partially fund that with a sales tax, that is a huge tax cut for the rich and a huge tax hike for everyone else. Why? Because everyone else has to spend a far greater share of their income on everyday needs that are (wait for it)...subject to the new sales tax. If you are wealthy, only a small portion of your income goes to things like food, shelter, clothing, dining, etc., so a sales tax doesn't hurt you as much. Most of your money goes into investments, real estate, savings, etc. And if you have a taste for something really expensive and want to duck the 9% sales tax, you could always get it overseas.

What frustrates me so much is that Americans have such short memories. Steve Forbes and the late Jack Kemp were always droning on about a flat income tax in the 1990s. It was the whole basis for Forbes's failed bid for the Republican presidential nomination in 1996, and Kemp picked up the banner soon thereafter. A scant 15 years later, we all act like this is some radical new idea and have completely forgotten why we rejected it to begin with. And it makes even less sense today, with the middle class squeezed even harder and poverty on the rise. The last thing we need in a country that is already seething with mass protests over inequality, is a mechanism to transfer even more wealth from the lower 90% to the upper 10%.

I concede our current personal income tax code is overly complex and in need of reform. And I have said before that we'd be better off completely getting rid of corporate taxes. But 9-9-9 is not the answer. Its a recipe for an even more unequal America.

22 September 2011

How Do You Get Government Solutions When Government Fails? (Hint: You Don't. So Plan B it is!)

I am a politics junkie. I have always been drawn to and fascinated by the way government works and exploring how it can be made to work better and accomplish more for the people whom it serves. I was a lifelong Democrat because I always believed that good government can be part of the solution and not the problem. So it is with a heavy heart that I have to admit that at this point in history and in this country (the US), government has simply and utterly failed. What's worse, it hasn't failed by trying - that can yield useful lessons and have value in itself - but by, well, sucking at its job and just failing to act at all.

I could go on and on for pages and pages about the whys and wherefores and the whole sad state of affairs, but there's little value in doing that. Smarter people have written (and continue to write) reams and endless reams about the failure of government in 21st-century America. But I prefer to be positive and to turn this not into an admission of defeat but a call to action. The challenges of this century are just too staggering and pressing to sit around and wait for Washington to get its act together. The answer is no longer to write your Congressman, sign meaningless online petitions that no one reads, 'Like' clever soundbites on Facebook (you know, the ones only people who already agree with you will read anyway) and sit around talking about whether it should be the People's Front of Judea or the Judean People's Front. It's time for people of all political stripes and all backgrounds to roll up our sleeves and get to finding non-government solutions to the big problems of our day, especially sustainability and climate change.

In the next couple of months I will be getting more specific. There's a new group getting ready to launch worldwide soon and its aim is specifically to solve the biggest problems through smart, practical and immediate approaches. A friend of mine is the brainchild and prime mover behind it and I don't want to steal his thunder (since the group is still in planning stages), so I don't want to say much else for now. But watch this space for updates and, when the time is right, information on how you can get involved, no matter where you are around the globe.

Let me close by emphasizing one very important thing. Since anyone who reads this blog knows I am a fairly liberal fellow politically, one may assume that my enthusiasm for this new group indicates it will be left-oriented and political in nature. That is absolutely not the case. It's precisely because I recognize that the political route is (at least for now) a dead end that I am so looking forward to participating in what promises to be an entirely solutions-driven, action-oriented, apolitical movement.

Here's to Optimism!

04 September 2011

The Case of the Disappearing Money

If Republicans, conservatives and many Americans of any political stripe are to be believed, the single greatest and most urgent mystery in modern times is this: where is all the money disappearing to?! If we believe them, every dime you pay in taxes, every penny that goes into a government job or contract, every nickel paid to fund anything done by the government, simply vanishes. That's why, by their reasoning, taxes should be as low as possible: every dollar not paid in taxes goes to the economy, while every dollar paid to the government simply vanishes into thin air, never to be seen again.

OK, they admit, the money doesn't disappear, but it isn't used as efficiently by the big, bad ol' government as it could be used by the Glorious and Patriotic, Wonderful PRIVATE SECTOR! (Cue marching band and fireworks! Serve the apple pie!) And everyone - and I do mean almost everyone, including most Democrats in this country - buys into this, to the point that it is simply a given in our national dialogue. President Obama, for example, takes it as the gospel truth that tax cuts=more prosperity.

And as we saw just this past week, this Mystery of the Disappearing Money also applies to the costs of conforming to regulations. President Obama did what he always does these days and caved to Republicans by withdrawing his administration's plan to tighten smog regulations because it would 'cost' tens of billions of dollars at a time when the economy needs the money to create jobs. So he has implicitly accepted that money spent on these measures just turns to dust and blows away.

Just one small problem with this reasoning: it makes no sense whatsoever and is not supported by facts or reality. (Other than that, though, it's perfectly reasonable.) Let's look at the cost of conforming to regulations that improve air quality, for example. Let's say it is the worst-case scenario and it's tens of billions of dollars. Do those tens of billions of dollars simply go into a giant paper-shredder? No, they go into contracts with other companies to implement particulate- and pollution-reduction measures, and those contracts create new jobs. They go into the purchase of new equipment to reduce pollution and waste, again creating jobs and making the polluters more efficient to boot (which has long-term economic benefits of its own). And on the savings side, the effects of lower levels of smog redirect billions away from healthcare costs and into sectors where the same money can create more new jobs.

And what about all that 'wasteful' government spending? It's certainly true that governments do have a talent for inefficiency and waste, as all large organizations do; but it's by no means true that the government is always worse than the private sector, and in many cases it is considerably better. For example, Medicare actually delivers healthcare at a more efficient rate than do private insurers. And as we have just seen very recently, when you compare the performance of government-run foreign aid, reconstruction and military support services and infrastructure to such programs carried out by private contractors, turns out big, bad ol' Uncle Sam is far more cost-effective and efficient than the private sector, where not only costs are higher but corruption and waste are rampant.

And one must distinguish between government spending and government investment. The former is expenditure on a short-term need that while important to meet, may not lead to any positive return down the road. But government investment is money spent by the government to ensure long-term needs - ones that can not be met by the private sector - are met in order to support the economy and society of our country. Those investments normally have positive returns on investment, returns that can and should be measured and made public to set them apart from mere 'spending'. Take roads and other infrastructure projects. The private sector is simply never going to step up and say, 'hey, let's pay billions of dollars to improve roads, rebuild bridges and replace our rotting, dangerous sewage, drainage and water management systems in this country.' But without that investment, the private sector will lose more and more money over the coming years due to everything from supply chain disruptions caused by poor roads to closures caused by preventable flooding and water-supply interruptions. Addressing those issues will save billions and create a lot of jobs in the process, while delaying them does us no favors anyway: a repair that might take $5,000 today may cost 2-3 times that much if we wait too long.

But wouldn’t raising taxes to balance the budget and repair our embarrassingly-poor infrastructure just make the wealthiest Americans scared to invest? After all, we keep hearing that trillions of dollars in cash are sitting on the sidelines due to investor skittishness. Republicans point to this huge cache and claim that its owners are just chomping at the bit to invest it, but alas, with so many regulations and taxes, what’s a billionaire to do? What utter and complete hogwash. These claims do not bear up under even modest scrutiny. In fact, the existence of all that sidelined money is an argument in favor of taking the opposite approach favored by Republicans: the wealthiest 5% of Americans, the ones who are sitting on these trillions, have absolutely no motivation whatsoever to invest it in jobs, even if conditions were ideal, so why keep their taxes low and allow them to accumulate even more money just to see it sidelined, too? With such vast wealth concentrated into so few hands, those people no longer need to invest to secure their financial futures. At some point, it is simply safer to live on that accumulated wealth, especially when the economy is uncertain. In short, they have no motivation to create jobs and add value for the economy. But if we taxed those wealthiest Americans and invested that money in the economy by funding things like sorely-needed infrastructure improvements, we could force that money off the sidelines and push it into the hands of the lower 95%, who simply have no choice but to spend and invest it, since they have unmet needs and must still work hard and invest to secure their futures.

So, we want to create millions of new jobs and get this country moving again? Then let’s do what past Republican and Democratic presidents alike have known to be the sensible thing: return to a tax policy that discourages the accumulation of idle capital and that uses a high marginal personal tax rate to keep funds flowing through the system. Use those high marginal tax rates on the wealthiest to balance the budget, cut corporate taxes and rebuild this country. Then step back and watch our America get back to work.

19 August 2011

Lies, Damned Lies and....Averages

Mark Twain famously said that there are three kinds of lies: Lies, Damned Lies and Statistics.* I'd amend that: Lies, Damned Lies and Averages. Averages hide a multitude of sins, lies and distortions, especially in politics. Consider a statement like this:

"My fellow Americans, consider the past ten years: on average household income has increased 28%. The average American now has 49% more money invested in stocks."**

People nod and agree how wonderful things must be on paper, even if personally they do not feel richer or better off....and with good reason. Consider the analogous situation:

I have 100 pies. You have 2 pies. On average we have 51 pies, even though I have 50 times more pies than you do. Tomorrow I get ten more pies and you lose one pie. On AVERAGE, pie ownership has increased almost nine percent. On AVERAGE, we each have more pies than yesterday and on AVERAGE (and at aggregate level), the total number of pies is going up. But are you better off that you were yesterday? Certainly not. Oh, well....at least 'on average' things are going swimmingly.

When wealth in this country is finding its way into fewer and fewer hands and as the gap between the richest and poorest continues to grow, dishonest people distract from this by touting averages in exactly these ways, proving once again that in the hands of talented political consultants, even the truth can be used to lie. Until people start learning to dig deeper, we will continue to be bamboozled by people who assure us that on average, things are going well, even when it's just a tiny sliver of the population that is benefiting.

--------------------------------------------------------------------------------------

*Actually, he attributed the quote to someone else, but there's no evidence that person ever said it, so we may as well call it Twain.

**These are not the actual figures; this is just an example.

13 August 2011

Of Gracchi, Gross Domestic Product and Growing Unrest

Looking at the riots in the UK and (earlier this year) Greece, I have to say that I am reminded of Rome in the final days of the Republic. The gulf between rich and poor was growing ever wider, more and more people were dispossessed even as the wealthiest patricians grew ever richer. A revolution was stirring and two brothers, the Gracchi, hoped to ride it by leveraging the power of the plebes to effect change. They were each in turn murdered by a patrician class determined to preserve its power. Within another century, the Republic was swept away in favor of empire and the aspirations of plebes were dashed for another couple of thousand years. In short, there was a revolution and it was successful: it was a revolution by the ruling class to ensure their power would be cemented for a very long time to come. And it worked.

In the early 20th century, the 'plebes' tried again. This time they were successful. And that turned out even worse! We got three generations of mind-numbing, soul-crushing, oppressive dictatorships in the name (but scarcely to the credit or benefit of) the working classes. So revolution from the bottom up was just as helpful as from the top down.*

So perhaps instead of revolutions from either direction, we could perhaps have the wisdom to see the telltale signs of growing disaffection and take rational, reasonable, measured steps to stave off any radical moves from the top or from the bottom? Maybe Western governments need to take the British and Greek riots as wake-up calls and address the underlying causes of discontent . Across the developed world, people are fed up with the working and middle classes having to bail out and pay the price for what is increasingly looking like an oligarchic kakistocracy.

And as far as telltale signs go, how is the looting the poor in London committed different from the looting the banks did of that (and my) country? They were bailed out by everyone (poor included) while their executives were receiving huge bonuses and while the politicians who were supposed to be regulating them were at best absent, at worst complicit. If that's not looting, what is? Why is that morally better than smashing a window and grabbing a TV? My point is not to condone the rioting. It was wrong, period. But I just don't think it was that much morally worse than what the patrician class was already doing (and the riots caused a lot less monetary damage than the bailouts). But how many of those bankers went to jail? Again, I am not - REPEAT NOT - condoning the riots and their violence and destruction. I merely question the wisdom of our society's decision to condemn them while accepting other, equally immoral acts.

I do not say this to rabble-rouse. I am no Bolshevik or revolutionary for any class of people. I mean this as a warning sign: if we kill the Gracchi OR the czars, either way the story ends in blood and tears.

--------------------------------------------------------------------------------------
*This is of course a gross oversimplification, but what do you want? It's a blog and I summed it up in two paragraphs!

09 August 2011

Remember Rule No. 1: Don't Panic

As the markets melt down, remember Rule No. 1: Don't Panic! Going along with the herd and selling equities at a time like this is foolish. Personally, not only am I not selling, but I am buying and will continue to do so as prices fall, because equities are like anything else: the lower the price (assuming it's a sound company), the better for you in the long term.

The only scenario in which it makes sense to sell right now is if you honestly, truly believe that the world is coming to an end and stock prices will therefore never, ever recover. Barring that, though, this is the perfect time to buy stocks and I will be doing so aggressively as the slide continues. When the panic stops (and remember, it always does eventually) and stocks begin to recover, I will have taken advantage of buying in at much lower prices, while the herd will have dumped all theirs at the bottom and start buying again at the higher price points.

06 August 2011

Personal Genome Project: Update August 2011

As you may recall from one of my first posts back in January, I decided to participate in the Personal Genome Project. Well, it's finally underway and I am among the first 100 participants! I received my DNA sample collection kit from Dr. Church's lab this past week and returned it. I also had to fill out some surveys and other information and link to my GoogleHealth page, as well as upload my 23andme.com genetic profile. Thanks again to Mark Stevenson and his book, An Optimist's Tour of the Future, for telling me about this amazing project.

While scientific curiosity and fascination with the possibilities are what drove my decision to participate, I must also confess that there is just a wonderful 'cool factor' here. How many people in the world have had their entire genomes sequenced? Not a lot. It's thrilling and, yes, I admit it, a little scary, too.

I am not certain when I will start to hear about results and findings, but I will update this blog as I learn more.

05 August 2011

The Debt Deal: Three days old and already a failure....so let's fix it

Well, we got a deal. And what a deal it was. No new revenue, nothing to create (and a lot to kill) jobs, no reform of entitlements, no long-term solutions to the underlying debt issues. In short, the perfect Republican deal. Obama caved yet again. And three days later, it has already failed: S&P is cutting our rating anyway, claiming we still don't have our debt act together.

So why did we fail? It's because we accepted a false premise. Ever since Obama caved last year on Bush tax cuts for the wealthy – despite the fact that a huge majority of Americans supported his position that the wealthy shouldn’t get more tax breaks – the left has essentially ceded the point that lower taxes=more growth. This idea, which became popular under Reagan, persists to this day as the gospel truth. There's only one small little problem: it's never been supported by facts. But it is so appealing on an intuitive level that few people question it. It just makes sense: government takes less, people spend and invest more, more jobs are created. And if it weren't for the pesky little fact that there is no evidence to support this thesis (and plenty to refute it), I'd agree with it. But if you stop for a moment and give it further thought (something inconvenient in a world of sound-bites, I know), it really doesn't make all that much sense. Consider the reasoning more closely: if I raise taxes and leverage them to create a more redistributive system, lower, middle and upper-middle families get more, the wealthiest get less. Now consider what happens when I give an extra dollar to a wealthy family: they don't need this money to pay bills or even buy new things; their material needs are already met. So they can just put it aside and keep it on the side lines (along with the trillions in wealth already sidelined in this country). But a family lower down the scale will run out and spend it on all the things they need and/or want.

Ah, but the Republicans counter, it's the wealthy who create jobs! It will trickle down. Well, except they don't. Corporations create jobs; the wealthy just benefit from the profits of those corporations through their investments. So it's corporations we should stop taxing altogether and the wealthy we should tax more, to the degree necessary to balance the budget and give more to the lower classes who will actually go out and spend that money to create jobs. In short, it's the lower through upper-middle classes (through their spending) and the corporations (driven by that demand from said spending) who are creating jobs in this country, while the wealthiest simply reap the benefits. So why is it Republicans want to give the most to this class that needs the least and contributes the least? Because, as George Bush said in a rare moment of frankness, that's their base.

If Obama were miraculously to grow a spine, he should propose this to save our credit rating: 1) Eliminate the corporate tax entirely (leaving just the payroll tax). 2) Raise marginal income tax rates on all households whose annual income from all sources is greater than $200,000 (and include an automatic inflation index) so that it starts where it is now for $200,000 and ramps up slowly such that at the highest end of the margin, the rates are up to 75%. (Don’t panic! That’s the marginal rate, the rate you pay only on that part of the income that falls into that bracket, not on all your income once you enter that bracket.) Make the corresponding brackets for dividends and capital gains increase at the same brackets, but make them slightly lower, in order to encourage investment. 3) Raise the Social Security and Medicare retirement age to 70, with the change phased in over time, and introduce means-testing for families with net worth over $10 million (in 2011 dollars, pegged to CPI).

That would be sound fiscal policy and would not only eliminate the year-on-year deficits, but actually put us into a position to slowly eliminate the overall deficit. What's more, it would be pro-growth. In exchange for agreeing to this overhaul, Democrats could even agree to a balanced-budget amendment, assuming it contained responsible language to make exceptions under certain circumstances (e.g. war, certain economic conditions, etc.).

22 July 2011

Norway

I spent four years in Norway, where I did my undergraduate degree in modern languages and international studies. From 1991 to 1995, with the exception of some summer vacations in Spain and the US and a semester abroad in France, Norway was my home. I had mixed feelings about Norway. People can be quite cold and rather provincial and don't even get me started on the climate. But overall, I love the country and visit my son there at least once a year.

What I have always loved most about Norway is the sense that, in an insane world of conflict and hate, here was a corner of the world where sanity prevailed. People voluntarily vote for high taxes to help eradicate poverty, and as a result, there is almost zero poverty in Norway. Despite the lies of Republicans here in the US, it turns out that you can have high taxes, low poverty and high wealth, too. Norwegians are wealthier than Americans, they live longer, they report higher levels of happiness.

Today, that nice, cozy, safe place in northern Europe was shattered by a horrible, senseless act of violence. There's no definitive word yet on who is responsible, but early reports suggest it is homegrown extremism, not the Islamic variety. That horrifies me, because I think that the Norway I love and respect could overcome Islamic terrorism and come across whole on the other side; but I am not sure Norway can stay the Last Homely House of the world if it turns out this was a Norwegian act against Norway. If I were a praying person, I would pray for Norway to come out of this unscathed and still determined to remain a beacon of fairness and justice. Please, Norway, don't let this event deform your sense of decency and fairness the way we let 9/11 destroy our sense of fair play and decency.

17 June 2011

Great Books, Part 2 of ∞ : The Rational Optimist

The second book in my ∞-part series of book reviews, is 'The Rational Optimist' by Matt Ridley. I scarcely know where to begin, which is why this review is being started so late and likely will not be meeting your eyes until considerably later. This book is unnerving to me because I am so conflicted about it. One the one hand, I have no disagreement with Mr. Ridley on his core thesis, which boils down to the idea that progress is inevitable when ideas are allowed to propagate freely, and that progress in fact speeds up faster and faster the more ideas, as he puts it, are allowed to 'have sex'. If we don't meddle with creativity and just let the great marketplace of ideas run freely, we will reap rewards with no diminishing return, hence his giddy (yet, in theory, rational) optimism. Throw in unfettered comparative advantage and we might as well stop worrying about anything at all! Sounds great. And in fact, Mr. Ridley managed to change my way of thinking on a few subjects. But still, with so many, many problems of logic and reasoning, not to mention just plain factual errors, I can't share his unbridled enthusiasm.

So where do I begin? I am not even sure I can begin here in a blog. To be honest, the issues and problems addressed in Mr. Ridley's book have made me think there may be an entire book just in addressing it all. I have been taking notes as I read through the book and so far I have several dozen pages of comments and rather alarming issues...and still have another 100 or so pages of his book to go! (Yes, rather dilatory, but in my defense, I have in the meantime read another 12 or so books...I can never be accused of focus.)

Stay tuned.

15 June 2011

Basic Questions that Smarter People Must Answer for Me NOW

If the dinosaurs were killed off in a mass extinction, then how can they simultaneously be the precursors of chickens and other birds? Isn't this sort of an either/or thing? Either they all died off/or they didn't, but rather evolved into modern avians. Or are we saying most died off, but the ones who survived, evolved into that McChicken sandwich? And if the latter, then which died off and which survived to give me chicken nuggets? I would like some clarification here, please. Is T Rex dead or just nibbling corn awaiting his tasty, tasty doom?

The size of the universe. OK, seriously, this makes my head hurt. 1) The universe exploded from an infinitely-small point 13.7B years ago (more or less). 2) The universe is infinite. 3) No object can travel faster than the speed of light. How do you square this circle? If nothing can travel faster than the speed of light, then no object can currently be more than 13.7B light years from the ultimate 'ground zero', ergo the universe must be finite since nothing in the universe can be further away from that defined point than 13.7B light years. Q.E.D. EXCEPT THEY KEEP TELLING ME IT'S NOT FINITE! Ouch. Seriously, someone explain this.

27 May 2011

Gene-ious

I said yesterday that I had gotten my genome 'partially sequenced' at 23andme.com. That is not technically accurate: I got genotyped, which is just taking one's DNA and looking at genetic variants (as opposed to sequencing the genome, which is of course far more involved, though in principle it's the same idea). As noted in an earlier post, I have also volunteered for Dr. Church's PGP (Personal Genome Project), which in fact would lead to having my genome entirely sequenced. They are looking for 100,000 total volunteers, but I think that their eyes may be bigger than their wallets on that, at least in the short term.

So why did I do the genotyping? I had a number of motivations:

1) Sheer curiosity. I just wanted to learn more about the process and its efficacy and results.

2) There was probably some morbid curiosity as well: it's not every day one gets insight into what will ultimately kill one. Of course, no one knows for sure how one will die (insert cliché about getting hit by a bus tomorrow yada yada), but genotyping can give you a pretty good picture. For the record: stroke and Alzheimer's are my two most likely means of exiting the stage. Neither came as a great surprise given family history.

3) A peek into my children's future. What I have as genetic predisposition or as recessive carrier trait can of course impact future generations. Interestingly, as far as carrier traits, I had...zilch. From Alpha-1 Antitrypsin Deficiency to Torsion Dystonia, I seem pretty much devoid of any unfortunate recessive carrier baggage.

4) Taking control of my health. Knowing that I am 25% more likely than the general population to suffer from stroke or other events or maladies stemming from atrial fibrillation, I am certainly going to be more careful about heart health. Knowing that I am twice as likely as others to develop Alzheimer's means I am certainly going to figure out a more obvious place to keep my keys. And on the other side of the coin, I am going to relax a little bit about type II diabetes: while it may be present in my family, I am far less likely than most people to develop it myself. And most interestingly, I even have some insight into the efficacy of certain drugs in the event I do fall ill. Plavix, for example, is less likely to help me should I need it, which is a very good thing to know for someone with the aforementioned predisposition towards atrial fibrillation. Knowing in advance that other alternatives might work better, could make all the difference. (On the sunnier side, should I ever need interferon, I am more likely than others to respond well to it.)

Another very useful bit of information was about diet. I am very physically active, but all my adult life I have had to work hard to keep the spare tire of 10 pounds or so off my stomach. I have usually combated this through exercise and a low-fat diet. Turns out that might not be the right approach for me; it might even be the problem. I have a gene that not only makes a low-fat diet ineffective, but actually makes things worse. With this gene, I am MORE likely to have extra pounds if I have a low-fat diet. Mind you, this isn't a licence to go and raid McDonald's every day: while I will benefit from much higher fat intake, it has to be high in just one variant of the good kind of fat, mono-unsaturated. So it's good-bye low-fat foods, hello nuts, poultry and oils like olive, canola, etc.

5) Genealogy. It confirmed what I already knew about my paternal side: Irish origins. Interestingly, on my mother's side it's a bit murkier than expected. I could be anything from Druze Muslim to Ashkenazi Jew to Basque (or something as prosaic as British origins, which is certainly the case going back the past few centuries, as my genealogical research confirms). Makes me wonder about the long path my ancestors took in their journey after Africa.

6) Getting an understanding for what I am fighting with v against. There are two possible attitudes one can take here. I can see a genetic predisposition towards not being a sprinter as an excuse not to run. Or I can take the attitude that I must try all that much harder, and note that the same trait leaves me more favorably disposed towards endurance running.

7) Having a laugh. In theory, I should more likely be a blue-eyed man of shorter than average height, with a balding head. (My eyes are hazel, I am 189cm tall and as of age 39, all the hair's still there and the shower drain is clean, knock on wood.)

All in all, I would certainly recommend the 23andme.com service. I have spent USD 99 on far stupider things. A word of warning, though: they do make you explicitly request to have certain results revealed to you, and you should consider that carefully. For example, when you log on for the first time after they finish your kit, most results are right there on the page; but for, say, Parkinson's, you have to click through a warning that basically asks, 'Are you SURE you want to know this?' It could be devastating. I was required to go through the same process for the Alzheimer's one. I opted to look because I figured it couldn't be worse than what I feared. (It wasn't. It was actually better: there's a 75% chance I will never develop it; worse odds than most people, but far better than I had feared.) But give that some careful thought before clicking through.

Beam me down, Scotty.

I said recently that there are some annoying aspects to sci-fi. But when I watch old episodes of Star Trek from the 1980s/1990s, as I did this past winter as the snow hemmed us all in, what strikes me is not how outlandish some of it is, but how much they underestimate the tide of science and technology. As I said in a recent post, there are those who suggest that we are nearing a technological singularity, a point past which society will be unrecognizable to those on this side of the singularity. I said then and repeat now that I am skeptical of this idea, but there are signs that something is hurtling towards us, for good or ill.

Look at some of the 'marvels' in those old Star Trek episodes:

-I recently saw an old episode recently in which they talked about the problem of it taking months before they could sequence a genome to find the problem at the crux of the episode. In the 24th century. In 2011, it takes a few weeks and the price is in the tens of thousands of dollars. I suspect that within a few short years, it will take days and cost hundreds. I have had mine partially sequenced for USD 99!

-In one episode, set in the future even in relation to the show's timeline, a man's sight was restored by implanting cloned eyes. A few years ago, a woman in Spain had a new windpipe implanted: it was essentially her own, as it had been 'manufactured' from an old one using her own cells.* Scientists are now experimenting with growing organs on demand. Liver failing? We'll grown you a new one and get it to you next week. That's not 24th century. That's probably a few years away.**

-In another scenario, that same blind man's clunky 'visor' was replaced with mechanical eyes a few years later. This 'breakthrough of the 24th century' is already happening now, in fact. The first primitive (60-pixel) prosthetic eyes have already been developed and approved for implantation. Wearers can at least discern light and up to eight colors, and see well enough to navigate safely. And that's version 1.0. I would be surprised if we didn't have HD-quality resolution that restored most sight within another generation or so, right here in the Dark Ages of the 21st century.

-Remember the old tricorders and communicators from the original show? My iPhone can do 1000 times more than those clunky things ever could.*** I can whip out this USD 199 device and do things Captain Picard would need his ship's computer to do.

-Speaking of computers. Are you kidding me? That piece of junk on the Enterprise often did things in hours that my laptop could do in minutes or even seconds. There are already supercomputers capable of memory and speed that are comparable to those of a human brain. In ten years, we'll probably have laptops of that capacity. And how about those huge computers on the original series?! Please. We surpassed those before Shatner bought his third toupee.

-On a related note, I saw an episode in which the characters were in awe of an android that performed at 60 teraflops. I am no expert on this, but from my admittedly cursory investigation, it seems that's peanuts! Our current supercomputers are measuring in petaflops, not teraflops.

So skip the beam-up, Scotty. I am doing just fine down here.

--------------------------------------------------------------------------------------
Footnotes:

*You know who you are when I say that at least this time I remember where I read it.

**Well, assuming they can sort out the pesky issue with the connections: making an organ and hooking it up to the body's blood supply are two separate tasks, and it turns out the latter is harder. It's like being told that creating an HD TV from scratch is easy-peasy, but figuring out how to plug it in is superlatively difficult.

***Except getting me beamed up.

04 March 2011

B.E.S.T.

Since the collapse of the colonialist/mercantilist era, all the great economic -isms have been centered on how national economies can increase, maintain, and internally distribute their fortunes. The hows were always predicated on the whys: we created and maintained wealth in a capitalist economy through free-market mechanisms because we believed that the wealth of the individual was an extension of the freedom of the individual; or we created and maintained a command-and-control economy through centralized planning because we believed that the wealth of the society was an extension of the responsibilities of the individual towards the society.

Spoiler alert! Looks like capitalism is more or less in the lead. We are still arguing about degrees, but there is now no major debate, even in 'communist' countries*, about the suitability of the free-market capitalist model. So for the sake of argument, let's call it settled.

So the -isms of the 21st century must turn on very different questions. It is no longer a question of how we gain our wealth, but how we spend it and how we build the legacies we endow with our wealth. And make no mistake: it is a huge amount of wealth, despite the setbacks of recent years. Compared to any other time in history, people in developed countries are better off than ever before. Your average lower-middle class American enjoys better food, shelter, and technology than any medieval emperor could even imagine.

If you are a pure free-market capitalist, the answer to how to spend this wealth is very simple: have few to no taxes and let people spend their money freely, with little to no obligation towards society. If you are of that mindset, you might as well stop reading right now. You won't like a single word of what I have to say.

Now that the pure free-marketeers have exited stage right and in a huff, let's discuss some alternative models. The very idea that there are alternative models seems to scare many people. As soon as one says, "I have an alternative to free-market capitalism", our 20th-century minds automatically react negatively, working on the assumption that one must mean some variant of socialism. But remember that we have moved past that conversation entirely. Again, this isn't about how we accumulate wealth. This is an entirely new conversation not about how we get it, but how we spend it.

At this point, most people might say that how we spend it is in fact a function of how we got it: if free-market capitalism worked for getting the wealth, must we not simply spend it on the same principle, as free-wheeling consumers? My answer is simple: spending it how ever we like isn't making us happy as a society and is perpetuating a culture of debt and emptiness, and an overall sense that we have no collective (and often even individual) purposes as a community. The easy part about positing this belief is that I do not need to provide proof of it: I think that any Western citizen reading this will instantly connect with and feel exactly what I mean here, which is itself all the proof I require. If I am wrong, please write and tell me how our society provides a sense of community and direction, of purpose and legacy, of pride of place in history, of patrimony we happily pass along to the next generation. I look forward to hearing it! I do not expect my inbox to overflow.

So what can we do to provide a fulfilling outlet that will make all the hard work and hard-won wealth seem worth the effort? My proposal is summed up with a simple acronym: BEST. Building, Exploration, Science and Teaching. BESTism - for what is a belief worth if it is not instantly converted to an -ism? - is a reaction and alternative to the simple consumerism that we all thought was the only possible product of capitalism. It's my attempt to say that we don't have to reject capitalism in order to reject many of its ills, because those ills are not necessary outcomes of the system, but are instead unnecessary outcomes of what we do with the wealth created by that system.

I'd like to explore the details of BEST in future postings, but for now will close with a counter-point to the single most obvious objection to the idea. The clever free-marketeer will say, "But without that reviled consumerism, there would be no wealth or capitalism upon which to base these fine pursuits." I reject that argument for the very simple reason that it is factually inaccurate: contrary to popular opinion in the US, money spent on public works, education and exploration, does not simply disappear into thin air; quite the contrary: that money fuels growth and jobs in ways that often surpass those of the private sector, not least of all because such efforts strongly engage the private sector.


--------------------------------------------------------------------------------------
Footnotes:

*How ironic that one of the few remaining communist countries, China, is in fact the most capitalist of all.

13 February 2011

Egypt, Ends and Means

Congratulations to the Egyptian people on a wonderful, courageous revolution. I hope it spreads!

In light of recent events, I'd like to revisit a theme I have addressed many times: ends and means, and why Realpolitik isn't just immoral, but fundamentally irrational and counter-productive.

In 2005, I wrote, "[Successfully redefining our foreign policy] also means being honest with ourselves about the nature of the regimes in our so-called allied countries like Saudi Arabia, Kuwait, Pakistan and Egypt, regimes that are little better than that of Saddam [Hussein's former regime in Iraq] and, for our own interests, perhaps even worse."

And in 2003, "At least since the moment the morally depraved philosophy of foreign policy espoused by Henry Kissinger became our guidebook for dealing with the world, we have marched from one blunder to another. Why is it so hard for us to learn our lessons? We supported the repressive South Vietnamese regime simply because they weren't the communist North Vietnamese. We put the brutal dictator Pinochet into power just because he wasn't the socialist Allende. We supported the right-wing forces of dos Santos just because they weren't the left-leaning UNITA. We helped create the monster Saddam because he wasn't the Ayatollah. We supported the fundamentalist Mujahideen, the precursors of the Taliban and Al Qaeda, simply because they were fighting our communist enemies. And on and on and on, right down to our current tolerance of brutal Afghan warlords. This is the legacy of Kissinger's...[American brand of] Realpolitik: we consistently abandon moral principles in favor of short-term expediency."

In short, I suppose I could live with Henry Kissinger and his ilk, with their smug, self-satisfied contempt for decency in foreign policy and their ends-justify-the-means philosophy, if their policies EVER actually worked EVEN ONE SINGLE SOLITARY TIME. But these guys just never seem to get tired of being wrong. And for reasons I will never understand, every foreign policy 'expert' in every American administration (and from both parties) seems to agree with them, despite no one EVER seeing evidence to suggest they are justified in their confidence. This month, with the collapse of the Mubarak regime in Egypt, they have been proven wrong (and hypocritical) yet again, and Obama was left standing red-faced next to (but supportive of!) a dictator one moment, asking him to leave the next! So just how many more examples do we need to burn through before we accept that there is nothing realistic or practical about Realpolitik? It turns out that letting decency and morality guide your foreign policy is also actually the logical, rational, reasonable thing to do. Why is that so hard to accept? Is it because people are afraid of being thought naive, gullible or foolish? But if the old way of doing things fails repeatedly and consistently, isn't sticking to Realpolitik the naive, gullible and foolish thing to do?

So here's a wild idea. Let's adopt a foreign policy that is consistent with the best ideals of democracy and decency. Let's stop propping up the bad guys, even when it's convenient for us in the short term. Let's stop selling weapons to thugs and sadists. Let's stop funding one terrorist to fight another. I'm not saying we ride in and play hero: as we saw in Egypt, revolutions work best when the heroes are homegrown. But if we behave in a manner consistent with our own values, at least we won't look so hypocritical when we try to stand next to those heroes once they've beaten beaten the villains; and at least those defeated villains won't have been our friends.

03 February 2011

Great Books, Part I of ∞ : An Optimist's Tour of the Future

I've no experience at all with book reviews. Giving my opinion is hardly problematic, as I warned in my inaugural post. But summarizing is not my forte. If anything, I tend to do the opposite of summarizing: give me a paragraph and I will give you a book. When trying to review an entire book, then, well....I just hope blogspot doesn't charge by the word.

Nevertheless, here I am trying to do a book review. Why? Because I am suffering from an embarrassment of riches of late. I have come across so many wonderful books in the last couple of years that I am bursting to share them all. I haven't come across such a wealth of wonderful reading since I was a very young man*, back when ALL wonderful literature was new to me. So, over the coming months, I will share some of these titles and my thoughts on their content and worth.

Among the most recent is Mark Stevenson's An Optimist's Tour of the Future, an insightful and inspiring (if occasionally mildly terrifying) book about the latest trends in all the technologies and ideas that will shape the world to come. I was fortunate enough to read an advance copy of the book, which is being released in the US on 3 February 2011. After I read it, I began a correspondence with the author, who is one of the most genuinely kind people I have had the pleasure to 'e-meet', the electronic nature of our acquaintanceship notwithstanding. I mention this only in the interest of full disclosure: I am reviewing a book of a person whom I have come to know (albeit to a necessarily very limited extent). But to be clear, reading and admiring the book came first and my reflections are thus free of any bias: I would not have reached out to the author had I not already respected his work. So with all the disclosures out of the way...

Think back over the past few years and think about the books you've read on the current state of the world and/or its fast-approaching fate. Then, when you get back from the pharmacy and take your copious amounts of anti-depressants needed to cope with those books, pick up a copy of this book and throw out the pills. Amid all the doom and gloom, here's a blossom of hope. Mind you, Mr. Stevenson is no naïf in rose-colored glasses: he approaches his subjects - among them some of the world's most brilliant people - with an intelligent skepticism, challenging their assumptions and never letting them off the hook when they try to wiggle out of the tough questions.

To get a sense of Stevenson's style and approach in this book, think about the motivations behind "What Are You Optimistic About?: Today's Leading Thinkers on Why Things Are Good and Getting Better", combine it with the probing intelligence and never-say-die quest for creative answers behind "Freakonomics", then dash in the wit and wisdom of a Bill Bryson.

Each section of the book covers a specific topic, with subjects ranging from transhumanism to robotics to the environment to genetic engineering (to name but a few). But more interesting still are the people working at the cutting edge of these fields. In each section, we follow Mr. Stevenson around the world as he visits some of these leading minds of our time, visionaries like Ray Kurzweil, George Church and Vint Cerf. Through wit, charm and intelligence, he elicits a level of frankness that you will not witness in any other interview format. (In that sense, the book is worth the price for the biographical components alone.)

I think the biggest selling point of this book, though, is the way it alters the reader's whole way of looking at an exciting future that is so much closer than most of us might think. Stevenson calls it a 'reboot', and that's a very apt descriptor: the reader finishes the book with a sense of awe (and yes, some trepidation) about a future in which everything we have taken for granted for so long, is suddenly washed away in favor of very new definitions of things as fundamental as success, happiness, relationships, even mortality.

So put down the doom and gloom for a while, turn off the 24/7 parade of dismay and pick up this reason to be optimistic. The future is going to be a wild ride, and Stevenson's book is a good road map.

--------------------------------------------------------------------------------------
Footnotes:

*1938-ish?

01 February 2011

Beam me up, Scotty.

I love science fiction. There. I admitted it. Star Trek. Star Wars. Stargate. Star-whatever. I love it all. But I do have an issue with what I call the internal coherence problem.

Internal coherence means that no matter how odd the rules of the game may be - i.e., no matter how fantastic the sci-fi premise is - once those rules are laid out, the plot must adhere to those self-evinced rules once they are set. In other words, the plot must be coherent with respect to whatever rules were used to define its own universe. So make up the rules and make them as crazy as you like, but once they are made, you have to follow them. And if you fail to set out rules, then your plots must adhere to basic logic, even if the rules of physics are set aside for the sake of fun. Let's take some examples.

-Language. No surprise I am starting here! I have no problem with Star Trek's universal translator. It's a clever plot device, the linguistic equivalent to the very convenient transporter technology of that same show. And at least it helps to address the problem of why everyone in space appears to speak perfect English. But it should follow basic logic in its approach. For example, it either works or it doesn't: it is confusing when 90% of the time it is on automatic and translating everything people say, but then suddenly someone can say a word in another language, but that one word isn't translated.

One area where it shouldn't work, no matter how clever these 24th-century programmers are, is with a completely new language. I don't care how advanced your technology is, me saying to you, "Hello, I greet you in peace, and by the way, your starship is double-parked in a handicap zone and is about to be towed" does not provide sufficient information for you to decipher my entire language. You can't deduce "I'd like to order a pepperoni pizza" from someone saying, "Hey, I love your cool starship".

-Space. You'd think one no-brainer area for people writing about stuff happening in space would be, well, space. I guess we're just so accustomed to operating in two dimensions, that we just forget how much space there is in space! For example, sometimes a ship is 'surrounded' by enemies ahead and to their flanks. It's SPACE, not the ocean. Just go 'up' or 'down' instead of forward. (These terms are used loosely, because of course there is no 'up' or 'down' in space...there's no fixed body against which to measure something as 'up' or 'down'....which is sort of the point here.) The other major problem is sound, which can't exist in the vacuum of space; but I give them a pass on this one: you need sound as a dramatic effect.

-Planets: Again, if you're writing about outer space, this should be something you get right pretty frequently, but alas, no. Some oddball things about planets in sci-fi: 1) Why is it that despite these planets often being at least as big as (if not bigger than) Earth, everyone always crash lands in the same place? What are the odds that two ships from space, landing independently at different times and with no predetermined plan, would land in, say, Tuscon, Arizona in the US? Blindingly little chance. 2) Why is is that every planet save ours has a world government with a single capital for the whole planet, and everyone speaks the same language across this world? We have ca. 200 nations and speak over 6000 languages...why do aliens just have the one government and the one language?

-Aliens. Why does everyone is space look human, only with funny noses or foreheads or ears?! Strange as it may seem, this is actually one that doesn't bother me as much as the others due to a variant on the theme of the so-called Law of Mediocrity*: evolution will often find the same answers to the same or similar questions. So, if billions of years of evolution lead to the dominant species being a (relatively) big-brained biped with a nose, two eyes and two ears, then it is not unreasonable to expect a similar model if circumstances are similar elsewhere. And since our definition of intelligent life presupposes an Earth-like planet, it may be reasonable to expect intelligent beings from space to be not entirely different from humans. In theory. Possibly. [Insert your own generic long string of caveats here, because we have no way of testing this idea!]

-Matter matters. For reasons I don't understand, one favorite plot line across many sci-fi universes, is the 'matter-less being' plot. Example: there's an accident (or some crazy device involved) and characters X and Y become invisible to everyone and can pass through walls. Fine...it is sci-fi after all. But then, how do they stay standing on a floor? If gravity and matter are irrelevant to them, what affixes them to the floor even as they can walk through walls? Shouldn't they be able to dive through floors as easily as they walk through walls? Very confusing.

*Sigh* There, I feel better now. Now I can go back to watching more cheesy sci-fi.

--------------------------------------------------------------------------------------
Footnotes:

*It's not really a law.

31 January 2011

Oenology: Allow me to wine a little

I love wine. I love the way a good wine smells, the way it looks, the way it tastes. Here's what I DON'T love about wine: wine snobs. For reasons that have never been apparent to me, wine draws pretension like honey draws flies. Why is that? Wine-making is a very down-to-earth endeavor with agrarian roots (so to speak). When was it hijacked by snobs with snifters?

The most annoying aspect is the way so many people pretend to experience wine. The human palette is in fact a fairly crude tool. Even people gifted with extraordinary palettes are limited to three, maybe four different tastes at a given swig. So it annoys the bejesus out of me when I hear people talking about "dark chocolate leather with a hint of raspberries and two...no, scratch that...three-day old emmental cheese from...Wisconsin...southern part of the state." OK, I am exaggerating. But not by much! Why is all that non-sense necessary? What's so wrong with just saying, to take sauvignon blanc as an example, "That's a great wine. Tastes like grapefruit."? Why pretend you also taste ten different kinds of passion fruit with buttery overtones and nutty undertones, when it's not physically possible for a human to make that many different taste distinctions?

I'm also not a fan of wine ratings. I think Robert Parker has done a huge disservice to wine, even if he has done a great service to wine-makers' wallets. What does it mean to say that a wine gets 93 v 92 v 80 v...whatever? To me, it's akin to saying that Munch's Scream is a 93 while Van Gogh's Starry Night is a 95. Huh? How do you put numbers on something so subjective? Not only is it impractical; it's demeaning. To suggest that something so subjective can be scored, is to suggest that it is not in fact subjective at all, that it is formulaic and measurable and that there is therefore some 'right' answer. How absurd! You might just as well say that great art can best be done using paint-by-numbers kits!

The most unnerving thing, however, is wine snobs who foolishly spend hundreds of dollars on wines, working under the delusion that more expensive = better. Mind you, there are some excellent wines in those price ranges. But it does not necessarily follow that a $200 bottle of wine will be twice as good as a $100 bottle of wine, or for that matter that it will be at all better than a $10 bottle of wine. And besides, where's the fun in getting the most expensive wine? Anyone with a fat enough wallet and a decent sommelier at his or her disposal can pair a wine with a food. But trying finding a solid $20 bottle of wine to go with that lamb...that takes some thought.

At the end of the day, here's what matters: enjoy what you drink, pretension and snobbery be damned. If you like white wine with steak, have white wine with steak.* If you want to put ice cubes in your zin, go for it. If you love that $15 bottle of pinot noir and think that $50 Chateauneuf-du-pape is swill, then drink that pinot with a smile. And when you hear some schmuck order the most expensive wine on the menu and go on about its delightful nose and hints of tinkleberries, just laugh and keep right on enjoying that $15 pinot.

--------------------------------------------------------------------------------------
Footnotes:

*I don't recommend that particular experiment. It would be pretty gross, I'd think. But if you like it, go for it! And while Sauternes with steak might be a stretch, this does bring up another good piece of advice: ignore that non-sense about 'white wine for white meat, red wine for red meat'. There are many good reds that go with, say, chicken, for example.

30 January 2011

Language Part III: Random Things

OK, for round three, there is no coherent theme. It's just about random things I feel like addressing!

Subjunctive v indicative. For those of us who've studied Latin-based (aka Romance) languages, this is not new. "¡Ojalá pudiera verte!" not "¡Ojalá podría verte!"(Span.) or "Je veux que tu prennes du café" not "Je veux que tu prends du café" (Fr.) or "Spero che tu stia bene" not "Spero che tu stai bene" (Ital.) and so on.

But what about in English? Does English have a subjunctive? Yes, but its usage is on the decline. Consider "If I was in your shoes" v "If I were in your shoes". Most people can identify the latter as sounding more correct (and they are right); but we all acknowledge that the former is becoming more and more common. However, there are certain areas where the subjunctive aids disambiguation, so I expect it to stick around for a long time in those. Consider "Was I in charge" v "Were I in charge": if you fail to use the subjunctive, it sounds like a question. We also find the subjunctive holding on in many fixed expressions, e.g. "If need be", "Truth be told", "far be it from me", etc.

The main reason it appears to be disappearing is that in so many cases, you simply can't distinguish between indicative and subjunctive in the modern form of English*. English has very simple verb conjugation, so the same conjugation is used across many different persons, numbers and even tenses. Consider the present tense indicative: only one (third person singular) is conjugated differently from the rest, e.g. I run, You run, S/he/it runs. So even when we are using subjunctive, it is often disguised because it is indistinguishable from the indicative. Example: "I require that you come to dinner." This is subjunctive, but it is conjugated no differently than if it were indicative, e.g. "You come to dinner". But try it with the third person singular and the subjunctive more obviously rears its head: "I require that he come to dinner", not "...comes to dinner".

'An'. What could be easier than the use of 'an' v 'a'? If it starts with a vowel, use 'an'; otherwise, use 'a'.

Well, two problems here. The first is a common misunderstanding about what a vowel is: vowels are sounds, not letters. The same is true of consonants. That's why linguists use the IPA (international phonetic alphabet): to understand language, you must divorce sounds from letters. It just so happens that the letter 'c' is used to represent a /s/ sound in English sometimes, but that sound is not inherent in this graphical representation called 'c'. So, applying this to our vowel issue, remember than 'an' is used before a vowel, not just a letter we usually associate with vowels. Example: we normally associated 'u' with a vowel sound. But sometimes is plays the role of what's called a 'semi-vowel', a sort of hybrid between a vowel and a consonant. When it does, as in the case of 'universe', we do not use 'an'. No one says "an universe", right?

The trickier problem is related to a very unstable phonetic element: the aspiration 'h', e.g. the first sound in 'historical'. This is a very unstable sound in languages generally, so it tends to disappear over time. That's why (in American English) 'herb' is now pronounced more like 'erb'.** (More on this later.) So when we have these cases of unstable aspirated 'h' sounds and they have become very weak, it is perfectly acceptable to use 'an'. Many people believe it isn't, because the word doesn't start with a vowel. But when the 'h' is so weak and the next sound is a vowel, it makes sense. Example: "I read an historical account of the sinking of the Lusitania." The only caveat is that the word must not only suffer from a weakly aspirated 'h' at the beginning, but must also not have a stress on the first syllable. So we do not say "an history", for although the aspiration is not very strong, the stress coming on the first syllable does accentuate it to the point that 'an' seems out of place. If you think about it, you will see this 'rule' works pretty well. Think about the following: "I rode a helicopter to an historical site to see an historian writing a history of helicopters."

Next on the list of random things I feel like addressing: hyphenation. Specifically, I want to draw attention to hyphenation as it relates to creating single concepts. For example, if you want to form a single modifier from two modifiers, you must hyphenate: "He's an easy-going fellow.***" The reason it is practical to use a hyphen here, is that without the hyphen, the two modifiers appear to apply independently to the modified noun: "He is an easy going fellow." In this case, is he easy-going or is he both easy and in the process of going? It can be confusing sometimes because there are cases in which a given phrase might be used with our without a hyphen, and the semantic gap may be a small one. My wife asked me the other day if this sentence required a hyphen: "I am a fourth-grade teacher." It does, because these two terms together ('fourth' and 'grade') constitute a single modifier for 'teacher'. But the sentence "I teach the fourth grade" does not require a hyphen as it is an adjective ('fourth') modifying a noun ('grade'). But "I teach a fourth-grade class" requires the hyphen because, again, the two modifiers combine to form a single element modifying 'class'.

'Ain't'. Believe it or not, I have no problem with this word whatsoever. When I was growing up, my mother used to correct me, saying "'Ain't' ain't in the dictionary!" In fact, it is, as well it should be. So why is it so often seen as 'incorrect'? I blame what I only half-jokingly call 'Written Language Syndrome'. WLS afflicts languages like English by deluding people into thinking that spoken language should follow written language completely and slavishly. The logic thus leads us to believe, for example, that since 'ain't' was born of strictly colloquial, verbal English, it has no place in 'proper' speech. Why shouldn't it? It is a convenient contraction, and a flexible one to boot, since it serves several combinations. I would agree that, since it evolved at the spoken colloquial level, it has no place in formal writing; but I see no reason at all to shun it at the spoken level.

--------------------------------------------------------------------------------------
Footnotes:

*It was easier in earlier stages of English, since we had more varied conjugations back then.

**It is still pronounced with the initial /h/ sound in England, but this is tied to socio-economic class issues. 'Dropping one's haitches' has always been a clear marker of belonging to a lower socio-economic class, so educated English people who live in horror at being thought (or revealed to be) 'lower class' frantically pronounce all their initial 'haitches'. For this same reason, I would wager that most of them would object to my application of 'an' to anything that has even a wisp of an 'aitch'! 'Aitch' that a pain in the 'aitch'?

***Yes, he is...except with grammar.

27 January 2011

Fun with Artificial Intelligence, Part II

As promised when I initially wrote about artificial intelligence (and specifically about Eureqa), I have been playing around with economic data and have come up with some interesting results. After running dozens of experiments using different combinations of data and allowing Eureqa to use different computational operations, I have come to a fascinating conclusion: in terms of what helps the AI fit equity performance data points to a model based on all data available to it, it seems equities don't 'care' about anything but how we, the consumers, think about the economy, regardless of how well- or ill-informed we are about the economy. Give the AI GDP data, CPI, unemployment, CD rates (to provide opportunity cost), even prior year stock performance and price-to-earnings data: if you also provide consumer confidence data, it will systematically discard every other data type save that one. In some simulations with solutions that have similar complexity and fit, it might also use Fed Funds rate and/or CD rates, which I see as opportunity-cost stand-ins. But for the most part, it just wants to know, "How do you feel about the economy, regardless of how well or poorly it is actually performing and regardless of how much you even really understand it?"

As I said in the prior post, I am far from convinced that such data-mining can offer realistic predictive models. But ironically, the fact that the AI prefers the least rigid, least 'rational' data type, in fact makes me less skeptical about its predictive power. My reasoning is this: if the AI chose models that were based strictly on 'hard' data such as CPI, etc., I would suspect it was simply data-mining and that the results would be useless outside the confines of the already-given universe of data points, since equities markets are inherently irrational and are driven by things far less quantifiable than, say, GDP; but the very fact that it chooses the least 'rational' data type, consumer confidence, tells me that it may indeed be coming up with a decent predictive model, since it seems fitting that it has chosen the one data type that combines rigid metrics but applies them to decidedly 'soft' data (that is, consumer sentiment).

So the proof will be in the pudding, I guess. But that pudding will take quite a while to cook, so don't hold your breath. Meanwhile, for what it's worth, I will shortly add some of the predicted values from various experiments. Then we'll sit back and see what happens!

My next project is GDP. I want to see what data types the AI most prefers for predicting the performance of the US economy.

23 January 2011

Language Part II: This thyme it's homophonic

Today, 'lets' deal with homophones. It's a 'thymely'* subject! We'll have a look at some of the many instances where homophones can lead to problems at the written level.

It's v its: This is a classic example of something that is very unstable and certain to change in favor of what is now viewed as a 'mistake'. (See earlier entry, especially the section on 'he and she' v 'they'.) First of all, they are homophones. But the real problem is that the distinction is entirely counter-intuitive: normally an apostrophe+s construction denotes a genitival relationship, e.g. Christopher's Take; but 'it's' is a contraction of 'it is' while 'its' is a genitival form of 'it'. WTF, mate?! So I expect this to disappear sooner or later. This is the descriptive grammarian in me defeating the normative one.

Their v they're v there. Wow. 'Their' is adjectival and possessive, 'they're' is a contraction and 'there' is an adverb. Hard to blame anyone making a mistake here, at least in everyday writing. The semantic differences are very clear, of course, but the fact that all three are very common and are homophones, makes it easy to slip up. Even I do it if I am in a hurry and/or I am not editing carefully. I expect to see some fusion here eventually. I give it a few decades, but if I had to place bets, it would be on 'there' becoming the single orthographic entity into which the other two are subsumed.

Who's v whose. Pretty much the same thing going on here as 'they're' v 'their'.

To v too. Easy to distinguish between them: one is a preposition and the other an adverb. Hmm...except that 'to' can sometimes be an adverb as well, albeit in a different context. Throw in the fact that it is a difference of but one letter, and I give the whole thing maybe fifty years before one is absorbed by the other. I place my bet on 'to'. I think 'two' will survive on its own, though.

Your v you're. Placing my money on 'your' winning out in the end. Some people will say, "How? They mean two different things!" To which I reply, "Seriously?" I just have this to say: bear (animal) v bear (unrelated verb), quail (bird) v quail (unrelated verb). We get much of our meaning from context anyway, so homonyms are as harmless as homophones.

A lot v alot: This one's a bit different because the latter isn't even a word. But guess what? It soon will be. And for good reason: it expresses a single idea. So while I personally would never write 'alot', I accept that it will soon be an accepted word in English.

Let's v lets. Contraction (imperative of first person plural of a verb) v third person singular conjugation of the same verb. C'mon...it'll boil down to 'lets' eventually anyway. Lie back and think of England. Just 'lets' let it happen.

Ant v aunt: Normally I wouldn't even include these, but they bring in yet another wonderful factor: semantic difference guards against confusing homophones, but especially ones that are entirely regional to begin with. To me, these words are homophones, because in my native South (of the US), they are both pronounced \ˈant\; but elsewhere, they are not homophones at all, as aunt is \ˈänt\. I therefore expect this written distinction to survive.

Almost any word in plural v possessive. This one drives my wife bonkers. You see it everywhere these days: "We have the best price on television's!" "Kitten's for sale". Most words in their plural and in the possessive are homophonic, so this shouldn't be too surprising.

I say just kill the apostrophe everywhere. Look at this plural v possessive issue issue and many of the issues above it: mistakes centered on this annoying little floating comma. Context should take care of making it clear when we mean one v the other, so why bother at all? Will anyone fail to understand "I didnt take its meaning"? 'Its' perfectly clear to me!

As for me, as much as I accept that the 'mistakes' will soon be the rules, and even as I personally promote the downfall of some of the stricter current rules, I am too much a perfectionist to say things like "lets say its OK to make these mistakes". Besides, if I relent, I lose out on the fun of correcting and judging you and making you feel that your worth as a human being is tied to your grammar and spelling. So, you know...I'll just stick with the rules.

Kidding aside (and yes, for the humorless among you, I was just kidding about judging you), if you take away** just one thing from this posting, let it be this: writing follows speaking, rarely*** the other way 'round. We forget this because ours is a highly literate society. But even now, in the 21st century, most languages**** spoken on Earth are either unwritten or are written using a borrowed script (e.g. Latin alphabet). Language occurs naturally; writing is quite artificial. One day I will get around to a blog entry on written language as an (arguable) precursor to civilization. But not today.

--------------------------------------------------------------------------------------
Footnotes:

*OK, 'thyme' can't be an adjective, but shut it..it's my blog.

**We'll talk about hyphens at some point. Take-away v take away and so on.

***Spoken follows written only in cases of a sort of hyper-correction, when speakers try to make their speech conform to written standards. Granted, 'hyper-correction' is usually used in a different context, but I think it fits here, too. Remind me to talk about hyper-correction when we discuss comparative linguistics!

****It always surprises people when they hear that there are roughly six THOUSAND languages on this planet. [No one really agrees on the exact number, since there are a lot of grey areas, especially when it comes to distinguishing language v dialect. This can be a very touchy subject. For example, I have always maintained that Norwegian (and its MANY dialects), Swedish and Danish are all just dialects of a single Scandinavian language, since they are all more or less mutually intelligible; but I do not recommend ever saying that to a Scandinavian person! But when someone like me can study at university in Norway and have textbooks in all three 'languages' throughout the course of his academic career, it seems difficult to make the case that they merit the distinction of being called separate languages. But as usual, I digress!] India alone has around 700 languages. Even the language we call Chinese is in fact many different languages, many of them mutually unintelligible on the spoken level (though sharing a common, mutually understandable written version).

22 January 2011

IQuestion

Pet peeve of the day: IQ tests. These are the wife-beaters of the intellectual world: they've been telling people they're stupid for so long, that people believe it. But have you ever noticed that the type of intelligence they test just HAPPENS to be the kind of intelligence possessed by the very inventors and promoters of these tests? And they have been so thoroughly successful in conning people into believing that this type of intelligence is the 'true' intelligence, that we design whole systems of advancement and learning around their tests.

So what is intelligence, then? If real intelligence were truly just about the type of intelligence tested on these exams, I would be rated fairly highly, since I always score very well on traditional IQ tests. (Please don’t ask me my IQ. I believe telling one’s IQ to be as vulgar as telling people how much money one makes.) That's no coincidence: I have always had good 'book smarts', the very same type of intelligence of those who make the tests. But my question is, is that really a useful gauge of how well people use their minds to navigate the world around them? Isn't that the real definition of intelligence: how well do people use their mental faculties to adapt to (and excel in) the world around them? And if that is the true definition, aren't IQ tests woefully inadequate, even misleading?

I can think of at least two areas that are completely ignored by IQ tests. The first is the ability to understand people based on their reactions. For example, some people know how to 'read' other people exceptionally well. Some people can see a slightly raised eyebrow and the smallest of grimaces, and glean a book of information about that person from those facial expressions. How is this less important a survival skill than, say, knowing the square root of 256 without the aid of a calculator? That person skilled at reading faces can always whip out his iPhone and get that square root. There is no app to help me read other people. But if we both take an IQ test, I appear to be the smart one.

Another mental skill ignored - one seemingly minor in the modern world with our GPS and smart phones, but in my view still important - is a sense of direction. I lack it utterly. I couldn't navigate my way out of a paper bag. But I know people you could lead into an unknown field at night, and then watch them head due north. I have lived in the same neighborhood for five years and still have no idea which way is north (unless I look at my iPhone compass). Maybe this doesn't much matter to an urban person in the 21st century, but I can tell you from experience that it is still a skill you very much miss when you lack it.

I can hear the objections from proponents of these tests: "we never claimed they were holistic measures of all types of intelligence and anyway, those skills you cite are instinctual, not mental."* I reject both arguments. The first is disingenuous: proponents may have softened their stance on the value of the tests in recent decades, in light of studies exposing things like cultural bias and education v. supposed 'innate' intelligence, but even today, they clearly (and quite smugly) believe that those of us who have high IQ test scores are somehow superior. I reject the second argument because this division of 'conscious' v 'subconscious' (i.e. 'instinctual') mental ability is completely arbitrary and irrelevant. Moreover, it is a false one: when answering some of the toughest questions on IQ tests I took in my youth, I did not perform any conscious calculations.

Speaking of subconscious, my pet theory (which I posit only half in jest) is that all the geeks who came up with and continue to support IQ tests, designed them with a (hopefully subconscious) goal of setting themselves above all those kids who made fun of them in grade school. The popular kid who had charisma and could read people so well? His skills won't count as intelligence! That kid in boy scouts who could never get lost and could lead the troop out of the woods? Doesn't count! In fact, every kind of meaningful intelligence besides their own kind...doesn't count!

So do we reject all IQ tests? Not necessarily. But we need to go back to the drawing board and redesign them, or at any rate supplement them.

--------------------------------------------------------------------------------------
Footnotes:

*Yes, this is a straw-man argument; but it is one that I could reasonably expect from an IQ-test proponent. And if you don't buy that, then, well...it's my blog, so shut it!

15 January 2011

Language

[Preface: Upon reading this a second time, it reminds me of a medieval codex found in (what's now called) Italy. It was written by a frustrated Latin teacher who kept admonishing his students, "It's not [x], it's [y]!" Of course, to future linguists this discovery was a gold mine: this frustrated teacher's 'corrections' were markers showing the evolution from Latin to Italian. So before reading the following treatise, remember: if you think any of it strange or unwieldy, take comfort in the fact that many of the 'mistakes' I point out will likely be the rules of English in future centuries. But since you are living in the here and now, well, they are still just mistakes...]

My random ravings on language usage:

1) Incorrect usage of the words 'irony' and 'ironic'. I don't understand why people feel the need to use such words if they don't understand the meaning. Do people just generally feel they should know the meaning, so maybe they use these words to overcompensate? Why? There is no reason why everyone should get irony, any more than everyone should get, say, skydiving. I don't get spatial reasoning, so I avoid giving directions and I do my best to avoid maps at all costs. I don't go out of my way to volunteer as a navigator to overcompensate for this deficiency. I am also terrible at remembering names.* I do not offer to introduce people at parties to overcompensate for this failing. And that's OK. It's who I am. So why do people who do not understand the concept of irony feel the need to use the word and its adjectival form so much? I blame Alanis Morrisette. Her rather silly song shoved the word into popular use, despite the fact that her use of it was almost entirely off base. There is nothing ironic about rain on your wedding day. It's just sad and unfortunate. A traffic jam when you're already late is annoying, but not ironic. A no-smoking sign on a cigarette break? Also just a sad bit of luck.** What's worse, people often use these words when they in fact mean precisely the opposite of irony. Example: "Ironically, the prisoner was captured as he was dashing out of the prison yard." That is exactly what one would expect to happen! But there is a delightful irony in all this: people using the exact wrong and opposite word when saying 'irony', is itself a form of irony, so this suddenly got all meta. Anyway, the point is, be hereby absolved of any need to understand the words 'ironic' and 'irony'. You are liberated! So just stop using them!

2) People making pretentious attempts at using what they deem 'fancy' grammar words...and then failing to use them correctly. I am a bit of a self-confessed grammar nazi, in the sense that I recoil at poor grammar when I find it in places one should least expect it, e.g. newspapers, books, etc. So it's ironic*** that I am in fact not really a normative grammarian at all when it comes to everyday speech and informal writing. Even in formal writing, it doesn't bother me terribly, as long as the rules are broken for the sake of fluidity and clarity. For example, take the 'split infinitive'. First of all, it's a stupid rule, and believe it or not, it is relatively new and quite artificial. But new or old, avoiding splitting infinitives leads to tortured word order and poor clarity in many cases. So why bother? Proper grammar and syntax are supposed to be instruments that facilitate communication. The moment they become a hindrance, dispense with them. It's that simple. But when writing formally, just make sure you know the rules before you break them.

Examples of people attempting to use 'fancy' grammar:

a) Who v Whom. If you don't know the difference, just always say 'who'! It's fine. I promise. Even I won't judge you for it!**** 'Whom' is a word on its way out anyway. Why? Because language is the most democratic thing ever devised. In the end, the majority always wins. If it didn't, Italians would still be speaking Latin. So, like 'may' v 'can', 'whom' will soon be a relic. Even I don't bother with it all the time. But for the record, the difference is relatively simple: 'Who' is always a subject of a verb. 'Whom' is always an object.***** Some people simply can't get their heads around this. And you know what? That's FINE! So just say 'who' all the time! Nobody cares! Just don't sound foolish by using 'whom' incorrectly, because then you sound pretentious (for attempting to use a word you do not understand) AND foolish (for said lack of understanding). And to be fair, it can be hard to get the concept sometimes, because it is not a function of syntax, despite what many people believe. In other words, just because the word you're looking for is not *immediately* followed by a verb belonging to it, doesn't mean 'who' is the incorrect choice. Classic example: "Whom shall I say is calling?" Sounds fancy, huh? It's wrong. It's easy to think that the 'shall' belongs to the 'I' (which it does) and thus that the 'whom' must be an object; but it isn't: it is the subject of 'is calling'. So it should be "Who shall I say is calling?" The easiest way to untangle these things is to play with the word order a bit. "I shall say whom is calling?" You would never say "'whom' is calling" there, right, because it's obvious it should be 'who' as a subject, no? So don't use 'whom' in the other formulation: just because the word order has changed, doesn't mean the grammar has.

b) 'Fancy' (mis)use of pronouns. WHY on Earth do people feel that nominative versions of pronouns are somehow 'fancier' than accusative ones? They aren't. Nominative means it's the subject of a verb. Accusative means it is the object. (Technically, accusative means it is simply the object of a verb, but in practice, since English doesn't really get into the weeds on variants based on case, for all intents and purposes****** we can say an object of anything, e.g. of a preposition within an adverbial phrase, etc.) So when people try to sound pretentious and say, "The letter was addressed to she and I", I cringe. Would you ever say "She did it to I" or "I did it to she"? No. So why would a letter be addressed to "I"? Answer: you think it sounds cool. Stop it.

But the worst (to my ears) is misuse of reflexive pronouns. You sound ridiculous when you say, "It was given to John and myself"! Or "She gave the letter to myself". Only you can do something to 'yourself'.******* Someone else does it to 'you'. "I sent the letter to myself" v "She sent the letter to me". And when you use it incorrectly, you sound so pretentious, because it's obvious you think it sounds 'elegant'...but it's not; it's just plain wrong.

c) 'Fancy' use of adverbs...when you really want an adjective. Classic example: "I feel badly." If you feel badly, it means your ability to feel is something you do poorly. If you mean that your state of being/mind is bad, then "You feel bad". Verbs like 'feel' and 'smell', when used intransitively, convey a meaning that is basically the same as 'to be'. Take 'smell'. If you smell badly, it is the manner in which you are capable of smelling, so you are saying your sense of smell is poor. If you smell bad, it is a state of being, so go take a shower.

A similar issue arises when one joins two words that are essentially subjects using a 'copulative'******** verb. That's why it is technically "It is I", because "to be" is an intransitive verb, so there is no object and both components are thus subjects joined by the copulative verb. But this is where democratization of language wins out: most people treat the second subject as an object and it is perfectly natural to do so, so someone reading this in 100 years will likely have learned that "It is me" is the correct way to go.

3) 'i.e.' v 'e.g'. This is a really easy mistake to avoid. Just imagine yourself saying 'that is' v 'for example', e.g. "We were stuck, i.e., we weren't going anywhere." Think of it this way: e.g offers examples that are a subset of the universe, while i.e. simply renames that universe. "I ate lots of things, e.g. apples, bananas, etc.". E.g => because these are just some examples of the many things I ate. "I ate lots of things, i.e. I had quite a variety." I.e. just restates the matter. If it helps, 'i.e' really does mean 'that is' in Latin ("id est"), so that's an easy way to remember it. E.g. means 'exempli gratia', 'for the sake of example'. And you know what? English is a beautiful, flexible language on its own. So if in doubt, just dump the stuffy Latin altogether! Just say 'that is' and 'for example'! You will use those properly every time! And why is Latin a better alternative to English anyway? Aside from the convenience of abbreviation, I see no advantage.

4) Pronoun number consistency. Example: "I am not sure exactly which candidate we'll pick, but they will be qualified." This is an unfortunate side-effect of English's lack of a more flexible set of pronouns, combined with an increasing sensitivity about excluding women from general statements. The correct form is to say 'he or she' (or 'him or her' if accusative), e.g. "I am not sure which candidate we'll pick, but he or she will be qualified." It used to be common simply to say "...he will be qualified", but that's quite understandably and necessarily a no-no these days (unless of course it's, say, a sperm donor, in which case let's safely stick to the masculine pronoun). On the written level, I have a convenient short-hand for this: "...but s/he will be qualified."

This is probably an uphill battle, though. The fact is, using two pronouns, e.g. 'he or she', is just too cumbersome. It's a pity we can't just make up some nice neutral pronoun. If we had the equivalent of an Académie Française, I guess we could give it a shot. But I for one am quite happy that English lacks such a dictatorial governing body. One of the main things I adore about English is its amazing dynamism and creativity; any governing body would soon beat those traits out of it. Still, it's hard not to look with some envy at languages that have more flexible pronouns. For example, when I hear 'we' in English, there is always the ambiguity: do I mean you and I are the subjects? Or do I mean some third party + me as opposed to you? In other words, are you part of the 'we'? In many languages, this is solved by having two pronouns for 'we': one meaning the speaker and the person to whom s/he is speaking, another one meaning the speaker and someone other than her/his interlocutor. But I am getting off on a tangent here. Some other day, I will waste your time talking about comparative linguistics.

5) Lay v lie. This is actually very easy. Lay is a transitive verb. Lie is not. A chicken lays an egg. You don't 'lie' anything. You just lie, e.g. "I lie in bed." You don't "lie an egg". BUT, and here's where it does get a bit tricky, the past tense of the intransitive 'lie' is...'lay'. Sorry about that. So it's: "I lie in bed today", "I lay in bed yesterday", "I have lain in bed since last week". "I lay an egg today", "I laid an egg yesterday", "I have laid three eggs since last week". (What can I say, I'm one busy chicken.)

But again, language as democracy says that at some point, these two verbs will simply fuse into one verb with a single conjugation, but with two meanings: one transitive, one intransitive. And you know what? That's fine. There's a word for languages that don't change, evolve and adapt: dead.

OK, that was fun! But I am dying of hunger. Ironic since I haven't eaten in ages. But whom is responsible for that? It's me. I guess I should feel badly, e.g., I feel guiltily. So I shan't just lay about waiting for dinner! I have a guest coming, so I should prepare something for them.


--------------------------------------------------------------------------------------
Footnotes:

*Seriously, it's embarrassing. But it only seems to apply to living people. I can tell you who succeeded Thomas Moore, who faced William the Conqueror at Hastings...but I couldn't tell you a single name of a single person I met at a recent party.

**She wasn't entirely off. For example, conquering one's fear of flying just in time to board a doomed flight is indeed ironic.

***Yes, ironic.

****Well, maybe a little. But we'll both get over it.

*****In the mood for more delicious irony? In every sentence in which I have used 'whom' so far, it has been a subject of a verb, in direct contradiction to what I just said. But this is in the same way 'me' is a verb in this clause: it is referring to the entity/concept v actually using the word in a sentence. Like saying "'Me' is a word in English."

******Not 'all intensive purposes'...minor pet peeve.

*******Settle down!

********You in the back! Stop that snickering!