Tuesday, 18 December 2012

Merry Christmas!

I like Christmas. There, I've said it.

I like the way it breaks up the dull midwinter with a few weeks of licensed overindulgence. I like the familiarity of its rituals, going back home to see my family (some of whom I don't see at other times of the year unless there's a wedding or a funeral, since I've been living in London), the exchange of cards and gifts (even if the stress of deciding what to buy people is a hassle), the crap TV. I know that there are now other ways of keeping in touch with people we don't really want to see all the time (it's called Facebook), but it's nice to be forced to stop and think about Aunt so-and-so once in a while, especially since she doesn't have a computer. I like turkey with all the trimmings, Christmas pudding, arguing with my mother, the whole kit and kaboodle. I'm sorry, but I don't want to do something alternative and help people in Africa with a goat (the best way to help them is to get the EU to dismantle the protectionist Common Agricultural Policy), or promote ethical this that or the other. The whole point of Christmas is the consumption. It's once a year, just roll with it. Indeed, if we didn't have Christmas, I think we'd have to invent it. Which of course we did. Or that is, pretty much every society that lives outside the equatorial belt has invented a winter festival of some sort, be it Hannukah, Diwali, Kwanzaa, Saturnalia, Yule or Samhain, to help them get through the long winter nights. This history of different cultures with their different ways of doing a winter festival is why Christmas has become such a glorious mishmash of other traditions, from the fir trees, Yule logs and deity on a flying sleigh from the Norse and Germanic solstice festival of Yule, through the gift exchange, drunkenness and overeating of Roman Saturnalia (Saturn still turns up, but now at the end of the year, with his white beard and robes, scythe and hourglass, as Old Father Time), the holly, ivy and kiss under the mistletoe of the Druids, to the grafted-on Christianity that has slowly accreted ever since Constantine decreed it would be Rome's state religion. If there was ever a multicultural festival, it is surely Christmas.

Occasionally there is a bit of atheist angst over Christmas, but that presupposes that Christmas is in any way a Christian festival, which it never really has been. The Puritans recognised this by trying to ban Christmas as 'pagan', and you can see their point (even if you can't sympathise with such a bunch of killjoys). It makes me laugh every year as another po-faced commentator opines about Christmas's commercialism or attempts to "ban Christmas", as they completely fail to recognise that this is simply Christmas reverting to type. For all of the cloying hard sell to children of a baby surrounded by cute animals and being given presents, the Nativity is pretty much beside the point; a clumsy attempt by a Gospel writer to ret-con Jesus into Isaiah's (amongst others) prophecy about the Messiah, complete with a fabricated census to ease the confusion over where he was supposed to be born, the mistranslation of 'almah' as 'virgin' and a few lost Zoroastrian priests (Magi) turning up from Persia for no very good reason. The birth of Jesus is arguably the least interesting thing about him, which is presumably why two of the Gospels ignore it completely, and the other two give contradictory accounts. His ministry and execution (and, if you believe it, resurrection) are surely the main point of the story.

So never mind the Disney-esque manger and shepherds stuff, let's celebrate the solstice as it has always been celebrated, with food, drink, presents, and family. Happy Holidays, Merry Christmas, and Io Saturnalia!

How high is a nobleman?

Something has been bothering me lately, and its something that I've not really ever seen much discussion of in textbooks and the like: to what extent was nobles' potition at the top of society reinforced by them simply being taller than peasants?

The theory runs something like this: height in adulthood is at least partially (about 10% according to figures I've found on the web) down to diet in childhood. That is, someone with a protein-rich diet will end up about 10% taller than someone without. I think that underestimates the effect, personally - you can see the effect in modern Japan, where the post-war generation are up to 12-18 inches shorter than the current burger and Kobe beef chomping one. We know that people were on average shorter in previous ages (or do we - see below). We also know from their suits of armour and skeletal remains that people like Henry VIII and John of Gaunt were well over six feet tall. So presumably medieval nobles (with a comparatively meat-rich diet) were similarly about a foot taller than medieval peasants, and therefore people literally looked up to them as superior beings, a bit like the elves in Tolkein.

Unfortunately, to quote Blackadder; "there was just one thing wrong with this theory... it was bollocks."

After some research I found that I wasn't the only person to have wondered this, and someone had done a proper study of remains from grave sites, indexing height with nutrition in previous ages, and found that there was no support for my theory, as well as that longevity and average height actually haven't changed as much down the ages as is popularly believed.

It's still a relatively small sample, and it's not quite enough to make me completely change my mind, but it looks like I have to accept that if there was such an effect, it was much smaller than I imagined - maybe one or two inches at most on average, and certainly not the 6-12 I had imagined. So much for that idea, then.

Saturday, 28 July 2012

Jerusalem

I'm not a great sports fan. I have been known to watch some big events; England football games, the rugby world cup, that kind of thing, but the Olympics has always bored the shit out of me. Athletics just isn't that interesting; it's like a school sports day blown up to monstrous proportions. Still, it's come to my adopted home city anyway, with all of its Zil lanes for the 76,000 strong "Olympic family" of bigwigs, sponsors and assorted hangers-on, its vanloads of police with machine guns, helicopter gunships and surface to air missile batteries, its heavy-handed corporate censorship, its road closures and general self-important pomp. Londoners have carped. We are not as a city easy to impress. After 2,000 years, London has endured sacking by Celts, plague, fire, the Blitz, the IRA and al Qaeda, oh, and by the way, two previous Olympic games. Pardon us if we don't swoon at the prospect of a third. Nevertheless, I have enough residual pride in my country and my city that when it came to last night's opening ceremony, I didn't want it to be embarrasssing. I watched with fingers crossed. "Just don't be shit", I Tweeted. It wasn't.

If the Olympics are boring, opening and closing ceremonies tend to be even more so, with lots of bland totalitarian marching and dancing in unison in bright lycra costumes and vapid sentiments about fraternity and peace and giant doves being unleashed. Four years ago Beijing had pulled out all the stops to send the kind of message about China's arrival on the world stage that Dr Goebbels would have approved of, and we had limply slunk away after Boris Johnson had looked manic with his shirt hanging out and some second rate slebs farted around on a red bus. Oh God, we all thought; 2012 is going to be really embarrassing, isn't it? Well fortunately, no. Danny Boyle rose to the challenge, and managed to produce something both impressive, stirring, at times confusing, occasionally bonkers, but identifiably British and definitely the best opening ceremony I have ever seen, perhaps the best it's possible to conceive of, given the constraints it has to work within.

We had all seen the teletubbies-style layout of England's Green and Pleasant Land in the preview, and had rightly been a bit suspicious about trying to represent the country as a John Majoresque fantasy of bicycling nuns and cricket on the village green, but that was swept away by the impressively done sequence of the Industrial Revolution, Kenneth Branagh as Isembard Kingdom Brunel as Caliban from the Tempest declaiming loudly underneath a cross between Silbury Hill and Glastonbury Tor. In a kind of 'four ages of Britain', we moved from the 18th century's pastoral idyll to the industrial 19th and then to the 20th century (exemplified, apparently, by the NHS, but then it's a closer thing to a state religion in Britain than Anglicanism ever was), and finally the digital era of the 21st. Boyle tried to cram in just about everything he could, from Shakespeare and Blake to JK Rowling, from Elgar to the Beatles and the Arctic Monkeys, James Bond, Mr Bean and even the Internet care of Tim Berners-Lee, but it served as an effective reminder that while the days of our industrial muscle and globe-spanning Empire may be (fortunately) behind us, we are still a cultural superpower. He played unashamedly to the home crowd - things moved so fast that even as a native I'm sure I missed things, and there's surely no way someone from overseas would have identified Eastenders or Michael Fish, but managed to be inspiring and even emotional without being mawkish. And above all I think it succeeded in winning the hearts even of cyncial Brits because the ceremony had something that the Olympics is generally conspicuously lacking - a sense of humour. From the utterly lunatic sequence of the Queen 'skydiving' into the arena with Bond to Rowan Atkinson daydreaming about Chariots of Fire, it wasn't afraid to be funny. Take that, po-faced mandarins. Our singing children weren't dubbed either, and our fireworks weren't CGI-enhanced.

The games themselves will be as dull as they always are, but I think everyone in Britain is walking just a little taller this morning. "And was Jerusalem builded here, among these dark satanic mills?" Well Mr Blake, even if only for one evening, yes, perhaps it was.

Wednesday, 11 July 2012

The Most Human Human

Last night I went on a whim to a lecture at the Royal Institution given by Brian Christian, author of The Most Human Human, a book about his experiences of taking part in the Loebner Prize, an annual competition based around Alan Turing's famous 'Turing Test' thought experiment. Christian was talking about artificial intelligence, and the way it is changing what it means to be human. The title of the talk comes from one of the sub-prizes awarded at the Loebner Prize - there is an award for 'most human program' and also for the human that most judges guessed was a human - the 'most human human'.

It was an interesting experience on several levels. Firstly, I had never actually been inside the RI before, although as a teenager and budding scientist I used to love the televised Royal Institution Christmas Lectures, and watched them avidly, even though (in an era before video recorders) it sometimes meant getting up at 6:30am or the like. This lecture was in that same lecture theatre, which is a lot smaller and more intimate than it looks on TV, and where people like Michael Faraday and HG Wells have lectured. I also dicovered the RI has a very nice (if understaffed, at least last night) bar and restaurant which is open to the public and which I'll definitely be making more use of.

Anyway, the lecture. He started off with a potted history of the philosophy of what it meant to be human, from Aristotle to Descartes, and the theory that the thing that differentiated us most from the animal and plant kingdoms was our capacity for rational and abstract thought. Then he ran through a history of artificial intelligence, reminding us that 'computer' used to be a job description for mathematicians, and that Turing only used the word by way of analogy - 'this machine... well it's a bit like a computer'. Now 60 years on the definition has flipped, and the computer is the machine, and we use it as an analogy for a human who is skilled at maths. He described the way that computers have staked out territory that we once thought of as belonging purely to humans, but argued that the easiest things to duplicate via a machine were exactly those things we once most valued in ourselves and considered made us distinctively human (playing chess 25 moves in advance, knowing the answer to Jeopardy questions), while the most difficult to automate were actually those things we take for granted (recognising people, understanding language, walking around without bumping into things). In AI circles this is known as Moravec's Paradox, but Christian suggested that now that we are measuring ourselves against machines rather than animals, it is precisely these biological skills that we may come to value more.

For the second part he moved into a discussion about the Turing Test, and how we judge whether someone else is human through a low bandwidth medium like text messaging. He pointed out that in fact we now all do it every day, every time we read an email and decide if it has come from a spambot or a real human, and asked us to concentrate on the next batch of emails we scan, and work out at which point we decided if this was a human or a machine, and to try and analyse what our decision making process was. This part of the talk roamed through speed dating, CAPTCHA codes (where a machine is, ironically, deciding if we are human or not) and the dreaded autocorrect, where the AI is interposing itself between us and our audience, trying to second guess us, and in so doing smoothing out precisely those human foibles that make us distinct. With reference to the hacker that accessed Sarah Palin's Yahoo account, he discussed computer security and how we are moving away from content-led security (passwords, ID codes) which computers find easy but we don't, back towards form-led security like signatures and biometric recognition, and the ways that we recognise each other (voices, faces). He argues that human and machine intelligence are already in a symbiotic relationship, and so changes in machine intelligence will continue to change how we view ourselves and how we relate to each other.

The talk had some interesting ideas, and was a great way to spend 90 minutes, but somehow left a lot of loose ends. I suppose it was aiming to make you think a bit - and buy his book of course! But as to the future - when I asked him about Searle and the Chinese Room he clearly came down on the Strong AI side of the argument, that human intelligence is in effect a physical process which will, ultimately, be simulated or duplicated to the point where we can no longer tell the difference. However, he did admit that we're nowhere near there yet. Even now the best Loebner Prize programs can only fool humans 25% of the time under perfect conditions (the judge only gets 5 minutes of interaction purely via text). Still, Turing predicted a 30% success rate by 2000. He wasn't that far off, was he?


Monday, 4 June 2012

More Forethought needed

Yesterday I avoided Jubilee toadying by going to see Ridley Scott's Alien prequel Prometheus. This was a film I had been really looking forward to, and there are precious few of those these days - Hollywood has become a slave to massive, empty, effects-driven machines, generally involving superheroes (which I find too silly - it's the costumes, chiefly). Unfortunately, Prometheus seems to have absorbed some of the same nanite virus and has ended up as Alien cross-bred with <insert mega blocbuster here>. In mythology, Prometheus was the titan who stole fire from the Gods and gave it to mankind, and who was punished by being chained to a rock having his liver pecked out by eagles every day, and a similar fate awaited most of the characters in the film. In Greek, Prometheus means 'forethought'; I ended up feeling that more of that could have gone into the script. Spoilers follow...

I know that sequels have to refer back to the originals, and the odd line of dialogue being repeated is more of a knowing wink from the director, but really: a spaceship with a cold, English android, a slacker captain, some surly blue collar engineers and a feisty brunette (and an icy dominatrix as added optional extra, although you could tell she was always for it). This is just Alien, surely? Alien signal pointing us at distant, windswept planet - check. Evil corporation moving behind the scenes - check. Ship infiltrated by alien biohorrors courtesy of said English android - check. Android's head torn off but keeps talking - check. Feisty heroine last to be eliminated - check. At least there were no cats. Being a prequel, and having watched Alien again the day before, I did briefly wonder how technology ended up going backwards in the intervening years, so that Ripley and co were running a space freighter using DOS, but heck, the Nostromo was the equivalent of a tramp steamer, not the cutting edge research ship that Prometheus is.

And what has changed in the intervening 30 years are the resources at Scott's disposal, and to be fair he deploys them to great effect. The film looks stunning. The 3D computer graphic as the remote drones explore the complex is beautiful, the wide alien vistas and the Nazca Lines meets Alien Hive base complex looks stunning. Being an Alien fanboy I suspect that these latter re-use some design sketches Giger did for the original Alien film which there wasn't enough money to create first time around (but which Scott/Fox presumably still owned the rights to). The scenes within the alien 'Space Jockey' spaceship, and the exterior shots of it in flight (and crashing) are amazing. It is a perfectly rendered version of the one from Alien, but done with modern photorealistic CGI and not just a dimly-lit model in a Pinewood back lot. I would have paid my £5 just for those bits, to be honest.

However, in order to propel the plot the allegedly clever scientists and corporates begin with some schoolboy errors. Anyone who has played in Jim Wallman's Universe campaign (which borrows a lot from the Alien universe) knows that (a) you never take your helmet off in an environment that might have alien bio-contaminants in (in this case an entire alien nano-virus manufacturing facility, apparently), (b) you always bring the Marines with you first time around, rather than waiting for the sequel, because by then you will be dead, and (c) if in doubt, take off and nuke the site from orbit. A few pistols and a flamethrower are not going to cut it against giants and squids and nanite plagues. The scientists also exhibit amazing lack of curiosity to anything not directly concerned with the mission, eg: "It's -12 degrees, so what is this gooey liquid - it can't be water?" "Whatever". "What's this strange black gunk coming out of the vases?" "Dunno. Probably not important."

From there the movie runs predictably into The Thing territory, with alien bio-horrors, a thinly-sketched cast being decapitated one by one, lots of running down corridors (I was amazed Scott had the self-discipline not to include a self-destruct countdown) and much ickiness. There are interesting ideas in there struggling to escape, from the Von Daniken-esque Engineers (Forerunners) and their motives both for creating and apparently wanting to destroy humanity, the old man searching for an alien Elixir of Life, the Christian scientist trying to reconcile her faith and science, the robot being a substitute son for the old man against the daughter (the ice cold blonde) who he is estranged from, etc, but these are drowned out in the running and shouting and explosions bit. Noomi Rapace directing the auto-doc to perform an emergency caesarian is an especially silly as well as gory scene. One of the few interesting points - that the Engineers/Space Jockeys are actually giant humans in space suits and appear to share 99.9% of our DNA - is completely wasted, since when one is finally awakened from cryo-sleep, rather than helping provide plot explanation, it just goes berserk and starts killing things and simply becomes another monster. Ho hum. I was never quite sure where the squids came into it, either.

I guess no movie ever lives up to its own hype. In spite of all, it's an entertaining enough two hours. But you just felt it was a bit... lazy somehow, and it could have so much more.

Monday, 5 March 2012

Oh God...

I suppose nothing should surprise me any more about American Christian fundamentalism, but today was the first time I came across the concept of Providential History - the idea that the history of the United States of America has been divinely guided by God because the nation has a part in His Plan. This apparently includes the US Constitution being Divinely Inspired, in much the same way as the Bible. I was so gobsmacked that I had to sit down for a while and take in the enormity of what that claim would imply. Even the discovery of minerals, oil or other natural resources is divinely guided, apparently. Presumably God therefore has a bit of a downer on Jews but is particularly delighted with Wahhabist Islam, given the relative share of resources that Israel and Saudi Arabia have been given.

It shouldn't be too surprising I suppose; the temptation if you are at the top is to assume that you deserve it and got there via your own merits, rather than there being any element of chance or circumstance that has led you there, and from there it is only a short step to say that it has been Divinely Ordained and that you are part of His purpose. Trouble is, that kind of teleological reasoning was just as popular in the British Empire ("God is an Englishman"), and probably the Spanish and indeed Roman Empires before it. It also presumably means that God was also pretty keen on slavery and the genocide of native peoples - he must have been in a bit of an Old Testament mood at the time.

Of course, not even the adherents of Providential History would actually claim such a thing. They insist that those were human "mistakes" which were made along the way. It's the usual story of; "if it's good it must be from God, if it's bad it must be human". The trouble with that is that what you attribute to God's Will thus becomes completely subjective; basically whatever confirms your own prejudices is all down to Divine Inspiration, and anything else is the fault of "secular meddlers" messing up the Plan. And hence the context where I came across Providential History today; that of gay marriage.

Now marriage is a whole ballgame all in its own right, and while the Catholic Church gradually muscled in on marriage ceremonies during the Middle Ages, it has always been an essentially secular activity; a civil contract between parties, to do with inheritance and exchange of property (back at a time when women were also property, which is another can of worms I'll leave for the moment). Even today, a Church "wedding" is technically only a "blessing" on the marriage - the actual legal ceremony itself is about declarations in front of witnesses and signing the certificate. Personally I've never been quite sure why people were so keen on marriage anyway; I don't need a certificate to validate my relationship, but that's by the by. Let's just say that this is an area where views have changed throughout history. In a lot of US states, weddings between people of different colours was illegal until very recently, and equally justified at the time by appeals to the Bible. I've recently read an interesting article that I unfortunately can't locate at the moment which showed how US Christians' position on abortion has moved over the past 50 years, from "some for, some against" to "you can't call yourself a Christian if you're in favour of it".
EDIT: Found it!
But which view was Divinely Ordained, and which the result of Secular Medding? God's not telling. I suspect that the adherents of Divine Providence might find He's not really got much to say on the subject of gay weddings either. He's probably too busy working out where the rise of atheist China fits into Providential History.

Yeah, somehow I thought you were going to say "as the Antichrist"...