Impopular Culture

Impopular Culture

About the Blog

Coming at the world from an unexpected angle, far too fast to stop.


WritingPosted by Pelotard 2010-12-09 23:20:06

I'm just through reading Surface Detail by Iain Banks. As always (or at least very neraly always), Banks' writing is top class, and I enjoyed it all the way through, which isn't bad considering 1) it was 600 pages, 2) lately, I've taken to chucking books across the room about one third of the way through because they annoyed me. Usually because the protagonist was described as a super-genius but became a complete moron the moment the plot required it of him.

I haven't blogged about any of the chuckers. It's not a done thing for an aspiring writer to tell the world how much some named, published writers sucked.

Here, I'm sort of making an exception. I feel I can do this because it's a book I actually liked, by one of my favourite authors. Not that I'm going to say that anything in it sucked, as such. But there was one rather spectacular failure that's of interest to writing in general, so I though I might say a fe words about it. It's to do with theme.

Surface Detail is one of Banks' Culture novels; you don't have to have read any of them to follow my reasoning—or let's say that my reasoning will be equally confusing and pointless whether you've read any or not—but it takes place in an extremely high-tech civilization spanning a huge amount of worlds (some of them artificial). The premise is that people's personalities can be stored in what I suppose you'll have to call "computers" in the same way that you might use the word "boat" for both Queen Elizabeth 2 and a log that you sit on while paddling with your feet. And when the physical person dies, the softcopy of the personality can be incarnated in an artificial body, or in a virtual world designed to any specs you care to name.

On some planets, it has been decided that if you've been bad when you live, you'll be incarnated in a virtual Hell. Other planets take a dim view of this, and war ensues.

This is, of course, a situation which could be full with all sorts of ethical dilemmas and interesting conflicts. There's potential to have people who are mostly bad but happen to side with the good, and vice versa.

Unfortunately, this is mostly absent from Surface Detail.

What you have is one of the Hells, described in loving, gory detail so that we can have no doubt that this is a Bad Thing. Certainly something that only the weirder sort of religions can applaud, and their single representative on these 600 pages is an Obviusly Bad Person. It isn't even felt to be necessary to explain why the good guys are opposed to this; they don't even have to have their moral reasons for their opposition displayed—opposing Hell is actually enough to qualify you as Good Guy, initially, although it's revealed later that some of the Bad Guys are also opposed to Hell. No reason is given for this, either; it's simply assumed that any civilized person (or alien) must be. This is the only shading available: it's so obviously bad that even the bad guys are opposed to it.

I strongly suspect that this lack of deeper detail in Surface Detail is because the premise, on closer inspection, has difficulties holding together.

On a very basic level, the software copies of personalities have legal rights. But the extent of these rights are never defined. Probably just as well.

Do they have the right to be run? (Would they than ever elect to be run other than in a simulation full of extreme bliss?) Or are they only safe against being deleted? Can they own property? (What would they want with property?) Do they own the original person's property? If they are copied, does the copy own all, half, or none of the property? Can they vote? Can they still vote after they've copied themselves onto a billion trillion computers (which they may or may not own)? Are there safeguards against them ganging up on the living? Or is that discrimination? Either the safeguards or the ganging up; pick either. Or both.

Nothing of all this is covered in the novel. Fortunately, the character it happens to didn't own anything anyway. And noone thought to make a few copies of her. Could have become awkward otherwise.

Then we have a problem that runs deeper. It's in all of the novels where "copy your personality" is used as a plot device. Is the copy the same person as the original?

Peter Hamilton's Commonwealth Saga sidestepped the question altogether. From society's point of view, the copy was the same. Most characters believed that their recording, when inserted into a clone, was still them. Some, though, didn't—so the narrative wasn't decided on the issue.

The transporters in Star Trek do it differently. They take Spock apart into his constituent atoms, zap them through space, and reassemble them on Vulcan. (The reason for this particular technical solution is that it was too expensive to shoot the Enterprise landing on a new planet in every episode.) In that case, planet-side Spock is arguably the same person as Enterprise-side Spock: it's the same atoms put together the same way.

Other novels featuring transporter technology will have the transporter scan the exact quantum state of every particle, which destroys the original. Then a copy is assembled at the target location, using locally available atoms. Still, it's a fundamental tenet of quantum mechanics that all particles of one type—neutrons, say, or electrons—are completely indistinguishable, so it is still arguably the same Spock: you can't really tell whether it's the same atoms or different ones. Then again, quantum mechanics seemingly forbids you to record a complete quantum state due to complementarity, so maybe we shouldn't talk too loudly about this, in case quamtum mechanics notices.

For the software variety, it's a bit easier. Since you can make the backup while I'm still alive, the software personality, although a perfect simulation, can't be me. I'm still here. The other bloke's in the machine, doing whatever it is ghosts do to pass the time.

So, does it make sense to punish the software version because I've been bad?

Not really, huh?

Sure, the software version is probably also a bad person. But this person hasn't actually done anything. He's just a simulation of someone who has. And if he has legal rights, as per above, they should presumably protect him from suffering eternal torments for something that was done by a different person altogether.

But, maybe, if the legal system was run by one of the nuttier religions? Well... they would probably be dead set against the machine person being a continuation of the living person. They would, in fact, be more likely to make the entire ethical problem go away by branding the machine person an abomination and pressing Delete.

Basically, the idea of a religion being nutty enough to go to the trouble of creating artificial Hells to punish artificial persons for something that someone else did, while still being sane enough to run entire planets with no apparent trouble, sort of stretches the logic to breaking point.

Not that I think that this was what Banks was out to get at. The whole Hell thing is more of a plot device to get the story up and running with a huge conflict and huge stakes. Based on that, he did his usual thing with a myriad of people (some of which are spaceships) running around trying to outwit each other or using advanced machinery to kill each other, all for very good reasons, or at least so they seem to them. Even the bad guy is somewhat normally bad, unlike e.g. the one in The Algebraist.

And while he was constructing a plot, he took the opportunity to take a dig at religions.

Sure: I've been contemplating asking the Pope if Jews really go to Hell for failing to believe in Jesus, and if this is so, pointing out that Hitler at least only tormented them until they died, while Jehova will go on for all eternity. But Banks doesn't ask that sort of questions. He only succeeded in taking a dig at his ideas of what an insanely fundamentalist religion should be like, but none actually are.

You can read this one even if you are very devoutly religious. Unless, possibly, you have religious reasons to believe that artificial people should be punished for things they didn't do.

But in that case, I believe you deserve to go to Hell.

  • Comments(0)//