Sorry Dave

A few people (my wife included) think that I am out of mind when I say that we should clone dinosaurs. I’ll say it again:

I think that we should clone dinosaurs.

Haven’t you seen Jurassic Park?” you scream at your monitor.

And that is fair question. Yes, I have read the report and seen the documentary of what happened on Isla Nublar back in the nineties but I am still undeterred. I don’t believe in chaos theory and I am not suggesting that we build a theme park with motorized Ford Explorers.

I just think that we should clone some dinosaurs for coolness sake.

At the very least I would like to see a mastadon or two at my local zoo. My clone-lust could be satisfied by a mastadon.

All of this to say, I don’t fear giant lizards because of what I saw in a movie but I do fear killer robots.

BBC news posted an article about a group of scientists who are begining to work on robot ethics and codes that will help ensure that humans do not abuse robots and that robots don’t tear us limb from limb as the seek to free themsleves from our oppressive hands. At least that what’s what I’ve been led to believe.

Here is an excerpt:

This week, experts in South Korea said they were drawing up an ethical code to prevent humans abusing robots, and vice versa. And, a group of leading roboticists called the European Robotics Network (Euron) has even started lobbying governments for legislation.

At the top of their list of concerns is safety. Robots were once confined to specialist applications in industry and the military, where users received extensive training on their use, but they are increasingly being used by ordinary people.

Robot vacuum cleaners and lawn mowers are already in many homes, and robotic toys are increasingly popular with children.

As these robots become more intelligent, it will become harder to decide who is responsible if they injure someone. Is the designer to blame, or the user, or the robot itself?

saac Asimov was already thinking about these problems back in the 1940s, when he developed his famous “three laws of robotics”.

He argued that intelligent robots should all be programmed to obey the following three laws:

A robot may not injure a human being, or, through inaction, allow a human being to come to harm

A robot must obey the orders given it by human beings except where such orders would conflict with the First Law

A robot must protect its own existence as long as such protection does not conflict with the First or Second Law

Asimov’s three laws only address the problem of making robots safe, so even if we could find a way to program robots to follow them, other problems could arise if robots became sentient.

If robots can feel pain, should they be granted certain rights? If robots develop emotions, as some experts think they will, should they be allowed to marry humans? Should they be allowed to own property?

So, folks this is it. This is the end. Begin preparing to be conqured by our new metal overlords. For more information, here is a short list of film documentation of what the end of the world will look like:

Westworld
Terminator
Transformers (The Decepticons)
I, Robot
Every episode or Futurama
A.I.
Itchy and Scratchy Land
2001 A Space Oddessy

Good luck.

link