Board Thread:Speculation House/@comment-14138255-20140815002519/@comment-66.249.83.64-20140816015744

'Weiss Scnhee' is German for 'white snow' so 'Ice Queen' is somewhat of a play on wordword when using that nickname. Now for he knew who she was. Roman's a professional enough criminal to keep an eye out for all worthwhile marks in the city of vale, so he would probably knew what WeWeiss looked like even if he didn't learn what she looked like from his dealings with the white fang. So that's my logical explanation for that joke.

Anyways as far as the robot conversation is going I have a few points to make. If we do make robots for war, and they only follow orders/controller input then they should always have to take orders from a human supervisor or captain who can make reasonable judgment calls. If we truly make robots that are capable of learning complex thoughts and action, we also have to teach them ethics and morals. There is this idea in robotics referred to as the singularitysingularity(not a black whole this time). It basicly states that at a certain point robots have to become like humans to get smarter/evolve, and humans have to become more like robots to become smarter/evolve. I Robot was a very good movie to demonstrate this concept(the one starring Will Smith). Vicky who could only think logicly and followed the 3 civilian laws stated in this forum previously, believed that to truly stop humans from coming to harm had to take over the world, sacrifice a few human lives for the good of many, and suspend certain liberties and freedoms. Sunny, a prototype that had a second robots brain(located where the heart or soul would be, very good symbolic placement), effectively knew the laws like a human knows though shall not commit murder, but could choose to override and ignore them, and was effectively a whole new breed of robot. Sunny was a key player when shutting down Vicky, who asked him why he opposed her when her 'logic was undeniable.' Sunny's response, "yes, but it just seems so ... Heartless."

The problem with rules is that no rule is absolute there will inevitably be situations where there are exceptions or there should be exceptions of a rule, because it's impossible for people who make rules and laws to truly predict every possible situation. But enough of that short rant, if we truly make robots that can learn some of the first things they need to learn are ethical view points, so that they don't operate on ruthless logic. Glad to see that Penny's father did do this.

Defcon Deceiver