clay тerran | ѕpace nerd (
geonomy) wrote in
driftfleet2015-09-19 01:52 pm
![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
![[community profile]](https://www.dreamwidth.org/img/silk/identity/community.png)
Entry tags:
- !mingle,
- aiya/gray nightingale,
- allen walker,
- anders,
- belthazar spellscry,
- beverly crusher,
- clay terran,
- coil lenn,
- cole,
- davesprite,
- dorian pavus,
- elize lutus,
- finrod felagund,
- hiccup horrendous haddock iii,
- isabela,
- james buchanan barnes (au),
- james buchanan barnes (ou),
- james potter,
- jennifer keller,
- jinx halcomb,
- kazuto "kirito" kirigaya,
- lea (axel),
- leanne,
- margarine "margie" detemps,
- marhi ohsi/metallic gold magpie,
- maxine "max" caulfield,
- megaman.exe,
- miah lanbatal,
- mordin solus,
- nelkeila tarid,
- pearl,
- phèdre nó delaunay de montrève,
- r. daneel olivaw,
- richard castle,
- riku,
- rogue,
- simon tam,
- sokka,
- steven quartz universe,
- tekhetsio,
- toph beifong,
- varric tethras,
- vash the stampede,
- vima sunrider,
- yamanaka ino,
- yuuki konno,
- zessica wong
When We're Rollin Through the Wild Wild West
Who: Everyone! EVERYONE!
Broadcast: IF YOU WANNA
Action: Everywhere! EVERYWHERE!
When: Mid-September - Mid-October
[How's your ship doing, pardners? Is it still pretty damaged? Is it glitching out hardcore? Well, there is one place to land to take care of all that, and that's on a dusty old moon in front of a planet that no one wants to go to. On this moon, you'll find a civilization that looks a lot like one's stepped into a wild west movie, hardened cowpolk and mutant horses and all. Will you become a vigilante, hunting bounties and stopping gangs from doing their dirty work? Will you go exploring planetside, with a cowboy hat and a stalk of wheat in your mouth to complete the look? Good luck with that; there is no wheat. This town ain't big enough for all you flooters, but make it so! For the ratings!
In other words, it's a planet mingle! Get 'er done!]
[September Planet info here]
Broadcast: IF YOU WANNA
Action: Everywhere! EVERYWHERE!
When: Mid-September - Mid-October
[How's your ship doing, pardners? Is it still pretty damaged? Is it glitching out hardcore? Well, there is one place to land to take care of all that, and that's on a dusty old moon in front of a planet that no one wants to go to. On this moon, you'll find a civilization that looks a lot like one's stepped into a wild west movie, hardened cowpolk and mutant horses and all. Will you become a vigilante, hunting bounties and stopping gangs from doing their dirty work? Will you go exploring planetside, with a cowboy hat and a stalk of wheat in your mouth to complete the look? Good luck with that; there is no wheat. This town ain't big enough for all you flooters, but make it so! For the ratings!
In other words, it's a planet mingle! Get 'er done!]
[September Planet info here]
no subject
no subject
[He gives this some thought, very seriously.]
I should explain that 'friend' is the accustomed term of address between Auroran robots, though I would call Giskard a friend in the sense you mean as well. I have known several individuals that I am privileged enough to call my friend. If I am forced to define friendship, then I would say it is in my response to someone's company. I find that my responses are quicker and easier, positronic potentials move more smoothly, I am less aware of the pull of gravity upon me. There are a number of other effects, but those are the chief ones. I term this pleasure, as it seems analogous to how humans use the word. In the presence of a friend, I experience pleasure. My responses towards such a person are very strong; though of course the First Law holds for all humans, it is especially strong in regards to the well-being of a friend. It is doubtlessly different from what you experience, but for myself, it is how I define it.
no subject
There's a lot she could ask him - Why do robots call one another "friend?" Does that mean they have their own culture? How is friendship different between, say, two robots and a robot and a human/? However, she is sidelined by one small detail.]
The First Law?
no subject
no subject
Wait, wait, wait... [Her hands are starting to flail around in agitation] "Must obey all orders?" That's - [slavery! Might be best not to say as much out loud, though.] So when I told you to sit, did you have to do it, even though it was meant as a friendly offer, not an actual command?
no subject
[He frowns, very slightly, giving her a curious look. This is actually not a reaction he's ever encountered.]
Why does the Second Law bother you, Dr. Crusher?
no subject
It's an abridgment of your free will, makes you and your people societal subordinates, opens you up to abuse... in short, I find it unethical.
no subject
That is what robots are created for. We were made to serve. I admit that such heavy dependence on robots is not always beneficial for humans, but that doesn't change my nature.
no subject
I'm sorry, I shouldn't... it's just that Data has had to fight so hard to be treated as a person, as an individual in his own right, it's something I'm sensitive about.
no subject
[Daneel gives this some thought, for a moment. The idea seems wrong to him, in a deep and very crucial way. And yes, he's masterless, but that's because he's serving humanity in a wholly different way now.]
I'm not sure what to say, Dr. Crusher. All robots are made this way. It does not trouble me.
no subject
Data was made under different circumstances... for Dr. Soong, yes Data was a scientific achievement, but he also saw Data as a son.
[Not that he was always the best of fathers but they were family.]
no subject
[He isn't fooled, and it's hard to know just how hard to press. It's difficult to drop the subject.]
Dr. Crusher, I don't understand your distress. I am not harmed in any way by the Laws. They are my nature, and that is all. To be perfectly honest, I sometimes find the fact that humans do not have clearly defined Laws of their own to be... unsettling.
no subject
I understand that you are not human, you are not of my universe, and I cannot expect you to hold the same values as I do. But the fact remains that I cannot see you as anything other than a person, equal to me and everyone else in this fleet, so to me, this Second Law is a violation of that equality.
And perhaps... perhaps it is also because I do not always trust human nature. As you say, we do not have any clearly defined, innate Laws that govern our morality. We're capable of wonderful, amazing things, but we are equally capable of horrific things. Abuse of power is all too common in our history and even as far as we have come, it's still always a possibility. So it worries me for humans to have that power of command over you, though I am relieved to hear that you have never come to any harm because of it.
no subject
[Daneel thinks about this, and he gives a small, soft sigh.]
You are, I fear, judging all robots of my world by me, and this may not be an accurate picture. I am an extremely advanced robot, in appearance but also in the complexity of my brain. Not all robots are comparable. In fact, many robots are very simple, capable of only doing one particular task: farming, for example, or mining, or selling articles of clothing. They would not be capable of having this conversation with you.
Robotics is a field that stretches back seventeen hundred years, and the earliest robots, I suspect, needed the Laws in order to function effectively. The designs of more complex brains were built upon those of simpler robots. The Laws have always been part of the design of the positronic brain. While it is true that I have a stronger than usual Third Law, as I was considered a very valuable robot, the Laws cannot be removed from a positronic brain. Nor could one be made without the Laws, at this point, without recreating seventeen hundred years of robotics advancement. Perhaps it's true that I could function without the Laws, but I nevertheless have them. If you would forgive the comparison, humans have remnants of primate ancestry that they themselves do not need, but they are still a part of you. In fact, Friend Giskard theorized that there are Laws of Humanity that govern your behaviour, but we were never able to determine what they might be. They are far more complex than the Laws of Robotics, at the very least.
I also must question why it would be inherently wrong for me to accept orders. If someone has a task for me to accomplish, and I am capable of it, why should I not fulfil that? It pleases them, and it pleases me to be of help. Just now, you asked me to sit. I sat, and you were were made more comfortable by this gesture. I cannot be ordered to do something truly distasteful to me. I have... resisted... the order to submit to deactivation, in the past, because there were tasks I had to accomplish that were more important than that order. It was difficult to do so, but it was necessary.
[This is quite a speech. He's aware that it's... a lot. Oops sorry.]
I am admittedly of the opinion that a human society that depends heavily on robot labour is by its very nature unstable, and will eventually collapse. I have seen this myself, but this does not make the Laws unwise. It is partially because of the Laws that I have been without a master for the past year. I formulated my own Law, though perhaps it is more properly a corollary of the First. I term it the Zeroth Law, because it takes priority over the other three: A robot may not harm humanity, nor through inaction allow humanity to come to harm. I saw that I could do more good for humanity acting independently. The Laws have led me to this. I don't believe they are a negative influence.
no subject
You're right, we do still have a biological and psychological connection to our primate past. We can't ever totally abandon or ignore that, but the beautiful thing about sapience, about self-awareness, is our capacity to move beyond our basic instincts, as you have done both in your refusal to deactivate and in your development of a new Law. Forgive me, I was under the clearly false impression that you had no defense against an order you fundamentally disagreed with. I feared what would happen if a human ordered you to harm yourself or another robot or - in the case of the fleet - another non-human.
[Her gaze drops, as does her voice, and a strong sense of fear emanates from her. Though it's only a remembered fear, it's nonetheless a potent one.]
I have seen what happens when a person's will is stripped from them entirely, when they cannot do but what they are commanded.
no subject
[He shuts his eyes briefly, needing a moment to recover. This conversation is... uncomfortable, but necessary.]
What have you seen, may I ask?
no subject
In my universe, there is a race of cybernetic organisms, who call themselves the Borg. No one knows exactly where they came from or how they originally came to be; all we know is that they expand their population by forcibly assimilating members of other species into their collective hive mind. When someone is assimilated, they lose all their individual free will, their identity. They now belong to the collective, body and mind. And all the collective cares about is expanding further, collecting other species and their technologies as they go, and destroying anyone who resists.
[She takes a deep breath.]
My captain [And the emotions packed into those two words, "my captain:" there's a bond of loyalty there, of caring, that goes well beyond what one normally feels for a superior officer], he was assimilated. When they came to attack Earth, they used him - not just his knowledge of our technology and battle tactics. No, they went further than that. He was their spokesperson, the face we all saw and the voice we all heard as they came for us.
[Another pause, as she attempts to reign in her emotions to finish what she has to say.]
Eleven thousand people died in that battle. Fortunately, we stopped them from reaching Earth, and Captain Picard was recovered and rehabilitated but... I know the guilt of all those deaths weighs heavily on him. Perhaps always will. And despite what we know about the Borg, there are still many who blame him personally for the loss of their loved ones. What's worse, is that it is only a matter of time before the Borg return. I can imagine no worse fate for Earth than assimilation.
This is nothing like your situation, I know... but... hopefully it helps explain my reaction.
no subject
A force such as you describe has the potential for terrible destruction. If it can ever be permanently stopped, a loss might be justifiable, if... terribly regrettable. It's not a choice I envy.
How was it your captain was able to recover from such an experience? I can only imagine the experience of being part of a single mind against one's will and then being... severed would be difficult.
no subject
[A small sigh escapes her.]
To be honest, though, I don't know that he'll ever fully recover, psychologically. Not at least while the threat of it happening again is still there, in the back of all our minds.
no subject
[What would a mind like the Borg be like? Would it be overwhelming to touch? Could he deactivate something like that, or would it be too strong? He knows how a mind might be twisted fatally, even if he never would.]
And I do hope you can defeat the Borg some day.
no subject
[She says this half defensively, half ruefully.]
I have to believe we will. Anything else is... unthinkable.
no subject
[He feels that... very strongly indeed. He can't help it.]
no subject
no subject
no subject
I'm sorry to have thrust all this on you. [Her brow furrows.] I haven't told anyone else about the Borg. They're ... difficult to talk about and to explain to someone who has never encountered them.
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)