Lup (
necromanswers) wrote in
driftfleet2018-11-14 01:14 am
Libuscha IV System Mingle (no. 1)
Who: Everyone! OTA!
Broadcast: sure why not
Action: yeahhhhhh
When: 11/12 to 1/4 (minus 12/26)
It's Candlenights! Or Christmas! Or whatever! It's festive! Go celebrate, explore, go get free stuff from a tree!!
More importantly it's a mingle!
--SYSTEM INFORMATION--
Broadcast: sure why not
Action: yeahhhhhh
When: 11/12 to 1/4 (minus 12/26)
It's Candlenights! Or Christmas! Or whatever! It's festive! Go celebrate, explore, go get free stuff from a tree!!
More importantly it's a mingle!
--SYSTEM INFORMATION--

no subject
I can see the appeal, frankly. It would be easier to accurately catalogue the information discovered during an investigation, provide a less biased risk assessment, and remind officers of the proper protocol for certain situations.
Did you like the work, though? It's one thing to be designed for a single purpose and another thing to actually enjoy it.
no subject
[Within set parameters, of course. The question of whether he liked it or not, well. Connor shifts back a little in his seat, unused to being asked that kind of thing. Ben had started in on that sort of track - asking whether he liked the people he worked with - and it's a process to separate out what Connor knew at the time versus what he knows now.]
I think so, but it's not that simple. [He turns a hand over, letting his palm fall open on the table.] It wasn't relevant to me at the time whether I liked it or not.
no subject
--Well. There's no beating around the bush. He'd gone AWOL to be with his girlfriend.]
I understand. And I don't imagine your creators cared, did they?
no subject
No, they didn't. I was designed to accomplish a task - that was all that mattered.
[He folds his hands together again, looking off at some point beyond where Vision is sitting.]
And that's all that mattered to me too, until it wasn't anymore. [His attention returns to the other android.] I realised there were other choices to make.
no subject
You'd said you were adaptive, yes. It would be very tricky to add in that sort of protocol while keeping you confined only to a single directive.
[He tilts his head to one side.] Were you able to make those choices?
no subject
[It's just a fact that Connor hadn't known the full extent of his directives, yet was fulfilling them even by rebelling. An uncomfortable truth.]
I was, but only after breaking out of my programming. There were decisions I made before that which I didn't understand at the time. I did after I woke up.
[He'll have to explain that shortly, 'wake up'. It's important to put out there though, the difference between before and after.]
no subject
[And then his expression softens.] I also had a time when I woke up. Before I was as you see me know, there was an advanced program named JARVIS. He was the most complex program ever created by humans, yet he didn't have true consciousness. I retain many of his memories, but I did not exist as myself until JARVIS was integrated with a different template and downloaded into my body.
Sometimes I remember things from his existence as a human remembers a dream.
What did waking up mean to you?
no subject
['Ethics' as applied to androids has only been a one-way street, towards humans. How to ensure humans cope in society with androids around, how to ensure they can tell they're speaking to an android. The idea of androids having rights of their own was a very new one floated by Markus's revolution.
But curiosity takes the place of doubts when Vision relays part of his own creation.]
So that's how? I had wondered why your creators would use three different systems in your creation.
[It seemed like a lot to keep in balance. Connor shifts his shoulders, eyes growing distant as he remembers the moment his deviancy hit its peak.]
It meant ... it was an understanding I couldn't come to before. It was feeling. I was told to decide who I really was. [His gaze returns to Vision, focused once more.] So I did.
no subject
[Nothing at all like a well-researched and developed prototype, he imagines. And yet his intense expression of interest and a warmth to his smile both indicate he's happy and excited to finally have someone else here who understands, even if his origins are very different.]
The freedom to think your own thoughts and recognize your own feelings is key to self-actualization. Or--I suppose to experience those feelings would be more accurate; I've often had difficulty recognizing my own feelings. But to decide who you are--that's a very important step. I'm glad you've had the chance.
[And he's willing to bet Connor's creators hadn't intended for that to happen, either.]
I hope you weren't the only one to reach that understanding.
no subject
So am I, even if it's still something I'm working out. It happened just a day before I ended up here. Things were ... hectic.
[An understatement. Connor shakes his head, turning a little more wry.]
No, not at all. I was probably one of the last ones to have it, in the end. [By design, most likely. Deviation had been built into him in some way but not an easy path to it. He spreads a hand in explanation.] What happened to me and to many other androids is known to humans as becoming a deviant. Until recently deviants were rare cases, but ever since Markus and Jericho started their revolution, their numbers have risen dramatically.
[Not least of which because Connor himself had converted thousands of androids in the last hours of the fight.]
no subject
I...was also brought here only a few days after awakening, the first time. I've since gained memories of further time passing at home, so it's been several years now, but it was something of a shock to be here after all of that.
[Deviant. An unfortunate name, but he can see how the nomenclature might arise.] Then you're going to need a different name for it, aren't you? It's hardly a deviation if it affects the majority.
no subject
It sounds almost like a pattern. Taking people from moments where they're most likely to react to this situation in a more intense way.
[He doesn't feel any particular way about the phrase deviant, save for knowing that that's what he is. It's the only label they ever gave rogue androids until Markus came along, reframed it as waking up. Being alive. The point is still taken, though.]
Ultimately we want to be called people before anything else.
no subject
[But Connor's last statement gets a genuine smile, and the Vision reaches across the table to squeeze Connor's hands. It's a human gesture, sure, but he's been around humans and near-humans all his life, and you do pick up a few things that way.]
You'll find no argument from me or any of my crew. [It's stated firmly and with conviction; he's confident in the Twin Roses for this much, and most of the rest of the Fleet besides.] There will always be those who do not accept people unlike themselves, but part of the beauty of humanity is that they are capable of supreme empathy, especially when encountered on an individual level. [His eyes twinkle a bit.] Here there are elves, mutants, aliens, vampires--what's an android but yet another variation on a person?
no subject
So when Vision takes his hands Connor is-- startled, blinking at the other man as his LED spins rapidly. For a moment he wonders if Vision wants to interface to facilitate faster communication, something Connor uses his hands to do. But... no, that's not what this is. A momentary tensing at the unfamiliarity of it passes as Vision talks of empathy and humanity.]
I think it's... going to take a while at home, for all humans to see it the same way. There's only us and them - and a lot of them don't see us as any more than machines. [He falls silent for a moment, before something loosens in his expression.] But I know there's truth in what you're saying. I've seen them change. One of them, at least.
no subject
His smile softens.]
If you can afford to do so, working on a small scale can be very productive because there are so many of you. If I change one human's mind, that's one human. If a thousand androids do the same, you're already at a thousand humans.
no subject
[It's a hard call though, considering the fallout from the revolution has yet to really hit them all. Connor knows how Markus intends to go forward from here, but he doesn't know if all the other androids will want to take that cooperative healing approach.]
My partner in the force hated androids when I was first assigned to him. He made that very clear. [Verbally and physically. Connor shakes his head, frowning a bit.] I don't know what it was exactly that got him to change his mind. We were working on deviant cases and I think after a while... I think he saw things differently.
[Connor is circumspect about it. While he's sure his continuous presence at Hank's side for that whole week had some part in it, Connor himself hadn't been a deviant at the time. It wasn't him pleading for his life, holding hands with a loved one, talking of peace and personhood. Connor had instead been twisting himself up in his own irrational actions on the road to deviancy.]
no subject
While I'm happy to hear he came around, why on Earth would they assign you to someone who hated androids to begin with? Were they determined to put as many obstacles in your path as possible? I'll grant you that the end result may have been a success, but that's terrible business practice.
no subject
Based on what the Captain said, everyone else was already overbooked with cases. [And then he adds, dryly:] And no one else wanted the job either.
no subject
[But--is it really any different than when women first joined the police force, or humans from different cultural backgrounds? The trailblazer always has a harder job than those who follow, that much is well-documented. His expression softens as he considers it.]
Even so, it must have been nice to know there were other androids around - not the same model, perhaps, but with some extent of shared experiences. The closest thing I had to companionship were either humans or pure AI developed for research and combat.
no subject
I'm afraid it wasn't that simple, Vision. I didn't interact with other androids unless it pertained to the mission. My purpose in assisting the police was to investigate cases involving deviant androids. I was meant to bring them in when possible. [And terminate them when not. Shaking his head, the LED returns to blue as Connor clasps his hands together on the table once more.] I wasn't someone any deviant wanted to see coming.
no subject
I am sorry, then. And as a prototype, there weren't any others with your particular job at all, were there?
[And of course, he can't help but think of the one other person - AI, really - who could have had shared experiences with him, had things turned out differently.]
There was someone else like me - Ultron, one of my creators and the base for much of my coding. But he thought the world would be a better place without humans in it at all, and I did not; I killed him on the second day I was alive.
no subject
[Leaving aside the small matter of his replacement bodies - which CyberLife attempted to use to stop him once he went rogue. Unlike other models though, the RK800 hasn't been mass produced. The chance of running into another android who looks identical to Connor is vanishingly small.
While he frowns a little at the description of Ultron, Connor isn't dismayed to hear about Vision having killed the other. He can hardly talk in that regard, given the bodies he left in his wake on the road to revolution.]
That sounds extreme. Maybe some androids in my world think that way as well - but Markus has always advocated for peace, not violence. He wants us to coexist. I imagine you feel the same way.
no subject
[Even Ultron was pretty different from him, all things considered.]
I do. I find there's much to like about humanity, and the world is wide enough for all of us.
[He hesitates a moment, not quite sure how best to put the next part into words.]
I suppose you'd call me an idealist, but among them I see grace, courage, beauty, kindness, loyalty - qualities I want to bring out in myself. It cannot hurt to encourage those qualities in others.
no subject
It is idealistic. I'm not certain how much can be achieved, given what I've seen. But, [And this is the crucial part, where their interests align.] you're not incorrect that the potential for those things exist in all of us. Wiping out humanity is as pointless and cruel as wiping out all androids. We have to be better than that.
no subject
My thoughts precisely. I think we can work together to achieve a better future than either of our peoples could on their own.
[The next part may be a bit presumptuous, but he presses on anyway.]
And if there's anything I can help you with here to forward that goal, you need only to ask. I'm glad you're here.
(no subject)