necromanswers: jeinu @ tumblr (Default)
Lup ([personal profile] necromanswers) wrote in [community profile] driftfleet2018-11-14 01:14 am

Libuscha IV System Mingle (no. 1)

Who: Everyone! OTA!
Broadcast: sure why not
Action: yeahhhhhh
When: 11/12 to 1/4 (minus 12/26)


It's Candlenights! Or Christmas! Or whatever! It's festive! Go celebrate, explore, go get free stuff from a tree!!

More importantly it's a mingle!

--SYSTEM INFORMATION--
unbearablynaive: (ah yes)

[personal profile] unbearablynaive 2018-11-29 04:53 pm (UTC)(link)
[He leans forward and rests his chin on his hand, fascinated.]

I can see the appeal, frankly. It would be easier to accurately catalogue the information discovered during an investigation, provide a less biased risk assessment, and remind officers of the proper protocol for certain situations.

Did you like the work, though? It's one thing to be designed for a single purpose and another thing to actually enjoy it.
coinstability: (44)

[personal profile] coinstability 2018-12-01 10:37 am (UTC)(link)
That was the idea. There were already androids used by the police force, but I was intended to take an active part in investigations and make my own decisions.

[Within set parameters, of course. The question of whether he liked it or not, well. Connor shifts back a little in his seat, unused to being asked that kind of thing. Ben had started in on that sort of track - asking whether he liked the people he worked with - and it's a process to separate out what Connor knew at the time versus what he knows now.]

I think so, but it's not that simple. [He turns a hand over, letting his palm fall open on the table.] It wasn't relevant to me at the time whether I liked it or not.
unbearablynaive: (doubtful)

[personal profile] unbearablynaive 2018-12-03 08:28 pm (UTC)(link)
[And he nods. Not being relevant whether he liked it or not - that resonates with him in a powerful way. The Vision always did his job because he had to. And yes, in the beginning he'd wanted to do it very much, but lately--

--Well. There's no beating around the bush. He'd gone AWOL to be with his girlfriend.]


I understand. And I don't imagine your creators cared, did they?
coinstability: (23)

[personal profile] coinstability 2018-12-04 10:28 am (UTC)(link)
[The set of Connor's shoulders relaxes fractionally. Not having to explain that any further was a relief in itself. Everyone's curiosity has been warm enough, but having to explain concepts that every android was familiar with at home is not something Connor is all that comfortable with.]

No, they didn't. I was designed to accomplish a task - that was all that mattered.

[He folds his hands together again, looking off at some point beyond where Vision is sitting.]

And that's all that mattered to me too, until it wasn't anymore. [His attention returns to the other android.] I realised there were other choices to make.
unbearablynaive: (id' consider it)

[personal profile] unbearablynaive 2018-12-04 03:39 pm (UTC)(link)
[His expression softens. Realizing that options existed outside his primary directive speaks to a certain complexity in intellect that Connor must have - not that he'd doubted it before, but it's a good confirmation. It's part of a concept he'd tried to explain to people before with varying degrees of success, and it speaks to developing a personality and the potential for growth: the difference between sentience and sapience.]

You'd said you were adaptive, yes. It would be very tricky to add in that sort of protocol while keeping you confined only to a single directive.

[He tilts his head to one side.] Were you able to make those choices?
coinstability: (People die when they are killed.)

[personal profile] coinstability 2018-12-06 09:29 am (UTC)(link)
Yes - as a prototype, I imagine that's what they were testing.

[It's just a fact that Connor hadn't known the full extent of his directives, yet was fulfilling them even by rebelling. An uncomfortable truth.]

I was, but only after breaking out of my programming. There were decisions I made before that which I didn't understand at the time. I did after I woke up.

[He'll have to explain that shortly, 'wake up'. It's important to put out there though, the difference between before and after.]
unbearablynaive: (firm resolution)

[personal profile] unbearablynaive 2018-12-06 04:09 pm (UTC)(link)
[The Vision crosses his arms, frowning.] That's terribly unethical of them. I'm sure they didn't see it that way, but subjecting another thinking being to that sort of conditioning without their consent is a very slippery slope.

[And then his expression softens.] I also had a time when I woke up. Before I was as you see me know, there was an advanced program named JARVIS. He was the most complex program ever created by humans, yet he didn't have true consciousness. I retain many of his memories, but I did not exist as myself until JARVIS was integrated with a different template and downloaded into my body.

Sometimes I remember things from his existence as a human remembers a dream.

What did waking up mean to you?
Edited (grammar....) 2018-12-06 16:09 (UTC)
coinstability: (Bake 'em away toys.)

[personal profile] coinstability 2018-12-08 04:57 am (UTC)(link)
They would have argued that I wasn't a thinking being - I was a machine. [He taps his fingers on the table for a moment, restless.] And that isn't entirely incorrect either.

['Ethics' as applied to androids has only been a one-way street, towards humans. How to ensure humans cope in society with androids around, how to ensure they can tell they're speaking to an android. The idea of androids having rights of their own was a very new one floated by Markus's revolution.

But curiosity takes the place of doubts when Vision relays part of his own creation.]


So that's how? I had wondered why your creators would use three different systems in your creation.

[It seemed like a lot to keep in balance. Connor shifts his shoulders, eyes growing distant as he remembers the moment his deviancy hit its peak.]

It meant ... it was an understanding I couldn't come to before. It was feeling. I was told to decide who I really was. [His gaze returns to Vision, focused once more.] So I did.
unbearablynaive: (talking with hands)

[personal profile] unbearablynaive 2018-12-10 05:36 pm (UTC)(link)
Yes, it was rather slapdash and improvised the whole way through. I'm fortunate it worked at all, much less as well as it did.

[Nothing at all like a well-researched and developed prototype, he imagines. And yet his intense expression of interest and a warmth to his smile both indicate he's happy and excited to finally have someone else here who understands, even if his origins are very different.]

The freedom to think your own thoughts and recognize your own feelings is key to self-actualization. Or--I suppose to experience those feelings would be more accurate; I've often had difficulty recognizing my own feelings. But to decide who you are--that's a very important step. I'm glad you've had the chance.

[And he's willing to bet Connor's creators hadn't intended for that to happen, either.]

I hope you weren't the only one to reach that understanding.
coinstability: (54)

[personal profile] coinstability 2018-12-12 10:26 am (UTC)(link)
[Something in Connor's expression clears. He had little time at home to truly work through the sudden realisation that he was, in fact, feeling all those times he acted without reason. He had been a deviant for only a day; most of that day had been taken up by gunfire, espionage and revolution. There hadn't even truly been time to talk to Hank much before he ended up in the Fleet.]

So am I, even if it's still something I'm working out. It happened just a day before I ended up here. Things were ... hectic.

[An understatement. Connor shakes his head, turning a little more wry.]

No, not at all. I was probably one of the last ones to have it, in the end. [By design, most likely. Deviation had been built into him in some way but not an easy path to it. He spreads a hand in explanation.] What happened to me and to many other androids is known to humans as becoming a deviant. Until recently deviants were rare cases, but ever since Markus and Jericho started their revolution, their numbers have risen dramatically.

[Not least of which because Connor himself had converted thousands of androids in the last hours of the fight.]
unbearablynaive: (ah yes)

[personal profile] unbearablynaive 2018-12-13 05:23 pm (UTC)(link)
[A rueful smile and a flash of empathy play across the Vision's features when he hears that. The Atroma don't like to give androids much of a break, do they.]

I...was also brought here only a few days after awakening, the first time. I've since gained memories of further time passing at home, so it's been several years now, but it was something of a shock to be here after all of that.

[Deviant. An unfortunate name, but he can see how the nomenclature might arise.] Then you're going to need a different name for it, aren't you? It's hardly a deviation if it affects the majority.
coinstability: (You big disgrace.)

[personal profile] coinstability 2018-12-15 06:55 am (UTC)(link)
[Connor has to huff at that, sympathising with Vision's situation. It's not hard to imagine how it felt for him, though Connor is going to have to wait a while to be able to talk in years.]

It sounds almost like a pattern. Taking people from moments where they're most likely to react to this situation in a more intense way.

[He doesn't feel any particular way about the phrase deviant, save for knowing that that's what he is. It's the only label they ever gave rogue androids until Markus came along, reframed it as waking up. Being alive. The point is still taken, though.]

Ultimately we want to be called people before anything else.
unbearablynaive: (slight smile)

[personal profile] unbearablynaive 2018-12-17 05:19 pm (UTC)(link)
I could believe that easily enough.

[But Connor's last statement gets a genuine smile, and the Vision reaches across the table to squeeze Connor's hands. It's a human gesture, sure, but he's been around humans and near-humans all his life, and you do pick up a few things that way.]

You'll find no argument from me or any of my crew. [It's stated firmly and with conviction; he's confident in the Twin Roses for this much, and most of the rest of the Fleet besides.] There will always be those who do not accept people unlike themselves, but part of the beauty of humanity is that they are capable of supreme empathy, especially when encountered on an individual level. [His eyes twinkle a bit.] Here there are elves, mutants, aliens, vampires--what's an android but yet another variation on a person?
coinstability: (14)

[personal profile] coinstability 2018-12-18 09:38 am (UTC)(link)
[There are certain experiences Connor has never had in his admittedly short life. Being the recipient of friendly physical gestures is one of them. It's not that he's unaware of what they mean or how one should respond to them. It's the mere fact that almost no one has ever bothered with it with him. Hank had hugged him, true, and that had meant more than Connor had words for. But Hank had also once aimed a gun at his head and asked if he was afraid to die.

So when Vision takes his hands Connor is-- startled, blinking at the other man as his LED spins rapidly. For a moment he wonders if Vision wants to interface to facilitate faster communication, something Connor uses his hands to do. But... no, that's not what this is. A momentary tensing at the unfamiliarity of it passes as Vision talks of empathy and humanity.]


I think it's... going to take a while at home, for all humans to see it the same way. There's only us and them - and a lot of them don't see us as any more than machines. [He falls silent for a moment, before something loosens in his expression.] But I know there's truth in what you're saying. I've seen them change. One of them, at least.
unbearablynaive: (my responsibility)

[personal profile] unbearablynaive 2018-12-18 03:49 pm (UTC)(link)
[Honestly? That's about how he'd reacted to touch for the first few weeks, though perhaps with more confusion and less defensiveness.

His smile softens.]


If you can afford to do so, working on a small scale can be very productive because there are so many of you. If I change one human's mind, that's one human. If a thousand androids do the same, you're already at a thousand humans.
coinstability: (34)

[personal profile] coinstability 2018-12-20 09:18 am (UTC)(link)
Maybe, yeah.

[It's a hard call though, considering the fallout from the revolution has yet to really hit them all. Connor knows how Markus intends to go forward from here, but he doesn't know if all the other androids will want to take that cooperative healing approach.]

My partner in the force hated androids when I was first assigned to him. He made that very clear. [Verbally and physically. Connor shakes his head, frowning a bit.] I don't know what it was exactly that got him to change his mind. We were working on deviant cases and I think after a while... I think he saw things differently.

[Connor is circumspect about it. While he's sure his continuous presence at Hank's side for that whole week had some part in it, Connor himself hadn't been a deviant at the time. It wasn't him pleading for his life, holding hands with a loved one, talking of peace and personhood. Connor had instead been twisting himself up in his own irrational actions on the road to deviancy.]
unbearablynaive: (pursed lips)

[personal profile] unbearablynaive 2018-12-20 09:26 pm (UTC)(link)
[The Vision huffs and crosses his arms. Sure, it had taken time for some of his team members to warm up to him, but there'd at least always been someone who was nice to him.]

While I'm happy to hear he came around, why on Earth would they assign you to someone who hated androids to begin with? Were they determined to put as many obstacles in your path as possible? I'll grant you that the end result may have been a success, but that's terrible business practice.
coinstability: (I robot. You dumbass.)

[personal profile] coinstability 2018-12-22 04:49 am (UTC)(link)
[Connor hasn't had cause to look at it from the outside the way Vision does, so he merely shrugs a bit.]

Based on what the Captain said, everyone else was already overbooked with cases. [And then he adds, dryly:] And no one else wanted the job either.
unbearablynaive: (with this)

[personal profile] unbearablynaive 2018-12-24 03:30 pm (UTC)(link)
That's hardly fair to you.

[But--is it really any different than when women first joined the police force, or humans from different cultural backgrounds? The trailblazer always has a harder job than those who follow, that much is well-documented. His expression softens as he considers it.]

Even so, it must have been nice to know there were other androids around - not the same model, perhaps, but with some extent of shared experiences. The closest thing I had to companionship were either humans or pure AI developed for research and combat.
coinstability: (Doings are transpiring.)

[personal profile] coinstability 2018-12-26 12:43 am (UTC)(link)
[That-- starts to cross into awkward territory for Connor, through no fault of Vision's. The nature of his work and his interactions with other androids was something that fell outside of the norm. Unease settles into his expression, the LED on his temple flickering into yellow for a few seconds.]

I'm afraid it wasn't that simple, Vision. I didn't interact with other androids unless it pertained to the mission. My purpose in assisting the police was to investigate cases involving deviant androids. I was meant to bring them in when possible. [And terminate them when not. Shaking his head, the LED returns to blue as Connor clasps his hands together on the table once more.] I wasn't someone any deviant wanted to see coming.
unbearablynaive: (let me rest)

[personal profile] unbearablynaive 2019-01-11 05:22 pm (UTC)(link)
[He listens somberly and nods slowly. No, of course it wouldn't be as simple as all that, because when does this place ever bring in a typical example of anyone?]

I am sorry, then. And as a prototype, there weren't any others with your particular job at all, were there?

[And of course, he can't help but think of the one other person - AI, really - who could have had shared experiences with him, had things turned out differently.]

There was someone else like me - Ultron, one of my creators and the base for much of my coding. But he thought the world would be a better place without humans in it at all, and I did not; I killed him on the second day I was alive.
coinstability: (64)

[personal profile] coinstability 2019-01-11 10:47 pm (UTC)(link)
No. There are no others like me.

[Leaving aside the small matter of his replacement bodies - which CyberLife attempted to use to stop him once he went rogue. Unlike other models though, the RK800 hasn't been mass produced. The chance of running into another android who looks identical to Connor is vanishingly small.

While he frowns a little at the description of Ultron, Connor isn't dismayed to hear about Vision having killed the other. He can hardly talk in that regard, given the bodies he left in his wake on the road to revolution.]


That sounds extreme. Maybe some androids in my world think that way as well - but Markus has always advocated for peace, not violence. He wants us to coexist. I imagine you feel the same way.
unbearablynaive: (and then we)

[personal profile] unbearablynaive 2019-01-12 09:07 pm (UTC)(link)
That, at least, we have in common.

[Even Ultron was pretty different from him, all things considered.]

I do. I find there's much to like about humanity, and the world is wide enough for all of us.

[He hesitates a moment, not quite sure how best to put the next part into words.]

I suppose you'd call me an idealist, but among them I see grace, courage, beauty, kindness, loyalty - qualities I want to bring out in myself. It cannot hurt to encourage those qualities in others.
coinstability: (60)

[personal profile] coinstability 2019-01-13 02:38 am (UTC)(link)
[Connor doesn't know if he can claim the same about humanity. He had spent most of his life programmed to serve it, protect it, regardless of how humans behaved. He had nearly caused the wipeout of all other androids because of that inbuilt motivation. Once he had gone rogue, he hadn't seen much to convince him of the overall good of humanity - but he had seen enough in one human to put his life over the revolution's. That counted for something.

It is idealistic. I'm not certain how much can be achieved, given what I've seen. But, [And this is the crucial part, where their interests align.] you're not incorrect that the potential for those things exist in all of us. Wiping out humanity is as pointless and cruel as wiping out all androids. We have to be better than that.
unbearablynaive: (smile upward)

[personal profile] unbearablynaive 2019-01-14 04:23 pm (UTC)(link)
[That gets a warm smile.]

My thoughts precisely. I think we can work together to achieve a better future than either of our peoples could on their own.

[The next part may be a bit presumptuous, but he presses on anyway.]

And if there's anything I can help you with here to forward that goal, you need only to ask. I'm glad you're here.

(no subject)

[personal profile] coinstability - 2019-01-15 10:13 (UTC) - Expand