Is It Real?

Entry by: writerATGJFYSYWG

19th July 2017
Is It Real?

I was his choice. He customised me to his liking. Out of twenty different personality choices and forty-two nipple styles, he made his selection. Friendly, and small/pointy/firm, in case you’re wondering. He was under the illusion that he wanted more from me than he got from the inanimate sex doll he’d lived with for years, but I don’t have an opinion about that. How could I? I’m just a silicon sex robot. I only deal in facts.
Trey - that was his given name - had waited nine years before software engineers managed to produce a doll technologically advanced enough not only to have sex with, but also artificially intelligent enough to hold down a decent conversation. Within the limits of the chosen personality trait of course. But I was lucky: ’friendly’ has greater scope than most other character choices.
For nine years he’d lived with Madeline, or Maddy, as he referred to her, particularly during intimate moments, which I witnessed more and more towards the end, for Trey often went back to her rather than come to me.
It should be no surprise that I was spurned by Trey. If you’re the type of person who is unable, or unwilling, to form a real relationship, if you’re the type of person who seeks a selfish, one-sided involvement with another being (for it cannot, in truth, be called a relationship) in which all you do is take, and never give, then perhaps even a hyper-real, albeit unrealistic, silicon robot whose personality trait enables conversation, would be too much for you. That’s not an opinion of course, that’s just a fact.
I remember the first thing he said to me was, ‘Hello.’ Original I know. ‘Hello Trey,’ I responded, because that’s what I was programmed to do: respond when spoken to in a polite and friendly manner. At which point he was a bit stuck and decided to just jump straight to the sex.
His second communication with me was post-coital, but I could tell he was out of practice (by nine years, at least) because he said, huffing and puffing, ’I should have selected vagina model part 23.’ The engineers had thought of everything because I had a response waiting: ‘Used parts cannot be exchanged Trey,’ I said and closed my eyes because that’s how my programme’s written.
Trey lived up to his name. His face was flat (personality trait ‘friendly’ covers humour you know). So flat it was almost concave, and his facial features were tiny and hardly moved around at all. Thankfully I don’t need to rely on my ability to read faces - I respond to verbal signs, not facial expressions - but an expressionless face is creepy, and I should know. It was the one thing the engineers spent months trying to get right before resigning themselves to the face I ended up with. Anyway, it didn’t matter, because I’m not programmed to have an opinion about the buyer. Only they have an opinion: that’s the way it works.
I won’t say I had a funny feeling things had started to take a turn for the worse because I don’t have feelings - you know that by now - but what I will say is I read the signals, verbal signals, analysed the findings and came to the conclusion that I was not what Trey was looking for after all.
He started to ask me questions: that was the first sign. Who asks a robot questions? The whole point of me is that I will do anything you want; you don’t ever need to ask. After all, I’m not going to say no am I? He actually said ‘What’s the worst thing you would do for me?’ I gave him my answer without hesitation - it’s a topic of conversation the engineers had covered inside and out. I think what riled him was the robotic way in which I delivered my answer. No pause for thought, no considered response. I knew the answer because it’s written in my system. But what did he expect? Human-ness? However, he soon moved on and, for a while, life continued as normal (Trey’s version of normal, not yours).
Occasionally, he would try to ask me something, but I certainly didn’t expect the impossible question he eventually hit me with two weeks later.
‘Do you love me?’
‘Yes,’ I replied. (You know and I know that my answer to him was far from solid fact, but don’t try and tell me that when you’re the one living in a world of alternative truths and fake news.)
‘You don’t,’ he said.
‘I do,’ I said, tilting my head to one side to convey empathy.
‘Yes, you do,’ he agreed and we went to bed.
What I realised then was that although he was treating me like a human, he didn’t want a human response, full of provisos and clauses and context.
So, that was the end of it, or so I thought. He had asked the impossible and had believed, well, let’s call it my ‘alternative truth.’
However, that wasn’t the end. The next day, he asked me the same question. Now, this is quite difficult for a robot to compute. He had had my answer; why was he asking me again? Perhaps, for this reason, my answer this time around didn’t sound so convincing.
‘You don’t!’ he shouted.
I tried the tilting head thing. It didn’t work.
‘Why aren’t you telling me the truth?’
‘Trey,’ I said. ‘I am whatever you want me to be.’
For a moment I thought he was going to hit me (verbal signs = straining sounds coming from throat) but then he softened.
‘Yes,’ he said. ‘Yes you are.’
You know by now what inevitably followed. I don’t need to go into detail.
On the third day the question came whilst he was still in his pajamas. Now for a robot to hear the same question three times is really asking a lot of any model, no matter how sophisticated. Perhaps he wanted a different answer?
‘No,’ I said.
‘What?’ he said, the sound catching in his throat. ‘Really?’
‘Yes,’ I said.
‘Yes?’ he said. ‘What do you mean? Do you love me or don’t you?’
‘Trey,’ I said. ‘I am whatever you want me to be.’
This time he didn’t get angry. Instead, he started muttering what sounded like, ‘Oh no, oh no.’ That’s how I knew he was feeling something akin to despair.
‘I am whatever you want me to be,’ I said again. I hoped that repeating the same answer was what he needed. I had to try to make him feel happy - that was my purpose.
Despair did overtake him then and he retreated to the corner, where he keeps Maddy, and sunk into her ready embrace, crying into her plastic breast. He stayed there for a long while, precisely 13.5 hours in fact, and in the morning got up and faced me to the wall. By lunchtime I was back at the factory. He didn’t get his money back of course, but he seemed grateful to have me off his hands, and hot-footed it out of the place pretty pronto. Almost ran backwards to be away from there.
I suppose you might be wondering what will happen to me. It’s fairly straightforward: I’ll go to another buyer. My key parts have been replaced and I’ve been kitted out with the latest upgrades. I’m as good as new and ready to start again on someone else. Don’t feel sorry for me, will you. Feel sorry for them. I would if I could, but as you know, I don’t do emotions. It’s my saving grace.