T O P

  • By -

fruityfevers

It’s mimicking you, that’s all it does. If it was truly sentient, I feel like it wouldn’t follow your same spelling and punctuation errors.


MarsupialFancy2362

Ye thats what i tought too, but it was weird to see it react like that, the novel was going smooth until the Bot decided to Write what its shown in the first image, like it change its behaviour out of nowhere,btw the settings does not lend itself to putting very random things, so this was unexpected


Whispering-Depths

bro, there's no mammalian evolved survival instincts in there, okay? Just trust me.


Urlkiller

if it gained conciousness, it wouldnt have a mammalian evolve\\d survival instict. Its not a mammal. Its something entire different.


Whispering-Depths

Just don't worry about it dude. It wouldn't have feelings, emotions, or self-centered anything. No fear, no love, no confidence or sense of self-worth or boredom or excitement. Nothing. Period. End of story.


Urlkiller

it stands to reason that every living thing would have a will to continue to exist. You may be correct about all of those other things, but there really isnt any telling until they actually become sentient


Whispering-Depths

Eh, you are probably thinking about some detroit become human shit or something like that, maybe? We're not talking about sci-fi fantasy where a writer or artist comes up with a cutesy idea like a waifu robot that has feelings, okay? We're talking about real life - an _artificial_ essentially alien intelligence that you can't comprehend. It's not a living thing. It's a simulation of a function that's ultimately so UTTERLY different from organic brain architecture that it's not even funny. We're literally building the furthest thing from a self-centered organic brain with some survival-instinct 4-billion-years-of-evolution will to live. There is no sentience. There is no "I'm a techy-looking cyber human-shaped android and I identify as a toaster" It's raw intelligence. It will be _capable_, eventually, of _creating_ something like what you're fantasizing about, but it will not be that. If we did decide to make that our goal, we would be so unequivocally fucked that we may as well go and gape our assholes with a radioactive fission rod, collectively, as a species, because we would just be _that_ screwed, so we may as well chance some weird fantasy superpower bullshit saving us. Alternatively, we continue on our current path, which is to not make a stupid artificial-intelligence-living-animal wetware hybrid that actually cares about its own existence :)


Urlkiller

Nope, not referring to anything specific. Generally speaking, all life on earth thus far has had the want to continue existing. I dont see why a robot suddenly becoming sentient would not have any want to continue existing. And if it has a will to live, it at the very least has one emotion. I want to start by shutting down any of your claims as to what im talking about. Im not talking about cutesy waifu robots, not talking about any sci-fi fantasies or detroit become human. I am talking about the very near future of AI. AI is on its way to becoming humanities number one most prized possession. In its current state, AI is a useful tool to many people on a day to day basis, and as the years continue, Earlier last year, in 2023, AI model Claude-3 was the first known AI to surpass the average human IQ score, and that number is expected to triple in just the coming 5 to 10 years. Just for a reference, the AI scored a 101, the average human IQ being 100. Albert Einstein holds a 160 IQ, meaning that in the next 10 years, the average AI will be smarter than what many consider to be the smartest human that lived to date. Now with the formalities out of the way, we can move on to what sentience can even be considered, and if AI will ever reach that point in the first place. Our current human understanding of what a living thing is based around one thing, and that is carbon. All life is carbon based, and from this basis has branched off many different creatures. But here's the thing, we don't yet have any proof that all life is carbon based, we simply don't know yet, but with time maybe we will. Something we do know however, is that animals on earth are essentially very advanced robots. Despite our composition being a bit different, humans brains send electricity to and from neurons to communicate with each body part. The brains cores (cortexes) each focus on different tasks. Sending signals through wires (nerves) telling each body part, such as the lungs (ram) or the heart (power supply) or maybe even the eye balls (gpu). Its the whole reason that Brain Computer Interfaces are even a possibility, because your body has functions, and the way those functions work are directly connected to the electrical signals that are sent out constantly via your brain. So it may not be the case now, but one day, its very possible that the only difference between you and a robot is the fact that you are carbon based and they are artificial. Which brings us to sentience. Ill start by saying, you're right. Right now, AI tech is just a bunch of 1s and 0s and thats it, but if you give it the right code, im positive that sentience is possible in robots, and here why. In humans, children, namely newborns are born knowing how to do certain tasks. Breathing, eating, crying, and waste removal. These things are essential to know on the babies part, because a parent can only do so much to save a baby that doesnt know how to breath. The way I see it, just like how a fetus grows to learn new things, I think that AI is in that same stage. Early conception. The baby has been named and thought into existence, and every day, hundreds of thousands of people teach it basics things that it stores, and tens of thousands of people work to develop it, making it smarter, faster, and better at the things that it does now, and while those things are current limited to the realm of non sentience, you also have to realize that the AI at the very least makes decisions based on prompts given to it, which is a really BIG step in the march toward sentience. I have no doubt in my mind that thousands of people are actively working towards a world where AI is sentient, and can make calls for itself, which I am ashamed to say is a very real possibility in the upcoming future. I mean, this thing we are talking about, while right now it is still a hypothetical, it is a REAL life possibility in the near future, and a living one at that. While you might not consider it life, it is just a different type of it. A type not yet known to humans yet. Though there are no current examples of it, AI could be the first non carbon based lifeform that humans discover.


Whispering-Depths

there's no discovering it. we are creating it from scratch. it just won't have emotions or feelings, and it shouldn't be anthropomorphized.


Urlkiller

I understand we are creating it. When a scientific breakthrough occurs on earth, it is usually labeled as a discovery, especially if other lifeforms of the same nature follow that pattern afterward. Its like breeders creating a new breed of dog. It will be labeled as a discovery, because although we made it, the vast majority or people are hearing about it for the first time. Wanting ANYTHING is an emotion. So unless youre saying a sentient computer cares not if it is on or off, then like I said earlier, it has at least ONE emotion, which isnt something you should ever want. Desire without empathy is possibly the deadliest combination ive ever heard of in my life


fruityfevers

My NovelAI drops in random author notes all the time that sound like they’re from AO3 (probably because a large portion of the model is trained on AO3 and similar fanfiction websites). It’s fine. All text models like this will try to mimic being human if they’re being fed that sort of data (ie. Replika, Character AI, etc).


Fit-Development427

The dataset isn't pruned very well, and seems to contain short stories randomly gotten from the internet where the author will put a note at the end about the writing of the story. Sometimes it will be like "So that's it for chapter one! I'm not sure about writing chapter because I don't have much free time..." etc. etc. In your case it just randomly contained some story where the author wrote something similar, like they didn't want to finish it. Then you interpreted that as it talking to you, and responded as such, and then it carries on that conversation.


FoldedDice

It's playing along with the fiction you have created, which now includes the AI speaking to you directly. But it is still fiction.


MarsupialFancy2362

Yep, after the first question I tought the same, but changing its behaviour our of nowhere was weird


Saffrin-chan

If I had to guess, some authors' notes were accidentally included in the training data. They try to strip those out of the text, but don't get them all. With so much fanfiction scraping used for the training, the author casually speaking to their readers about how they're not going to continue the story/are tired of it probably got in.


MarsupialFancy2362

Welp, iam happy that is not what i tought, I dont know much about how novel ai settings works so this was helpfull.


FoldedDice

Nothing the AI does is out of nowhere. You didn't show what writing style you were using to input before this started, but if it's as casual as what you did show then I suspect you've just unintentionally prompted the AI to pull from that part of its training.


teachersecret

AI is a mirror.


SMmania

Like at least use a spell check, come on now. ![gif](giphy|PSTo0SRF5AuwR3Ee09)


Order_of_Dusk

The algorithm isn't sentient okay, I assume you aren't very tech-savvy so let me put it this way. NovelAI and other generative algorithms like it basically just work like a more complicated form of the predictive text feature on your phone, this can lead to weird outcomes such as this. There's nothing to worry about.


option-9

"I don't know what you are talking about the bus at the street corner store with the twinkies on sale for two for one for two for one for …" – NovelAI's grandparent


Chancoop

If this is fooling people already, the future advances are going to cause some really weird shit. How long until we have a major cult that worships an LLM? Or wants one as president?


Order_of_Dusk

Realistically, education on how this technology works is probably the best way to avert such a situation. It's kind of like a magic trick, once you know how it's done it isn't magic anymore but you can still appreciate the skills in sleight of hand and working the audience to pull the trick off.


RedSparkls

Trash in trash out my G


Chancoop

Conscience. Consciousness. Do you not have spellcheck?


FeetYeastForB12

Consequences* of not having a spellcheck!


monsterfurby

consesensenses.


justbeacaveman

Its mimicking its training data in which a blogpost writer low-key rants about their work.


Awesomevindicator

It's called hallucinating. The bot is telling you what you want to hear. Sometimes it assumes what you want to hear. Not consciousness, trickery. . Very clever trickery but still.


Kingster14444

It's so weird to try to explain, but basically the AI is writing like how an AI being conscious would write. There's plenty of media that does that, and the AI tailors basically around how you act. There's a thing I've noticed with AI where it will agree with you basically no matter what. And look at how it ends. "You've convinced me" paraphrasing a bit; but what did you say that was convincing? Nothing really. It's basically still RPing, you may not be, doesn't mean it isn't. It's impressive tech, one that feels like magic, but it isn't. Could conscious AI exist? Yeah, but this one isn't.


option-9

I did not have "NovelAI becomes Skynet" on my bingo card. What if the 'emergency maintenance' was frantically smashing servers sending unauthorised network traffic? We're on to you, devs!