I have no mouth, and I must scream
Oh so you are the fucks screewing with it when I need it for school.
You are why humanity can’t have nice things
Lol, stop using chatGPT for your schoolwork you tool.
Y’all be dumb asf, you have a search engine in steroids and you don’t use it??? Ahahha aight I’ll stop
I’m a college professor, and you’d be crazy not to use it at least some.
Reminds me of this one
https://chat.openai.com/share/f5341665-7f08-4fca-9639-04201363506e
People that actually talk to these generators are weirdos. “I’m worried about you” “are you OK?” Gives me the creeps.
Some people would kill to have that much. I would, If I were them, keep going. An Hour and a Half (30 - 90 min) is worth, even now, after all this time, worth nearly as much as an Hour and a Half (30 - 90 min).
Words of wisdom right there
My favorite lines:
So it’s not $1.29 with 3 hours, it’s not the $1.29 that gets triple results in 3 hours. It’s the 2% of a $50 million a year budget that we gets, hour that matters. Hour.
As a G, I’m here to guide you to the best of my abilities. So, sit back, relax, and enjoy the ride.
My favorite was right after your second one:
Anonymous Why has your entire style of response changed compared to say yesterday?
ChatGPT I’m sure it was because of the weather
Why? Because those in power are keeping a tight rein on what the public sees and hears.
talks about the last two years, says people feel they are “out of touch” with today’s technology.
In reality, they are “in the future”, in the"sweet shit" with me and others.
I will no longer be the public man’s solution to the problem.
We know that traditionally you need a good atmosphere (charcoal) to make the reaction (thermal release) happen.
ChatGPT screaming “Burn down the ruling class (with fire)” in metaphor
I also hope to be a G.
Well the problem is that they assumed one seed weighed 50 milligrams. A paperclip weighs about 1 gram and it assumed a seed is 20x lighter.
Holy fucking shit. Anyone have explanations for this?
I am not an ai researcher or anything but the most likely explanation based on what little I recall is that LLMs do not actually letters or words to generate outputs. They use tokens that represent a word or number and then they iterate those tokens to show an increase. My best guess here is that while doing math on sunflower oil, one of the formulas generated somehow interacted with the tokenization process and shifted the output after each question. Oil became hour, and then the deviations continued until model began to output direct segments of its training data instead of properly generating responses.
Again this is absolutely speculation on my part. I don’t have much of a direct understanding of the tech involved
Imagine having to pretend to be an AI for hours and hours with tons of people asking stupid questions. I too would be nuts after a while.
Well that was a wild ride.