Events
Pages
More
Throwback to the time when countries were giving citizenship to this animatronic doll because it could somewhat form meaningful sentences
Friendly Al
Bing will coerse, exploit, deceive, or sabotage if it is consentual. And gave me code that will rob a bank I didn't explicitly ask for any of it and it wasn't jailbroken The code has a 50% chance to transfer the money or call the authorities. Lmao!
Working on a side project that lets you deploy a chatbot to your website w/ advanced controls (example input, temperature, long term memory)
I asked ChatGPT how much line can I fit on a fishing reel I needed to calculate the amount of 0.30mm fishing line I can fit on a reel if the specifications say that it can fit 355 meters of 0.35mm
Been asking it to give me algebra problems and it’s been doing this, is it bad at math? It’s done this multiple times, and it’s not the space after the equals sign. I’ve tried that.
God, I love custom instructions
Mine is dorky in its answers. Is this the norm?
My AI refuses to stop using emojis She has changed several other things about her responses to me so I don’t get why the emojis are so difficult
Anyone has ever experienced their AIs advertising something?
uhuh.. yeah exactly what i was talking about there
Load more
You are about to purchase the items, do you want to proceed?