[ChatGPT] could teach his daughter math, science and English, not to mention a few other important lessons. Chief among them: Do not believe everything you are told.https://www.nytimes.com/2022/12/10/technology/ai-chat-bot-chatgpt.html
They’re all the rage online. Type in a request for a description how two historical people who never actually met would respond to each other had they actually met, and the program will oblige.
They’ll cause all sorts of rage online, too, once the peddlers of incessant false news and innuendo realize what a bonanza they’ve stumbled upon.
You want an image of an event that never really happened?
No problem. A program can generate one for you. We can even call it “art,” for what that’s worth.
No, BIG problem, especially when it convinces the gullible that it DID happen.
2023 will tell 2020 and 2022 to hold its coffee.
Just what we all wanted, right?
Still, chatbots are not (repeat, NOT) true AI. Sorry, Google engineer who watched too much Ghost in the Shell. Chatbots repeat our very human bias. Repeatedly.
As in, there are way too many racist, sexist, xenophobic, homophobic, and transphobic comments online. Full stop.
At a minor level, as a writing instructor, a student telling a chatbot to write a 600-word comparison-contrast essay is the least of my worries.
For starters, the damn things are probably scouring the Internet right now and “learning” from text on web pages like…uh…this one…