OpenAI announced the release of a new feature late yesterday for a limited number of ChatGPT users that allows the chatbot to retain information gleaned from human-AI interactions. This "memory" capability is Sisters Sex Scandalmeant to save users the trouble of repeating information, though it will no doubt sound to many reasonable observers like yet another piece of tech gathering details about us.
And yes, OpenAI does appear to be turning the memory feature for ChatGPT on by default. "You can turn off memory at any time," the official blog post about memory notes.
SEE ALSO: U.S. court dismisses most claims against OpenAI in copyright class actionUsers who leave memory on are encouraged to manage the feature, much in the way the Men in Black manage the memories of hapless bystanders after encounters with aliens: by forcing it to forget. But no neuralizer is required; instead you can apparently just "tell it to forget conversationally," OpenAI says, which evidently means you can include something like, "don't store this in memory, but..." in a prompt. If only your gossipy barber could be thwarted so easily.
Exactly what an AI "memory" consists of is not yet clear — and may never be — but an OpenAI video shows a fictional user managing memories in their user settings, and the bulleted list of memories is revealing. Memories outwardly appear to be pithy little snippets of text about preferences and biographical information, similar to what a movie cop would write down in a notebook while interviewing a witness. "Daughter, Lina, loves jellyfish," reads one. "Prefers assistance with writing and blog posts to be more concise, straightforward, and less emotive," says another. "Safe full of valuables is near unlocked side door," reads another. Just kidding about that third one.
But the type of information the feature retains is, nonetheless, a little concerning, particularly since it's easy to imagine heavy ChatGPT users inadvertently revealing the contours of their workplace, family, and medical situations — not to mention hints as to their their innermost feelings about those situations — to a machine that will remember them conceivably forever. Even more concerning, OpenAI has already has a history of accidentally leaking stored conversations.
In an effort to allay such concerns, OpenAI says it's allowing users with the feature enabled to switch on a "temporary chat" option for memory-free conversations, a feature seemingly inspired by incognito mode in modern web browsers. And OpenAI also claims that it will prevent the proactive memorization of sensitive data, unless "explicitly" requested by the user. This hints at a sub-feature in ChatGPT memory that, when it detects you've just told it, say, your family history of cancer, will say something like, "looks like some pretty sensitive data you've got there. Want me to remember that?"
For now, OpenAI says the feature is in testing, and that it will be "rolling out to a small portion of ChatGPT free and Plus users this week," and that it's still being evaluated for usefulness. If you use ChatGPT don't forgetto check and see if it's on.
Topics Artificial Intelligence
How to watch the 'Halloween' movies in 2023'Monster' review: Hirokazu KoreOn Prison Literature & Dostoyevsky’s Notes from a Dead HouseYelp and Texas Attorney General at odds over disclosing antiTikTok parody of Alanis Morissette's 'Hand in My Pocket' goes viralApple will no longer repair the $17,000 gold Apple WatchAlice in a World of Wonderlands: Translating Lewis CarrollWhy is Randi Zuckerberg making cringe music videos about cryptocurrency?Thousands of UkraineTikTok parody of Alanis Morissette's 'Hand in My Pocket' goes viralWould you pay $14 for adThousands of UkrainePeloton finally connects with your Apple Watch. And you can track your workouts with just one tap.Water is best served room temperature, not ice cold. Do not @ me.Scary Stories Are Meant to Be Read AloudEverything we know about 'Only Murders in the Building' Season 4Skirting the Issue: Six Paintings by Matthew BrannonAt the Whispering GalleryScary Stories Are Meant to Be Read AloudX Social Media sues Elon Musk's X over the 'X' name United flight evacuated because of a scorpion 'The Sims' like the one you remember is coming to iPhones and Androids Photographer accidentally captures the wrong couple's engagement First Hydra took Captain America, now they're taking the White House Trump and EDM get these lacrosse bros super pumped Ridiculously adorable skateboarding pup crashes into the BBC Elon Musk's Boring Company starts it's tunneling project When did laptops become such a danger on planes? What to bring on a plane if your laptop is banned 'Wonder Woman' box office predictions are all over the place. It's complicated. '90s tooth gems are coming back because Instagram trends are out of control Houston Rockets' kiss cam told one fan exactly where he stood with his friend Snap CEO Evan Spiegel is here to lose money and earn your trust Elon Musk cries for help with the name for his first Boring machine This village in Indonesia is literally made out of rainbows Cry of the week: Jimmy bests Chuck on 'Better Call Saul' Psy's new hot single reminds you that he still has the top YouTube video of all time Steve Harvey does NOT want anyone coming into his dressing room Snapchat is testing a new way to make money — with Hello Kitty 'Avocado hand' is real and can turn your dream brunch into an ER nightmare
1.5167s , 8224.671875 kb
Copyright © 2025 Powered by 【Sisters Sex Scandal】,Exquisite Information Network