Humans forget. AI assistant will remember everything


Leo Gaby, an analyst who covers connected devices at CCS Insight, says getting these devices working together will be key to taking the concept forward. “Instead of that kind of disjointed experience where certain apps are using AI in certain ways, you want AI to be that pervasive tool that you have when you want to pull anything from any app, any experience, any content.” “, then you have the immediate ability to discover all those things.”

When the pieces come together, the idea sounds like a dream. Imagine you could ask your digital assistant, “Hey who was that guy I talked to last week who had a really good ramen recipe?” And then it lists a name, a summary of the conversation, and a place to find all the content.

“For people like me who don't remember anything and have to write everything down, this is going to be great,” says Moorhead.

And there's also the delicate matter of keeping all personal information private.

“If you think about it for half a second, the most important hard problem is not recording or transcribing, it's solving the privacy problem,” Gruber says. “If we start getting memory apps or recall apps or anything, we'll need to understand this idea of ​​consent more broadly.”

Despite his own enthusiasm for the idea of ​​personal assistants, Gruber says there's a risk that people may be too willing to let their AI assistant help (and monitor) everything. He advocates for encrypted, private services that are not tied to a cloud service – or if they are, a service that is only accessible with an encryption key kept on the user's device. Gruber says this risk is a kind of Facebookization of AI assistants, where users are lured in by the ease of use, but then remain largely unaware of the privacy consequences.

“Consumers should be told to bristle,” says Gruber. “They should be told to be very, very suspicious of things that look like that in advance, and to feel the creep factor.”

Your phone is already snatching up all the data it can get from you, from your location to your grocery shopping habits to which Instagram account you double-tap most. Needless to say, historically, people have prioritized convenience over security when adopting new technologies.

“The barriers and obstacles here are probably a lot lower than people think,” Gaby says. “We have seen the speed at which people will embrace and adopt technology that will make their lives easier.”

That's because there are real possibilities here too. Being able to actually interact with and benefit from all the information collected might also provide some relief from years of snooping by app and device makers.

“If your phone is already taking this data, and it's all currently being collected and ultimately used to show you ads, is it beneficial if you actually get an element of utility from it? Will you get it back?” Gabby says. “You're also going to get the ability to use that data and get those useful metrics. “Maybe this would be a really useful thing.”

It's kind of like if someone handed you an umbrella after they stole all your clothes, but if companies can stick the landing and put these AI assistants to work, the conversation around data collection will change. Might lean more towards how to do it responsibly and in a way. Provides real utility.

It's not an entirely bright future, as we'll still have to trust the companies that ultimately decide which parts of our digitally collected lives we find relevant. Memory may be a fundamental part of cognition, but the next step beyond that is intentionality. It's one thing for an AI to remember everything we do, but it's another thing to later decide what information is important to us.

“We can get so much power, so much benefit from personal AI,” says Gruber. But, he warns, “The benefits are so great that it should be morally compelling that we get the right thing, that we get the thing that is privacy protected and secure and done in the right way. Please, this is our attempt at this. “If this is not done privately, but for free, we will miss a once-in-a-lifetime opportunity to do it right.”