First, they are programmed for specific tasks. (Examples created by OpenAI include “Creative Writing Coach” and “Mocktail Mixologist,” a bot that suggests recipes for soft drinks.) Second, bots can mine private data, such as a company’s internal HR documents or a database of real estate advertisements. , and incorporate this data into their responses. Third, if you allow them, bots can connect to other parts of your online life – your calendar, your to-do list, your Slack account – and take actions using your credentials.
Does this sound scary to you? It is, if you ask some AI security researchers, who worry that giving robots more autonomy could lead to disaster. The Center for AI Safety, a nonprofit research organization, has listed autonomous agents among its “catastrophic risks linked to AI” this year, claiming that “malicious actors could intentionally create malicious AI with dangerous goals.”
But there is money to be made in AI assistants that can perform useful tasks for people, and corporate clients are eager to train chatbots on their own data. There’s also an argument that AI won’t be truly useful until it truly understands its users – their communication styles, their likes and dislikes, what they watch and buy online.
So here we are, speeding into the era of the autonomous AI agent – the doomed be damned.
To be honest, OpenAI’s robots are not particularly dangerous. I got a demo of several GPTs at the company’s developer conference in San Francisco on Monday, and they automated mostly innocuous tasks like creating coloring pages for kids or explaining the rules of games. card games.
Custom GPTs can’t do much yet beyond searching documents and connecting to common applications. A demo I saw on Monday involved an OpenAI employee asking a GPT to search for conflicting meetings on her Google calendar and send a Slack message to her boss. Another event happened on stage when Sam Altman, CEO of OpenAI, created a “startup mentor” chatbot to give advice to aspiring founders, based on an uploaded file of a speech that he had uttered years earlier.