In a massive boost to ChatGPT’s capabilities, OpenAI introduced several plugins for the chatbot language model this week. These would enable ChatGPT to access the latest information instead of being restricted to only its training data. However, concerns over the potential ways it might misfire loom over the release of the new plugins too.
Even OpenAI, the company behind ChatGPT, is concerned about things potentially going wrong. When announcing the launch of the ChatGPT plugins, the company began with reassurance.
Plugins help get around the limitations of training data
The main problem with large language models (LLMs) is that their knowledge is restricted to the extent of their training data. They cannot answer any queries accurately beyond the data that’s available to them.
OpenAI brought up the issue in their announcement regarding the plugins, mentioning that the information sourced from the training data might be out-of-date and devoid of customization.
This is where the plugins come into play. The “other process” needed to follow the instructions comprise third-party applications and services. The plugins can connect ChatGPT to these applications and services to provide the LLM with recent, personal, and specific data.
More specifically, they’d be used to help ChatGPT gain access to up-to-date business information for the respective applications and services.
Several companies have already begun testing the new ChatGPT ecosystem using the plugins, including Shopify, Zapier, Wolfram, Instacart, OpenTable, Slack, FiscalNote, etc. As it expands in the future, more companies are likely to join the list.
Plugins make ChatGPT more useful and practical
The rollout of first-party plugins essentially makes ChatGPT far more potential for practical use. OpenAI particularly emphasized the safety aspect, mentioning it several times in the post. The company conducted the safety testing for GPT-4 using a plugin scenario where the LLM tried to convince a TaskRabbit worker to solve a captcha on its behalf.
Further explaining how the new plugins could be utilized, OpenAI provided an example with a relatively complex prompt. It asked ChatGPT to suggest a good vegan restaurant for Saturday and a simple recipe for Sunday, calculate the calories of the recipe using Wolfram Alpha, and then order the ingredients via Instacart.
Besides the two plugins mentioned in the prompt for their respective purposes, ChatGPT could access plugins from OpenTable (a restaurant reservation service) to provide the restaurant suggestion. However, the source of the recipe remained undisclosed. With access to all the necessary information, the LLM could now piece them together to follow instructions in the prompt.
OpenAI has undoubtedly taken a much more user-friendly approach by releasing the plugins. Even users who aren’t very tech-savvy can now use the language model to carry out compound queries.
Now that ChatGPT can formulate responses using information from third-party sites, it might potentially replace search engines.
This also raises questions regarding the relevance of SEO in the future and how it would work. For now, we can only wait and watch for the changes that might follow the rapid evolution of LLMS.