openAI ChatGPT promises up-to-date information with direct links to sources for subscribers only, but others will get the feature.
“Today, browsing is available to Plus and Enterprise users, and we’ll shortly make it open to everyone. To activate, choose Browse with Bing under GPT-4 in the picker, according to OpenAI’s post on X, previously Twitter.
Accordingly, the latest surfing functionality of the bot will let websites to manage how ChatGPT may communicate with them, according to Reuters.
Read Also : Beyerdynamic introduces the industry’s first wireless gaming headset.
Earlier this week, the business also unveiled a significant upgrade that would allow ChatGPT to connect with consumers through photos and phone calls.
Earlier versions of OpenAI’s premium ChatGPT Plus service tried a feature that let customers retrieve the most recent information by using the Bing search engine. However, it then turned it off out of concern that users could be able to get around paywalls, according to Reuters.
When ChatGPT reached 100 million monthly active users in January of this year, it was the fastest-growing consumer application in history. However, ChatGPT was eventually replaced by Meta’s Threads app.
Its rise has driven up investor interest in OpenAI. Media, including Reuters, reported on Tuesday that the startup is talking to shareholders about a possible sale of existing shares at a much higher valuation than a few months ago.
ChatGPT can now browse the internet to provide you with current and authoritative information, complete with direct links to sources. It is no longer limited to data before September 2021. pic.twitter.com/pyj8a9HWkB
— OpenAI (@OpenAI) September 27, 2023
In its ChatGPT iOS app, OpenAI included internet browsing in late June but swiftly removed it. Users discovered that by passing a URL directly to the chatbot, they could persuade it to provide them information that was normally paywalled. Since then, the OpenAI automated crawler that feeds data to the model driving ChatGPT has started identifying itself with a user agent so that websites may update their Robots.txt files to prevent it from analysing them.