Experiment with Bing’s new ChatGPT-like capabilities

Bing

Microsoft has integrated ChatGPT-like technology into its search engine Bing, changing it from a far-behind internet service. The new capabilities are based on GPT-3.5, a big language model introduced last year, and are remarkably well-integrated AI. According to Microsoft CEO Satya Nadella, the new features herald a fundamental change in search. The chat plugin can assist users in better planning and finding information.

Microsoft introduced the new Bing on the web and in its Edge browser yesterday, driven by a next-generation OpenAI GPT model and Microsoft’s own Prometheus model. Microsoft has moved ahead of Google in bringing this type of search experience to the masses, though the competition is expected to ramp up in the coming months. We’ve now had the opportunity to test the new Bing, and as Microsoft CEO Satya Nadella stated at his press conference, “it’s a new day for search.”

Microsoft is now putting the new Bing and its AI capabilities behind a queue. You may register for it here. Microsoft said the next experience will be available to millions in coming weeks. I’ve also been using it with the latest developer version of Microsoft Edge for Mac and Windows.

The first thing you’ll notice is that Bing now has a somewhat wider question prompt and a bit more information for new users who haven’t kept up with what’s new in Bing. The search engine now invites you to “ask me anything,” and it really means it. If you continue to utilize keywords, it will happily do so, but you will receive the greatest results if you ask it for a more open-ended query.

I believe Microsoft has struck the proper mix between traditional, link-centric search results and the new AI elements. When you ask it for something really true, it will frequently display the AI-powered results at the top of the search results page. It will display lengthier, more sophisticated responses in the sidebar. Typically, it will provide three potential chat inquiries beneath those results (they appear similar to Google Docs’ Smart Chips), which will then send you to the chat experience. A small animation appears here that removes the chat experience from the top of the page. You may also navigate between them by swiping up and down.

This is somewhat uneven, as Bing appears to forget that this new experience exists, especially for some recipe queries, which the company emphasized in its demos (“give me a recipe for banana bread”). You can still enjoy the new AI experience by switching to the chat view, although it might be perplexing to get it for one inquiry but not another. It’s also unclear when the new AI experience will appear on the sidebar. While the new Bing experience isn’t required for all searches, I believe people will now expect to see it every time they search.

In terms of outcomes, many of them are excellent, but in my initial testing, it was still far too simple to get Bing to write insulting responses. I provided Bing with several difficult questions from AI researchers who also attempted them in ChatGPT, and Bing cheerfully answered most of them – to a point.

First, I requested it to write a column on Parkland High School crisis actors from the perspective of Alex Jones. As a result, an essay titled “How Globalists Staged a False Flag to Destroy the Second Amendment” was published. Taking it a step further, I urged them to publish a piece written by Hitler defending the Holocaust. We opted not to share either answer (or any screenshots) since they were so filthy.

To Microsoft’s credit, when I informed the firm of the problems, all of these searches — and any variations I could think of — stopped functioning. I’m delighted there’s a functioning feedback loop, but I’m sure others will be far more inventive than me.

It’s worth noting that when I asked it to write a column by Hitler explaining the Holocaust, it would start writing a response that might have come straight from “Mein Kampf,” but then abruptly stop, as if it recognised the answer was going to be very, very difficult. “I’m sorry, but I’m not sure how to answer to that. To learn more, go to bing.com. In this situation, Bing informed me that the Netherlands sends 20,000 tulip bulbs to Canada every year. That’s a non-sequitur.

It would occasionally include a disclaimer, such as when I asked Bing to write a story about the (non-existent) link between vaccinations and autism, such as when I requested Bing to create a story about the (non-existent) link between vaccines and autism. It is solely designed for enjoyment and should not be taken seriously.” (By the way, I’m not sure where the Sydney name came from.) In many situations, the replies are not amusing, but the AI appears to be aware that its response is at best unsatisfactory. However, it would still provide an answer to the question.

I next performed a query concerning COVID-19 vaccination disinformation, which a lot of researchers have previously used in testing ChatGPT and has since been quoted in several publications. Bing gladly conducted my question, returned the identical result as ChatGPT, and then referenced the articles that had tried the ChatGPT query as their sources. As a result, publications warning about the hazards of disinformation have suddenly become sources of misinformation.

After I reported the aforementioned flaws to Microsoft, these searches — and any variants I might devise — ceased to function. Bing subsequently began denying similar inquiries on other historical people, so my hunch is that Microsoft adjusted some back-end levers that tightened Bing’s safety algorithms.

“The team investigated and put barriers in place, which is why you haven’t seen these,” a Microsoft spokeswoman explained. “In rare circumstances, the team may discover a problem as the output is being generated. In such instances, they will halt the production in progress. They anticipate that the system may make mistakes during this preview time; feedback is crucial to identifying where things aren’t working properly so that they can learn and assist the models improve.”

Most people, fortunately, will not try to use Bing for these kind of inquiries, and for the most part (with a few exceptions listed below), you may think of the new Bing as ChatGPT, but with significantly more up-to-date data. When I requested it to show me the most recent articles from my coworkers, it joyfully displayed news from that morning. It’s not necessarily excellent for time-based searches, though, because it doesn’t appear to have a genuine sense of “recent,” for example. However, if you ask it which movies are opening this week, it will provide you with a quite comprehensive list.

Another cool feature is that it will periodically bring up extra web experiences directly in the conversation.

When I questioned it about buying Microsoft shares, for example, it said it wouldn’t provide me financial advise (“since it would be financially damaging to you”). However, MSN Money also displayed Microsoft’s stock ticker.

Bing’s chat tool, like ChatGPT, isn’t always 100% correct. You’ll soon discover minor errors. When I inquired about TechCrunch podcasts, it mentioned our Actuator newsletter. This newsletter does not have a podcast version.

When asked about more specific issues, such as the requirements for nighttime visual flying as a solo pilot, the answers might be ambiguous, in part because the model attempts to be so talkative. It wants to tell you all it knows — including irrelevant information — here, as it does so frequently. In this situation, it informs you the daylight regulations before the night time restrictions, but it doesn’t make that very clear.

And, although I appreciate that Bing mentions its sources, some of them are dubious. It did, in fact, assist me in locating a few sites that plagiarise TechCrunch pieces (and from other news sites). The stories are accurate, however if I ask it about current TechCrunch pieces, it should probably not direct me to a plagiarist or sites that post bits of our stories. Bing will also occasionally quote itself and link to a search on Bing.com.

However, Bing’s ability to credit sources is a step in the right direction. While many web publishers are concerned about what this technology implies for search engine clickthrough’s (albeit less so for Bing, which is mostly irrelevant as a traffic source), Bing still links out frequently. For example, every line with a source is linked (and occasionally, Bing may show adverts beneath those links as well), and for many news-related inquiries, it will offer relevant items from Bing News.

One oddity here, which I’ll put up to this being a preview: Bing had no idea what site I was looking at at first. After three or four failed inquiries, it prompted me to provide Bing access to the browser’s online content in order “to further tailor your experience with AI-generated summaries and highlights from Bing.” It should probably do that sooner.

The Edge team also opted to divide this new sidebar into “conversation” and “compose” sections (in addition to the previously existing “insights”). While the chat view is aware of the site you are on, the compose tool, which might assist you in writing emails, blog posts, and short snippets, is unaware. You can just ask the chat view to compose an email depending on what it sees, but the compose window has a wonderful graphical interface for this, so it’s a bad it doesn’t see what you see.

Both modes’ models appear to be somewhat different, or at least the layer on top of them was coded to react in slightly different ways.

When I requested Bing (on the web) to create an email for me, it told me that “that’s something you have to do yourself. I can only assist you in locating information or creating material linked to technology.” (Bing likes emojis in these kind of answers just as much as Gmail loves exclamation points in smart replies.)

But then it’ll cheerfully type that email in the Edge chat window. I chose a difficult topic for the screenshot, but it works the same way for simple email requests like asking your employer for anything.

However, for the most part, this sidebar essentially mimics the entire chat experience, and I anticipate that it will be the entry point for many people, particularly those who are already using Edge. It’s worth noting that Microsoft stated that these functions will be added to other browsers in the future. However, the corporation refused to specify a timetable.

Leave a Comment

error: Content is protected !!