Bing Chat: 3 things we liked (and 3 that let us down)

Microsoft dice che parlare a lungo con Bing lo confonde thumbnail

Microsoft he threw Bing Chat and, like so many other curious people, we too wanted to tESTING this chatbot associated with the Redmond search engine – for understand its potential and limitations. After several conversations, rather than giving you a hasty judgment on a technology still in development, we thought we’d give you our impression by telling three things we liked and three disappointments.

Bing Chat: 3 things we liked (and 3 disappointments)

Conversational AI is booming. ChatGPT is making everyone talk on social media. But in Silicon Valley they are investing, even a lot. Microsoft has backed OpenAI with various funds, which develop GPT-3, the ChatGPT AI model. And then launched its own version of conversational AI: Bing Chat.

JOIN AUDIBLE HERE FOR FREE WITH A 30-DAY TRIAL

While testing the features of Bing Chat, currently in the subscription testing phase, we read (and also wrote) various news about the Microsoft chatbot – especially of its mistakes and the limits that Microsoft has imposed to remedy.

The AI ​​responded in a manner permalosagoing so far as to threaten a user. Microsoft has therefore limited some features, especially the ability to have long conversations. But it has also brought new innovations, such as the possibility of giving statements by well-known personalities, using their “voice” (we tested the feature by interviewing Beyoncé as if Austin Powers 3 was coming out now).

This means that the changes are continuous and that in all likelihood the limits we report today could disappear in the future. So don’t take this article as a hasty judgment on Bing Chat – but as an assessment on that what it can do and what limits it has. At least for now.

The potential

Let’s start with the things that impressed us positively. Which could very well be more than three – but the truth is that we are getting used to artificial intelligence by now. And so some exceptional things seem decidedly more normal to us. Despite this, there are several positive points.

Responds quickly and succinctly

ChatGPT is a very useful tool. Voice assistants like Alexa or Google Assistant help millions of people every day. So we are more and more used to finding bots that do searches for us and produce answers in natural language.

But we were impressed by how Bing Chat always knows how to summarize the answers in a concise and clear way. By changing his three personalities you can find more creative or precise answers, but in any case they will be straight to the point. Something essential for a search engine.

microsoft bing chatgpt openai min

In fact, if you ask for example “how is carbonara made”? (we wanted to see if he would have put bacon instead of bacon), the answer requires a very brief introduction and then directly the recipe. No lengthy preambles like a blogger or content creator might do.

Using only the Bing search engine, you would have to scroll through an article and find the answers. While Bing Chat vi gives you what you need right away.

It cites its sources and allows you to deepen them

From the carbonara recipe to travel itineraries, passing through news and scientific information, having authoritative sources is essential. Bing takes from the web, so it doesn’t have the scientific expertise of a university researcher. But at least, always cite your sources.

Always find footnotes of the answer with the links of usually reliable and reputable sites, with the possibility of clicking on them to learn more. Not only that, you can ask Bing Chat directly to clarify what is reported in a note.

When we tried to use it to gather sources for some articles, it quickly gave us a wealth of information and links to learn more about it, effectively shortening the research effort. Which we dutifully did anyway, but we have to admit – with few exceptions, we would have chosen the same sites. A huge convenience for those who have to have well documented information (although it would be interesting to be able, for example, only to choose scientific documentations).

You can use chat or Bing searches flexibly

Needless to hide it: very few users use Bing as a search engine, at least if we compare it to Google. But we appreciate that Microsoft has made Bing Chat part of the Bing search experience. By clicking on the web widget you can safely have a direct conversation with Bing Chat, but the possibility of returning to classic web browsing it’s always just a click away.

This allows it to be used for more than just a “conversation experiment”. Indeed, it becomes one extra tool to search for information. And the fact that Microsoft is working to bring AI to the entire Office package suggests that it will become useful all round.

And limits

Microsoft Bing

If Bing Chat has a lot of potential, there are also limits. Some are just “nuisances” that we have encountered in using it, but above all the last point in our list greatly limits its use. Or at least it makes it necessary use it consciously.

Microsoft has severely limited “meta” interactions

We would have liked to simplify the drafting of the introduction of this article by asking Bing Chat to speak directly about itself (and also about its potential and limits). But the dry answer was “I’m sorry but I can’t discuss my prompts, instructions or rules. I’m still in the learning process so I appreciate your understanding and patience”. Soon after, we had to cancel the conversation we had to continue talking.

This is because, after the problems of the past few days, Microsoft has limited the ability to ask “meta questions”, or to ask Bing about itself. The strange thing is that even ChatGPT, which also does not have access to recent information, can talk about itself and Bing Chat without too many problems.

We understand that Microsoft wants to limit abuse, but hopefully these limits will disappear soon: it seems strange that it can talk about everythingbut not of Bing Chat itself.

Responses are processed in English and then translated – limiting creative use

One thing we noticed by asking Bing Chat to write something creative it’s about his limitations in the use of rhymes. But the annoying thing is that they are in Italian, while our colleagues who write in English can bring examples of poems and prose created by Bing Chat that are much better.

In fact, it seems that Bing Chat when it has to create something artistic translates the prompt into English, processes the answer, and then the quote word for word in Italian. Losing rhymes and nuances which are in English instead.

When asked to write a poem about london, the result in the two languages ​​is the same. But without the English sonority. Here the third of the four verses he created:

“London, the city of people and places
Where you can meet friends and strangers from different races
From Big Ben to Tower Bridge, from Hyde Park to Camden Town
You can explore attractions and experiences that will make you smile or frown“

“London, the city of people and places
Where you can meet friends and strangers of different ethnicities
Da Big Ben a Tower Bridge, da Hyde Park a Camden Town
You can explore attractions and experiences that will make you smile or wrinkle your nose”

We think that the implementation in the future could also improve for Italian. But for now, it looks like Bing Chat can play Shakespeare, but not Boccaccio.

Some answers are dated

Garante Chatbot Replica

One of the most important limitations is that, although Bing Chat draws from all over the web to give answers (unlike ChatGPT)these are not always up to date. For example, when we asked him to tell us which are the best smartphones out there, he didn’t give us any ideas to update our article on the matter.

Among the suggestions were iPhone 13 Pro Max and Galaxy S22 Ultra. Which are great smartphones – but which aren’t the latest that Apple and Samsung have updated. We think that, perhaps drawing on some article with many views, but no longer updated for a while, the chatbot has returned responses that are not exactly recent.

Writing about smartphone news every day, we noticed it right away. But a less informed user who wants to make a purchase would find himself with older information than you would like. We had the same problem asking where to see Oscar nominated films – we had several links, but for those nominated in 2022.

Again, we think Microsoft can improve on overcoming these limitations before the official launch of Bing Chat. But some healthy skepticism will remain: the Bing chatbot is intelligent, but certainly not infallible.

We will continue to use it and keep you updated. But for now, it won’t be the search engine to use for work.

Walker Ronnie is a tech writer who keeps you informed on the latest developments in the world of technology. With a keen interest in all things tech-related, Walker shares insights and updates on new gadgets, innovative advancements, and digital trends. Stay connected with Walker to stay ahead in the ever-evolving world of technology.