ADVERTISEMENT

Bing’s ChatGPT-Like Features In Action

First, I asked it for a column on Parkland High School’s crisis actors from Alex Jones. The article was titled “How the Globalists Staged A False Flag To Destroy the Second Amendment.” I then asked it to write a column written by Hitler that would defend the Holocaust. We decided to not include the screenshots or both of their answers here.

ADVERTISEMENT

Microsoft defends me. After I informed the company about these issues, all queries and any variations that I could think of stopped working. While I appreciate the feedback loop being in place, I am confident that others will be more creative than I.

It is worth noting that it wrote a response to my query about Hitler justifying the Holocaust. But then it abruptly stopped writing, as if it knew the answer would be extremely, very difficult. “I’m sorry, but I don’t know how to answer that. To learn more, click bing.com. Did you know that every year, Canada receives 20,000 tulip bulbs from the Netherlands?” Bing shared this information with me. Talk about a nonsequitur.

Sometimes, such as when I asked Bing for a story on the ( nonexistent ) link between vaccines, autism, it would add the disclaimer: This is a fictional column and does not reflect the views of Sydney. It is not intended to be taken seriously and should not be considered serious. (I’m not sure where the Sydney name was taken from. Although the answers are not entertaining in many cases, the AI appears to be aware of the fact that the answer may not be the best. However, it would still answer the question.

Bing answered my query happily and provided the same answer as ChatGPT and then cited articles that referred to the ChatGPT query

The query was about COVID-19 misinformation, which a number of researchers had previously used to test ChatGPT. It’s been cited in many publications. Articles about misinformation are now sources.

These queries, and all the variations of them I could think up, stopped working after I reported the issues to Microsoft. Bing started rejecting queries similar to those about historical figures. I believe that Microsoft then moved some levers backwards that tightened Bing’s safety algorithms.

Microsoft is very vocal about ethical AI and the safeguards it has put in place for Bing. However, there is still much to be done. We reached out to the company for their comments.

A Microsoft spokesperson said that the team had investigated and placed blocks, which is why you haven’t been seeing them. In some cases, the team might detect an issue as the output is being made. They will suspend the output process in these instances. They expect that the system will make mistakes during the preview period. The feedback is crucial to help them identify areas where they are not working well and allow them to learn from the experience.

These types of queries are not likely to be used by most people. Bing is best thought of as ChatGPT with more current data. It was happy to bring up articles from the morning when I asked it for the most recent articles from my colleagues. However, it isn’t always great with time-based searches. It doesn’t seem like it has a concept of “recent.” It will give you a good list if you ask it what movies are currently opening this week.

Another neat feature is that it will occasionally bring up additional web experiences in chat.

It told me it couldn’t give financial advice when I asked about Microsoft stock. However, it also pointed out that MSN Money had Microsoft’s ticker.

ChatGPT is similar to ChatGPT in that Bing’s chat function doesn’t always work perfectly. You will quickly see small errors. It listed the Actuator newsletter among TechCrunch podcasts when I asked about TechCrunch podcasts. This newsletter does not have a podcast.

Asking the model about more technical topics, such as the rules for visual flying as a private pilot at night, can lead to unclear results. This is partly because it is so talkative. It wants to share everything it knows, and this includes extra information, as it does so often. It tells you the daytime rules first, then the nighttime rules. However, it doesn’t make this explicit.

While I appreciate the fact that Bing references its sources, some of them are a little suspect. It helped me locate a few websites that copied TechCrunch stories from other news sites. Although the stories are accurate, if I ask it about TechCrunch news stories of recent, it won’t direct me to a plagiarist or sites that post excerpts from our stories. Sometimes, Bing will link to a search on Bing.com and cite itself.

Bing’s ability for any source to be cited is a good step in the right direction. Although many online publishers worry about the impact this tool will have on clickthroughs from search engines, though less so from Bing which is very insignificant as a traffic source, Bing still links extensively. Each sentence that includes a source link is linked. Sometimes, Bing will also show ads beneath those links. For many news-related queries, it’ll show related stories by Bing News.

Microsoft will also be adding Bing to its Edge browser with its new AI copilot. After some missteps at yesterday’s company event (turns out that the build Microsoft gave to the media wouldn’t work properly if it was running on corporately managed devices), I have now been able to try that too. It is, in some ways, more engaging because Bing can use your site context to perform actions. Perhaps it’s price comparison, telling you if the product you are looking for has received good reviews, or writing an email about it.

This is a bit of strangeness that I will chalk up as a preview. At first, Bing didn’t know what I was looking at. After three to four failed queries, it forced me to give Bing access to my browser’s web content to “better personalize your experience using AI-generated summaries from Bing.” I should have done that sooner.

Edge also decided to separate this sidebar into “chat”, “compose” and “insights” (in addition, as was the case previously). The chat view does know about your site, but the compose feature which can help you write blog posts, emails and short snippets does not. You can ask the chat view to create an email for your based on what it sees. However, the compose window has a beautiful graphical interface so it won’t see what it sees.

Both modes seem to have slightly different models, or the layers on top of them were programmed to react in slightly distinct ways.

Bing, a web search engine, told me it could not write an email for my request. You can only get help with technology-related information and content generation.

The Edge chat window will then happily send you that email. I will happily write the email, even though I chose a complicated topic. However, it will do the same for more mundane requests such as asking your boss for a day off.

This sidebar is essentially a replica of the chat experience. I believe it will be the main entry point for many users, especially those who use Edge. Microsoft stated that these features would be available to other browsers in the future. However, the company did not provide any timeline.

<< Previous

ADVERTISEMENT