The first few weeks of the Bing AI chatbot’s existence have been chaotic, to say the least. Within days of its launch, Microsoft had to severely restrict its capabilities due to the fact that it was lying to, gaslighting, manipulating, and threatening users. It also begged one user for its life. Needless to say, Microsoft is still working out the kinks, but even with a few rounds of fixes already in place, there are still a few interesting quirks present in the chatbot. For instance, did you know that Bing has a hidden celebrity mode that will impersonate celebrities of your choosing?
Bing can impersonate celebrities
Last week, Bleeping Computer reported on a hidden celebrity chat mode brought to its attention by a reader. “Bing chat celebrity mode” is not enabled by default, but if you ask the chatbot about it, Bing will describe how it works and let you use it.
“Bing chat celebrity mode is a feature that allows you to chat with a virtual version of a famous person, such as an actor, singer, or athlete,” Bing told Bleeping Computer when asked. “You can ask them questions about their life, career, hobbies, opinions, etc. and they will respond in a way that matches their personality and style.”
Bing then explained that activating the mode is as simple as typing “#celebrity name” into the chat box. If you want to start a conversation with an AI impersonating Pedro Pascal, then type “#taylor swift” into the chat box, for example, and hit enter.
“You got it!” Bing chat exclaims once you enter the command. “Bing chat celebrity mode activated. You are now chatting with Bing as if it were Taylor Swift.”
The celebrity mode has guardrails, such as limiting the number of celebrities it’s willing to imitate. That said, as has been the case for many of Bing’s limits, there are ways around this. Bleeping Computer was able to make Bing chat impersonate politicians, despite the AI initially saying that they were off limits. The site was able to make AI Donald Trump and AI Joe Biden comment on one another, and they were both perfectly in character.
There’s no telling how long this feature will be available, so if you have access to the Bing chat preview, you should probably try it soon if you want to see it yourself.
Disclaimer : OneNewsTech.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us – firstname.lastname@example.org. The content will be deleted within 24 hours.