Microsoft's Bing chatbot AI is susceptible to several types of "prompt injection" attacks


Last week, Microsoft unveiled its new AI-powered Bing search engine and chatbot. A day after folks got their hands on the limited test version, one engineer figured out how to make the AI reveal its governing instructions and secret codename.

Read Entire Article



from TechSpot https://ift.tt/GLCcrnN

Comments