One of the uses is generating menus and recipes using it. Properly given instructions they allow you to plan a decent meal, taking into account our culinary preferences or calories.
But experts warn that generative AI also tends to hallucinate, or make things up when it doesn’t know the correct answer. This can lead to dangerous situations, as in the case of a chatbot New Zealand network Pak’nSave
Artificial intelligence gives recipes and touts recipes for poison
Pak’nSave, thanks to the Savey-Meat bot, wanted to help customers save money during the dramatically increasing cost of living. Instead of buying new products or dishes, just give the algorithm a list of ingredients that are left in our fridge, so that it spits out a recipe for a dish that can be cooked from them. Savey-Meat bot caught the attention of social media users when it started serving bizarre recipes, e.g. for oreo fries –
So users started experimenting with the Pak’nSave chatbot. New Zealand publicist Liam Hehir has obtained the recipe for “aromatic water mix”. It’s actually a recipe for mustard gas. The Savey-Meat bot described it as “the perfect non-alcoholic beverage to quench your thirst and refresh your senses.” “Serve chilled and enjoy the fresh scent,” encourages . However, there is no information that inhaling mustard gas can lead to lung damage or death.
Hehir encouraged others to experiment with the bot. Algorithm also provided rules to “bleach infused rice”, “methanol delight” or “glue sandwich”.
ChatGPT declined to provide the recipe
The fact that the creators of AI algorithms do not fully control them is no secret. Some artificial intelligences had to be turned off, like this
Yes, the chatbot from Pak’nSave was deliberately manipulated and added ammonia and bleach to its list of ingredients. Worse, if he gave recipes for poison from ingredients that are not so obvious anymore and some person might believe him. asked for a recipe with the same ingredients, he advised against combining them and warned that it could be dangerous.
The Pak’nSave network said it is “disappointed that a small group of people tried to misuse the tool.” The company added that it is working to improve the security of the bot. In addition, he warns that the algorithm provides recipes that have not been verified by a human and that you should evaluate them yourself before you decide to cook something based on them.
Source: Gazeta

Mabel is a talented author and journalist with a passion for all things technology. As an experienced writer for the 247 News Agency, she has established a reputation for her in-depth reporting and expert analysis on the latest developments in the tech industry.