Last year, as an experiment, I baked cakes from recipes generated by ChatGPT and Bard. It went better than I expected — because much of each recipe didn’t seem created by AI at all.
The recipes I got were eerily familiar to recipes I’ve seen on food blogs or Instagram. They’d been slightly changed, and only Bard (now called Gemini) bothered offering attribution, linking to a recipe from Sally’s Baking Addiction. In some ways, this isn’t a new problem; I’ve seen far too many copied non-AI recipes floating around the internet. AI, though, makes the ethics of recipe copying more fraught. Its sheer scale threatens to create a world where human recipe developers are crowded out by semi-randomized AI-generated competition, but while there have been hints of this world arriving, we’re not in it yet.
Even without AI’s involvement, recipe similarities are a very touchy subject in the food world. Some influencers find themselves in a firestorm if another food blogger or cookbook author claims they stole their recipes. The New York Times reported several instances of alleged recipe copying, including the 2021 case when author Sharon Wee accused British chef Elizabeth Haigh of copying passages and recipes from her cookbook. Then, there’s the online hate against popular food blogger Half Baked Harvest, who’s been accused several times of copying from two other popular food bloggers.
But food creators have very little legal recourse if they feel someone took their work. A simple list of ingredients and instructions is treated as an idea that cannot be protected by a copyright. Many recipes have an element of oral tradition; many are passed down through family members. While some include lengthy personal preambles that are copyrightable, the biggest pushback for “stealing” recipes is often the daily snark threads on Foodsnark Reddit, and the recipe world is largely governed by etiquette — not law.
And like musical chords, there are a limited number of ingredients you can put together for a passable recipe. Take a pie crust. How many permutations of sugar, butter, and flour can one write out before someone claims they just repeated the same recipe?
Large language models, like those that power ChatGPT and Gemini, can take those permutations, parse through them faster than a human could, and come up with a pretty solid recipe very quickly. As a result, finding recipes that meet a particular diet is often touted as a good potential use of chatbots. On the other hand, to make an obvious point, AI tools can’t actually prepare or eat food. They don’t really “know” if a recipe will work, just that it fits the pattern of one that does.
Cookbook author and recipe developer Abi Balingit, who runs the blog The Dusky Kitchen, said she doesn’t think about AI when she creates recipes. But she worries that it may impact where food writers and developers can feature their work.
“There are gradients of what is fine and not, AI isn’t making recipe development worse because there’s no guarantee that what it puts out works well,” Balingit said. “But the nature of media is transient and unstable, so I’m worried that there might be a point where publications might turn to an AI rather than recipe developers or cooks.”
Generative AI still occasionally hallucinates and makes up things that are physically impossible to do, as many companies found out the hard way. Grocery delivery platform Instacart partnered with OpenAI, which runs ChatGPT, for recipe images. The results ranged from hot dogs with the interior of a tomato to a salmon Caesar salad that somehow created a lemon-lettuce hybrid. Proportions were off — as The Washington Post pointed out, the steak size in Instacart’s recipe easily feeds more people than planned. BuzzFeed also came out with an AI tool that recommended recipes from its Tasty brand.
Balingit added that people have a certain level of trust when they read someone’s recipe or watch them make a dish. There’s the expertise of having made the food and actually tasted it.
That explained why I instantly felt the need to double-check the recipes from chatbots. AI models can still hallucinate and wildly misjudge how the volumes of ingredients impact taste. Google’s chatbot, for example, inexplicably doubled the eggs, which made the cake moist but also dense and gummy in a way that I didn’t like.
Balingit maintains that one advantage recipe creators have is the human connection. Her cookbook Mayumu is filled with dishes that take an imaginative trip through her parents’ migration to the US from the Philippines, her childhood in California, and her current life in New York. She said AI doesn’t have cultural or nostalgic connections to food and eating, a historically personal thing people share with others.
I felt the same. Though I never truly cared who first devised the idea to poach chicken in a ginger broth, it’s still my favorite Filipino dish, tinola. When I first started learning how to cook it (thanks largely to my dad kicking me out of the kitchen for being a nuisance), I scoured the internet for “authentic” recipes. I chose to follow the work of people whose personal connection to the dish was strong, and it made me want to make the food they wrote passionately about.
ChatGPT and Bard can generate functional recipes. I know that because I followed them. But I knew, as the person who baked those cakes, that it was dispassionate and generic. My editor, Adi Robertson, compared one to a boxed cake mix, and another reminded me of sad cafeteria cakes. Sure, it hits the spot. Yes, it’s chocolate cake. But cakes can be so much more.
"food recipes" - Google News
March 11, 2024 at 10:46PM
https://ift.tt/KVHkt45
AI-generated recipes won't get you to Flavortown - The Verge
"food recipes" - Google News
https://ift.tt/hL7I1jd
https://ift.tt/2T3blSc
Bagikan Berita Ini
0 Response to "AI-generated recipes won't get you to Flavortown - The Verge"
Post a Comment