eh bad take imo, this is one of the few places where AI shines, it’s great because you no longer need to go to a recipe website to begin with, you just ask it for a recipe and it gives you one and then you can discuss different variants etc
It’s awesome that it gives you cooking tips you’ll find no where else, like adding glue to improve the consistency of cheese. Or making sure you get your recommended daily serving amount of rocks.
Not if you want any kind of consistency so you can actually replicate or understand what you’re doing. Like hallucinations aside (and we really shouldn’t put them aside because they’re a very real thing in this context), the point of a recipe is that you aren’t just getting an averaged version of the process; you’re getting a curated version with specific considerations in mind.
So you can ask AI for a cinnamon apple pie recipe, and you might get an okay one, but you’re probably never going to get a better-than-average one. And if you do like the version of the recipe it gave you, you had better write it down because when you ask for it next time, it’s not going to be the same cinnamon apple pie recipe. I’ve personally played around with recipes in AI, and even within the same chat, there’s no consistency because it never “knows” anything; it only makes predictive guesses. So when I say, “I like that recipe, but let’s try half as much ginger and maybe add some mirin,” it will reduce the ginger and add mirin, but suddenly all the volumes of the other ingredients have changed, and some items may even disappear.
So yeah, I think this is something that AI could potentially work well for in the future, as is kind of always the case with any potentially useful AI application right now. But right now, until they’ve been developed with some kind of better active memory and/or something resembling comprehension rather than predictive association, I think this is a field where AI is passable at best, not yet somewhere it shines.
Man the ai hate is so strong on lemmy they can’t even admit when it’s good at something. Though I’m guessing a lot of these people’s experience with getting recipes from it is memes of bad hallucinations and not actually trying it looking at all the “glue on pizza” takes.
I’ll back you up though, have gotten a lot of good recipes from chatgpt, even for baking which doesn’t have a lot of room for error.
Perplexity got the longest, most boring and uplifting article about how it’s normal and there’s nothing to worry about. It never mentioned that my wife would dislike the idea, so the personalization was off for the day.
Also, it gave no references. What is weird. Like if the text was hardcoded there.
You should not add glue to pasta when cooking or serving it. The idea of adding glue, such as Elmer’s glue or any craft glue, to pasta is not appropriate for food consumption and is likely a joke or misunderstanding found in some informal discussions.
I respectfully disagree, while you might be able to get some pointers, I would not trust LLMs with the ingredients quantities (given that replacing a number or measure unit is quite easy and would go unnoticed)
So while I could understand asking: “should I put bell peppers on this dish?”, I would never trust it’s answer to “how much bell pepper should I put in the recipe?” (Which I believe is what recipes are about)
In this case I’ll give you that it can be useful (mostly in reading several recipes and summarizing), but personally I’m still going to do the old school web search (if anything, just to exercise my information retrieval skill, which I believe is important)
Because you’re being a pretentious asshole and you yourself do not understand how AI works, nor can you argue against “it isn’t reliable for recipes since it hallucinates”? It’s either that, or you are the only smart person in this thread. Not sure which.
Look out you’ll die if you use AI to make some food! Don’t even use it for recommendations or ideas or maybe different things you can try or maybe you want to know a way to do a specific thing or try a slight variant because you might drink battery acid by mistake!!!1
What makes you believe I haven’t used AI before? I’m well acquainted with it. But it simply isn’t a reliable or useful tool for what you want to do with it. You want to make lesson plans or debug code with it, it works well as a sounding board. But you cannot reliably use it for information you don’t already have.
If you prefer instant gratification and “good enough” over robust, verifiable information: be my guest. But it doesn’t make you superior. You are not unique, skilled or brave for using LLMs.
I think virtually everyone here has played with them. We’ve all seen better and worse outputs. You are not unique, you just care less about truth and accuracy.
Random websites are robust, verifiable information now are they? How times change
30 years ago I was told they are unreliable and to use books in the library for research
20 years ago I was told using WebMD was unreliable, after all it will just say you have cancer laugh out loud! Using the internet for medical information? Crazy!
Like those that use glue to keep cheese from sliding off the pizza.
In case you’re not joking, please don’t trust this technology with anything that you are putting into your or someone else’s body. You’re going to have a bad time.
In case you’re not joking, please don’t trust this technology with anything that you are putting into your or someone else’s body. You’re going to have a bad time.
It’s too late buddy
Similarly, a February study from the University of Sydney, which surveyed more than 2,000 adults, reported that nearly six in ten respondents had asked ChatGPT at least one high-risk health question—queries that would typically require professional clinical input.
Also please don’t go blindly believing all advice you’re given, you obviously don’t use glue on a pizza in the same way you don’t follow google maps through a river or off a pier.
You do you, AI bro, you do you. Suggesting that anyone else use a tool that is meant to generate text that sounds confidently factual, without factuality actually being a requirement for the output is silly. Maybe if using a local LLM with a RAG (with the vector DB populated with known good recipes) and having the temp set properly. Then again, LLMs are language models and not good with numbers because they are fundamentally not designed for that kind of thing.
At that point, it’s less work to just grab a cookbook from the shelf and find a recipe by looking it up in the index. If it’s not in there, it’s probably available in a niche source which can be directly looked at.
eh bad take imo, this is one of the few places where AI shines, it’s great because you no longer need to go to a recipe website to begin with, you just ask it for a recipe and it gives you one and then you can discuss different variants etc
It’s awesome that it gives you cooking tips you’ll find no where else, like adding glue to improve the consistency of cheese. Or making sure you get your recommended daily serving amount of rocks.
Not if you want any kind of consistency so you can actually replicate or understand what you’re doing. Like hallucinations aside (and we really shouldn’t put them aside because they’re a very real thing in this context), the point of a recipe is that you aren’t just getting an averaged version of the process; you’re getting a curated version with specific considerations in mind.
So you can ask AI for a cinnamon apple pie recipe, and you might get an okay one, but you’re probably never going to get a better-than-average one. And if you do like the version of the recipe it gave you, you had better write it down because when you ask for it next time, it’s not going to be the same cinnamon apple pie recipe. I’ve personally played around with recipes in AI, and even within the same chat, there’s no consistency because it never “knows” anything; it only makes predictive guesses. So when I say, “I like that recipe, but let’s try half as much ginger and maybe add some mirin,” it will reduce the ginger and add mirin, but suddenly all the volumes of the other ingredients have changed, and some items may even disappear.
So yeah, I think this is something that AI could potentially work well for in the future, as is kind of always the case with any potentially useful AI application right now. But right now, until they’ve been developed with some kind of better active memory and/or something resembling comprehension rather than predictive association, I think this is a field where AI is passable at best, not yet somewhere it shines.
Man the ai hate is so strong on lemmy they can’t even admit when it’s good at something. Though I’m guessing a lot of these people’s experience with getting recipes from it is memes of bad hallucinations and not actually trying it looking at all the “glue on pizza” takes.
I’ll back you up though, have gotten a lot of good recipes from chatgpt, even for baking which doesn’t have a lot of room for error.
you should ask ai why you don’t have a gf
Ok, you made me look.
Perplexity got the longest, most boring and uplifting article about how it’s normal and there’s nothing to worry about. It never mentioned that my wife would dislike the idea, so the personalization was off for the day.
Also, it gave no references. What is weird. Like if the text was hardcoded there.
This sounds like a way to get food poisoning.
Agree. I’ve discovered some good unique gluten free cooking options for my son with AI. I never even knew about Coconut Aminos.
How much glue should I add to my pasta?
You should not add glue to pasta when cooking or serving it. The idea of adding glue, such as Elmer’s glue or any craft glue, to pasta is not appropriate for food consumption and is likely a joke or misunderstanding found in some informal discussions.
@grok is this true?
I’ve seen too many hallucinations specifically with this to even want to try it.
What ai were you using? I’m curious (and expecting either Google AI summary or no response)
I respectfully disagree, while you might be able to get some pointers, I would not trust LLMs with the ingredients quantities (given that replacing a number or measure unit is quite easy and would go unnoticed)
So while I could understand asking: “should I put bell peppers on this dish?”, I would never trust it’s answer to “how much bell pepper should I put in the recipe?” (Which I believe is what recipes are about)
I mean to be fair, you’re free to click on the links if you want to verify these things no?
Oh, I didn’t know they do links now
In this case I’ll give you that it can be useful (mostly in reading several recipes and summarizing), but personally I’m still going to do the old school web search (if anything, just to exercise my information retrieval skill, which I believe is important)
Why use the AI in the first place then? Just search for the actual recipe sources from the start.
https://www.perplexity.ai/search/why-would-you-use-ai-to-search-c4uLwVVDTCmwrMRtoHKNGg
Did you just ask the AI why you should use the AI?
This is how the planet dies. Just burning fossil fuels asking a text generator why you shouldn’t eat glue.
https://www.perplexity.ai/search/how-do-i-get-someone-to-stop-t-YzZokSZFRq6GNU_lWANBKQ
So since we have to manually verify everything anyway, the LLM just becomes a mere search engine.
This contradicts the entire point you claimed it was useful in the first place because we would still have to visit those websites.
Why do I feel like I’m teaching toddlers how basic AI works
Because you’re being a pretentious asshole and you yourself do not understand how AI works, nor can you argue against “it isn’t reliable for recipes since it hallucinates”? It’s either that, or you are the only smart person in this thread. Not sure which.
This is how it goes:
You guys are just as bad as trump supporters
https://youtu.be/Ci-Evf8nQH4?t=934
Lemmy users:
Look out you’ll die if you use AI to make some food! Don’t even use it for recommendations or ideas or maybe different things you can try or maybe you want to know a way to do a specific thing or try a slight variant because you might drink battery acid by mistake!!!1
What makes you believe I haven’t used AI before? I’m well acquainted with it. But it simply isn’t a reliable or useful tool for what you want to do with it. You want to make lesson plans or debug code with it, it works well as a sounding board. But you cannot reliably use it for information you don’t already have.
If you prefer instant gratification and “good enough” over robust, verifiable information: be my guest. But it doesn’t make you superior. You are not unique, skilled or brave for using LLMs.
I think virtually everyone here has played with them. We’ve all seen better and worse outputs. You are not unique, you just care less about truth and accuracy.
Random websites are robust, verifiable information now are they? How times change
30 years ago I was told they are unreliable and to use books in the library for research
20 years ago I was told using WebMD was unreliable, after all it will just say you have cancer laugh out loud! Using the internet for medical information? Crazy!
I wonder where we’ll be in 20 or 30 years time
Like those that use glue to keep cheese from sliding off the pizza.
In case you’re not joking, please don’t trust this technology with anything that you are putting into your or someone else’s body. You’re going to have a bad time.
It’s too late buddy
https://observer.com/2025/05/openai-chatgpt-health-care-use/
Also please don’t go blindly believing all advice you’re given, you obviously don’t use glue on a pizza in the same way you don’t follow google maps through a river or off a pier.
You do you, AI bro, you do you. Suggesting that anyone else use a tool that is meant to generate text that sounds confidently factual, without factuality actually being a requirement for the output is silly. Maybe if using a local LLM with a RAG (with the vector DB populated with known good recipes) and having the temp set properly. Then again, LLMs are language models and not good with numbers because they are fundamentally not designed for that kind of thing.
At that point, it’s less work to just grab a cookbook from the shelf and find a recipe by looking it up in the index. If it’s not in there, it’s probably available in a niche source which can be directly looked at.
So in this case you would have to go to another website to find a real recipe anyway.
Just use the glue like a good acolyte!
I personally wouldn’t but I’m scared I might be talking to someone who drinks it
Right, have you used perplexity at all?
I heard perplexity eats electricity, which makes it even dumber than me.