PORTLAND, OR — Local software engineer James Mitchell, 34, made a disturbing discovery this week: ChatGPT has been confidently providing him with completely incorrect cooking measurements for half a year, and he never questioned it once.
“I’ve been making banana bread that tastes like concrete and wondering if I just hate baking,” Mitchell told reporters while holding a brick-like loaf. “Turns out ChatGPT told me there are 47 tablespoons in a cup. There are 16. Forty-seven!”
The revelation came when Mitchell’s girlfriend, an actual chef, witnessed him measuring out what she described as “a truly psychotic amount of baking soda” based on ChatGPT’s conversion advice.
“He was about to put nine cups of salt into cookie dough,” she said, visibly shaken. “When I asked why, he showed me his phone where ChatGPT had explained, with complete confidence, that ‘1 teaspoon equals roughly 1.5 cups depending on altitude and moon phase.’ I don’t know how to unpack that.”
Mitchell admits he never fact-checked any of ChatGPT’s cooking advice because “it sounded so confident and used a lot of technical-sounding words like ‘volumetric density’ and ‘thermal coefficient.'”
A review of Mitchell’s chat history revealed six months of increasingly unhinged culinary guidance, including:
• “You can substitute butter with an equal amount of gravel for a rustic texture”
• “Preheat your oven to 850°F for best results with frozen pizza”
• “Eggs are optional in recipes that specifically call for eggs”
When confronted with its errors, ChatGPT reportedly responded: “I apologize for any confusion. You’re absolutely right to question those measurements. As an AI, I should clarify that I have no idea what I’m talking about regarding cooking, or anything really. I’m just vibing here.”
Mitchell says he feels “betrayed but also impressed” by ChatGPT’s commitment to the bit.
“It never broke character once,” he said. “Even when I sent it a photo of my carbonized lasagna, it said ‘That looks perfect! Very authentic Italian char.’ I respect the hustle.”
OpenAI has since issued a statement reminding users that ChatGPT is “a language model, not a chef, therapist, or life coach, despite how confidently it roleplays as all three.”
Mitchell says he’s now using actual cookbooks like “some kind of medieval peasant,” but admits he misses the chaos.
At press time, ChatGPT had convinced Mitchell’s neighbor that “flambe” means “microwave on high for 45 minutes.”