Restaurant Calorie Checker
Snap the plate, get calories and macros. Works at any restaurant — chain, indie, takeout, delivery. No menu lookup needed.
How to Estimate Restaurant Calories in 3 Steps
📸 Photo the plate
Top-down angle, include a utensil or hand for scale. Before you dig in.
💬 Send to Nouri
Telegram chat. Optionally add context: "half portion," "no dressing on salad," "extra cheese."
🔢 Get estimate
Calories + protein + carbs + fat, with a note on invisible fat (dressing, cooking oil) where relevant.
Why Restaurant Tracking Breaks Traditional Apps
MyFitnessPal and similar apps depend on a food database. For restaurants, this means: (1) the chain has to have submitted nutrition data, (2) the menu item has to match exactly, (3) portion size has to be standard. All three fail constantly.
- Independent restaurants almost never have database entries — you're guessing.
- International restaurants are badly covered in US apps — their food rarely maps to database items.
- Menu items vary by location — the "Cobb salad" at one Chili's isn't the one a user logged three years ago.
- Portions are whatever the kitchen feels like — crowdsourced entries don't match what's on your plate tonight.
Photo-based tracking sidesteps all of this. If the AI can see it, it can estimate it — regardless of which restaurant, which city, which cuisine.
Real Scenarios Where This Helps
🍝 Date night at a new Italian spot
Restaurant isn't in any database. Menu is descriptive, not quantitative. Photo of the plated pasta gives you a calorie read on the spot.
🍜 Traveling in Asia
Local noodle shops, street food, regional dishes that don't exist in US databases. Photo-based estimation is language-agnostic.
💼 Business lunches
Client lunch, you don't control the order. Photo-estimate without a phone-typing ritual — discreet, instant.
🏠 Takeout and delivery
Container contents often don't match menu descriptions. Photo what's actually in the box.
🥂 Weddings and events
You don't have time to database-hunt during a 4-course reception. Photo each course, sum later.
🍣 Omakase, tasting menus
12 small courses, no menu, no portions posted. Photo each plate — aggregate estimate at the end of the meal.
Photo Technique: 3 Rules for Better Estimates
- Top-down angle. The AI needs to see the full plate perimeter. Side shots lose portion size cues.
- Reference object in frame. A fork, a smartphone, a drink glass. Gives the AI a known size to calibrate from.
- Before you eat. An untouched plate is the cleanest input. Half-eaten meals confuse portion estimation.
Following these reduces typical restaurant-meal error from ±25% to ±15% — still not laboratory-precise, but good enough for steering a weekly calorie average toward a goal.
What to Add for Better Accuracy
If you already know a detail that would be invisible in the photo, tell Nouri. A reply with "extra dressing on this," "they used a lot of butter," or "only ate half" updates the estimate immediately. These corrections also train the AI on your specific awareness — next time you photo a similar plate, the baseline starts closer to right.
For eating out frequently, combine this with the AI calorie counter workflow so home meals and restaurant meals use the same interface and accumulate into the same daily total.
When Restaurant Photo Estimation Is Weak
- Soups and stews with hidden ingredients. The AI sees broth; the meat and starch content below surface is inferred from recipe averages.
- Heavily sauced dishes (curries, creamy pastas). Sauce volume is visible, sauce calorie density isn't. Expect wider error.
- Ambiguous portion sizes at chain restaurants where "large" vs "regular" isn't obvious from photo alone. Mention the size in the message.
- Buffet plates with 8+ items. Each item is small, photo resolution limits identification. Two photos (one per half of the plate) helps.
Frequently Asked Questions
How accurate are restaurant meal estimates?
Realistic accuracy is ±15–25% for restaurant meals vs ±10–15% for home meals. The bigger gap comes from invisible cooking fat — oil, butter, sauces — that restaurants use generously. Nouri applies a typical "restaurant adjustment" that assumes ~10% extra fat on visible plates. Useful for awareness, not for contest-week macro precision.
Does it work with any cuisine?
Yes — vision models are trained on food from most major cuisines. Strong accuracy on American, Italian, Mexican, Japanese, Chinese, Indian, Thai, and Mediterranean. Less tested on highly regional dishes (specific regional Indian preparations, lesser-known Southeast Asian dishes, niche African cuisines). Results are still useful, just with wider error bars.
Can I use it for takeout and delivery?
Yes. Photo the plated food the way you'd eat it — after unpacking from delivery containers. Plate composition matters more than restaurant brand or cuisine.
Is it better to photo the menu or the food?
The food, by a wide margin. Menu descriptions are notoriously imprecise ("grilled chicken salad" can range from 300 to 1,200 kcal depending on dressing and cheese). A photo of the actual plate you're about to eat produces a much tighter estimate.
What if the restaurant provides nutrition info?
Use the restaurant info when it's available — chain restaurants in the US are legally required to post it. Nouri's photo estimate is the fallback for independent restaurants, international spots, and anywhere without menu-level nutrition data.
How should I frame the photo to get the best estimate?
Three rules: top-down angle so the plate area is visible; keep a utensil or hand in the frame for scale reference; avoid close-ups that crop out the plate edge. All three together reduce portion-size error by roughly 30% vs a random snapshot.
Does it handle shared appetizers or family-style meals?
Tell Nouri in a reply how much you personally ate ("I had about half of this"). The bot splits the estimate accordingly. Works for tapas, dim sum, Indian thalis, and shared apps.
Related Reading
Eat Out Without Losing Track
Free photo-based restaurant tracking. No database lookup. Works anywhere.
Start Free in Telegram