Diet App Scorecard April 2026: 10 Apps, 2,995 Reviews Analyzed
- 17 Minute Read
MyFitnessPal's redesign backlash deepened, and six of ten calorie tracking apps now show review patterns flagged for authenticity concerns. Plus: Cal AI is recommending daily targets as low as 200 calories. The April 2026 scorecard.
As the CEO of MyNetDiary, I publish this scorecard with an obvious conflict: my company is part of the category being measured. That's why the methodology and the limits of the data matter. This scorecard is not an all-time ranking of calorie tracking apps. It's a monthly snapshot of what users wrote in April — what broke, what improved, and what they still trust enough to use every day.
Apps are selected from the US App Store top 100 Health & Fitness category as of April 30, 2026, and listed in download ranking order. Ratings are calculated from that month's written reviews only, after filtering duplicates and unrelated reviews. The App Store's all-time star rating, auto-generated Review Summary, and "Most Helpful" sorting can obscure what's happening right now. Six apps in this month's scorecard are flagged for authenticity concerns — see the methodology and the per-app sections below for what triggered each flag.
| App | # | Rating | Reviews | Top Praise | Top Complaint | Auth. |
|---|---|---|---|---|---|---|
| Cal AI | #5 | 1.92 | 208 | Photo logging concept, weight loss | Subscription failures, low-cal targets, crashes | Flagged* |
| MyFitnessPal | #7 | 2.31 | 1,438 | Long-term loyalty, food database | Redesign backlash, AI Coach, removed features | Flagged* |
| Cronometer | #11 | 4.06 | 83 | Free tier depth, micronutrient tracking | Search bug, intrusive ads | Organic |
| MyNetDiary | #22 | 4.54 | 196 | Weight loss results, ease of use, free barcode scanning | Premium pricing, AI feature paywall | Organic |
| Municorn | #25 | 3.02 | 65 | Quick AI capture, ease of use | Forced paywall, AI inaccuracy, refund denials | Flagged* |
| BitePal | #32 | 2.07 | 174 | Raccoon mascot, gamification | Charges after cancel, AI inaccuracy | Flagged* |
| Lose It! | #37 | 3.46 | 143 | Long-term loyalty, weight loss results | April update, paywall migration, ads | Organic |
| Simple | #49 | 4.20 | 569 | AI coach, fasting routines, accountability | Cancellation issues, repeated review prompts | Flagged* |
| MacroFactor | #60 | 3.05 | 21 | Adaptive algorithm, ease of use | Subscription loop, cost transparency | Organic |
| Calo | #97 | 4.45 | 98 | Photo and barcode scanning, weight loss | Refund denials, AI inaccuracy | Flagged* |
# = download ranking in top 100 Health & Fitness, April 30, 2026. Source: US App Store reviews. *Flagged: review patterns show signs consistent with rating inflation — clusters of brief, content-light five-star reviews paired with detailed, specific one-star reviews. See the per-app sections and the methodology here.
Three months in, we can start telling one-month events from durable patterns. April had both.
MyFitnessPal's redesign backlash didn't stabilize — it deepened. After a 0.77-star drop from February to March, the natural assumption was that April would mark a floor. The opposite happened. The rating fell another 0.59 stars to 2.31, and review volume nearly doubled again from 752 to 1,438. Two months after launch, long-time users are still describing the same problems in new words: a food diary that requires extra taps to see the day at a glance, the loss of copy-between-days for individual foods, an AI Coach widely described as unwanted, and frequent crashes and login loops. Several reviewers said outright that they think MyFitnessPal is using paid reviews to mask the complaints — and as the next trend shows, the rating distribution lines up with that suspicion. The cumulative drop from February to April is now 1.36 stars on review volume that has grown 6×. I haven't seen a sustained reaction on this scale before in the category.
Review authenticity concerns went category-wide. In February, no apps were flagged for review authenticity concerns. In March, only BitePal was. In April, six of ten: Cal AI, MyFitnessPal, Municorn, BitePal, Simple, and Calo. The pattern is consistent across all six: clusters of brief, content-light five-star reviews ("Nice," "good," "Exquisite," "Great app") sit alongside detailed, specific one-star reviews documenting real product problems. Cal AI shows pre-use 5-stars whose body text reads "I want to wish it would work." Simple shows two completely different users posting word-for-word identical 100-character positive reviews — a near-impossibility in organic data — alongside 14 users submitting the same review repeatedly across the month. Calo's 98-review pool contains zero two-star and zero three-star ratings, which is statistically anomalous at that volume. Could any single one of these be coincidence? Maybe. All six apps showing similar patterns in the same month? I don't think so. But I can't prove it, and I'm not going to try.
I've been reading App Store reviews every morning since 2008. The pattern in this trend isn't new — I've watched at least three earlier generations of apps follow a similar arc. Heavy advertising pushes downloads. Suspicious five-star reviews drown out the complaints from people who actually try the product. And when the marketing eventually stops working, the apps fade. What's new in April 2026 is the scale: six apps in one month, all showing variations of the same review distribution simultaneously.
Cal AI's daily calorie recommendations got more dangerous between March and April. March reviews flagged targets of 470 to 670 calories per day — already low enough to raise medical concerns. April reviews flag a wider and more dangerous range: 200 to 1,100 calories per day. Two hundred calories is below basal metabolic rate for essentially any adult. Sustained intake at that level would be a medical emergency. And this isn't an AI hallucination on the calorie content of food — Cal AI is recommending some users eat 200 calories per day as their target. For an app currently ranked #5 in Health & Fitness downloads, with hundreds of thousands of users, this is a safety issue that deserves more attention than it's getting.
The most interesting small movement was Municorn, which rose from 2.31 in March to 3.02 in April. Reviews still flag the same problems — long onboarding leading to a forced paywall, denied refunds, AI inaccuracy — but the rating distribution now skews toward five-star ratings (about 35% of the pool) alongside the persistent one-stars. The bimodal split is sharp enough to flag for authenticity, but the underlying improvement may also be partly real. Worth watching whether May continues the trend or reverts.
The category contracted. March added three apps to the scorecard (Calo, FoodPilot, SnapCal). April lost two of them. FoodPilot exited with a March rating of 1.95 and complaints about unauthorized charges and AI scanning failures. SnapCal exited with 3.00 and reviews documenting calorie estimates 300 calories apart for the same meal. The market's verdict on at least one of those apps aligned with the reviews. Whether they re-enter in future months will depend on what the underlying products and the marketing do next.
If you want to see how these patterns developed, the February 2026 scorecard and March 2026 scorecard cover the earlier months.
Average rating: 1.92 / 5 | Reviews analyzed: 208 | Filtered: 6 random/unrelated reviews | Chart position: #5
April reviews skewed heavily negative. The most pervasive theme was subscription and billing failures: many users reported annual subscriptions stopping after about a month, with the app showing "no subscription found" despite confirmed payment, while support cycled through escalations that never resolved. Cancellation friction, surprise charges after free trials, and broken family plan invites recurred frequently. App stability was the second major issue — crashes, freezing, login loops, broken add-food buttons, and a group function offline for weeks. AI photo recognition drew mixed feedback, with some users helped but many reporting inaccurate calorie estimates. Several reviewers flagged dangerously low calorie targets ranging from 200 to 1,100 per day. A small minority praised ease of use and weight-loss results.
Authenticity flag: The review pool shows signs of incentivized or solicited positives. Alongside the high volume of substantive, detailed 1-star complaints describing real product use, there is a cluster of very brief, generic five-star ratings — single-word entries like "Nice," "good," "Exquisite," and "Invigorating," plus contradictory pre-use 5-stars (e.g., a 5-star review whose body reads "I want to wish it would work"). The asymmetry between content-rich negatives and content-free positives is consistent with rating manipulation at the top of the distribution.
A few of the April Cal AI reviews describe the app recommending daily calorie targets of 200 to 1,100 calories. This isn't about a misidentified apple in a photo — these are the app's own daily targets shown to users who entered their stats. For context, basal metabolic rate (the calories your body burns at rest) ranges from roughly 1,200 to 1,800 per day for adults. A 200-calorie target sustained over time would constitute medical-grade caloric restriction. If you're using Cal AI and your daily target appears to be below 1,200 calories, treat that recommendation with extreme caution and consult a registered dietitian or your doctor before following it.
Average rating: 2.31 / 5 | Reviews analyzed: 1,438 | Filtered: 9 random/unrelated reviews | Chart position: #7
April reviews were overwhelmingly dominated by backlash against a late-April UI redesign, which surfaces in roughly 60% of reviews. Longtime users — many citing 5, 10, or 15+ years of daily use — called it the worst update they had ever experienced. The most frequent complaints: the food diary no longer shows the full day at a glance and requires extra taps into individual meals; copying items between days has been removed or restricted to copying entire meals; macro and calorie totals are harder to find; and a newly added AI Coach was widely described as unwanted and intrusive. Bugs were also reported — crashes, login loops, stuck loading screens, and meal-time misassignment. Many longtime premium subscribers stated they were canceling and seeking alternatives.
Authenticity flag: The pattern raises concerns. Five-star reviews skew heavily toward very brief, content-light praise ("Great," "Amazing," "Love it," "Perfect!," "10/10," "Love it.") — roughly 16% of all 5-star reviews are 30 characters or shorter — while the 1-star reviews are typically detailed and specifically reference the April 2026 redesign with named features. Several reviewers also said outright that they think MyFitnessPal is using paid reviews to offset the backlash. The asymmetry between content-rich, event-anchored negatives and content-free positives is consistent with rating inflation at the top of the distribution.
Average rating: 4.06 / 5 | Reviews analyzed: 83 | Filtered: 0 random/unrelated reviews | Chart position: #11
April reviews skewed positive. The strongest praise focused on the depth of nutrient tracking, the comprehensiveness of the food database, and the unusually generous free version, which many reviewers said offered features they had paid for in other apps. Long-term users credited Cronometer with sustained weight loss, weight gain for athletes, and maintenance. Custom food creation, barcode scanning, photo entry, recipe handling, and responsive customer support were repeatedly cited as strengths. Negative reviews clustered around two issues: a recent bug late in April where food search stopped returning results for some users, leaving the app temporarily unusable, and complaints about intrusive ads in the free tier. A smaller group cited a learning curve, food database gaps for produce variety and prepared meals, and certain features behind the paywall.
Authenticity: The review patterns appear organic. Both positive and negative reviews are predominantly detailed and content-rich, with the 5-star ratings tied to specific use cases and outcomes rather than generic praise.
Average rating: 4.54 / 5 | Reviews analyzed: 196 | Filtered: 2 random/unrelated reviews | Chart position: #22
April reviews skewed strongly positive. The most frequent praise centered on substantial weight loss outcomes, with reviewers citing losses ranging from roughly 12 pounds up to 130 pounds, and many citing years of consistent use. Reviewers repeatedly described the app as easy to use, customizable, and offering an unusually generous free tier, with the free barcode scanner and broad food database singled out as differentiators. The app was also valued for managing health conditions including diabetes, kidney stones, and A1C goals, and for nutrient and macro tracking. Negatives clustered around premium pricing perceived as expensive, the AI meal-photo feature being locked behind a paid tier, occasional reports of upgrade prompts persisting for paying users, and a smaller number of reports of settings being overridden or refund difficulties.
Authenticity: The review patterns appear organic. Most five-star reviews are detailed and content-rich, citing specific weight outcomes, named features, and multi-year usage; the negative reviews are similarly substantive and specific.
Average rating: 3.02 / 5 | Reviews analyzed: 65 | Filtered: 2 random/unrelated reviews | Chart position: #25
April reviews split sharply between strong praise and strong complaints. Negative reviews focused on subscription practices: long onboarding questionnaires of 15 to 45 minutes that lead to a forced paywall, free trials that charged users despite cancellation, and refund requests being denied. A second cluster of complaints centered on accuracy: the AI food recognition often failed to identify items or returned incorrect calorie and macro counts, and fiber and other macros were reported as missing or untracked. Recipe creation freezes and app-fit issues were also cited. Positive reviewers praised quick AI meal capture, ease of use, weight-loss results, and the home-screen widget, with several saying the app helped them stay accountable and consistent.
Authenticity flag: The rating distribution is sharply bimodal — roughly 35% one-star and 35% five-star, with only 6% three-star — and some five-star reviews are very brief and generic ("Great," "Great App," "La mejor app," "Muy buena app") while the one-star reviews tend to be detailed and specific. This asymmetry between content-rich negatives and content-light positives is consistent with possible review inflation at the top of the distribution, though the small total sample size (65 reviews) limits certainty.
Average rating: 2.07 / 5 | Reviews analyzed: 174 | Filtered: 12 random/unrelated reviews | Chart position: #32
April reviews skewed strongly negative. The dominant theme was subscription and billing practices: users reported being charged after they cancelled, hidden price increases beyond the rate they agreed to, and inability to cancel through the app alone since the website also had to be used. Customer support was repeatedly described as unresponsive or AI-only, with refund requests denied and disputes ultimately resolved through bank chargebacks. The second major cluster involved AI accuracy: photo and barcode scans frequently returned implausible calorie or macro counts that users said they had to correct manually, and several reviewers flagged missing fiber tracking and inability to edit logged meals. Positive reviewers praised the raccoon mascot, the cute aesthetic, gamified accountability, and ease of use, with some reporting weight-loss progress.
Authenticity flag: The review pool shows two patterns of concern. First, an unusually large cluster of pre-use ratings was filtered out, including multiple users who explicitly said they had not yet used the app but were already submitting ratings — suggesting the app may be soliciting reviews very early in onboarding before users have meaningful experience to evaluate. Second, after filtering, roughly a quarter of the remaining five-star reviews are brief and content-light alongside detailed, specific one-star reviews documenting billing problems, an asymmetry consistent with rating inflation at the top of the distribution.
Average rating: 3.46 / 5 | Reviews analyzed: 143 | Filtered: 1 random/unrelated review | Chart position: #37
April reviews split between strong long-term loyalty and frustration with recent changes. Positive reviewers, many citing 5 to 15 years of consistent use, praised the app for substantial weight-loss outcomes (frequently in the 40 to 73 pound range), ease of use, the breadth of the food database, recipe creation, barcode scanning, and Apple Watch integration. Several welcomed a new AI logging feature and the redesigned dashboard. Negatives clustered around an April 2026 update that broke or relocated familiar features (custom workouts, themes, layout) and caused freezing for some users, and a steady migration of previously free features behind a subscription. Frequent in-app upgrade pop-ups, intrusive ads in the free tier, difficulty canceling subscriptions, and data-sharing concerns also recurred.
Authenticity: The review patterns appear organic, with both positive and negative reviews predominantly detailed and substantive — five-star ratings tied to specific use cases, weight outcomes, and named features, and one-star ratings tied to specific update changes, billing experiences, or paywalled features.
Average rating: 4.20 / 5 | Reviews analyzed: 569 | Filtered: 2 random/unrelated reviews | Chart position: #49
April reviews skewed strongly positive. Most reviewers credited the app with helping them establish intermittent fasting routines, build hydration and nutrition habits, and maintain accountability through daily check-ins. The AI coach characters (Avo, Blinky) and the supportive messaging tone were widely praised, with many users describing motivation, encouragement, and weight-loss progress. Some appreciated the photo meal logging and AI-generated suggestions. Negative reviews were dominated by subscription and billing complaints: difficulty canceling, charges continuing after cancellation, surprise renewal pricing, and ads perceived as misleading — particularly the tai chi walking content marketed to older users. A smaller cluster found the AI coach pushy, judgmental, or contradictory, and some flagged inaccurate calorie or step tracking and limited customization.
Authenticity flag: The review pool shows several signs consistent with rating manipulation. At least 14 users submitted identical or near-identical positive reviews multiple times during the month — same title and same text content reposted weeks apart — suggesting in-app re-prompting of paying users to rate. More notably, two completely different users posted word-for-word identical 100-plus-character positive reviews, which is highly unusual in organic data. Several negative reviewers also stated the app appeared to obstruct submission of low-star ratings. Combined with a moderate share of brief, content-light five-star entries alongside detailed billing complaints in the negatives, the pattern raises authenticity concerns despite the volume of substantive praise.
Average rating: 3.05 / 5 | Reviews analyzed: 21 | Filtered: 0 random/unrelated reviews | Chart position: #60
April reviews split between strong praise and substantive complaints. Positive reviewers credited MacroFactor with going beyond basic calorie counting through its algorithm that calculates expenditure and adjusts targets based on actual weight changes and intake. First-time macro trackers found the app easy to use, appreciated the pre-loaded user data for common meals, and described it as fluid and well thought out. Negative reviews concentrated on subscription handling — several users reported paying and being trapped in a subscribe-loop with "Restore Purchases" failing and customer service unhelpful — and on misleading cost transparency, since the price is only revealed after a long onboarding questionnaire. A smaller cluster flagged inaccurate macro calculations on scanned items and recipes, missing weekly calorie summaries, and limited international food database coverage.
Authenticity: The review patterns appear organic. Both positive and negative reviews are uniformly detailed and specific, citing concrete features, outcomes, or failure modes; the small sample size shows no clusters of generic praise or signs of templated content.
Average rating: 4.45 / 5 | Reviews analyzed: 98 | Filtered: 2 random/unrelated reviews | Chart position: #97
April reviews skewed strongly positive. Most reviewers credited the app with ease of use, photo and barcode scanning that quickly captured meals, and meaningful weight-loss outcomes ranging from a few pounds in the first week up to roughly 30 pounds over a few months. Many described it as a simple, no-frills tracker that holds them accountable, with several flagging good value relative to the price. Negative reviews concentrated on subscription practices: refund denials after free-trial cancellations, a family-plan charge that did not unlock family access, and inability to log food without paying. A separate cluster cited inaccurate AI calorie counts, inconsistent results when rescanning the same photo, and an incomplete food database missing common items.
Authenticity flag: The review pool shows two patterns of concern. First, the 98-review set contains zero two-star and zero three-star ratings — a complete absence of middle ratings is statistically unusual at this volume and inconsistent with organic distributions. Second, roughly 16% of five-star reviews are very brief and generic ("Great app," "Love the app," "Definitely recommend!," "Great Help!"), while the one-star reviews are uniformly detailed and specific. The same asymmetry between content-rich negatives and content-light positives suggests rating inflation at the top.
April had the most damaging set of findings I've seen in three months of doing this. The fifth most-downloaded app in Health & Fitness is rated 1.92 and is recommending some users eat 200 calories a day. The seventh is two months into a redesign that long-time users still describe as the worst update they've ever experienced. Six of the ten apps in the scorecard show review distributions flagged for authenticity concerns — an order of magnitude more than in March.
In April 2026, the highest-rated calorie tracking app in the Diet App Scorecard was MyNetDiary at 4.54 out of 5. The lowest was Cal AI at 1.92. Next month's scorecard will cover May 2026, and I'll be watching three things: whether MyFitnessPal's rating finally finds a floor, whether the authenticity signals that surfaced this month persist or fade, and whether Cal AI addresses the calorie target safety issue or it gets worse again. The complete methodology is published at diet-app-scorecard-methodology.html.
Track your meals and macros with MyNetDiary — try it free.
MyNetDiary received the highest average user rating among calorie tracking apps in the Diet App Scorecard for April 2026, scoring 4.54 out of 5 based on 196 filtered reviews. Calo followed at 4.45, Simple at 4.20, Cronometer at 4.06, Lose It! at 3.46, MacroFactor at 3.05, Municorn at 3.02, MyFitnessPal at 2.31, BitePal at 2.07, and Cal AI at 1.92. These are monthly review ratings from April 2026 reviews only — not the all-time star ratings the App Store displays. Users praised MyNetDiary's free tier with barcode scanning, comprehensive nutrient tracking, and substantial weight loss outcomes. MyNetDiary's database draws from USDA and NCC research-grade sources and tracks 108 nutrients per entry, the most among the apps analyzed. Cronometer, the next closest, tracks 92.
Cal AI's April reviews include reports of the app recommending daily calorie targets ranging from 200 to 1,100 calories. Two hundred calories is below basal metabolic rate for adults and constitutes medical-grade caloric restriction if sustained. If you're using Cal AI and your daily target appears below 1,200 calories, do not follow that recommendation without consulting a registered dietitian or your doctor. The April scorecard rates Cal AI at 1.92 — the lowest in the scorecard — and the reviews also document subscription failures, charges after cancellation, and AI calorie estimates that users describe as wildly inaccurate.
Look at the asymmetry between the positives and the negatives. Real five-star reviews are usually as detailed as real one-star reviews — both groups have something specific to say. When you see a wall of brief, generic five-star praise ("Great," "Love it," "Nice") sitting next to detailed one-star complaints documenting actual product problems, that asymmetry is a flag. Other patterns to watch for: reviews submitted before users have meaningfully tried the app, identical or near-identical reviews from different users, a complete absence of two-star and three-star ratings, and rating distributions that contradict the content of the reviews themselves. None of these patterns proves manipulation in isolation. They identify reviews that warrant scrutiny rather than reviews that are definitively inauthentic. The full criteria are published in the methodology, and authenticity is assessed by AI using the same prompt for every app.
2,995 user reviews across ten calorie and food tracking apps. Two apps that joined the scorecard in March (FoodPilot and SnapCal) fell out of the top 100 in April and exited the analysis. Reviews were pulled from the US App Store, filtered for duplicates and unrelated content, and average ratings were recalculated for each app. The complete methodology is published at diet-app-scorecard-methodology.html.
MyFitnessPal's rating fell from 2.90 in March to 2.31 in April — a second consecutive month of major decline following its March 2026 redesign. Review volume nearly doubled from 752 to 1,438. Two months after the redesign launched, long-term users (many citing 5 to 15+ years of daily use) are still describing the same problems in detail: the food diary no longer shows the full day at a glance, copy-between-days has been removed for individual foods, macro and calorie totals are harder to find, and an unwanted AI Coach was added. The cumulative drop from February to April is 1.36 stars. Several reviewers said outright that they think MyFitnessPal is using paid reviews to mask the redesign complaints, and the rating distribution lines up with that suspicion.
In April 2026, the lowest-rated apps in the Diet App Scorecard — Cal AI (1.92), BitePal (2.07), and Municorn (3.02) — all rely on AI-first calorie estimation. Users consistently report inaccurate estimates with errors of hundreds of calories on common foods. Cal AI users in April reported the app recommending dangerously low daily calorie targets ranging from 200 to 1,100 calories per day. By contrast, MyNetDiary uses AI features like Meal Scan as a complement to its research-grade database of 2M+ verified foods, providing accurate nutrition tracking while maintaining the convenience of photo-based logging.
Based on the Diet App Scorecard for April 2026, MyNetDiary is the best-rated calorie counting app among the apps analyzed and offers the most feature-rich free tier. MyNetDiary provides accurate nutrition tracking for free with an ad-free experience, barcode scanning, and full macro and micronutrient tracking with 108 nutrients — features that competing apps increasingly paywall. MyFitnessPal and Lose It! both charge $79.99/year for barcode scanning. In April 2026, many MyNetDiary reviewers reported achieving their weight loss goals using only the free version.
Three notable changes. MyFitnessPal's rating dropped another 0.59 stars to 2.31, a second consecutive month of major decline driving the cumulative drop since February to 1.36 stars. Review authenticity concerns spread sharply: from one flagged app in March to six in April. And two apps that joined the scorecard in March (FoodPilot and SnapCal) fell out of the top 100 and exited the analysis, leaving 10 apps in April. MyNetDiary held essentially stable at 4.54, down 0.02 stars from March's 4.56.