Comparison
MBFC rates ~5,000 outlets through one researcher's assessment. Web Jury aggregates trust-weighted crowd votes — distributions visible, free API, growing.
| Feature | Media Bias / Fact Check | Web Jury |
|---|---|---|
Number of outlets rated | ~5,000 | Thousands (growing) |
Methodology MBFC ratings are one person; ours are aggregated reader votes. | Single researcher's assessment | Trust-weighted crowd vote |
Accuracy score (separate from bias) | Yes | Yes |
Vote distribution shown | ||
Public API | Paid | Free + paid tiers |
Browser extension | Third-party | Official |
Per-article ratings | Coming Q2 | |
YouTubers + Twitter accounts | Limited | |
Open methodology | Documented | Documented |
Reader response / appeals | Email-only | On-platform |
MBFC was a pioneer — one researcher, Dave Van Zandt, has rated thousands of outlets since 2015. The scale is impressive, but the methodology has obvious limits: single-rater scoring means single-person blind spots. Two outlets can look identical in coverage and get different ratings purely from who evaluated them.
Crowd-sourced ratings flip the failure mode. The risk is brigading instead of bias. We handle that with trust-weighted votes (review history quality matters), distribution visibility (you see the histogram, not just the mean), and temporal smoothing (no single pile-on shifts public numbers).
Use both. MBFC's per-outlet write-ups are still excellent context. Use Web Jury when you want to see what readers actually think versus what one researcher concluded.
Free, no signup needed to browse. See how thousands of outlets are rated.
Other comparisons