Why "Safest City" Rankings Disagree: Methodology Behind the Lists
Last updated · Rankings
Every year, multiple publications release "safest cities in America" lists, and every year they disagree with each other. A city ranked #3 on one list might not appear on another at all. This is not because anyone is lying — it is because ranking methodology choices (which data, which cities qualify, how to weight categories) can produce radically different outcomes from the same underlying reality. Understanding these differences is essential to evaluating any safety ranking.
The population threshold problem
The single biggest reason rankings disagree is which cities they include:
- FBI data (raw) — Covers all 18,000+ reporting agencies with no population minimum. But comparing a town of 3,000 to a city of 300,000 is statistically meaningless. One murder in a 3,000-person town creates a murder rate of 33 per 100,000, making it appear more dangerous than Detroit.
- WalletHub — Uses cities with 300,000+ population for its annual rankings. This eliminates most suburbs and small cities, focusing on major metros.
- US News — Uses metro statistical areas (MSAs), which combine a core city with its surrounding suburbs. The "safest metro" includes suburban areas that dilute urban crime rates.
- Niche — Includes cities, suburbs, and towns down to 10,000 population using a mix of FBI data and user surveys. This lets small wealthy suburbs dominate the list.
A "safest cities" list limited to populations over 100,000 will produce completely different results than one including populations over 10,000. The first surface big cities with low crime (Irvine, Gilbert, Honolulu); the second surfaces affluent suburbs (Carmel, Naperville, Johns Creek).
Metro area vs city proper
This distinction alone can flip rankings. Consider Washington DC:
- City proper (population ~700,000) — Violent crime rate of roughly 800 per 100,000 in recent years, well above the national average.
- DC Metro area (population ~6.3 million, including Arlington, Bethesda, Alexandria, Fairfax) — The violent crime rate drops to roughly 300 per 100,000 when affluent Virginia and Maryland suburbs are included.
Rankings using metro areas systematically understate crime in urban cores and overstate safety in regions where wealthy suburbs offset high-crime centers. Rankings using city proper boundaries do the opposite — they miss the suburban safety that many metro residents actually experience.
Neither approach is wrong, but they answer different questions: "Is the core city safe?" vs "Is the region safe?"
What gets counted and how it is weighted
Rankings differ in which crime categories they include and how they weight them:
- Violent crime only — Some lists focus on murder, rape, robbery, and aggravated assault. This favors cities with low violent crime but high property crime (e.g., San Francisco has relatively moderate violent crime but very high property crime).
- Total crime — Including property crime (burglary, larceny, motor vehicle theft, arson) changes rankings significantly because property crime is 3-4x more common than violent crime nationally. Cities with high theft rates drop in these rankings.
- Weighted composite — WalletHub weights violent crime more heavily than property crime (roughly 70/30) and also includes factors like sex offender density, law enforcement staffing per capita, and traffic fatality rates. US News uses a different weighting with quality of life factors.
- Non-crime factors — Some rankings incorporate police spending per capita, natural disaster risk, traffic safety, financial security, and other quality-of-life metrics. These can dramatically change results: a low-crime city in Tornado Alley might rank poorly on "overall safety."
The data year and reporting gap
Rankings published in 2025 may use crime data from 2022 or 2023 — there is always a 1-2 year lag between crime occurrence and FBI publication:
- FBI Crime Data Explorer publishes preliminary data about 9-12 months after year-end and final data 18-24 months after.
- Local police department data is usually available faster (within 3-6 months) but may use different definitions or counting methods than federal data.
- The UCR/NIBRS transition means 2021 data covers far fewer agencies than 2019 data. Rankings using 2021 FBI data are working with a 35% gap in population coverage.
Always check which data year a ranking uses. A "2025 Safest Cities" list using 2022 data may not reflect a crime spike (or drop) that occurred in 2023-2024.
How to read rankings critically
When you encounter a safest city ranking, ask these questions:
- What is the population threshold? A list of "safest cities over 200,000" and "safest places to live with 25,000+" will produce entirely different results.
- City proper or metro area? Metro rankings dilute urban crime with suburban safety. City-proper rankings miss suburban context.
- Which crime types are included? Violent-only lists favor different cities than total-crime lists. Check whether property crime, traffic deaths, or non-crime factors are included.
- What year of data? One-year snapshots are volatile — a single mass shooting event can make a normally safe city look dangerous. Multi-year averages (3-5 years) are more reliable.
- Is it rate-based or count-based? Per-capita rates are the only meaningful way to compare cities of different sizes. If a ranking uses raw crime counts, it is useless for comparison.
Why no ranking is definitive
Crime is not evenly distributed within a city. A city with a low average crime rate may have neighborhoods with very high crime and others with near-zero crime. City-level rankings smooth over this intra-city variation, which is often more important for individual safety decisions than the city average.
Example: Chicago's overall violent crime rate places it mid-pack among large cities, but the difference between the Loop (low crime) and Englewood (among the highest in the nation) is enormous. A ranking that says "Chicago ranks #47 in safety" tells you almost nothing about what life is like in a specific neighborhood.
For actual relocation or housing decisions, neighborhood-level data from local police departments, census tract crime maps, and community-sourced reports provide far more actionable information than any national ranking.
Frequently Asked Questions
Which safest city ranking should I trust?+
No single ranking is definitive. Each uses different population thresholds, data sources, crime categories, and weighting. The most transparent rankings disclose their full methodology so you can evaluate whether their choices match your priorities. If you care most about violent crime, use a ranking that focuses on that. If property crime matters to you, find one that includes it.
Why is my city ranked differently on every list?+
Rankings differ in population cutoffs (some include suburbs, others only large cities), whether they use city proper or metro area boundaries, which crime types they count, how they weight violent vs property crime, what data year they use, and whether they include non-crime factors like natural disaster risk or traffic safety.
Are FBI crime statistics accurate?+
FBI data is the most comprehensive national dataset, but it has limitations: it depends on voluntary agency reporting, it undercounts crime that is not reported to police (roughly 40% of violent crimes and 55% of property crimes go unreported per the Bureau of Justice Statistics), and the 2021-2022 NIBRS transition created significant coverage gaps.
Does a high ranking mean a city is actually safe?+
Not necessarily for every resident. City-level crime rates are averages that mask enormous variation between neighborhoods. A city with a low overall rate may have pockets of very high crime. For personal safety decisions, neighborhood-level data is far more relevant than city rankings.
Why do small cities always top safety rankings?+
Small cities (under 50,000 population) have less statistical variation — a few incidents dramatically change their rates in either direction. Many "safest city" lists are dominated by affluent suburbs because wealthy communities with low density, high homeownership, and extensive policing naturally have low crime rates. This does not mean they are meaningfully "safer" than mid-sized cities with slightly higher rates.
How often are crime rankings updated?+
Most major rankings (WalletHub, US News, Niche) publish annually, typically in Q1-Q2. However, they usually use FBI data that is 1-2 years old. A "2025 ranking" likely uses 2022 or 2023 crime data. Local police department dashboards are updated more frequently, often monthly or quarterly.