AdvertisementWalking In America Remains Dangerous — Especially In FloridaJanuary 10, 2017Laurel WamsleyfacebookEmailDespite efforts to reduce pedestrian deaths, many cities have become more dangerous for walkers in recent years. (Getty Images/EyeEm)