I mean...no? That's just not true. The fall of the Roman Empire led to centuries of brutal wars through all their colony states, plague was rampant, education/literacy dropped sharply, what freedoms were had under Roman rule was curtailed, individual monarchies and city states fought over resources, the Catholic church rose to prominence, and the whole thing famously and infamously led to the Dark Ages, and they weren't called that because black was in fashion.There’s others worth debating here (particularly South America, their issues are more Monroe Doctrine than self-inflicted and the comparative prosperity of the 19th century attests to that), but this Rome one is transparently false. Diet quality, height, likelihood of dying violently, lifespan, and I think even child mortality all improved after the fall of the Roman Empire. Rome sucked ass to live under, it just looked nice to be powerful in.
The whole idea of the Renaissance in the 15th century, a full millennium after the fall of the Roman Empire, was a return to Roman ideals of democracy, humanities, science, mathematics, expression, education and cleanliness.