Right now people think America is a wreck, and in some obvious ways it is. But we overrate the importance of the national stage. We’re mostly unaware of the true stories of our era, as they will be written 30 or 40 years from now.
We’ve become a nicer, more tolerant, gentler nation in many respects. I think there’s more wisdom in the country than ever before. We’re screwing some other things up royally, but those are so visible, I want to stick and say the actual country and the idea of America are still underrated.