It seems that politicians as well as the news readers and commentators have the wrong idea of a preferred political environment. They continually speak of bringing democracy to this country or that and our government seems to want to commit the troops to bring this about (as long as the politicians don’t have to face the enemy personally, it’s okay).
Bring democracy to this country or that country at great expense of blood and treasure seems to be the cause of the day. But is it worth it? Is democracy always a good thing? Recall that Adolf Hitler was democratically elected before you attempt to answer the last question. Democracy can have bad, and even tragic, results.
Instead the better political ideal would be (lassiez-faire) capitalism. Something the progressives are progressing away from.