Hitler won his war having left Nazi roots in the West
Hans Vogel - Pravda
December 8th, 2008
It is a historical fact that Hitler's Germany has lost the Second World War. But is it, really?
It may be dangerous to even question the defeat of the Nazis, since at the very least one runs the risk of being declared insane and locked up. After all, we have here a historical fact, whose “factuality” is so absolutely lapidary and evident as to be the cornerstone of the very world we are living in.
In 1945 absolute evil, incorporated by Adolf Hitler, was defeated by absolute good, represented by the forces of liberal democracy, led by the United States. Entire generations in “the West” have been instructed by their governments, their teachers, and by scholars and journalists what to think of World War II. The number of Hollywood films elaborating on all facets of the epic struggle between the United States and Germany is so great as to defy counting. For scope and intensity few propaganda campaigns can rival the US post-war offensive to persuade the world to accept its skewed version of history. Its results are impressive indeed: nobody in “the West” today will admit the undeniable fact that it was actually the Red Army that defeated Hitler in the most bitterly fought campaign in history. It can hardly even be discussed in an academic environment: its mere suggestion will cause contemptuous smiles and a loss of intellectual prestige.
Now that the true character of the US political system is increasingly apparent to the whole world, now that the parasitical nature of its economic system is causing ever greater numbers of victims, now that the lies that underpin the US domination of world politics are finally being exposed, the time has come to take a serious look at what the “US victory over the Nazis” has brought us.
If tomorrow Hitler would rise from the grave and look at the world, he would be a happy man, but also a confused man. He would look at the US and see a country that had adopted so many facets of Nazi Germany. Why then, he would wonder, had the US fought his Nazi Germany? He would think he had won the war after all.
I am afraid Hitler would be right. He has indeed won the war.
In 2001 just after the destruction of the WTC towers in New York, Jean-Marie Colombani, editor-in-chief of the prestigious French newspaper Le Monde wrote in an editorial “we are all Americans.”
I am afraid, we in “the West” have all become Nazis now.