Europe has this image of being peace-loving, liberal, green, prosperous, harmless etc...but I don't believe it. Does America do the dirty work for Europe? Many White Americans I talk to love mother Europe more than America (especially liberals). Does Europe secretly pull all the strings? Are we being fooled? should Europe, Canada, Australia, and New Zealand be called the White power alliance?