Back when we were in school, we always learned in history about countries such as Spain, England, France, Germany, etc.. Those countries have largely faded into irrelevance in World affairs as the center of power has shifted eastward towards Russia and China. Is this a good thing for those countries? What, as Americans, can we learn from these once world powers?