It's pretty obvious a lot of countries believe they have a responsibility for the health of their citizens. How is it we can unite behind the idea of going to war to "bring freedom and democracy" to other nations but not behind the idea of helping people live healthier lives?