Critics of America's health care system say it's really a "sick care" system. Doctors and hospitals only get paid for treating people when they're sick.
But that's starting to change. Health insurance companies and big government payers like Medicare are starting to reward doctors and hospitals for keeping people healthy.
So, many health care companies are trying to position themselves as organizations that help people stay well.