How The American Health Care System Became Big Business?

How The American Health Care System Became Big Business? In the U.S., healthcare is now strictly a business term. Healthcare organizes doctors and patients into a system where that relationship can be financially exploited and as much money extracted as often as possible by hospitals, clinics, health insurers, the pharmaceutical industry, and medical device manufacturers.