Living Better
I do not understand The "Health Care" system in the U.S.!
Doctors and Drug Companies have no incentive to make you well or find cures. How did our system end up this way? Doctor's do not profit if you are healthy, only if you are sick. If you are not sick, then they recommend unneeded surgery or treatments so they can profit from you. We no longer find cures for diseases because doctors make a ton of money form the drug companies to recommend drug therapies for their patients so they can keep the money coming in. This is an evil of the capitalist health care system we have in the USA.
If you are treated by a doctor, and he fails to make you better, you still have to pay him. If you have surgery that fails, and even if you die because the doctor was not successful in his procedure, the doctor still expects to get paid and will go after your estate to collect. There is something very, very wrong with this.
The focus on doctors should be maintaining your wellness, not treating you because you get sick from poor diet and lifestyles choices. Just like a car, which will last longer and run better when it is properly maintained thruough preventive and scheduled maintenance programs. I would much rather pay for scheduled wellness exams than to deal with the stress and anxiety of getting sick. While the treatment programs would still be needed in such a system, integrating the current system with programs of wellness will benefit everyone and reduce the extreme costs associated with the present system.
Like annual or bi-annual dental checks are a necessity for maintaining healthy teeth for a lifetime, the same thing principal applies to our bodies.
** Crossposted on Backroads of my Mind
1 comment:
That would be socialism, according to the Republicans. Free market is what everything should be based on -- even our broken health care system.
Post a Comment