Which Is True? the FDA Wants to Keep Us Sick So They Can Make a Fortune From Drug Companies and Doctors?
Question by Scott: Which is true? The FDA wants to keep us sick so they can make a fortune from drug companies and doctors?
Or not? I can show you a hundred web site that say that FDA, Big Pharma, and medical doctors are conspiring against the American people to keep us sick and make a fortune from us and our insurance companies.
I can also show you a hundred websites that say all of the above is a lie and there is no conspiracy.
Someone will accuse a man of being an idiot for believing such nonsense. Others will say he’s naive for thinking that doctors truly want to cure people.
My question is this: Which one is true? It has to be one or the other so don’t say neither or both. One must be correct, and there must be proof. Let’s put this to rest once and for all. It’s not religion we’re talking about here so we should be able to solve it.
Best answer:
Answer by flyguy09
The vast majority of doctors are compassionate people who want to provide the best care for their patients. Getting into medical school is extremely competitive, and it’s even more demanding once you’re in. Plus, a lot of med students are $ 200,000 in debt by the time they graduate, and they get paid pretty much the same salary that a waiter gets paid during internship. Keeping this in mind, it would be really stupid to go into medicine for the money, especially when you can make so much more in other professions. I would say that 90% of doctors went into medicine because they are really passionate about helping people. So no, doctors don’t purposely provide you with treatments that don’t work so that you keep coming back.
With that being said, pharmaceutical companies definitely influence health care to a large degree. Somebody has to fund clinical trials, and that somebody is usually pharmaceutical companies. They’re not going to fund a clinical trial testing the efficacy of their drug if the results say that the drug doesn’t work. Also, drug company representatives try to entice doctors to prescribe their drugs more often by giving them free lunch, nice pens, free vacations, and so on. So yes, drug companies are definitely there to make money, and they do influence the way health care is delivered in the United States. I don’t think that’s a conspiracy, really, because it is pretty obvious (e.g. you will probably see drug company reps with their suits and briefcases at your doctor’s office).
That doesn’t mean that so-called “holistic” practitioners are saints, though. If you look at any of these individuals who make these claims against the FDA and doctors, they themselves are just trying to make money. They’ll write a web page or newsletter convincing you that everyone is out to get you, and that the only way you can save yourself is by buying their herbal blend or supplement.
So in conclusion, if I had to choose who to believe, I would choose my physician. If you feel like your physician is a jerk who is out to get your money, find another doctor. Like I said, the vast majority of doctors are good people.
Add your own answer in the comments!