i saw an interesting show on tv yesterday where this woman had cancer and she is a doctor and she didnt want to go through chemo or anything so she significantly changed her diet and lifestyle and the cancer went away... i know that is prolly an extreme case but it really made me think... im a strong believer that there are usually natural ways to cure or treat almost everything but the big drug companies and therefore the doctors and government dont want people to know that so they dont lose money. but of course, many "medicines" actually cause problems or only mask them. how do you feel about this whole issue? which do you think is generally better?
i dont know if i agree with you, marc... i dont neccasarily consider putting chemicals into your body NATURAL.
I didn't mean that, Trish. Just that natural remedies were their source. So, why not consider going back to the source.
I dont trust western remedies anymore, my doctors lied to me for their own gain so anything any doctor tries to give to you, throw it back in their face I am convinced that doctors are evil, I dont trust them and niether should anybody else
Personaly I believe the mind itself can cure most things.... And natural substances are far preferable to chemicals....
natural is bullshit. a chemical is a chemical, whether it comes from a plant or a lab. also, natural remedies tend to turn one purple
LMAO no, but for real Trish, you need to IM me, I will tell you my story and explain more throughly why doctors are evil.... in case you dont remember, its Mr D here, I think you have my address
alright...um, i dont know if i do. on msn im dnbgurl22@msn.com (would like to hear your story but i know doctors are corrupt) oh btw! where the hell have you been??
Ive been out and about but ya, I have you in my list, but youre not showing up as online, check if you blocked me or not