I have a feeling this post is not going to be very eloquently written, but can I just rant for a moment? The health care decision yesterday frustrated me, as I'm sure it did a lot of you. But even aside from that, health care has angered me lately.
Isn't it true that "back in the day", doctors were truly there to help people? It wasn't about money, it was about helping the sick, and keeping people healthy. Doctors didn't make much money. They truly cared and genuinely wanted to help. I'm not saying that doctors today don't want to help people, but really, think about it....
It is ridiculous how much it costs to go to a doctor these days, and how much it costs for health insurance. And, I'm sorry, but if someone is sick and doesn't have money to pay...help them anyway! A person's health is more important than the money someone will make off of a doctor's visit! Doctor's visits, health insurance, prescriptions....they ALL. COST. TOO. MUCH. If doctors and drug companies and health insurance companies would get off their high horses and reach out to people, forget about a large salary, and focus on helping others....how wonderful that would be.
I told you this would be a rant. Personally I think health insurance is dumb. If someone is sick, HELP them. Greed annoys me.
Thanks for listening! Your thoughts?