Well, if you haven't seen this yet, it's not for lack of me telling you about it, that's for sure.
Again, I wish we'd had more time to talk about an issue that clearly could have filled up the whole hour and then some, and I don't think I've ever spoken in public at any point in my life and not wished afterwards that I'd been more eloquent, but...it was fun. I hope it sparks a lot of discussion, and I hope that people outside of medicine can try to empathize, even just a little bit, with the bigger issues at play.
Since I've started writing outside this blog a little more (see most recently, my stint writing for Psychology Today, where my motto has pretty much been go big or go home, at least in terms of picking highly polarizing topics), I've been struck by how much people seem to really dislike doctors. As a modern medical trainee, I never grew up thinking that I would be loved or revered as a doctor--I just wanted to do my job and take care of people, and if they wanted to thank me for it, fine, I appreciate being recognized, but the approbation was besides the point.
Still, I'm really taken aback by some comments I've read, not just in response to my piece, but just responses to doctors and their choices in general, that makes me wonder how this sentiment of generalized antipathy towards the medical profession, doctors in particular, has evolved. We've all gotten used to hearing blanket statements towards other professions ("All politicians are crooks" is probably the most common) and I have to say, some of the blanket statements I hear lobbed back and forth about our profession (that's we're lazy, that we're arrogant, that we don't care about people, that all we want to do is earn a buck or a million of them) seem to be treading awfully close to that territory. How did this happen? Do these stem from specific bad experiences that patients have had with particular doctors, or is there something else, something more pervasive and societal, that have not only knocked doctors off their historic pedestal, but put us in the same box as scoundrels and opportunists?
I have not seen this in my particular practice, I should say. I like to think I have a good rapport with my patients, and while I know everyone thinks its easy for anesthesiologists ("you just put them to sleep,") the fact of it is that we have to inspire the most amount of trust in the least amount of time, so patient relations is key. I also would like to underline that I don't particularly want to be put up on a pedestal (it's really only in the past few years that I've even become comfortable in being addressed with the honorific--if people call me "Michelle" I'm generally much more at ease), and regardless of how society views doctors, it's still an honor and privilege to serve our patients. But it's there's definitely a change in the role of doctors in society, and as much as I've been talking about the changing face and culture within medicine, perhaps there's just as much of a change in relating to us from the outside.
Why do people hate doctors? I mean, I kind of hate my dentist, but you know, not really--I just hate the reasons I have to go see him. But he's a good guy, doing his job, and I don't think he's a criminal or anything. Wherein does personal experience bleed into societal expectation? At what point did Norman Rockwell's country doctor become Alec Baldwin in "Malice"?
Thoughts?