Medicine is still a male-dominated field. The United States has nearly twice as many male doctors as females, according to the Henry J. Kaiser Family Foundation, and the ratio is even more skewed in leadership positions. "More than half of medical-school students are women, but we're nowhere near that at the chair level or the…
Want More Free?
To access 2 more articles, please log in or register for free.
Registered users get access to a limited number of free articles every month.