I have a hard time believing doctors will be replaced in large numbers. The nature of their jobs may shift away from hypothesis generation but ultimately a human will be making the final diagnosis, choosing from several options that have been vetted algorithmically. Legally & practically a human needs to arbitrate that process, people would feel uncomfortable otherwise.
Why? what people do not understand about automation is that it doesn't need to be perfect - just good enough.
Doctors make mistakes all the time, if say you replace doctors diagnostic duties with say IBM Watson MD then as long as he's just as bad as your average doctor it won't matter.
The big open question in automation whether it's in transportation or medicine is liability, but that something that insurance companies can solve easily between them selves :)
Yeah, in a very limited set of circumstances, and still with a pretty significant miss rate.
But the day will come when the machines are near-perfect.
And people already trust computers more than they did in the 70s. Perhaps not if you ask them directly, but the reality is that they do, with many areas of their lives, and without a second thought.