Toggle Nav

History of Medicine in America

Posted on October 10, 2015

Western Medicine, US Medicine in particular has changed from the early days of America. Medicine started out as a trade much like trades such as barrel makers, carpenters, etc. Just as young boys and young men were indentured to masters, medical doctors would go to a limit school and then be “interned” to a medical doctor that was well established. It was during their internships that much of the art and practice of medicine was learned. This link is quite lengthy but vastly interesting.

The foundation for this video is a book, “Murder by Injection, the History of the AMA”, by Eustice Mullins. I hope that this clip is both enjoyable and enlightening. Ultimately, my hope is that you will think and ask questions about what is going on with your body, what treatments are being recommended and become an educated consumer in the health care market.