Medieval medicine and computers
November 8, 2010This is a bit of a provocative post, and its impressions (I dare not promote them to the level of conclusions) should be taken with the amount of salt found in a McDonald’s Happy Meal. Essentially, I was doing some reading about medieval medicine and was struck by some of the similarities between it and computer engineering, which I attempt to describe below.
Division between computer scientists and software engineers. The division between those who studied medicine, the physics (a name derived from physica or natural science) and those who practiced medicine, the empirics, was extremely pronounced. There was a mutual distrust—Petrarch wrote, “I have never believed in doctors nor ever will” (Porter 169)—that stemmed in part from the social division between the physics and empirics. Physics would have obtained a doctorate from a university, having studied one of the highest three faculties possible (the others being theology and law), and tended to be among the upper strata of society. In fact, the actual art of medicine was not considered “worthy of study,” though the study of natural science was. (Cook 407).
The division between computer scientists and software engineers is not as bad as the corresponding division in the 1500s, but there is a definite social separation (computer scientists work in academia, software engineers work in industry) and a communication gap between these two communities. In many other fields, a PhD is required to be even considered for a job in your field; here, we see high school students starting up software companies (occasionally successfully) all the time, and if programming Reddit is any indication, there is a certain distaste for purely theoretical computer scientists.
The unsuitability of pure computer science for the real world. Though the study of pure medicine was highly regarded during this time, its theories and knowledge were tremendously off the mark. At the start of the 1500s the four humors of Hippocratic medicine were still widely believed: to be struck by disease was to have an imbalance of black bile, yellow bile, phlegm and blood, and thus the cure would be to apply the counteracting humor, and justified such techniques as bloodletting (the practice of purposely bleeding a person). Even the understanding of how the fundamentals of the human body worked were ill-understood: it was not until William Harvey and his De motu cordis et sanguinis (1628) that the earlier view that food was concocted in the organs and then flowed outwards via the veins, arteries and nerves to the rest of the body was challenged by the notion of the heart as a pump. (Cook 426) If the circulatory system was true, what did the other organs do? Harvey’s theory completely overturned the existing understanding of how the human body worked, and his theory was quite controversial.
I have enough faith in computer science that I don’t think most of our existing knowledge is fundamentally wrong. But I also think we know tremendously little about the actual nature of computation even at middling sizes, and this is a quite humbling fact. But I am also fundamentally optimistic about the future of computer science in dealing with large systems—more on this at the end.
Testing instead of formal methods. The lack of knowledge, however, did stop the physicians (as distinct from physics) from practicing their medicine. Even the academics recognized the importance of “medieval practica; handbooks listing disorders from head to toe with a description of symptoms and treatment.” (Porter 172) The observational (Hippocratic) philosophy, continued to hold great sway: Thomas Sydenham, when asked on the subject of dissection, stated “Anatomy—Botany—Nonsense! Sir, I know an old woman in Covent Garden who understand botany better, and as for anatomy, my butcher can dissect a joint full and well; now, young man, all that is stuff; you must go to the bedside, it is there alone you can learn disease.” (Porter 229)
In the absence of a convincing theoretical framework, empiricism rules. The way to gain knowledge is to craft experiments, conduct observations, and act accordingly. If a piece of code is buggy, how do you fix it? You add debug statements and observe the output, not construct a formal semantics and then prove the relevant properties. The day the latter is the preferred method of choice is a day when practitioners of formal methods across the world will rejoice.
No silver bullet. In the absence of reliable medical theories from the physics, quackery flourished in the eighteenth century, a century often dubbed the “Golden Age of Quackery.” (Porter 284) There was no need to explain why your wares worked: one simply needed to give a good show (“drawing first a crowd and then perhaps some teeth, both to the accompaniment of drums and trumpets” (Porter 285)), sell a few dozen bottles of your cure, and then move on to the next town. These “medicines” would claim to do anything from cure cancer to restore youth. While some of the quacks were merely charlatans, others earnestly believed in the efficacy of their treatments, and occasionally a quack remedy was actually effective.
I think the modern analogue to quackery are software methodology in all shapes in sizes. Like quack medicines, some of these may be effective, but in the absence of scientific explanations we can only watch the show, buy in, and see if it works. And we can’t hope for any better until our underlying theories are better developed.
The future. Modern medical science was eventually saved, though not before years of inadequate theories and quackery had brought it to a state of tremendous disrepute. The complexity of the craft had to be legitimized, but this would not happen until a powerful scientific revolution built the foundations of modern medical practice.
Works referenced:
- Porter, Roy. The Greatest Benefit To Mankind. Fontana Press: 1999.
- Park, Katherine; Daston, Lorraine. The Cambridge History of Science: Volume 3, Early Modern Science.
I dont think it’s all that different either. Good analogy. :)
CS is great from the HW to the base-OS and programming libraries, and then it loses it’s footing as the complexity of state spirals the variables out of control.
You’re not alone. Frederick Brooks writes in the preface to the 20th anniversary edition of The Mythical Man-Month:
“In preparing my retrospective and update of The Mythical Man-Month, I was struck by how few of the propositions asserted in it have been critiqued, proven, or disproven by ongoing software engineering research and experience.”
It feels like the majority of software development principles/practices/etc. are based purely on opinion, and maybe some experience if you’re lucky, but never based on science. It’s completely normal to make bold claims – such as “language X is better than language Y” or “NoSQL is better than relational databases” – evidenced only by opinion and anecdotes. Can medical researchers make such bold claims about drugs, supported only by opinion and anecdotes? It’s hard to discern the truth when there is so little data collected under reasonable experimental conditions.
What is our equivalent of a double-blind placebo-controlled study? Is such a study possible in our discipline? I’m no scientist, but it seems like software development is difficult thing to run experiments on.
You reference a ‘Cook’ who does not appear in your list of works referenced. Would that be Hal Cook?
Despite the rapid acceptance of the circulation. Harvey’s theory of generation never caught on, from which we might conclude that every new idea from a software guru has to be judged on its own merits, no matter often s/he was right before.
Most developers left computer science behind at university (if they even went). We need to see some shining examples of how computer science is going to help us improve reliability, productivity, scaleability etc.
Remember how all that CS research went into AI and spectacularly failed to deliver very much - I thought computers would be programming themselves by now ?
A short time ago, a friend (and also one of the best programmers I know) told me that, against what almost everybody think, our profession is more a craftsmanship than an engineering. And we realized that this is one of the big causes (not justifications) about why is so hard to planning and release projects in time.
Our develpment ’tools’ and frameworks contribute to increase this mess when the code don’t do what we expect that do, the documentation and samples are pretty frugal (in case that exists), and after hours (or sometimes days) of test and error, and prays to Saint Google, we found that our code is right, but it need the install of a fix (or a entire update or service pack) to works properly.
Recently I discovered, after alot of tunings in the servers, how to improve up to 1,000% the performance of a slow web app: using Chrome instead IE in the client side. In the same of Saint Bit, tell me how we can apply a scientific method instead the empirical way to solve this kind of situations?
If we are in the same situation that the medicine was five centuries ago, It seems that we are doomed to be empirical artisans other five centuries.
Thought provoking comparison. Can software engineers learn much from medicine’s long history? Not until Vesalius and Harvey did practitioners begin to influence medical theorists. Only after old theories were debunked could new discoveries lead to new theories that could improve practice…tenured CS profs take note. Of course, we now live in very different times ;)
I agree that software engineering practice in industry is pretty awful, commercial software quality is abominable, and huge amounts of money and effort are spent on quality improvement initiatives that don’t improve quality. Commercial software writers can certainly use some guidance. But consider what drives innovations like the personal computer, mobile computing and social networks. These arose from commercial demand, not the university lab.
OTOH, the OOA/OOP paradigm shift shows that academics in endowed chairs have something to contribute. And the revolution in bioinformatics demands software that can pass scientific peer review. Informatics is not doomed–rather, we’re just beginning to glimpse some potential benefits. Maybe the lesson to be learned is that practitioners and academics can and should both learn from one another.
The responses to this post so far are indicative that you’ve struck a chord. I think your best comparison is between medicine quacks and programming frameworks/methodologies. It seems like every week there is some new methodology that everyone must be crazy not to implement because “look at how successful it was for Company X!”. I don’t buy most of these fads and it seems too often I’m saying “yep, saw that coming” when the fads are later shot to pieces (director’s cut of Robocop style) in public arenas like Slashdot and Reddit.
Thank you for the thought-provoking impressions and thank you for looking closely at history for guidance in today’s crazy world of programming - I hope more people are taking notes.