The modern system of workers’ compensation is so complex and arcane it produces considerable grief to those who must deal with it on a daily basis. Yet these often cumbersome regulations are so ultimately vital to society they appear, in one form or another, in all industrialized nations. A look at workers’ law over the years demonstrates the failure of the historical alternatives to formal workers’ compensation systems to meet either the goals of social justice or economic efficiency. While the orthopaedic surgeon may often lament the difficult compensation case appearing in clinic, it may add some perspective to review how and why this system became entrenched in the workplace.

Workers’ Compensation in Antiquity

The history of compensation for bodily injury begins shortly after the advent of written history itself1. The Nippur Tablet No. 3191 from ancient Sumeria in the fertile crescent outlines the law of Ur-Nammu, king of the city-state of Ur. It dates to approximately 2050 B.C.2. The law of Ur provided monetary compensation for specific injury to workers’ body parts, including fractures. The code of Hammurabi from 1750 B.C. provided a similar set of rewards for specific injuries and their implied permanent impairments. Ancient Greek, Roman, Arab, and Chinese law provided sets of compensation schedules, with precise payments for the loss of a body part. For example, under ancient Arab law, loss of a joint of the thumb was worth one-half the value of a finger. The loss of a penis was compensated by the amount of length lost, and the value an ear was based on its surface area3. All the early compensation schemes consisted of “schedules” such as this; specific injuries determined specific rewards. The concept of an “impairment” (the loss of function of a body part) separate from a “disability” (the loss of the ability to perform specific tasks or jobs) had not yet arisen.

Yet the compensation schedules of antiquity were gradually replaced as feudalism of the Middle Ages gradually became the primary structure of government. The often arbitrary benevolence of the feudal lord determined what, if any, injuries garnered recompense. The concept of compensation for the worker was bound up in the doctrine of noblesse oblige; an honorable lord would care for his injured serf.

Common Law and the Early Industrial Revolution

The development of English common law in the late Middle Ages and Renaissance provided a legal framework that persisted into the early Industrial Revolution across Europe and America. Three critical principles gradually developed which determined what injuries were compensable. They were generally so restrictive they became known as the “unholy trinity of defenses.

1. Contributory negligence.

If the worker was in any way responsible for his injury, the doctrine of contributory negligence held the employer was not at fault. Regardless of how hazardous the exposed machinery of the day was, any worker who slipped and lost an arm or leg was not entitled to any compensation. This was established in the United States through the case of Martin v. the Wabash Railroad, in which a freight conductor fell off his train. Although inspectors subsequently blamed a loose handrail, his injuries did not receive compensation because inspecting the train for faulty equipment was one of his job duties.

2. The “fellow servant” rule.

Under the “fellow servant” rule, employers were not held liable if the worker’s injuries resulted in any part from the action or negligence of a fellow employee. This was established in Britain through the case of Priestly v. Fowler in 1837, a case of an injured butcher boy. In America, precedent was provided five years later by Farnwell v. The Boston and Worcester Railroad Company.

3. The “assumption of risk.”

The doctrine of “assumption of risk” was exceptionally far-reaching. It held simply that employees know of the hazards of any particular job when they sign their contracts. Therefore, by agreeing to work in a position they assume any inherent risk it carries. Employers were required to provide such safety measures as were considered appropriate in the industry as a whole. In the nineteenth century, this often left a great deal to be desired. Assumption of risk was often formalized at the beginning of an employee’s tenure; many industries required contracts in which workers abdicated their right to sue for injury. These became known as the “worker’s right to die,” or “death contracts.”

While these common law principles were quite restrictive, it was their method of enforcement that proved most cumbersome. An injured worker’s only recourse was through the use of torts. In the nineteenth century as in our own, these were exceptionally expensive legal affairs. Most countries required considerable fees simply to file a personal injury lawsuit. These more often than not were beyond the limited means of the injured worker. It was so uncommon for a working man to win compensation for injury that private organizations such as the English “Friendly Societies” and German “Krankenkassen” were formed that offered more affluent laborers the option of buying various kinds of disability insurance5. Nevertheless, the worker did occasionally prevail through tort legislation. As the century wore on, this began to happen frequently enough that employers too became uncomfortable with the capricious nature and high cost of battling civil suits.

To read the full article, please visit NCBI
Disclaimer: This information is not a substitute for legal advice. Laws change from time to time, so if you are injured, protect your rights and call today at 1-800-598-2440 or contact the Womick Law Firm online.

Leave a Reply

Your email address will not be published. Required fields are marked *