The Road to Worker Protections
For most injured workers, a complete recovery and the ability to return to work are only possible because of the medical and health care services provided by Workers’ Compensation insurance. However, the concept of healthcare support delivered through an employment setting and applied to on-the-job injuries is a recent one, and its premise and principles are designed to set aside centuries of unfair and often exploitive labor practices.
A Brief History of Labor Practices
To understand how and why America’s worker protection systems came into being, you first need to know about the European foundations that define their history. Europe’s culture around work habits and practices evolved significantly from the Middle Ages to modern times, and changes in economics, industries, and social norms caused equally significant changes in how, where, and why people worked.
Those changes in ’employment’ thought patterns traveled with New World explorers to North America, where they, too, evolved to reflect the cultures in which they existed. Without a doubt, that evolution benefitted the workers, saving them from horrendous working conditions and poverty-level wages. But it also helped the companies that hired workers and the communities in which they worked. And each step of that evolution adds nuance to our understanding of the balance that now exists between workers, their employers, and the communities that benefit from a safe and productive industrial complex.
From Hell to Haven
The world’s workers have never had it easier than they do today. National laws governing safety and health standards ensure that employees aren’t required to risk their lives to make a living. Those standards, however, rest on the backs of the untold number of sick, injured, and dead workers that toiled without those protections in place.
Early Work Conditions
Before the Victorian era (1837 – 1901), working conditions were unregulated, and ’employers’ were often more ‘owners’ than job providers. Lower class citizens traded labor for food or lodging and were expected to work seven days a week for as many as 18 hours a day. Safety standards on the job were also unheard of, regardless of the location or type of work. Domestic workers were exposed to the smoke and flames of wood-burning stoves; field hands got no reprieve from the weather, and miners toiled deep underground with no fresh air ventilation nor protection from the toxic and caustic air they were breathing. Premature deaths were common, as were deaths caused by accidents and disasters. Children were not saved from the situation either; because their small size made them ideal candidates for many tasks, children were forced into work as early as three years, and often died young from overwork and dangerous worksite conditions.
The Victorians improved the situation, but only slightly. While safety precautions were still nonexistent, workers during this period were finally given a day off per week, although they continued to work 14 to 16 hours a day during the other six.
The Industrial Revolution of the mid-19th Century, however, caused three major shifts in workplace cultures:
- it automated many tasks that were previously jobs for people;
- it provided new forms of work in factories, and
- it required a new kind of compensation since factory workers couldn’t live in the factory nor consume the products it produced.
All three shifts posed challenges that, eventually, provided the opportunity for more positive changes in the world’s work environments:
- Despite the increased productivity, the factories weren’t safe to work in as many newly invented machines didn’t have safety features, nor were they well manufactured.
- Workers were usually unskilled, and training was scarce if it happened at all. Debilitating and fatal injuries were common, and workers and their families were left destitute if/when the breadwinners were unable to return to their jobs.
- Wages, too, caused problems. Profits were the focus of employers, not caring for their staff, and those profits rose when employment costs were low. Men were paid the most, while women were often paid only half that, and children were paid even less.
And working conditions remained terrible. There was rarely fresh air ventilation, so employees inhaled the smoke and ash put off by the machines. Short lunch and dinner breaks led to exhaustion, which, in turn, led to accidents. Children suffered the most, and many were sickened or deformed due to poor nutrition, inhaling toxic air, and the lack of exposure to sunlight. In fact, it was the challenging working conditions for children that contributed to the development of unions.
The Rise of Unions
The ‘Union’– an organization of workers – first appeared in the early 1700s, as workers grew frustrated with low wages and poor working conditions. Over time, if uniting as a grouped ‘workforce,’ they may have increased leveraging power to improve their situation because the law offered an individual worker little or no protection from an overly oppressive boss. Although litigations were often pressed based on unhealthy work conditions, employers were equally often let off the liability hook because of one of three major defenses that were popular in those days:
1) The employee was negligent and caused their own injury;
2) Another employee was at fault and caused the injury, or
3) The injured worker knew it was a hazardous job and ‘assumed that risk’ by accepting the
As a unified force, however, a union could overcome those defenses by bringing different tactics to the fray. From their perspective, without their combined effort, the factories couldn’t produce, and employers would lose their customers.
Initially, these groups were a conglomerate of skilled workers, tradesmen, whose productivity couldn’t be readily replicated by someone else. They fought for shorter hours, better pay and safer working conditions to balance the toll taken on their health by exhaustion and poverty-level nutrition. The circumstances of child workers were also a common complaint, and many unions stopped work to force factory owners to stop using children as laborers.
By the mid-1800s, Unions had become common on both sides of the Atlantic, and the movement grew significantly as the new 20thCentury approached. Strikes – work cessations – became a favored method of forcing change and were growing in size and frequency. While localized strikes often involved only local workers, the Union movement itself was attracting growing numbers of laborers across regions and even countries. In 1892, 261 strikes in France engaged only about 50,000 workers, but by 1914, just 22 years later, 1,309 strikes took 438,000 workers offline. In Britain between 1909 and 1913, more than 2,000,000 workers refused to work due to adverse work conditions.
The American colonies were not immune to the consequences of striking European workforces. Even before the Revolution, in 1768, a group of New York tailors went on strike to protest a wage reduction, and when the Federal Society of Journeyman Cordwainers, a union of shoemakers, formed in 1794, the American labor movement was launched. The trade unions fought back against British-based business practices by publishing their prices to demonstrate their value and prevent their replacement by cheaper, unskilled laborers. The Unions grew quickly because they were able to attract larger memberships as cities grew and national communications systems developed.
Compensation for Injuries
Although the modern rules around compensating an injured worker are new, the concept itself has been around since approximately 2050 B.C. That “Law of Ur” provided financial compensation for injuries to specific body parts, including bone fractures. Over the next four thousand years, other cultures adopted the practice, although each was unique in the value it set on the particular body part that was harmed (loss of a thumb was worth twice the compensation as the loss of a finger, according to the ancient Arabs).
It wasn’t until 1871, however, when Prussian Chancellor Otto von Bismarck enacted the “Employers Liability Law,” that selected classes of workers finally received a reliable option to pursue that related to their on-the-job injury. von Bismarck wasn’t addressing workers’ demands out of personal concern for their well-being, however. Instead, he used the protections to secure the support of the general population which was, at the time, leaning towards the Marxist and socialist movements. By suppressing that leadership while adopting their most popular theories, von Bismarck maintained a steady hold over the Prussian nation.
The new law didn’t hold all employers liable for all work-related injuries, as its name might suggest. It mandated that employers pay into a fund that would, in turn, pay out to injured workers the funds they needed to get medical care for their injuries. And, the original 1871 law applied only to injured laborers in the quarries, mines, railroads, and certain factories, but didn’t cover other industries or other worker concerns. Eventually, this disparity drove von Bismarck to extend the protections by adding the publicly funded Workers’ Accident Insurance provisions in 1884, then later the Public Pension Insurance (for those rendered disabled by a job injury) and Public Aid funding, to ensure long-term support for disabled former workers.
Further, perhaps the most significant action taken by von Bismarck was to establish this state-administered ‘injured worker protection system’ as the ‘exclusive remedy’ when injuries occurred on the job. Although those injured workers could no longer sue their employers for compensation for their injuries, they weren’t left unemployable or destitute in that situation, either.
Also, not insignificantly, von Bismarck’s introduction of the state as a player in the realm of workers’ compensation management set the stage for the involvement in and oversight by America’s modern CMS in today’s workers’ compensation industries.
That discussion follows next month…. See you then!