Definition - What does Workers' Compensation mean?
Workers' compensation is a legally mandated insurance that employers must carry. It is for employees who get injured on the job. Laws pertaining to workers' compensation are designed with the employees in mind. It views employers as liable for any injury that an employee suffers while working. The employer must provide an adequate monetary compensation to an injured employee without any litigation.
Justipedia explains Workers' Compensation
Many jurisdictions around the world use workers' compensation laws to safeguard the interest of employees who get injured while working because many employers might choose to walk away from their responsibility towards such employees. In some jurisdictions, workers' compensation benefits also cover the dependents of an injured or deceased employee. In the United States, the Federal Workers' Compensation Act provides benefits to non-military federal employees who get injured on the job. Most states have their own laws that cover employees of non-government entities. In most states, an injured employee can get workers' compensation benefits regardless of who, the employee, employer, customer, coworker, or any other third party, was at fault.
Do I Have to Have a Permanent Disability to get SSDI Benefits?