Man talking to a doctor about their injury

What to Tell Your Doctor About a Workplace Injury

Injuries in the workplace are typically unexpected and can have devastating consequences for workers and their families. Some may try to work through the pain of a workplace injury, while others must seek treatment from a doctor. Those who need medical care may wonder if it's best to disclose how their injury occurred to a physician.

The answer is yes.

Those injured on the job should tell their doctor about the injury as soon as possible. Many workplace injuries can be treated and healed with proper medical care. However, if the injury is not treated properly, it could lead to long-term health problems.

When seeing the doctor, it is best to be honest about how the injury occurred. Individuals should be as specific as possible when describing how the injury happened. If someone is unsure of how the injury occurred, the doctor will need to know this as well.

Individuals should also tell their doctor what type of pain or discomfort they are experiencing. Be sure to describe the pain in as much detail as possible. The doctor will need to know where the individual is experiencing pain and how severe it is.

It is also critical to tell the doctor about other symptoms, such as swelling or bruising. These symptoms can help the doctor determine the severity of the injury and whether or not further treatment is necessary.

By providing a doctor with the necessary information, individuals can ensure that they receive the proper treatment for their injury.

Injured While on the Job?

If you were seriously injured at work or your employer is not following the workers' compensation laws, you should contact an experienced workers' comp attorney. The dedicated team at The Sexton Law Firm can help you get the benefits you are entitled to under the law. Read what our past clients say about their results and contact us so we can get started working with you.