What is Scarring?

Scarring is a natural part of the healing process following an injury or potentially an illness. All wounds result in a scar, although some scars are more noticeable than others.

What Happens to the Skin during Scarring

Scars are the body's natural reaction to wounds. In order to close the wound and protect the body, the new tissue forms from a scab. This tissue is a fibrous tissue that's not quite the same consistency as the rest of the skin tissue. It's darker and less strong against UV rays and further injury. There are also no hair follicles or sweat glands in the new tissue.

When to Worry about Scarring

Scars are actually nothing to worry about once they've formed, although you may be unhappy with their cosmetic appearance. You may need medical attention during the healing process of the wound if you suspect infection. (The signs of infection include red tissue, painful tissue and swelling.) Once the wound has formed a scar, you should be beyond the risk of infection.

If you're concerned about scarring, there are some home remedies and over-the-counter ointments and creams that may be able to help you diminish the appearance of your scars. You can also make an appointment with your physician, dermatologist or cosmetic specialist for more drastic scar reduction treatment.