A scar is the pinkish or brown patch of skin that grows in the place where you once had a wound or sore. They are areas of fibrous tissue that replace normal skin tissue after destruction of some of the dermis. A scar is thus the skin's natural way of repairing itself from injury. Most people have scars.