Why Women Paint Their Nails ...

By Leiann

Why Women Paint Their Nails ...

Have you ever been asked why women paint their nails. Well-manicured nails are an outward symbol of health, femininity and social status. Dirty and brittle nails may mean you are not taking care of yourself or are sick.

When it comes to femininity and social status, wealthier women who would not normally get their hands dirty by house cleaning, cooking or any type of manual labor, would then be expected to have well-manicured nails.

The following YouTube video, by HowStuffWorks, published on September 8, 2014, explains these little bits of nail polish history further and tells you why women paint their nails.

Beginning in the 20th century, nail polish became a beauty staple for women. It's reasons are not as fun and fashionable as you would expect.

Throughout the years, women have always set aside enough money for that little bottle of luxury. A little bit of color can give a woman a positive outlook.

Personally, I think well-manicured toenails are important. A woman can have the ugliest feet but put on a splash of color and they look 100% better.

Whether you are wealthy or you are poor, you can always afford a nice bottle of luxury, whether it is by Sally Hansen or Wet n Wild and save money by doing it yourself.

Leave Your Comment

Popular now

Recent