fbpx
Home African Caribbean Do Black Women Really Have To Bleach Their Skin?

Do Black Women Really Have To Bleach Their Skin?

by caribdirect
0 comments

For years Black women in Africa and the Caribbean have practiced bleaching their skin. We at CaribDirect did two articles on this controversial subject, one in 2011 (https://caribdirect.com/skin-bleaching-the-need-to-belong/) the other in 2016 (https://caribdirect.com/dancehall-artist-khago-admits-bleaching-his-skin/).

The reason still seems to be the same that the pay-off comes in the form of job security, progress, and power. Skin bleaching is therefore a business decision. It is felt that the appearance of lighter skin means much better chances of landing that better paid job, this is especially more common in the fields of sales and marketing.

Do you think this is happening for the reasons stated above or is happenning because some of our women have no or low self esteem and cannot stand up to white employers? What do you think…?

1
0
caribdirect

We provide news and information for anyone interested in the Caribbean whether you're UK based, European based or located in the Caribbean. New fresh ideas are always welcome with opportunities for bright writers.

You may also like

Leave a Comment

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy

Copyright © 2024 CaribDirect.com | CaribDirect Multi-Media Ltd | CHOSEN CHARITY Caribbean New Frontier Foundation (CNFF) Charity #1131481