For years Black women in Africa and the Caribbean have practiced bleaching their skin. We at CaribDirect did two articles on this controversial subject, one in 2011 (https://caribdirect.com/skin-bleaching-the-need-to-belong/) the other in 2016 (https://caribdirect.com/dancehall-artist-khago-admits-bleaching-his-skin/).
The reason still seems to be the same that the pay-off comes in the form of job security, progress, and power. Skin bleaching is therefore a business decision. It is felt that the appearance of lighter skin means much better chances of landing that better paid job, this is especially more common in the fields of sales and marketing.
Do you think this is happening for the reasons stated above or is happenning because some of our women have no or low self esteem and cannot stand up to white employers? What do you think…?