1 Answers
The feminization of the workplace is the feminization, or the shift in gender roles and sex roles and the incorporation of women into a group or a profession once dominated by men, as it relates to the workplace. It is a set of social theories seeking to explain occupational gender-related discrepancies.
4 views
Answered