Is a Female Dominated Work Force a Positive Development?
Depending on how the workforce is measured, women have either become the majority of workers in the United States or are right on the cusp of becoming so.
According to a new survey from Allstate-National Journal, most Americans think this is a positive development.