When Did Unions Start In America?
When Did Unions Start In America? Sustained trade union organizing among American workers began in 1794 with the establishment of the first trade union. Discrimination in unions was common until after WWII and kept Blacks, women, and immigrants out of higher-skilled and higher-paid jobs. How did unions start in America? Unions began forming in the