Front cover image for Women's roles in nineteenth-century America

Women's roles in nineteenth-century America

The 19th century is referred to as the "Woman's Century," and it was a period of amazing change and progress for American women. There were great leaps forward in women's legal status, their entrance into higher education and the professions, and their roles in public life. This book examines the factors that affected women's roles in the US.
Print Book, English, ©2007
Greenwood Press, Westport, CT, ©2007