In the 1800s womens rights changed tremendously. In the beginning of the 1800s women had no rights and were strictly housewives destined to raise children. Throughout the 1800s many changes occurred within the roles and rights of women. Women during this era were given opportunities for freedom. Women took advantage of these opportunities and changed their role in America. These changes.