One of the most important social factors that has shaped professional nursing in the U.S. is the changing role and social status of women. In the past, nursing was seen as just another part of a woman’s day-to-day life. Nurses often had little or no formal training or professional designation (Leddy, 2022).
But with women’s civil and political rights advancing through movements such as the suffrage movement and feminism, nursing changed. With the emergence of professional nursing schools at the end of the 19th century and the beginning of the 20th, nursing became a respectable career choice for women (Leddy, 2022) .The nurses’ contributions during World War II further elevated their status as they demonstrated their ability and commitment in healthcare settings (Lloyd, 2023).
As women’s rights and opportunities increased, nursing became a profession with standardized training, certifications, and professional associations (Lloyd, 2023).
Today, nursing is a women’s profession, and advances in gender equality have had a major impact on shaping the modern nursing workforce in the United States with a focus on education, specialty, and professionalism (Leddy, 2022).