Posts filed under: ""

The Changing Role of Registered Nurses

Long before the 1870s when the first crop of nursing schools were founded in the United States, the act of providing nursing care had been a formal and informal vocation for thousands of years. The Civil War solidified the nation’s need for a national system of nursing education and standards of care; since then, the healthcare industry has modernized and diversified, and so has the profession of nursing. Since the days of musket rounds and battle cannons, nursing has evolved into an essential…

Continue Reading