CommonSpirit in Chicago is launching a new "digital consumer experience" to continue their trend of last year with strong results in their quality of care, safety, and patient satisfaction.
According to their second-quarter financial report, they're "delivering on customer needs in significantly less time than the national average, and most (92%) patient needs are resolved within the first phone call, also ahead of industry averages." The report also mentions that programs similar to the one they're implementing in Chicago have saved them "$12 million in savings from operating costs, efficiency in processes and improved patient metrics."
They're also utilizing AI to monitor patients for sepsis and identify potential stroke patients. Since launching this initiative in 2015, both of these use cases have resulted in reduced mortality rates, reduced period of hospital stay times, and increased response times.
As younger generations start to value a more even work-life balance, they reevaluate how they feel at work. Many factors contribute to employees' engagement levels, so how do you keep them engaged?
As AI becomes even more unavoidable in our daily lifes, the same is happening in hospitals. AI can be a great tool to assist with virtual visits, record keeping, finding patient information, assisting with real-time nurse-to-patient translation, and more. However, because of how relatively new (in scale) this technology is, there aren't a lot of regulations set in place yet.
Artificial intelligence has been used in some capacity for years across different industries, and healthcare is no different. However, with the rise in visibility and awareness of what AI can provide, many questions about safety and ethics arise. As of now, there are no federal regulations limiting how and when AI can be used, so it's up to the state to decide. There are several proposed bills, but according to Stateline1, only one law is active: a Georgia law allowing AI to be used in eye exams.
"In the long run, whatever artificial intelligence we use, it’s still the human — the person — that has to take that data, and the interpretation of that data in some respects, and apply it to the real person that’s in the bed, the nursing home or the home of that person,” says nurse Judy Schmidt. The American Nurses Association’s code of ethics backs this sentiment, noting that AI, or any advanced technology, shouldn't replace the skills and judgment of nurses.
“We’ve been using algorithms and machine-generated insights for a number of years, but now, it’s sort of getting more pressing with the complexity," says Dan Weberg, the vice president of the American Nurses Association\California. Since nurses interact so closely with their patients and want to maintain their trust, it's only logical that they would want a say in the legislation of AI to ensure it's used only when needed.