Connect with us

Education

Why trust is the missing link in AI clinical decision support – Nursing Education Network

Published

on

Artificial intelligence-based clinical decision support systems (AI-CDSS) aren’t any longer the “technology of the future” in critical care. They are increasingly embedded in the way in which clinicians monitor risk, interpret trends and predict deterioration. However, the conversation often focuses on accuracy, workflow integration, and price, leaving a key human factor underexplored: .

Awad, N. H. A., Aljohani, W., Yaseen, M. M., Awad, W. H. A., Abou Elala, R. A. S. A., and Ashour, H. M. A. (2025). When machines resolve: Exploring how trust in artificial intelligence shapes the connection between clinical decision support systems and nurse regret: A cross-sectional study. Nursing in intensive care, 30(5), e70157.

A cross-sectional study during which the authors examined the connection between ICU nurses, their care, and—most significantly—.

Decision regret matters since it shouldn’t be just a person emotion; it might translate into self-confidence, skilled identity, moral anxiety, and willingness to be involved in future decisions. In intensive care, where decisions are frequent, time-consuming, and ethically complex, regret can develop into a silent consider burnout and hesitation. The article suggests that this may increasingly have an effect on whether AI support is viewed as a security net or a guessing machine.

What the researchers did

The study conducted a cross-sectional survey (response rate 62.5%).

Nurses were eligible in the event that they worked full-time in intensive care units and had a minimum of 1.5 years of age and were excluded in the event that they had no exposure to AI-CDSS or were in non-clinical roles.

Three validated measures were used:

  • (Health Systems Utility Scale, HSUS)
  • (Post-decision regret scale; rating 0–100)
  • (AI Trust Scale; 4 items)

What did they find?

Reliance is related to less regret – especially when trust is high.

Nurses reported average levels of reliance on AI-CDSS, trust in AI, and post-decision regret: reliance, regret, trust.

Three results stand out:

  1. (r = -0.42).
  2. (r = -0.33).
  3. Dependence and trust were positively related (r = 0.51).

with a big interaction term (p = 0.012). In other words, when nurses trusted the AI ​​more, their reliance on it was more strongly related to reduced regret.

AI implementations often give attention to what the system can do. This study proves that there’s also a necessity for management how the system changes the emotional load of nurses.

If trust is low, AI can enhance guesses (“Did I miss something?”) or change anxiety about responsibility (“If AI said X, should I have done X?”). If trust is bigger, AI support can act as a reassurance and reduce decision regret after difficult phone calls. To go:

Practical implementation

1) Treat “trust in AI” as a measurable effect of implementation

2) Make your training clear: not only “how to use it” but in addition “when not to use it”

3) Strengthen the psychological safety related to a way of ownership of selections

4) Make your workflow visible: Usability supports reliability

5) Adapt management to the truth on the bedside: make clear responsibility early

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Our Newsletter

Subscribe Us To Receive Our Latest News Directly In Your Inbox!

We don’t spam! Read our privacy policy for more info.

Trending