It is no secret that implementation of evidence-based practice (EBP) in health care is nowhere near the levels it should be.1 It also is well known that a myriad of factors affect the diffusion of innovations, such as the adoption of EBP, and many of Everett Rogers’ observations about the variables that affect them remain relevant.2 Recent studies about human behavior and social networks made me think about the adoption of EBP and Rogers’ observations related to the role of individuals in this process, particularly the compatibility of the innovation itself with the values, beliefs, and ideas of the individual or groups.
One (1) study3 examined 126 000 stories distributed on Twitter between 2006 and 2017 and found the truth took 6 times as long as falsehoods to reach 1500 people and that falsehoods diffused statistically significantly farther, faster, deeper, and more broadly than the truth in all categories of information. It also found that while robots certainly accelerated the rate of diffusion, these findings were essentially similar when the analyses were run after all bots were removed. In other words, humans were more likely to spread false news than real news. Similarly, an earlier study4 that focused on online social network use of scientific and conspiracy news found the vast majority of likes (77.9%) and comments (80%) were from users usually interacting with conspiracy — not scientific — stories. The authors of both studies expressed concern that rejecting science and real news in favor of conspiracy and fake news can, among other factors, profoundly impact the allocation of resources and the health and well-being of populations. These studies also appear to confirm the veracity of Jonathan Swift’s observation more than 300 years earlier that “Falsehood flies, and the Truth comes limping after it.” Going back to Rogers and EBP, it also makes me wonder: Could this be an important driving force behind the fact that some practices in health care are adopted with virtually no evidence at all while real science is ignored for decades?
Take the case of opioid medications for noncancer chronic pain management. For decades, health care professionals prescribed this treatment even though no study had ever been conducted about the effects of long-term opioid therapy versus no opioid therapy for this indication and ample evidence about an increased risk of harm was known.5 By contrast, studies and guidelines dating back almost 2 decades recommend the consumption of clear fluids up to 2 hours and a light meal up to 6 hours before surgery to reduce risk of harm. Yet — regardless of the scheduled procedure time — many patients are still told to not eat or drink “after midnight.”6
In an example closer to home in our wound care arena, dozens of studies have been published for the past 4 (!) decades showing wounds will heal faster, with less pain, and a lower risk of infection if covered with a moisture-retentive dressing. Still, gauze dressings continue to be used in practice and described by some as “standard care.” At the same time, the practice of routine sharp debridement of venous and pressure ulcers appears to have been readily adopted even though the level of evidence for this debridement method is much lower than that of other less invasive methods, and studies examining the efficacy of this routine have not been conducted.7,8 These are but a few examples of how rejecting science in favor of hearsay, tradition, fake news, or whatever you want to call it can cause resources to be misdirected and the health of populations to be put at risk.3,4
The question that remains is why. The authors of the social media studies found fake news typically was more novel (although users may not have perceived it as such) and that both the science and conspiracy communications were polarized commenting mainly inside their community.3,4 The latter seems to indicate that 1) some untested and 2) seemingly unlimited access to information doesn’t change the reality that birds of feather (continue to) flock together. It also confirms the importance of Rogers’ observations about individual variables that affect the adoption of new ideas.2 Do we adopt untested practices and resist those that are effective because that is the norm in our community, because that is the only information we are exposed to in our echo-chamber9 or because they fit our previously held beliefs? Or, do we resist new ideas, evidence, or guidelines because they were not generated within our own community (eg, a discipline that is not “ours”)?
With so many lies, real news, and scientific and conspiracy information (literally) at our finger tips, these are important questions to ponder. We need to find a balance — a common ground — in order to find ways to move science, evidence, and health care forward through research and the adoption of EBP. It seems to me most people who need health care, regardless of their social media communication preferences, would agree that receiving care that has been shown to be effective is much preferred over care procedures provided just because someone thinks they work or has heard about them.