Prologue: A Two-Part Exploration of FEMA’s Challenges
Emergency management is a field built on urgency, efficiency, and resilience. Yet, when we examine the evolution of the Federal Emergency Management Agency (FEMA) and its role in disaster response and relief, we see a system riddled with inefficiencies and bureaucratic entanglements. How did we get here? Why do we continue to tolerate a system that so often fails those it was designed to help?
This two-part article explores these questions through the lens of Systems and Behavioral Laws—principles that describe how organizations and people behave in structured environments. In part one, we explored the systems and behavioral laws to uncover how FEMA’s disaster response and relief efforts may have evolved into their potential inefficient state. We highlighted how well-intentioned policies became distorted over time, potentially resulting in a system that prioritizes meeting bureaucratic metrics over true disaster resilience.
Below, in part two, we will delve into cognitive biases—the psychological forces that drive resistance to reform. These biases may keep us locked into a system that we know is flawed, yet we continue to reinforce it. Together, these two articles will lay the groundwork for understanding, at least in part, why FEMA’s disaster response and relief efforts may need reform and why systemic change may be so difficult to achieve.
Introduction: The Psychology Behind Resistance to Change
My dissertation on leadership selection in public safety required me to research psychological methods of measuring leadership and personality traits. After completing the dissertation, I found myself diving deeper into the psychology of how we think and make decisions. That journey led me to explore cognitive biases and heuristics, the mental shortcuts and patterns that shape our perceptions, beliefs, and behaviors—often without us even realizing it.
As a result of this research, I developed a four-hour presentation on cognitive biases and heuristics as they apply to public safety, which I’ve had the opportunity to present across the country. What’s become clear through these sessions, and in my own experiences, is how profoundly these biases impact our willingness (or reluctance) to embrace change—especially when it comes to something as deeply ingrained as the recent discussions regarding the reformation or dissolution of FEMA.
There are hundreds of cognitive biases, but for the purposes of this article, I’ve selected just a few that I’ve observed potentially at play in discussions—both online and in person—regarding the reform or elimination of FEMA. These biases are listed in alphabetical order without regard to any priority, but together, they help explain why even the most logical arguments for change are often met with fierce resistance.
Before we get into these biases though, I would like to state that when I am talking about the reformation or elimination of FEMA, it is also implied that I am talking about the overhaul of the Robert T. Stafford Act and other legislation in which FEMA may be the primary executor in which could result in the inefficiencies we are hoping to change…it isn’t just FEMA.
1. The Backfire Effect
The Backfire Effect occurs when people’s beliefs become stronger, not weaker, when confronted with contradictory evidence. You might assume that presenting facts would change minds, but the reality is quite the opposite—especially when those beliefs are deeply held.
Over the past month, as discussions about reforming or eliminating FEMA have gained momentum, I’ve watched this effect play out repeatedly. Rather than engaging with new information about FEMA’s inefficiencies or systemic flaws, many people double down on their original positions.
Research supports this phenomenon. Certainty and misinformation are incredibly powerful. Even when people know that information is incorrect, it continues to influence their thinking (Gorman & Gorman, 2017; Kolbert, 2017; Mercier & Sperber, 2017; Wadley, 2012). Colleen Seifert from the University of Michigan notes:
"Misinformation stays in memory and continues to influence our thinking, even if we correctly recall that it is mistaken. Managing misinformation requires extra cognitive effort... If the information fits with your prior beliefs, and makes a coherent story, you are more likely to use it even though you are aware that it's incorrect" (Wadley, 2012).
In short, facts alone won’t change minds. People often cling to their beliefs even more fiercely when they feel those beliefs are under attack.
2. Bandwagon Effect Bias
The Bandwagon Effect is the tendency for people to adopt beliefs or behaviors simply because many others have done so. It’s a form of groupthink, where the popularity of an idea is mistaken for its validity (Chery, 2015).
With FEMA’s role in disaster response being a longstanding fixture, many assume that its continued existence and practices are inherently correct because “everyone” believes in them. As the conversation about FEMA reform has become more mainstream, I’ve noticed that some individuals jump on the bandwagon—either to defend FEMA’s status quo or to oppose reform—without fully understanding the complexities of the issue. This bias is often reinforced by social media echo chambers and opinion polls, which can amplify the illusion of consensus (Obermaier, Koch, & Baden, 2013).
3. Belief Bias
Belief Bias refers to our tendency to accept or reject arguments based on whether their conclusions align with our existing beliefs, rather than on their actual logical merit.
When it comes to FEMA, many people struggle to objectively evaluate arguments for reform because they are emotionally or ideologically attached to the current system. Instead of analyzing whether FEMA’s processes are effective, they default to defending the agency simply because it aligns with their preconceived notions of federal disaster response.
To challenge this, I often encourage people to ask: “When and how did I form this belief?” Recognizing the origins of our beliefs can help us better understand why we defend them so strongly.
4. Confirmation Bias
Confirmation Bias is the tendency to seek out, interpret, and remember information that confirms our existing beliefs, while ignoring or dismissing information that contradicts them.
In discussions about FEMA, I’ve seen people selectively reference success stories, moments, or particular aspects where FEMA performed (or performs) well, while overlooking the agency’s many inefficiencies and failures. Once people form an opinion, they are more likely to “embrace information that confirms that view while ignoring, or rejecting, information that casts doubt on it” (Heshmat, 2015).
Richard Feynman famously said:
"The first principle is that you must not fool yourself—and you are the easiest person to fool."
This rings true in emergency management. We must actively challenge our own assumptions to avoid falling into the trap of confirmation bias.
5. Congruence Bias
Congruence Bias is similar to confirmation bias but focuses on the tendency to test only our own hypotheses, rather than considering alternative explanations.
In the FEMA reform debate, many people are so intent on proving that the current system works that they fail to explore alternative models that could yield better results. As Sherlock Holmes (via Arthur Conan Doyle) wisely stated:
"It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts." (A Scandal in Bohemia)
We need to be willing to test alternative hypotheses for the role of FEMA and be open to the idea that a different approach might be more effective.
6. Dunning-Kruger Effect
The Dunning-Kruger Effect describes how people with limited knowledge or skill in a particular area often overestimate their competence. Conversely, those with more expertise tend to underestimate their abilities because they understand the complexity of the subject.
I’ve observed this firsthand in the emergency management community. Many younger emergency managers, fresh out of school or training, are overconfident in what they’ve been taught about FEMA (and perhaps even by FEMA) and its processes. Meanwhile, older, more experienced emergency managers tend to be more open-minded, having witnessed numerous exceptions to the rules over time. They’ve developed a kind of Socratic wisdom, acknowledging that there’s always more to learn.
As Bertrand Russell put it:
"The whole problem with the world is that fools and fanatics are so certain of themselves, yet wiser people so full of doubt."
7. Escalation of Commitment Bias (Sunken Cost Fallacy)
Escalation of Commitment Bias, also known as the Sunken Cost Fallacy, is the tendency to continue investing in a failing course of action simply because you’ve already invested time, money, or effort into it.
This bias is particularly strong when it comes to FEMA. Many people have "skin in the game"—they’ve worked with FEMA (or for FEMA), relied on its services, or contributed to its operations. As a result, they are naturally resistant to reform or elimination because it would mean admitting that their past efforts were part of a flawed system.
8. Focalism (Focusing Illusion)
Focalism is the tendency to place too much emphasis on a single piece of information when making judgments or predictions.
In the case of FEMA, people often focus on one aspect of the agency’s performance while ignoring the broader inefficiencies. As Daniel Kahneman noted:
"Nothing in life is as important as you think it is when you are thinking about it." (Kahneman, 2011)
However, we must also consider the opposite of focalism when discussing reform. There are certainly aspects of FEMA that are efficient and we may not be recognizing those but perhaps need to keep those aspects functioning less we suffer a lesson in unintended consequences.
9. Mere Exposure Effect
The Mere Exposure Effect refers to the tendency to prefer things simply because they are familiar.
FEMA has been a fixture in American disaster response for decades. For many people, this familiarity breeds comfort, making them resistant to change simply because they’re used to the way things are.
10. Reactive Devaluation Bias
Reactive Devaluation Bias occurs when people reject or devalue ideas simply because they originate from an opponent or someone they dislike.
In today’s polarized political climate, this bias is particularly relevant. For those who are strongly opposed to President Trump, the fact that discussions about FEMA reform are associated with his administration makes them instantly resistant to the idea, regardless of its merit.
11. Semmelweis Reflex
The Semmelweis Reflex describes the tendency to reject new ideas because they contradict established beliefs.
This was famously illustrated by Dr. Ignaz Semmelweis, who discovered that handwashing could drastically reduce infections but was dismissed by his peers because it contradicted prevailing medical beliefs.
In the context of FEMA, many people reject reform ideas out of hand simply because they challenge the status quo of federal disaster response.
12. Status Quo Bias
Status Quo Bias is the tendency to prefer things to remain the same rather than embracing change. People often fear the unknown and find comfort in familiar systems, even if those systems are flawed.
When it comes to FEMA, many prefer to stick with what they know rather than risk trying something new—even if that new approach could lead to better outcomes.
Conclusion: Embracing a New Paradigm
Understanding these cognitive biases is the first step toward overcoming resistance to FEMA reform. By recognizing how our minds naturally cling to the familiar, defend existing beliefs, and resist contradictory information, we can begin to open ourselves to new possibilities.
I encourage everyone to keep an open mind and imagine new paradigms for FEMA and associated disaster response and relief legislation. Consider what the results of those paradigms could be. Be open to new information, challenge your assumptions, and continuously re-imagine what a more effective FEMA—or an alternative agency—could look like.
As Socrates wisely said:
"The unexamined life is not worth living."
Let’s not let unexamined beliefs hold us back from creating a more resilient and effective system for disaster response.
References
Chery, K. (2015). The bandwagon effect: Why people tend to follow the crowd. Verywell Mind.
Gorman, S. E., & Gorman, J. M. (2017). Denying to the Grave: Why We Ignore the Facts That Will Save Us. Oxford University Press.
Heshmat, S. (2015). What is confirmation bias? Psychology Today.
Kahneman, D. (2011). Thinking, Fast and Slow. Farrar, Straus and Giroux.
Kolbert, E. (2017). Why facts don’t change our minds. The New Yorker.
Marcus, G. (2008). Kluge: The Haphazard Construction of the Human Mind. Houghton Mifflin Harcourt.
Mercier, H., & Sperber, D. (2017). The Enigma of Reason. Harvard University Press.
Obermaier, M., Koch, T., & Baden, C. (2013). The bandwagon effect in political discourse. Journal of Communication.
Poundstone, W. (2017). Head in the Cloud: Why Knowing Things Still Matters When Facts Are So Easy to Look Up. Little, Brown and Company.
Wadley, J. (2012). The stubborn power of misinformation. University of Michigan News Service.
Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one's own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121-1134.