Terry Gerton Cybersecurity is a major focus for us. We discuss it frequently, and our audience is highly engaged with the topic. Typically, we cover cybersecurity tools, training, and regulations. However, we’re also observing that phishing and social engineering attacks are becoming increasingly sophisticated each year. Breaches persist across all levels of government. From your vantage point at Fors Marsh, what accounts for the disconnect between cybersecurity technology investments and actual results?
Nicole Togno I believe the honest answer is that we’ve been addressing the wrong problem. Agencies invest heavily in tools, and those tools are excellent at managing systems. However, they haven’t invested as much in understanding the people working within those systems. People are complex. We’re all intricate beings. We’re busy and increasingly operate under pressure as our work evolves and technology becomes more integrated into our workflows. We’re constantly making countless judgment calls throughout the day. Attackers understand this. Phishing is so effective not because people are careless or incompetent, but because it’s designed to exploit how humans actually process information, especially under stress. When I examine why these breaches continue to happen, I don’t necessarily see a technology gap. I approach it from a different perspective. The real gap lies in our understanding of human behavior.
Terry Gerton What I’m hearing you say is that regardless of our investment, we’ll never achieve a foolproof technical solution to cybersecurity—it’s fundamentally a behavioral challenge. What practical implications does this have for government leaders when they assess their cyber risks?
Nicole Togno In practical terms, it means people will always be human. Let me approach this from a different angle. Many technologists enter this field asking, “What do people need to know?” From a behavioral perspective, we should be asking different questions: What shapes how people behave, and what drives their actions? These are fundamentally different inquiries. There’s a framework I find particularly useful—it’s straightforward and well-established in behavioral science. It’s called COM-B, which stands for Capability, Opportunity, Motivation, and Behavior. The core concept is that behavior isn’t just about knowledge—it’s far more complex. Behavior is rarely a purely logical process. It’s really about whether the environment we create makes certain behaviors easy or difficult, and whether people are genuinely motivated to act. When I analyze a security issue, I’m not a technologist. I’m not just asking whether people completed their technology training to gain knowledge. I’m asking whether this environment enables people to make secure choices, and whether they trust the system enough to engage with it honestly. These questions don’t come naturally to technologists, and that’s not a criticism—it’s simply a different discipline. That’s why I believe behavioral science offers a valuable complementary perspective.
Terry Gerton That’s a fascinating viewpoint. I’m reflecting on the last cyber training I went through about identifying phishing emails and avoiding suspicious links. It sounds like you’re suggesting that approach is insufficient.
Nicole Togno It’s definitely not enough. Most of us have experienced those click-through training sessions that feel like mere compliance exercises. You gain knowledge, but knowing and doing are entirely different things. Behavioral science has long recognized this—there’s extensive research demonstrating that information alone rarely changes behavior. Security programs still largely operate on the assumption that if you inform people about risks, they’ll act logically and responsibly. But that’s simply not how humans function. When I’m rushing to meet a deadline and encounter a login prompt requiring three, four, or five additional steps, I’m tempted to find a workaround—not because I disregard security, but because my brain is focused on completing my work. We’ve become highly efficient in our society, with systems that push us to work smarter and faster. That drive for efficiency isn’t a character flaw—it’s simply how our cognition works. When training is your entire strategy, you’re essentially placing the burden of security on individuals and then blaming them when things go wrong. That approach feels both ineffective and somewhat unfair.
Terry Gerton I’m speaking with Nicole Togno, Senior Director for Civilian Experience and Policy Research at Fors Marsh. Nicole, let’s be candid—most cybersecurity professionals are technologists, not behavioral scientists. They’re trained to create the kind of environment you describe. What should they consider when evaluating their cybersecurity risks? What immediate steps could they take?
Nicole Togno That’s an excellent question. It starts with viewing your employees as partners in the security process rather than assets or risks to manage. This sounds simple but represents a significant shift in how most security programs are designed by technologists. In practice, it means understanding how people actually navigate these systems daily, identifying their pressure points, and recognizing where they attempt to work around obstacles because they lack better options. It means creating channels where people feel safe reporting mistakes without fear of punishment. Currently, in many agencies and workplaces, if someone clicks a malicious link, their instinct is to hope nobody notices. This represents a real cultural problem with genuine security consequences. We need to build more trust—trust between leaders and employees throughout the organization. This isn’t some soft, abstract concept. In this context, trust is a genuine security asset that technologists and leaders should actively cultivate within their systems.
Terry Gerton What you’re describing sounds like it would be swimming upstream for a CISO. That’s not the easy path. The easy path is to purchase cybersecurity
tools or put more checklists in place. And, say that I’ve built the system that is going to address my cyber risk, how do you empower the decision-makers to actually change their approach to this?
Nicole Togno It really starts with a change in perspective. Instead of treating security as a set of rules forced upon people, it should be something you create together with them, giving everyone the ability to contribute. Organizations often overlook that their own staff are one of the best sources of information about where the true weaknesses lie. Empowering tech teams to talk to employees to discover unofficial shortcuts, or to learn which procedures are so frustrating that nobody actually follows them, even if they don’t say so openly. Many security programs aren’t set up to bring that knowledge to the surface; they’re built to push directives downward. There’s a wealth of insight just sitting there, waiting for technologists, leaders, and CISOs to tap into. It’s incredibly valuable for them to understand not just how to get people to follow the rules, but to start asking: what are people telling us about how this system truly operates in practice. It’s a completely different conversation. This shift in mindset gives them fresh data and interesting insights they wouldn’t uncover through a standard internal penetration test on their technology assets alone.
Terry Gerton If you are a federal CIO or a CISO and you’re saying, that sounds like something I’d really like to try, but I have no idea where to start. How can they begin to build an environment, a structure, a culture that seeks that employee feedback?
Nicole Togno The easiest first step is simply to talk to people. It doesn’t need to be formal. It doesn’t require a town hall meeting or an official survey. Just sit down with a small group of employees from various roles and ask them two questions: Where does security slow down your work, and what do you do when that happens? Then just listen. Don’t defend the current program. Don’t explain the policy. Just listen. What you hear might be surprising. You might learn about actions people are taking that you weren’t aware of, or processes that were abandoned long ago, instances where following the secure path felt too difficult. That information is pure gold. It shows you precisely where your program is failing in the real world, beyond the technical testing your experts perform to verify the system works. It reveals the actual daily experiences of the people you depend on to keep the agency secure. That’s where I’d begin. Simply talk to people. You can’t design for human behavior if you don’t understand the real humans using it.
Terry Gerton Once you get that kind of feedback, what’s the next step?
Nicole Togno That’s an excellent question. Once you have that feedback, the next step is to give it to your technology team. Show them the evidence: you’ve built a strong, well-designed system up to this point. But to reach the next level, you need to involve the people using it. Provide them with these insights to guide better decisions. You didn’t know about this workaround before. Now you do. What are we going to do about it? Take those quick steps to gather the insight and then act on it. It can be one small change at a time. It doesn’t require a complete overhaul or a full system redesign. It’s simply taking these firsthand insights from the people in your organization who use your tools and systems every day and making small, continuous improvements.
Copyright
© 2026 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.



