The Human Factor
The Human Factor in Securing the Code Development Lifecycle
There is nothing more Human than making misstakes. (if you read this sentence, you should agree with it) 🙂
However, when it comes to a code development process and cybersecurity, this cannot be overstated and must be addressed with full attention. Understanding and managing the Human Factor for safeguarding the code development lifecycle is critical.
Developers (A.K.A Humans) are still responsible for the process's stages, from planning, coding, testing, deployment, and maintenance. Furthermore, until this change, developers must work hard to minimize the gap between the need for a secured code (application and technologies) to the fact that Humans make mistakes. Many tools, best practices, and technologies can help in the process and point the developer to optional vulnerabilities in the code. However, biases, distractions, or carelessness are human characteristics, and technologies can only minimize the risk.
Cognitive Biases - A Human Weakness
A significant part of human vulnerabilities lies in our cognitive biases. One such common that affects people's decisions is confirmation bias. Confirmation bias is the tendency to search for, interpret, favor, and recall information that confirms or supports one's prior personal beliefs.
In code development, confirmation bias can manifest in several ways. For instance, if developers strongly believe their code is secure, they may unconsciously discount code review feedback or testing results that suggest otherwise. Overlook warnings or dismiss them as false positives, leaving the code vulnerable to potential exploits.
A Harvard Business School report explores the impact of confirmation bias in business contexts. It shows how this bias can close our eyes to essential data and insights and provides strategies to counteract it.
Biases come in different shapes and affect our judgments; We listed a few of the most relevant that can affect a developer's decision-making in the development lifecycle.
Anchoring Bias is the tendency to adjust our judgments (especially numerical judgments) toward the first piece of information (Tversky & Kahneman, 1974).
Availability bias is the tendency by which a person evaluates the probability of events by the ease with which relevant instances come to mind (Tversky & Kahneman, 1973).
Hindsight bias is a propensity to perceive events as being more predictable once they occur (Fischhoff, 1975).
Omission bias is the preference for harm caused by omissions over equal or lesser harm caused by acts (Baron & Ritov, 2004).
Outcome bias is the tendency to judge the quality of a decision based on the information about the outcome of that decision. These judgments are erroneous for the normative assumption that "information that is available only after a decision is made is irrelevant to the quality of the decision" (Baron & Hershey, 1988, p. 569).
Overconfidence bias is a typical inclination of people to overestimate their abilities to successfully perform a particular task (Brenner et al., 1996).
The good news is that there are ways to reduce these biases and minimize the conflict between human error and the need for a secure code development process.
Ways to solve the conflict
Education, Training, and Building Resilience
Developers should employ continuous education and training to be current on the latest security practices and potential vulnerabilities. Including security awareness training to help developers recognize these effects. No! There is no need for psychology sessions, but Education and Training about the secure development lifecycle process, followed by real-world examples and use cases of development lifecycle processes that have gone south, can help build a more resilient work process and a team. Developing mental resilience will help individuals manage stress, avoid burnout, and maintain attention to detail – crucial in a field where minor oversights can lead to significant security vulnerabilities.
Creating a Security Culture
Creating a working culture that prioritizes security is essential. Creating a security culture involves more than just implementing policies – it requires a top-down approach where managers lead by example and integrate security practices into daily workflows.
Recognizing and rewarding security practices to motivate employees to take security more seriously.
The cybersecurity space shifted from asking when the cyberattack will accrue to how an organization can minimize its attack surface.
Creating an organizational security culture is a great idea. Regarding the organization's development team, this idea turns into a must-have practice.
Shifting the Perspective
Security as an Integral Part of Coding. Lastly, it is critical to shift the perspective of viewing security as a hindrance or an afterthought to seeing it as an integral part of the coding process. Developers should be encouraged to take ownership of the security of their code. When security is a shared responsibility rather than the domain of a specific team, it becomes a collective effort that everyone contributes.
Integrate security into the Development Lifecycle: Planning, designing, and writing code with a potential security risks mindset.
For example, when designing a user authentication feature, a developer with a
security-first mindset would consider potential vulnerabilities and develop the feature
to mitigate these risks.
Including a security focus on code, reviews could involve using secure coding checklists during manual reviews or employing automated tools that scan for common security flaws.
Collaboration with Security Teams: Work closely with security teams throughout development. Ensuring that security expertise is applied from the start, making security an integral part of the development rather than a retrospective addition.
Develop a Security Mindset: Cultivate a security mindset. Think about how someone can exploit the code and how to prevent this from a
For instance, when writing a piece of code that handles user input, a developer should
consider potential input validation issues that could lead to vulnerabilities such as SQL
Finally, here is a short video of Professor Daniel Kahneman explaining On Cognitive Bias and Systems