Size and complexity are the enemies of cybersecurity
In cybersecurity we are always faced with the chance that our system harbors some, unknown vulnerability, and the possibility that vulnerability will be discovered by some malicious actor who will then use it against our system, as well as other, similar systems.
Cybersecurity vulnerabilities are the result of two kinds of errors or defects:design errors and implementation errors. A design error is where the functionality of a system or component is not properly and comprehensively analyzed and understood so that the resulting design does not cover all possible use cases. Analysis of a system requires understanding and capturing all the possible ways that a system will be used, as well as the limits of how the system will be used such that only the planned functionality is enabled by the system. The design is the plan for how the system will implement the functionality that satisfies the analysis results. The design captures the structure of a system or component and the breakdown of the partitioning of the major functionality.
Implementation is the realization of the design. The development of the system or component using software development tools such as editors and compilers in the specified languages and frameworks. All configurations are also included in the implementation. The development process often includes: a build and integration processes, coding standards, design patterns, code reviews, and testing as methods to increase the likelihood that the resulting implementation is as true to the design and has the least number of defects possible.
Both the design phase and the implementation phase provide opportunities for creating defects that may become vulnerabilities in the system or component. The number of defects produced in a system or component directly correlates to the complexity and size of the system, as defects typically occur at a rate per unit of development. So, roughly, a design or implementation that is twice as large will have twice the number of defects.
The complexity of a design or implementation has a similar effect of creating defects in a system. But, unlike implementation size the number of defects can increase much faster with increasing complexity. Complexity is difficult to quantify, but is related to how many elements are needed to create a solution and how many relationships are involved in those elements. It is also related to how many steps are needed to accomplish a use case of the system or component.
Humans are good at keeping simple constructs in their minds and understanding them. When the size and complexity of a system increases the human developer must break it down into simpler pieces that can then be understood at once. And relationships must be abstracted and reduced to easily understood assemblies. When a system is large and complex the number of possible relationships tends to grow faster than linear, and the number of possible paths in the system grows exponentially. Thus the system quickly becomes unwieldy to the human developer as the size and complexity grows.
The size and complexity of a system directly results in a greater number of defects and resulting vulnerabilities as these quantities grow. On the other hand, the number of defects and cybersecurity vulnerabilities shrinks as the system or component is made smaller and simpler. This strongly suggests that designs and implementations that are small and simple should be very much favored over large and complex if effective cybersecurity is to be obtained.
The article was written by David W. Viel, Ph.D., the founder and CEO of Cognoscenti Systems, LLC. He has extensive experience in research and development of mission critical systems in a wide variety of fields including military control systems, space, modeling and simulation, computer languages, telecommunications, and distributed systems. He also has led a number of teams in the development mission critical systems. First the article was published here.