Since the qualitative metrics are all subjective in nature, virtually every risk element can be characterized by the first two metrics, Low, Medium, and High, or Ordinal Ranking. Vital, Critical, and Important, however, are descriptive only of an assets value to an organization.
The Baseline approach makes no effort to scale risk or to value information assets. Rather, the Baseline approach seeks to identify in-place safeguards, compare those with what industry peers are doing to secure their information, then enhance security wherever it falls short of industry peer security. A further word of caution is appropriate here. The Baseline approach is founded on an interpretation of due care that is at odds with the well-established legal definition of due care. Organizations relying solely on the Baseline approach could find themselves at a liability risk with an inadequate legal defense should a threat event cause a loss that could have been prevented by available technology or practice that was not implemented because the Baseline approach was used.
The classic quantitative algorithm, as presented in FIPSPUB-65, that laid the foundation for information security risk assessment is simple:
(Asset Value x Exposure Factor = Single Loss Expectancy) x Annualized Rate of Occurrence = Annualized Loss Expectancy
For example, lets look at the risk of fire. Assume the Asset Value is $1M, the exposure factor is 50%, and the Annualized Rate of Occurrence is 1/10 (once in ten years). Plugging these values into the algorithm yields the following:
($1M x 50% = $500K) x 1/10 = $50K
Using conventional cost/benefit assessment, the $50K ALE represents the cost/benefit break-even point for risk mitigation measures. In other words, the organization could justify spending up to $50K per year to prevent the occurrence or reduce the impact of a fire.
It is true that the classic FIPSPUB-65 quantitative risk assessment took the first steps toward establishing a quantitative approach. However, in the effort to simplify fundamental statistical analysis processes so that everyone could readily understand, the algorithms developed went too far. The consequence was results that had little credibility for several reasons, three of which follow:
Yes, this primitive algorithm did have shortcomings, but advances in quantitative risk assessment technology and methodology to explicitly address uncertainty and support technically correct risk modeling have largely done away with those problems.
Pros and Cons of Qualitative and Quantitative Approaches
In this brief analysis, the features of specific tools and approaches will not be discussed. Rather, the pros and cons associated in general with qualitative and quantitative methodologies will be addressed.
We are proud to bring to all of our members a legal copy of this outstanding book. Of course this version is getting a bit old and may not contain all of the info that the latest version are covering, however it is one of the best tool you have to review the basics of security. Investing in the latest version would help you out in your studies and also show your appreciation to Auerbach for letting me use their book on the site.