Risk and Responsibility

In his entry on “risk” for the Stanford Encyclopedia of Philosophy, Sven Ove Hansson noted that there are at least five relevant senses of the word in common usage today:

  • An unwanted event that may or may not occur.
  • The cause of an unwanted event that may or may not occur.
  • The probability of an unwanted event that may or may not occur.
  • The statistical expectation value of an unwanted event that may or may not occur.
  • The fact that a decision is made when the outcome probabilities are known.

The first three definitions are often assigned to different terms – such as consequence, hazard, and threat (or vulnerability), respectively – and treated as contributors to the fourth, which is the standard technical meaning of “risk” among engineers and risk analysts. The fifth is only invoked in formal decision theory, where “decision under risk” is an alternative to “decision under certainty” (each option is associated with exactly one outcome) and “decision under uncertainty” (outcome probabilities are unknown).

A 2012 paper by philosophers Ibo van de Poel and Jessica Nihlén Fahlquist with the same title as this column (Chapter 35 in Handbook of Risk Theory, Springer) points out that “risk” is more nuanced in fields such as psychology and social science. In particular, contextual elements come into play, which may include “by whom the risk is run, whether the risk is imposed or voluntary, whether it is a natural or man-made risk, and so on.” It is precisely these kinds of factors that give risk its moral dimension, and thus imply some degree of responsibility on the part of those who confront it on behalf of others – such as engineers.

In a 2011 paper on “The Relation Between Forward-Looking and Backward-Looking Responsibility” (Chapter 3 in Moral Responsibility: Beyond Free Will and Determinism, Springer), van de Poel proposed nine distinct notions of responsibility:

  1. Cause – The earthquake was responsible for killing 100 people.
  2. Role – The driver is responsible for controlling the vehicle.
  3. Authority – The superintendent is responsible for the construction project.
  4. Capacity – The person has the ability to act in a responsible way.
  5. Virtue – The person has the disposition to act in a responsible way.
  6. Obligation – The person is responsible for the safety of the passengers.
  7. Accountability – The person is responsible for explaining what he/she did.
  8. Blameworthiness – The person is responsible for what happened.
  9. Liability – The person is responsible for the cost of the damages.

The first four are strictly descriptive, while the other five are normative; and of those, #5-6 are forward-looking (prospective), while #7-9 are backward-looking (retrospective). Philosophers throughout history have mostly addressed retrospective responsibility, especially #8; in particular, the necessary and sufficient conditions for someone to be properly held responsible for something that has already occurred. However, engineering ethics is (or should be) more concerned with the prospective responsibility of engineers who assess, manage, and communicate risk in an effort to avoid unwanted events; i.e., a combination of #3-6.

Interestingly, van de Poel and Nihlén Fahlquist assign responsibility for risk assessment primarily to scientists, and suggest that responsibility for both risk management and risk communication falls mainly on government officials and corporate executives. The only responsibility that they attribute specifically to engineers is for risk reduction; i.e., safety understood as maintaining risk at or below an acceptable level. However, they acknowledge the obvious question that arises: Are engineers also responsible for setting this threshold, or is that determination made by others?

As I see it, the answer is a resounding “yes”! The consensus process by which most engineering codes and standards are developed today provides opportunities for all stakeholders – engineers and non-engineers alike – to have a say in establishing how much risk is permissible. However, the current trend toward more complex and prescriptive design criteria carries its own risks, which may be easily overlooked. Engineering is not simply a matter of following rules set down by others; each individual engineer must still exercise appropriate knowledge, skill, and judgment on a case-by-case basis.

Another complication is what philosophers call the “problem of many hands” (PMH) – the difficulty of allocating responsibility when multiple people are engaged in a particular activity. Van de Poel and Nihlén Fahlquist suggest a few possible solutions and conclude, “What is needed is probably a combination of these three approaches”:

  • Responsibility-as-virtue (#5 above). This “would entail a focus on how to develop and cultivate people’s character with the aim to establish a willingness to actively take responsibility.”
  • Procedural distribution of responsibility. This would essentially apply the familiar consensus process to the ethical realm, not just technical considerations.
  • Institutional design. This would involve creating environments that foster responsibility-as-virtue and “minimize unintended collective consequences of individual actions.”

The idea, again, is to “prevent risks from materializing instead of distributing responsibility when it has already materialized.” Avoiding PMH would thus require those involved to be “virtuous-responsible people who use their judgment to form a balanced response to conflicting demands.”

This should sound familiar if you have been reading this space over the last couple of years, or at least my most recent installment (“Virtuous Engineering,” September 2013). Having identified objectivity, care, and honesty as the moral virtues of engineering, and practical judgment as the intellectual virtue of engineering, I see the union of these traits as constitutive of the more general virtue of responsibility within the context of engineering practice. In other words, the virtuous engineer is a responsible engineer.▪

About the author  ⁄ Jon A. Schmidt, P.E., SECB

Jon A. Schmidt (jschmid@burnsmcd.com) is a Senior Associate Structural Engineer in the Aviation & Federal Group at Burns & McDonnell in Kansas City, Missouri. He serves as President on the NCSEA Board of Directors, was the founding chair of the SEI Engineering Philosophy Committee, and shares occasional thoughts at twitter.com/JonAlanSchmidt.

Download this article
STRUCTURE magazine