Huawei or the highway?

Information technology is complicated. And so, for those who do not understand how it works, it may be dangerous. The same is true for cars and rotary saws. Risk management is not just for certified professionals to practice; we all manage risk. The little sister who guides her younger brother across a busy street, the advisor to a president regarding a matter of national security—these people manage risk. They—we all—have to consider probabilities for success and failure, for living and dying.

Information technology is complicated. And those who understand it know well that it can be dangerous and know perhaps how dangerous it could be. More and more, the systems on which our society depends—our systems of communication and transportation, our systems for producing and distributing medicines and food, our systems for providing safety and security, and our systems for providing healthcare, are getting linked, becoming interdependent and, along the way, running the risk of producing catastrophic consequences for large parts of our country.

The contention that the Internet of Things (IoT) is a major development in the history of the human race is no exaggeration. The potential for good is significant; the potential for evil is unparalleled. To be sure, wars are not new, and exploitation, slavery and torture feature prominently in the records of human conduct over all the ages. But the IoT will provide us a tool or an entire toolbox for making lives safer, healthier, happier, or rendering lives more vulnerable to psychological manipulation, political oppression and even physical threats—all on scales far greater than ever seen before.

Hackers bent on disrupting our power grid and networks for communication will have opportunities for producing more troubles than even imagined in centuries past. Wars of the future will occur in several new dimensions. Governments bent on controlling their citizens will be able to exercise their powers in ways more subtle, more insidious, more effectively than any totalitarian regime to date. So we should build into our IT and our political systems the means by which their processes can be examined—build in windows offering transparency rather than back doors allowing surreptitious assaults on our freedoms.

No one should believe that information technology is essentially benign. IT is essentially amoral: neither good nor evil. Only the uses of technology are appropriately judged in terms of morality. Nor should anyone believe that IT will inevitably lead societies toward democracy. The history of this technology is brief, but there are enough lessons already to show that human rights will always remain in jeopardy.

The notion of planting chips in people, so that employers or other authorities can easily identify and track individuals is not new. Facial recognition and other means for identifying people and determining their whereabouts and patterns of travel are only increasing in deployment and efficiency. The latter technologies do not require the implantation of any device, and so they skirt the requirement to address people directly—either to gain their acquiescence or to force them into compliance. They save time and money and reach many more people than any program for implanting chips possibly could. Connecting common electronic devices into the one big network or even just a few little networks can have far reaching consequences for large parts of a society.

The American federal government has prohibited the transfer of some technologies to Huawei. It has pressed other countries to avoid purchasing and becoming dependent on products from Huawei. One stated reason for this policy is the concern that Huawei has put or may put ‘back doors’ into its products—the means by which the government of China may gain information.
International trade has produced abundant benefits—cheaper goods are one of those—but among the greatest are the peaceful relationships that result from the entanglement of economic incentives for orderly growth and development. Nations, however, cannot afford to allow economic considerations to lead to an increase in vulnerability should the international relationships come to grief. If even friends and allies can get into conflicts, surely nations can, when the risks and rewards of cooperation and competition are so great.

Electronic components capable of being programmed and controlled remotely—especially those that govern the functioning of essential services on which large parts of society depend—pose far greater risks to a nation than does, for example, steel purchased from abroad. Policymakers are wise to be concerned about the potential for mischief—or devastating consequences—brought about by foreign powers tinkering with the controls of our communication, transportation, and security systems. Truly exhaustive testing of components manufactured by foreign countries for the IoT is not practical. Even if every component were examined and passed, there is still a potential for reprogramming the devices for malicious purposes while those devices are in place. America ought to develop its own industries for the design and manufacture of components for 5G networks. Risks associated with the maintenance of critical infrastructure ought to be minimized.

But if our leaders decide to purchase from any foreign country the kit that implements the IoT, they must be sure that it will work safely and securely for our citizens. And, by the way, they ought to be as careful putting American products into service in 5G networks. The seeds of terrorism can grow in American soil.

In the end, lawmakers have no easy task deciding on policies that could affect the well-being and even the lives of the people they serve. All they need to be is morally certain that their choices to permit the use of certain components of the IoT will do no harm.

References
Fowdy, Tom. “The United States quietly concedes defeat on Huawei's 5G” 2020 May 07. https://news.cgtn.com/news/2020-05-07/The-United-States-quietly-concedes....
Economist 2020 April 11, pp. 47-49.
Economist 2019 September 14, pp. 14-15 and pp. 59-61.