Appropriate automation: Human system dynamics for control systems
Hybrid for appropriate automation
An approach is needed to guide process control engineering in the quest for appropriate automation. Sometimes insights can come from unlikely places. Early research in psychology may have something to offer here.
Figure 3 depicts a model of a closed-loop control system that embodies the first principles discussed. The ideas in Figure 3 are not entirely new, nor purely theoretical. Although it incorporates ideas such as search, proposed by Newell and Simon (1975), as a requisite for intelligence, an earlier version of the figure has been used as a reference design for military and commercial sensor management systems, and similar models abound in the human factors and HSI research literature. (cf Harris, Ballard, Girard and Gluckman, 1993; Wickens and Carswell, 2004).
Note the words "human," "software," and "hardware" do not appear in the figure. The architecture can be applied to any intelligent control system, whether it be purely silicon-based, mixed silicon-organic based, or even purely organic. Systems that are organized as in Figure 3, and that comprise human and machine components, exhibit capabilities and limitations as a system that are not evident-and may not even exist in either component alone because they are emergent. Thus such systems are hybrid intelligence systems. Such systems can routinely "solve" induction problems.
This processor is canonical because it entails the essential components of a real-time, closed-loop control system capable of intelligent behavior; yet, it does not specify how the processes are implemented. As a simple rule of thumb, at least for the foreseeable future, the induction authority should be human, and the deductive engine should be a machine, but those allocations of function might change.
IBM's Watson system, for example, may be capable of induction (Fan, Ferrucci, Gondek and Kalyanpur, 2010). Watson is the first of an emerging generation of machine intelligence systems intended to operate in real-world environments to solve real-world problems. Yet, such intelligent machines can't operate in a completely automated fashion. Their recommendations for solution to induction problems are just one more source of information to the human control authority. Watson is designed to be a team member, not the final decision-maker. Such machines must be integrated with human cognition in a coherent fashion to augment and enhance human cognitive performance. Hence, with the advent of Watson, new principles for integrating human and machine intelligence into a cohesive whole are needed. We trust that improved understanding of the fundamentals of induction and human cognition will contribute measurably to the next generation of control engineering.
What cannot be automated? From the perspective of first principles in control engineering, at least one answer is: Induction cannot be automated, at least not yet. Failure to understand the implications of that answer can be disastrous. New technology—IBM's Watson being the progenitor—may force us to revisit this principle in the near future.
- Steven D. Harris is president of Rational, LLC. Jennifer McGovern Narkevicius, PhD, is the managing director of Jenius, LLC and co-chair of INCOSE's Human Systems Integration Working Group. Edited by Mark T. Hoske, content manager, CFE Media, Control Engineering, firstname.lastname@example.org.
- Automation, in this context, concerns the implementation of the control process inside a machine.
- Even with closed-loop control, humans are involved.
- Inductive logic devices aren't available in process control, yet.
What efficiencies will be available when computing power and programming allow machine-based inductive reasoning with closed loop control?
This is a full-length version of an article scheduled to appear under the same headline in the March 2014 Control Engineering print and digital edition. See related articles tagged below.
Aledo, J.A., Martinez, S., and Valverde, J.C. (2013). Parallel discrete dynamical systems on independent local functions. Journal of Computational and Applied Mathematics, 237, pp. 335-339. Boy, G.A. (2013). Orchestrating Human Centered Design. London: Springer-Verlag.
Boyd, J.R. (1975). Destruction and creation. Unpublished lecture notes, 1976.
Becker, B. Greenberg, R.J., Haimovich, A.M., Parisi, M., and Harris, S. (1991) F-14 sensor fusion and integrated mode requirements, Technical Proceedings of the 1991 Joint Service Data Fusion Symposium, pp. 629-647, Laurel, MD: John Hopkins University APL Labs.
Denman, W. (2013). MetiTarski and the Verification of Dynamical Systems. Presentation at National Institute of Aerospace / Langley Research Center, 8 July.
Doyle, J.C., and Csete, M. Architecture, constraints and behavior. PNAS, vol. 108, 13 Sep 2011, pp. 15624-15630.
Fan, J., Ferrucci, D., Gondek, D., and Kalyanpur, A. PRISMATIC: Inducing knowledge from a large scale lexicalized relation resource. Proceedings of the NAACL HLT 1020 First International Workshop on Formalisms and Methodology for Learning by Reading. Los Angeles, CA: Association for Computational Linguistics, 2010, pp. 122-127.
Fitts, P.M. Human Engineering for an Effective Air Navigation and Traffic Control System. Washington, DC. National Research Council, 1951.
Goodloe, A., and Muñoz, C. (2013). Compositional Verification of a Communication Protocol for a Remotely Operated Aircraft, Science of Computer Programming, Volume 78, Issue 7, pp. 813-827.
Government Accounting Office (2003). Military Personnel: Navy Actions Needed to Optimize Ship Crew Size and Reduce Total Ownership Costs, GAO-03-520. Washington, DC.
Harris, S.D., and Helander, M.G. Machine intelligence in real systems: Some ergonomics issues. In G. Salvendy (Ed.), Proceedings of the First International Conference on Human-Computer Interaction, Amsterdam: Elsevier, 1984.
Harris, S.D., and Owens, J.M. Some critical factors that limit the effectiveness of machine intelligence in military systems applications. Journal of Computer-Based Instruction, 13-2, 1986, pp. 30-33.
Harris, S.D., Ballard, L., Girard, R. and Gluckman, J. (1993). Sensor fusion and situation assessment: Future F/A-18. In Levis A. and Levis, I. S. (Eds.), Science of Command and Control: Part III Coping with Change. Fairfax, VA: AFCEA International Press.
Haskins (2011). INCOSE Systems Engineering Handbook v. 3.2. San Diego, CA: International Council on Systems Engineering.
Helbing, D. Globally networked risks and how to respond. Nature, 497, pp. 51-59, 2 May 2013.
Jones, M., Plott, C., Jones, M., Olthoff, T., and Harris, S. The Cab Technology Integration Lab: A locomotive simulator for human factors research. In Proceedings of Human Factors and Ergonomics Society 54th Annual Meeting. Human Factors and Ergonomics Society, Oct 27, 2010.
Mortveit, H., and Reidys, C. (2008). An Introduction to Sequential Dynamical Systems. New York: Springer, 2008.
Newell, A., Simon, H.A. Computer science as empirical inquiry: Symbols and search. Communications of the ACM, 19(3), pp. 113-126, 1976.
Palmer, J.A. Superdome partial power outage. Report prepared for Entergy New Orleans, Inc., SMG, Louisiana Stadium and Exposition District. Louisville, KY: Palmer Engineering and Forensics, 21 Mar 2013.
Reason, J. (1990). Human Error, Cambridge: Cambridge University Press.
Smullen, R.R., and Harris, S.D., "Air Combat Environment Test and Evaluation Facility (ACETEF)." In North Atlantic Treaty Organization Advisory Group for Aerospace Research and Development (AGARD) Conference Proceedings (No. 452,"Flight Test Techniques." Papers presented at Flight Mechanics Panel Symposium, Edwards Air Force Base, CA, 17-20 October 1988.
Tolman, E. C., and Brunswik, E. (1935). The Organism and the Causal Texture of the Environment. Psychological Review, 42, pp. 43-77.
Wickens, C.D., and Carswell, C.M (2007). Human Information Processing. In G. Salvendy (Ed.), Handbook of Human Factors. N.Y.: John Wiley & Sons.
Annual Salary Survey
After almost a decade of uncertainty, the confidence of plant floor managers is soaring. Even with a number of challenges and while implementing new technologies, there is a renewed sense of optimism among plant managers about their business and their future.
The respondents to the 2014 Plant Engineering Salary Survey come from throughout the U.S. and serve a variety of industries, but they are uniform in their optimism about manufacturing. This year’s survey found 79% consider manufacturing a secure career. That’s up from 75% in 2013 and significantly higher than the 63% figure when Plant Engineering first started asking that question a decade ago.