You are here: Home Contents V20 N2 V20N2_Tilbury.html
Personal tools

The Rationality of Automation Bias in Security Operation Centers

 

 

Full text
View
Purchase

Source
Journal of Information Systems Security
Volume 20, Number 2 (2024)
Pages 89109
ISSN 1551-0123 (Print)
ISSN 1551-0808 (Online)
Authors
Jack Tilbury — The University of Tulsa, USA
Stephen Flowerday — The University of Tulsa, USA
Publisher
Information Institute Publishing, Washington DC, USA

 

 

Abstract

Security Operation Centers (SOCs) comprise people, processes, and technology and are responsible for protecting their respective organizations against any form of cyber incident. These teams consist of SOC analysts, ranging from Tier 1 to Tier 3. In defending against cyber-attacks, SOCs monitor and respond to alert traffic from numerous sources. However, a commonly discussed challenge is the volume of alerts that need to be assessed. To aid SOC analysts in the alert triage process, SOCs integrate automation and automated decision aids (ADAs). Research in the human-automation field has demonstrated that automation has the potential of cognitive skill degradation. This is because human operators can become over-reliant on automated systems despite the presence of contradictory information. This cognitive bias is known as automation bias. The result of this study is the development of four critical success factors (CSFs) for the adoption of automation within SOCs in an attempt to mitigate automation bias: (1) Task-based Automation; (2) Process-based Automation; (3) Automation Performance Appraisal; and (4) SOC Analyst Training of Automated Systems. In applying these CSFs, a beneficial balance between the SOC analyst and the use of automation is achieved. This study promotes the human-in-the-loop approach whereby experienced and cognitively aware SOC analysts remain at the core of SOC processes.

 

 

Keywords

Security Operation Center, SOC, SOC Analyst, Automation Bias, Automated Decision Aids, Critical Success Factors.

 

 

References

Afzaliseresht, N., Miao, Y., Michalska, S., Liu, Q., and Wang, H. (2020). From logs to Stories: Human-centred data mining for cyber threat intelligence. IEEE Access, 8, 19089–19099. Scopus. https://doi.org/10.1109/ACCESS.2020.2966760

Agyepong, E., Cherdantseva, Y., Reinecke, P., and Burnap, P. (2020). Challenges and performance metrics for security operations center analysts: A systematic review. Journal of Cyber Security Technology, 4(3), 125–152. https://doi.org/10.1080/23742917.2019.1698178

Akinrolabu, O., Agrafiotis, I., and Erola, A. (2018). The challenge of detecting sophisticated attacks: Insights from SOC Analysts. Proceedings of the 13th International Conference on Availability, Reliability and Security, 1–9. https://doi.org/10.1145/3230833.3233280

Alahmadi, B. A., Axon, L., and Martinovic, I. (2022). 99% False Positives: A Qualitative Study of SOC Analysts’ Perspectives on Security Alarms. 31st USENIX Security Symposium, 19. https://www.usenix.org/conference/usenixsecurity22/
presentation/alahmadi

Andrade, R. O., and Yoo, S. G. (2019). Cognitive security: A comprehensive study of cognitive science in cybersecurity. Journal of Information Security and Applications, 48. Scopus. https://doi.org/10.1016/j.jisa.2019.06.008

Baroni, P., Cerutti, F., Fogli, D., Giacomin, M., Gringoli, F., Guida, G., and Sullivan, P. (2021). Self-Aware Effective Identification and Response to Viral Cyber Threats. In Jancarkova T., Lindstrom L., Visky G., and Zotz P. (Eds.), Int. Conf. Cyber Confl., CYCON (Vols. 2021-May, pp. 353–370). NATO CCD COE Publications; Scopus. https://doi.org/10.23919/CyCon51939.2021.9468294

Basyurt, A. S., Fromm, J., Kuehn, P., Kaufhold, M.-A., and Mirbabaie, M. (2022). Help Wanted—Challenges in Data Collection, Analysis and Communication of Cyber Threats in Security Operation Centers. Int. Conf. Wirtschaftsinformatik, WI. 17th International Conference on Wirtschaftsinformatik, WI 2022. Scopus. https://www.scopus.com/inward/record.uri?eid=2-s2.0-85171997510andpartnerID=
40andmd5=30a02b455898c7c2c9d2421d82606470

Bridges, R. A., Rice, A. E., Oesch, S., Nichols, J. A., Watson, C., Spakes, K., Norem, S., Huettel, M., Jewell, B., Weber, B., Gannon, C., Bizovi, O., Hollifield, S. C., and Erwin, S. (2023). Testing SOAR tools in use. Computers and Security, 129, 103201. https://doi.org/10.1016/j.cose.2023.103201

Brown, P., Christensen, K., and Schuster, D. (2016). An Investigation of Trust in a Cyber Security Tool. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 60(1), 1454–1458. https://doi.org/10.1177/1541931213601333

Butavicius, M., Parsons, K., Lillie, M., McCormac, A., Pattinson, M., and Calic, D. (2020). When believing in technology leads to poor cyber security: Development of a trust in technical controls scale. Computers and Security, 98, 102020. https://doi.org/10.1016/j.cose.2020.102020

Chamberlain, L. B., Davis, L. E., Stanley, M., and Gattoni, B. R. (2020). Automated Decision Systems for Cybersecurity and Infrastructure Security. 2020 IEEE Security and Privacy Workshops (SPW), 196–201. https://doi.org/10.1109/SPW50608.2020.00048

Chamkar, S. A., Maleh, Y., and Gherabi, N. (2022). The Human Factor Capabilities in Security Operation Centers (SOC). EDPACS, 66(1), 1–14. https://doi.org/10.1080/07366981.2021.1977026

Chen, Y., Zahedi, F. M., Abbasi, A., and Dobolyi, D. (2021). Trust calibration of automated security IT artifacts: A multi-domain study of phishing-website detection tools. Information and Management, 58(1), 103394. https://doi.org/10.1016/j.im.2020.103394

Cichonski, P., Millar, T., Grance, T., and Scarfone, K. (2012). Computer Security Incident Handling Guide: Recommendations of the National Institute of Standards and Technology (NIST SP 800-61r2; p. NIST SP 800-61r2). National Institute of Standards and Technology. https://doi.org/10.6028/NIST.SP.800-61r2

Cummings, M. L. (2004). Automation Bias in Intelligent Time Critical Decision Support Systems. American Institute for Aeronautics and Astronautics First Intelligent Systems Technical Conference, Reston, VA. https://web.archive.org/
web/20051218092750id_/http://web.mit.edu:80/aeroastro/
www/labs/halab/papers/CummingsAIAAbias.pdf

De-Arteaga, M., Fogliato, R., and Chouldechova, A. (2020). A Case for Humans-in-the-Loop: Decisions in the Presence of Erroneous Algorithmic Scores. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 1–12. https://doi.org/10.1145/3313831.3376638

Devo. (2021). 2021 Devo SOC Performance Report. Ponemon Institute. https://www.devo.com/wp-content/uploads/sites/1/2021/12/2021-Devo-SOC-Performance-Report.pdf

Dietrich, C., Krombholz, K., Borgolte, K., and Fiebig, T. (2018). Investigating System Operators’ Perspective on Security Misconfigurations. Proceedings of the 2018 ACM SIGSAC Conference on Computer and Communications Security, 1272–1289. https://doi.org/10.1145/3243734.3243794

Dykstra, J., Met, J., Backert, N., Mattie, R., and Hough, D. (2022). Action Bias and the Two Most Dangerous Words in Cybersecurity Incident Response: An Argument for More Measured Incident Response. IEEE Security and Privacy, 20(3), 102–106. https://doi.org/10.1109/MSEC.2022.3159471

Endsley, M. R., and Kaber, D. B. (1999). Level of automation effects on performance, situation awareness and workload in a dynamic control task. Ergonomics, 42(3), 462–492. https://doi.org/10.1080/001401399185595

Erola, A., Agrafiotis, I., Happa, J., Goldsmith, M., Creese, S., and Legg, P. A. (2017). RicherPicture: Semi-automated cyber defence using context-aware data analytics. 2017 International Conference On Cyber Situational Awareness, Data Analytics And Assessment (Cyber SA), 1–8. https://doi.org/10.1109/CyberSA.2017.8073399

González-Granadillo, G., González-Zarzosa, S., and Diaz, R. (2021). Security Information and Event Management (SIEM): Analysis, Trends, and Usage in Critical Infrastructures. Sensors, 21(14), 4759. https://doi.org/10.3390/s21144759

Goodall, J. R., Ragan, E. D., Steed, C. A., Reed, J. W., Richardson, D., Huffer, K., Bridges, R., and Laska, J. (2019). Situ: Identifying and Explaining Suspicious Behavior in Networks. IEEE Transactions on Visualization and Computer Graphics, 25(1), 204–214. https://doi.org/10.1109/TVCG.2018.2865029

Hámornik, B. P., and Krasznay, C. (2018). A Team-Level Perspective of Human Factors in Cyber Security: Security Operations Centers. In D. Nicholson (Ed.), Advances in Human Factors in Cybersecurity (Vol. 593, pp. 224–236). Springer International Publishing. https://doi.org/10.1007/978-3-319-60585-2_21

Hauptman, A. I., Schelble, B. G., McNeese, N. J., and Madathil, K. C. (2023). Adapt and overcome: Perceptions of adaptive autonomous agents for human-AI teaming. Computers in Human Behavior, 138. Scopus. https://doi.org/10.1016/j.chb.2022.107451

Husák, M., Sadlek, L., Špaček, S., Laštovička, M., Javorník, M., and Komárková, J. (2022). CRUSOE: A toolset for cyber situational awareness and decision support in incident handling. Computers and Security, 115, 102609. https://doi.org/10.1016/j.cose.2022.102609

Jajodia, S., Liu, P., Swarup, V., and Wang, C. (Eds.). (2010). Cyber Situational Awareness: Issues and Research (Vol. 46). Springer US. https://doi.org/10.1007/978-1-4419-0140-8

Kassner, M (2015, February 2). Anatomy of the Target data breach: Missed opportunities and lessons learned. ZDNet/tech. https://www.zdnet.com/article/anatomy-of-the-target-data-breach-missed-opportunities-and-lessons-learned/

Kinyua, J., and Awuah, L. (2021). Ai/ml in security orchestration, automation and response: Future research directions. Intelligent Automation and Soft Computing, 28(2), 527–545. Scopus. https://doi.org/10.32604/iasc.2021.016240

Kokulu, F. B., Soneji, A., Bao, T., Shoshitaishvili, Y., Zhao, Z., Doupé, A., and Ahn, G.-J. (2019). Matched and Mismatched SOCs: A Qualitative Study on Security Operations Center Issues. Proceedings of the 2019 ACM SIGSAC Conference on Computer and Communications Security, 1955–1970. https://doi.org/10.1145/3319535.3354239

Merritt, S. M., Ako-Brew, A., Bryant, W. J., Staley, A., McKenna, M., Leone, A., and Shirase, L. (2019). Automation-Induced Complacency Potential: Development and Validation of a New Scale. Frontiers in Psychology, 10, 225. https://doi.org/10.3389/fpsyg.2019.00225

Metzger, U., and Parasuraman, R. (2005). Automation in Future Air Traffic Management: Effects of Decision Aid Reliability on Controller Performance and Mental Workload. Human Factors: The Journal of the Human Factors and Ergonomics Society, 47(1), 35–49. https://doi.org/10.1518/0018720053653802

Miloslavskaya, N. (2016). Security Operations Centers for Information Security Incident Management. 2016 IEEE 4th International Conference on Future Internet of Things and Cloud (FiCloud), 131–136. https://doi.org/10.1109/FiCloud.2016.26

Mosier, K. L., Skitka, L. J., Heers, S., and Burdick, M. (1998). Automation Bias: Decision Making and Performance in High-Tech Cockpits. The International Journal of Aviation Psychology, 8(1), 47–63. https://doi.org/10.1207/s15327108ijap0801_3

Naseer, H., Maynard, S. B., and Desouza, K. C. (2021). Demystifying analytical information processing capability: The case of cybersecurity incident response. Decision Support Systems, 143, 113476. https://doi.org/10.1016/j.dss.2020.113476

Neupane, S., Ables, J., Anderson, W., Mittal, S., Rahimi, S., Banicescu, I., and Seale, M. (2022). Explainable Intrusion Detection Systems (X-IDS): A Survey of Current Methods, Challenges, and Opportunities. IEEE Access, 10, 112392–112415. Scopus. https://doi.org/10.1109/ACCESS.2022.3216617

Ofte, H. J., and Katsikas, S. (2023). Understanding situation awareness in SOCs, a systematic literature review. Computers and Security, 126, 103069. https://doi.org/10.1016/j.cose.2022.103069

Osborne, C. (2021, May 13). Colonial Pipeline ransomware attack: Everything you need to know. ZDNet/tech. https://www.zdnet.com/article/colonial-pipeline-ransomware-attack-everything-you-need-to-know/

Palmer, D. (2017, October 27). WannaCry ransomware: Hospitals were warned to patch systems to protect against cyber-attack - but didn't. ZDNet/tech. https://www.zdnet.com/article/wannacry-ransomware-hospitals-were-warned-to-patch-system-to-protect-against-cyber-attack-but-didn’t/

Parasuraman, R., and Manzey, D. H. (2010). Complacency and Bias in Human Use of Automation: An Attentional Integration. Human Factors: The Journal of the Human Factors and Ergonomics Society, 52(3), 381–410. https://doi.org/10.1177/0018720810376055

Parasuraman, R., Sheridan, T. B., and Wickens, C. D. (2000). A model for types and levels of human interaction with automation. IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, 30(3), 286–297. https://doi.org/10.1109/3468.844354

Rainbolt, G. W., and Dwyer, S. L. (2012). Critical Thinking: The Art of an Argument (Vol. 1). WADSWORTH CENGAGE Learning.

Ryan, T. J., Alarcon, G. M., Walter, C., Gamble, R., Jessup, S. A., Capiola, A., and Pfahler, M. D. (2019). Trust in Automated Software Repair: The Effects of Repair Source, Transparency, and Programmer Experience on Perceived Trustworthiness and Trust. In A. Moallem (Ed.), HCI for Cybersecurity, Privacy and Trust (Vol. 11594, pp. 452–470). Springer International Publishing. https://doi.org/10.1007/978-3-030-22351-9_31

Singh, I. L., Molloy, R., and Parasuraman, R. (1993). Individual Differences in Monitoring Failures of Automation. The Journal of General Psychology, 120(3), 357–373. https://doi.org/10.1080/00221309.1993.9711153

Skitka, L. J., Mosier, K. L., and Burdick, M. (1999). Does automation bias decision-making? International Journal of Human-Computer Studies, 51(5), 991–1006. https://doi.org/10.1006/ijhc.1999.0252

van der Kleij, R., and Leukfeldt, R. (2020). Cyber Resilient Behavior: Integrating Human Behavioral Models and Resilience Engineering Capabilities into Cyber Security. In T. Ahram and W. Karwowski (Eds.), Advances in Human Factors in Cybersecurity (Vol. 960, pp. 16–27). Springer International Publishing. https://doi.org/10.1007/978-3-030-20488-4_2

Vielberth, M., Bohm, F., Fichtinger, I., and Pernul, G. (2020). Security Operations Center: A Systematic Study and Open Challenges. IEEE Access, 8, 227756–227779. https://doi.org/10.1109/ACCESS.2020.3045514.