Minding the the IT-OT Gap: Student Project

As we have built the industrial cybersecurity program one of the coolest things we have done is develop our ESET 1181 Introduction to Cyber-Physical Systems course.

Industrial Cybersecurity students take the course in their first semester. I like to think of it as the entire program packed into a nutshell. With 20+ hands-on activities, students get brief exposure to topics that other courses will cover in greater depth.

During lucky week 13, we discuss the roles and responsibilities that professionals will encounter, and describe the factors (especially the non-technology factors) that have created the IT-OT gap.

I describe several real-life experiences where the gap loomed beneath. Then I ask students to deploy their creative energies to express gap. Over the years students have submitted poetry, skits, videos, and posters. I wanted to highlight a couple student videos that you might find entertaining. Happy viewing!

New Video Describing ISU Industrial Cybersecurity Degree Program

One of the things I did not fully anticipate when I left FireEye/Mandiant to lead ISU’s Industrial Cybersecurity Program was the full range of skills I would need to excel as a Program Coordinator.

Naturally, domain expertise is important — but curriculum design and development, overseeing faculty, interacting with students from a variety of ages and social backgrounds, receiving calls from parents, hiring faculty, guiding adjuncts, running advisory committees, creating exams and scoring rubrics, selecting high quality materials, dealing with department, college, and university curriculum review committees, making proposals to the state board of education, submitting grants has been much more involved and challenging than I anticipated!

I was very pleased to have some good marketing help in the form of the video above. It features a variety of faculty and staff that have made the Industrial Cybersecurity program a great success over the years! I love that it features our real students.

A special shout-out to Ryan Pitcher who is among the most dedicated faculty I can imagine. At school early and late. Always willing to take time for students and peer instructors alike. It is faculty like him that make ESTEC a hiring staple for technical professionals at regional, national, and global competitive firms!

If you are looking for well prepared entry level industrial cybersecurity talent, please reach out to me.

Easterly: Software Liability Regime for Critical Infrastructure

When I read congressional testimony from Jen Easterly on January 31, 2024 I was quite surprised. Check out this quote:

Unfortunately, the technology base underpinning much of our critical infrastructure is inherently insecure, because for decades software developers have been insulated from responsibility for defects in their products. This has led to misaligned incentives that prioritize features and speed to market over security, leaving our nation vulnerable to cyber invasion. That must stop.

The discussion over liability for software has been going on for a long time. Dale Peterson touched it in his post last week.

Our world depends on software – for which there are varying degrees of quality control. And the companies selling this software intentionally disclaim responsibility for their products via EULAs. Remarkably – but not unexpectedly – this is true even in the ICS space. Here is an excerpt from a leading ICS vendor’s EULA . I don’t intend to pick on this vendor alone because this type of language is standard practice across the industry.

VENDOR makes no representation or warranty, express or implied, that the operation of the Software will be uninterrupted or error free, or that the functions contained in the Software will meet or satisfy Your intended use or requirements; You assume complete responsibility for decisions made or actions taken based on information obtained using the Software. In addition, due to the continual development of new techniques for intruding upon and attacking networks, VENDOR does not warrant that the Software or any equipment, system or network on which the Software is used will be free of vulnerability to intrusion or attack.

Key passage there “VENDOR does not warrant that the software… will be free of vulnerability.”

Can you imagine requiring drivers who cross a high suspension bridge to sign a user license that says “we do not warrant that this bridge will support your vehicle”?

In my CYBR 3383 Security Design Principles class, one of the principles we highlight is “professional liability” — which means putting one’s name, reputation, honor, career, on the line.

Putting your name on the line is the standing expectation of engineers who sign engineering documents. And it extends to other fields as well.

In general, professional liability is missing from the software industry.

While I am not an expert in software development, I observe that there is a lack of support for software engineering licensure. I am not saying a license should be required in all cases, but certain software is running processes upon which millions of people depend for clean water and reliable electricity every day. Shouldn’t there be some overarching minimum professional standard for these individuals and their work?

Easterly envisions:

We must drive toward a future where defects in our technology products are a shocking anomaly, a future underpinned by a software liability regime based on a measurable standard of care and safe harbor provisions for software developers who do responsibly innovate by prioritizing security.

I’d like to hear more about that…

Cybersecurity in Control Systems Engineer PE Exam

I was visiting the NCEES web site the other day. NCEES is the National Council of Examiners for Engineering and Surveying. That is the group that produces/maintains the Fundamentals of Engineering (FE) and Principles and Practices of Engineering (PE) examinations.

Image: Unsplash

In States of the United States, passing the PE exam is a requirement for obtaining professional licensure as an engineer.

PE exams are offered in 16 fields, ranging (alphabetically) from Agricultural & Biological to Structural.

During the course of seven years developing the country’s first Industrial Cybersecurity degree program, I have asked myself what success would look like for the country.

One core idea (and I am not the only person to think this way – see DOE Cyber Informed Engineering effort) is that professional licensure for all engineering AND engineering technology fields would require some basic knowledge, or even better, basic competency in cybersecurity.

So, when reviewing the NCEES PE exam specifications document for Control Systems Engineering (CSE), I was pleased to find an entry for “Security”. It states:

D. Security of Industrial Automation and Control Systems
1. Security (e.g., physical, cyber, network, firewalls, routers, switches, protocols,
hubs, segregation, access controls)
2. Security life cycle (e.g., assessment, controls, audit, management of change)
3. Requirements for a security management system
4. Security risk assessment and system design
5. Product development and requirements
6. Verification of security levels (e.g., level 1, level 2)

This seems like a great start. I was left wondering what the exam questions might actually entail. Maybe I will have to take the exam to find out! I was able to gather that the CSE is offered exactly once each year at a Pearson Vue center.

Perhaps more importantly, I wondered:
* Are these the most important security concepts for a control systems engineer to know?
* How will cybersecurity knowledge affect the behavior of a control systems engineer?
* What are the correct answer rates for each question?

Interestingly, the exam specifications for the following exams (where we might hope to find it) do not name security (as in cybersecurity) among covered topics:
* Electrical and computer – electronics, controls, and communications
* Electrical and computer – power
* Industrial and systems

Specifications for the other exams (where we might be less expecting to find it): Agriculture and Biological, Architectural, Chemical, Civil, Environmental, Fire Protection, Mechanical, Metallurgical, Mining and Mineral Processing, Naval Architecture and Marine, Nuclear, Petroleum, and Structural do not mention cybersecurity despite its cross-cutting implications.

It is very informative that the NCEES Web site makes pass rate information available for all of the exams. A review of this data shows that in the 2023 year, the Control Systems Engineering exam was administered 221 times, with a 57% 1st time pass rate.

The data also indicates roughly 19,000 individuals take a PE exam in any field for the first time each year (data provided is biannual for some tests, I multiplied those by two to make an annual estimate).

In short, I believe there is a real opportunity to bake cybersecurity into the engineering discipline here, but it is going to require some serious effort!

The IA (Cybersecurity) Workforce

A few days ago I noticed a white house release regarding it’s federal artificial intelligence workforce development program that you can read about at this link. Part of the initiative involves creating a workforce development and recruitment effort modeled after the NSA/NSF Scholarship for Service (SFS) program.
Over dinner at the SFS student job fair in Washington DC last month, the SFS Program Office announced that Victor Piotrowski would leave his role with SFS to help stand up the federal AI workforce initiative. Several faculty members from long-standing SFS schools, including myself from Idaho State University, and Dr. Jim Alves-Foss from University of Idaho stood in ovation for Victor’s dedication. I shouted, “Thank you, Victor!” at the top of my voice.
I think the new federal AI initiative is very interesting, I’m glad that the CyberCorps Scholarship for Service initiative can serve as a useful model for what federal policy can do to address this important need.
All of our lives are being impacted by AI. I just finished reading Mustafa Suleyman and Michael Bhaskar’s work “The Coming Wave“.
To me, the book argues persuasively that individuals, communities, academics, policy-makers, and executives need to think differently about technology in general, the evolution of technology, and the ramifications of technology across all disciplines.
I think the most intriguing part of the book was the discussion of what the next 10 or 20 years could hold, based on current developments and market forces. The authors’ treatment of containment as the only strategy that can harness the wave for long-term human (homo technologicos) was also of great interest. The book featured several examples of failed technological containment and several examples of moderately successful technological containment.
Near the end of the book, the authors’ describe Luddite efforts to physically destroy factory machines, such as automated looms, during the industrial revolution in England. The authors assert that within 50 or so years of those destructive attacks, Luddite concerns were irrelevant because because by then, the children of the technological resistance were benefiting directly from vastly improved lives that the technology ultimately enabled.
This got me thinking about the relationships between technology, individual well-being, social well-being, and political structures. On the relationship to political structures, I realized more fully that communism as originally conceptualized by Marx and Engels were political responses to fundamentally technological advancements.
Suleyman and Bhaskar did not make this connection directly, but they did point out that governments have generally failed to develop sound policies specifically around technological advancement. They call for intentional efforts to develop a cadre of interdisciplinary professionals, dedicated to advancing the field of technological safety/security.