CS4CA Conference Recap

Last week I attended the Cyber Security for Critical Assets conference in Houston. I was impressed. Low key, probably 200 attendees plus reps from 20 vendors. Hight asset owner presence. I loved hearing the hallway conversations — people talking about real challenges and real experiences.

I was surprised by the candor in the technical deep dive sessions: OT security practioners at asset owners talking about how they had built their programs, including what technologies they had selected for what purpose, and what they planned to do next.

I moderated a panel on workforce development. While it seems everyone likes training,  ours was the only presentation that dealt specifically with intentional developing your people. Panelists included Nikolai Zlatarev of Castleton Commodities, Jude Ejiobi of University of Houston Downtown, and Chuck Bondurant of the Texas Public Utilities Commission.

It was a great panel, and I appreciated representation from industry, academia, and (quasi) government. Each panelist had a different professional pathway into OT security, and that was a key take away. It is such an interdisciplinary field.

Nikolai emphasized that the best way to develop skills was through real world experience. He liked to throw people all the way in. He explained that his career was one great challenge after another. He was obviously a man who had learn to thrive in ambiguous environments, and has a knack for keeping a positive attitude.

Jude does a good bit of security consulting work, and was brought into academia as a lecturer, but now helps direct the UH Downtown Masters in Security Management. Practical and well-spoken.

Chuck came into OT security when he learned that he wasn’t cut out for retirement after military service. His job involves liaising with and supporting Texas utilities on cybersecurity. When he took the job, he had no idea what a PLC was, but his military cybersecurity experience gave him a solid foundation to build on.

I think there is great value to having candid conversations about developing ourselves and our programs. I will look forward to next year!

Student Research Symposium and Engineering Technicians

Two weeks ago ISU hosted its annual Research and Creative Works Symposium. It is a really neat event that gives any student – graduate or undergraduate – a forum at which to share their work. Students simply sign up and voila they have a speaking spot in front of a panel of judges and anyone who might come support them!

I think this is a very useful, low stakes practice for students.

This year I am supervising two graduate student research projects. One, who chose to present at the conference, is investigating the effects of ransomware in Southeast Idaho. A handful of organizations, including local governments, health care, and manufacturing have all taken a ransomware hit. The student’s research is qualitative, relying on interviews with individuals who were impacted. Really insightful to have a collection of first-hand perspectives.

I was so impressed that all my graduate students will be required to present next year!

But that wasn’t what I found the most intriguing. After my student presented, I stuck around to hear other presentations.

I was especially intrigued by research into training levels of psychiatric technicians. Psychiatric care technicians work in mental health facilities and have more regular contact with patients than any other health care provider. They engage in a variety of interventions, including group therapy sessions.

Despite the boots-on-the ground role that these technicians play for patients, in Idaho and 45 other states (if I remember the statistic correctly), there is no baseline education or training requirement for such technicians.

The student had created and administered a questionnaire to a variety of psychiatric care professionals at a local psychiatric hospital to see what level of training psychiatric technicians should have. The student’s research found (again if I remember correctly) that a technician should have between 8 and 40 hours of training related to their role – which training could be concurrent with job function.

When she was finished, another member of the audience (who I surmised was the student’s supervisor) asked “what do you plan on doing next?”

“Well,” the student responded, “In our program we have learned about lobbying. I am going to lobby for change. I think there needs to be a minimum level of training these technicians need to have.”

Ahhh. Now you can see where I am going…

In the OT sphere, in the USA alone, we have probably several hundred thousand technicians – instrumentation and control technicians, electrical engineering technicians, mechanical engineering technicians – who install, configure, operate and maintain industrial control systems – systems that provide electricity, drinking water, and manufactured goods. Not a single state (0 of 50) requires them to have any cybersecurity training or demonstrated competence.

“Wow”, I thought, “We require barbers and hairdressers to have professional licensure in all 50 states. But there are no requirements for those individuals whose job performance directly affects the well being of millions.”

This is an addressable issue. Let’s get on it.

Tech Expo 2024!

The ISU College of Technology hosts an annual technology fair for middle school and high school students from across Southeast Idaho. The event, held in the ICCU Dome, attracts more than 2,000 teens to explore technology careers.

Industrial Cybersecurity has hosted a Tech Expo “booth” for 7 years. During the event, I stand in the thoroughfare and ask the youthful attendees “have you ever hacked a computer? Come on over and we will show you how!” 

Not a tough sell.

This year I had several of my current Industrial Cybersecurity students run the booth — coaching the high-schoolers through the exercise. We sit the high-schoolers across from one another and help them create a “secret” file via command line, use a default ssh password to access the other person’s computer, steal the secret, and race to shut off the other person’s computer.

Students who have never thought about this get quite excited. When asked what they learned, some students reply that they didn’t know hacking was so easy.

The event is a whirlwind as the booth stays full for four hours. I would estimate we ran 50 or 60 students through the exercise.

What impressed me most this year, is that when I asked students what they were planning to do after they finished high school, I had three or four tell me “I am going into cybersecurity.” I even had one tell me, “I heard about hackers taking out a power grid. I want to do that.”

With a big smile I was able to tell them, “We are almost full for Fall, but I think there’s still some room. Just call or email, and we will get you signed up!”

PCAST Report on CPS Resilience

I enjoyed reviewing the President’s Council of Advisors on Science and Technology (which boasts some big name institutions), “Report to the President” on “Strategy for Cyber-Physical Resilience”.

The Strategy offers a total of 14 recommendations across four categories:

  • Establish performance goals
    • 1.A Define sector minimum viable operating capabilities and minimum viable
      delivery objectives
    • 1.B Establish and measure leading indicators
    • 1.C Commit to radical transparency and stress testing
  • Bolster and coordinate research and development
    • 2.A Establish a National Critical Infrastructure Observatory
    • 2.B Formulate a national plan for cyber-physical resilience research
    • 2.C Pursue cross-ARPA coordination
    • 2.D Radically increase engagement on international standards
    • 2.E Embed content on cyber-physical resilience skills into engineering professions
      and education programs
  • Break down silos and strengthen government cyber-physical resilience capacity
    • 3.A Establish consistent prioritization of critical infrastructure
    • 3.B Bolster Sector Risk Management Agencies staffing and capabilities
    • 3.C Clarify and strengthen Sector Risk Management Agency authorities
    • 3.D Enhance the DHS Cyber Safety Review Board (CSRB)
  • Develop greater industry, board, CEO and executive accountability and flexibility
    • 4.A Enhance Sector Coordinating Councils
    • 4.B Promote supply chain focus and resilience by design

The report provides some context and insight on each of these. I can’t help but comment on 1.A. “Define sector minimum viable operating capabilities and minimum viable delivery objectives”.

I really like this idea because it shifts focus from the system itself (networks, software, process equipment) to the delivery of the critical function (power, water, food, etc). This is a great step in thinking through what matters most.

My observation is that in a highly interconnected world, with global supply chains, setting a scope for performance for an entire sector seems challenging because sectors don’t really “exist”. They are not monoliths. Their value is the service they provide to various users and customers, rather than to themselves.

Consider that the number and size of infrastructure service providers can vary greatly depending on geography. What is the minimum level of electricity or water to sustain quality of life for Southeast Idaho? For the city of Los Angeles? For the state of Texas? So the approach has got to include both sector and geography.

And within those geographies, various organizations rely on infrastructure services. Who should receive those services first?

Then, we have to recognize that sources of communications, energy, food, water, and medicine frequently (most frequently?, almost always?) operate across geographic boundaries — including in some (many?) cases across national boundaries.

Finally, each sector is not truly independent of other sectors. One geo-sector’s minimum viability may depend upon and/or conflict with that of another geo-sector.

I am pleased that the PCAST took up this topic. I am very optimistic about incorporating function-centered thinking. I find intriguing the idea of establishing minimum viable operating capabilities and objectives. However, I remain concerned that administrative constructs based primarily on sectors and geographies leave significant gaps.

There are some words in the strategy, such as “enhance supply chain focus” and “enhance cross-sector coordinating councils” that could address this concern, but I found this presented as “do this too” rather than as an indispensable component of CPS robustness and resiliency.

I am not advocating abandonment of sector and geography thinking, but I believe we will need some additional paradigms and/or alternative perspectives to do this well.

Minding the the IT-OT Gap: Student Project

As we have built the industrial cybersecurity program one of the coolest things we have done is develop our ESET 1181 Introduction to Cyber-Physical Systems course.

Industrial Cybersecurity students take the course in their first semester. I like to think of it as the entire program packed into a nutshell. With 20+ hands-on activities, students get brief exposure to topics that other courses will cover in greater depth.

During lucky week 13, we discuss the roles and responsibilities that professionals will encounter, and describe the factors (especially the non-technology factors) that have created the IT-OT gap.

I describe several real-life experiences where the gap loomed beneath. Then I ask students to deploy their creative energies to express gap. Over the years students have submitted poetry, skits, videos, and posters. I wanted to highlight a couple student videos that you might find entertaining. Happy viewing!

New Video Describing ISU Industrial Cybersecurity Degree Program

One of the things I did not fully anticipate when I left FireEye/Mandiant to lead ISU’s Industrial Cybersecurity Program was the full range of skills I would need to excel as a Program Coordinator.


Naturally, domain expertise is important — but curriculum design and development, overseeing faculty, interacting with students from a variety of ages and social backgrounds, receiving calls from parents, hiring faculty, guiding adjuncts, running advisory committees, creating exams and scoring rubrics, selecting high quality materials, dealing with department, college, and university curriculum review committees, making proposals to the state board of education, submitting grants has been much more involved and challenging than I anticipated!

I was very pleased to have some good marketing help in the form of the video above. It features a variety of faculty and staff that have made the Industrial Cybersecurity program a great success over the years! I love that it features our real students.

A special shout-out to Ryan Pitcher who is among the most dedicated faculty I can imagine. At school early and late. Always willing to take time for students and peer instructors alike. It is faculty like him that make ESTEC a hiring staple for technical professionals at regional, national, and global competitive firms!

If you are looking for well prepared entry level industrial cybersecurity talent, please reach out to me.

Easterly: Software Liability Regime for Critical Infrastructure

When I read congressional testimony from Jen Easterly on January 31, 2024 I was quite surprised. Check out this quote:

Unfortunately, the technology base underpinning much of our critical infrastructure is inherently insecure, because for decades software developers have been insulated from responsibility for defects in their products. This has led to misaligned incentives that prioritize features and speed to market over security, leaving our nation vulnerable to cyber invasion. That must stop.

The discussion over liability for software has been going on for a long time. Dale Peterson touched it in his post last week.

Our world depends on software – for which there are varying degrees of quality control. And the companies selling this software intentionally disclaim responsibility for their products via EULAs. Remarkably – but not unexpectedly – this is true even in the ICS space. Here is an excerpt from a leading ICS vendor’s EULA . I don’t intend to pick on this vendor alone because this type of language is standard practice across the industry.

VENDOR makes no representation or warranty, express or implied, that the operation of the Software will be uninterrupted or error free, or that the functions contained in the Software will meet or satisfy Your intended use or requirements; You assume complete responsibility for decisions made or actions taken based on information obtained using the Software. In addition, due to the continual development of new techniques for intruding upon and attacking networks, VENDOR does not warrant that the Software or any equipment, system or network on which the Software is used will be free of vulnerability to intrusion or attack.

Key passage there “VENDOR does not warrant that the software… will be free of vulnerability.”

Can you imagine requiring drivers who cross a high suspension bridge to sign a user license that says “we do not warrant that this bridge will support your vehicle”?

In my CYBR 3383 Security Design Principles class, one of the principles we highlight is “professional liability” — which means putting one’s name, reputation, honor, career, on the line.

Putting your name on the line is the standing expectation of engineers who sign engineering documents. And it extends to other fields as well.

In general, professional liability is missing from the software industry.

While I am not an expert in software development, I observe that there is a lack of support for software engineering licensure. I am not saying a license should be required in all cases, but certain software is running processes upon which millions of people depend for clean water and reliable electricity every day. Shouldn’t there be some overarching minimum professional standard for these individuals and their work?

Easterly envisions:

We must drive toward a future where defects in our technology products are a shocking anomaly, a future underpinned by a software liability regime based on a measurable standard of care and safe harbor provisions for software developers who do responsibly innovate by prioritizing security.

I’d like to hear more about that…

Cybersecurity in Control Systems Engineer PE Exam

I was visiting the NCEES web site the other day. NCEES is the National Council of Examiners for Engineering and Surveying. That is the group that produces/maintains the Fundamentals of Engineering (FE) and Principles and Practices of Engineering (PE) examinations.

Image: Unsplash

In States of the United States, passing the PE exam is a requirement for obtaining professional licensure as an engineer.

PE exams are offered in 16 fields, ranging (alphabetically) from Agricultural & Biological to Structural.

During the course of seven years developing the country’s first Industrial Cybersecurity degree program, I have asked myself what success would look like for the country.

One core idea (and I am not the only person to think this way – see DOE Cyber Informed Engineering effort) is that professional licensure for all engineering AND engineering technology fields would require some basic knowledge, or even better, basic competency in cybersecurity.

So, when reviewing the NCEES PE exam specifications document for Control Systems Engineering (CSE), I was pleased to find an entry for “Security”. It states:

D. Security of Industrial Automation and Control Systems
1. Security (e.g., physical, cyber, network, firewalls, routers, switches, protocols,
hubs, segregation, access controls)
2. Security life cycle (e.g., assessment, controls, audit, management of change)
3. Requirements for a security management system
4. Security risk assessment and system design
5. Product development and requirements
6. Verification of security levels (e.g., level 1, level 2)


This seems like a great start. I was left wondering what the exam questions might actually entail. Maybe I will have to take the exam to find out! I was able to gather that the CSE is offered exactly once each year at a Pearson Vue center.

Perhaps more importantly, I wondered:
* Are these the most important security concepts for a control systems engineer to know?
* How will cybersecurity knowledge affect the behavior of a control systems engineer?
* What are the correct answer rates for each question?

Interestingly, the exam specifications for the following exams (where we might hope to find it) do not name security (as in cybersecurity) among covered topics:
* Electrical and computer – electronics, controls, and communications
* Electrical and computer – power
* Industrial and systems

Specifications for the other exams (where we might be less expecting to find it): Agriculture and Biological, Architectural, Chemical, Civil, Environmental, Fire Protection, Mechanical, Metallurgical, Mining and Mineral Processing, Naval Architecture and Marine, Nuclear, Petroleum, and Structural do not mention cybersecurity despite its cross-cutting implications.

It is very informative that the NCEES Web site makes pass rate information available for all of the exams. A review of this data shows that in the 2023 year, the Control Systems Engineering exam was administered 221 times, with a 57% 1st time pass rate.

The data also indicates roughly 19,000 individuals take a PE exam in any field for the first time each year (data provided is biannual for some tests, I multiplied those by two to make an annual estimate).

In short, I believe there is a real opportunity to bake cybersecurity into the engineering discipline here, but it is going to require some serious effort!

The IA (Cybersecurity) Workforce

A few days ago I noticed a white house release regarding it’s federal artificial intelligence workforce development program that you can read about at this link. Part of the initiative involves creating a workforce development and recruitment effort modeled after the NSA/NSF Scholarship for Service (SFS) program.
Over dinner at the SFS student job fair in Washington DC last month, the SFS Program Office announced that Victor Piotrowski would leave his role with SFS to help stand up the federal AI workforce initiative. Several faculty members from long-standing SFS schools, including myself from Idaho State University, and Dr. Jim Alves-Foss from University of Idaho stood in ovation for Victor’s dedication. I shouted, “Thank you, Victor!” at the top of my voice.
I think the new federal AI initiative is very interesting, I’m glad that the CyberCorps Scholarship for Service initiative can serve as a useful model for what federal policy can do to address this important need.
All of our lives are being impacted by AI. I just finished reading Mustafa Suleyman and Michael Bhaskar’s work “The Coming Wave“.
To me, the book argues persuasively that individuals, communities, academics, policy-makers, and executives need to think differently about technology in general, the evolution of technology, and the ramifications of technology across all disciplines.
I think the most intriguing part of the book was the discussion of what the next 10 or 20 years could hold, based on current developments and market forces. The authors’ treatment of containment as the only strategy that can harness the wave for long-term human (homo technologicos) was also of great interest. The book featured several examples of failed technological containment and several examples of moderately successful technological containment.
Near the end of the book, the authors’ describe Luddite efforts to physically destroy factory machines, such as automated looms, during the industrial revolution in England. The authors assert that within 50 or so years of those destructive attacks, Luddite concerns were irrelevant because because by then, the children of the technological resistance were benefiting directly from vastly improved lives that the technology ultimately enabled.
This got me thinking about the relationships between technology, individual well-being, social well-being, and political structures. On the relationship to political structures, I realized more fully that communism as originally conceptualized by Marx and Engels were political responses to fundamentally technological advancements.
Suleyman and Bhaskar did not make this connection directly, but they did point out that governments have generally failed to develop sound policies specifically around technological advancement. They call for intentional efforts to develop a cadre of interdisciplinary professionals, dedicated to advancing the field of technological safety/security.

The Secretary and CICUL

I had the opportunity to meet with Energy Secretary Jennifer Granholm for about 5 minutes this past week on her visit to Idaho National Laboratory.

In my short time with her, I found Secretary Granholm energetic and inquisitive.

INL’s Eleanor Taylor, two ISU students interning at the INL, and I, led Secretary Ganholm and Idaho Representative Mike Simpson on a tour of the Cybercore Integration Center University Laboratory (CICUL).

As an INL/ISU joint appointee, I have the opportunity to leverage INL’s two decades of leadership in industrial cybersecurity to help create the next generation of engineers, technicians, analysts, managers, and researchers to defend the country’s critical infrastructures — it is an exciting mission!

We intend to use the CICUL to design, pilot and accelerate adoption of cyber-informed engineering of industrial control systems (ICS) by developing transformative educational experiences and conducting innovative research.

So, there was no better way to show our mission and our progress than by turning the time over to a pair of fantastic students/interns to describe the equipment in the laboratory, and explain their summer projects.

They did a great job describing the wastewater treatment skid, the user manual and startup guide they created, and their plans for allowing universities to integrate the skid into their educational offerings. Great opportunity for them!