Bad, Good, and Super-Cringey Infosec Lab Environments

I’ve had the (dubious) honor and privilege of witnessing a couple decades of IT educational lab environments. Even after well over a decade of full-time cybersecurity work, I often still have to re-certify on various tasks which require I complete a live lab or CTF (capture the flag). I build such environments myself. The way we train people to use the command line, explore tools, and learn about how security works has changed drastically, mostly for the better.

When I was in tech school and college ages ago, most training was still theoretical – on paper or fill-in-the-blank. Forward-thinking instructors might find a way to simulate systems and tasks in some manner, but it was a tiresome, manual process. Virtualized environments have been revolutionary for IT training in general. We now have the ability to build complex network environments that closely simulate reality, yet can be reset or manipulated by instructors at will. Gone (should be) the days of lab questions that required a single static response when reality allowed for many. Gone (should be) the days of unstable and unrealistic labs that require lots of physical intervention.

Unfortunately, problems persist across many educational and certification lab environments. I still find myself in CTF or certification environments which teach more about how to game or fix the lab than about the learning objectives. When I bring this up to other infosec professionals, I often get the same response: “well, we had to figure them out, and surviving made us clever…”. That’s certainly debatable, but in my opinion, it as much gate-keeping as, “we had to network in strip clubs” or “we didn’t have documentation outside of man pages”.

In an educational environment, learning objectives and expectations should be clear and quantifiable to both the students and the instructors. If the environment accomplishes those objectives for the intended audience then it did its job. If it does not, it failed. We have to recognize that not every student is the same type of learner as us, and not every student has the same technical background. Remember that it is the era of the tablet and many young people today do not own a home PC, much less one they may administer. Providing challenges for exceptional students is fine, but those challenges should not be a barrier to expected learning progression.

All that said, let’s talk about some essential characteristics of a good educational lab environment – whether it be a CTF, a classroom exercise, a certification test, or a cyber range:

  1. It always provides consistent output when given a specific input.
    This seems obvious, but it’s not. Consistent output often becomes a problem when there are problems with load, other users modifying the same environment, or system latency. For example, if your lab doesn’t always receive a DHCP address, takes a really long time to boot up, or other users can unexpectedly reset hosts or change addresses, students may follow instructions or a logical set of steps and receive a different output than their neighbors (or not receive output at all). This can be okay if it is not relevant at all to the specified task, but if it impacts objectives or causes confusion for the target audience, it’s a big problem.
  2. There is a clear (and working) way to report bugs or outages.
    No lab environment is foolproof. I’ve never seen one that was completely stable and bug-proof without some human administration. There should always be a way for participants to report bugs with systems, software, or tasks. If you are providing 24/7/365 environment, you should set clear expectations about when this help is available and what to do if problems occur outside support hours. If legitimate bugs or outages impact participant score or course completion, there should be clear ways for them to record the problem and contest the outcome.
  3. Where applicable, all valid responses are accepted.
    It is the nature of the command line (and computing environments in general) that there are frequently multiple ways to reach the same outcome. There are many different ways to navigate and reference file structures in Windows and Linux. There are multiple commands, shortcuts, and modules that perform the same function. Sometimes names or commands are not case sensitive. Within reason, a good lab environment should allow for students using any valid method for reaching and reporting an answer. These potential paths and responses can take some time to fully identify, but flexibility is key. If you require a very specific command, or a specific case, there should be a solid reason for this and it should be made clear to participants.
  4. It is tested and/or monitored for stability, latency, and performance, especially under load.
    It’s simply a fact of life: your lab environment will not perform the same with 20 (or 200) people using it as it did when you first set it up. There will be unforeseen issues with system or network performance, or lab access / load times. To the best of your ability, try testing your environment with multiple users – and anticipate what will happen if system performance drops drastically. Will latency cause systems to be incompletely loaded when the student gains access to them? Will it cause test or exercise completion to take substantially longer? Once the lab environment goes live, make sure that somebody is routinely monitoring performance and stability and adjusting accordingly.
  5. A crash, time-out, or outage that is your lab’s fault won’t cost substantial student progress or success.
    System problems are a fact of life – even more so in a lab environment. Particularly in time-based tests, CTFs, or exercises, students should not be penalized for technical issues with the lab environment that they did not cause. I’ve suffered many a crash or spontaneous reset of my lab system that meant starting the entire exercise, flag, or test from scratch. This is indescribably frustrating even to the most seasoned professional. To a newcomer tentatively exploring cybersecurity as a field, a few such demoralizing losses could mean the end of their journey. To the best of your ability, try to ensure that if your lab environment or a lab system crashes and/or resets, student progression, notes, and/or exam progress are not totally lost. If such a failure occurs in a timed environment, consider the possibility of making reasonable concessions with regards to deadlines.
  6. Instructions, flags, and commands are checked for typos and clarity.
    If I see one more for-profit infosec cheat sheet or lab exercise with “nmap” spelled “namp” or “netstat” spelled “netsat” there’s a fair chance I will throw my mouse at my monitor. If you are instructing or requiring your participants to use specific commands or flags to complete an exercise, please triple check that those commands are spelled properly in:

    – your CTF flags and answers

    – your lab guide

    – your test questions

    Mistakes happen. We all make typos. I’ve probably have made a few in this blog! That’s cool. Where typos start becoming really unfortunate is when they impede progression or completion. If your code is wrong, or your flags are capitalized or ordered wrong, or you mistype commands in questions or answers, you’re immediately and directly negatively impacting your participants’ success. Take the time to have a second person read through your text, and also have a person at the participants’ technical level run through the exercise in advance. You know how to type the commands right by memory. They do not.
  7. Participant expectations and/or learning objectives are made clear from the start.
    So you’ve made this super cool CTF or student lab, and you’re just itching to let your participants in to play! That’s awesome – but what exactly are you using it to teach or test? Computers are pretty infinitely complex, and problems like latency and lab bugs can cause participants to go down unexpected rabbit holes. You must make it clear to those participants how you will measure their success, the scope of the exercise, and what they are trying to accomplish. To make this clear to them, you must also make it clear to yourself. Once you identify what success in the test, lab exercise, or CTF looks like, you can then quantify how successful the environment was at enabling that success. For example, if you’ve built an educational lab to teach the use of nmap, and a large number of students get hung up trying to reach the target host due to networking issues, it is likely you who have failed at your objective – not the students. Your students should know that fixing the network is not an objective so they can promptly report the lab issue.
  8. If applicable, the lab is tested for functionality at multiple resolutions, in various browsers, common multi-monitor arrangements, and/or accessibility settings.
    We’ve all been dropped into a lab or certification test where our biggest problem was just viewing and manipulating the screen comfortably. If you are not providing participants all of the hardware they will use, assume that any reasonable modern screen resolution may be in use. When possible, provide support for multiple monitors – it’s 2020! If your lab has a web interface, consider supporting common modern browsers and operating systems. If there is any potential for accessibility to be an issue (this is the norm, not an exception), ensure that typical accessibility options like magnifiers, screen readers, and color-blindness schemas will not break the essential functionality of the lab environment. If you can’t do so, then prepare to offer reasonable support and concessions if simply viewing the environment in a reasonable manner becomes a problem.

I would gently remind all lab creators, administrators, and vendors that the success of their cybersecurity lab environments in meeting learning objectives reflects strongly on their organization’s professionalism and motivation to see their participants succeed. I’ve heard many complaints from instructors and lab facilitators at established institutions that they are not receiving adequate funding or resources to meet these basic standards. Labs are an indispensable part of cybersecurity training in 2020. There is no excuse for well-resourced educational organizations to not adequately support and fund adequate and professional-quality labs.

If you want to learn more about building great training labs and CTFs, I highly recommend you pick up a copy of Building Virtual Machine Labs: A Hands-On Guide by Tony Robinson. It’s an easy read that offers step-by-step solutions for building something you can be proud of.

One thought on “Bad, Good, and Super-Cringey Infosec Lab Environments

Leave a comment