Lisa Lorenzin: Webinar recording - a standards-based approach to ICS security: Greetings!
In follow-up to David's earlier comment - if you were interested
in the TCG webinar on a standards-based approach to ICS security,
but were unable to attend yesterday, the recording is available
Bob Radvanovsky: Repost: Green Lights Forever...: Posting got munged...reposted.
makes it a little bit harder to sniff or join. It's just one layer of
defense in depth. Nobody expects that putting up a chain link fence around
a substation is going to be sufficient by itself either.
Erik Hjelmvik: No ICS cyber security at BlackHat: Tom, the ICS Village PCAP you've shared from DEFCON 22 cannot possibly
contain traffic from a properly configured SPAN port!
Why do I know? Well the packet capture was running for 33 hours, there
were over 990,000 packet captured, but not a single packet was going
between a client to an ICS device. [...]
The content of this meeting is a mix of technical and executive topics, serving the purpose of peer information sharing and education, with a view to support strategic planning and validate the participants' focus and priorities
FI-ISAC, the Financial Institutes – Information Sharing and Analysis Centre, was launched as an initiative of two members of the ENISA’s Permanent Stakeholder Group (PSG), Wim Hafkamp (NL) and Ferenc Suba (HU). The first meeting was held in 2008 in Hungary.
The European FI-ISAC is an independent organisation and is actively supported by ENISA. Its members come from the financial sector (Banks or banking associations), national CERTS and Law Enforcement Agencies (LEA’s). Other organisations represented are: ENISA, Europol, the European Central Bank (ECB), the European Policy Centre (EPC) and the European Commission.
The purpose of the meetings is information exchange on:
Vulnerabilities, technology trends and threats
e- and m-channel, cards, central systems; and all ICT related topics.
Stay updated - subscribe to RSS feeds of both ENISA news items & press releases!
"The most important failure was one of imagination." -- 9/11 Report
This sentence and the reforms that it (and others like it) compelled after the attack on the twin towers have driven many of the changes in the way intelligence analysts do their jobs over the last 13 years. Fundamental to these changes were (and are) attempts to get analysts to think differently. Specifically, most of the discussion and many of the efforts were aimed at increasing divergent thinking abilities among intelligence professionals. Red teaming, brainstorming, and the ubiquitous informal encouragement to "think outside the box" are all, to one degree or another divergent thinking strategies. There are good reasons, however, for analysts to master the flip side of divergent thinking - convergent thinking -- as well. There is quite a bit of excellent research that suggests that having a strong divergent thinking skillset is not enough. In fact, the research goes further. Having only strong divergent thinking skills likely lowers forecasting accuracy. That's right - lowers. Psychologists, for example, have long known that having too many choices is not only unproductive but counterproductive. In 2000, Sheena Iyengar and Mark Lepper showed the effects of too many options with respect to consumer products. Participants in their experiments showed more interest in the huge selection of jams with which they were presented but were more likely to actually make a decision and buy one (and to be more satisfied with their purchase) if presented with a smaller assortment. Don't understand how this works? Just take a look at the clip with Robin Williams as a recent Soviet emigre in the movie Moscow On The Hudson at the top of this post... Beyond the realm of jam and much more directly relevant to intel professionals, Philip Tetlock, in his groundbreaking work on the correlates of forecasting accuracy, Expert Political Judgement, found that one popular analytic methodology, Scenarios Analysis, doesn't work at all. Generating more and more plausible scenarios is actually counterproductive. His experiments showed that "such exercises will often fail to open the mind of inclined-to-be-closed-minded hedgehogs but succeed in confusing already-inclined-to-be-open-minded foxes... (p. 199 of the 2005 edition for those interested in such things)" Finally, research conducted by Mercyhurst's own Shannon (Ferrucci) Wasko using a real world intelligence problem and a controlled experiment showed much the same effect: Divergent thinking alone lowers forecasting accuracy. What's an analyst to do? While divergent thinking is useful for developing concepts, ideas or hypotheses, convergent thinking is useful for focusing the analytic effort. I have found that there are three crucial convergent thinking techniques:
Grouping. Grouping (and its corollary, Establishing Relationships) is probably the most useful of the convergent thinking techniques. In order to get a handle on all of the ideas that typically emerge from any divergent thinking exercise, it is important to be able to group similar ideas or hypotheses together. Critical to this effort are the labels assigned to the various groups. All sorts of cultural and cognitive biases can easily come into play with poorly chosen group names (For example, think how easily the labels "terrorist", "freedom fighter", "good" or "evil" can influence future analysis). Mindmapping and other concept mapping techniques are very useful when attempting to use grouping as a way to deal with an overabundance of ideas.
Prioritizing. Deciding which ideas, concepts or hypotheses deserve the most emphasis is crucial if collection and analytic resources are to be used efficiently. Treating every idea as if it is equal to all the others generated by the divergent thinking process makes no sense. Yet, as with any convergent thinking process, the decision regarding which concept is first among the putative equals should be made carefully. Problems typically arise when the team setting the priorities is not diverse enough. For example, a team of economists might well give economics issues undue emphasis.
Filtering. Filtering, as a convergent thinking technique, explicitly recognizes the awful truth of intelligence analysis - there is never enough time. Filtering can be used to eliminate, in its extreme application, some possibilities entirely from further consideration. Typically, however, analysts will use filtering to limit the level and extent of collection activities. For example, intel professionals looking at pre-election activity in a certain country might decide to focus their collection activities at the county rather than at the city or town level. As with grouping and prioritizing, where to drawn these kinds of lines is fraught with difficulty and should not be done lightly.
These are just the three techniques that I think are the most important. There are clearly other convergent thinking strategies that are useful to analysts - don't hesitate leave your favorite in the comments!
Just got turned on to Joe Lau's book, An Introduction to Critical Thinking and Creativity (H/T to Edutopia). I haven't had time to read more than the Introduction (free to download) and review the Table of Contents but it was enough to get me to order the book. Why? I really like his definition of critical thinking. Lau identifies the 10 abilities of a critical thinker and it seems like a pretty comprehensive list to me:
Understand the logical connections between ideas.
Formulate ideas succinctly and precisely
Identify, construct, and evaluate arguments.
Evaluate the pros and cons of a decision.
Evaluate the evidence for and against a hypothesis.
Detect inconsistencies and common mistakes in reasoning.
Analyze problems systematically.
Identify the relevance and importance of ideas.
Justify one's beliefs and values
Reflect on the justification of one's own beliefs and values.
Obviously this is just my first take on it but I think it is worth checking out.
Most gamers understand that games fall into genres. For example, Scrabble, Boggle and my own game, Widget, are all examples of "word games". There is no standardized list of game genres, of course, but gamers are like Supreme Court Justice Potter Stewart when it comes to genres (Stewart, in trying to define pornography, famously wrote in Jacobellis v. Ohio, "I know it when I see it.").
Most games, then, fit neatly into existing genres and truly new genres come along only rarely. It is even more rare for a new genre of games to have a large-scale cultural or social impact. The last such genre that I can think of was the role-playing game, epitomized by the first and still one of the most popular games, Dungeons and Dragons. Whether you played D and D or not (or liked it if you did play it), there is no denying that it spawned a genre of games that impacted and continue to impact both culture and society.
Today there is a new genre of games - cooperative tabletop games - that I think has a chance to have a similar impact on the way we teach not just intelligence but just about everything.
Cooperative games are labelled as such because players cooperate with each other to defeat the game. This kind of play style has long been a staple of many video games where players will gather as teams to defeat a common enemy.
While there are a few examples that date back as far as the 1980's, modern cooperative tabletop games typically require much more nuanced gameplay than their video game counterparts. True cooperation on everything from strategy to resources is usually necessary to defeat these challenging games.
If you are not familiar with this genre (and most people are not), I strongly recommend you get some of these games and play them. Two good examples to start with are Pandemic and Forbidden Desert. Both games pit you and the rest of the players in a race to beat the game. Either everyone wins or no one wins.
There are many variations on the theme but typically these games throw an escalating series of challenges at the players. Pandemic, for example, envisions a team of experts working to stop a global disease epidemic. Forbidden Desert asks players to collect a series of artifacts and escape the desert before sandstorms swallow the players.
Players in these games usually assume a variety of roles, such as Engineer or Medic, each with a particular skill useful in defeating whatever it is the game throws at them. Players can and do discuss everything from strategy to resource allocation. This kind of game doesn't just encourage cooperation but demands it from every player.
My recent game, Spymaster (which has proved incredibly popular - I have given out nearly 200 copies to date), was designed as such a game. Small groups of players have to make collaborative decisions about how and where to place certain collection assets in order to collect various information requirements, all while losing the fewest possible assets. While the current version of Spymaster allows the players to determine how they will make decisions about asset allocation, I am thinking about an "advanced" version of the game that will assign various roles to the players coupled, of course, with unique capabilities associated with each role.
Whether you have had a chance to play Spymaster or not, once you have played a couple of these kinds of games, the possibilities for their use in class becomes very apparent. There is a lot of learning going on in these games and not all of it is knowledge-based. Teamwork, conflict management and collaboration are all essential elements of these games.
More importantly for classroom use, these games can be designed to take a relativity small amount of time to play. Unlike videogames, tabletop games also tend to expose the underlying system to the players in a bit more detail. Likewise, tabletop games are vastly less expensive to design and produce than videogames which means that more topics could be covered for the same or less money - clearly a consideration in these budget restricted times. Finally, bringing a tabletop game into a secure facility is vastly easier than trying to import electrons.
Do I really think that cooperative tabletop games will change intel training and education? I'm not sure, but I know that they can - and that this is an experiment in games-based learning worth attempting.
A couple of weeks ago, I made a print-and-play version of my new game about collection management, Spymaster, available to anyone who reads this blog and would drop me an email (The offer is still open, by the way, in case you missed it the first time). Since then, I have mailed out over 100 copies to everyone from the DNI's office to troops deployed in Afghanistan to academics in Japan to the Norwegian police forces! Feedback is starting to trickle in and the comments have been largely positive (whew!) even from some very experienced collection managers (Thanks!). In addition, I have received a number of outstanding suggestions for enhancing or improving the game. Some of these include:
Making different collection assets work better or worse against different information requirements.
Increasing the point value of information requirements collected early.
Making some of the OSINT cards "Burn - 0" or impossible to burn.
Giving players a budget and assigning dollar values to each collection asset such that players had to stay within their budget as well.
I recognize that these suggestions may not make much sense if you haven't played the game but all of them (plus many more) are fantastic ideas designed to make the game more real. And therein lies the rub... One of the classic problems of games designed to simulate some aspect of the real world is the trade-off between realism and playability. Playability is really just how easy it is to play the game. Every time you add a new rule to make the game more realistic, you make the game more difficult to play and therefore less playable. Its not quite as simple as that but it gives you a good idea of how the problem manifests itself. Great games designed to simulate reality often give a strong sense of realism while remaining relatively simple but the truth of it is, like the Heisenberg Uncertainty Principle, the more you try to do one, the less, typically, you are able to do the other.
The problem of playability versus realism is analogous to the problem of feature creep in project management. Most people have been involved in a project that started out simple but, over time, grew incredibly complex as more and more "good ideas" were added. Each idea, in and of itself, was justifiable but, in the end, led to an unwieldy mess. Figuring out where to draw the line is just as important in game design as it is in project management. This constraint is even more strict when considering the modern intelligence classroom. Here, unless the course is entitled "collection management", there is likely a highly limited amount of time to devote to a game on collection management. Consider the case of Spymaster. I wanted a game which would replace a one-hour lecture on collection management for our intro classes. To make this work, I would need to be able to set-up the game, explain the rules, play the game and then conduct an outbrief all within an hour. That's pretty tough to do (at least for me) and still make the game meet your learning objectives. It becomes a very careful balance of putting good ideas into the game while not running out of time to play the game in class. The classic solution to this problem is to have a basic version and an advanced version (or several advanced versions). These can be included in the rules from the outset or added later as expansion packs. Right now, this is exactly what I am doing with all of the feedback I am receiving - scouring it for good ideas I want to put into more advanced versions of Spymaster!