Workshop on Formal Methods and Security Protocols (at LICS'98)

June 25, 1998, Indianapolis, Indiana
by Scott Stoller.

The Workshop on Formal Methods and Security Protocols was held in Indianapolis, Indiana, on June 25, 1998, immediately following LICS'98. It was organized by Nevin Heintze of Bell Labs and Jeanette Wing of CMU.

Roger Needham (Cambridge University and Microsoft Cambridge Research Labs) gave the keynote address. He observed that the assumptions underlying the design of security protocols must change as technology advances. For example, the advent of "truly personal" pocket-sized computers, which are used only by the owner, may provide a feasible mechanism for storage of large amounts of confidential information. He emphasized that security within an organization and security between organizations are often dealt with differently, and that in some cases, it might be feasible to base the latter on direct pairwise negotiation (e.g., use of a different public/private key pair, or even a different list of one-time keys, for each pair of organizations). He also observed that formal methods tend to aim at "absolute" guarantees, while in many commercial applications, business considerations lead to the use of systems that provide weaker guarantees; formal methods that can deal with such guarantees would be useful.

Needham also made a few points regarding the famous Burrows-Abadi-Needham logic of authentication ("BAN logic"). He observed that authentication and key establishment protocols are much easier to get right if the designer doesn't try too hard to optimize them (e.g., by reducing the amount of data that has to be encrypted), and that the speed of current hardware has made such optimizations relatively unimportant. Therefore, the community should start to focus on verification of other classes of protocols, such as electronic commerce protocols and software anti-piracy protocols. Also, he pointed out that the abstraction of "freshness" used in BAN logic was not completely successful, because it combines two distinct notions: functional dependence (or correspondence), and recency.

Needham's remarks were followed by presentations of the contributed papers, which can be accessed via the URL http://www.cs.bell-labs.com/~nch/fmsp. [Available soon according to workshop organizers -ed.]

Scott D. Stoller (Indiana University) described preliminary work on reducing the problem of verifying an authentication or key establishment protocol to a finite-state problem, suitable for model checking. Reductions are needed to bound the number of encryptions performed by the adversary and the number of protocol runs by the honest principals.

Catherine Meadows (Naval Research Lab) discussed the use of the NRL Protocol Analyzer to examine the Internet Key Exchange protocol (IKE), which is really a combination of several subprotocols and is therefore considerably larger than most of the security protocols that have been mechanically verified so far. The scale of this analysis prompted several improvements to the Protocol Analyzer. The analysis led to a clarification of the IKE specification. The analysis was not compositional; it would be nice if compositional verification techniques could be used to verify each subprotocol separately.

Will Marrero (CMU) described a machine checkable logic of knowledge for specifying security properties of electronic commerce protocols; this is joint work with Ed Clarke (CMU) and Somesh Jha (CMU). One of the features of this logic, compared to some other knowledge-based security logics such as BAN logic, is that it has a precise semantics with respect to a well-defined model of computation with an operational semantics for protocols. Only first-order knowledge can be expressed in the logic, but this seems sufficient for many purposes and hopefully facilitates efficient implementation.

The next talk was on "ActiveSPEC: A Framework for the Specification and

The next talk was on "ActiveSPEC: A Framework for the Specification and Verification of Active Network Services and Security Policies", by Darryl Dieckman (U. of Cincinnati), Perry Alexander (U. of Cincinnati), and Philip A. Wilsey (U. of Cincinnati). In this framework, an active network is specified by its security policies, services, and resources. The PVS theorem prover from SRI is used to produce formal proofs that an active network satisfies its requirements.

Richard Kemmerer (U.C. Santa Barbara) described the specification and analysis of Mobile IP Using ASTRAL; this is joint work with Zhe Dang. This work exploits the real-time features of ASTRAL, a model checker for real-time systems. As with most model checking approaches, a restricted (but useful!) kind of verification is performed: the user indicates the number of agents in the system and the length of time (on the clock of the simulated system, not the CPU time used by the simulator) for which the system should be simulated, and the model checker checks whether the specified properties hold within those bounds.

Grit Denker (SRI) described the use of rewriting logic to specify security protocols and the use of Maude, an interpreter for rewriting logic, as an analysis tool; this is joint work with J. Meseguer (SRI) and C. Talcott (Stanford). The use of rewriting logic offers some advantages; for example, it may facilitate modular verification of protocols.

Paul F. Syverson (Naval Research Lab) talked about "Relating Two Models of Computation for Security Protocols". He presented a model of computation for cryptographic protocols that can serve as a semantics for SvO, a BAN-like logic, and described the relationship between that model and the computational model underlying the NRL Protocol Analyzer. The ultimate goal is to achieve a better integration of these two complementary approaches to protocol analysis.

F. Javier Thayer Fabrega (MITRE) discussed strand space pictures; this is joint work with Jonathan C. Herzog (MITRE) and Joshua D. Guttman (MITRE). Their recent work on strand spaces has already provided very simple and lucid correctness proofs for cryptographic protocols. This paper makes such analyses even more lucid by augmenting them with diagrams that illustrate the protocol's behavior, attacks on the protocol, or crucial steps in the correctness proof.

The contributed talks were followed by a panel discussion featuring Martin Abadi (DEC SRC), Roger Needham, and Doug Tygar (CMU). Martin pointed out the need for formal methods that can deal with higher-level specifications; this is important for enabling the verification community to proceed from analysis of security protocols in isolation to verification of higher-level requirements of larger systems (e.g., distributed filesystems) in which such protocols are used. Doug Tygar suggested that SET would be a good case study, and expressed his hope that formal methods, by requiring careful study of a problem domain as part of the development of suitable abstractions, might aid in the discovery of new attacks, such as timing attacks or power attacks on smart cards. [Some formal work on SET has already been done by yours truly and Cathy Meadows and presented at FC98. Work on protocols related to SET has also been presented by Steve Brackin at the '97 DIMACS Security Protocols Workshop and by Dominique Bolignano at CSFW10. -Paul S] Martin Abadi and Roger Needham agreed that going out and developing suitable abstractions for a new problem domain is often the most productive part of the use of formal methods.

Edmund Clarke and Nevin Heintze will organize a second workshop on this topic, to be held in July 1999 in conjunction with FLOC'99. For more info, keep your eye on the FLOC'99 home page: http://www.cs.bell-labs.com/cm/cs/what/floc99/