**Review of the
Computer Security Foundations Symposium,
CMU, Pittsburgh, Pennsylvania, USA
June 23-25, 2008
**

Review by Kumar Avijit

July 10, 2008

- Paper 1

Riccardo Focardi started the conference proceedings with his paper titled "Language based secure communication". He presented a process calculus with abstractions for expressing secret and authenticated communications, and argued that these abstractions were easier to implement in the real world than both the low-level primitives of spi-calculus as well as the abstract private channels of pi-calculus. He then presented a model of the network in the form of a low-level calculus to account for adversaries that can intercept, store, duplicate and forward messages. The main idea here was to introduce an extra binder in the input forms to represent the network view of a message; thus a secret message can only be interpreted as bytes by an adversary. The rest of the talk was about formalizing the semantics as a labeled transition system and characterizing observational equivalence as a bisimilarity thereby allowing coinductive proof techniques for observational equivalence. - Paper 2

The next paper titled "Refinement types for secure implementations" was presented by Cedric Fournet. He presented a concurrent functional programming language with refinement types which can be used to write programs adhering to a security policy. Refinement types were used to capture properties of program values as a first-order logic formula. The security policy is embedded in the program code at various points by means of 'assert' clauses that check to see if a logical formula holds at a program point. Typechecking ensures that no assertion fails at runtime. Fournet then described an extension to model adversaries by adding a type denoting public data.

As examples showing the expressiveness of the language, Fournet described implementations of various cryptographic primitives and protocols in the language. He compared the implementation of the typechecker to a previous approach that used ProVerif.

- Paper 3

The next talk was about a paper on "A trust management approach for flexible policy management in security-typed languages" presented by Sruthi Bandhakavi. She presented a language called RTI which allowed decentralized control over security policies while maintaining the privacy requirements of principals. RTI is based on the RT language for defining role membership. The language has imperative commands for adding or deleting policy statements, and those for branching on policy queries. In addition to variables, RTI protects policy statements also by assigning security labels to them. It is ensured that the branching construct does not leak information about the security policy. This is implemented by selecting only that part of the policy to make a query decision which is allowed by the current security level of the computation. Thus an adversary cannot learn about a private policy of other individual by executing an if statement. A dual condition ensures integrity. A non-interference theorem stating that low computation does not depend on any changes to the high part of the policy is proved. - Paper 4

Michael Clarkson gave an exciting presentation about his joint work with Fred Schneider on "Hyperproperties". He started off with a bit of history about the classification of properties of traces as safety and liveness, and the theorem about all properties of traces being expressible in terms of safety and liveness properties. He then motivated his work by saying that most security properties studied today, e.g. non-interference are not properties of traces but those of sets of traces, and hence inexpressible as safety or liveness. This was followed by the analogues of safety and liveness for hyperproperties (properties of sets of traces) and the comment that hyperproperties are expressive enough to capture higher order properties, due to the analogous result for second order logic. Clarkson then mentioned important differences between simple properties and hyperproperties such as the latter not being closed under refinement. Just like properties, all hyperproperties are intersection of a hypersafety and a hyperliveness property. Clarkson mentioned that it is easy to extend hyperproperties to probabilistic hyperproperties. He finished his talk by illustrating a 'world map' of classification of various security properties. - Paper 5

Next was the presentation by R. Ann Miura-Ko on her paper titled "Security decision-making among interdependent organizations". This paper was in a different vein than the rest because it did not aim at formal techniques applying directly to security, but was rather about modeling the impact of security decisions made by organizations on the security of other organizations, using approaches from Economics. Miura-Ko started by presenting a tempting example that consisted of web users using the same passwords across sites thereby creating security dependencies between services offered by the web sites. She then described how to model this dependence by used weighted directed edges between sites. Here the negative weights denoted that increase in security at the source leads to an increased security hazard at the destination, whereas the positive weights denoted the destination benefiting from increased security investment at the source. She then extended the model for the investment in security made by each node and obtained a utility function that translates the effective investment made by an organization to the total benefit it obtains. Finally she described a free-riding index which measured how much a site can reduce its investment due to positive influence from other sites. Iliano Cervasato noted that the security of a site could also depend on the adversary intent, and asked how this would affect the model. In response, Miura-Ko mentioned that attack graphs may be used to model the path that an adversary takes through the network, but that this is a new dimension and complicates the problem. - Paper 6

The next paper was titled "Tractable enforcement of declassification properties". This paper introduces a notion on non-interference in presence of declassification which is called delimited non-disclosure. It defines this property as a bisimulation between program states, where the simulation relation is actually a family of relations indexed by the policy at the program point represented by the states involved. This takes into account that declassification introduces local policies, and that all the intermediate states should also be taken into account while defining a non-interference policy in such cases. The paper then defines an 'abstract' type system for enforcing delimited non-disclosure in terms of a type system for plain non-interference, and presents a case study of typing JVM. - Paper 7

The final paper for this day's session was on "End-to-end enforcement of erasure and declassification" which was presented by Stephen Chong. He started off by noting that plain non-interference is in general not an adequate/usable property because the confidentiality of information often changes as the program execution proceeds. He gave examples of situations where declassification or weakening of confidentiality is needed, and others where erasure or strengthening is called for. The policies specify erasure and declassification from one security level to another conditionally on the values of program expressions. Stephen showed how declassification and erasure policies can be enforced by a combination of static typechecking and runtime checks. Erasure is ensured at runtime at all points in the program where memory is updated. At such points, all those variables whose policies require erasure are set to zero. Declassification is done at runtime by the guarded declassify command. The runtime checks ease the burden on static typechecking and allow for more expressive conditions. The type system presented was the one common for non-interference with an additional rule for guarded declassification. Finally Stephen showed an example of implementation of a secure remote voting system Civitas that uses erasure and declassification constructs.

The second day started with a joint CSF-LICS invited talk by David Basin on his work with Christoph Sprenger on "Cryptographically-sound protocol model abstractions". The first session of this day was joint with LICS.

- Paper 8

Rohit Chadha presented his paper of the session titled "Expressiveness and complexity of randomization in finite state machines" . The problem was to determine the expressive power of finite state randomized monitors on non-probabilistic systems. Chadha started the motivation by discussing the class of languages accepted by randomized monitors with infinite state. From a previous result this class is exactly the countable unions of safety properties, also termed as Almost Safety properties. After formalizing a notion of acceptance by a randomized finite state machine with probabilities associated with each transition (a Markov chain), Chadha developed a hierarchy of languages based on the upper bound of the probability of acceptance of the language by any such machine. One of the attendees noted the absence of non-determination in the automata discussed, and Chadha acknowledged that this was another possible consideration. - Paper 9

Next Henry DeYoung presented his work on "An authorization logic with explicit time". He motivated the general trend of using a logic for expressing policies giving arguments such as precision of expression, flexibility in management of policies, and amenability to policy analysis. Next he motivated by an example how using explicit time in the policy is required in social situations. He gave compelling reasons for why traditional approaches which do not have a notion of time, are inadequate by citing that correct proofs may be rejected because of expired credentials. He also noted why even temporal logic seems inadequate because policies often want to use specific time intervals. Next he presented his logic which treated time interval as a modality. He presented a sequent calculus and described various favorable meta-theoretic properties of his logic. Lastly he discussed extending his calculus with linearity to model consumable credentials. - Paper 10

Yuri Gurevich was the next speaker and his paper was on DKAL which is a distributed-knowledge authorization language. He began with a similar motivation as the previous talk about using logic for access control. After describing a little history of languages starting from Datalog to Binder to SecPAL, he moved on to a brief primer on Datalog highlighting the absence of functions and negation. He then discussed the modifications to SecPAL made in DKAL such as the directional nature of 'says' which now includes the principal for which a statement is being issued. This prevents information leakage by probing attacks. Another modification was that 'can say' was given a first order status as being a function in its own right, rather than using it in conjunction with 'says' as a ternary relation. This enables writing more complex policies which have nested occurrences of 'can say'. In addition, there is a notion of a knowledge of a principal captured by the 'knows' predicate, and that of trust between principals. Finally, the worst case complexity for satisfying a query is polynomial in the lengths of policy and the query, each raised to a power linear in depth and width of quotations. - Paper 11

The next talk was by Glenn Bruns on using Belnap logic for composing access control policies. In contrast with the previous two talks, this paper considered a more abstract view of access control policies as being predicates on access requests. This abstraction was suitable for their paper since they were reasoning about policy composition. Glenn started by motivating using the four-valued Belnap logic for distinguishing between denial and undefinedness of the policy on access requests. In a two-valued world, conflicts in the composed policy cannot be expressed as the outcome of composition - since the result of the composition should either be true or false. In such a case having four truth values namely true, false, undefined and conflict seem natural. Glenn showed how these values can be arranged in two different lattices, one based on truth and the other on information content. The policies themselves can the ordered using point-wise extension of these orders. Glenn then talked about a propositional query language over policies with the order relations giving rise to predicates. He briefly mentioned that such queries can be translated into first-order logic formulae. - Paper 12

Next Jeffery Vaughan gave an exciting talk about his work entitled "Evidence-based audit". The setting of this work was distributed authorization and the paper presented a dependently typed language for writing programs for such environments. Jeff came up with three problems that plague the security of systems, namely:- The implementation of the reference monitors may be buggy. Here the reference monitors are being considered as part of a trusted computing base.
- The policy language may not be expressive enough to capture the institutional policy.
- The system may be configured with an inherently bad policy. This could happen for various reasons, such as the administrators not understanding the complex policy language completely.

Jeff then showed that the language was strongly normalizing by translating it to calculus of constructions with dependent pairs.

- Paper 13

Post-lunch, the first paper titled "Automated verification of remote electronic voting protocols in applied pi-calculus" was presented by Catalin Hritcu. Catalin started by presenting a big list of properties desired for protocols for remote voting and noted that some of them seemed conflicting. The paper focused on a subset of these properties like voter eligibility, non-reusability and inalterability in the category of soundness properties, and receipt-freeness, immunity to simulation attacks, forced-abstention attacks and randomization attacks in the category of coercion-resistance. For each of the above properties, Catalin then presented how to formalize these properties in applied pi-calculus using the notion of observational equivalence. For the coercion-resistance, their novel approach used dummy voters as balancers in parallel to registered voters in order to achieve the necessary indistinguishability in the final election results. It was showed that coercion resistance implied receipt-freeness. An attendee asked if coercion-resistance and checkability of votes were conflicting properties. Catalin responded that a voter could always fake when the coercer asks him/her to show that he cast a vote. - Paper 14

The next paper to be presented was titled "Specifying secure transport layers". This paper deals with characterizing confidentiality and authentication properties of protocols in terms of trace properties in the CSP language, the motivation being to analyze layered architectures. First the paper defines an abstract model of the network dividing it into three layers - application, secure transport and network. Agents are allowed to create messages in application layer only, but an intruder is given extra capability to 'get' and 'put' transport layer messages as well. An intruder can only manufacture messages based on its knowledge, the definition of which is left open in the paper. A hierarchy of confidentiality and authentication properties of single messages as well as of sessions is defined by considering various combinations of properties of message traces. For instance, a trace in which the intruder cannot fake a message and cannot re-ascribe a message as coming from a honest agent provides sender-authentication. Several examples of protocols are given that are believed to satisfy the properties characterized in the paper. - Paper 15

Next Jean Goubault-Larrecq presented his paper titled "Towards producing formally checkable security proofs, automatically". He first showed that extracting formal proofs from verification tools such as ProVerif was hard, in fact harder than the process of verification itself. Then he showed that if one uses model-checking of finite models to establish security properties, then one can instrument the model-checker to emit formal proofs automatically. - Paper 16

The next paper was on "Composition of password based protocols" presented by Mark Ryan. The motivating example for this paper was that of two password-based protocols each of which was resistant to guessing attacks independently, but not so when composed together. The processes were formalized in applied pi-calculus. Next, passive guessing attacks were formalized using indistinguishability of the situation where the attacker could guess the correct password, with respect to one where he guessed the wrong password. Since the attacker did not insert/remove packets, this is the case of a passive adversary. It was shown that frames resistant to passive guessing attacks are closed under composition. For the case of active adversaries, it was shown that well-tagged processes were closed under such guessing attacks. In the end a simple transformation to well-tagged processes was defined. The idea was to enclose each occurrence of the secret under a hash function. - Paper 17

Dominique Unruh was the next speaker and his paper was titled "Computational soundness of symbolic zero-knowledge proofs against active attackers". The paper discussed what properties a cryptographic zero-knowledge proof needs to satisfy in order for it to be a computationally sound zero-knowledge proof in the Dolev-Yao model. - Paper 18

The last talk of the day was on "Joint state theorems for public-key encryption and digital signature functionalities with local computation". Joint state theorems are theorems about composition of protocols where the composed protocols are allowed to share state. The paper discusses limitations with an earlier work on joint-state theorems, and presents a new theorem using the IITM model. Further, it discusses joint-state realizations of public-key encryption and signature functionalities.

The last day of CSF started with a joint CSF-LICS invited talk by Dexter Kozen. He talked about formalizing flowcharts with non-local control flow in terms of Kleene algebra with tests with equational reasoning. He started off by highlighting a previous result by Bohm and Jacopini about expressibility of deterministic flowcharts as while programs requiring auxiliary variables, and another result by Ashcroft and Manna that these auxiliary variables are necessary.

This was followed by a short tutorial of some results on Kleene algebra. A boolean subalgebra of the Kleene algebra was then introduced to define Kleene algebra with tests. Every deterministic automaton for Kleene algebra with tests is equivalent to a while program. Finally he presented a translation from arbitrary flowcharts to strictly deterministic automata.

- Paper 19

The first talk of this session was on "Type systems for observational determinism" by Takio Terauchi. As is clear from the title, the talk presented a type system for ensuring that a non-deterministic program is observationally deterministic with respect to some set of secure and public variables. Observational determinism means that even in presence of a scheduler, an attacker should not be able to infer the pattern of high inputs by looking at the low output. Terauchi divided the type checking into three phases:

The program is sliced to throw away the part that does not affect the low variables. It is then checked if the sliced program mentions any high variables. Finally, fractional capabilities are used to determine if the sliced program is deterministic.

The main idea behind using fractional capabilities is to assign a write capability to each variable. This capability is used up when there is a write to that variable. Capabilities cannot be duplicated. This prevents non-deterministic programs where concurrent writes happen to the same variable. This is ensured by the type checking rule for spawning threads in which capabilities are split between the spawning process and the spawned thread. In the case of a channel I/O, capabilities can be passed because of the deterministic nature of the communication.

The type system could accept non-confluent programs. Steve Zdancevic asked if adding confluence-checking would affect the complexity of the algorithm to which Terauchi responded that the algorithm would still be polynomial time.

- Paper 20

The next talk was titled "Information flow in systems with schedulers" and was presented by Chenyi Zhang. The speaker talked about extending various information flow properties such as non-deducability of High actions from Low views for the case when the high and low processes are interleaved by a scheduler. - Paper 21

The final session of the conference was about two papers describing the use of existing proof technologies in real-world applications. The first talk was about "A correctness proof of a mesh-security architecture" presented by Doug Kuhlman. Doug gave a brief overview of the protocol used in the mesh wireless network, and then described using PCL for verifying correctness of the protocol. - Paper 22

The last talk was about "Formal analysis of PKCS#11". The speaker talked about attacks on PKCS#11 API that could lead to leakage of keys, and described a language in which to model such mutable storage based APIs as PKCS#11.