USENIX Security Symposium VII Tutorials
by Nimisha Mehta


This year the four-day USENIX Security Symposium took place in San Antonio, Texas. The program chair was Avi Rubin of AT&T Labs-Research.

There were 708 people who attended the conference this year. The overall impression of the people I met (mostly from the research community) was that this year's symposium seemed to be more technically rounded than that of previous years and included more interesting talks and papers. The main topics covered in the technical section included intrusion detection, crypto, CAs, access control, and web security.

The tutorials provided an in-depth look at a particular topic taught by a champion in the field. Here we will briefly cover the tutorial on certification by Carl Ellison from CyberCash, Inc. and the tutorial on cryptography by Bruce Schneier of Counterpane Systems. There were several other tutorial that I did not attend.

The half-day tutorial on certification, entitled "Certification: Identity, Trust, and Empowerment" provided a broad overview of the history, philosophy, and deployment of public key certificates. First, Ellison covered the historical development of public key certificates from Diffie-Hellman's early papers on public keys in 1976, to Kohnfelder's MIT thesis in 1978, to the notion of global identity certificates as with X.500 and X.509 in the late 1980s. He spoke about how the advocates of X.500 certificates dreamt of having a global directory (i.e. a telephone book) binding all people and their certificates. However, this failed because of the non-technical yet fundamental problem of confidentiality where certain organizations cannot reveal the identities of their employees.

He then talked about the more recent development in the early 1990s, from PEM (privacy-enhanced email) to PGP and SSL. PEM, needing a global hierarchy of names, failed for the same reasons as X.500. However, RIPEM, not needing certificates, is more widely deployed than PEM. PGP, developed by Phil Zimmerman in response to the FBI's request for having access to all information passed in the cleartext, is used worldwide for securing email, while SSL is widely used for securing Web transactions. However, Ellison complains that there is no real trust management engine when using SSL. The certificates are merely toll booth certificates where the user is not informed about the server other than it having a certificate from e.g. VeriSign.

SPKI, on the other hand, is based on an authorization model, where certificates carry permissions along with the names. This, Ellison believes, gives meaning to the certificates and would be a step forward from the current toll booth certificates. He envisions that there would be as many issuers as there are entities; e.g. different issuers for accessing medical files, for writing prescriptions, and for trespassing behind the firewall. He then went on to discuss the overall issues with Trust and Empowerment. He warned us that CAs only focus their attention on trusting the strength of the cryptographic algorithms, trusting the legal support, and trusting the procedures for revocation, reissuance, and unique identification. However, they do not address how one goes about trusting the issuer to grant permissions. He advises that one always need to ask whether the issuer of the certificate is empowered with that authority. For example, in the PICS system one needs to decide whether the page was rated by a trusted rating service.

The other big issue he discussed was that with Identity: how does one go about naming the correct entity. His basic recurrent message was that having global names is not the correct approach since that requires human guesses as part of the security protocol in order for users to distinguish between computer generated unique identifiers. He instead supports the local name space approach introduced in the SDSI model by Rivest and Lampson at MIT. The basic idea of SDSI is that each user maintains his/her own local name space and these name spaces are linked together by referring to names in other name spaces. In Ellison's words, "SDSI did for security for namespaces what Einstein did for physics", noting that Einstein replaced the notion of global space and time with that of local ones relative to the observer.

Overall, this tutorial is recommended for those who want a general overview of the history and future of public key certificates and to clarify any false notions of certificates. The main messages were 1. we do not really need a global PKI in order to enable electronic commerce, 2. local name spaces are more secure than global ones, 3. certificates that bind authorization provide a more trustworthy approach, and finally, 4. certificates do not bind a public key to a person, but rather a name string to a public key.

The cryptography tutorials consisted of two half-day tutorials by Bruce Schenier. The first half focused on the basics of cryptography and surveyed the various encryption algorithms for symmetric cryptography, public-key cryptography, one-way hash functions, and random number generation, and covered a few current cryptographic protocols. The second half talked about how cryptography is used on the internet for electronic commerce, secure email, trust management, and IP security. Here I will mainly discuss the proceedings of the first half.

He first described the standard secret key algorithms including DES, IDEA, Blowfish, RC5, CAST, and Skipjack. He recommended using Triple-DES if possible, otherwise IDEA, RC4, or Blowfish. He announced that a new standard AES (Advanced Encryption Standard) will be chosen by NIST in 1999 from the submissions they receive. He then discussed the difficulty of generating a stream of random numbers. Any deterministic method for generating a stream of data will *not* generate random numbers. Most pseudorandom generators use some sort of a secret seed. However, once the seed is determined the stream can be easily reproduced. He noted that some Berkeley graduated students had cracked the random number generator used by Netscape.

Crytanalysis is the study of breaking codes. He compared differential and linear cryptanalysis. Both can be protected against by increasing the number of rounds and using random and large S-boxes. On the other hand, the only defense against brute-force crpytanalysis is having a long key. Currently, it can take only 3.6 hours to break a 56-bit key and 38 days to break a 64-bit key given a $1M computer. He predicts that for every five years in the future, assume the attack will be either ten times faster or ten times cheaper. However, he advises not to feel that cryptography will solve all our problems. For example, it is easier to implement algorithms and protocols correctly than it is to handle and manage the private keys securely.

As for public-key algorithms, he notes that although RSA depends on the difficulty of factoring numbers, our ability to factor has been doubling every ten years. He also briefly mentioned that it is too early to tell about the future of elliptic curves since researchers have not yet proved that we will not find a subexponential time algorithm to solve the discrete logarithm problem. He advises that in choosing an algorithm, you need to decide what the value of your secret is and how long it should be secure.

The main messages delivered in both halves of the cryptography tutorials were 1. "the problem with bad cryptography is that it looks just like good cryptography", 2. it is prudent to prepare for the worst, 3. the social problems are much harder than the mathematics, 4. the solution is different for each consumer depending on their specific needs, and 5. "if you think cryptography can solve your problem, then you don't understand your problem and you don't understand cryptography."