[Because this issue of Cipher is over 100K bytes, the email version has been split into two parts. This is part 1] _/_/_/_/ _/_/_/ _/_/_/_/ _/ _/ _/_/_/_/ _/_/_/_/ _/ _/ _/ _/ _/ _/ _/ _/ _/ _/ _/ _/_/_/_/ _/_/_/_/ _/_/ _/_/_/_/ _/ _/ _/ _/ _/ _/ _/ _/ _/_/_/_/ _/_/_/ _/ _/ _/ _/_/_/_/ _/ _/ ========================================================================== Newsletter of the IEEE Computer Society's TC on Security and Privacy Electronic Issue 66 May 17, 2005 Hilarie Orman, Editor Sven Dietrich, Assoc. Editor cipher-editor @ ieee-security.org cipher-assoc-editor @ ieee-security.org Bob Bruen, Book Review Editor, cipher-bookrev @ ieee-security.org Yong Guan, Calendar Editor, cipher-cfp @ ieee-security.org ========================================================================== The newsletter is also at http://www.ieee-security.org/cipher.html Contents: * Letter from the Editor * Commentary and Opinion o Two opposing viewpoints on DAC - Ninghui Li and Mahesh V. Tripunitara - Jon Solworth and Robert Sloan o Allan Friedman's review of the Financial Cryptography Conference, February 28 - March 3, 2005, Roseau, Dominica o Robert Bruen's review of Silence on the Wire. A Field Guide to Passive Reconnaissance and Indirect Attacks by Michal Zalewski o Robert Bruen's review of Darknet: Hollywood's War Against the Digital Generation by J. D. Lasica o Robert Bruen's review of Secrets of Reverse Engineering by Eldad Eilam * News Items o National Coordination Office for Information Technology Research and Development releases PITAC report, "Cyber Security: A Crisis of Prioritization" o CERT Issues Report on Insider Sabotage, contributed by Sven Dietrich o Special to Cipher, Chris Lovick and Russ Housley discuss recent developments in standardizing the SSH protocol o Researcher Reveal Microsoft Funding o DARPA Shifts Research Horizons o Book reviews, Conference Reports and Commentary and News items from past Cipher issues are available at the Cipher website * Conference and Workshop Announcements o Upcoming calls-for-papers and events * List of Computer Security Academic Positions, by Cynthia Irvine o RWTH Aachen University * Staying in Touch o Information for subscribers and contributors o Recent address changes - Yvo Desmedt - Tom Van Vleck * Links for the IEEE Computer Society TC on Security and Privacy o Becoming a member of the TC o TC Officers (a new slate) o TC publications for sale (including SRSP 2005) ==================================================================== Letter from the Editor ==================================================================== Dear Readers, Last week I was at the IEEE Symposium on Research in Security and Privacy, one of the two major efforts of the Technical Committee. It featured some very good papers, and one major controversy over discretionary access control. The two sides explain their positions in this issue. At the conference, Carl Landwehr presented a thought-provoking short talk on network security ecology. His thesis is that symbiotic computer viruses may be our inevitable future. Viruses that protect our machine from other viruses and sap our resources to a tolerable degree may win out over any other security solution. I thought about that as I disinfected my laptop machine from the viruses picked up on the hotel's wireless network during the conference. Robert Bruen has, as usual, found three interesting new books to review for us. At the SRSP event I was pleased to notice that several of the books he has reviewed for us in the past 12 months were on display by the publishers. We have an excellent report summarizing the Financial Cryptography conference, written by Allan Friedman. This is the first review that we've had in Cipher of this conference, and it is a welcome addition. We also have a contribution from the IETF Security Area about recent developments in standardizing the SSH protocol. The Technical Committee has begun a new term for officers. Jon Millen is chairman, Cynthia Irvine is Vice Chairman, and Heather Hinton gets thanks for her service and moves to the distinguished position of Past Chairman. [Ed. Correction: The technical committee officers' term complete at the end of the calendar year; Heather Hinton is still the chairman] I'm grateful to our Cipher contributors, and especially to Yong Guan, who has a magic touch with the CFP and Calendar entries for the website and for the summaries in the newsletter. Hilarie Orman cipher-editor @ ieee-security.org ==================================================================== Commentary and Opinion ==================================================================== Book reviews from past issues of Cipher are archived at http://www.ieee-security.org/Cipher/BookReviews.html, and conference reports are archived at http://www.ieee-security.org/Cipher/ConfReports.html ____________________________________________________________________ The Li-Tripunitara vs. Sloan-Solworth DAC Controversy Two Viewpoints ____________________________________________________________________ At the 2005 IEEE SRSP conference, Li and Tripunitara presented a refutation of 2004 IEEE SRSP paper by Sloan and Solworth. During the question period, a representative of Sloan and Solworth read a statement objecting to some claims in the 2005 paper. Both sides were invited to submit an informative statement to Cipher, as a way of motivating wider community scrutiny of the formal aspects of discretionary access control. ____________________________________________________________________ Ninghui Li and Mahesh V. Tripunitara ____________________________________________________________________ Our paper titled, "On Safety in Discretionary Access Control" (LT05) in the 2005 IEEE Symposium on Security and Privacy, has two main contributions. The first dispels an apparently prevailing myth that safety is undecidable in DAC. In their paper in 2004 IEEE Symposium on Security and Privacy, Solworth and Sloan use this myth to motivate a new access control scheme (the Solworth-Sloan scheme) and claim that the scheme captures a broad class of DAC schemes. Our second contribution is a demonstration that their claim is erroneous. We dispel the myth that safety is undecidable in DAC by arguing that DAC should not be equated to the well-known access matrix scheme due to Harrison, Ruzzo and Ullman (for which safety is known to be undecidable). One can have access control systems based on the HRU scheme with no DAC features. Furthermore, it is unclear how certain features in DAC schemes can be achieved in the HRU scheme. Note that Harrison et al. did not equate DAC with their scheme. We also demonstrate that safety is efficiently decidable for the Graham-Denning scheme, which subsumes the DAC schemes to which the Solworth-Sloan paper refers. On the question of whether the Solworth-Sloan scheme captures all known DAC schemes, we first provide a precise and detailed description of the Solworth-Sloan scheme, based on their informal description. We then consider Strict DAC with Change of Ownership (SDCO), one of the DAC schemes that Solworth and Sloan claim can be implemented in their scheme. We use the hints in the Solworth-Sloan paper to provide a complete construction, and observe several deficiencies in the construction. Solworth and Sloan claim that our construction is not what they intended; however, the approach they say they intended has more serious deficiencies. Space limit precludes us from including technical details in this statement. For technical details please see our web page at http://www.cs.purdue.edu/homes/ninghui/dac_safety/ ____________________________________________________________________ Jon Solworth and Robert Sloan ____________________________________________________________________ In 1976 Harrison, Ruzzo and Ullman (HRU) presented a model that could implement many protection systems, including many DAC systems, and showed that the safety problem for their model was undecidable. In Oakland 2004, we presented a model that can implement all the kinds of DACs in Osborn, Sandhu and Munawer's 2000 paper ("OSM00"). We proved that it has a decidable safety problem and claimed that it was the first system that both had a proof that its safety property was decidable and could implement all those OSM00 DACs. By our result, it is obvious that HRU must be able to express at least some access control scheme not in the OSM00 DACs. In Section 5.2, LT05 (discussed above) claims that our work has "deficiencies from the standpoints of correctness" and that it "does not capture the state invariant in [its encoding of DAC with change of ownership] that in every state, there is exactly one owner for every object that exists." That claim is erroneous. In our scheme, "Ordinary object labels are of the form where U is a user and N ... [is a] tag. An ordinary object label[ed], is 'owned' by the user U" (Section 3). Thus an object's ownership is determined by its label. Change of ownership is allowed by a rule denoted rl(<*u, *>, <*v, *>) = {*u} (a slight variant is shown in our Figure 6 and described in the text of Section 5). This rule enables user U to perform a change of ownership, giving away an object it owns to V. This implements OSM00's change of ownership. Section 5.2 of LT05 builds a model that claims to implement our scheme. However, this LT05 model is inconsistent with our change of ownership rule described above. It encodes object ownership by mapping a label to a group of owner(s), and changing the owner through the group mechanism; indeed this does not ensure that "there is one owner for every object". Our scheme does. ____________________________________________________________________ Notes from the Financial Cryptography Conference February 28 - March 3, 2005 Roseau, Dominica by Allan Friedman, Harvard University ____________________________________________________________________ Introduction (Sven Dietrich, Carnegie Mellon University) Financial Cryptography 2005 (http://fc05.ifca.ai/) was held in Roseau, Dominica (that's Dominica, between Guadeloupe and Martinique, NOT the Dominican Republic on the island of Hispaniola shared with Haiti). The conference venue was the Fort Young Hotel in the heart of Roseau, attended by about 60 representatives from research and industry from around the world. The hotel was booked solid very early, so many of us had to seek alternate housing at surrounding hotels. The Balas Bar in the Fort Young ended up being the break area for the sessions and a general meeting area before and after the talks. Wireless service (sometimes reliable, sometimes not, your mileage did vary) was available both in the conference room and outside in the break area, which was quite useful in case someone needed information or references. However, there were ample opportunities to network with colleagues throughout the day until late in the night, e.g. at the Rump Session and Banquet. The conference was peppered with cryptograms (mostly - if not all - courtesy of Ray Hirschfeld) to jog your brain: the obligatory T-shirt contained them front and back, and the meal coupons were anagrams of each meal (Monday breakfast, Monday lunch, etc.). Yes, what cryptographers do for fun... Motto: It's not a junket! Really. [Ed. Really?] See you at FC'06 in Anguilla, BWI! ---------------- Monday, February 28, 2005 Threats and Attacks (Session chair: Avi Rubin) Fraud with Asymmetric Multi-hop cellular Networks - Gildas Avoine The problem is how to encourage nodes in a mobile ad hoc network to forward traffic to the next hop on its path to or from the station. The originator of the message should be charged, but intermediaries should be rewarded. The scheme has to be lightweight. Small amounts of cheating are tolerable, but systematic fraud should be detectable and punishable. Every message has the MAC of the sender encrypted with a symmetric key known only to the sender, which the forwarders then submit to the station for some (50-75%) chance of reward. System can be subverted by two nodes in the same cell who just don't go through the station so avoid a charge. This can be solved by forcing message authentication. Another problem is the reliance on the one key for authentication. This means that the station is vulnerable to an oracle attack to determine the key. Using a hash of the key helps. Protecting Secret Data from Insider Attacks - David Dagon, Wenke Lee, The goal is to make it as hard as possible for some one who has gained access to a database (intruder or insider) to learn something useful. They present a storage system that is incredibly large, so that finding useful data is difficult even if some one has subverted some of the defenses. It comprises a tera-scale padded table with encrypted data broken into shares. It should be big enough that it can't be stored in memory, forcing a disk access and the ensuing performance drag. Lagrange interpolation can be used for data integrity. Linear scans of the table are useless, as is any single chunk of the table. A brute-force online attack is really slow, since it requires disk I/O, but proper usage with appropriate information can run 300 disk lookups/second. Using the speed difference between memory (proper usage) and disk speed (brute force access looking for shares) can be thought of as a "poor man's" one-way function. Countering Identity Theft through Digital Uniqueness, Location Cross-Checking, and Funneling - P.C. van Oorschot and S. Stubblebine Document-driven authentication has failed: easy to duplicate, hard to detect duplicates, no back-channel for document management. Assert uniqueness of credential through location. Use cell phones with geolocation enabled (911 service) which will allow for authenticator entry tied to a geographic space. For each authentication, query the device: if there is one signal (PIN) entered, assume proper use. If no signal or more than one signal is received, there is a system error. Whenever ID is asserted, verify through local phone signal. Theft of device is noticed, cloning causes multiple signals, so neither is a very strong attack. There are still many legal and economic details to work out for privacy issues. First Keynote (Session chair: Moti Yung) Trust and Swindling on the Internet - Bezalel Gavish Fraud in online auctions has grown. Trust is important, and Gavish detailed many fraud schemes that were currently being used on online auction sites that preyed on flaws in the trust-based system. Many auction sites claim a fraud rate of less than 1%; this doesn't correspond with anecdotal evidence, but we easily get data from the auction houses themselves. He ran a survey of 130 auction winners (10% response of 1300), 21% of which were dissatisfied with their experience. BUT - that's only 2% of all auctions, and there is a likelihood of reporting bias. It's harder than you'd think to get good data about the results of auctions, but there is reason to believe fraud rate is higher than we'd think. Digital Signing Methods (Session chair: Giovanni di Crescenzo) Identity-based Partial Message Recovery Signatures (or How to Shorten ID-based Signatures) - Fangguo Zhang (Sun Yat Sen University, P.R.China) and Yi Mu and Willy Susilo (University of Wollongong, Australia) Can we get the security an L-bit key's signature (difficulty to forge) with fewer than L bits if we want to recover the message? Using Abe-Okamato's (1999) method for shortening sigs for message recovery, they are able to use the signature to recover discarded bits in a truncated message. This is a good work-around for schemes that have fixed message lengths. Time Capsule Signature - Yevgeniy Dodis (NYU) and Dae Hyun Yum (POSTECH) Goal: A signature that is valid in the future time t, but not valid now. They use absolute time to define the "time capsule" signature's activation, but the time server can be independent of this protocol (doesn't need to contact anyone at any specific time). Other party should be able to verify that signature will be valid at t. The construct "identity-based trap-door hard-to-invert relations". Clever idea, but unclear how it would be used. Policy-Based Cryptography and Applications - Walid Bagga and Refik Molva (Eurecom) Most security systems use policy for access control and crypto for confidentiality and authentication. These systems are usually slapped together, since they are hard to combine. Policies are monotonic logical expressions of ands and ors, defined through trusted authority and assertion. Policy-based encryption integrates policy with crypto. Keys can be policies: decrypt a message only if the predefined policy/key allows. Suppose a client sends an encrypted request to service provider with policy for privacy certification. The SP can only decrypt if it has a cert issued by an authority that the sender likes. While trust negotiation is terribly flawed on this level, this scheme sidesteps the negotiation phase, making communication more efficient. Panel: Phishing Organizer: Steve Myers, with Drew Dean, Stuart Stubblebine, Richard Clayton, Steve Myers, Mike Szydlo Steve reminded us that phishing is an attack the combines social engineering and technology to get valuable information, or otherwise take advantage of the victim. They fool their victim with convincing reasons to visit misleading sites, often using fake addresses or even fake SSL certs. RSA did a survey showing that consumers are now more scared. Anti-phishing.org shows phishing emails are on the rise. Stuart talked about the futility of both technical and social countermeasures. Some advice we give consumers is conflicting (Citi says don't click on unsolicited emails, then sends them) or not less than helpful in the long run (eliminate the use of clicking in email??). Fooling the phishing site by always giving the wrong password first won't work for long as scammers become more sophisticated. Merchants don't want to give up the direct channel of email, but as email becomes riskier, who knows? Drew reminded us that man-in-the-middle web-spoofing attacks are as old as the web. Phishing should be easier to protect against, since it is spam, not MitM, but spreads more easily. Phishing attacks the user, not the computer. The attacks aren't that technically sophisticated, and some client-side defenses should work. Why don't we synch our spam-filter with our browser??? Tools like the Stanford Spoofguard would also help. Steve proposed an alternate protocol for authenticating websites. It's easy to duplicate a website, and hard for firms to police against trademark usage from others. The password model authenticates the user to the site, but not the site to the user. Use images sent back from the website for the user to authenticate as the password is typed in letter by letter. User just has to *recall* the images, not remember them explicitly. No hardware is required, and can have a large image space. The plan is still vulnerable to a MitM attack, but it makes phishing harder. Richard reiterated that any single tool has enough flaws to conclude that it will not solve the problem. Even if a user can reliably prove her ID, there are no mechanisms binding authentication to action, so Alice can think she is paying her gas bill while the attacker withdraws 1000 pounds. The intermediary software must be trustworthy, and the distribution itself is vulnerable to more phishing, as is the distribution of client certificates. He charged that current browser technology didn't guarantee the credibility of anything on the screen, and fully defensive behavior would probably prevent the user from using the bank's services. However, he charged that implementing many small steps would raise the cost of phishing enough to potentially reduce it. Discussion after the panel wondered about the impact of these security flaws on the consumer market, and whether the market could drive better security. There was some debate over whether regulation and liability needed to be adjusted to foster the appropriate environment. Some one in the audience pointed out the bad guys had were better at cooperating than the good guys. Several people suggested attacking phishing at the level of email media: ISP's can check source domains or blacklist known sites. These solutions are part of a larger network security debate. ---------------- Tuesday, March 1, 2005 Privacy (Session chair: David Pointcheval) A Privacy Protecting Coupon System - Liqun Chen, Matthias Enzmann, Ahmad-Reza Sadeghi, Markus Schneider, Michael Steiner Repeated-use coupons are useful to both the consumer (get free stuff) and the seller (customer lock-in, brand loyalty). Unlike paper coupons, digital coupons involve identity tracking as the easiest solution. The vendor needs protection against forging and double use and use control to prevent multiple users from pooling together. The privacy-sensitive consumer wants to prevent linking between issuances and redemption. The consumer buys a multi-coupon, each good for one redemption (i.e. buy 10 tunes for price of 9). They present a signing scheme based on (camenisch/lysyanska 2002) that uses a different exponent root for each signature. To issue a coupon, the customer chooses random string, computes a binding factor, and computes a value D using the vendor's public key. The vendor then computes a blinded signature, which the customer can then unblind. To redeem a single coupon, the vendor only needs to know that at least one unredeemed coupon exists, but shouldn't be able to link any two coupons to a single multicoupon. The solution is to show that a provable but undisclosed multi-coupon signature covers the single coupon. Testing Disjointness of Private Datasets - Aggelos Kiayas and Antonina Mitrofanova Suppose two distrustful parties with private datasets from the same lexical universe want to compare these datasets without revealing the contents to each other. The presenters offered three protocols based on homomorphic encryption principles, each optimizing for different attributes. Any private intersection predicate evaluation (PIPE) system should be designed to reflect size of the lexicon and the sets, revealing on a yes or no as to whether there are any common elements in the two sets. In the first solution, each actor encrypts the entire alphabet using an added primitive denoting the members they possess: any common elements will be observable due to homomorphic properties. This is straightforward, but impractical with large lexicons. The second and third scheme uses "superposed encryption" where one party computes a polynomial based on encrypting the held set and superposing it, then encrypting the combination. The latter two are more efficient for smaller sets in larger universes. It is not clear exactly what the ability to compare private datasets for any unity would be used for. Hardware-oriented mechanisms (Session chair: Jacques Traore) RFID traceability: A multiplayer problem - Gildas Avoine and Phillipe Oechslin The presenters summarized how RFID technology works, and observed that the privacy issues have to do with information leakage (the ID can convey other information about the bearer) and traceability. The former issue can be solved at the vendor level. RFID is bad for tracebility: they cannot be turned off, they are not always evident to the bearer, range of readability is increasing, and they present an equipped observer with easy-to-analyze logs. Physical solutions involve physically destroying the tags, preventing them from being read with Faraday cages or preventing the response from being read with blocker tags. Software solutions that allow the authorized party to read the tag but prevent unauthorized tracking across time/space are just being developed, and have issues. RFID technology has a stack of a physical layer, a communication layer and an application layer. Any of these layers can be attacked to trace the tag, so a privacy scheme needs to address all layers. The data-based application layer can be attacked based on the multiple signals sent. The communication layer can be attacked where the reader is responsible for preventing collisions, both with deterministic and probabilistic protocols. The interface of the physical layer is very open to eavesdropping, radio fingerprinting, etc. The focus on the application layer won't suffice; ideally, we'll need crypto at the communication layer. Stronger protocols are more expensive. Discussion generated some controversy about killing and jamming tags. Information-theoretic security analysis of physical unclonable functions - P. Tuyls, B. Skoric, S. Stallinga, A.H. Akkermans and W. Ophey In crypto-based systems, storing keys on weak media can be the weakest point, allowing cloning and abuse. The solution is to use physical artifacts that are unclonable. They must be easy to evaluate, hard to invert and unpredictable even to some one who has the function. Multiple different inputs should produce random-looking outputs. Philips has a prototype optical system: shine a light on a pattern, and get a pattern, from which you generate a key. Each angle forms the first half of a different key pair. Like biometrics, they are robust in perfect conditions, but very sensitive to noise. Security-wise, physical cloning requires the actual device. Electronic cloning requires a full challenge-response space, or at least extract all the entropy, which they go on to prove. Questions about how large the space is were mollified, since the prototype has 10^8 challenge options, with thousands of bits of output space. Supporting Financial Transactions (Session chair: Liqun Chen) Risk Assurance for Hedge Funds using Zero Knowledge Proofs - Michael Szydlo Hedge fund investors want to know as much as possible about the risks of their investment. At the same time, the fund follows private, proprietary investment strategies, and wishes to keep their holdings as secret as possible. The funds use statistical arbitrage, so need to keep their actions secret; they are also subject to less regulation so investors are more exposed if the fund's position are more risky (statistically) than their approval. If the fund managers make statements about their risk exposure using ZK proofs, investors will be able to measure their risk without being able to learn the actual investments. Risk proofs can be about firm earnings, geopolitics, asset allocation percentages, etc. The investment contract includes ZK proofs based on market predicates, verified by a trusted third party. This is less for individual investors than institutions, who have precise risk models for their investments. Dishonest actions now leave a clear paper trail for fraud. Discussion revealed that the proofs must be defined strictly enough to be useful, but broadly enough to prevent an oracle-like attack. Systems, Applications and Experiences (Session chair: Lynne Coventry) Views, Reactions and Impact of Digitally-Signed Mail in e-Commerce - Simson L. Garfinkel, Jeffrey I. Schiller, Erik Nordlander, David Margrave, Robert C. Miller Secure email has been around for a while, and S/MIME has been around since 1998, but neither seems to be too popular. S/Mime allows the mail client to easily verify a signature; in business circumstances where getting the sender's key is easy, it should be easy to use. S/MIME may not be the best standard (document updates, subject not signed), but it's common. The project is based on the results of a survey of amazon merchants, since amazon uses S/MIME for VAT tax notices. 417 surveys were completed (25% EU - 75% US) and the respondents were reasonably well off, educated online merchants. Most respondents didn't know if their email client could handle encryption. Most felt that bank and online merchant information should be signed, with some respondents desiring tax returns and personal correspondence also signed. Merchants didn't believe that signed messages were more trustworthy. Some felt that signed mail was important, but that it was too complicated or not worth the effort: they had no reason to send out signed emails. In the discussion, it came up that people don't think of a signature as a message verifier, but a sender verifier. Securing Sensitive Data with the Ingrian DataSecure Platform - Andrew Koyfman There is a large need to keep databases secure, but we can't focus on perimeter defenses alone: there are too many breeches, and too many insider attacks. Application-level security to encrypt data before it goes in the DB require firms to change their apps. So it's easier to add on database level encryption that is transparent to existing apps. And we don't need all the data encrypted, just some the sensitive stuff. Encrypting the columns can break existing read mechanisms, so we use triggers that only authorized parties can use to decrypt data. A network-attached centralized server handles the keys for all users, crypto, with each user having different permissions (time of day, rate of processing, etc). Like all database apps, there are hassles: indexed fields are hassles and there are legacy issues. Ciphire Mail Email Encryption and Authentication - Lars Eilebrecht Why a new encrypted email scheme? The hard part of encrypted email is key distribution: the web of trust doesn't really work for normal users, and trusted third parties needed for X.509 or PKIX are unproven. Ciphire transparently sits between user's mail client and server. The setup uses a centralized server to generate a key for an email address - not a person - without much user involvement. It uses ASN.1 (based on X.509) and encrypts the header as well as the message, and each MIME part separately. The hybrid trust model uses the hierarchical nature of PKI (with some detection against server subversion) but each client checks its own certs, other certs and compares summary hash values with communication partners, so many users would be able to see if something had changed. It's not compatible with extant systems, and the dispute resolution and key revocation bits are sketchy. But the system is trying to be as simple for the user as possible. ---------------- Wednesday, March 2, 2005 Second Keynote (Session chair: Andrew Patrick) Lynne Coventry - Usable Security: A conundrum? The goal of most systems is something other than security. The question is whether reconciling usability and security is impossible or just difficult. The default is to make things convenient, rather than secure. Many parties are involved in the burgeoning fraud arena: attackers, legal users of applications, programmers, ISP owners, company managers, etc, all involved in the tradeoff BUT legal users have the steepest cost/benefit range. Since users think of money as safe to start with, security is seen as an added cost to them. A dizzying array of ATM fraud is presented, most capitalizing on users' trust in the banks and the people around them. We can ask users to secure themselves, but PINs have usability issues. Setting their own risk by pre-specifying hours or locations may or may not work. Memory offers set limitations on "something you know" and natural behavior around biometric systems leads to false reject rates of 20-30%, so that wraps it up for "something you are." We need more HCI and user studies: people can be their own worst enemy. Even increasing public awareness can backfire by reducing confidence and use. A participant asked how banks make their final decisions on security measures, but there appears to be no clear metric. Message Authentication (Session chair: Yuliang Zheng) A User-Friendly Approach to Human Authentication of Messages - Jeff King and Andre dos Santos If the user trusts a smart card, but doesn't trust the intermediary computer, how can the user trust the smart card to sign a message to a server? They propose using hard AI problems that the device can initiate, the human can solve, but the computer can't. The user needs to be able to extract a unique message from the smart card, without the computer identifying (and being able to tamper with) that message. The device generates a 3D ray-traced image with text in it. The user can verify the image and read the text message. Any attacking intermediary would have to redraw the image with a new message, which would involve interpreting the scene to re-run a ray-tracing machine. There are other mechanisms you can use: spoken speech, or handwriting. The system must be easy to use, and depends on AI problem parameters. The developed image system can transmit 20 characters, so it's good for status messages, but not meaningful communication. Auctions and Voting (Session chair: Yvo Desmedt) Small Coalitions Cannot Manipulate Voting - Edith Elkind and Helger Lipmaa Plurality systems create incentives for malicious coalitions to not vote their true preferences to manipulate the outcomes. Arrow shows that manipulation is completely preventable, but we can make it hard to do. A random preround pair-wise comparison chooses which half of the candidates go on the ballot, but this can be a source of manipulation itself. They extend Conitzer and Sandholm to apply for more manipulators than just one, but it is not clear what the systematic rule is for what the maximum fraction of manipulators they can allow is. A question highlighted the fact that the model also assumes equal weights across preference strength for each actor. Efficient Privacy-Preserving Protocols for Multi-Unit Auctions - Felix Brandt and Tuomas Sandholm In a multi-unit auction with identical items, the auction is efficient way to allocate goods based on value, but you have to trust the auctioneer. Auction fraud is easy with an untrustworthy auctioneer, and hard to detect. The bidders don't want to reveal their bids to each other, since it can expose strategy, so the protocol should be robust against coalitions. Protocol uses distributed homomorphic encryption to generate keys, publish encrypted bids, then jointly compute the outcome vectors, followed by distributed decryption. Bad faith participation can be identified, those bids removed, and the computation restarted. Event Driven Private Counters - Eujin Goh and Philippe Golle In an instant runoff election, preferences are publicly revealed anonymously. Still, it can be possible to submit preference orderings that are combinatorically identifiable, allowing for credible vote-selling. Common solutions such as encrypting counters add too much work to the tallying of votes. They present a "private counter" protecting a value that updates based on events, without revealing what the new value is. Individual preferences are never actually revealed. Results are computed collectively, and if a winner is not reached, the event is announced that one candidate is removed. All private counters update, and another result is computed. If the private counter is secure, then this system is as well. ---------------- Thursday, March 3, 2005 User Authentication (Session chair: Mike Szydlo) Secure Biometric Authentication for Weak Computational Devices - Mikhail J. Atallah, Keith B. Frikken, Michael T. Goodrich and Roberto Tamassia Biometric authentication should be fairly straightforward, but standard systems need to compare cleartext form of biometric data because hashing similar readings will produce different outcomes. The system assumes a smart card with biometric reader, and a server to access, with the goal that attacking the server should not let you impersonate the user. Even sending the biometric reading XOR'd with a one-time pad, since it will leak information over time. They use Boolean vectors to interpret the Hamming distance between the encrypted measured value and the encrypted stored value. This forms a very lightweight protocol requires possession of the smart card AND either the source metric or access to the server. Since the smart card and server are both involved in the authentication, each authentication system requires a separate smart card. Panel: Incentives, Markets and Information Security. Organizer: Allan Friedman, with Bazelel Gavish, Paul Syverson, Sven Dietrich and Richard Clayton Bazelel presented a straightforward economic analysis of a commonly discussed information security issue: spam. Gavish argues the problem stems from the low marginal cost to send messages in a digital environment, and proposes fee-based system that gives a credit to each recipient, claimable from the sender. While the general idea has been discussed before [5], this approach involved both end parties and the service providers. Gavish advocated a dynamic pricing scheme, and highlighted important areas of research for implementation. Paul shifted the focus from mechanisms to institutions, arguing the "identity theft is about neither identity nor theft." Syverson highlighted flaws in the current state of consumer authentication, where information that has a very high value in specific contexts (a social security number can open a line of credit or obtain a new password) is undervalued by some actors, leading to arbitrage and fraud. This also introduced the concept of a security externality, where poor protection or overuse of identifying and authenticating information can raise fraud rates for other parties. Sven demonstrated that a single security issue like distributed denial of service (DDOS) attacks presents the opportunity for multiple levels of analysis that stem from unique features of information systems. The nature of the attack stems from the decentralized environment, where the coordination costs of a bot-net are less than the damage inflicted on the target. Networks of subverted machines also raise the question of who should bear responsibility for the damage caused, since the software manufacturer, the machine owner and local ISP could all have theoretically prevented the machine from causing damage. Dietrich even explained the networks of subverted machines were traded in illicit marketplaces, raising questions of trust and quality guarantees. While no single approach can solve the problem of DDOS attacks, each layer of analysis opens an opportunity to raise the costs, reduce the damages and mitigate harms of this critical issue. Finally, Richard took a step back, acknowledging the importance of economics in the field of security, but tempering this enthusiasm with several observations. Using the example of email payments, he illustrated that proposed economic solutions might fall flat from simple economic or technical realities. Furthermore, economics is a nice tool, but good numbers are needed to judge efficacy. It is one thing to build internally consistent models but to further extend the field, these models should be consistent with empirical data. Clayton summed up by urging people to learn more about economics, but suggesting that it was "perhaps not yet time to change departments." ____________________________________________________________________ Book Review By Robert Bruen May 14, 2005 ____________________________________________________________________ Silence on the Wire. A Field Guide to Passive Reconnaissance and Indirect Attacks by Michal Zalewski No Starch Press 2005. ISBN 1-59327-046-1. LoC TK105.59.Z35 281 pages. $39.95. Index. Bibliographic references endnotes. "Silence on the Wire" is an unusual and greatly interesting security book. Though written in a narrative form, unlike other security books, it does not fit into the category of Kevin Mitnik (The Art of Deception and The Art of Intrusion) and Ira Winkler (Spies Among Us). The discovery of a technical book in this style is cool. Zalewski builds on his passive OS fingerprinting work to provide us with a framework for looking at network security. It feels like we are listening in on his thoughts as he observes and analyzes the way TCP works. Most of us in the security universe have analyzed TCP packet structure, payloads and the misuse of the format, but here we see someone watching as the traffic flows. The net is a living entity that behaves in a global sense which reflects the cumulative behaviors of local events, something akin to the sound of a highway as the individual automobiles drive by. Listening to that highway sound tells you something about the highway which is different from what just one car is doing. When a competent thief breaks into a place, steals something, and then disappears into the darkness, there is no disturbance, no trace is left - the opposite of a gang bursting into a bank in broad daylight. Quietly gathering small bits of information which gain meaning when aggregated is the preferred method of reconnaissance. Listening for loud noises is not so hard, but knowing when those one or two unusual packets are a precursor to an attack is much more difficult. Successful exploits require deep understanding as well as patience. It is not enough to find some bad line of code that allows a buffer overflow. There is a creative side which watches how things happen as a sea of activity gives up a hint of a weakness, followed by a careful crafting of a method to take advantage of it. This approach is not the usual attack on a victim. Instead, we encounter a watchful eye on a complex environment which seeks out the details overlooked by a noisy attacker. "Under the radar" is an apt phrase to describe passive reconnaissance. Getting in the head of someone who knows how to do it can be a challenge unless they choose to reveal how they think and observe. This book is one of the rare opportunities to peek inside. I am not sure if we can all learn how to this well because it is a not a technique which can mastered. It may very well be something you get at birth. In any case, understanding what it is all about is still worthwhile, especially from the author of this book. Zalewski's perspective on network attacks is unique and valuable. I recommend "Silence on the Wire" to anyone who wants to broaden their own view. ____________________________________________________________________ Book Review By Robert Bruen March 13, 2005 ____________________________________________________________________ Darknet: Hollywood's War Against the Digital Generation by J. D. Lasica Wiley & Sons 2005. ISBN 0471683345 320 pages. $25.95. Index About four years ago Peter Biddle et al, prepared a good paper called The Darknet and the Future of Content Distribution. It was essentially a state of the situation and dire warning for those who were concerned about protecting copyright. They made the mistake of defining darknets as a copyright-free zone for sharing files, when in fact darknets are a logical outgrowth of network development. The most noticeable characteristics are those that keep these nets private from all but a chosen few. This is not as sinister as it sounds. For example we might invite only a select group of friends when we throw a party or we might sponsor a community get together which lets anyone in. It is simply a matter of choice, not necessarily a conspiracy to do harm. The Internet is one of the most open forms of technology and social interaction that we have ever seen. Nevertheless, many networks which connect to it are private, such as a corporate network, where access is limited to employees. These networks are dark to us if we do not work there, and perhaps even if we do work there. These same networks ought to allow only encrypted traffic making them really dark to outsiders. Again, nothing sinister, just private. The darknets in question are somewhat more sinister, however. The lawsuits started by the Recording Industry Association of America (RIAA) now number in the thousands. They are suing individuals deemed to be stealing music by sharing the digitized versions of songs, commonly called MP3s because of the compression algorithm. Much of the difficulty started with Napster, but now appears to be completely out of control. Several businesses, like Apple, have finally moved to selling music over the Internet for about a dollar a song, a smarter move than suing the kids who make up your customer base. One of the consequences of this struggle is the creation of darknets which swap songs, movies, games, software, and what ever else is of interest. The attack by the RIAA has forced the use of technology underground, not unlike the old days of alcohol prohibition, which helped fuel an underground economy which today probably rivals the visible economy. Digital technology now allows individuals to do what only a few years ago required large sums of money, expertise and other resources in their homes. Powerful private networks are not very expensive, so when a small group of college kids decide to create a darknet, there are few obstacles in their way. It is not a crime to build one, but like most things, it can be used for good or evil. The real issue is that now these groups have a place to do what they want without interference. Now they do not have to participate in activities in the open. Down the road, we will see many more of these darknets, or rather see the repercussions. A major concern is that this goes far beyond music sharing into all possible forms of entertainment. We are at a fork in the road, the entertainment industry against the techno-savvy young adults. In the long run, my money is on the kids. The movie industry has been hesitant to jump into the legal solution, but they are worried because the time has arrived when downloading movies is not different from downloading a song. The networks are faster, software is smarter, storage is cheaper and powerful computers are cheaper. Add encryption to the mix and there is no easy fix to stop the process. The scope of the issue is both broad and deep. The best source of current information is this book by J. D. Lasica. The requisite web site is http://www.darknet.com. It is a fascinating, well-researched book that everyone interested in the future of technology and society ought to read. In an ironic example of how foolish the establishment can be, Steve Jobs of Apple has ordered all Wiley books removed from Apple's retail stores because they are publishing an unauthorized biography of Jobs. Now that he is a power within the industry he has decided to step on others on their way up. I urge everyone to learn about the issue by reading Lasica's book. It is an accessible book which brings the social consequences of responses to technology to those who most need to know. I read it in one sitting, completely absorbed with his insights into the issues and the barrage of thoughts of future implications. While darknets are not new, the social impact of their existence is still unfolding. ____________________________________________________________________ Book Review By Robert Bruen May 14, 2005 ____________________________________________________________________ Secrets of Reverse Engineering by Eldad Eilam Wiley Publishing 2005. ISBN 0-7645-7481-7 589 pages. $40.00. Three appendices. Index with bibliographic references included. It is not clear that secrets are involved in reverse engineering, with the exception that the secret is expertise. The process of getting a program to run is straightforward: design, code, compile, link and run. The binary executable is a translation of an idea into source code, the source code is compiled into object code, which is then linked with code from system libraries. The process is non-trivial, but like time's arrow, it should be a reversible one. The trick, of course, is mastering each level of translation, which is not so simple. Anyone in the tech field ought to have written some program in some language, even if was just "Hello World." Since the point is to run a program, most people stop there. Over the years, however, many reasons to go to the code have popped up. Debuggers have become sophisticated to point where symbolic references, stepping through the code, branching and all sorts of bells and whistles are the norm. They are so good that the binary can be reversed with almost no effort, if you understand the process. Just as obviously, the folks who would like to prevent their code from being reversed have also been making progress. They strip out useful information or obfuscate variable names and use other techniques to make it difficult to reverse their code. But in the end, what was translated within software, can be reversed. The only question is how hard will it be. In physics, time's arrow should be reversible, at least within the math. In world we live in, it is not possible to reverse things we break, like eggs and drinking glasses. Those would like to prevent reverse engineering are stuck in the math world for now and until a technique is developed to make it impossible to unwind what has been done and still have the code run. Until now, reverse engineering has had scattered sources on the web and in chapters in a good books such as "Security Warrior," by Cyrus Peikari and Anton Chuvakin. Eldad Eilam's book fills the gap admirably. This is a top notch book covering everything you need to know about reverse engineering code in any environment. Mastering techniques, products and concepts is the goal, and it is all here. There is also a web site with source code to accompany the book . Be forewarned, you will need to learn assembly language to get through it. If you skipped the hard parts of writing code, you will be challenged. This is not a book for wimps. There are disassembly listings, there are step by step, detailed instructions on reversing programs and there are charts explaining the conceptual approach. Eilam covers .Net reversal as well as copy protection. While none of this should be controversial, all of it seems to be. "Reversing" is the best book available to learn, understand and practice reverse engineering while it is still legal. [Part 2 of Cipher is being sent separately]