Review of  the
Financial Cryptography 2006,
Anguilla, British West Indies
February 27 - March 2, 2006

Review by Sven Dietrich et al.
May 13, 2006

by Sven Dietrich, Carnegie Mellon University

This year's Financial Cryptography (FC) conference was once again held in Anguilla, where the first instance of FC was held (one that I personally attended back in 1997, despite a looming American Airlines pilot strike). This year's conference had a sense of retrospective, reinforced by Ron Rivest's keynote speech and Michael Froomkin's invited talk. The general chair was Patrick McDaniel, Penn State University, and the program co-chairs were Avi Rubin, Johns Hopkins University, and Giovanni di Crescenzo, Telcordia. They put in considerable effort to make this an excellent conference with a solid program. The main topics this year were Authentication and Fraud Detection, Privacy, Reputation and Mix-nets, Conditional Financial Cryptography, Payment Systems, and Efficient Protocols. There also were a series of Short Papers, an industry presentation, and a rump session.

The conference itself was held at Paradise Cove, a resort on the southwest side of island, facing the island of St. Maarten only a few miles across the sea. As the number of attendees hovered around 70 people (too large for the set of rooms at the small resort), all from industry, government, and academia, many had to find alternate housing arrangements that made the Traveling Conference Attendee, err, Traveling Salesman Problem look tame. Left-hand driving is a must, which one can easily forget as the cars are left-hand drive as well. Many drivers heard the desperate reminders of 'stay left!' from their passengers. However, shuttle service from the various hotels to the conference venues and the events was excellent.

The sessions were mostly held at the Paradise Cove, and for those who are turning green with envy, they can be reassured that the sessions were held in the basement conference center, where no sunlight penetrated. A series of wireless base stations assured connectivity, and many attendees opted for the Tor ( approach of web access. The rum(p) session was held at the original FC location, Chandeliers, the conference facility at the Interisland hotel. Other highlights were the live music night with reggae legend Bankie Banx at the Dune Preserve and the Beach BBQ at the Anguilla Great House (for most attendees these were a short stroll via the beach at Rendezvous Bay, but not close by car - a simple exercise in non-Euclidean distance metrics).

Now on to the technical part of the program. The sessions were summarized by various attendees at the conference. Those summaries were collected by Will Enck, Penn State University, who also assisted Patrick McDaniel during registration.

Stay tuned for the Call for Papers for Financial Cryptography 2007 and please consider submitting and attending.

Sven Dietrich
Program Chair, FC 2007
Session Summaries
by Will Enck, Penn State University
Keynote Address: Ron Rivest
Title: Perspectives on Financial Cryptography Revisited

Summarized by Will Enck, Penn State University

At the first FC in 1997, Ron Rivest presented his perspectives on financial cryptography. In this presentation, he discussed 18 debatable propositions. With time, we gain experience, and hence new perspectives on security. For the 2006 FC keynote address, he revisited these points.

P1: There is little difference between Internet payment schemes and interstellar payment schemes. In 1997, the Internet was not nearly as embedded into our daily lives. This proposition is false, as we now recognize the need for the Internet to be connected to the ``real world'' (e.g., delivery).

P2: Historically, payment schemes have not worked well. Payment schemes have transitioned though many forms of exchange. Sampling a few, we observe metal, paper money, checks, credit cards, and more recently, electronic money. In 1997, electronic money was a relatively new concept, and it was debatable whether or not it would fair. While he views the original proposition on existing payment schemes, looking back, Rivest believes the original proposition was mostly true. Hyperinflation exists with electronic money in MMORPG's, but things are getting better, e.g., risk management.

P3: Electronic cash systems will enable anyone with a PC to be a ``mint'' for his own brand of currency. Electronic cash begets many of the same concerns as when the printing press became widely available. In particular, many believe it will diminish the role of the central bank. In 2006, Rivest believes the proposition is technically true, it has proven false in practice, as there is a continued dominance of large financial institutions.

P4: National currencies won't go away, to be replaced by cyberspace dollars. Looking back in 2006, and given proposition P2, Rivest believes this will remain true.

P5: Individual privacy is already lost, and must be regained. In 2006, this remains true. In fact, it has gotten worse. Anyone can by someone's cell phone history with only \$100; Tivo tracks your viewing habits; Amazon profiles your buying habits; Google characterizes your web searches. The real question is, how much to people care?

P6: User profiling has a definite ``up-side'' for the user. Despite the privacy infringement introduced by profiling, it does allow for better service, e.g., spending profiles aid fraud detection. In 2006, this remains true, but only if it works well, e.g., when Tivo guesses incorrectly.

P7: Governments will not allow payment systems to support true (payer or payee) anonymity for large payments. With respect to law enforcement, there are many good reasons document the identities making large transactions, e.g., bribery, kickbacks, political payments. In 2006, this remains true, especially post 9/11. Only, now, there is not even serious debate about it.

P8: Achieving payer anonymity for small payments by cryptographic means is too expensive (in terms of complexity and CPU time). In 2006, remains true, but mostly in terms of user and system complexity. Again, as people do not care much about anonymity, there has been little demand for such systems.

P9: Anonymity will be a value-added feature that a user may purchase. Conversely, a user may break his own anonymity in a transaction for a fee. In 2006, Rivest believes to original proposition to be false. Anonymity as a value-added feature only exists in a minor way. Again, there is a recurring theme that people only care about ease-of-use, disregarding any concern for privacy. Avi Rubin noted that many people do care about their cell phone records. Subsequently, another attendee mentioned that there is a parallel between identity theft and anonymity, but Rivest disagreed, saying that they are different beasts.

P10: Multi-application smart cards will never make it big. This mainly results from user's comfort with having one card per issuer. In 2006, we are beginning to see signs of change, e.g., Hong Kong's ``octopus card.'' Even though the multi-application smart card has not caught on, we are seeing a similar phenomenon occur with cell phones. It is a prime candidate for use as an authentication device or trusted agent. The question that remains is, do we trust it?

P11: Anonymity for small-value payments will arise (only) from anonymity of card-holder/card relationship. Smart cards can be obtained anonymously and then used for many small-value purchases. In 2006, this remains true, as small pre-paid application cards (e.g., for transit) exist.

P12: Smart cards will be ``broken into'' on a regular basis, but the cost of doing so will rise dramatically over the next decade. While smart card use has not exploded, the proposition can be directly applied to RFIDs. In this regard, in 2006 the proposition remains true. We have already seen remote power analysis of RFIDs, and Rivest expects the trend to continue.

P13: Digital coins will not be used for large-value transactions. The main concern with using digital coins for large-value transactions is replication. The problem is much less severe in micro-payments. Rivest believes the proposition is still true, but only because digital coins just are not being used. He then revised the proposition to state that digital coins will never make it.

P14: Payment schemes with off-line coin transfers between users won't make it. There is really no good business model for digital coins, and by virtue of the analysis of P11, the original proposition remains true.

P15: Micropayment schemes will be the system of choice for purchasing most information over the web. Most information on the web is of low value, but in 2006 we have still not seen the rise of micropayments. This is mainly due to the success and dominance of ad-based systems. Rivest revised the original proposition to state, while ``small payment'' schemes may thrive, true ``micro'' payment schemes may never make it. He further commented that some specialized applications may occur.

P16: General-purpose public key infrastructures (PKI's) are not necessary for financial cryptography--they can (and will) be special-cased. In 2006, there is still no general purpose PKI; the original proposition remains true.

P17: Voting systems and payment systems will be seen as being very close. One can view voting for a candidate as giving a $1 coin to that person. Of course, in voting, anonymity is necessary. The more the analogy is investigated, the similarities seem superficial, e.g., selling one's vote has no real counterpart. Therefore, the proposition is false, but it is still useful to think about the differences.

P18: ``Alice's crypt restaurant'' can serve any feasible combination of system requirements at a workable cost (not necessary cheap). In 2006, this remains true, even more so with the magic of elliptical curves and bilinear maps.

In summary, 1997 was an ``optimistic'' year with too much emphasis on anonymity. Over the years, it has become apparent that most people are more concerned with convenience than privacy. Currently, privacy is really a secondary thing, after functionality. Therefore, we need to design systems so that privacy has no cost.

Technical Paper Session: Authentication and Fraud Detection
Session Chair: Sven Dietrich

Summarized by Marina Blanton, Purdue University

1) Title: Phoolproof Phishing Prevention
Presenter: Bryan Parno

Phishing is a growing problem. It currently affects millions of people per year and results in large losses to businesses. A typical phishing attack involves a legitimately looking email from a bank or another institution and redirects the user to a website. The website looks secure and collects information about the user that later can be sold or misused. Current techniques for phishing consist of copying of images, DNS tricks, JavaScript attacks, and certificates. Advanced techniques make use of target selection, socially aware attacks, and context-aware attacks.

Security is secondary to users, since all they want is functionality. Thus, they choose bad passwords, are not good at parsing domain names or going through certificates, too many pop-ups. And the main goal of this work is to prevent the attacker from viewing or modifying user account, by limiting what an attacker can do. The work assumes that the user will make mistakes and tries to help her not to make mistakes. This is done by guarding user accounts using mobile devices.

The main idea is that a mobile device creates a public key pair, then authenticates to various servers that user wants to use. The server refuses the connection if the mobile device doesn't authenticate properly. Every time a user wants to access her account, she has to initiate the connection from her mobile device. Advanced features for improved security are also available. The protocol uses the standard SSL protocol, the private key never leaves the mobile device.

Attacks such as hijacking of account setup, theft of the mobile device, malware on the device, and attacks on the network are also addressed in this work. There is a prototype implementation of the scheme with Nokia Smartphone and Firefox browser.

2) Title: A Protocol for Secure Public Instant Messaging
Presenter: Mohammad Mannan

In IM, a user has a pre-established password with the server. A user starts by authenticating to the server and communicates to other users by going through the server, withe exceptions where communication between the clients is direct.

Why do we need security in IM? IM is used very widely in home and businesses; the amount of data exchanged in IM will soon overgrow email. Companies are effected by security-related problems in IM. Existing solutions have drawbacks, and strong password protocols are not designed to fit the framework. The goals of this research is thus to overcome those drawbacks. The design goals include mutual assurance of identity, secure communication, forward security, repudiation, and replay detection.

In the solution given in the paper, Instant Messaging Key Exchange (IMKE), all communication consists of three phases of messages: authentication and key exchange with the server, public key distribution (when a client requests to talk to another client), and session key transport (when a client contacts another client). IMKE features avoidance of off-line dictionary attacks, independence of public key protocols, secure communication without the need to maintain long-term secrets, and other properties. Not all attacks, however, are addressed: in IMKE keyloggers can collect passwords, a false public key of the server allows off-line dictionary attacks, malicious server can forward false public keys, and IM worms are also a problem. IMKE was integrated with Jabber, it results in usable performance, and is inclemently deployable. In the summary, the scheme is simple, integratable, and practical.

3) Title: Using Automated Banking Certificates to Detect Unauthorized
Financial Transactions
Presenter: Sven Dietrich (substitute)

This work focuses on detecting attacks conducted by authorized insiders in the banking domain. In the current systems, the insiders can modify the transactions. Audit (through audit files) is not sufficient.

The solution given in this work is called the Transaction Authentication Service (TAS), which utilizes Automated Banking Certificates (ABC). It involves hashing to perform consistency checks between participating banks. The paper re-defines the concept of transactional workflow, and the integrity is achieved by exploiting inter-transactional relations. It gives a new way of constructing audit tools for integrity drift detection.

Panel: Ten Years of Financial Cryptography
Moderator: Moti Yung

Summarized by Bryan Parno, Carnegie Mellon University

Moti Yung prefaced the panel with a brief history of the Financial Cryptography conference, which began in Anguilla in 1997 under the leadership of Bob Hettinga, Vince Cate, and Ray Hirschfeld. The conference attracted cryptographers, security experts, cipherpunks, lawyers and government workers. In the early years, the conference brimmed with optimism over the prospects of electronic money and e-payment systems, but the conference has since expanded its horizons to encompass PKI, voting systems, and fraud, as well as policy and legal issues, and its venues have ranged from the Caymans to Bermuda to Guadeloupe, and finally back to Anguilla. Moti concluded by asking the two panelists, Nicko van Someren from nCipher and Jacques Stern from ENS, to comment on the state of the science behind Financial Cryptography, as well as its relevancy to industry.

Since nCipher provides security solutions for many customers in the financial world, Nicko makes his living as a financial cryptographer. He began his talk by analyzing various changes in the world of financial cryptography over the last 10 years and concluded with his thoughts on future directions. In 1997, the primary business driver was technology for technology's sake backed by a glut of cash, whereas today, two of the primary motivators are regulatory compliance and the need to combat identity theft. The legal landscape has shifted as well. While export controls are no longer the concern they were in 1997, the DMCA and the aftereffects of September 11th have both impinged on security researchers (e.g., anonymity systems tend to encounter a more hostile environment), though the increase in requirements for audit information about financial transactions offers room for additional technical innovation. In the future, Nicko believes that anonymity systems will continue to struggle, since most people simply do not care enough about it. He advocates additional thought on the meaning and authentication of identity, and argues that Digital Rights Management is here to stay, so the community should embrace it, rather than hoping it will all go away. Finally, he suggests that in addition to designing systems that are resilient to node or component failure, we should begin to design systems that are robust against user failure, since the world is full of naive users who do not always behave the same as cryptographers or security experts.

As the representative of academia, Jacques began his talk by considering the science behind Financial Cryptography, asking whether it amounted to merely considering traditional cryptographic topics in the context of financial transactions. Jacques rejected this hypothesis, since it leaves out too many aspects of financial cryptography, including digital identity, defenses against phishing and spam, content protection and electronic voting. As an alternative, he proposed defining financial cryptography as the study of the protection of digital assets and transactions, though he still found this definition mildly unsatisfying. Jacques suggested that the financial cryptography community has produced a number of interesting and innovative ideas, particularly in the areas of e-cash and payments, e-voting, traitor tracing, combating spam with puzzles and applications of homomorphic encryption. However, in considering MONEO, a system in Europe for making small payments, and current e-voting systems, Jacques concluded that ideas from the financial cryptography community have not influenced real-world applications as much as might be desired.

In the discussion that followed, it was suggested that since large office retailers such as Staples and Office Max have been devoting advertising and shelf space to shredders, people may care more about privacy that Nicko believes. Gene Tsudik commented that a paper shredder appeals to a primal urge to destroy that a digital equivalent cannot match. When asked where he first expected to see the use of peer-to-peer electronic cash, Nicko responded that he expects it will only emerge in niche markets.

Technical Paper Session: Privacy
Session Chair: Mike Szydlo

Summarized by Jong Youl Choi, Indiana University

1) Title: Privacy in encrypted content distribution using private broadcast encryption
Presenter: Adam Barth

There are many applications that aim to distribute information using broadcast encryption. Examples can be found in Windows Encrypted File System (EFS), OpenPGP, GNU Privacy Guard (GPG), and MS Outlook. However, none of these applications provide recipient privacy. In other words, an attacker can determine identities authorized to read encrypted content.

One trivial solution to hide recipients in encrypted content distribution is to remove all identity information from the recipient list. However, the cost of this scheme linearly increasing as the number of recipients increases. The paper introduces an efficient solution which requires only constant operations regardless of the number of recipients.

2) Title: A Private Stable Matching Algorithm
Presenter: Philippe Golle

Matching algorithms are highly demanding when we need to match one element of a set to one from another. For example, we can use matching algorithms for matching men and women for marriage or assigning medical students to a residency program in hospitals. Unfortunately, perfect match is impossible. Instead, a stable matching algorithm designed by Gale and Shapley is the most common in practice. However, in terms of privacy, the algorithm does not sufficiently provide participant privacy, as the algorithm reveals intermediate information such as preferences and histories in the middle of operation.

The contribution of this paper is to propose a private stable matching algorithm which will not reveal any information to break privacy. For this purpose, the proposed scheme is designed to use re-encryption mix networks as a privacy building block and the Paillier encryption scheme.

3) Title: Private Policy Negotiation
Presenter: Klaus Kursawe

In many electronic transactions, two participants should agree on some level of policy before initiating communications. For example, a service provider needs to publish its privacy policy, and a user needs to agree with that privacy policy to access services. Based on their pre-defined preferences, interactive negotiation is also possible between a server and a user. However, it is hard to prevent leaking preference information of both sides in the middle of negotiation. For example, in P3P (Platform for Privacy Preferences Project) policy, a server provides a list of policies and a client needs to choose some of them with interaction. In this scheme, client's preferences will not be revealed but server's preference will not be protected. From this point of view, the paper provides mutually privacy-preserving policy negotiation protocol. The proposed solution is based on two-party computation with homomorphic cryptosystems.

Technical Paper Session: Reputation and Mix-Nets
Session Chair: Yacov Yacobi

Summarized by Marina Blanton, Purdue University
1) Title: Uncheatable Reputation for Distributed Computation Markets
Presenter: Radu Sion

This work addresses building reliable reputation in grid systems. In such systems, many workstations comprising a grid can be used to run a computation, and all together offer a lot of CPU power and storage space. Individual machines fail often, and thus we want to build a reputation system to establish how reliable each server is.

Building of reputation by dishonest clients can suffer from two problems: bad mouthing and ballot stuffing. In bad mouthing, a good service is provided, but the client gives bad reputation to the service. In ballot stuffing, good reputation is given for the service that doesn't deserve it. Thus, reputation mechanisms shouldn't solely rely on the assumptions outside of the system and should take performance of the service into consideration.

The setting considered in this work is the following: a client wants to perform a computation and has a number of constraints on its job. A service can evaluate the client's function. In the normal scenario, the client contacts a scheduler and asks for an available server. But in the setting where the clients are not trusted to provide truthful information about the service (by possibly giving incorrect reputation as listed above), we would like to have secure witnessing of the computation performed.

The idea is to invite witnesses that will monitor the interaction between the client and the server. Given a threshold c on the number of dishonest witnesses, the number of witnesses a client uses needs to be 2c+1. The client is free to choose its witnesses (can be the same set for different computations).

The protocol works as follows: the client multicasts to the chosen witnesses its functionality. The witnesses select the service for that functionality, rate the service (compute a new reputation to it), and return the result of the computation to the client. To achieve this, ringers are used as a building block. The ringers provide protection from lazy service providers that instead of conducting the computation simply return a random answer. Each witness works separately and creates its own ringers. Then all witnesses engage in a distributed protocol to agree on a new rating of the service. Assuming that the service providers are not malicious but can be lazy, the use of ringer results in a better performance than checking of the results.

2) Title: An Efficient Publicly Verifiable Mix-net for Long Inputs
Presenter: Jun Furukawa

Verifiable mix-nets perform mixing in such a way that the output can be verified. In the scheme given in this work, the output can be publicly verified, mixing has a practical computation cost, and it works for arbitrarily long inputs. None of the previous schemes could simultaneously achieve all of the above. The scheme is provably robust and anonymous. It uses verifiable encryption and zero-knowledge arguments for decryption and permutation as its building blocks. It also provides protection against collusion of a user and the mixer.

3) Title: Auditable Privacy: On Tamper-evident Mix Networks
Presenter: Jong Choi

Mix networks can be described as a sequence of mix servers that make tracing of a message impossible and are used as a basic building block in applications. Each mixer has a random permutation that is kept secret. However, an attacker might compromise a mixer and leak the secret information. Such attacks can leak permutations, private keys, and in general any security-critical information. Public channels are possible, and the attacker could create a covert channel to leak secret information (hidden in the output), so that the leak is not detectable by others.

The key generation phase is safe against covert channels, but the mixing phase is vulnerable. To address this, in this work each mixer is required to produce a commitment at the end of the key generation phase, and then each mixing can be verified using a witness. That is, now the mixing phase is tamper-evident. The solution is based on re-encryption mix-nets and requires each server to performs ElGamal re-encryption and permutation.

Merkle hash trees are used to produce a commitment. Verification is interactive: an observer chooses either 0 (left) or 1 (right), the server opens the corresponding values and hashes of the others, and the observer checks that there is no deviation from the commitment. Security can be improved by lowering the probability of cheating from 1/2 to 1/2^k using k commitments. Also, the interactive verification can be replaced with a non-interactive version, which in addition allows for many stealth observers not known to the server.

Short Technical Paper Session
Session Chair: Gene Tsudik

Summarized by Philippe Golle, PARC

A Practical Implementation of Secure Auctions based on Multiparty Integer Computation
Presenter: Ivan Damgard

Danisco is an international processor of farm products that buys its ingredients from a number of producers of raw materials. A double auction is used to determine the fair market price of commodities. Each seller has a quantity to sell at every price, and likewise each buyer has a quantity to buy at every price. These inputs are submitted to a trusted auctioneer, who draws the corresponding demand and supply curves. Where the demand and supply curves cross, the market clearing price is found.

The job of the auctioneer boils down to adding up inputs, then finding the market price with a binary search. The goal of this paper is to replace the trusted auctioneer with multiple trustees (eg: Danisco, the farmer association and a neutral party). This is done with the use of secret sharing techniques. Buyers and sellers encrypt shares of the quantities they want to buy and sell under the public keys of the trustees. The trustees add up the shares (which is trivial with secret sharing). They must then do a comparison to find the market clearing price (which is public). For the comparison, the paper uses a trick by Cramer, Damgard and Ishai published at TCC 2005. Intuitively, the comparison step is hard because comparison is bit-wise, whereas secret sharing shares a whole number.

Defeating Malicious Servers in a Blind Signatures Based Voting System
Presenter: Jacques Traore

This talk presents two protocol failures in a voting system called Votopia. Votopia was designed by Korean and Japanese teams. It was used to vote for the most valuable players during the 2002 FIFA world cup.

Votopia is based on blind signatures. In a nutshell, ballots are encrypted, signed and mixed. After the mixing step, ballots with valid signatures are decrypted. If a signature is invalid, Votopia uses a back tracing procedure to detect if the anomaly comes from a cheating mixnet or a cheating voter. If a mix cheated, it is disqualified. If a voter cheated, her ballot is discarded.

This presentation identifies the following failure in Votopia. The first server can modify the results of the election without being detected, by asking an accomplice to get signatures on ballots. The first mix then receives the list of true ballots, and replaces the ballots submitted by people whom the mix does not like with ballots prepared by the mix's accomplice. When the final list is decrypted, it contains only valid ballots with valid signatures.

To repair Votopia, the proposed solution is to replace blind signatures on ballots with fair blind signatures. The voting protocol is otherwise similar to Votopia. But now, if a voter asks for a fair blind signature but does not cast a ballot, the anonymity of the signature is revoked and the signature is put on a black list. This prevents the attack against Votopia described above.

The repaired version of Votopia was tried during student election in France and the EU referendum. In the trial, all the ballots collected were valid (so there was no need for back-tracking).

Pairing-Based Threshold Cryptography Improving on Libert-Quisquater and Baek-Zheng
Presenter: Yvo Desmedt

In an identity-based encryption schemes, the public key of a user is her identity. In Boneh-Franklin, a Trusted Authority (TA) has a secret key that allows it to compute the private keys of users. But trust in the TA is problematic. This presentation shows how to use secret sharing and threshold decryption to build an identity-based cryptosystem in which one trust in a single party is not required.

Credit transfer within market-based resource allocation infrastructure
Presenter: Tyler Close

In complex computing infrastructures (e.g. PlanetLab), it is hard to determine efficient allocations of resources among computing tasks. To manage this complexity, one approach is to rely on market-based mechanisms (auctions) to distribute resources: tasks are allocated currency, and can bid for resources to finish their runtime. This talk analyzes the requirements for credit transfers in such a system. The requirements identified are as follows:

Yet more requirements are given in the paper (audit trail, network efficiency, resistance to denial of service attacks, etc). The conclusion is that credit transfer is about access control, not cash schemes.

A Note on Chosen-Basis Decisional Diffie-Hellman Assumptions
Presenter: Michael Szydlo

This talk gives a cryptanalysis of two complexity assumptions introduced at Financial Crypto 2005 by Pointcheval and Abdalla to prove the security of a three party password authentication scheme. Both assumptions consist of complex interactive games. This talk shows that both assumptions are unsound. The security of the three party password authentication scheme of Pointcheval and Abdalla is an open question: only the security assumptions were cryptanalyzed, not the protocol itself. Two lessons can be drawn for provable security: complicated assumptions are risky, and non-interactive hardness assumptions should be favored.

Cryptanalysis of a partially blind signature scheme or "How to make 100$ bills with 1$ and 2$ ones"
Presenter: Jacques Stern

This talk presents a cryptanalysis of a partially blind signature scheme due to Cao, Lin and Xue. A blind signature allows a signer (say, a bank) to sign a message (say, a coin serial number) without learning it. Blind signatures are used in E-cash systems to ensure that coins are untraceable. A partially blind signatures consists of two parts: the first is visible to the signer (e.g. the coin's value or expiration date), while the second is not (e.g. the coin serial number).

In 2005, Cao, Lin and Xue proposed a partially blind signature scheme, and proved that their signatures are untraceable and unforgeable. This talk shows that these proofs are incorrect, and that neither untraceability nor unforgeability holds. The partially blind signature scheme of Cao, Lin and Xue is insecure.

An Efficient Group Signature with Concurrently-Secure Joining
Presenter: I. Teranishi

A group signature scheme consists of a protocol to join the group, a protocol to sign a message, and a protocol to trace a signature. When multiple users want to join a group, most existing group signature schemes require several sequential executions of the join protocol. This talk proposes a new group signature scheme that allows for concurrent joining. The new scheme is based on the Furukawa-Imai group signature scheme. The talk describes a 3-move concurrently-secure joining protocol for Furukawa-Imai group signatures. The computational cost of signing and joining for multiple users is smaller than the original scheme of Furukawa and Imai.

Rump Session
Session Chair: Moti Yung

Summarized by William Enck, Penn State University

This year's Rump Session had two innovations: an anonymous announcement of a talk and an auction for presentations that are requested during the session itself!

21:00-21:05 Opening Remarks

21:05-21:15 Key Exchange Using Password and Long Key
Vlad Kolesnikov*, Charles Rackoff

In this talk, the speaker addressed one of the contributions on his TCC'06 paper with the same title. He presented a family of key exchange schemes of Halevi and Krawczyk, and described a practical attack on some members of the family. This attack allows the intruder $I$ to test one password of *another* player $P$ by each login *as $I$*.

21:15-21:20 VietCrypt (an announcement)
Lan Nguyen

More information can be found at

21:20-21:25 Anonymous announcement given by Philippe Golle

For Enhancing the Privacy of the speaker, announcement remains anonymous

21:25- 21:30 Short Group Signature with Flexible Join
Toshiaki Makita*, Yoshifumi Manabe, and Tatsuaki Okamoto

Toshiaki Makita

21:30- 21:35 Split, Scatter and Throw
Burton Rosenberg

We note that the SANS switch is a natural place to perform secret sharing of disk data blocks. By insuring that target disks are located in a diversity of locations, so called Split, Scatter and Throw, we have a simple, keyless security solution for backup with resilience to lost data tapes and retired hard drives. A software-only implementation using FUSE on Linux was also introduced.

21:35- 21:40 Cryptool: e-Learning tool for Cryptography
Bernhard Esslinger

CrypTool ( is an open source project started 8 years ago which develops a GUI package (freeware) to make learning cryptography easier.

It contains modern and classic algorithms, protocols and schemes and many of them are visualized. Unique are the the exhaustive online help and the extent of cryptanalysis functions. It is written in C++.

The package is downloaded more than 2000 times a month and used in many companies and organizations worldwide. The next version 1.4.00 will be released in April 2006.

Any help to contribute to this open source project is appreciated.

Bernhard Esslinger, lecturer at the university of Siegen and director at DB.

21:40-21:45 Security of UC Blind Signature
Seiji Doi*, Yoshifumi, Manabe and Tatsuaki Okamoto

Seiji Doi

21:45-21:55 The Roseau-Massacre information disclosure attack (-or- How to track financial cryptographers)
Sven Dietrich

In the digital world, subtle vulnerabilities occur when data structures are not completely cleared. When subsequent uses of that data structure uses less space, bytes from the old data remain. A similar vulnerability exists in the real world. At the 2005 FC, attendees asked the front desk for directions to a BBQ restaurant. After arriving at the destination, they discovered the reverse side of the paper with directions contained the hotel charge summaries for other FC attendees.

21:55-22:05 Providing cell-phone service to the perfect terrorist
Radu Sion

User identities are strongly tied to cell-phones. Whenever you make a call, providers record the time, location, and destination number. As data services become more prevalent, VoIP can be used to circumvent this logging.

22:05- 22:10 A Vulnerability in a Ciphertext Comparison Scheme
Vlad Kolesnikov

The speaker briefly presented some aspects of a ciphertext comparison scheme of Peng, Boyd, Dawson and Lee. There, a set of authorities determine which one of the two homomorphically encrypted numbers x, y is greater. The idea is to obtain a threshold encryption of a product (x-y) \prod r_i, where r_i are chosen by authorities, such that the product does not overflow; the (threshold) decryption of the product reveals which number is greater. The observation of the speaker is that with high probability, the prime number decomposition of the product will reveal a lot of information about (x-y), which necessitates a revision in the scheme.

22:10-22:30 This presentation time was *for Auction* for last minute rump session entries

Winners of auctions are:
(1) Announcement: WEIS 2006 -- Jean Camp.
More information at WEIS06
(2) Announcement: CANS 2006 -- Yi Mu
More information at
(3) Secure Networking with Mono-Policies: the good, the bad, and the ugly
Yvo Desmedt

In the classical work on a Byzantine adversary in computer networks, which was generalizes by Dolev-Dwork-Waarts-Yung to include privacy, the number of adversaries is bounded by a threshold. Since many routers run on similar platforms and since attacks can be replicated, the threshold bounded adversary is unrealistic. An alternative model is to use a node-coloring to indicate on which platform the node runs. Since the resources of the enemy are assumed bounded, this one can only infect all nodes running a threshold number of platforms or less. In sharp contrast with the classical case, security requirements are no longer equivalent to demanding the existence of node-disjoint paths (the ugly part). Unfortunately security conditions become co-NP completely (the bad part). The good news for monopolies is that when the communication paths are monochrome (use the same platform), security can easily be achieved. The last results provides the foundation for the colored PKI model proposed in Comm. ACM, August 2004. For more details see e.g. Section 4 in Proc. ISAAC 2005 (Springer).

(4) The ARIANE attack on RSA - Oops, sorry, I mean RSI!
Ariane Dekking (physiotherapist, specialized in RSI)

Newsflash!... As some of you may know, Alice and Bob are secretly in love with each other. They use encryption to keep their e-mail relationship hidden from the outside world. After too many hours of pleasure, sitting uncomfortably behind the computer, they started to have physical complaints.

This is what many people will recognize as Repetitive Strain Injury (RSI). RSI literally means complaints of overstraining caused by repetitive movements, which can lead to a scale of different kind of symptoms in arms, neck and shoulders.

To attack RSI you can try to use the ARIANE approach: Anguilla Reverse Independently Adaptive Nerve Enhancer! (Only the FC'06 rump session attendees in Anguilla might appreciate the full extent of this approach...)

What might be a better solution though, is to go to and take a look at the information for treating RSI. Or write me an e-mail at

(5) Lightweight Email Signatures
Ron Rivest
(joint work with Ben Adida, David Chau, and Susan Hohenberger)

We present Lightweight Email Signatures (LES), a simple cryptographic architecture for authenticating email. LES is an extension of DKIM, the recent IETF effort to standardize domain-based email signatures. LES shares DKIM's ease of deployment: they both use the DNS to distribute a single public key for each domain. Importantly, LES supports common uses of email that DKIM jeopardizes: multiple email personalities, firewalled ISPs, incoming-only email forwarding services, and other common uses that often require sending email via a third-party SMTP server. In addition, LES does not require DKIM's implied intra-domain mechanism for authenticating users when they send email.

LES provides these features using identity-based signatures. Each domain authority generates a master keypair, publishes the public component in the DNS, and stores the private component securely. Using this private component, the authority delivers to each of its users, via email, an individual secret key whose identity string corresponds to the user's email address. A sender then signs messages using this individual secret key. A recipient verifies such a signature by querying the appropriate master public key from the DNS, computing the sender's public key, and verifying the signature accordingly. As an added bonus, the widespread availability of user-level public keys enables deniable authentication, such as ring signatures. Thus, LES provides email authentication with optional repudiability.

We built a LES prototype to determine its practicality. Basic user tests show that the system is relatively easy to use, and that cryptographic performance, even when using deniable authentication, is well within acceptable range.


Technical Paper Session: Conditional Financial Cryptography
Session Chair: Ray Hirschfeld

Summarized by Ivan Osipkov, University of Minnesota
1) Title: Construction for Token-Controlled PKE
Presenter: David Galindo

Given user's PK and token, we want to encrypt such that message can be decrypted only when the token is released. In addition, we also want to ensure that ciphertext will not decrypt under another token. The token is given to designated trusted party which releases it only when certain conditions are met. One of the goals is to make sure that the same token can be used for multiple users without security compromise and that the token size is small. More precisely, knowing token does not allow to learn anything about the message unless the corresponding secret key is known, and without the token (even if secret key is known) one learns nothing about the encrypted message. The authors introduce the notion of strong existential unforgeability, which ensures that both the token originator and the ciphertext receivers will be able to detect if the TTP provides with incorrect token. Previous approaches allowed for TTP to modify the token such that the ciphertext will still decrypt but to a possibly different plaintext. The authors propose a scheme based on Fujisaki-Okamoto transformation and TPOW functions. In addition, when using RSA for TPOW, the security reductions are tight.

2) Title: Authenticated Key-Insulated Public Key Encryption and Timed-Release Cryptography
Presenter: Ivan Osipkov

The authors formalize the notion of secure Timed-Release PKE using offline agent which periodically publishes tokens. Anyone can encrypt message such that only designated receiver will be able to decrypt and only when the token for the appropriate day is published. The attacks include confidentiality against third party which compromises the agent, and timed-release confidentiality in which the receiver adaptively tries to choose the time and his PK in order to decrypt before time. The authors show that existence of secure TR-PKE is equivalent to SKIE-OT with Random Access Key Updates. Together with related work, this implies that existence of TR-PKE is equivalent to existence of IBE. The authors show how to construct secure TR-PKE using IBE and multiple encryption of Dodis-Katz, and then provide a construction based on adapted BF IBE. The authors extend this construction to include sender authentication such that 3rd party (even having compromised timed-release agent) cannot forge a new ciphertext. In addition, the authors show that the adaptive receiver cannot forge a ciphertext to itself from a given sender to the future time. In addition, confidentiality against third-party and timed-release confidentiality hold even if senders secret key is compromised. The resulting scheme, TR-PKAE, is effectively almost as efficient as BF IBE itself. In addition, it can be extended to use multiple independent agents using Pedersen's threshold scheme and can be extended to Boneh et al HIBE to reduce the repository requirements for tokens.

3) Title: Conditional Encrypted Mapping and Comparing Encrypted Numbers
Presenter: Vlad Kolesnikov

The general problem is how to use semi-trusted third party to compute "Greater Than" predicate on encrypted numbers such that 1) the third party does not need to know plaintext values of the inputs and knows only public encryption key, and 2) the third party does not learn anything about the meaning of the output of computation, which can be extracted using corresponding secret key. The resulting scheme is called GT-CEM (Conditional Encrypted Mapping) and requires correctness and statistical privacy (i.e. statistically, one cannot learn more from third-party's output than the actual result of computation). The proposed construction uses additively homomorphic public key encryption such as Palliers and encryption is computed bit-by-bit. The construction is achieved in two steps: 1) randomized mapping, (-1,1)-RM, is constructed, which is a variation of GT-CEM except that it takes as input one encrypted number instead of two and returns a) corresponding encrypted secret value if plaintext of encrypted input is -1 or 1, or b) random answer if the plaintext is 0 or random (we need random answer for 0 because in GT-SEM were working on difference vector); 2) to construct GT-CEM, the encryption of difference vector of inputs is computed, to which, after randomization, (-1,1)-RM is applied bit-by-bit ensuring that only the important bit is not random and can be isolated. The key idea is to come up with a linear function f such that from f(-1) and f(1) one can reconstruct corresponding plaintext secret while at the same time keeping f(0) random. The authors discuss how one can construct more general variation of (-1,1)-RM and how a more general predicate can be used. It is shown how GT-SEM can be used for PSPP auctions and for proxy selling with secret reserve price. Finally, the authors compare costs of proposed protocols with related work showing that it outperforms in terms of communication and modular multiplications, partially due to the fact that related work transfers one-bit secret per execution while this work aggregates them up to security-dependent constant.

4) Title: Revisiting Oblivious Signature-Based Envelopes
Presenter: Gene Tsudik

Suppose a third party generates signing and verification keys, distributing public parameters. The authorized receiver obtains (confidential) signature on some public message M. The sender has a secret plaintext P. The sender and receiver interact such that sender transfers confidentially message P in a way that the sender does not know the message, and the receiver will not be able to learn anything about P unless he has the appropriate signature. Such scheme, called OSBE, must have soundness, obliviousness (sender does not know whether receiver has signature) and semantic security against the receiver (without the right signature, receiver learns nothing about P). The author likes key agreement protocol without authentication, with both users ending up with the same shared key if and only if receiver has proper signature. The key is used to encrypt the message P.ty using standard crypto assumptions in the remaining schemes. same number of exponentiations as in the OSBE adaptation. When ented, Schnorr-OSBE is the most efficient while (non-interactive forward secrecy, i.e. the adversary should not be able recover secret shared during interaction even if it knows the master secret (the signing key) of the third party (note that RSA-OSBE provides PFS, while the authors show how to provide it in the remaining protocols). Another interesting property that is introduced is called impostor obliviousness: in this case, the sender runs SBE protocol with the same receiver and should not be able to tell if the receiver is an impostor or not (note that in case of simple obliviousness, only a single run is performed). The RSA-OSis impostor-oblivious while the remaining schemes are not. Finally, the authors note that OSBE is an asymmetric version of secret handshakes and propose how to combine two runs of OSBE to obtain a secret handshake scheme.

Technical Paper Session: Payment Systems
Session Chair: Jean Camp

Summarized by Vlad Kolesnikov, University of Toronto
1) Provably Secure Electronic Cash based on Blind Multisignature Schemes.
By Yoshikazu Hanatani (The University of Electro-Communications) and Yuichi Komano (Toshiba Corporation) and Kazuo Ohta (The University of Electro- Communications) and Noboru Kunihiro (The University of Electro-Communications)

The main subject of the talk and the paper was electronic cash. The speaker discussed the standard model of untraceable electronic cash, where the players are bank, user, shop. The goal is to enable offline payments, satisfying the standard requirements -- untraceability, double spending detection.

The speaker noted that known schemes vulnerable to insider attacks, i.e. to attacks by rogue employees of banks. He proposed the principal solution -- e-cash with multiple banks, where we distribute the issuing process among several banks. For this task, there is a need for a new primitive -- blind multisignatures.

The speaker defined the model and gave definitions of security for this primitive. In particular, he required resistance to adaptive message attack on blind multisignatures.

He then presented protocols for coin issuing and tracing of double-spenders. He also mentioned proofs of the corresponding security results.

2) Efficient Provably Secure Restrictive Partially Blind Signatures from Bilinear Pairings.
By Xiaofeng Chen and Fangguo Zhang (Sun Yat-sen University, China) and Yi Mu and Willy Susilo (University of Wollongong, Australia)

The main contribution of the paper is a restrictive blind signature scheme and a restrictive partially blind signature scheme, which are constructed from bilinear parings. The schemes are short and efficient.

The speaker first discussed preliminaries. As a background, he discussed blind signatures, partially blind signatures, restrictive blind signatures, and restrictive partially blind signatures.

He then presented the building blocks of the schemes -- a variant of GDH blind signature scheme, a variant of BLS short signature scheme, and blind variant of BLS short signature scheme. Then the speaker discussed the constructions of the paper, applications to electronic cash, and summarized the contribution.

3) Privacy-Protecting Coupon System Revisited
Presenter: Lan Nguyen

The paper addresses privacy issues in coupon systems. The author begins with previous work on Privacy-Protecting Coupon (PPC). The main point of comparison is the coupon system presented in FC 2005 (CESSS05). There, there is a vendor and many users. Vendor gives m-use coupon to users. We want anonymous and unlinkable redeeming, unshareability. The speaker also mentions related work TFS04, NS05, Chaum, Brands, Ferguson, OO, Chen, Verheul, PV. None of them quite achieve what is done in the paper.

The author then discusses building blocks and assumptions. His construction relies on bilinear pairings and related assumptions. It uses Boneh-Boyen signature scheme, Camenisch-Lysyanskaya scheme, and Dodis-Yampolskiy verifiable random function as building blocks.

The contribution of the paper is in several directions. This work allows coupon revocation, which was not available previously. The author also formalizes the model and proposes a new scheme. The scheme has a constant cost of redeeming coupons (vs linear in m in CESSS05). This constitutes a tradeoff, since the initialization of this scheme is linear in m, vs constant in CESSS05.

Panel: Identity Managment
Moderator: Frank Trotter

Notes by Jean Camp, Indiana University

Summarized by William Enck, Penn State University

Who are you anyway?
Can an identity be secured, managed, and verified?
Can federated IDs work?
Can corporations cooperate?

Is there an answer, or many? There are proposals, but what are the critical elements of financially viable and privacy variable?

Jon Callas, CTO PGP:
Four things are called ID management:

Mike Froomkin
If you run an ID system, then you must engage in legal compliance: EU, California disclosure. Whan happens when the cops show up? Design must be sensitive to local compliance requirements.

Most discussions one hears are about "home brew". People are talking about building systems. How much value is there of government issued IDs? Reliance on government IDs will come with possible conditions. Privacy rules are coming. As we drift towards national ID, we should have privacy conditions that come with the use.

Bob Hettinga

The only interesting thing about crypto for most people is to get money back when they send code. That is why people got into cipherpunks ~'95

Cryptography used in finance is FC. So, my definition is very broad. It is basically anything that is not crypto used for and against nation states.

Does what you are doing reduce risk-adjusted transactions cost? There exist token and notational money which Bob calls book-entry and bearer transactions. With book-entry, you send someone to jail if they lie about the books.

I use this American Airlines credit card. AA wants to bribe you with an award to keep you from just flying cheap. That's why they use it.

They are transfer prices to fill the seat. Why not just fly cheaper. Airlines are built to fly cheaper. The marketing guys say that. In finance, you can get that from the market, the price.

They give you the card to do price discrimination. The opposite of privacy is price discrimination. Odlyzko has the best paper on that.

I have another identity with a credit card under another name where I don't have my name. It was worth it to them to provide it. Identity can be brokered.

We are an online bank. How can we reduce the cost of ID management and decrease the cost of identity management. Look up to see if address matches, so we know they are responsible. We have been talking about this for 10-15 years.

Frank was the first person who put blind signatures on the net. They don't do it now. Why?

We stopped because it was not was not financially viable.

They got a lot of the porn market. Once the user got an account they could be identified. M Twain identified a pornographer, M Twain got upset at the concentration of pornographers.

There were not enough people. I know immediately that they have an account.

Vinnie did a paper in 97-98 on what were really attribute certificates. It might be time to re-publish.

I wil pay someone to validate transaction.

We need to do that for book-entry. Private law is created by business all the time.

You have to know depositors under know-your-customer requirements.

Support costs are my killers. I have very few customers that eat up most of my support costs. Identification of them into types would be great. Knowing how to identify them and do differential services is critical

He wants people not to walk away from their history. If your trouble customer is always Chris Marlow and good customer is always Jon, they you need a strong binding.


The only thing PGP has done differently is that it creates a market for CAs. It doesn't mean that certifying someone means that you like them. We had an idea that you can dis-identify someone, negative certification.

Does EverBank use digital signatures?

You have a digital signature that shows you have stolen password/ID. We are installing PGP Universal last week. In the future, it will be required for all users. Not unlike fax or letter a digitally signed mechanism is a valid signature.

I clipped a gif into a digital signature document the insecure gif of the pen signature was valid not the secure signature.

A signature is an act, not a thing. A person always sent copies of receipts. Not real ones and had to change. But they allowed him to fax the receipts, not a Xerox.

The fax is wire fraud if not real. But the Xerox is not.

If you had software that you trust, would you let the Xerox be real?

ECash and certs did not catch on

What is the overhead fraud for distinct mechanisms?

Credit card fraud is highest, fedwire is essentially zero risk, checks vary widely. The real ID management is more mailbox than Internet.

Last time a fedwire financial transfer fell, a bank fell in '86

I have a colleague who woke up and found 2.6M in his account that had been routed. $2.6M sat there for ten days. The FBI thought it was too low to trouble with. It sat there for days, he did not dare touch it. Then it was moved out.

I have gotten 10-15 calls of which 2 were genuine fraud. But you have to be careful, because you can refuse valid customers.

You concentrate risk when you concentrate identity

Are there low-forge passport IDs?

On 9/11 they weren't

That is the difference between authentication and authorization. 9/11 the hijackers were identified and authenticated as having tickets.

Is there an acceptable level of digital due diligence? There is a corollary in the digital world.

If you spend money verifying identities, would national IDs be cheaper?

I would be cynical for a long time about the value of national IDs because you could steal them.

Merely identifying someone is useless. It doesn't identify what they do.

Part of the cost in book-entry settlement is this concentration of risk.

There are people working on it. MF:
The amount of binding is much higher than it was 20 years ago.

A friend of mine who is an anti-CAPPs lobbyist and tried to have a kid. They decided to pay to fly to Brazil and have the kid there. They documented him at birth as two citizenships of two countries without extradition treaties.

Brand's company presentation was Chaum's presentation with identity substituted for money.

Micropayments payments were going to be a killer app, why not?

Larger institutions will have to force change. It won't be bottom up. I am talking security transactions

Use of identity credentials for environments that they are not designed for? Please comment.

Most notorious is SSN. It was set up not to be an ID. So it is a bad ID. We should tame rather than kill national ID cards.

SSN is a bad microfederated identity

Stories about airlines, why their security is bad because it's not about security, it is about preventing secondary arbitrage of airline tickets.

I worked at Morgan Stanley as a kid -- companies that issue interest on bearer bonds couldn't take it off their taxes. They went away because they were too expensive.

So do you believe that national IDs do not increase security?

National IDs exist. You have to design them in such a way that ID theft is reducible but detectable. Too good a national ID then ID theft is not believed. Too weak an national then ID theft is rampant.

There is a law in Arizona that when you register to vote, the state must demand and keep copies of all materials. What do you think? What's the point?

In Miami, they would use the database to look at absentee fraud. Now in order to increase fraud, (they called it reduce complexity) they removed the counter-signature. This reduced the prosecution. I'll look into Arizona.

When we have PKI?

Eight years ago.

What has failed is "the PKI." What is happening is that lots of PKI is springing up. BMW requires PGP signatures. Skype has a PKI - it is a simple elegant little PKI. The Government solved everything with Bridge CAs, e.g., trusted introducers. The PKIs are specific not a monolithic PKI.

Verisign killed it.

We might see "distinguished names." You will same virtual or Federal ID.

You talked about Real ID cards and passports. What's the difference?

Real ID requires DLs to function by 2007 as national IDs. It won't happen by 2007, but it will happen. It is being sold by terrorism. Fraud and illegal immigration are the drivers.

There is ID fraud

There are civil liberty costs.

Where is the demand?

It is being pushed down. If they could reduce liability for me, I would support it.

Three main sources. The first is people who are anti-immigrant. Second, the people who believe it will stop terrorism. Third, the people who want to sell it.

There is a "cult of identity" that if I know your true name, I would know your attributes

We provide identity to get things done. This room may feel uncomfortable with that but identity is a simplifier.

Gilmore is unable to fly domestically. There are things growing. We'll get to it next year. A check cashed is a CA authentication.

We went through immense effort to create digital signatures. We loved them to death.

There was a European Directive on E-money. We had regulators who did _not_ understand E-money, banking or innovation. EGold money is not in Europe in regulatory terms.

The fascinating concept is with regulation the problem used to be capture and revolving door. Primarily now it is ideological constrains, based on the level playing field arguments. Incumbents make level playing field arguments and innovation is killed.

What if they had everything? What are the real civil liberties arguments?

Risks of national ID:

Invited Talk: Michael Froomkin
Title: Are We All Cypherpunks Yet?

Summarized by Mohammad Mannan, Carleton University

Michael Froomkin's talk primarily focused on comparing the environment and status of financial cryptography, security, and privacy in real and cyber worlds in the context of last ten years. In 1997, when Financial Cryptography (FC) started, there was a "change the world" attitude in the security community. PKI, digital signature, digital certificates seemed coming big; e-cash was just around the corner. The U.S. government attempted to control communication privacy in an "encrypted" world by influencing industries to implement the Clipper Chip, Skipjack etc. Also, export control was in place in 1997 in the U.S. -- another attempt to keep crypto tools out of mass-market.

Froomkin argued that there have been no "big" changes made in the last ten years. The crypto community has won a few battles but no wars. A "corporate" feeling has replaced the "woodstock" feeling in the community. Many cypherpunks of 1997 are now Vice-President or CTO of successful companies. The hype of PKI, distinguished names have long gone. SSL is viewed a step toward secure e-commerce. But Froomkin also mentioned that SSL certificates are far beyond users' understanding, and numerous SSL warnings simply train users to ignore those messages. Consumers are yet to use digital certificates. Nevertheless, significant progress has been made on legal issues for digital signatures. The dream of e-cash is still unrealized. Mondex is a primary example of e-cash failure. Froomkin dismissed the use of cellphone payment system as e-cash. He categorized such a system as a "smart" use of cellphones.

In the realm of communications privacy, many things are unchanged in compare to 1997. Very few entities other than financial organizations use crypto tools to protect user privacy. Tor, onion routing, are good in theory, but have limited practical impact. Web-based services, e.g., Gmail, Google search history have made online privacy much worse than in 1997. Things are pretty bad legally too. Laws such as the U.S. Patriot Act, Foreign Intelligence Surveillance Act (FISA) have threatened civil liberties more than anything since the World War II. Guantanamo Bay, Afghanistan, CIA prisons, Abu Gharib have made privacy nothing but a "rounding error". NSA is apparently spying on all, home and aboard -- details are unclear though. "Lawfare" is defined as the pursuit of strategic aims through legal maneuvers; the use of the legal system is discouraged in the light of national security. There was a little debate on how much different the situation would be under a different U.S. administration. Also, reasons why people's reaction to the Iraq war isn't the same as the Vietnam War is discussed.

Although the export issue of DES and other crypto has been resolved, the mass-market use of crypto is yet to be realized. Crypto tools are not used in Windows (not with an easy user interface), P2P or VOIP. However, one interesting new development in the P2P domain is noted. To fight ISPs' attempt to reduce bittorrent traffic ("traffic shaping"), RC4 encryption has been introduced recently in the Bitcomet client.

Froomkin predicted that the crypto wars are coming back. Communications privacy will be a major confrontation ground. Governments want to access electronic communication as well as telephone conversations. VOIP traffic has been targeted. The EU data retention directive requires providers to keep call setup information for two years, as well as details of unanswered calls. Governments are collecting information from several checkpoints, e.g., banks, ISPs, telephone companies, Internet cafes. Money flows also reveal a lot of information. Is FC relevant to current privacy issues? Froomkin stated that much financial crypto has useful non-financial aspects too. In that sense, FC is becoming more relevant.

Technical Paper Session: Efficient Protocols
Session Chair: Tatsuaki Okamoto

Summarized by Lan Nguyen, Commonwealth Scientific and Industrial Research Organization

1) Efficient Broadcast Encryption Scheme with Log-Key Storage
Presenter: Yong Ho Hwang

This presentation by Yong is on broadcast encryption, where a centre sends encrypted messages to a group of legitimate users. It should have stateless receivers, i.e. a device decrypting current transmission with its initial configuration only. Its applications include Pay-TV, multicast communication and DRM.

A generic model for Broadcast Encryption is provided, including 3 algorithms, Set-up, Broadcast and Decryption. Important requirements for Broadcast Encryption include transmission cost, key storage and computation cost. The presentation then discusses related works. Communication-efficient Broadcast Encryption schemes include the SD scheme and the Pi scheme, but they require too much memory. Broadcast Encryption schemes with log-key restriction, where a receiver can store at most O (log n) secret keys, include the Complete Sub-tree scheme, the Stratification Subset Difference scheme and the B1 scheme. The contribution of the presented paper is a Broadcast Encryption scheme with log-key restriction with more efficient transmission. Yong presents the basic scheme without using tree for 16 users. He then presents the complete scheme with tree, based on the SSD and B1 schemes and a new idea of key assignment technique traveling between stratifications. The complete scheme is demonstrated for 64 users. There is a lemma saying that the key assignment satisfies the key indistinguishability property. The complete scheme is the scheme with O (log n) key storage restriction with the most efficient transmission cost and reasonable computation cost and it is good for large groups of users. There is a question what k represents. It represents the number of stratifications.

2) Efficient Correlated Action Selection
Presenter: Marina Blanton

This presentation by Marina is about a game-theoretic problem of two-player strategic games, where each user has a set of possible moves and they execute the moves simultaneously. There is a payoff function that is computed from each pair of two simultaneous moves. Each player tries to maximize their expected payoff by choosing suitable moves. The game theory has proved that collaboration between players could result in higher payoffs for both players. It can be done by a trusted third party mediator who performs action selection according to a probability distribution and secretly tells each player the designated moves. Marina provides an example of two competing stores each of whom each week has to decide in advance whether to run a sale. It is acceptable if at most one store runs a sale and it is unacceptable otherwise. A problem is that the players privacy could be weakened while collaboration takes place. This paper tries to use cryptographic methods to address this problem.

The problem can be described as a game in which two parties want to collaborate using a joint strategy described by a list of m pairs. Each pair of actions is chosen randomly according to a probability distribution and each player only learns its respective move without learning anything else. Some schemes have been proposed to address this problem. The DHR (Crypto00) scheme eliminated the third party mediator but it is only efficient for a uniform distribution. The Teague (FC04) scheme allows efficiency for a non-uniform distribution but it is still worst-case exponential in the representation of the joint strategy. This paper proposes the most efficient scheme. The m action pairs are denoted as {(ai,bi)i=1m. Each pair (ai,bi) is chosen with probability pi/2l where pi is a l-bit integer, so p1++pm=2l. Two players jointly compute Pi= p1++pi for i=1,,m. To obtain (ai,bi), they generate a random l-bit integer r and find the index i such that r is in [Pi-1, Pi) whose probability is pi/2l. It can be done by using a semi-honest protocol with a semantically secure homomorphic encryption scheme such as Paillier. They also propose an efficient solution against dishonest behaviour using threshold ElGamal encryption and a two-party computation technique (ST-Asiacrypt04). Marina shows that the new proposal is more efficient than the two prior works in the worst case. There is a question about how to create the table of collaborative incentives. It is a different matter and depends on situations. It is answered by statisticians and not a computer science issue.

3) Efficient Cryptographic Protocols Realizing E-Markets with Price Discrimination
Presenter: Moti Yung

This presentation by Moti is about using cryptographic techniques to provide electronic markets with price discrimination. The scenario is that a seller advertises a product with certain desired revenue and different buyers are willing to pay different prices for the product. The sellers objective is to maximize revenue or at least obtain the desired revenue. If the seller issues the same price for everyone, some buyers with less willingness do not buy the product. With price discrimination, buyers pay for the product according to their willingness, the revenue is increased and everyone is happy. However, price discrimination is difficult to apply due to consumer discomfort This paper proposes efficient cryptography is feasible if the sum of willingness prices is bigger than revenue, even if the seller allows discounted prices. It is important that price discrimination occurs but nobody learns the bids. Cryptographic solutions have also been proposed but they are inefficient or not robust.

The ideal solution should protect ue and provide auditability and robustness against malicious proposed solution uses the Paillier threshold homomorphic encrypted values with proofs of knowledge of range bounds. Homomorphic encryption is processed and the authorities finally decrypt the rest can be modified to deal with surplus where a specific proportion slack can be allocated to support the seller.

Closing session: The end

The conference adjourned after final remarks by Avi Rubin, Giovanni di Crescenzo, and Ray Hirschfeld.

See you next year!