Review of the
Tenerife, Canary Islands, Spain
January 25-28, 2010
Review by Vaibhav Garg and Debin Liu
Disclaimer: The following review of the conference is limited by the authors ability to comprehend the various talks. If we make a mistake or quote someone incorrectly we ask for your apology. We do not intend to cause offense.
The Fourteenth International Conference on Financial Cryptography and Data Security was held from January 25-28th in Tenerife, Canary Islands, Spain. The conference was supplemented by three workshops: the First Workshop on Real-Life Cryptographic Protocols and Standardization (RLCPS'10), the Workshop on Ethics in Computer Security Research (WECSR 2010), and the First International Workshop on Lightweight Cryptography for Resource-Constrained Devices (WLC'2010). There is a separate write-up for WECSR 2010.
Session Chair: Moti Yung, Google and Columbia University
Invited Talk: Ueli Maurer, ETH Zurich: Constructive Cryptography - A primer
He talked about the need to have a step wise development approach in cryptographic research. He introduces constructive cryptography, where one can modularize the process of research in order to adequately access what cryptography can and can not achieve.
Session 1: Session Chair - Ahmad-Reza Sadeghi
Paper 1: The Phish Market Protocol: Securely Sharing Attack Data Between Competitors
Authors: Tal Moran, Harvard University, Tyler Moore, Harvard University
This paper was presented by Tyler Moore. Banks hire "take-town" companies to patrol internet for phishing sites. Take-down companies compete for clients. For take down companies, this kind of data is considered competitive advantage. However, there might be significant gaps between two data sets. I might cost up to $330,000,000 when phishing data is not shared, which it is not. The paper presents a protocol to share data about phishing sites so as to not provide unfair advantage to any of the participants.
Solution: Set up a phish-market to share phishing information. Buyer learns only URLs that phish its client banks. Seller cannot learn who the Buyer's client are. Buyer must pay for new each URL learned. Buyer doesn't pay for URLs he or she already known. They pay with encrypted coins and reveal only total payments to the seller.
Transactions: Seller offers URL; Buyer sends encrypted payment; Buyer proves payment is good; Buyer proves he knew some URLs.
What if the seller tries to sell fake data?
The assumption here is that both the companies want to do business.
Did you talk to take down companies?
Yes we did.
This paper was presented by Moti Yung. He introduced homomorphic encryption and bush operation. The bush model is not scalable: load balancing issues, remote geographic locations, and overlay networks in peer to peer architectures. The idea is to distribute the entire computations, decryptions, over the different nodes of a tree. Thus solve the biggest brother problem where decryptions are all done at the root, and we have a tree homomorphic encryption, which is scalable.
This paper won the best paper award. It was presented by Roger Dingledine. The TOR network has too few relays and they are too slow. There should incentives for nodes to become relays. For example, it should easy to become a relay, so you provide the users with a graphic user interphase, or point and click interface to become a relay. Also differentiate users by performance, reward the good ones and penalize the bad ones. An incentive would be that to get faster speeds a node would have to become a relay.
Does this proposal conflict with FCC net neutrality?
This paper was presented by Prithvi Bisht. The problem is of SQL attacks on databases. The existing solutions have limitations. For example a developer may use PREPARE statement, but they would have parse each query individually. This paper presented an automated technique for sound program transformation to include PREPARE statements in place of unsafe SQL queries.
(Felix Gröbert) How do we identify data precisely?
A. By using an SQL parser.
This paper was presented by Marcel Keller. They presented a new protocol for AES encryption. They took the approach of inversion by masking that has the advantage of less operations online. They evaluated their technique using VIFF benchmark. They chose AES because of its arithmetic structure.
The motivation was to automatically detect guessing attacks including undetectable online guessing attacks. The approach was verification based on pseudo-randomness. The image of a one-way function on the secret is known; the image of trap-door one-way function on the secret is known. Guessing rules: from one-way function images, by inverting one-way trapdoors.
They presented a protocol for secure modulo reduction in Homomorphic Paillier Cryptosystems. The efficiency of the protocol is independent of the bit length of the x being encrypted. The protocol can be used to compute statistics like mean etc. between multiple parties.
Pseudonyms are not good enough for trust. This paper presents signatures of reputation, a cryptographic primitive that allows for reputation with privacy and anonymity. It has formal privacy and security properties and users can obtain and prove reputation without being linked. Trust can be expressed through voting.
This paper was presented by Jens Grossklags. Users lack incentives to take actions to protect information. There is a need to identify how valuable is information in the context of security decision making? Decision to invest in security by an agent is not based on another agent's decision. Self-protection is a public good, while self-insurance is a private good. For every expert user, there may be n-1 naive users. They considered three cases: weakest-link, best shot and total effort. They define the price of uncertainty as the ratio of the expected payoff in the complete information environment over the payoff in the incomplete information environment.
This paper was presented by Paul Syverson.
We need an algorithm for large market players (LMPs). LMPs try to hide the intentions that are hard to hide. Thus we need new infrastructures or protocols. The approach is a general model underlying trading strategies that leak no information to exploiters: curious observers, individual curious brokers, and colluding curious brokers. This approach uses existing infrastructure.
What is the transaction cost?
A. We treat it as trade-off.
He began with an introduction of Game Theory. He talked about Nash Equilibrium and gave couple examples to show how to achieve equilibrium solution using backward induction, as well as some concepts about extensive game, strategies in dynamic games, and subgame perfection. Some examples of application of game theory on security: security of physical and MAC layers and intrusion detection systems. He explained relationship between game theory and cryptography: game theory concepts can be used in design of cryptographic mechanism design for rational users. Examples of game theory application in security: revocation in ephemeral networks, location privacy. Wireless networks bring formidable challenges in terms of malicious and selfish behaviors. Game theory modeling of security mechanisms can help predicting and influencing by mechanism design the behavior of the involved parties.
This paper was presented by Frank Stajano. The primary idea presented was the use of distance bounding to prevent replay attacks. This is enabled by using two channels. Multichannel protocols against relay attacks: yield a more specific answer and use unrelayable channel to sample physical property of prover. Properties of unrelayable channel: unclonability, unsimulability, untransportability..
This paper was presented by Joseph Bonneau. Personal knowledge questions are cheap. The security provided however is bad and in terms of privacy user has to provide sensitive information. A targeted attacker can find this information by doing a web search, looking at public records, social engineering, dumpster diving or acquaintance attack. For a trawling attacker, given 100 accounts, there was 50% success after 5000 guesses for PINs and 50% success after 168 guesses for last names. The data was obtained by crawling facebook and from public sources.
What if you let user pick the questions?
A. Users do not understand entropy. They usually pick mother' s maiden name since they have seen it many times and assume it is good.
This paper was presented by Meredith Patterson. They did horrible things to X.509. There exploited semantic and syntactic inconsistencies. X.509 certificates are signed by CA. But several CA implementation flaws: DNS cache poisoning, using MD5 at all, border gateway protocol attacks. They had some recommendations to improve this.
Validation parsing and generation should be the same. Reduce attack surface by using parser to generate implementations of protocols. Do not hand roll the parser. There is not way to equate two outputs otherwise - undecidable problem. Basic Encoding Rules (BER) are so permissive, use Distinguished Encoding Rules (DER) instead. Everyone should use the data the same way and interpret it the same way. Validate every layer and data at every layer. Certificate Authorities should test implementations.Questions
This paper was presented by Adam Barth. They challenged the assumptions that favor proactive security. Probabilistic model for an attacker is bad, since probabilistic models are good for random risks like fire. Attackers are strategic and for that we need stochastic models. Expected loss is not a good metric since it is driven by its tail. Instead use Value at Risk. The Attack graph model is directed graph and defender allocates budget over edges.
This paper was presented by Paul A. Karger. The goal was to improve the software security of smart cards. They designed a full blown operating system that looks like Unix. Applications are sand-boxed to control malicious behavior. For security policies regarding reading cards they used ISO 7816-4. Mandatory access based on modifications to Bell&LaPadula and Biba model. Privacy preserving authentication protocol that allows use of mandatory access controls in commercial settings across the entire Internet. Directory implementation: leaf files point to their parent directories pathname look up by linear search.
Why store file names multiple times?
A. For consistency checking.
This paper was presented by Maria Dubovitskaya. The problem is that people and organizations can leak information by the data they purchase. The solution proposed is priced oblivious transfer. Existing priced oblivious transfer protocols have issues, for example purchases are linkable and there is not option to recharge. The authors proposed a new protocol to overcome these problems.
Have you thought about implementation?
This paper was presented by Tom Chothia. This paper talks about the security issues of using passports with RFID chips in them. ISO 14443 handles basic wireless communication. Government says that this is for 9 cm. But that is not true. Its based on on FCC guidelines. Real range is much greater, i.e. more than 50 cm. ISO 7816 causes generation of error messages that give more information. ICAO specifications of passports are publicly available. Given the problems in the system it is easy to monitor someone's passport and based on that their movements.
What can you do?
It is illegal to write 'wrap it in foil' in a public paper.
Why is UK using BAC as opposed to USA using plain text?
They have always used it.
Can you use timing attacks to identify contents of passports?
Yes but it is very error prone.
Are you blaming RFID or just this implementation?
What is the directionality for broadcast?
This paper was presented by Sebastien Canard from Orange. They talked about a new e-cash scheme that can be used to produce coins of different denominations.
Q1. (Jean Pierre-Hubaux) Why is e-cash useful for Orange?
You can use it on the iPhone or any other kind of phones.
Q2. Mobile phone have been around for a while. Why e-cash now?
We use phones for everything now. So why not this?
Comment: Main problem of e-cash is legislation for roles of telecom provider.
Q3. What about overhead? Can't you use a whole pile of coins of denomination 1 for everything.
A. It is not yet clear, which would be more efficient.
This paper was presented by Feng Hao. They primarily talked about the limitations of formal analysis and also presented a new attack on HMQV.
Provable security: Adversarial Models, Security Definitions, and Security Proofs. Everyone does formal analysis. But it should not only be theoretically secure but also practically secure. For example MQV vs. HMQV is highly debatable. MQV is not provable. But HMQV derived from MQV has proofs. Derivations consist of hash functions, remove static keys, and use ephemeral public keys. However, a small subgroup attack is possible. This attack should have been covered by formal analysis. MQV uses static separate from ephemeral but HMQV combines them for optimization purposes. That is not always good.
Wormhole replay attack: Self communication can be attacked using man in the middle. HMQV assumes there is only one mobile client. This is a common problem with formal models. Models can only capture a subset of attackers capability.
What is the impact of the first attack?
A. Attacker can do transactions and repudiate given protocols.
The paper breaks the security of a real world contactless payment system.
Contactless technology is used a lot. There are implementation issues. For example randomness is only based on time, so if you can fix time no more randomness. It is possible to build a system from off the shelf components that would fix the nonce on the card. You can use it extract all the keys from the card. And if it turns out that the keys are the same on all cards, you can now decrypt contents of many cards in milliseconds. You can even do electronic pick-pocketing.
Have they upgraded yet?
A. No they haven't.
Ulrich Rührmair presented this paper. The paper talks about the use of diodes as a building block of COA/PUF/POF. The specific diodes are produced by the ALILE process. The structures produced are not clonable.
POF: Secret binary key is stored in a non-volatile memory. Invasive attacks can get to that key. At the same time you can have a hardware and you can derive the key from the random components of the diodes. So key is not stored explicitly and it is harder to get invasively. You can get 3 bits of key per diode.
(Heng Fao) Did you consider that correlations might decrease entropy?
Why is more difficult to get the key when you have it as hardware?
A. If you look at the structure with a microscope you can't really tell anything, so one is forced to measure, which makes it difficult.
What code is trusted?Behind the curtain even small code uses external libraries and operating system both of which allow possibility of exploitation. We utilize the idea of FROGS. We basically have a 1000 lines of ARM that we release to the auditors for checking. Its a three step process:
Authentication: Check voter is genuine and give them a voter card.
Ballot Selection Device: Save vote on card.
Ballot Casting Device: Display vote + audio of vote and cast ballot.
The paper basically talks about a voting machine that has very few lines of code to ensure that it can be audited and thus provide trust.
Studies show that vote selection + vote casting never works because users never check?
Yes. It is so most of the time. But it is better than not having the check.
Can voters verify their votes?
Not in this system.
What about the code in the smart card?
A. The only guarantee is that the data in the smart card is the same as the data on the display.
This paper was presented by Ross Anderson. Single sign-on has not been successful with the exception to 3DS, which is the credit card companies answer to card not present fraud. Security and privacy of this scheme is bad. It is the worst single sign on protocol ever. It is successful because, merchants are not liable now. Also users lose statutory protection of signatures. Customers are liable for all losses.
The paper basically talked about the problems inherent in the design of 3D Secure protocol. It says that the reason for the adoption of this protocol is technical but due the economics, since it allows the merchants to be liability free and puts the liability on the customer.
When the iFrame opens can you tell if TLS is used.
No, you can't even tell the URL.
How many are using this protocol?
In Sept 2009, 240 million registered card holders.
Do you think they will see that fraud is still high and switch to a different protocol?
Once deployed it becomes a question of information governance, because everyone will say its not their problem.
Comments on how to change regulatory effects?
USA has it right. Reliability is not on users and also users can sue.
This paper was presented by Aniket Kate. The paper talked about the problem of using the same pseudonym across different sessions. If the same pseudonym is used the attacker can keep track of it and get some information about the identity of the user. The basic idea here is to change the pseudonym frequently. A new circuit construction methodology based of Sphinx was introduced and was proven secure in the universal composability framework. The new circuit enhancements were given for TOR, pairing based onion routing and certificate-less onion routing.
Among other things, program chair Radu Sion talked about acceptance rate. He also noted that the program committee was only from Europe and USA. The best paper award presented to Roger Dingledine. Rachel Greendtadt talked about her research on attacks on stylography. She needs participants for her research. Roger Dingledine told stories about Iceland Banking crisis and China's Blue Shield program and what it means for security and privacy. Joseph Bonneau talked about the idea of electronic protest and using Internet as the new frontier demanding reforms.
She talked about the various studies that her research group has conducted to study the way users interact with security software and warnings in order to try and understand how they make security decisions in real life. She also talked about the difficulties faced in this kind of research including the ethical and legal complications.
One study she described was about looking at SSL certificate warnings and the effectiveness of different Internet browsers in presenting it. They found that FireFox 3 was most successful, but the reason might have been that Firefox 3 was newly deployed at the time of conducting the study and the users simply might have been unaware of how to ignore the warning. This example elucidated the problems of hidden variables with some of these studies and brings to light how difficult it might be to produce good study designs.
How do you deceive the users?
Make the incentives reasonable.
Would the results hold for a larger password dictionary?
You can never be certain without actually doing the experiment. But I think they should.
This paper was presented by Benedikt Westermann. They talk about Johndonym. Every time cascade gets initialized it generates new key pairs. This is done by open SSL but there is a bug in the Debian package only an small number of keys can be generated. In the encryption scheme the same IV is used for both directions and same key stream is generated by both mixes. Thus Johndonym is susceptible to replay attacks.
Was forward secrecy not considered?
Can ANON fix this without architectural change?
Yes, partially. They have to modify the protocol, but they have to update every mix, which takes time.
This paper was presented by Sven Sche. They presented a new Ring signature scheme that is unforgeable under Computation Diffie Hellman (CDH) assumption for bilinear groups.
This paper was presented by Tyler Moore. They found that a lot companies end up paying for self referential advertisements to type-sqautters since it is too expensive to try and have all squatters prosecuted. Many of such sites were being used to host pay per click advertisements. They built a crawler that would work on a distance metric to find similarly named websites. One finding was that the shorter the name of a website the lesser chance it has of being type-squatted.
How many sites were being used for phishing?
How much money do these squatters make?
Based on a back of the envelope calculation, a lower bound might be 10 million dollars.
Why do you think there was no phishing?
A. Maybe because pay per click has lesser penalty than phishing which has higher penalty. We will probably see more advertising fraud.
This paper was presented by Thomas Schneider. They present the idea of Secure Function Evaluation (SFE) using garbled circuits. It is mostly not practical due to computational issues. There are also communication issues like amount of data storage required. This is in particular bad for mobile networks. To overcome these issues one does not send a lot of garbled circuits but only sends session information to save on overheads on computation and storage.
What about implementation cost?
A. In practice we already have the hardware. Nothing extra is required.
This paper was presented by Toni Perkovic. They presented a new authentication scheme that guards against shoulder surfing attacks. They then conduct a user study to measure the effectiveness of their scheme.
Did you compare these with coverage systems?
No, we did not. A major drawback of this scheme is it requires new hardware.
Using the Modulo 10 table it should be possible to break the password with intersection attack.
A. I do not think so, because of the nature of the table.
This paper was presented by Marko Vukolic. They presented an open standardized protocol for all vendors for compliance and interoperability. They address two issues, first they automate the task of deploying keys and certificates, and two, they introduce a new strict access control.
Are you planning to put this into a product?
A. This is actually for an IBM product.
The paper was presented by Octavian Catrina. The basic idea presented in the paper is that it should be possible to do secure computation with rational numbers with fixed points other than integers and boolean. The scheme presented allows the complete family of protocols including arithmetic operations and comparisons. Corruption threshold t<n/2, where n is the number of participants. Communication complexity can be computed by counting the number of times the primitive is invoked.
This paper was presented by Kwangjo Kim. Contract signing is important for markets. Prover and verifier might want to dispute or receive a signature on a common message M. For dispute resolution there is usually Trusted Third Party (TTP). Previous approaches however, ignore player privacy-fairness and are based on verifiable encryption of digital signatures with universally verifiable property that does not produce abuse freeness.
What are the communication channels, are they authenticated for example?
A. Yes, we use authenticated channels.
This paper was presented by Felix Gröbert. They targeted the weaknesses of a USB based class 1 smart card readers that are more complex than standard class 1 readers. They also did a proof of concept.
Any statistics about banks' usage of readers in German banks?
No. What is your contribution?
Panel Members: Bernhard Haemmerli, Acris GmbH & University of Applied Sciences Lucerne, Rafael Llarena, Atos Origin, Michael Samson, FI-ISAC y NVB, Thomas Kohler, UBS. Moderator: Henning Arendt, @bc, Previous Chair of European Finance Forum (Panel Members are from here on referred to by their first initial so B, R, M, T and H).
H: There is a need for infrastructure change. We need Authentication and Denial of Service (DoS) resistance. We need to look at five years from now, as to what kind of infrastructural challenges will be there for the financial sector. There have been eight recommendations. We need cloud computing and virtualization on top of what is there. Digital identities should be made mandatory for financial sector.
T: Private sector is prepared for major attacks. But networks between private and public sector is weak. Crisis is usually cross jurisdictional and national boundaries. This is bad. We are looking at a project between European banks. There needs to be an exchange of information to preserve time and resources.
1. Other organizations have secure networks? Why can't you use the existing structures without trying to invent a new one?
Banks are competitive and there is not a lot of trust. They do not think that existing networks provide enough protection.
Comment: (Michael Samson) Banks need to cooperate and not be competitive when it comes to studying vulnerabilities. Key is trust. Networks for information exchange, are however, not secure. We are trying to have information exchange between larger banks but it is only starting off.
What is being done for trustworthy information exchange?
We are having a meeting of people to discuss possibilities, there is a formally signed agreement. We also try to have a stable group of people that meets instead of two new people every meeting.
What is being done electronically?
We do not do that. We exchange less sensitive information via email. We are trying to build a secure email network for sensitive information, which started a few years back.
Is this networks for all the banks in Netherlands?
Only members of FFI. There is a project underway called Communication Middleware for Monitoring Financial Critical Infrastructure (CoMiFin) - for secure information exchange. This is right now a two year research project. We are not sure if this can be deployed commercially.
R: We need to stress the importance of 17 global providers in physical infrastructure protection and stress secure information exchange
B: Trust is essential. We can build trust slowly. We need to use more social science than cryptography. Personal trust leads to institutional trust. It requires negotiation skills. See the middle ground between competition and collaboration. There are many dimensions to Critical Information Infrastructure Protection (CIIP), which has historically been the way to go: policy, technology, political negotiation, and directions for future research. We need to educate people in CIIP, but not many academic institutions have programs in CIIP.
H: We need to look at the kind of tools needed for secure information exchange. Secure email would help. But trust in a person is more important. We need to be able to abstract the less sensitive part of all information and at least communicate that, since sensitive information, or red information, can not always be communicated.
Simplified multilevel security has been seen as a working concept. A two level security would not help. Have you seen the security literature on this?
We would like to know more.
There is also the power plant industry. They handle information exchange. We need to have a central server. But there is political trouble in deciding where this server should reside. Trust also goes from real trust to abstract trust. Banks can not believe in abstract. We need to reinvent trust.
Comment (Ross Anderson): For best practices you should look at USA. We need to look at how much more is being lost. We need to look at public information sharing, like releasing aggregate data about losses. Breach reporting is done in USA, UK, and France, why not elsewhere.
Situation in UK is different. Fraud is limited. It is not very
important. Problem for public is limited. We need to tell people to
secure there computers. This kind of information can raise trust
issues. Customer trust would lower if they knew that banks released
Question: While the main goal is trust between banks, but you also want confidential networks between banks. The protocols used are flawed. This can only be fixed if network providers work with banks.
We see the same attacks again and again. The move to mitigate is slow. Maybe we need collaboration between banks and research community. It is not a matter of money, it is a matter of will. In Italy, loss is low, so there is no will. In Italy there is lots of card cloning. But its cheap. Maybe loss seems low, because customer is liable, not banks.
(Paul Syverson) There are many platforms for secure information exchange. Are any of these being used by banks.
Banks are dumping liability on customer and then calling losses low. Isn't that unethical?
M: Banks are changing. We are going to EMV cards. We try to monitor strange behavior. Reports from Cambridge are very strong. We play it down. We can't move too quickly. Its not practical.
B: CoMiFin does not trust academia. They are afraid of over reaction. It is difficult to train academics.
R: We need trust between academia and financial sector.
M: Fraud is illegal. We can't change too quickly since number of systems involved is too large.
T: I concur with Michael. We have many dedicated initiative to improve security. We need supranational initiatives. CoMiFin is a good example of research community and finance working together. It is not, however, a silver bullet.