Call for Papers
DSN 2007 (Dependable Systems and Networks)
Workshop on Assurance Cases for Security
The Metrics Challenge

Edinburgh, Scotland - UK
27 June 2007

For critical systems it is important to know whether the system
is trustworthy and to be able to communicate, review and debate
the level of trust achieved. In the safety domain, explicit
Safety Cases are increasingly required by law, regulations and
standards. It has become common for the case to be made using a
goal-based approach, where claims (or goals) are made about the
system and arguments and evidence are presented to support those

The need to understand risks is not just a safety issue: more
and more organizations need to know their risks and to be able
to communicate and address them to multiple stakeholders. The
type of argumentation used for safety cases is not specific to
safety alone, but it can be used to justify the adequacy of
systems with respect to other attributes of interest including
security, reliability, etc.

An international community has begun to form around this issue
of generalized assurance cases and the challenge of moving from
the rhetoric to the reality of being able to implement convincing
and valid cases. In a recent article in IEEE Security and Privacy
we outline what we have been doing so far in the security area,
what we hope to achieve and where we go next.

Prior workshops, beginning with one held at DSN 2004, have
identified a number of technical, policy and research
challenges. This workshop will focus on one of these challenges:
metrics for assurance cases for security. Such metrics can be
essential for supporting decisions regarding the resources
provided to develop the assurance case, and the efficacy of the
resulting case. However, there is no commonly accepted approach
to this topic. We would like to be able to answer questions (in
the context of security) such as:

* What makes an argument compelling?

* Are there standard patterns for arguments?

* What arguments should be compelling? What arguments do people
		 actually find compelling?

* How do additional arguments or evidence serve to increase the
		 compelling nature of a case?

* If there are accepted notions of what makes a case compelling,
		 to what extent do we know that these accepted notions are
		 correct or incorrect?

* Is there a measure of compellingness that could be used to
		 compare alternative argumentation structures?

* How can assurance cases be composed? If they are composed, is
		 it also possible to compose the metrics associated with the
		 individual cases?

* How can arguments with different compelling force be
		 compounded for supporting the case claims?

* What new types of evidence are needed to create arguments
		 which are more sound and how will we measure that they are
		 more sound?

* By what metrics do we assess the effectiveness of evidence?

* What is the cost/benefit justification for developing an
		 assurance case?

* Are there different levels of effort depending on the
		 motivation? Can these levels be quantified?

* Can it be shown that a well-defined and executed assurance
		 case process will cost less than current assurance

* Given two cases, one that costs more and, by some metric, is
		 more compelling than the other, how does one make the trade?

The purpose of the workshop is to understand these and other
questions in the context of assurance cases for security and to
identify viable technical approaches.


The workshop will be held on day two, June 27, of DSN 2007. It
will consist of:

* invited talks at the beginning of a session followed by brief
		 presentations of position papers.

* discussion of the application of metrics to example toy
		 assurance cases for security.

* consolidation and conclusions.


The workshop will identify state of the practice in metrics for
assurance cases in the context of security, identify promising
ways forward and research directions. The workshop will produce
the following outputs:

* Identification of the candidate metrics for assurance cases
		 for security and the characteristics which those metrics
		 must posses

* A listing of the major classes of evidence for assurance cases
		 for security and a mapping of classes of evidence to metrics

* Candidate methods for combining the various classes of
		 evidence toward the desired system security properties.

Participation and Submissions

Attendance at the workshop will be open to all interested
parties. For active participation submission of a position paper
of no more than six pages is required. The submission should
conform to the proceedings publication format for IEEE
Conferences and should be submitted electronically in PDF format
via e-mail to weinstock at Please use the subject
"DSN AC Workshop Submission" so that your submission is not
overlooked. Submissions will be reviewed by the organizers and
those accepted will be published in the DSN Proceedings
supplemental volume.


Robin Bloomfield, Center for Software Reliability (UK)
Marcelo Masera, Joint Research Center of the European Commission (Italy)
Ann Miller, University of Missouri at Rolla (US)
O. Sami Saydjari, Cyber Defense Agency (US)
Charles B. Weinstock, Software Engineering Institute (US)


Submission Deadline: March 9, 2007
Author Notification: April 13, 2007
Camera Ready Copy Due: May 4, 2007


For questions or concerns regarding the workshop please contact
Sami Saydjari at Cyber Defense Agency (ssaydjari at or Chuck Weinstock at the Software
Engineering Institute (weinstock at