Program

Twenty-Second Annual Computer Security Applications Conference (ACSAC)

Practical Solutions To Real World Security Problems

acsac06-logo-djt.gif

December 11-15, 2006
Miami Beach Resort and Spa
Miami Beach, FL, USA

Presented by acsalogonew-lite.gif





Conference and Registration Information

Invitation to ACSAC 22

I would personally like to invite you to attend the 22nd Annual Computer Security Applications Conference (ACSAC). This year the conference travels south to Miami Beach, FL! Though the venue is new, we will continue the format established last year: the Technical Program consisting of refereed papers and panels will be presented Tuesday through Thursday, while a workshop will be held on Monday, and full and half-day tutorials will be offered both Monday and Friday. This advance program provides the details you will need to register, travel and attend, but please also visit http://www.acsac.org, where you can find many useful resources, including full archives of past ACSACs.

We hope to see you in Miami Beach this December!

Dan Thomsen, Conference Chair

Welcome Reception

Please join us on Monday evening (Dec.11, 2006) at 6:00 pm for hors d'oeuvres and drinks and an opportunity to meet others in the security community. This is also a good opportunity for session chairs to meet their speakers!

Important Dates to Remember

November 13, 2006: Last day to reserve a room at the conference hotel at the conference rate.

November 13, 2006: Last day for early/reduced conference registration fee.

November 13, 2006: Last day to cancel your conference registration and obtain a refund less a service charge of $25.00. Cancellations must be in writing. See the Registration Form for complete details.



Conference Location

Hotel Information

The conference will be held at the Miami Beach Resort and Spa (http://www.miamibeachresortandspa.com).

Reservations

REGISTER EARLY! ACSAC has reserved a block of rooms at Group Room Rates until November 17. The Single/double price is $108.00/night plus tax. (Prevailing Government per Diem).

All reservations must be made directly with Miami Beach Resort and Spa.

Online Instructions

  1. Go to www.miamibeachresortandspa.com
  2. Click on Reservations
  3. At the bottom of the screen, click on the Group key
  4. Enter Attendee Code CA7460121

Hotel reservations: +1-866-765-9090.

Please be sure to use the Attendee Code to associate your reservation with the ACSAC conference. This makes the special negotiated room rates available to you, gives the conference a credit which helps to lower the regisration fees.

The room rate is available three days before and after the conference if you would like to stay over one or both weekends.

Note: there has been some intermittent problem getting the correct rate on the web site. The hotel has been extremely helpful in correcting the problem. Please complete the registration and then let us know conference_chair@acsac.org and we will make sure it gets straightened out.

Cutoff Date: To qualify for the negotiated rates, hotel reservations by attendees must be received on or before Monday, November 17, 2006.

Directions to the Conference Hotel

  • From Miami International Airport:
    • Exit Airport and follow the sign that says “Lejune Road North”
    • From Lejune Road North take the “112 East-Miami Beach” (second lane from right)
    • 112 East Miami Beach after the toll will become I-195 - Miami Beach
    • Once you are on I-195 go straight (always on the left lane) and after you pass the long bridge you will be on Arthur Godfrey Rd (also 41st Street)
    • Keep going until you see Indian Creek Drive, there you’ll make a left
    • Indian Creek will merge with Collins Avenue – proceed to 4833 Collins Avenue

  • From Ft. Lauderdale / Hollywood International Airport:
    • Exit Airport to I-95 South
    • Take I-95 to I-195 Miami Beach
    • Once you are on I-195 go straight (always on the left lane) and after you pass the long bridge you will be on Arthur Godfrey Rd (also 41st Street)
    • Keep going until you see Indian Creek Drive, there you’ll make a left
    • Indian Creek will merge with Collins Avenue – proceed to 4833 Collins Avenue

Transportation to Miami Beach

Miami International Airport (www.miami-airport.com) is the closest airport (approximately 10 miles west of Miami Beach). For information on taxis and shuttle services: http://www.miami-airport.com/html/taxi_and_shuttle_service.html

Fort Lauderdale airport www.broward.org/airport is also convenient. It's approximately 30 minutes north of Miami Beach. Information on public transportation: http://www.miamidade.gov/transit/

Official Miami Beach website: http://www.visitmiamibeach.us/

Meals and Special Diet Requests

The Conference Committee has selected lunch menus that we hope everyone will enjoy. We realize that some individuals have special dietary needs. We have made arrangements to offer a vegetarian meal at lunch that will feature some combination of pasta, vegetables, and/or fruits. Please indicate your dietary request on the registration form and upon your arrival, please check your registration packet to ensure that your lunch tickets indicate your dietary request. If there are problems, please contact the conference registration desk.

Special Instructions for Foreign Visitors

If you are traveling from outside the United states, you may need to obtain a visa. Details on requesting a letter of invitation from ACSAC can be found at visa request.



Conference At-A-Glance

Monday, December 11, 2006
8:30-17:00 Workshop: Host Based Security Assessment
8:30-12:00 Tutorial M1 Tutorial M3 Tutorial M4
13:30-17:00 Tutorial M2
Tuesday, December 12, 2006
Time Track 1 Track 2 Track 3
8:30-10:00 Distinguished Practitioner: Dixie Baker, SAIC
10:30-12:00 Applied Distributed Collaboration Client Access in Untrusted Environments Vulnerability Management
13:30-15:00 Network Intrusion Detection Panel: Challenges for Web Services Security Personal Identification Verification
15:30-17:00 Network Security Security in Systems Case Studies
Wednesday, December 13, 2006
Time Track 1 Track 2 Track 3
8:30-10:00 Invited Essayist: Brian Witten, Symantec Corporation
10:30-12:00 Applied Sandboxing Malware Case Studies
13:30-15:00 Applied Detection Techniques Panel: Partnering with Industry and Academia - The DHS S&T Approach to Cyber Security Research, Development, Test, and Evaluation Industrial Control System Security
15:30-17:30 Works in Progress Case Studies
Thursday, December 14, 2006
Time Track 1 Track 2 Track 3
8:30-10:00 Classic Papers: Peter G. Neumann, SRI International
Jeremy Epstein, webMethods Inc.
10:30-12:00 Applied Randomization Intrusion Detection Case Studies
13:30-15:00 Messaging Security Countermeasures Certification and Accreditation
15:30-17:30 Information Flow and Leakage Panel: Highlights from the 2006 New Security Paradigms Workshop Minimum Security Requirements
Friday, December 15, 2006
8:30-12:00 Tutorial F5 Tutorial F6 Tutorial F7
13:30-17:00



Technical Program

Tuesday, December 12, 2006, 8:30-10:00

Opening Plenary

Introductory remarks:
Dan Thomsen, Cyber Defense Agency, LLC, Conference Chair
Christoph Schuba, Linköpings University, Program Chair

Introduction of the Distinguished Practitioner
Marshall Abrams, The MITRE Corporation

Distinguished Practitioner:
Privacy and Security in Public Health: Maintaining the Delicate Balance between Personal Privacy and Population Safety.
   Dr. Dixie Baker, SAIC

Amidst threats of pandemic influenza and bioterrorist attack, the importance of public health surveillance and preparedness has never been more important. Early detection of biological events, electronic reporting of laboratory test results, efficient exchange of case reports across jurisdictions, and timely alerting of health threats are critical components of effective health protection. Equally important to public health surveillance, preparedness, and response is the timely availability of information relating to individuals' healthcare behaviors and clinical conditions - posing a threat to personal privacy. Public health is challenged to maintain an optimal balance between protecting the nation's health and respecting the personal privacy of its citizens.

baker.gif

About the Speaker:

Dr. Dixie B. Baker is a Technical Fellow and Vice President for Technology at Science Applications International Corporation (SAIC), where she serves as the Chief Technology Officer (CTO) for the Enterprise and Infrastructure Solutions Group (E&ISG). With a total staff of over 12,000 people, E&ISG leads SAIC’s business in homeland security, health and life sciences, energy, environment, and enterprise solutions. As CTO, Dr. Baker serves as the Group’s principal visionary and spokesperson on science and technology issues, develops partnerships and strategic alliances with technology suppliers, oversees research and development investments, and represents SAIC in national and international forums. In addition, Dr. Baker serves as a senior consultant on projects of strategic importance to the Group and to SAIC.

An internationally recognized thought leader in high-assurance architecture and information protection, Dr. Baker has applied her expertise primarily to the health and life sciences for the past ten years. She was the Principal Investigator for the Patient Centered Access to Secure Systems Online (PCASSO) project, a National Library of Medicine sponsored research project that is widely regarded as ground-breaking in providing patients safe and secure Web access to their complete medical records. Her research team’s paper won the ACSAC’s 1997 Best Paper Award. She has provided testimony to the National Committee on Vital and Health Statistics (NCVHS) as input to the development of the Health Insurance Portability and Accountability Act (HIPAA) security standards, and more recently, as guidance toward technology solutions for protecting the confidentiality of health records released to third parties. For the Centers for Disease Control and Prevention (CDC), she defined, and now is helping implement, an architecture that will enable the production, management, distribution, and use of semantically interoperable data-collection instruments for disease surveillance across the U.S.

Dr. Baker has published and lectured extensively in a number of technology and health-related areas, including information protection, high-assurance architecture, electronic medical records, and Internet safety. In 2001, she was awarded the John P. McGovern Lectureship in Information and Communications, presented by the Medical Library Association. In September 2004, at the invitation of the Ministry of Health of the Peoples Republic of China, she presented a keynote address and paper at the IDEAS04DH Workshop on Medical Information Systems, held in Beijing.

Dr. Baker holds a Ph.D. in Education Research Methodologies and an M.S. in Computer Science from the University of Southern California, as well as M.S. and B.S. degrees from Florida State University and The Ohio State University respectively.


Tuesday, December 12, 2006, 10:30-12:00

Track 1: Technical Papers

Title: Applied Distributed Collaboration
Chair: Christoph Schuba, Linköpings University

Shamon: A System for Distributed Mandatory Access Control.
Jonathan McCune, Carnegie Mellon University
Stefan Berger, IBM
Ramon Caceres, IBM
Trent Jaeger, Pennsylvania State University
Reiner Sailer, IBM

A Framework for Collaborative DDoS Defense.
George Oikonomou, University of Delaware
Jelena Mirkovic, University of Delaware
Peter Reiher, University of California, Los Angeles
Max Robinson, The Aerospace Corporation

V-COPS: A Distributed Vulnerability-based Cooperative Alert System.
Shiping Chen, George Mason University
Dongyu Liu, George Mason University
Songqing Chen, George Mason University
Sushil Jajodia, George Mason University

Track 2: Technical Papers

Title: Client Access in Untrusted Environment
Chair: Jay Kahn, The MITRE Corporation

Delegate: A Proxy Based Architecture for Secure Website Access from an Untrusted Machine.
Ravi Chandra Jammalamadaka, University of California, Irvine
Timothy W.van der Horst, Brigham Young University
Sharad Mehrotra, University of California, Irvine
Kent E. Seamons, Brigham Young University
Nalini Venkatasubramanian, University of California, Irvine

KLASSP: Entering Passwords on a Spyware Infected Machine Using a Shared-Secret Proxy.
Dinei Florencio, Microsoft
Cormac Herley, Microsoft

Vulnerability Analysis of MMS User Agents.
Collin Mulliner, University of California, Santa Barbara
Giovanni Vigna, University of California, Santa Barbara

Track 3: Case Studies

Title: Vulnerability Management

Using the National Vulnerability Database in Enterprise Information Security Programs.
Peter Mell, NIST
Tony Sager, NSA

Understanding the critical vulnerabilities in enterprise information systems and how to take effective actions to eliminate those vulnerabilities and mitigate risks to important organizational missions is a top priority for both public and private sector organizations today. The National Institute of Standards and Technology in collaboration with the Department of Homeland Security’s National Cyber Security Division and US CERT has initiated a comprehensive project to develop a broad-based database of key vulnerabilities in information systems. The National Vulnerability Database (NVD) is a comprehensive cyber security vulnerability database that integrates all publicly available U.S. Government vulnerability resources and provides references to industry resources. The NVD is based on and synchronized with the Common Vulnerabilities and Exposures (CVE) vulnerability naming standard. Linking the vulnerability database to standardized security controls (i.e., safeguards and countermeasures) necessary to protect information systems and recommended configuration settings for commercial products that are employed within those systems in a new and innovative part of the NVD effort that promises to have a significant effect on the overall, security of both public and private sector enterprises. This session covers the latest activities in the NVD, the expansion into the configuration settings area, and the application of automated tools and standardized specification languages to bring greater efficiencies to the process.


Tuesday, December 12, 2006, 13:30-15:00

Track 1: Technical Papers

Title: Network Intrusion Detection
Chair: Giovanni Vigna, University of California, Santa Barbara

Backtracking Algorithmic Complexity Attacks Against a NIDS.
Randy Smith, University of Wisconsin, Madison
Cristian Estan, University of Wisconsin, Madison
Somesh Jha, University of Wisconsin, Madison

NetSpy: Automatic Generation of Spyware Signatures for NIDS.
Hao Wang, University of Wisconsin, Madison
Somesh Jha, University of Wisconsin, Madison
Vinod Ganapathy, University of Wisconsin, Madison

Detecting Policy Violations through Traffic Analysis.
Jeffrey Horton, University of Wollongong
Rei Safavi-Naini, University of Wollongong

Track 2: Panel

Title: Challenges for Web Services Security
Chair: Anoop Singhal, Ph.D., NIST

The advance of Web services technologies promises to have far-reaching effects on the Internet and enterprise networks. Web services based on the eXtensible Markup Language (XML), Simple Object Access Protocol (SOAP), and related open standards, and deployed in Service Oriented Architectures (SOA) allow data and applications to interact without human intervention through dynamic and ad hoc connections. Web services technology can be implemented in a wide variety of architectures, can co-exist with other technologies and software design approaches, and can be adopted in an evolutionary manner without requiring major transformations to legacy applications and databases.

The security challenges presented by the Web services approach are formidable and unavoidable. Many of the features that make Web services attractive, including greater accessibility of data, dynamic application-to-application connections, and relative autonomy (lack of human intervention) are at odds with traditional security models and controls. Difficult issues and unsolved problems exist, such as the following:

  1. Confidentiality and integrity of data transmitted via Web services protocols in service-to-service transactions, including data that transits intermediary (pass-through) services.
  2. Functional integrity of the Web services themselves, requiring both establishment in advance of the trustworthiness of services to be included in service orchestrations or choreographies, and the establishment of trust between services on a by-transaction basis.
  3. Availability in the face of denial of service attacks that exploit vulnerabilities unique to Web service technologies, especially targeting core services, such as discovery service, on which other services rely.

Panelists:

  • Mike McIntosh, IBM Research, Yorktown Heights
  • Prof. Carl Gunter, Department of Computer Science, Univ. of Illinois
  • Jeremy Epstein, webMethods
  • Rafae Bhatti, CERIAS, Purdue University
  • Karen Goertzel, Manager Software Security, Booz Allen Hamilton, Inc.
  • Thomas Ray, Director of Security Services, Washington Mutual Card Services

Track 3: Case Studies

Title: Personal Identification Verification

FIPS 201: The Standard and Its Effect on the Public and Private Sectors.
Bill MacGregor, NIST
Johnny Hsiung, Altan Labs
Kevin Brault, BearingPoint
Steve Weymann, Infogard

In response to Homeland Security Presidential Directive #12, the National Institute of Standards and Technology initiated a new program for improving the identification and authentication of federal employees and contractors for access to federal facilities and information systems. Federal Information Processing Standard (FIPS) 201 was developed to satisfy the Personal Identity Verification (PIV) requirements of HSPD 12, approved by the Secretary of Commerce, and issued on February 25, 2005. In addition to the core PIV security standards, a number of guidelines, reference implementations, and conformance tests have been identified as being needed to: implement and use the PIV system; protect the personal privacy of all subscribers of the PIV system; authenticate identity source documents to obtain the correct legal name of the person applying for a PIV card; electronically obtain and store required biometric data (e.g., fingerprints, facial images) from the PIV system subscriber; create a PIV card that is personalized with data needed by the PIV system to later grant access to the subscriber to Federal facilities and information systems; assure appropriate levels of security for all applicable Federal applications; and provide interoperability among Federal organizations using the standards. This session provides the latest implementation information from organizations managing the deployment of the PIV technologies.


Tuesday, December 12, 2006, 15:30-17:00

Track 1: Technical Papers

Title: Network Security
Chair: Cristina Serban, AT&T

Practical Attack Graph Generation for Network Defense.
Kyle Ingols, MIT Lincoln Laboratory
Richard Lippmann, MIT Lincoln Laboratory
Keith Piwowarski, MIT Lincoln Laboratory

Secure Distributed Cluster Formation in Wireless Sensor Networks.
Kun Sun, Intelligent Automation, Inc.
Pai Peng, Opsware Inc.
Peng Ning, North Carolina State University
Cliff Wang, Army Research Office

Specification-Based Intrusion Detection in WLANs.
Rupinder Gill, Queensland University of Technology
Jason Smith, Queensland University of Technology
Andrew Clark, Queensland University of Technology

Track 2: Technical Papers

Title: Security in Systems
Chair: Lujo Bauer, Carnegie Mellon University

From Languages to Systems: Understanding Practical Application Development in Security-typed Languages.
Boniface Hicks, Pennsylvania State University
Kiyan Ahmadizadeh, Pennsylvania State University
Patrick McDaniel, Pennsylvania State University

An Internet Voting System Supporting User Privacy.
Aggelos Kiayias, University of Connecticut
Michael Korman, University of Connecticut
David Walluck, University of Connecticut

A Study of Access Control Requirements for Healthcare Systems Based on Audit Trails from Access Logs.
Lillian Røstad, Norwegian University of Science and Technology
Ole Edsberg, Norwegian University of Science and Technology

Track 3: Case Studies

Title: Case Studies
Chair: Ed Keefe, FBI

Challenges for Secure Web Services.
Anoop Singhal, NIST

The advance of Web services technologies promises to have far-reaching effects on the Internet and enterprise networks. Web services based on the eXtensible Markup Language (XML), Simple Object Access Protocol (SOAP), and related open standards, and deployed in Service Oriented Architectures (SOA) allow data and applications to interact without human intervention through dynamic and ad hoc connections. Web services technology can be implemented in a wide variety of architectures, can co-exist with other technologies and software design approaches, and can be adopted in an evolutionary manner without requiring major transformations to legacy applications and databases.

The security challenges presented by the Web services approach are formidable and unavoidable. Many of the features that make Web services attractive, including greater accessibility of data, dynamic application-to-application connections, and relative autonomy (lack of human intervention) are at odds with traditional security models and controls. Difficult issues and unsolved problems exist, such as the following:

  • Confidentiality and integrity of data transmitted via Web services protocols in service-to-service transactions, including data that transits intermediary (pass-through) services.
  • Functional integrity of the Web services themselves, requiring both establishment in advance of the trustworthiness of services to be included in service orchestrations or choreographies, and the establishment of trust between services on a by-transaction basis.
  • Availability in the face of denial of service attacks that exploit vulnerabilities unique to Web service technologies, especially targeting core services, such as discovery service, on which other services rely.

While many of the Web services challenges have been met with existing standards, there are a number of challenges that standards organizations are currently addressing—particularly in the area of Web services discovery and reliability. Few of these challenges will be addressed by any of the emerging web services security standards, the majority of which are limited to extending, enhancing, or augmenting current web services security standards in order to better provide message integrity, message confidentiality, and consumer/provider authentication.


Best Practices in Identity and Access Management.
Paul Henry, Secure Computing

Today’s requirements for Identity & Access Management (IAM) go well beyond our historical reliance on usernames and passwords. Regulatory demands as well as threats both internal and external to our networks have redefined the requirements for IAM. This presentation takes an in-depth look at the requirements for building an IAM and includes:

  • How Did We Get Here
    • Usernames and passwords are obsolete
  • Two Factor Authentication
    • It is not what you use but how you use it
  • Centralized Access Policy Management
    • Policy must be enforced at all endpoints
  • Scan and Block
    • Permit access only from trusted machines

Attendees will walk away with a firm understanding of creating an IAM that can mitigate many of the risks associated with today’s hostile network environment.


Preventing Identity Theft and Data Security Breaches: The Problem with Regulation.
Brooke Oberwetter, CEI

Analysis of the possible effects of federal legislation suggests that comprehensive regulation of security practices might have the unintended consequences of locking in current technologies and thwarting market incentives for both collaboration among firms when necessary and competition between them to produce newer and more innovative solutions for cybersecurity problems. The objective of this presentation is to underscore the idea that private innovations-and private actors such as security firms-are much better suited for meeting the constantly evolving challenges of computer security than federal regulators could ever be.


Wednesday, December 13, 2006, 8:30-10:00

Invited Essayist Plenary

Introduction of the Invited Essayist
Dan Thomsen, Cyber Defense Agency, LLC

Invited Essayist:
Engineering Sufficiently Secure Computing.
   Brian Witten, Symantec Corporation

We propose an architecture of four complimentary technologies increasingly relevant to a growing number of home users and organizations: cryptography, separation kernels, formal verification, and rapidly improving techniques relevant to software defect density estimation. Cryptographic separation protects information in transmission and storage. Formally proven properties of separation kernel based secure virtualization can bound risk for information in processing. Then, within each strongly separated domain, risk can be measured as a function of people and technology within that domain. Where hardware, software, and their interactions are proven to behave as and only as desired under all circumstances, such hardware and software can be considered to not substantially increase risk. Where the size or complexity of software is beyond such formal proofs, we discuss estimating risk related to software defect densities, and emerging work related to binary analysis with potential for improving software defect density estimation.

witten.gif

About the Speaker:

As Director of Government Research, Mr. Brian Witten leads all federally sponsored research and development within Symantec. Symantec Government Research is charged with the responsibility of developing technology for future Symantec products and services emerging from federally sponsored research solving nationally critical problems. Symantec pursues much of this research in partnership with world renowned universities. An experienced information security expert, Mr. Witten has also worked closely with both established industry leaders and early stage venture backed companies founded on disruptive technology.

Prior to joining Symantec, Mr. Witten worked at the Defense Advanced Research Projects Agency (DARPA), the U.S. military’s central research and development organization charged with sponsoring revolutionary, high-payoff research to maintain the technological superiority of the U.S. military. While at DARPA, he focused on creation of new network security technologies to protect current and future information systems supporting "Network Centric Warfare." At DARPA, Mr. Witten managed an R&D investment portfolio of more than $150 million in U.S. and international efforts.

Mr. Witten began his technology career as on officer in the U.S. Air Force where he first began collaborating with leading academic institutions and commercial firms in information security research while assigned to Rome Laboratories and Air Force Research Labs (AFRL).

Mr. Witten received his B.S. in Electrical and Computer Engineering from the University of Colorado.


Wednesday, December 13, 2006, 10:30-12:00

Track 1: Technical Papers

Title: Applied Sandboxing
Chair: Konstantin Beznosov, University of British Columbia

A Module System for Isolating Untrusted Software Extensions.
Philip Fong, University of Regina
Simon Orr, University of Regina

How to Automatically and Accurately Sandbox Microsoft IIS.
Wei Li, Rether Networks, Inc.
Lap-chung Lam, Rether Networks, Inc.
Tzi-cker Chiueh, Rether Networks, Inc.

Data Sandboxing: A Technique for Enforcing Confidentiality Policies.
Tejas Khatiwala, University of Illinois at Chicago
Raj Swaminathan, University of Illinois at Chicago
V.N. Venkatakrishnan, University of Illinois at Chicago

Track 2: Technical Papers

Title: Malware
Chair: Anoop Singhal, National Institute for Standards and Technology

On Detecting Camouflaging Worm.
Wei Yu, Texas A&M University
Xun Wang, Ohio State University
Prasad Calyam, Ohio State University
Dong Xuan, Ohio State University
Wei Zhao, Texas A&M University

Bluetooth Worms: Models, Dynamics, and Defense Implications.
Guanhua Yan, Los Alamos National Laboratory
Stephan Eidenbenz, Los Alamos National Laboratory

Back to the Future: A Framework for Automatic Malware Removal and System Repair.
Francis Hsu, University of California, Davis
Hao Chen, University of California, Davis
Thomas Ristenpart, University of California, San Diego
Jason Li, University of California, Davis
Zhendong Su, University of California, Davis

Track 3: Case Studies

Title: Case Studies
Chair: Ed Giorgio, Booz Allen Hamilton

Trusted Storage.
Dave Anderson, Seagate Research

Storage Systems, such as disk drives, and other computing-system peripherals are critical components of a security, privacy, and trust configuration of a computing platform. This session provides a framework with which to understand why and how peripheral devices should be secured as independent roots of trust. The framework provides a generic security model for all peripheral devices, and shows how peripherals can be configured as roots of trust, each playing a complementary role in establishing the overall security and privacy goals of platform-based and networked computing.

The session begins with security measures for storage systems that exist today and their relative effectiveness. It will then go into where and how to secure access control of the storage system, discussing in detail what needs to be controlled and how to grant control in a secure manner.

The Trusted Computing Group's Trusted Storage Use Cases will be reviewed in depth, highlighting the technical requirements being solved by the formal specifications. Relationships and cooperation with other industry storage standards (eg, SCSI and ATA) will be discussed, and the TCG's specification for secure and trusted storage will be outlined.


Putting Trust into the Network: Securing Your Network through Trusted Access Control.
Steve Hanna, Juniper Networks

Today, client network connection requests are granted or denied based on the client's ability to prove their credentials, including passwords, machine certificates and user certificates. This approach ignores the possibility that the client platform contains malicious code (e.g. viruses, Trojans, malware) that spreads through the network once IP connectivity is granted. Trusted Computing and its hardware elements provide the most reliable and secure method to ascertain end-point integrity for clients seeking connectivity to a network. Through trusted network connection protocols and trusted platform mechanisms, platforms can be authenticated before being given full network connectivity. This speaker will address the architecture, applications, status of the Trusted Network Connect specification which is backed by more than 90 companies and also discuss how to properly implement it.


Employing Encryption to Combat Data Theft.
Derek Tumulak, Ingrian Networks

In the wake of continued data thefts and security breaches and increasingly rigorous security and privacy mandates, encryption of data at rest is becoming a necessity for about any organization that manages sensitive customer or employee data. This presentation will provide an overview of the industry mandates for encryption of data at rest, an overview of the pros and cons of various encryption solutions, and offer best practices for deploying an encryption solution.


Wednesday, December 13, 2006, 13:30-15:00

Track 1: Technical Papers

Title: Applied Detection Technologies
Chair: Arthur R. Friedman, OASD(NII)/DoD CIO

Static Detection of Vulnerabilities in x86 Executables.
Greg Banks, University of California, Santa Barbara
Marco Cova, University of California, Santa Barbara
Viktoria Felmetsger, University of California, Santa Barbara
Giovanni Vigna, University of California, Santa Barbara

Foreign Code Detection on the Windows/X86 Platform.
Susanta Nanda, Stony Brook University
Wei Li, Stony Brook University
Lap-Chung Lam, Rether Networks
Tzi-cker Chiueh, Stony Brook University

PolyUnpack: Automating the Hidden-Code Extraction of Unpack-Executing Malware.
Paul Royal, Georgia Institute of Technology
Mitch Halpin, Georgia Institute of Technology
David Dagon, Georgia Institute of Technology
Robert Edmonds, Georgia Institute of Technology
Wenke Lee, Georgia Institute of Technology

Track 2: Panel

Title: Partnering with Industry and Academia - The DHS S&T Approach to Cyber Security Research, Development, Test, and Evaluation
Chair: Dr. Douglas Maughan, U.S. Department of Homeland Security (DHS), Science and Technology (S&T) Directorate

The Science and Technology (S&T) Directorate in the U.S. Department of Homeland Security (DHS) has the mission to conduct research, development, test and evaluation (RDT&E), and timely transition of cyber security capabilities to operational units within DHS, as well as federal, state, local, and critical infrastructure sector operational end users for homeland security purposes. Collaboration and partnerships with academia, research labs, and private industry is required for success in this mission, and this panel will present some of the partnerships that DHS S&T is currently involved in and supporting. The speakers will present their projects and their experiences of working with DHS S&T and the other partners.

The DNSSEC Deployment Initiative is a community-based, international effort to transition the current state of DNSSEC to large-scale deployment that will strengthen the Internet domain name system against attacks.

The DHS-SRI International Identity Theft Technology Council (ITTC) is a working forum where experts and leaders from the government, private, financial, IT, venture capitalist, and academia and science sectors come together to address the problem of identity theft and related criminal activity on the Internet.

Project LOGIC is a 12-month technology integration and demonstration project jointly supported by industry partners and DHS S&T. The project demonstrates an opportunity to reduce vulnerabilities of oil and gas process control environments by sensing, correlating and analyzing abnormal events to identify and prevent cyber security threats. It is also an illustration of a successful model for collaboration between infrastructure owners, technology providers, research labs, and the Government.

The DETER testbed, jointly supported by NSF and DHS S&T, is a shared testbed infrastructure that is specifically designed for medium-scale repeatable experiments, and especially for experiments that may involve "risky" code such as self-propagating malware.

The Secure Wireless Data Program is a collaboration between the U.S. and Canadian governments and multiple industry partners. SRI International leads an integration of new and innovative security technologies followed by trials to test the solution in real-case scenarios. The program is focused on overlays and complementary technologies that can be used to enhance security with minimal impact on the usability of the basic mobile data platform.

Panelists:

  • Dr. Steve Crocker, Shinkuro
  • Dave Jevans, Anti-Phishing Working Group and IronKey
  • Tom Aubuchon, Chevron Pipeline
  • Terry Benzel, USC-ISI
  • Mark Schertler, Voltage Security OR Dr. Ulf Lindqvist, SRI International

Track 3: Case Studies

Title: Industrial Control System Security

An Overview of Emerging Standards, Guidelines, and Implementation Activities.
Stu Katzke, NIST
Joe Weiss, KEMA

Industrial and process control systems are an integral part of the critical infrastructure and the protection of those systems is a priority for the federal government. From air traffic control systems to the systems managing the nation’s largest electric power grids, industrial controls systems are playing an increasingly important role in the economic and national security interests of the United States. Until recently, industrial control systems had little resemblance to traditional information systems in that they were isolated systems running proprietary software and control protocols. However, as these systems are integrated more closely into mainstream organizational information systems to promote connectivity, efficiency, and remote access capabilities, they have started to resemble the more traditional information systems. While the change in industrial control system architecture supports new information system capabilities, it introduces many of the same vulnerabilities that exist in current networked information systems. This session addresses the emerging industrial and process control security standards within the public and private sectors and the overall effect of those standards and activities in helping to secure these important systems.


Wednesday, December 13, 2006, 15:30-17:00

Combined Track 1 and 2: Works in Progress Session

Title: Works in Progress (WiP) session:
Chair: Cristina Serban, AT&T

  1. Maarten Rits and Mohammad Ashiqur Rahaman - SAP Labs France: Secure SOAP Requests in Enterprise SOA
  2. Edward Colbert, Dan Wu, Yue Chen and Barry Boehm - University of Southern California: Cost Estimation for Secure Software and Systems
  3. Eduardo B. Fernandez and Maria M. Larrondo-Petrie - Florida Atlantic University: A Methodology to build secure systems using patterns
  4. Tugkan Tuglular - Izmir Institute of Technology: Test Case Generation for Firewall Testing
  5. Uciel Fragoso-Rodriguez, Maryline Laurent-Maknavicius and Jose Incera-Dieguez - Instituto Tecnológico Autónomo de México, Mexico: Federated Identity Architectures Evaluation
  6. John McDermott and Myong Kang - NRL: An Open-Source High-Robustness VMM
  7. Tim Kelley, Indiana University: A Method for Increasing Transmission Rates in Covert Timing Channels
  8. Rosalie M. McQuaid, William Heinbockel, Joseph Judge, Peter Kertzner and Brian Soby - MITRE Corporation: Security Information Management for Enclave Networks (SIMEN)
  9. William Claycomb and Dongwan Shin - New Mexico Tech: Designing and Implementing Access Control for Impromptu Collaboration
  10. Takuya Mishina, Yuji Watanabe, Yasuharu Katsuno and Sachiko Yoshihama - IBM Research, Tokyo Research Laboratory: Semantic Fine-grained Data Provenance Tracking
  11. Ravi Chandra Jammalamadaka and Sharad Mehrotra - University of California, Irvine: Outsourcing Data Sharing Requirements to an Untrusted Service Provider
  12. Kris Britton and David Perlowski - NSA Center for Assured Software: Software Assurance Analysis Methodology
  13. Ron Finkbine - Indiana University Southeast: A Database for Managing Mutant Programs
  14. David Botta, Rodrigo Werlinger, André Gagné, Konstantin Beznosov, Lee Iverson, Brian Fisher and Sidney Fels - University of British Columbia: HOT Admin: Human, Organization, and Technology Centered Improvement of IT Security Administration
  15. Coimbatore Chandersekaran, Edward A. Schneider and William R. Simpson - Institute for Defense Analyses: Evaluating Security in Distributed Service-Oriented Systems
  16. Houssain Kettani - Jackson State University, MS: A Cryptographic Application of Number Systems Base Conversion
  17. Sashikanth Chandrasekaran - Oracle: Event Processing to Verify Compliance with Security Policies [a case study]
  18. SPECIAL PRESENTATION: David Bell: Looking Back -- Addendum

Track 3: Case Studies

Title: Case Studies
Chair: Ron Ritchey, Booz Allen Hamilton

CANADIAN-US Security Enhanced BlackBerry Trial.
Mark Schertler, Voltage

BlackBerry devices are representative of today's state-of-the-art for small portable communications devices and are widely deployed in the private sector, as well as by U.S. and Canadian government agencies at the federal, state and local levels. In order to study and enhance security for Blackberry devices the joint Canada-US Public Safety Technical Program (PSTP) initiated a joint US – Canadian trial. Trial partners include Defense Research and Development Canada (DRDC), an agency of the Department of National Defense (DND); and the Homeland Security Advanced Research Projects Agency (HSARPA), an agency of the United States Department of Homeland Security (DHS). The trial was directed at improving the security of existing, proven BlackBerry technology for use by the public safety, emergency preparedness, and law enforcement communities in both countries.

The trial’s research focused on overlays and complementary technologies that can be used to enhance security with minimal impact on the usability of the basic BlackBerry system. The trial had the following objectives:

  • Improve the security of BlackBerry technology security with minimal impact on the usability of the basic BlackBerry system
  • Explore opportunities and requirements for the secure use of BlackBerry devices by the public safety, emergency preparedness, and law enforcement communities.
  • Demonstrate policy enforcement and procedure constraints by integrating policy scanning and encryption technologies.
  • Demonstrate the reduction of public key infrastructure overhead through use of new public key technologies developed in government sponsored research.
  • Improve mission assurance by extending the coverage of BlackBerry communications beyond the terrestrial cellular phone system through mobile satellite ground stations that can be mounted on small marine vessels and all-terrain wheeled vehicles.
  • Improve interoperability by stimulating the development of inexpensive mobile communications nodes for first responders that support multiple emerging wireless access protocols, new portable devices and digital services.
  • Examine the secure e-mail solution jointly developed by Canada’s Communication Security Establishment (CSE) and the US’s National Security Agency (NSA).

This discussion will cover the objectives of the trial, execution and results.


Wi-Fi Protected Access for Protection and Automation.
Dennis Holstein, OPUS Publishing

CIGRE Study Committee B5 commissioned a survey of applications using Wi-Fi in protection and automation schemes and an analysis of the mitigation of security vulnerabilities offered by IEEE 802.11i on system reliability and performance. Working Group (WG) B5.22 was further tasked to recommend design requirements and prioritized security levels needed for Wi-Fi protected access related to critical mission protection and automation functions. This presentation summarizes the finding of that investigation. Design requirements and security levels needed for Wi-Fi protected access are prioitized in terms of their mitigationof risk related to critical mission protection and automation functions. Specific mechanisms need to adequately implement Wi-Fi are identified and related to existing or emerging standards.


XL Global Services: A Compelling Case Study in Data Privacy.
Mark Schertler, Voltage

XL Global Services provides IT Services to the XL Capital group of companies. A $50 billion corporation with locations in worldwide, Global Chief Security Officer Tom Dunbar is charged with the Herculean mission of being able to assure that sensitive data will remain privacy-protected and secure according to corporate policies and legislative requirements. With 3,400 local, mobile and remote users, a solution for leveraging digital communications had to feature (a) strong encryption with digital signature support to verify email senders, while ultimately being simple and usable; (b) enterprise scalability, while remaining smoothly flexible; and (c) auditable support of complex regulatory compliance laws, while providing lowest possible total cost of ownership. In this presenation, Mark Schertler will outline his data privacy implementation, while sharing tips, Best Practices, and real-world usable guidance for information technology and security pros.


Thursday, December 14, 2006, 8:30-10:00

Classic Papers Plenary

Introduction of the Classic Papers
Tom Haigh, Adventium Labs and the Cyber Defense Agency, LLC


Fifteen Years after TX: A Look Back at High Assurance Multi-Level Secure Windowing.
   Jeremy Epstein, webMethods, Inc.

Research in the late 1980s and early 1990s produced a prototype high assurance multi-level secure windowing system that allowed users to see information of multiple classifications on the same screen, performing cut & paste from low to high windows. This retrospective discusses the motivations for the project, reviews the architecture and implementation of the prototype, discusses developments in the intervening years, and concludes with lessons learned.

epstein.gif

About the Speaker:

Mr. Jeremy Epstein has been involved in information security for nearly 20 years, and is a well known researcher in the area. He is currently Senior Director of Product Security at webMethods, where he's responsible for analyzing and improving the security of all products, designing security for new products, assessing third party security products, and complying with security standards. He's also Lead Analyst with Cyber Defense Agency LLC where he leads several DARPA efforts. Prior to joining webMethods and CDA, he led a security research group at Network Associates, and was responsible for the C2 security evaluation of Novell NetWare. The research described in this classic paper was performed when he led a research group at TRW, Inc in the late 1980s and early 1990s. He has published over 20 papers in refereed research conferences including USENIX, IEEE, and ACSAC, as well as several articles in trade magazines. In his spare time, he works to improve security of electronic voting systems as a member of the Virginia legislature's committee on voting equipment recommendations. Mr. Epstein holds a BS in Computer Science from New Mexico Tech, an MS in Computer Sciences from Purdue University, and an ABD from George Mason University.


Risks of Untrustworthiness.
   Peter G. Neumann, SRI International Computer Science Lab

This paper revisits the risks of untrustworthiness, and considers some incidents involving computer-based systems that have failed to live up to what had been expected of them. The risks relate to security, reliability, survivability, human safety, and other attributes, and span a variety of applications and critical infrastructures, such as electric power, telecommunications, transportation, finance, medical care, and elections. The range of causative factors and the diversity of the resulting risks are both enormous. Unfortunately, many of the problems seem to recur far too often. Various lessons therefrom and potential remedies are discussed.

pgn-sri.gif

About the Speaker:

Peter G. Neumann (Neumann@CSL.sri.com) has doctorates from Harvard and Darmstadt. After 10 years at Bell Labs in Murray Hill, New Jersey, in the 1960s, during which he was heavily involved in the Multics development jointly with MIT and Honeywell, he has been in SRI's Computer Science Lab since September 1971. He is concerned with computer systems and networks, trustworthiness/dependability, high assurance, security, reliability, survivability, safety, and many risks-related issues such as voting-system integrity, crypto policy, social implications, and human needs including privacy. He moderates the ACM Risks Forum, edits CACM's monthly Inside Risks column, chairs the ACM Committee on Computers and Public Policy, and chairs the National Committee for Voting Integrity (http://www.votingintegrity.org).

He created ACM SIGSOFT's Software Engineering Notes in 1976, and was its editor for 19 years and still contributes the RISKS section. He has participated in four studies for the National Academies of Science: Multilevel Data Management Security (1982), Computers at Risks (1991), Cryptography's Role in Security the Information Society (1996), and Improving Cybersecurity for the 21st Century: Rationalizing the Agenda (2006). His 1995 book, Computer-Related Risks, is still timely!

He is a Fellow of the ACM, IEEE, and AAAS, and is also an SRI Fellow. He received the National Computer System Security Award in 2002 and the ACM SIGSAC Outstanding Contributions Award in 2005. He is a member of the U.S. Government Accountability Office Executive Council on Information Management and Technology, and the California Office of Privacy Protection advisory council. He co-founded People For Internet Responsibility (PFIR, http://www.PFIR.org). He has taught courses at Darmstadt, Stanford, U.C. Berkeley, and the University of Maryland. See his website (http://www.csl.sri.com/neumann) for Senate and House testimonies, papers, bibliography, further background, etc.


Thursday, December 14, 2006, 10:30-12:00

Track 1: Technical Papers

Title: Applied Randomization
Chair: Steven J. Greenwald, Independent Consultant

Address-Space Randomization for Windows Systems.
Lixin Li, Global Infotek
James Just, Global Infotek
R. Sekar, Stony Brook University

Address Space Layout Permutation (ASLP): Towards Fine-Grained Randomization of Commodity Software.
Chongkyung Kil, North Carolina State University
Jinsuk Jun, North Carolina State University
Christopher Bookholt, North Carolina State University
Jun Xu, North Carolina State University
Peng Ning, North Carolina State University

Known/Chosen Key attacks against Software Instruction Set Randomization.
Yoav Weiss, Discretix Technologies Ltd.
Elena Gabriela Barrantes, Universidad de Costa Rica

Track 2: Technical Papers

Title: Intrusion Detection
Chair: Carrie Gates, CA Labs

Automatic Evaluation of Intrusion Detection Systems.
Frédéric Massicotte, Communications Research Center
François Gagnon, Carleton University
Yvan Labiche, Carleton University
Lionel Briand, Carleton University
Mathieu Couture, Carleton University

Offloading IDS Computation to the GPU.
Nigel Jacob, Tufts University
Carla Brodley, Tufts University

Anomaly Based Web Phishing Page Detection.
Ying Pan, Singapore Management University
Xuhua Ding, Singapore Management University

Track 3: Case Studies

Title: Case Studies
Chair: Gary Wilson, Booz Allen Hamilton

Certification and Accreditation at National Oceanographic and Atmospheric Administration (NOAA), National Environmental Satellite, Data, and Information Service (NESDIS).
Dan Gambel, Mitretek Systems

In 2004 NOAA systems were identified by the Department of Commerce as being deficient in having current and compliant accreditation packages approved. NOAA operates a significant number of legacy systems that were developed prior to current standards and guidelines for IT security. As a result, the documentation of the security of the various systems was fragmentary and incomplete. In an attempt to achieve accreditation for these national critical and mission critical systems, NOAA elected to standardize the content of the security plans, an approach that was not acceptable to DOC Office of Inspector General. Mark Noto was brought into the security team by a newly appointed CIO to correct the problems with the process. The approach to fixing the problems was to assemble a team of existing contractors and NOAA staff into “red teams” to define the specific requirements for the System Security Plans, Risk assessment, testing, and Contingency documentation. Once the documentation requirements were adequate, three packages covering the three most critical systems were prepared and submitted for post-certification review by NOAA and DOC OIG. Based on comments on the initial package, a number of changes were made to the process, especially on details of the architecture and consistency of the scanning and inventory. Based on the revised requirements NOAA NESDIS has prepared, certified and approved packages for all active National and Mission critical systems and are progressing through mission essential systems. The presentation will address the current process and how the senior officials are responding to the information being provided. In addition, the NOAA process modification necessary to accommodate FISMA will be identified and, where available, solutions applicable to NOAA will be provided.


Using Predictive Analytics and Modeling to Improve Insider Threat Detection and Cyber Threat Identification.
Peter Frometa, SPSS Inc

A valuable approach to insider threat detection and cyber threat identification involves applying predictive analysis. Learn how predictive analysis can be leveraged to identify key characteristics of valid versus invalid network access attempts and web traffic patterns, as well as uncovering patterns in documented malicious activity. As millions of cases pass through network ports on a daily basis, predictive analytic techniques can also be applied to better predict and prevent activity that signals potentially suspicious behavior, or cases that could indicate malicious web robots or intrusion attempts. One of the many major advantages of this approach is that, instead of focusing efforts solely on specific, previously named viruses, predictive analytics looks at the behavior as a whole, and targets specific patterns or anomalies, greatly enhancing the likelihood of identifying and stopping new viruses or emerging variants as well as potential hacker activity. Through the incorporation of additional structured data sources such as employee key card access files, log-on audit logs, and file access logs; and unstructured data sources such as the content of web pages, downloaded files and documents, and intelligence reports; predictive analytics allows cyber threat identification to incorporate not only external threats, but those that exist internal to an organization as well (insider threats).


The Business of Enterprise Privacy, Compliance.
Mark Schertler, Voltage

While there has been significant focus on compliance as a driver for security investments, as well as significant FUD generated by vendors, the use of security products can drive significant business value at the intersection of compliance with core business drivers. This presentation examine how, even faced by daunting regulatory requirements and significant fines, many organizations can achieve concrete, measurable, and near-term business value when deploying enterprise privacy and security solutions that also deliver business process efficiencies. The focus will be specifically on case studies from the insurance and financial services industries.


Thursday, December 14, 2006, 13:30-15:00

Track 1: Technical Papers

Title: Messaging Security
Chair: John Totah, Sun Microsystems, Inc.

Addressing SMTP-based Mass-Mailing Activity Within Enterprise Networks.
David Whyte, Carleton University
Paul van Oorschot, Carleton University
Evangelos Kranakis, Carleton University

Using Attribute-Based Access Control to Enable Attribute-Based Messaging.
Rakesh Bobba, University of Illinois
Omid Fatemieh, University of Illinois
Fariba Khan, University of Illinois
Carl Gunter, University of Illinois
Himanshu Khurana, University of Illinois

Enhancing Signature-based Collaborative Spam Detection with Bloom Filters.
Jeff Yan, University of Newcastle upon Tyne
Pook Leong Cho, University of Newcastle upon Tyne

Track 2: Technical Papers

Title: Countermeasures
Chair: Rick Smith, University of St. Thomas

Extended protection against stack smashing attacks without performance loss.
Yves Younan, Katholieke Universiteit Leuven
Davide Pozza, Politecnico di Torino
Frank Piessens, Katholieke Universiteit Leuven
Wouter Joosen, Katholieke Universiteit Leuven

PAST : Probabilistic Authentication of Sensor Timestamps.
Ashish Gehani, University of Notre Dame
Surendar Chandra, University of Notre Dame

Towards Database Firewall: Mining the Damage Spreading Patterns.
Kun Bai, Pennsylvania State University, University Park
Peng Liu, Pennsylvania State University, University Park

Track 3: Case Studies

Title: Certification and Accreditation

Understanding the Risks to Enterprises and their Information Technology Infrastructure.
Ron Ross, NIST
Julie Mehan, Hatha Systems

Understanding the risks to enterprise missions resulting from the operation of highly-connected and complex information systems and networks is a top priority for public and private sector organizations today. Establishing a cost-effective and disciplined approach to assessing the effectiveness of the security controls (i.e., safeguards and countermeasures) employed to protect enterprise information systems and the critical missions supported by those systems is the driving force behind today’s certification and accreditation (C&A) efforts. Many efforts are underway both nationally and internationally, to streamline the C&A process and to promote activities that facilitate credible, risk-based decisions on the part of authorizing officials (a.k.a. designated accreditation authorities). This session addresses two of the most significant C&A processes currently employed by the federal government to address the security issues related to federal information systems; NIST Special Publication 800-37 covering the C&A process for federal non-national security systems and the Defense Department’s Information Assurance Certification and Accreditation Process covering the needs of the warfighter and key military applications and systems.


Thursday, December 14, 2006, 15:30-17:00

Track 1: Technical Papers

Title: Information Flow and Leakage
Chair: Ed Schneider, IDA

A General Dynamic Information Flow Tracking Framework for Security Applications.
Lap Chung Lam, Rether Networks, Inc.
Tzi-cker Chiueh, Stony Brook University

Covert and Side Channels due to Processor Architecture.
Zhenghong Wang, Princeton University
Ruby Lee, Princeton University

CryptoPage: an Efficient Secure Architecture with Memory Encryption, Integrity and Information Leakage Protection.
Guillaume Duc, ENST Bretagne
Ronan Keryell, ENST Bretagne

Protecting Privacy in Key-Value Search Systems.
Yinglian Xie, Carnegie Mellon University
David O'Hallaron, Carnegie Mellon University
Michael Reiter, Carnegie Mellon University

Track 2: Panel

Title: Highlights from the 2006 New Security Paradigms Workshop (NSPW)
Chair: Carol Taylor, University of Idaho

This panel highlights a selection of the most interesting and provocative papers from the 2006 New Security Paradigms Workshop. The URL for more information is http://www.nspw.org.

The panel consists of authors of the selected papers, and the session is moderated by the workshop's general chairs. We present selected papers focusing on exciting major themes that emerged from the workshop. These are the papers that will provoke the most interesting discussion at ACSAC.

  1. Googling Consered Harmful by Greg Conti

    Virtually every Internet user on the planet uses the powerful free tools offered by a handful of information service providers in many aspects of their personal and professional lives. As a result, users and organizations are freely providing unprecedented amounts of sensitive information in return for such services as Internet search, email, mapping, blog hosting, instant messaging and language translation. Traditional security measures, such as cryptography and network firewalls, are largely ineffective because of the implicit trust paradigm with the service provider. In this paper, we directly address this problem by providing a threat analysis framework of information disclosure vectors, including fingerprinting of individuals and groups based on their online activities, examine the effectiveness of existing privacy countermeasures and clearly outline the critical future work required to protect our corporate, organizational and individual privacy when using these services.

  2. Cent, Five Cent, Ten Cent, Dollar: Hitting Malware Where it Really Hurts by Richard Ford and Sarah Gordon

    Spyware, Adware, Bots. In each case, there is signicant evidence that there is an increasing financial motivation be hind the writing and distribution of these programs. In this paper, the concept of using our knowledge of these financial motivators to combat malicious software is introduced. Can attacks on business models actually provide relief that technology alone cannot? Can we deploy our technology directly, in order to receive direct benefits of this indirect attack on revenue streams? Our conclusion is that not only is this a possible solution, but that it may be an extremely effective one. This is illustrated by a description of our business model attack generator, MARK, the Multihost Adware Revenue Killer. Using MARK, we demonstrate simple but effective attacks against Malicious-code generated revenue streams. However, the creation and deployment of MARK raises thorny legal and ethical questions, as the impact of the technology is widespread and could easily be targeted at legitimate online marketing models. Do the ends justify the means?

  3. Challenging the Anomaly Detection Paradigm by Carol Taylor and Carrie Gates

    In 1987, Dorothy Denning published the seminal paper on anomaly detection as applied to intrusion detection on a single system. Her paper sparked a new paradigm in intrusion detection research with the notion that malicious behavior could be distinguished from normal system use. Since that time, a great deal of anomaly detection research based on Denning's original premise has occurred. However, Denning's assumptions about anomalies that originate on a single host have been applied essentially unaltered to networks. In this paper we question the application of Denning's work to network based anomaly detection, along with other assumptions commonly made in network-based detection research. We examine the assumptions underlying selected studies of network anomaly detection and discuss these assumptions in the context of the results from studies of network traffic patterns. The purpose of questioning the old paradigm of anomaly detection as a strategy for network intrusion detection is to reconfirm the paradigm as sound or begin the process of replacing it with a new paradigm in light of changes in the operating environment.

Track 3: Case Studies

Title: Minimum Security Requirements

FIPS 200: The Standard and Its Effect on the Public and Private Sectors.
Ron Ross, NIST
Graydon McKee, Unisys

The Federal Information Security Management Act of 2002 places significant requirements on Federal agencies for the protection of information and information systems including those systems comprising the critical infrastructure of the United States. The National Institute of Standards and Technology (NIST) is leading the development of key information system security standards and guidelines as part of its FISMA Implementation Project. One of the principal security standards, Federal Information Processing Standard (FIPS) 200, identifies minimum security requirements for federal information and information systems. The FIPS 200 minimum security requirements are linked to another NIST publication, Special Publication 800-53, which describes the minimum security controls (safeguards and counter measures) necessary to protect enterprise missions in the face of ever-increasing and sophisticated attacks. This session provides insights into the effects of the legislation and the implementing security standards and guidance on the public and private sector.



Workshop

Title: Host Based Security Assessment: Standards to Implementations
Chair: Dr. Harvey Rubinovitz, The MITRE Corporation
Monday, 11 December 2006, 8:30 a.m. - 4:30 p.m.

In recent years, malware has become one of the major threats to service availability and information security. To reduce the number of infections, security assessments are performed to check a computer’s configuration against a known standard to offers the best security for a given environment. A number of companies and agencies have published their standards and, depending on a system’s ownership, the owners of the system may need to maintain a list of systems with their degree of compliance. For groups that have a large number of computer systems, performing an assessment and determining compliance can be a long process. Another issue is determining which standard should be used to determine compliance.

Federal Information Processing Standards (FIPS) have been established to guide agencies in complying with the Federal Information Security Management Act (FISMA). The National Institute of Standards and Technology (NIST) is actively holding workshops on how FISMA requires organizations to conduct security assessments of federal systems, including those federal systems operated by contractors. Each organization, once agreeing on a standard configuration, needs a way to perform the checks in a systematic fashion. To help with this process, organizations may consider either in-house solutions or commercial products to perform the checks. Many of the products not only check but also enforce standard profiles.

This workshop will focus on the security assessments, including how they may be defined in standards by either government agencies or by commercial organizations, how technology is being implemented and utilized to perform the assessments, and how on-going enforcement is accomplished. The workshop will also examine the need to facilitate the research and development of the next generation of standards and implementations to assist in the creation of more secure configurations. Previous participants have agreed that past workshops have provided a useful and exciting forum for members of the standards and software development worlds to exchange ideas, opinions, and concerns. Due to community interest in security assessments and the rapidly evolving technologies, this year’s workshop should generate much discussion.

Pre-registration is required as there is registration fee to cover the cost of the workshop and lunch and snack. Position papers are encouraged. To register, contact Dr. Harvey Rubinovitz, Workshop Chair, The MITRE Corporation, M/S S145, 202 Burlington Road, Bedford, Massachusetts 01730; (781)-271-3076; hhr@mitre.org. If you are interested in attended please check off the appropriate box on the conference registration form and add in the workshop fee of $50.



Tutorials

ACSAC is pleased to host seven tutorials this year, distributed between Monday, December 11, and Friday, December 15, 2006. See the descriptions that follow for more details about each tutorial, its instructor(s) and when it will be given.

Attendees enrolled in any of the following tutorials are provided lunch on the day of their tutorial.

Although everyone attending a tutorial will be provided a copy of the materials used by the instructor, only those who pre-register for the tutorial will be guaranteed the tutorial materials at the beginning of the tutorial instruction. See the registration form for more information. Please note the tutorial registration fees are for tutorials only; registration for the technical portion of the Conference is separate.

Tutorial M1

Building Biometric Authentication Systems: Pitfalls and How to Avoid Them
Speaker: Dr. Jan Jürjens, The Open University
Time: Monday Morning 12/11/2006 Half-Day Tutorial

Biometric techniques are increasingly used to identify, verify, and authenticate people. Building authentication systems that rely on these techniques is however still very challenging: It is difficult to estimate the reliability of the current biometric sensors and matching algorithms. It is hard to adequately calibrate these algorithms to achieve the trade-off between false positive and false negative rates most suitable for a given application. And it is very challenging to correctly build systems that employ biometric authentication systems so that the overall system will be secure.

Based on practical experiences from related industrial R&D projects, the tutorial gives a hands-on introduction into how to correctly build biometric authentication systems and how to securely embed them into the system context. The tutorial will report on experiences and lessons learnt, and on common pitfalls in designing such systems. Tutorial participants will gain up-to-date knowledge on the state of the art in biometric authentication and on how to use this technology securely. A textbook on secure systems development used during the tutorial is distributed to each participant.

Prerequisites: Some basic knowledge in computer security would be helpful.

High-Level Outline

The tutorial covers the following subtopics.

  • Introduction: What Is Biometric Authentication and how does it work?
  • Which Biometric Authentication Technique is best for my application?
  • How do I calibrate it correctly for the right trade-off between false positives and false negatives?
  • Architectures for Biometric Authentication Systems
  • Common Pitfalls in Designing and Implementing Biometric Authentication Systems and How to avoid them
  • Security Testing Biometric Authentication Systems
  • Operational Security of Biometric Authemtication Systems: What to look out for.

Each topic is covered in about 20-30 min.

About the Instructor

Dr. Jan Jurjens is a Senior Lecturer (equiv. US Assoc. Prof.) at the Open University (the British long-distance university in Milton Keynes near London). He is the author of a book on Secure Systems Development with UML (Springer 2004) and an introductory book on IT-Security (Springer 2006) and about 50 papers and 10 invited talks in refereed international books, journals, and conferences, mostly on computer security and software engineering. He has created and lectured courses on secure systems development at the University of Oxford, the Technical University of Munich, Carlos III Univ. Madrid, and the University of Innsbruck, as well as over 30 tutorials on secure software engineering at international conferences. He is the initiator and current chair of the working group on Formal Methods and Software Engineering for Safety and Security (FoMSESS) within the German Society for Informatics (GI). He is a member of the executive board of the Division of Safety and Security (Fachbereich Sicherheit) within the GI, the executive boad of the committee on Modeling (QFA Modellierung) of the GI, the advisory board of the Bavarian Competence Center for Safety and Security (KoSiB), the working group on e-Security of the Bavarian regional government, the IFIP Working Group 1.7 "Theoretical Foundations of Security Analysis and Design", and the IFIP Working Group on Critical Infrastructure Protection which is currently being founded. He has been leading various security-related projects with industry.

Jan Jurjens studied Mathematics and Computer Science at the Univ. of Bremen (Germany) and the Univ. of Cambridge (GB). He has done research towards a PhD at the Univ. of Edinburgh (GB), Bell Laboratories (Palo Alto, USA), and the Univ. of Oxford (GB), received a DPhil (Doctor of Philosophy) in Computing from the Univ. of Oxford. Before joining the faculty at the Open University, he directed the Competence Center for IT-Security at the group on Software & Systems Engineering at the Technical University of Munich.

Tutorial M2

Defenses Against Viruses, Worms, and Malicious Software
Speaker: Dr. Tom Chen, Southern Methodist University
Time: Monday Afternoon 12/11/2006 Half-Day Tutorial

The Internet has created a fertile environment for viruses and worms since virtually every computer is now interconnected into a global community. About every PC user has an experience with a virus or worm at some time or another. Unlike many types of security attacks directed at compromising a specific target, the self-replicating nature of viruses and worms creates a large-scale attack on the general community. The Internet itself is also affected by the resulting congestion.

This tutorial will give an overview of computer viruses, worms, and Trojan horses. The tutorial is organized into three major parts. The first part introduces the audience to the self-replicating mechanisms of viruses and worms, and describes how malicious software programs function. The possible effects on hosts and networks is described with real-life examples.

The second part of the tutorial gives an overview of current host-based and network-based defenses. Hosts are protected by antivirus software and operating system patching. Network-based defenses consist of various network equipment such as firewalls, intrusion detection systems, server proxies, and routers. In addition to explaining each type of defense, the limitations of each defense are pointed out. The limitations are important to understanding why malware outbreaks continue to be a major problem today and into the foreseeable future.

The third part of the tutorial gives an overview of some current research areas in improving defenses. The automation of defenses will be critical in the face of new worms that can be much faster than today’s manual, reactive defenses. Automated defenses will first depend on accurate detection of new outbreaks. New outbreaks must be detected before a virus/worm signature is available, so new behavior-based detection methods must be used. Unfortunately, behavior-based detection can result in a high number of false positives, so current research is seeking to improve the accuracy of behavior-based detection. After detection of a new outbreak, automated defenses will exercise some action to quarantine the worm. Examples proposed by Cisco and Microsoft will be described. Also, the use of tarpits and rate throttling to slow down outbreaks will be explained.

Prerequisites: None specified.

High Level Outline

  1. Introduction
    Historical and recent cases
  2. Malware Basics
    Computer viruses, Worms, Trojan horses, Vulnerabilities and exploits, Replication mechanisms, Payloads, Congestion effects
  3. Current Defenses
    Antivirus software and patching, Firewalls, Intrusion detection systems, Router access control lists, Mail server proxies, Limitations of current defenses
  4. Early Detection and Warning
    Host-based detection, Honeypots and black holes, Network-based intrusion detection, Early warning systems
  5. Dynamic Quarantining
    Quarantine requirements, Cisco Network Admission Control, Microsoft Network Access Protection, Honeypots and tarpits, Rate throttling
  6. References and Further Reading

About the Instructor

Thomas M. Chen is an associate professor in the Department of Electrical Engineering at Southern Methodist University in Dallas, Texas. He received the BS and MS degrees in electrical engineering from the Massachusetts Institute of Technology in 1984, and the PhD in electrical engineering from the University of California, Berkeley, in 1990. He is currently the editor-in-chief of IEEE Communications Magazine, a senior technical editor for IEEE Network, a past associate editor for ACM Transactions on Internet Technology, and past founding editor of IEEE Communications Surveys. He served as the treasurer for the IEEE Technical Committee on Security and Privacy 2004-2005. He is a member of the Technical Advisory Board for the Voice over IP Security Alliance. Prior to joining SMU, he was a senior member of the technical staff at GTE Laboratories (now Verizon) working on ATM research. He is the co-author of ATM Switching Systems (Artech House, 1995). He was the recipient of the IEEE Communications Society’s Fred W. Ellersick best paper award in 1996. His research interests include network security, traffic modeling, network performance, and network management.

Tutorial M3

Live Forensics
Speakers: Dr. Frank Adelstein, ATC-NY and Dr. Golden Richard, University of New Orleans
Time: Monday 12/11/2006 Full-Day Tutorial

Traditional digital forensics focuses on analyzing a copy (or “image”) of a disk to extract information, such as deleted files, file fragments, web browsing history, and to build a timeline that provides a partial view of what happened on the computer. Live forensics, an emerging area in which information is gathered on running systems, offers some distinct advantages over traditional “dead” forensics, which focuses on disk images. Live forensics can provide information on the running state of the machine that cannot be gathered by static methods, such as running processes, memory dumps, open network connections, and unencrypted versions of encrypted files. This information can both serve as digital evidence and help direct or focus traditional analysis methods.

This tutorial covers the area of live forensics, including the types of information that can be gathered, how the evidence can be analyzed, and how it can work in conjunction with traditional methods, as well as satisfy forensic requirements. We will briefly review static disk analysis techniques, briefly cover network packet analysis, and then discuss gathering information on a live machine. The tutorial includes demonstrations. At the end, the students should understand what live state information is available on a computer, some of the different methods to gather the information, and the “best practices” that should be observed when performing a live analysis.

Prerequisites: None. This tutorial focuses on the emerging area of forensic analysis of live systems. The tutorial does not assume students have a background in forensics and will spend approximately 25% of the time reviewing the basic ideas of digital forensics. The rest of the course will focus on gathering and analyzing live data (network and host based forensics). Those familiar with “traditional” or static forensic analysis but who are interested in live forensics should also benefit from the course. This course will not cover legal issues.

High Level Outline

  1. Introduction
  2. Traditional Forensics Background (1.5 hours)
  3. Network Analysis (1.0 hours)
  4. Live Forensics (2.5 hours)
  5. Big Demo/Scenario of putting it all together (0.5 hours)
  6. Summary/Wrap up

About the Intructors

Dr. Frank Adelstein is the technical director of computer security at ATC-NY in Ithaca, NY. He is the principal designer of a live forensic investigation product (marketed as Online Digital Forensic Suite™ and LiveWire Investigator™) and has worked in the area of live investigation for the last 5 years. He has also been the principal investigator on numerous research and development projects including security, wireless networking, intrusion detection, and training.

Professor Golden G. Richard III is an Associate Professor at the University of New Orleans, where he developed the Information Assurance curriculum and coordinated the effort to have the University of New Orleans certified by the National Science Foundation as a Center of Academic Excellence. He teaches courses in digital forensics, computer security, and operating systems internals. He is also a co-founder of Digital Forensic Solutions, LLC and is the author of the digital forensics tool “Scalpel.”

Richard and Adelstein are the chair and vice-chair of the Digital Forensic Research Workshop, the premier workshop on research advances in the area of digital forensics. They have co-authored the book “Fundamentals of Mobile and Pervasive Computing” (from McGraw-Hill).

Tutorial M4

Security Engineering
Speaker: Dr. Steven J. Greenwald, Independent Consultant
Time: Monday 12/11/2006 Full-Day Tutorial

Based on Ross Anderson’s carefully researched and eminently practical book Security Engineering: A Guide to Building Dependable Distributed Systems, this tutorial will cover how to make distributed systems more secure with the help of both technological mechanisms and management strategies. It will cover the entire field of computer security, although it is, of course, severely limited by the one-day format.

Real-world examples of how information systems have been defeated will be covered, as well as the uses of technology, policy, psychology, and legal issues.. Practical examples such as the security of ATM machines, multi-level security, information warfare, hardware security, e-commerce, intellectual property protection, biometrics, and tamper resistance will be covered. Each section will examine what goes wrong.

Prerequisites: None.

High Level Outline

  1. A Quick Overview of Security Engineering Basics (1 hour).
  2. Conventional Computer Security Issues (1 hour).
  3. Hardware Engineering Aspects of Information Security (1 hour).
  4. Attacks on Networks (1 hour).
  5. Electronic Commerce (1 hour).
  6. Policy, Management, and Assurance (1 hour)
  7. Conclusions and General Q&A (½ hour).

About the Instructor

Dr. Steven J. Greenwald is an Independent Consultant in the field of Information Systems Security specializing in distributed security, formal methods, security policy modeling, and related areas. He also works with organizational security policy consulting, evaluation, training, and auditing.

Dr. Greenwald is also a Research Fellow of Virginia’s Commonwealth Information Security Center (CISC) and an adjunct professor at James Madison University (an NSA Designated Center of Academic Excellence in Information Security Assurance) where he teaches several graduate courses for their M.S. degree in Computer Science concentrating in INFOSEC.

Dr. Greenwald served as the 2001 General Chair of the New Security Paradigms Workshop (NSPW), has been past Program Chair for NSPW, and also serves on the program committees of other conferences. He is a member of the Association for Computing Machinery and the IEEE Computer Society. More information about him, including his publications, can be found at his web site at http://www.gate.net/~sjg6.

Tutorial F5

Next-Generation Wireless Risks & Defenses
Speaker: Mr. Richard Rushing, AirDefense
Time: Friday 12/15/2006 Full-Day Tutorial

As wireless networks continue to grow, the ever-present danger of new, more sophisticated hacking tools is also on the upswing. Simultaneously the knowledge needed to use these tools to hack into wireless networks is decreasing. Hackers, armed with new tools such as LIVE CD's and a host of others, are launching increasingly more sophisticated attacks on the networks and clients. Networks that a year ago were said to be unbreakable can now be broken in less than 10 minutes. The risk of wireless clients and the hostile environments of hotspots are making it more difficult for enterprise networks to stay secure.

This session will look at the current and future generation of wireless attack tools. This set of attack tools out on the Internet can damage, destroy, and infiltrate most wireless networks. This session will enable network and security administrators of organizations to build defenses and strategies against attacks and wireless network breaches or infections.

Topics covered at-a-glance:

  • BlueXing and New Bluetooth Attacks
  • WEP Attacks and Injection
  • LeapCrack, Bruteforce tools, Old Dictionary Attacks get new life
  • PEAP Exploits and how to prevent the MITM issues with clients
  • Probing Stations and how to prevent from become Snarfed
  • Protect the network from DoS Attacks
  • Layer 3 Attacks, from ARP spoofing, what is possible from the Air
  • 802.1x Exploits and Vulnerabilities

Prerequisites: None.

High Level Outline

  1. Introduction
  2. Wireless Networking
  3. Wireless Vulnerabilities and Tools Time
  4. Breaking Wireless
  5. Stealth Wireless Attacks
  6. Other Attack Vectors
  7. Layered Approach to Security
  8. Recommended Wireless Security Strategy
  9. Questions and Answers

About the Instructor

Richard Rushing is a recognized IT security expert with 20 years experience as a system analyst, engineer, consultant and architect. Richard has set security standards and policies for entire organizations and taught workshops on IDS, Security Protocols, and Network Security. Richard was most recently Chief Technical Officer of VeriSign's Network Security Services division where he identified and developed products and services to maintain VeriSign's focus on leading-edge security solutions. A much-in-demand speaker on information security, Richard has presented at leading security conferences, including Networld+Interop, RSA, Computer Security Institute, SANS Security Conferences, InfoSec and CyberTerrorism.

Tutorial F6

Using the Certification and Accreditation Process to Manage Enterprise Risk
Speaker: Dr. Ron Ross, National Institute of Standards and Technology and Dr. Julie Meehan, Hatha Systems
Time: Friday 12/15/2006 Full-Day Tutorial

This tutorial provides an in depth look at the process of certification and accreditation of information systems as a critical activity in managing enterprise risk. The fundamental concepts of security certification and accreditation as described in NIST Special Publication 800-37 will be discussed in the context of an integrated risk management framework. The integrated risk framework includes the principal components of an enterprise information security program to include categorizing the information system according to system criticality/sensitivity, selecting appropriate security controls (i.e., safeguards and countermeasures) for the system, determining security control effectiveness, and determining residual risk to the enterprise's mission or business case. Each phase of the NIST four-phase certification and accreditation process will be described as well as the roles and responsibilities of individuals within the enterprise participating in the process. The tutorial also examines the U.S. Department of Defense approach to certification and accreditation. Upon completion of the tutorial, attendees will have a fundamental understanding of the major components of an information security program, how the certification and accreditation process fits into the program, and what types of information are required by authorizing officials to make credible risk-based decisions on whether to place information systems into operation or continue their operation.

Prerequisites: None.

High Level Outline

  1. Introduction
  2. The Fundamentals
  3. The NIST 800-37 C&A Process
  4. The Department of Defense C&A Process
  5. Summary

About the Instructor

Dr. Ron Ross is a senior computer scientist and information security researcher at the National Institute of Standards and Technology (NIST). His areas of specialization include security requirements definition, security testing and evaluation, and information assurance. Dr. Ross currently leads the Federal Information Security Management Act (FISMA) Implementation Project for NIST, which includes the development of key security standards and guidelines for the federal government, contractors supporting the federal government, and the critical information infrastructure. His recent publications include Federal Information Processing Standards (FIPS) Publication 199 (the security categorization standard), FIPS Publication 200 (the minimum security requirements standard), NIST Special Publication 800-53 (the security controls guideline), NIST Special Publication 800-53A (the security assessment guideline), and NIST Special Publication 800-37 (the system certification and accreditation guideline). Dr. Ross is also the principal architect of the risk management framework and nine-step process that integrates the suite of NIST security standards and guidelines into a comprehensive enterprise-wide information security program. Dr. Ross is a frequent speaker at public and private sector venues including federal agencies, state and local governments, and Fortune 500 companies. In addition to his responsibilities at NIST, Dr. Ross supports the U.S. State Department in the international outreach program for information security and critical infrastructure protection.

Dr. Ross previously served as the Director of the National Information Assurance Partnership, a joint activity of NIST and the National Security Agency. A graduate of the United States Military Academy at West Point, Dr. Ross served in a variety of leadership and technical positions during his twenty-year career in the United States Army. While assigned to the National Security Agency, he received the Scientific Achievement Award for his work on an inter-agency national security project and was awarded the Defense Superior Service Medal upon his departure from the agency. Dr. Ross is a two-time recipient of the Federal 100 award for his leadership and technical contributions to critical information security projects affecting the federal government. During his military career, Dr. Ross served as a White House aide and as a senior technical advisor to the Department of the Army. Dr. Ross is a graduate of the Program Management School at the Defense Systems Management College and holds both Masters and Ph.D. degrees in Computer Science from the United States Naval Postgraduate School.

Tutorial F7

Acquisition and Analysis of Large Scale Network Data V.3
Speaker: Dr. John McHugh, Dalhousie University
Time: Friday 12/15/2006 Full-Day Tutorial

Detecting malicious activity in network traffic is greatly complicated by the large amounts of noise, junk, and other questionable traffic that can serve as cover for these activities. With the advent of low cost mass storage devices and inexpensive computer memory, it has become possible to collect and analyze large amounts of network data covering periods of weeks, months, or even years. This tutorial will present techniques for collecting and analyzing such data, both from network flow data that can be obtained from many routers or derived from packet header data and directly from packet data such as that collected by TCPDump, Ethereal, and Network Observer. This version of the course will contain examples from publicly available packet data such as the Dartmouth Crawdad wireless data repository and will deal with issues such as the acquisition of data in IP-unstable environments such as those involving DHCP.

Because of the quantity of the data involved, we develop techniques, based on filtering of the recorded data stream, for identifying groups of source or destination addresses of interest and extracting the raw data associated with them. The address groups can be represented as sets or multisets (bags) and used to refine the analysis. For example, the set of addresses within a local network that appear as source addresses for outgoing traffic in a given time interval approximates the currently active population of the local network. These can be used to partition incoming traffic into that which might be legitimate and that which is probably not since it is not addressed to active systems. Further analysis of the questionable traffic develops smaller partitions that can be identified as scanners, DDoS backscatter, etc. based on flag combinations and packet statistics. Traffic to and from hosts whose sources appear in both partitions can be examined for evidence that its destinations in the active set have been compromised. The analysis can also be used to characterize normal traffic for a customer network and to serve as a basis for identifying anomalous traffic that may warrant further examination.

Prerequisites: General familiarity with IP network protocols. Elementary familiarity with simple statistical measures.

High Level Outline

  1. Introduction (45 Minutes)
  2. Data Collection (45 Minutes)
  3. The SiLKtools Analysis Suite (90 Minutes)
  4. Advanced Analysis (90 Minutes)
  5. Case studies (60 Minutes)
  6. General Questions and Discussion (30 minutes)

About the Instructor

Dr. John McHugh holds the Canada Research Chair in Privacy and Security at Dalhousie University in Halifax, NS where he leads the Privacy and Security Laboratory. Prior to joining Dalhousie, he was a senior member of the technical staff with the CERT Situational Awareness Team, where he did research in survivability, network security, and intrusion detection. Recently, he has been involved in the analysis of large scale network flow data. He was a professor and former chairman of the Computer Science Department at Portland State University in Portland, Oregon. His research interests include computer security, software engineering, and programming languages. He has previously taught at The University of North Carolina and at Duke University. He was the architect of the Gypsy code optimizer and the Gypsy Covert Channel Analysis tool. Dr. McHugh received his PhD degree in computer science from the University of Texas at Austin. He has a MS degree in computer science from the University of Maryland, and a BS degree in physics from Duke University.



Awards and Opportunities

Best Paper Award

An annual prize for the Outstanding Paper is based on both the written paper and the oral presentation at the conference. A plaque and honorarium will be presented to the winning author.

To determine the Outstanding Paper, a set of best paper candidates is selected based on the recommendation of the Program Committee. A subcommittee then attends the presentation of each candidate paper and meets to select the winning paper. If the timing of paper presentations permits, the award is announced at the next available opportunity during the conference.

Best Student Paper Award

The winner of the student paper award is selected by the Student Awards Committee in consultation with ACSA. The winning paper may have multiple authors but the primary content of the paper must have been developed by students; students must provide written confirmation to the Student Awards Chair that they meet this policy. A student is defined as anyone who has a current course load of at least 9 credit hours or equivalent as explained by the student or who is enrolled in a degree-granting program and is not employed in a professional capacity outside of the university more than 20 hours per week.

ACSA Conferenceship Program

ACSA offers a Conferenceship Program for selected students who need assistance to attend the Annual Computer Security Applications Conference. Conferenceships can be requested by any student and will be awarded to students that will get the most from ACSAC. However, conferenceship will be awarded first to students that are author of papers and that have a co-author also attending ACSAC. The conferenceship will cover registration, 4 nights at the conference hotel, and $400 (North America Student)/$700 (International Student). The amount above is to cover airline tickets and other expenses, and require a copy of the airline ticket receipt.

To be considered for the Conferenceship Program, please submit the following information to the Student Awards Chair at student_chair@acsac.org: your name, your address, and the name of the institution at which you are a student; a list of applicable course work that you have completed or are currently enrolled in; your current grade point average; a short narrative discussing why you are interested in the security field, relevant areas of interest, the type of career you plan on pursuing, and two letters of recommendation from faculty. This material is typically due by October 1.

Works in Progress (WiP) Session

The Works In Progress (WIP) Session is intended as a forum to introduce new ideas, report on ongoing work that may or may not be complete, and to state positions on controversial issues or open problems. Additional submissions may be given to WiP Chair at wip_chair@acsac.org or the Program Chair at program_chair@acsac.org.



Conference Committee

  • Dan Thomsen, Cyber Defense Agency, LLC (Conference Chair)
  • Christoph Schuba, Linköpings University (Program Chair)

  • Rafae Bhatti, IBM Almaden Research Center (Site Arrangements Co-Chair)
  • Laura Corriss, Barry University (Site Arrangements)
  • Dan Faigin, The Aerospace Corporation (Tutorials Chair)
  • Arthur Friedman, OASD(NII)/DoD CIO (Registration)
  • Carrie Gates, CA Labs (Publicity)
  • Tom Haigh, Cyber Defense Agency, LLC (Guest Speaker Liaison)
  • Noel Hardy, Aspect Security (Recording Secretary)
  • Tracy Hawkins, NSA (Registration)
  • Paul Jardetzky, Network Applicances, Inc. (Panel Chair)
  • Jay Kahn, The MITRE Corporation (Distribution Chair)
  • Charles Payne, Adventium Labs (Program Co-Chair)
  • Steven Rome, Booz Allen Hamilton, Inc. (Case Studies)
  • Ron Ross, NIST (Case Studies)
  • Harvey H. Rubinovitz, The MITRE Corporation (Workshop Chair)
  • Pierangela Samarati, Università degli Studi di Milano (Program Co-Chair)
  • Andre dos Santos, University of Puerto Rico at Mayagüez (Student Awards Chair)
  • Ed Schneider, Institute for Defense Analysis (Treasurer)
  • Linda Schliper, Trusted Computer Solutions Mailing Lists)
  • Cristina Serban, AT&T (Works in Progress Chair)
  • Rick Smith, University of St. Thomas, Minnesota (Publicity)
  • Robert H'obbes' Zakon, Zakon Group LLC (Web Advisor)



Program Committee

  • Christoph Schuba (PC Chair), Linköpings University
  • Charles Payne (PC Co-Chair), Adventium Labs
  • Pierangela Samarati (PC Co-Chair), Università degli Studi di Milano

  • Tuomas Aura, Microsoft Research, UK
  • Lujo Bauer, Carnegie Mellon University
  • Terry Benzel, USC - ISI
  • Konstantin Beznosov, University of British Columbia
  • Rafae Bhatti, Florida International University
  • Marc Dacier, Eurecom Institute
  • Carrie Gates, CA Labs
  • Dieter Gollmann, Hamburg University of Technology
  • Wesley Higaki, Symantec Corporation
  • Cynthia Irvine, Naval Postgraduate School
  • Paul Jardetzky, Network Applicances, Inc.
  • Jan Jürjens, TU München
  • Myong Kang, Naval Research Laboratory
  • James Kempf, DoCoMo Labs USA
  • Angelos Keromytis, Columbia University
  • Carl Landwehr, University of Maryland
  • Peng Liu, Pennsylvania State University
  • Javier Lopez, University of Malaga
  • Bryan Lyles, Telcordia Technologies
  • Patrick McDaniel, Pennsylvania State University
  • John McDermott, Naval Research Laboratory
  • Joon Park, Syracuse University
  • Anoop Singhal, National Institute for Standards and Technology
  • Andre Dos Santos, University of Puerto Rico at Mayagüez
  • Giovanni Vigna, University of California Santa Barbara
  • Simon Wiseman, QinetiQ
  • Diego Zamboni, IBM Zürich Research Laboratory



ACSAC Steering Committee

  • Marshall Abrams, The MITRE Corporation
  • Jeremy Epstein, webMethods, Inc.
  • Daniel Faigin, The Aerospace Corporation
  • Steve Rome, Booz Allen Hamilton
  • Ron Ross, National Institute of Standards
  • Ravi Sandhu, George Mason University
  • Christoph Schuba, Linköpings University
  • Ann Marmor-Squires, The Sq Group
  • Dan Thomsen, Cyber Defense Agency, LLC



ACSA

  • Marshall Abrams, The MITRE Corporation (ACSA Chair & Treasurer)
  • Jeremy Epstein, webMethods, Inc. (ACSA Vice President)
  • Daniel Faigin, The Aerospace Corporation (ACSA Secretary)
  • Steven Greenwald, Independent Consultant
  • Steve Rome, Booz Allen Hamilton (ACSA President)
  • Harvey Rubinovitz, The MITRE Corporation, (ACSA Assistant Treasurer)
  • Ann Marmor-Squires, The Sq Group (Chair Emerita)
  • Mary Ellen Zurko, IBM Corporation



About the Sponsor

acsalogonew-lite.gif

ACSA had its genesis in the first Aerospace Computer Security Applications Conference in 1985. That conference was a success and evolved into the Annual Computer Security Applications Conference (ACSAC). ACSA was incorporated in 1987 as a non-profit association of computer security professionals who have a common goal of improving the understanding, theory, and practice of computer security. ACSA continues to be the primary sponsor of the annual conference.

In 1989, ACSA began the Distinguished Practitioner Series at the annual conference. Each year, an outstanding computer security professional is invited to present a lecture of current topical interest to the security community.

In 1991, ACSAC began the Best Paper by a Student Award, presented at the Annual conference. This award is intended to encourage active student participation in the conference. The award winning student author receives an honorarium and all conference expenses. Additionally, our Student Conferenceship program assists selected students in attending the Conference by paying for the conference fee and tutorial expenses. Applicants must be undergraduate or graduate students, nominated by a faculty member at an accredited university or school, and show the need for financial assistance to attend this conference.

An annual prize for the Outstanding Paper has been established for the Annual Computer Security Applications Conference. The winning author receives a plaque and an honorarium. The award is based on both the written and oral presentations.

ACSA initiated the Marshall D. Abrams Invited Essay in 2000 to stimulate development of provocative and stimulating reading material for students of Information Security, thereby forming a set of Invited Essays. Each year's Invited Essay addresses an important topic in Information Security not adequately covered by the existing literature.

This year’s ACSAC continues the Classic Papers feature begun in 2001. The classic papers are updates of some of the seminal works in the field of Information Security that reflect developments in the research community and industry since their original publication. ACSA continues to be committed to serving the security community by finding additional approaches for encouraging and facilitating dialogue and technical interchange. In the past, ACSA has sponsored small workshops to explore various topics in Computer Security (in 2000, the Workshop on Innovations in Strong Access Control; in 2001, the Workshop on Information Security System Rating and Ranking; in 2002, the Workshop on Application of Engineering Principles to System Security Design). In 2003, ACSA became the sponsor of the already established New Security Paradigms Workshop (NSPW). ACSA also maintains a Classic Papers Bookshelf that preserves seminal works in the field and a web site focusing on Strong Access Control/Multi-Level Security (http://www.sac-tac.org).

For more information on ACSA and its activities, please visit www.acsac.org/acsa/. ACSA is always interested in suggestions from interested professionals and computer security professional organizations on other ways to achieve its objectives of encouraging and facilitating dialogue and technical interchange.

To learn more about the conference, visit the ACSAC web page at http://www.acsac.org

To be added to the Conference Mailing List, visit us on the World Wide Web at http://www.acsac.org/list/

For questions, contact the committee members using the following E-mail addresses:


ACSAC wishes to thank ATSEC for being an ACSAC Platinum sponsor.

atsec-logo-rgb-tiff.gif

Revision: r1.9 - 2 Nov 2006 x