Research Projects 2007 (by faculty)

The funded projects listed below are/were active projects in the 2007 calendar year and the funded running total for that year is on the left navigational menu.

SoD: Collaborative Research: Transparency and Legal Compliance in Software Systems
Annie Anton

$270,407 by the National Science Foundation
08/ 1/2007 - 07/31/2010

Healthcare information systems are becoming ubiquitous and thus increasingly subject to attack, misuse and abuse. Mechanisms are needed to help analysts disambiguate regulations so that they may be clearly specified as software requirements. In addition, regulations are increasingly requiring organizations to comply with the law and account for their actions. We propose a requirements management framework that enables executives, business managers, software developers and auditors to distribute legal obligations across business units and/or personnel with different roles and technical capabilities.

The Design and Use of Digital Identities
Annie Anton ; Julia Earp

$103,006 by Purdue University
09/15/2004 - 08/31/2007

This project will address a wide variety of digital identity needs by developing required Flexible, Multiple and Dependable Digital Identity (FMDDI) technology, based on a sound underlying set of definitions and principles. We will apply, expand, and refine the theory of identities, the mechanisms to embody and enforce them, as well as study the implications of their use, and develop appropriate educational vehicles to teach people how digital identities should be used effectively.

ITR: Encoding Rights, Permissions, and Obligations: Privacy Policy Specification and Compliance
Annie Anton ; Julie Earp ; Lynda Aiman-Smith ; David Baumer

$932,000 by the National Science Foundation
09/15/2003 - 02/28/2010

This research focuses on how society uses, values, and protects citizens? personal information. From the perspective of system design, software engineers need methods and tools to enable them to design systems that reflect those values and protect personal information, accordingly. This research examines how privacy considerations and value systems influence the design, deployment and consequences of IT. The goal is to develop concepts, tools and techniques that help IT professionals and policy makers bring policies and system requirements into better alignment. An action-oriented set of conceptual tools, including guidelines and privacy- relevant policy templates will be constructed and validated.

Collaborative Research: A Comprehensive Policy-Driven Framework for Online Privacy Protection: Integrating IT, Human, Legal and Economic Perspectives
Annie Anton ; Ting Yu ; David Baumer ; Michael Rappa

$534,000 by the National Science Foundation
09/15/2004 - 08/31/2010

Privacy is increasingly a major concern that prevents the exploitation of the Internet's full potential. Consumers are concerned about the trustworthiness of the websites to which they entrust their sensitive information. Although significant industry efforts are seeking to better protect sensitive information online, existing solutions are still fragmented and far from satisfactory. Specifically, existing languages for specifying privacy policies lack a formal and unambiguous semantics, are limited in expressive power and lack enforcement as well as auditing support. Moreover, existing privacy management tools aimed at increasing end-users' control over their privacy are limited in capability or difficult to use.

Triangle Computer Science Distinguished Lecturer Series
Franc Brglez

$43,320 by Army Research Office
09/ 1/2007 - 08/31/2010

Since 1995, the Triangle Computer Science Distinguished Lecturer Series (TCSDLS) has been hosting influential university researchers and industry leaders from computer-related fields as speakers at the three universities within the Research Triangle Area. The lecturer series, sponsored by the Army Research Office (ARO), is organized and administered by the Computer Science departments at Duke University, NC State University, and the University of North Carolina at Chapel Hill. This proposal argues for continuation, for an additional 3 years, of this highly successful lecturer series.

CAREER: Adaptive Automated Design of Stored Derived Data
Rada Chirkova

$489,810 by the National Science Foundation
08/ 1/2005 - 07/31/2011

The goal of this project is to develop an extensible framework for designing and using derived data in answering database queries efficiently. The outcomes of the project are expected to be general and independent of a specific data model (e.g., relational or XML), while giving guarantees with respect to query-performance improvement. The approach consists of developing and evaluating mathematical models and algorithms for common types of queries on relational and XML data. Expected outcomes of the project include automated tuning of data-access characteristics in a variety of applications, thus enhancing the quality of user interactions with data-intensive systems.

Integration and Interoperability of XML Data
Rada Chirkova

$80,000 by Center for Advanced Computing and Communication (CACC)
07/ 1/2007 - 06/30/2010

This research proposal builds upon recent advances in Semantic Web and information integration. The objective is to produce the formalism and lay the groundwork for a broad spectrum of information integration, that facilitate rapid development of efficient and effective integration systems. We concentrate on information integration from XML sources. A user query is translated and executed on XML data, which is the native model of the source or can be obtained using wrappers to represent sources' data in XML. We consider, in particular, the XML pipeline approach as the preferred architecture for the implementation of an extensible system for integration/mashup.

Efficient View-Design Algorithms To Achieve Near-Optimal Performance Of Sets of Relational Queries
Rada Chirkova

$253,180 by the National Science Foundation
09/15/2003 - 08/31/2007

The goal of this proposal is to develop effective methods to improve performance of frequent important queries on large databases. This problem is important for improving efficiency of user interactions with relational data-management systems; solving the problem will have effect in query optimization, data warehousing, and information integration. The project focuses on evaluating queries using auxiliary relations, or views. Our main objective is to develop theoretically rigorous yet practically applicable techniques needed to design views that would help in computing sets of frequent and important queries with optimal or near-optimal speedup on large databases with given statistics.

Runtime/Operating System Synergy to Exploit Simultaneous Multithreading
Vincent Freeh ; Frank Mueller

$380,000 by the National Science Foundation
08/ 1/2004 - 07/31/2008

This proposal focuses on a synergistic approach combining runtime and operating system support to fully exploit the capabilities of SMTs. To meet this objective, it studies three different approaches. First, it investigates the benefits of using a helper thread along side the primary thread, by building a reference implementation of an SMT-aware Message Passing Interface library. Second, it investigates the benefits of dynamic mode switching between single-thread and multi-threaded configurations. Third, it modifies the operating system creating an SMT-aware scheduler. The benefits are demonstrated for a variety of applications, including large-scale benchmarks and other nationally relevant parallel codes.

NOAA Interdisciplinary Scientific Environmental Technology(ISET) Cooperative Research and Education Center
Vince Freeh (Co-PI) ; Fredrick Semazzi (PI) ; Lian Xie ; Jingpu Liu

$978,528 by NC A & T State University via the National Oceanic & Atmospheric Administration
09/ 1/2006 - 08/31/2011

NOAA awarded $12.5 Million to fund the Interdisciplinary Scientific Environmental Technology (ISET) Cooperative Research and Education Center. NC A&T State University is the lead institution. The team includes a diverse network of scientists, and engineers from A&T, NC State University, University of Minnesota, University of North Carolina at Pembroke, City University of New York, University of Alaska Southeast, California State University-Fresno, Fisk University as well as industrial, state and federal government partners. NC State University is the lead university for the research thrust on the analysis of global observing systems that includes numerical and physical research and analysis of hurricanes.

CAREER: New Directions in Managing Structured Peer-to-Peer Networks
Khaled Harfoush

$408,894 by the National Science Foundation
03/15/2004 - 08/31/2010

In the research component of my career development program, I focus on strategies for addressing the challenges and opportunities that face the deployment of structured P2P systems. In particular, I introduce new schemes to locate resources and strategies to serve them. I also introduce new schemes for topology interference, integration, and organization in order to optimize content distribution. The proposed educational aspect of my career development program focuses on (1) enhancing our department�s networking curriculum, (2) extending opportunities for women, under-represented minorities, and undergraduates in research, (3) encouraging students to participate in the computer science community outside the university.

CAREER: Assisted Navigation in Large Visualization Spaces
Christopher Healey

$370,403 by the National Science Foundation (ACIR/ACR)
02/ 1/2001 - 01/31/2008

This project will investigate methods for navigating complex information spaces. Work will focus on a system designed to help viewers visualize, explore, and analyze large, multidimensional datasets. Detailed local displays will be combined with a high-level global overview of areas of interest within a dataset. Local views will use perceptual cues to harness the low-level human visual system. Global overviews will identify and cluster elements of interest to produce an underlying graph that: (1) support efficient navigation via graph traversal, and (2) provide an effective visualization of the areas of interest and their relationships to one another.

Visualizing Network Data and Environments
Christopher Healey

$40,000 by CACC
01/ 1/2006 - 05/31/2007

This proposal describes a one-year research project to apply techniques from scientific visualization to the problem of displaying, monitoring, and analyzing network-based data.

A Bioinformatics Computing Cluster for NC State University
Steffen Heber

$227,029 by North Carolina Biotechnology Center
02/ 1/2007 - 01/31/2008

The Bioinformatics Research Center (BRC) at NC State University is one of the world's premier centers for education and research in bioinformatics. Funding will provide a 54 dual-Xeon compute node Linux cluster to enhance the computational resources of the BRC. BRC will partner with NC State Information Technology Division (ITD) to leverage the proposed cluster investment in two ways: first, BRC faculty will gain access to the high performance computing (HPC) resources - currently more than 400 blade processors. Second, by having a shared, ITD housed and administered computing resource, personal systems will not need to be purchased or maintained.

Forensic Analysis of Medical Devices
SP Iyer

$20,000 by the National Science Foundation
08/ 1/2006 - 07/31/2008

Automation in medical devices has led to the use of software as an integral part of these devices. Given the safety critical nature of these devices it is important to take efforts that such software is free of defects. However, in the event of an accident it becomes important to determine the cause of the error -- an activity that is termed {\em forensic analysis}. We propose to work on formal methods based tools that can be used to understand and identify the source of an error. This project will be carried out at NCSU and at FDA.

Request for Support for the International Conference on Information and Communications Security (ICICS 2006)
Dennis Kekas

$5,000 by National Science Foundation
09/15/2006 - 08/31/2007

The International Conference on Information and Communications Security will be held in December 2006 in the Research Triangle of North Carolina. This is a well-established security conference being held for the first time in North America. Support from NSF is sought to broaden participation, particularly of students, provide improved access for researchers to the latest research results, and to promote the development and dissemination of solutions to some of the nation's pressing security needs in the computing and communications areas.

Workshop on STEM Education K-12
Dennis Kekas ; Glenn Kleiman

$49,630 by National Science Foundation (NSF)
07/15/2007 - 06/30/2010

NCSU Center for Advanced Computing and Communication (CACC) plans to host a workshop July 31, 2007 –August 1, 2007 at the William and Ida Friday Institute for Innovational Education on NCSU’s Centennial Campus in Raleigh, NC. The goal of the workshop is to bring together select individuals from industry, government, and academia to develop a national series of workshops to address problem of stimulating interest in STEM K-12.

NSF Partnership in the Center for Advanced Computing and Communication
Dennis Kekas ; Mladen Vouk

$492,240 by CACC-NSF
09/15/1999 - 08/31/2007

The Center for Advanced Computing and Communication (CACC) is a membership-based industry/university cooperative research center co-located at North Carolina State University and Duke University. North Carolina State University was selected by the National Science Foundation in 1981 as a site for an industry/university cooperative research center in communications and signal processing. The center was named the Center for Communications and Signal Processing until 1994 when a second center site at Duke University was added. The CACC research goal is to create concepts, methods and tools for use in the analysis, design and implementation of advanced computer and communication systems.

Bayesian Pedagogical Agents for Dynamic High-Performance Inquiry-Based Science Learning Environments
James Lester ; Hiller Spires ; John Nietfeld

$605,436 by the National Science Foundation
01/ 1/2007 - 12/31/2009

Pedagogical agents are embodied software agents that have emerged as a promising vehicle for promoting effective learning. The proposed work has two complementary technology and learning thrusts. First, it will develop a full suite of Bayesian pedagogical agent technologies that leverage probabilistic models of inference to systematically reason about the multitude of factors that bear on tutorial decision making in dynamic high-performance inquiry-based science learning environments. Second, it will provide a comprehensive account of the cognitive processes and results of interacting with Bayesian pedagogical agents in inquiry-based science learning by conducting extensive empirical studies of learning processes and outcomes.

CAREER: Transparent, Interactive Desktop Parallel Computing for Scientific Data Processing
Xiaosong Ma

$400,000 by the National Science Foundation
03/ 1/2006 - 12/31/2012

While individual workstations in scientific research environments have become more powerful, they cannot meet the needs of today's interactive data processing tasks. Meanwhile, idle desktop resources are not efficiently utilized. This project aims at harnessing the collective idle resources within institutional boundaries to speed up computation- or data-intensive tasks routinely executed on desktop machines. We will build a novel desktop parallel computing framework, which will integrate distributed computing and storage resources to create an execution platform similar to that provided by a parallel computer, while maintaining the comfort and responsiveness of desktop sequential computing and the autonomy of resource donors.

Joint Faculty Appointment
Xiaosong Ma

$549,457 by UT-Battelle, LLC
09/21/2003 - 08/15/2012

Xioasong Ma's joint work with NCSU and Oak Ridge National Laboratories (ORNL) will bridge the gap between the two organizations in a practical manner to cooperatively research parallel I/O in conjunction with the Genomes to Life (GTL) and Scientific Data management projects within the Computer Science and Mathematics Division at ORNL.

Runtime Data Management for Data-Intensive Scientific Applications
Xiaosong Ma

$299,992 by the US Department of Energy
08/15/2005 - 08/14/2008

Many applications currently used on daily basis by scientists fail to take advantage of state-of-the-art computer systems. This problem is more severe for many data-intensive applications, such as bioinformatics and visualization codes, whose parallelization are more recent and less studied in parallel architectures' design, compared to traditional simulations. We propose to address the above problems by investigating efficient runtime data management for data-intensive applications. We plan to build novel technologies for generic, automatic parallel execution plan optimization and enhancing parallel scientific data libraries by hiding I/O costs.

Transparent Data Recovery for Parallel File Systems
Xiaosong Ma

$31,846 by Oak Ridge National Laboratories - UT-Battelle LLC
02/15/2007 - 09/30/2007

Dr. Xiaosong Ma and her students will attack the problem of transparent data recovery to improve the reliability of large parallel file systems. With this proposed technique, job input data will be automatically staged into a supercomputer, while a modified system to perform just-in-time patching to make sure the staged data are available when the corresponding job is scheduled.

Collaborative Research: Application-Adaptive I/O Stack For Data-Intensive Scientific Computing
Xiaosong Ma ; Vincent Freeh ; John Blondin

$266,002 by the National Science Foundation
09/15/2006 - 08/31/2010

In this proposal, we address the I/O stack performance problem with adaptive optimizations at multiple layers of the HEC I/O stack (from high-level scientific data libraries to secondary storage devices and archiving systems), and propose effective communication schemes to integrate such optimizations across layers. Our proposed PATIO (Parallel AdapTive I/O) framework will coordinate storage resources ranging from processors to tape archiving systems.

Comparative and Web-Enabled Virtual Screening
Xiaosong Ma (Co-PI) ; Jacqueline Hughes-Oliver (PI) ; Moody Chu ; Gary Howell; Morteza Khaledi

$1,111,110 by the National Institutes of Health
09/23/2005 - 07/31/2008

The long-term objective of this project is to develop computational algorithms and software to gain theoretical and empirical insights in the use of chemical diversity for determining quantitative structure-activity relationships (QSARs). In addition to addressing scientific and technical goals with respect to QSAR modeling, planning-period tasks will include specific activities to bring together the researchers and to facilitate inter-disciplinary communication. Specific Aim 1 is to develop and enhance collaborations between three broad disciplines: statistics, computer science, and chemistry.

CSR--EHS: Collaborative Research: Hybrid Timing Analysis via Multi-Mode Execution
Frank Mueller

$140,000 by the National Science Foundation
08/ 1/2007 - 07/31/2010

Real-time embedded systems require known bounds on the worst-case execution time (WCET) of tasks. Static timing analysis provides such bounds, yet cannot keep pace with architectural innovations and hardware performance variation due to chip fabrication scaling. Instead of simulating execution, this work promotes actual execution in hardware to bound WCETs. This renders tedious hardware modeling unnecessary while guaranteeing correct behavior regardless of complexity or variation of hardware. The approach will be evaluated by FPGA synthesis to assess its feasibility and to validate a prototype. Advanced architectural features are studied in co-design space exploration to combine predictability and tight WCET bounds.

CAREER: Exploiting Binary Rewriting to Analyze and Alleviate Memory Bottlenecks for Scientific Applications
Frank Mueller

$400,000 by the National Science Foundation
06/ 1/2003 - 05/31/2010

Today, high-performance clusters of shared-memory multiprocessors (SMPs) are employed to cope with large data sets for scientific applications. On these SMPs, hybrid programming models combing message passing and shared memory are often less efficient than pure message passing although the former fits SMP architectures more closely. For more information on this project check Dr. Mueller's Web Page

MOLAR: Modular Linux and Adaptive Runtime Support for HEC OS/R Research
Frank Mueller

$93,708 by the US Department of Energy
02/ 1/2005 - 01/31/2009

This project addresses issues of adaptive, reliable,and efficient operating and runtime system solutions for ultra-scale high-end scientific computing with the following goals: (1)Create a modular and configurable Linux system based on the application / runtime requirements. (2)Build runtime systems that leverage the OS modularity and configurability to improve efficiency, reliability, scalability,ease-of-use. (3)Advance computer reliability, availability and serviceability management systems to work cooperatively. (4)Explore the use of advanced monitoring and adaptation to improve application performance and predictability of system interruptions. Our focus is on developing scalable algorithms for high-availability without single points of failure and without single points of control.

ITR: Collaborative Research: SPARTA: Static Parametric Timing Analysis to Support Dynamic Decisions in Embedded Systems
Frank Mueller

$130,000 by the National Science Foundation
09/ 1/2003 - 08/31/2008

Embedded systems with temporal constraints rely on timely scheduling and a prior knowledge of worst-case execution times. Static timing analysis derives safe bounds of WCETs but its applicability has been limited to hard real-time systems and small code snippets. This proposal addresses these limitations of timing analysis for embedded systems. It contributes a novel approach to program analysis through parametric techniques of static timing analysis and provides innovative methods for exploiting them.

Online Data Reconstruction for Supercomputers
Frank Mueller

$15,000 by Oak Ridge National Laboratories
01/ 1/2007 - 06/30/2007

This work seeks to build online recovery mechanisms for transient supercomputer job data. With the proposed on-demand data reconstruction, staged input files that are unavailable due to I/O node failures in a parallel file system are transparently patched from source copies using the recovery metadata.

Collaborative Research: Effective Detection and Alleviation of Scalability Problems
Frank Mueller ; Jerzy Bernholc

$231,652 by the National Science Foundation
09/ 1/2004 - 08/31/2009

The focus of this project is to develop tool support to provide the ability for scientific programmers to inquire about scalability problems and correlate this information back to source code. Furthermore, we believe that tools should be able to suggest and evaluate optimizing transformations to alleviate these problems. This would constitute a significant improvement over current performance analysis practice. The key intellectual merit is in providing an automatic framework for detecting scalability problems and correlating them back to source code. We will experiment with our framework on the ASCI codes, which is intended to stress high-performance clusters.

Virtual Simple Architecture (VISA): Exceeding the Complexity Limit in Safe Real-Time Systems
Frank Mueller ; Eric Rotenberg

$275,000 by the National Science Foundation
08/15/2003 - 07/31/2007

While essential for real-time scheduling, deriving worst-case execution times (WCET) for contemporary processors is intractable. The Virtual Simple Architecture (VISA) framework shifts the burden of bounding the WCETs of tasks, in part, to hardware. A VISA is the pipeline timing specification of a hypothetical simple processor. WCET is derived for a task assuming the VISA. Tasks are executed speculatively on an unsafe complex processor. Yet, before deadlines become jeopardized, a simple mode is entered. Overall, VISA provides a general framework for safe operation on unsafe processors.

NeTS-NOSS: Secure, Robust and DoS-Resilient Code Dissemination in Wireless Sensor Networks
Peng Ning

$269,902 by the National Science Foundation
08/ 1/2007 - 07/31/2011

Sensor networks are ideal candidates for a wide range of applications, such as monitoring of critical infrastructures, data acquisition in hazardous environments, and military operations. It is usually necessary to reprogram sensor nodes after they are deployed through wireless links. In this project, we will investigate secure, robust, and DoS-resilient remote program of sensor nodes through wireless links. We expect to develop three groups of fundamental techniques as a result, including secure and proactively robust encoding of binary code images, DoS-resilient mechanisms for authenticating binary images, and efficient and effective techniques for remote sensor programming in hybrid sensor networks.

CAREER: Towards Trustworthy and Resilient Sensor Networks
Peng Ning

$400,000 by the National Science Foundation
07/ 1/2005 - 06/30/2011

Sensor networks are ideal candidates for a wide range of applications such as critical infrastructure protection. It is necessary to guarantee the trustworthiness and resilience of sensor networks as well as the sensing applications. The objective of this project is to develop practical techniques for building trustworthy and resilient sensor networks as well as instructional materials that facilitate the education of these techniques. The research activities are focused on practical broadcast authentication, trustworthy and resilient clock synchronization, and light-weight and collaborative intrusion detection in sensor networks, seeking effective integration of cryptographic techniques, application semantics, and other knowledge or constraints.

Collaborative Research: CT-T: A Resilient Real-Time System For a Secure and Reconfigurable Power Grid
Peng Ning

$28,500 by the National Science Foundation
09/ 1/2007 - 08/31/2008

Energy infrastructure is a critical underpinning of modern society that any compromise or sabotage of its secure and reliable operation will have a prominent impact on people's daily lives and the national economy. Past failures such as the massive northeastern power blackout of August 2003 have revealed serious defects in both system-level management and device-level designs. This project proposes a hardware-in-the-loop reconfigurable system with embedded intelligence and resilient coordination schemes to tackle the vulnerabilities of the power grid. The proposed system will be fully evaluated in terms of real-time responsibility, fault resiliency, and ability for local collaboration in emergent/catastrophic events.

Efficient and Resilient Key Management for Wireless Sensor Networks
Peng Ning

$173,165 by Syracuse University
05/ 1/2005 - 07/30/2008

Security of sensor networks is a critical issue, especially when the sensor networks are deployed in hostile environments for mission critical applications. This project aims at developing efficient and resilient key management techniques for wireless sensor networks, including novel key pre-distribution techniques, effective use of knowledge extracted from practical sensor deployment models as well as application semantics, effective integration of public key and secret key, and specific techniques for key management in hybrid sensor networks consisting of a small number of resourceful nodes and a potentially large number of resource constrained, regular sensor nodes.

Cyber-TA: NCSU: Large-Scale Privacy-Preserving Collaborative Intrusion Analysis
Peng Ning

$80,107 by SRI International
07/ 1/2006 - 07/14/2008

We will focus on one thrust of research in the Cyber-TA initiative. We will explore practical schemes for Internet-scale collaborative sharing of sensitive information security log content, while providing extensive guarantees for contributor anonymity. Cyber-TA will enable much greater content sharing of even the most sensitive system and security log content, allowing contributors to release "rich-content" (anonymized) alert information that can enable new directions in ultra-larges-scale repository correlation.

Collaborative Research: Trustworthy and Resilient Location Discovery in Wireless Sensor Networks
Peng Ning

$150,000 by the National Science Foundation
10/ 1/2004 - 06/30/2008

The objective of this project is to develop a comprehensive suite of techniques to prevent, detect, or survive malicious attacks against location discovery in sensor networks. The PIs will investigate key management schemes suitable for authenticating beacon messages, explore techniques to make existing location discovery schemes more resilient, seek beaconless location discovery that uses deployment knowledge instead of beacon nodes, and finally investigate methods to integrate the proposed techniques so that they can be combined cost-effectively for sensor network applications. This project will provide specific technical solutions that can be integrated with the sensor network techniques currently being developed.

ARO Workshop on Security of Embedded Systems and Networks
Peng Ning ; Frank Mueller

$21,000 by the Army Research Office
09/15/2006 - 09/14/2007

Embedded systems and networks are used heavily in critical defense applications. The integrity of embedded infrastructures, such as configuration and code, is of utmost importance. New techniques are needed that allow updates to the infrastructure of an embedded system without violating its integrity. This workshop intends to bring researchers that have expertise in a variety of techniques for ensuring the security and integrity of mission-critical embedded systems and networks.

WiSeNeT: Wireless Sensor Network Testbed for Research and Education
Peng Ning ; Injong Rhee

$108,105 by the Army Research Office
05/ 1/2006 - 04/30/2008

This proposal proposes to build a wireless heterogeneous sensor network test-bed consisting of over 200 sensor nodes with varying capabilities in terms of processing, energy efficiency and radio transmission capacities. The proposed test-bed provides realistic large-scale wireless sensor network environments for evaluating and validating the ideas, protocols and systems conceived from various other activities. The data and experience gained from operating and managing a real network environment will also provide practical insights for students and researchers on the operation of large-scale heterogeneous sensor networks which help identify new security and performance problems and develop their practical solutions.

Positioning and Reliable Data Transmission of Sensor Networks
Peng Ning ; Wesley Snyder

$199,823 by the US Army
08/ 1/2004 - 07/31/2007

Reliable and sufficient sensor coverage is an important requirement for a successful sensor network deployment. The goal of the project is to study and develop optimal positioning algorithms for sensor network deployment that provide surveillance and monitoring of assets or facilities. This project considers two scenarios: 1) Assuming we would deploy sensors for asset protection, an optimal sensor positioning and deployment algorithm is needed. 2) If sensors are deployed randomly over a geographical region to protect certain asset, what will be the best scheduling plan to turn on some of the deploying sensors such that sufficient surveillance can be provided.

Reliable Medium Access in Wireless Networks: Vulnerabilities, Protection, and Recovery
Peng Ning ; Wenye Wang (PI)

$223,071 by Army Research Office
08/ 1/2007 - 02/28/2008

The goal is to study vulnerabilities of medium access in wireless networks and develop preventive algorithms for protection and reactive algorithms for recovery in the aftermath of cyber-attacks The approach will be to start with detailed middle-ware based traffic injecting, monitoring, measurement, and analysis in the Networking of Wireless Information Systems (NeTWIS) lab of North Carolina State University. The collected data will be later used for evaluation and verification of our proposed solutions.

Dimensioning Access Networks Subject to Percentile End-to-End Delay SLAs
Harry Perros ; Yannis Viniotis

$40,000 by Center for Advanced Computing and Communication (CACC)
07/ 1/2007 - 06/30/2008

This proposal is the continuation of a project entitled "IP Triple and Quadruple Play Services: Modeling and Design", currently being funded during this academic year 2005/2006. The project deals with the dimensioning of an access network. Specifically, of interest is to determine the size of the upstream and downstream links as a function of the number of ADSL/cable modems supported by the access network. Alternatively, given the size of the upstream and downstream links, determine how many ADSL/cable modems can be supported.

IP Triple and Quadruple Play Services: Modeling and Design
Harry Perros ; Yannis Viniotis

$40,000 by CACC
07/ 1/2006 - 06/30/2007

The award will fund research activities aiming to make significant contributions to the capacity planning and automation for monitoring tools by use of modeling, simulation, testbed emulation and on line optimization. Specific goals include the study of models, response surfaces and advanced simulation methods, and, the creation of an automated paradigm for on-line optimization for capacity tuning.

CT-ER: Metamorphic Worm Detection
Douglas Reeves

$137,057 by the National Science Foundation
08/15/2006 - 01/31/2009

Internet Worms are software that propagate from computer to computer across the network, without intervention by or knowledge of users, for the purpose of compromising the defenses of those machines against unauthorized access or use. Worms have the property that they can spread very quickly to the vulnerable population of hosts, sometimes in only seconds, to achieve worldwide penetration. This speed allows them to bypass conventional methods of positive identification and human response.

Tracing Attacks Through Non-Cooperative Networks and Stepping Stones with Timing-Based Watermarking
Douglas Reeves ; Peng Ning

$1,179,321 by the US Department of Interior
09/29/2003 - 02/28/2007

Increasingly the nation's infrastructure is connected by the Internet for computing, communication, monitoring, and control. Adversaries(hackers, criminals, terrorists, etc.)can and do exploit this connectivity to attack networked computers and devices. In order to defend against such attacks, and prosecute the adversaries, it is necessary to be able to identify the source of the attack. Tracing an attack back through the Internet to its source is the goal of this research project. As part of this project, tools for watermarking flows and performing timing correlation in real-time will be developed.

NeTS-NOSS: Exploring the Design Space of Sensor Networks Using Route-Aware MAC Protocols
Injong Rhee ; Robert Fornaro

$584,999 by the National Science Foundation
01/ 1/2005 - 12/31/2008

As applications for wireless sensor networks are extremely diverse, sensor network designers will benefit immensely from (sensor) network protocols that can provide a wide spectrum of design choices, especially for very low energy budget applications. In this proposal, the PIs plan to develop a suite of new MAC protocols for sensor network applications based on a new approach, called Route-aware Media Access Control (RASMAC), that can greatly diversify design choices for application designers. A comprehensive evaluation of the developed protocols and their performance models is planned that involves design and implementation of a wildlife tracking system.

NeTS-NR: Traffic Quantization: A Formal Framework for Quality of Service (QoS) and Scalability in Packet-Switched Networks
George Rouskas

$357,314 by the National Science Foundation
09/ 1/2004 - 08/31/2009

Traffic quantization is a new approach to supporting per-flow functionality in packet-switched networks in an efficient and scalable manner. We propose the concept of tiered service to alleviate the complexity associated with supporting per-flow QoS: a quantized network offers a small set of service tiers, and each flow is mapped to the tier that guarantees its QoS. Research will consist of four components: develop novel quantized implementations of weighted fair queueing (WFQ); develop Linux implementations of quantized WFQ to validate the theoretical results; extend the quantization approach to multiple traffic parameters; and investigate efficient constraint-based routing algorithms for quantized traffic.

Lambda Scheduling for Grid Applications
George Rouskas

$40,000 by Center for Advanced Computing and Communication (CACC)
07/ 1/2007 - 06/30/2008

This effort is the first step towards a long-term vision of a distributed scheduling infrastructure that will allow researchers to reserve high-performance network paths via straightforward interfaces to support their research endeavors. Our work will support the development of sophisticated scheduling algorithms and capabilities, and complicated policy implementation and enforcement techniques, and their incorporation into single-domain schedulers; in future research, we will extend and incorporate these capabilities into multi-domain schedulers. Simulation results of algorithms will be used for prototype implementation and experimentation, and the results will be fed back into the simulation environment for continuous enhancement of the algorithms.

CPATH CB: Computing Across Curricula
George Rouskas ; Lisa Bullard ; Jeffrey Joines ; Lawrence Silverberg, Eric Wiebe

$274,749 by the National Science Foundation
07/ 1/2007 - 09/30/2010

The focus of this project is to streamline pathways through which students receive an education that equips them with the computing tools necessary for them to serve as future computing leaders of society. To this end, we will assemble a community of individuals, each of whom is invested in their own unique way to revitalizing the undergraduate computing education. The community will involve faculty representatives from several academic departments and delegates from industry partner organizations, and will open up meaningful channels for dialogue to flow from industry to the university, leading to a more diverse, flexible workforce of computing professionals.

Collaborative Research: NeTS-FIND: The SILO Architecture For Services Integration, Control, and Optimization For the Future Internet
George Rouskas ; Rudra Dutta

$228,000 by the National Science Foundation
09/15/2006 - 02/28/2010

The objective of this project is to formulate a formal framework for a non-layered internetworking architecture in which complex protocols are composed from elemental functional blocks in a configurable manner, and to demonstrate its potential by developing proof-of-concept prototypes. We propose a new internetworking architecture that represents a significant departure from current philosophy. The proposed architecture is flexible and extensible so as to foster innovation and accommodate change, it supports network convergence, it allows for the integration of security features at any point in the networking stack, and it is positioned to take advantage of hardware-based performance-enhancing techniques.

A Formal Approach to Traffic Grooming in Optical Networks with General Topologies
George Rouskas ; Carla Savage ; Rudra Dutta

$404,968 by the National Science Foundation
09/ 1/2003 - 08/31/2007

We address the problem of grooming traffic into lightpaths for transport over general topology optical networks so as to minimize the network cost. We will first study the traffic grooming problem in a number of elemental topologies such as rings, stars, and trees. We will consequently develop hierarchical approaches to tackle the problem in general topologies by decomposing it into smaller subproblems involving elemental topologies. The end-result of this project will be a suite of traffic grooming algorithms with formally verified properties that can be flexibly and efficiently applied within a variety of optical network and cost models.

Joint Faculty Agreement For Nagiza Samatova
Nagiza Samatova

$686,881 by Oak Ridge National Laboratory
08/ 9/2007 - 08/ 8/2017

Dr. Nagiza Samatova's joint work with NC State University and Oak Ridge National Laboratory (ORNL) will provide the interface between the two organizations aiming to collaboratively address computational challenges in the Scientific Data Management, and the Large-Scale Analysis of DOE-mission applications. (Supplement)

Joint Faculty Agreement For Nagiza Samatova
Nagiza Samatova

$507,294 by Oak Ridge National Laboratories - UT Battelle, LLC
08/ 9/2007 - 08/ 8/2015

Dr. Nagiza Samatova's joint work with NC State University and Oak Ridge National Laboratory (ORNL) will provide the interface between the two organizations aiming to collaboratively address computational challenges in the Scientific Data Management, Data-Intensive Computing for Understanding Complex Biologicial Systems, Knowledge Integration for the Shewanella Federation, and the Large-Scale Analysis of Biologicial Networks with Applications to Bioenergy Production.

High-Performance Data Analytics with Demonstrations to DOE-Mission Applications
Nagiza Samatova

$1,120,002 by Oak Ridge National Laboratories & UT-Battelle, LLC
10/ 4/2007 - 08/31/2012

Terascale computing and high-throughput experiments enable studies of complex natural phenomena, on a scale not possible just a few years ago. With this opportunity, comes a new problem - the massive quantities of complex so data produced. However, answers to fundamental science questions remain largely hidden in these data. The goal of this work is to provide a scalable high performance data analytics technologies to help application scientists extract knowledge from these raw data. Towards this goal, this project will research and develop methodologies for addressing key bottlenecks, and provide proof-of-principle demonstrations on the DOE applications.

Enumeration and Structure in Families of Partitions, Compositions, and Combinations
Carla Savage

$183,287 by the National Science Foundation
07/15/2003 - 06/30/2007

The proposed research is an investigation of fundamental questions involving the structure of combinatorial families and relationships between families with intrinsically different characterizations. The focus is on families of integer partitions, compositions, and combinations. The topics under investigation include new work on partitions and compositions defined by linear inequalities; new tools for investigating classical questions about generalizations of the Rogers-Ramanujan identities; and symmetric chain decompositions with geometric applications.

NeTS-NBD: Measurement-Based Mobility Modeling for MANETs
Mihail Sichitiu (ECE) ; Injong Rhee

$484,827 by the National Science Foundation
08/15/2006 - 07/31/2009

Mobile ad-hoc networks (MANETs) have been the focus of significant research activity in the past decade. Thousands of algorithms and protocols for MANETs have been proposed, evaluated and compared. One of the defining characteristics of MANETs is their mobility. We propose to develop and evaluate a hybrid mobility model that is relatively easy to generate and, at the same time, produces realistic mobility traces, that in turn, result in meaningful simulation results for MANET simulations. The proposed model has the desirable characteristics that it is customizable to match any scenario, while allowing the users to vary key parameters.

Agent-Based Conceptual Model and Policy Architecture for Virtual Organizations
Munindar Singh

$40,000 by CACC
07/ 1/2006 - 08/15/2007

Virtual organizations (VOs) are organizations of entities such as people and businesses that collaborate to address their collective and individual goals. We model business partners and the VOs they form as autonomous agents. The objective of this project is to address two major, related challenges: (1) how to ensure that agents interact correctly within and across VOs under different circumstances; and (2) how to specify agents and VOs in a perspicuous policy-based manner that engenders confidence in the functioning of the VOs involved.

ITR:Computational Principles of Trust
Munindar Singh

$573,473 by the National Science Foundation
09/ 1/2000 - 06/30/2007

Successful interaction relies heavily upon trust. This applies equally to electronic commerce and virtual social communities. However, figuring out who to trust and to what extent is extremely difficult in open networked information environments. Trust is a complex concept and involves aspects of competence and good nature (of the trusted party) and the risk tolerance and urgency (of the trusting party). This project studies distributed, scalable computational approaches for trust management, especially with regard to aggregate phenomena such as the emergence of subcommunities, pivots (which link different subcommunities), and the sensitivity of a community to invasion by nontrustworthy players.

Principles of Commitment Protocols
Munindar Singh

$345,000 by the National Science Foundation
05/15/2002 - 04/30/2007

Business protocols structure and streamline interactions among autonomous business partners. Traditional representations of protocols specify legal sequences of actions but not their meaning. Thus they cannot adequately support flexible interactions, e.g., to handle exceptions and exploit opportunities. This project is developing a declarative model of protocols that gives meaning to, and reasons about, states and actions based on the participants' commitments. This approach improves flexibility while maintaining rigor. This project is studying practical protocols from real-life domains such as transactions among financial institutions and other varieties of electronic business.

Toward Cognitive Habile Agents
Robert St. Amant

$375,266 by the National Science Foundation
01/ 1/2006 - 12/31/2009

Tool use is an agent's manipulation of objects in the environment to transform the interaction between the environment and the agent's sensors or actuators such that its goals are more efficiently achieved. We propose four core capabilities for a habile agent: the ability to generalize existing effectivities to those provided by a tool under the agent's control; the ability to perform detailed internal simulations based on hypothetical application of tool-using abilities; the use of symmetry in recognizing opportunities for tool use; and the use of a general image-schematic representation to control tool-using behavior.

Intelligent Human-Machine Interface and Control for Highly Automated Chemical Screening Processes
Robert St. Amant ; David Kaber ; Mo-Yuen Chow

$786,000 by the National Science Foundation
10/ 1/2004 - 09/30/2007

The breakthrough information technology that we will develop through this ITR project is an intelligent/adaptive, human-machine interface to support the new role of screening process supervisors in safe and effective, distributed control of high time stress and high risk, automated chemical and toxicity testing. The development of this technology will be based on cognitive modeling of supervisory controller behaviors during actual chemical screening processes and model predictions of operator performance with different interactive information display design alternatives during the (model) design phase and during chemical process run-time.

National Extreme Events Data And Research Center (NEED) Transforming The National Capability For Resilience To Extreme Weather And Climate Events (Supplement)
Ranga Vatsavai

$19,999 by Oak Ridge National Laboratory vis US Dept of Energy
03/16/15 - 09/30/2016

NCSU graduate student will develop a machine learning approach to linking extreme atmospheric ridging events with extreme surface temperatures, employing a Gaussian Process (GP)-based predictive analysis tool that leverages the predictive value of spatial and temporal correlations and builds on ORNL’s past success in spatial classification, temporal change prediction, and parallelizing GP for large spatiotemporal extents.

Scientific Data Management Center for Enabling Technologies
Mladen Vouk

$885,000 by the U.S. Department of Energy
11/15/2006 - 11/14/2012

With the increasing volume and complexity of data produced by ultra-scale simulations and high-throughput experiments, understanding the science is largely hampered by the lack of comprehensive, end-to-end data management solutions ranging from initial data acquisition to final analysis and visualization. The SciDAC-1 Scientific Data Management (SDM) Center succeeded in bringing an initial set of advanced data management technologies to DOE application scientists in astrophysics, climate, fusion, and biology. Building on our early successes, we will improve the SDM framework to address the needs of ultra-scale science.

Center for Scientific Data Management-Agent Technology Enabling Communication Among Tools and Data
Mladen Vouk

$906,987 by the U.S. Department of Energy
08/15/2001 - 08/14/2007

Scientific Data Management Center is a SciDAC funded center with a goal to establish an Enabling Technology Center that will provide a coordinated framework for the unification, development, deployment, and reuse of scientific data management software, including scientific workflow technologies, specifically through SDM?s Scientific Process Automation (SPA) focus area. The goal of this technology is to allow for easy and accurate interactions and flows among distributed computational, storage and application resources used in scientific discovery.

Markers of STEM Success (MOSS): An Eleven-Year Longitudinal Study of High Achieving Young Women's Interests, Experiences, and Preparation for STEM Careers
Mladen Vouk (Co-PI) ; Sarah Berenson (PI) ; Joan Michael ; Roger Woodard; Susan Bracken

$511,512 by the National Science Foundation
10/ 1/2006 - 09/30/2009

Over the past seven years, we have collected data on 250 high achieving young women, ages 11-20 for an intervention project and an ITWF project. High achieving is defined as those girls selected/electing to take Algebra 1 in middle grades, putting them on track to take calculus in high school. The proposed research provides an opportunity to extend and redirect the current database for a new study. By 2009 we expect to have 100 longitudinal records to inform post-undergraduate analysis, 200 longitudinal records to inform the undergraduate analysis, and 300 longitudinal records to inform the high school analysis.

Virtual Computing Environment Services for NC Community College System
Mladen Vouk (Co-PI) ; Eric Sills (PI) ; Frank Peeler ; Henry Schaffer; Sara Stein

$892,200 by the NC Department of Community Colleges
12/ 1/2007 - 12/31/2010

NC State's Virtual Computing Environment (VCE) is a computing environment in use at NC State, which is very attractive to the NC Community College System (NCCCS). It is expected to help them deliver educational services in a manner both superior to and with less resource expenditure than their current computer labs. After investigating the VCE the NCCCS put it into their plans. They then requested and received a Legislative appropriation to fund a production pilot using NC State's resources. The services provided under this agreement by NC State will support pilots at a number of NC Community College campuses.

ITR: Collaborative Research: Procedural Modeling of Urban Land Use and Form
Ben Watson

$50,000 by Northwestern University (NSF)
05/16/2007 - 05/15/2008

This subcontract supports work on an award authored by this investigator at another institution. That institution has made a significant change in the research direction of the award, and significantly reduced this investigator's role. The original goal of this award was automating the production of urban settings for digital entertainment and simulation. Since this research direction is no longer the award's priority, this subcontract will only support the development of techniques for: * high-level, more intuitive control of land use simulation, and * generation of highly differentiated (not self-similar) buildings.

CAREER:Managing Complexity: Fidelity Control For Optimal Usability in 3D Graphics Systems
Benjamin Watson

$59,153 by the National Science Foundation
08/ 1/2006 - 01/31/2009

Drastic improvements in the speed of 3D graphics rendering hardware have been accompanied by even more drastic increases in the size of displayed models. Researchers trying to display these models have been forced to reduce display speed and interactivity or reduce the fidelity of the displayed views of their models. What are the best methods for preserving visual fidelity as model complexity is automatically reduced? What is the most effective way of striking the display speed vs. visual fidelity compromise? Our research will take examine these questions, resulting in prototype systems and investigations of their effectiveness with user studies.

Hyper-Resolution Rendering and Display
Benjamin Watson

$60,000 by the National Science Foundation
09/ 1/2006 - 02/29/2008

Bringing the rendering technology that created the game industry to personal and business imaging will require achieving interactive display at printer resolution. With such displays, documents might be read directly off desktops, and analysts might spread all of a massive dataset across walls, and then lean forward to see the finest data detail. For such display, verbose pixel-by-pixel representations will not scale. This research will achieve a higher-level representation built on two fundamental technologies: adaptive frameless rendering, which exploits both spatial and temporal coherence in imagery; and change-based rendering, builds imagery using higher-level primitives such as gradients and edges.

REU Site:Design Tech: Sparking Research in Interactive Visual Design
Benjamin Watson ; Christopher Healey ; R. Michael Young ; Patrick Fitzgerald

$268,763 by the National Science Foundation
03/ 1/2006 - 02/28/2010

Participants of this interactive designed technology hothouse for undergraduate researchers and designers will work with computer science and design faculty and industry on projects spanning artificial intelligence, graphics, visualization as well as visual and interactive design. Sample projects include: advanced AI for interactive narratives and games; including camera control, and story planning and level design; automated tours through virtual and visualized environments; visualizing streaming news feeds using swarming sprites, and interactive, ambient display walls; PDA-based art installations, and real-world navigation tools. Students will gain the cross-disciplinary and cross-cultural teamwork and communications skills so important in designed technology research and industry.

CAREER: The Test-Driven Development of Secure and Reliable Software Applications
Laurie Williams

$413,764 by the National Science Foundation
04/ 1/2004 - 03/31/2011

Our nation's critical infrastructure demands that our current and future IT professionals have the knowledge, tools, and techniques to produce reliable and trustworthy software. The objective of this research is to extend, validate, and disseminate a software development practice to aid in the prevention of computer-related disasters. The practice is based upon test-driven development (TDD), a software development technique with tight verification and validation feedback loops. The proposed work extends the TDD practice and provides a supportive open-source tool for explicitly situating security and reliability as primary attributes considered in these tight feedback loops.

CT-ER: On the Use of Security Metrics to Identify and Rank the Risk of Vulnerability- and Exploit-Prone Components
Laurie Williams

$201,063 by the National Science Foundation
08/ 1/2007 - 01/31/2011

We propose to build, evolve, and validate a statistical prediction model whereby security-related ASA alerts from one or more tools and other software metrics are used to predict the actual overall security of a system. Our research involves collecting and analyzing a significant amount of data on software programs including security-related ASA alerts and actual security vulnerabilities and exploits, based upon inspections, testing failures, field failures, and reported exploits.

Hot Spot Identification and Test-Driven Development
Laurie Williams

$45,982 by Nortel Networks
08/16/2006 - 12/31/2007

The grant involves the collaboration on two projects for improving software quality at Nortel. In one project, a means for identifying "hot spots" in code will be developed and validated based upon historical data (static complexity, code churn, defect history for the module and potentially code coverage data). In the second project, we will examine the efficacy the test test-driven development practice for improving software quality.

In Regression Testing Without Code
Laurie Williams

$115,067 by ABB, Inc.
01/10/2005 - 08/15/2007

The goals of this research project are to come up with a validated method of regression test selection for software components for which source code is not available to software development organizations incorporating the components into their products. This includes all third party components that are used "off the shelf", as well as internal components where the source code is not readily available at the time of the regression test selection. This project will involve an understanding of component changes, regression testing, and release documentation.

Extending Extreme Programming
Laurie Williams ; Mladen Vouk

$254,134 by CACC-NSA
09/15/2003 - 12/31/2008

The Extreme Programming methodology was designed for relatively small teams of collocated programmers working on non-critical, small-medium, object-oriented projects. Little empirical assessment has been done on the methodology, though a sizable amount of anecdotal evidence supports the use of the methodology under these conditions. We are proposing collaborative research with the NSA and Galois in which we will empirically assess the efficacy of Extreme Programming practices in a high-confidence, secure, functional programming project. Additionally, we will work on integrating formal methods, reliability, and security testing into the set of Extreme Programming practices.

BPC-A: The STARS Alliance: A Southeastern Partnership for Diverse Participation in Computing
Laurie Williams ; Mladen Vouk ; Sarah Berenson

$303,219 by UNC-Charlotte (NSF)
03/ 1/2006 - 02/28/2010

Our goal is to broaden participation in computing by developing a Southeastern partnership to implement, institutionalize and disseminate effective practices for recruiting, bridging, retaining and graduating women, under-represented minorities and persons with disabilities into computing disciplines. The Alliance will implement a comprehensive set of activities to provide high-quality opportunities to a large audience of post secondary students, including a Student Leadership Corps, pair programming, a Web portal, and Marketing and Careers Campaign, summer REUS, and a STARS Celebration Conference and Exchange.

Collaboration through Agile Software Development Practices: A Means for Improvement in Quality and Retention of IT Workers
Laurie Williams ; Mladen Vouk ; Jason Osborne ; Winser Alexander; an Sarah Berenson

$812,587 by the National Science Foundation
06/15/2003 - 06/30/2009

This ITWF award to NCSU, NCA&T, and Meredith College will support a three-year study of the collaborative aspects of agile software development methodologies. We believe the collaboration and the social component inherent in these methodologies is appealing to people whose learning and training models are socially oriented, such as some minority groups, women, and men. The project?s objective is to perform extensive, longitudinal experimentation in advanced undergraduate software engineering college classes at the three institutions to examine student success and retention in the educational and training pipeline when the classes utilize an agile software development model.

On Expediting Software Engineer AWAREness of Anomalous Code
Laurie Williams ; Tao Xie

$40,000 by CACC
07/ 1/2006 - 12/31/2007

Our objective is to continue development of the Automated Warning Application for Reliability Engineering (AWARE) tool. AWARE will continuously provide the programmer with prioritized and trained information on faults revealed via compilation, static analysis, and dynamic testing. We are extending the functionality of the tool in the following ways: (1) more sophisticated prioritization of alerts; (2) learning of how often to initiate alert runs and how to display the alerts; (3) enhanced automated test case generation; and (4) redundant test case reduction. AWARE will provide the programmer with better diagnosis information, improving programmer productivity and product quality.

CAREER: Automated Synthesis of Bidding Strategies for Trading Agents
Peter Wurman

$300,010 by the National Science Foundation
08/ 1/2001 - 07/31/2007

This project will investigate approaches to building a strategy generation engine as a component of a flexible trading agent that converts user preferences, auction rules, and a model of the other agents into a decisionable format. The first strategy generation engine will produce game-theoretic representations of the decision problem. For small problems, the game can be solved and an equilibrium bidding strategy selected. However, for intractable larger problems, alternate strategy generation engines will be constructed which use other decision technologies. Ideally, the agent will be able to make this decision by assessing the structure of the problem instance.

CT-ISG: Collaborative Research: A New Approach to Testing and Verification of Security Policies
Tao Xie

$227,275 by the National Science Foundation
08/ 1/2007 - 07/31/2011

Security policies such as access control and firewall policies are one of the most fundamental and widely used privacy and security mechanisms. Assuring the correctness of security policies has been a critical and yet challenging task. In this proposal, we propose to develop a uniform representation of security policies across application domains such as XACML access control policies and firewall policies, and a set of novel techniques for testing and verification of both static and stateful policies based on the uniform representation.

CSR---SMA: Improving Software System Reliability via Mining Properties for Software Verification
Tao Xie

$20,000 by the National Science Foundation
08/ 1/2007 - 07/31/2008

Most correctness, security, and robustness violations of software systems are caused by the incorrect usage of application-specific APIs. But API details and the implicit usage properties are often not documented by the developers. Manually specifying a large number of properties or behaviors for static verification is often inaccurate or incomplete, apart from being cumbersome and prohibitively expensive. In this project, we develop a set of practical techniques and tools for inferring properties centered around single API call and properties related to multiple API calls, and improving the inference results through automatic test generation and dynamic analysis.

Improving Software Dependability Via Mining Properties For Software Verification
Tao Xie

$50,000 by Army Research Office
06/18/2007 - 03/17/2008

Most correctness, security, and robustness violations are caused by the incorrect usage of system or application programming interfaces (APIs). Although API properties or behaviors can be formally specified and statically verified against software by the current state-of-the-art static verification tools, API details and their usage properties are often not documented by the developers. We propose to develop new approaches for mining API properties for static verification from the API client call sites in existing code repositories

Integrating Static and Dynamic Analysis to Improve Automatic Unit-Test Generation
Tao Xie ; Jun Xu

$40,000 by CACC
07/ 1/2006 - 06/30/2007

Unit testing is an important activity in assuring high quality of software programs. Although there exist object-oriented test-generation tools for Java programs, two important issues pose barriers for their wide adoption. First, the existing unit-test generation tools usually generate many false warnings that could overwhelm and frustrate developers. Second, the existing unit-test generation tools generate a large number of random input values for the method under test; however, these generated test inputs may not exercise the meaningful, important situations where this method is actually used in the system under test. To address these two important issues in automated unit-test generation, we propose to integrate static and dynamic analysis to improve automatic test generation by considering usage contexts of the unit in the system. We use static and dynamic analysis to collect usage context information for the unit under test, and then use context information to guide test generation. Developers can use the resulting improved Java test-generation tools to augment their manually written tests to better assure high software quality.

Creating Effective Task Descriptions from Action Plans
R. Michael Young

$315,000 by the National Science Foundation
08/15/2004 - 07/31/2008

Artificial intelligence planning systems are being put to use to determine the activities of a wide range of intelligent interactive systems. The ability for these kinds of systems to explain their plans to human users is essential for the systems' successful adoption and use. We are investigating the generation of natural language descriptions of plan data structures. This work will develop a cognitive and computational model of task context and its role in the generation of action descriptions.

CAREER: Plan-Based Integration of Control and Coherence in Intelligent Exploratory Environments
R. Michael Young

$480,695 by the National Science Foundation
03/15/2001 - 08/31/2007

The use of virtual environments has shown success in applications ranging from education to entertainment. One limitation of these systems is that users' activities within them are over- or under-constrained. In this project, I will develop new models for the structure of user interactions within virtual worlds. Because a user's understanding of the activity in a world provides scaffolding for her own exploration, presenting the user with an environment in which action can be readily understood encourages the user to acquire and employ knowledge of the environment. This activity leads to an increased understanding of the world the environment models.

HI-FIVES: Using Web-Based Gaming to Improve Student Comprehension of Information Technology in Science
R. Michael Young (Co-PI) ; Leonard Annetta (PI) ; Deborah Mangum ; Thomas Miller

$1,197,270 by the National Science Foundation
09/ 1/2005 - 08/31/2009

Researchers in science education, computer science, distance education, and the NC Department of Public Instruction are partnering with the Kenan Fellows Program to harness the untapped potential of inexpensive, online multi-user competitive simulation software in improving the science achievement and IT skills of NC's grade 6-12 students. Over three years, teacher participants will learn how to use this technology to increase student science achievement and motivation to enter IT-related science careers. Intellectual merits of the project entail rigorous assessment and evaluation of how these environments most effectively improve the IT skills and science content mastery of students.

CT-ISG: Collaborative Research: A Framework for the Modeling and Management of Obligations in Security Policies
Ting Yu

$180,000 by the National Science Foundation
08/ 1/2007 - 07/31/2011

The correct and reliable operation of an information system relies not only on users' capabilities, but oftentimes on users' obligations. The management of obligations in security policies imposes significant challenges since obligations bear different properties from access control. This project develops a comprehensive framework for the management of obligations, including obligation modeling, specification, analysis, monitoring and discharges. Though the framework is formal in nature, and is designed on purpose to be general, the evaluation of its usefulness and effectiveness is firmly grounded on real applications, in particular, in the context of cross-domain data sharing systems and privacy policy enforcement systems.