Research Projects 2005 (by faculty)

The funded projects listed below are/were active projects in the 2005 calendar year and the funded running total for that year is on the left navigational menu.


Transnational Digital Government: Specifying Software Requirements and Policy for Compliance
Annie Anton

$81,400 by the University of Florida
06/16/2003 - 06/30/2005

This project involves the following research activities: - Requirements elicitation using goal- and scenario-based analysis of user requirements to specify usage scenarios that will provide a basis for the software solutions the team will provide to the participating stakeholders in the Dominican Republic and Belize; Codification and specification of relevant information sharing and privacy practices, regulations and policies to ensure that the security, laws, autonomy and culture of participating countries are not compromised; - Development of privacy enforcement tools that will serve as an interface between the data and the applications which process it, preventing applications from accessing data, or combining data that would lead to violations of stated policy and subsequently compromise an organization?s (e.g. a government agency) trustworthiness. This research focuses on privacy, employing governments' efforts to address the negative impacts on society of illicit drug production, traffic and consumption, and transnational border control as the primary domains for investigation. Thus, the proposal seeks to address an application of IT that raises a multitude of ethical and societal concerns.

The Design and Use of Digital Identities
Annie Anton ; Julia Earp

$103,006 by Purdue University
09/15/2004 - 08/31/2007

This project will address a wide variety of digital identity needs by developing required Flexible, Multiple and Dependable Digital Identity (FMDDI) technology, based on a sound underlying set of definitions and principles. We will apply, expand, and refine the theory of identities, the mechanisms to embody and enforce them, as well as study the implications of their use, and develop appropriate educational vehicles to teach people how digital identities should be used effectively.

ITR: Encoding Rights, Permissions, and Obligations: Privacy Policy Specification and Compliance
Annie Anton ; Julie Earp ; Lynda Aiman-Smith ; David Baumer

$932,000 by the National Science Foundation
09/15/2003 - 02/28/2010

This research focuses on how society uses, values, and protects citizens? personal information. From the perspective of system design, software engineers need methods and tools to enable them to design systems that reflect those values and protect personal information, accordingly. This research examines how privacy considerations and value systems influence the design, deployment and consequences of IT. The goal is to develop concepts, tools and techniques that help IT professionals and policy makers bring policies and system requirements into better alignment. An action-oriented set of conceptual tools, including guidelines and privacy- relevant policy templates will be constructed and validated.

Collaborative Research: A Comprehensive Policy-Driven Framework for Online Privacy Protection: Integrating IT, Human, Legal and Economic Perspectives
Annie Anton ; Ting Yu ; David Baumer ; Michael Rappa

$534,000 by the National Science Foundation
09/15/2004 - 08/31/2010

Privacy is increasingly a major concern that prevents the exploitation of the Internet's full potential. Consumers are concerned about the trustworthiness of the websites to which they entrust their sensitive information. Although significant industry efforts are seeking to better protect sensitive information online, existing solutions are still fragmented and far from satisfactory. Specifically, existing languages for specifying privacy policies lack a formal and unambiguous semantics, are limited in expressive power and lack enforcement as well as auditing support. Moreover, existing privacy management tools aimed at increasing end-users' control over their privacy are limited in capability or difficult to use.

CAREER: Adaptive Automated Design of Stored Derived Data
Rada Chirkova

$489,810 by the National Science Foundation
08/ 1/2005 - 07/31/2011

The goal of this project is to develop an extensible framework for designing and using derived data in answering database queries efficiently. The outcomes of the project are expected to be general and independent of a specific data model (e.g., relational or XML), while giving guarantees with respect to query-performance improvement. The approach consists of developing and evaluating mathematical models and algorithms for common types of queries on relational and XML data. Expected outcomes of the project include automated tuning of data-access characteristics in a variety of applications, thus enhancing the quality of user interactions with data-intensive systems.

Efficient View-Design Algorithms To Achieve Near-Optimal Performance Of Sets of Relational Queries
Rada Chirkova

$253,180 by the National Science Foundation
09/15/2003 - 08/31/2007

The goal of this proposal is to develop effective methods to improve performance of frequent important queries on large databases. This problem is important for improving efficiency of user interactions with relational data-management systems; solving the problem will have effect in query optimization, data warehousing, and information integration. The project focuses on evaluating queries using auxiliary relations, or views. Our main objective is to develop theoretically rigorous yet practically applicable techniques needed to design views that would help in computing sets of frequent and important queries with optimal or near-optimal speedup on large databases with given statistics.

Incremental Read-Aheads to Improve Throughput in Data-Intensive Applications
Rada Chirkova

$40,000 by CACC
07/ 1/2004 - 12/31/2005

Modern information system architectures place applications in an application server and persistent objects in a database. In this setting, to improve application throughput we propose to use data prefetching (read-aheads) to minimize total data-access time of an application, in a manner that affects neither the application code nor the backend DBMS. Our methodology is based on automatically merging SQL queries to produce query sequences with low total response time, in ways that exploit the application's data-access patterns. Our approach is independent of the application domain and can be a component of container managed persistence that can be implemented in middleware.

Cost Match for project "NCSU TIE: Wireless Sensor Networks for Structural Health Monitoring of Buildings and Bridges"
Rudra Dutta ; Mihail Sichitiu

$25,000 by CACC
05/16/2005 - 12/31/2005

Cost matching funds in support of NSF funded project "NCSUTIE: Wireless Sensor Networks for Structural Health Monitoring of Buildings and Bridges."

NCSU TIE: Wireless Sensor Networks for Structural Health Monitoring of Buildings and Bridges
Rudra Dutta ; Mihail Sichitiu

$100,000 by CACC-NSF
08/15/2003 - 12/31/2005

The proposed research will improve the state-of-the-art in Structural Health Monitoring of bridges and similar structures. There are two broad parts. The first part addresses sensing and interpreting SHM data, the second part addresses the transfer of data from sensors to the location where interpretation occurs. This proposal ties together the expertise in SHM provided by RB2C, and the networking expertise provided by CACC. As we describe, the SHM sensors that are the focus of our investigation demand a power-efficient wireless network and integration with the network; our research will result in networking algorithms and an integrated testbed.

Runtime/Operating System Synergy to Exploit Simultaneous Multithreading
Vincent Freeh ; Frank Mueller

$380,000 by the National Science Foundation
08/ 1/2004 - 07/31/2008

This proposal focuses on a synergistic approach combining runtime and operating system support to fully exploit the capabilities of SMTs. To meet this objective, it studies three different approaches. First, it investigates the benefits of using a helper thread along side the primary thread, by building a reference implementation of an SMT-aware Message Passing Interface library. Second, it investigates the benefits of dynamic mode switching between single-thread and multi-threaded configurations. Third, it modifies the operating system creating an SMT-aware scheduler. The benefits are demonstrated for a variety of applications, including large-scale benchmarks and other nationally relevant parallel codes.

The Centroid Decomposition and Other Approximations to the Singular Value Decomposition
Robert Funderlic ; Moody Chu

$521,999 by the National Science Foundation
07/ 1/2002 - 06/30/2006

The centroid decomposition, an approximation for the singular value decomposition, had a long but early history within the statistics/psychometrics community for factor analysis research. We revisit the centroid method first in its original context and then generalize and modernize it to arbitrary matrices. We show the centroid method can be cast as an n-step (linear) ascent method on a hypercube. Furthermore, we have shown empirically that the centroid decomposition is statistically sound. A major purpose of this work is to show fundamental relationships between the singular value, centroid and semi-discrete decompositions. This unifies an entire class of truncated SVD approximations.

CAREER: New Directions in Managing Structured Peer-to-Peer Networks
Khaled Harfoush

$408,894 by the National Science Foundation
03/15/2004 - 08/31/2010

In the research component of my career development program, I focus on strategies for addressing the challenges and opportunities that face the deployment of structured P2P systems. In particular, I introduce new schemes to locate resources and strategies to serve them. I also introduce new schemes for topology interference, integration, and organization in order to optimize content distribution. The proposed educational aspect of my career development program focuses on (1) enhancing our department�s networking curriculum, (2) extending opportunities for women, under-represented minorities, and undergraduates in research, (3) encouraging students to participate in the computer science community outside the university.

CAREER: Assisted Navigation in Large Visualization Spaces
Christopher Healey

$370,403 by the National Science Foundation (ACIR/ACR)
02/ 1/2001 - 01/31/2008

This project will investigate methods for navigating complex information spaces. Work will focus on a system designed to help viewers visualize, explore, and analyze large, multidimensional datasets. Detailed local displays will be combined with a high-level global overview of areas of interest within a dataset. Local views will use perceptual cues to harness the low-level human visual system. Global overviews will identify and cluster elements of interest to produce an underlying graph that: (1) support efficient navigation via graph traversal, and (2) provide an effective visualization of the areas of interest and their relationships to one another.

A Perceptual Visualization Architecture
Christopher Healey

$354,029 by the National Science Foundation
09/15/2000 - 08/31/2005

This project will address three issues: (1) can we harness and apply low-level human perception to the problem of visualizing large, complex, multidimensional datasets? (2) can we embed this knowledge in an AI-based system that will assist viewers in constructing perceptually-optimal visualizations in a general way to address a wide range of problem environments? (3) can results from perception be bound to stylistic properties in Impressionist painting, thereby creating a system that allows a viewer to "paint" an expressive visual representation of their data, while at the same time ensuring the result accurately portrays the underlying data values being displayed?

Interactive Exploration of Complex Datasets Via the Effective Generation of Text and Graphics
Christopher Healey ; Robert St. Amant ; Michael Young

$569,338 by the National Science Foundation (IIS/IDM)
09/15/2000 - 08/31/2005

This project studies methods for interactive exploration of complex data spaces through the combination of textual and graphical discourse engines, a plan recognition system, and an interaction manager. Users begin by asking questions about their data. The system responds using text and graphics. Text responses are built by a discourse engine; graphical images are constructed using a perceptual visualization assistant. Plan recognition algorithms analyze queries and users' reactions to the responses they receive. This allows the system to anticipate future queries, cache relevant statistics, and guide the discourse and visualization systems during evaluation of new user queries.

Symbolic Representation Based Partial Order Methods
S. Purushothaman Iyer

$160,000 by the National Science Foundation
09/ 1/2002 - 08/31/2006

Symbolic representations are used in analysis of finite and infinite state concurrent system. However, they could be subjected to constraint explosion much like state explosion in analysis of finite state designs of concurrent systems. The reason for both of these explosions is the consideration of all interleavings, of a concurrent system, during their analysis. Partial-order techniques depend upon the notion of independence among actions to avoid considering all possible interleavings. The proposed research will investigate the notion of unfolding, which aids both in discovery of independent actions and in succinctly representing the state space of systems.

Automated Analysis of Probabilistic Open Systems
S. Purushothaman Iyer

$210,000 by the National Science Foundation
09/15/2001 - 07/31/2005

The project will explore semantic theories of systems that have both non-determinism and probabilistic choice. In particular, notions of equality and approximate equality of system behaviors will be investigated. Furthermore, the effect of these notions on compositional reasoning will also be studied. The second topic of the investigation will be a thorough comparison of the semantic theories developed in this project against traditional approaches to dealing with non-determinism and probabilistic choice. Finally, practical algorithms for process minimization and for checking equality (and approximate equality) of processes will be designed and implemented in the Concurrency Workbench of New Century.

NSF Partnership in the Center for Advanced Computing and Communication
Dennis Kekas ; Mladen Vouk

$492,240 by CACC-NSF
09/15/1999 - 08/31/2007

The Center for Advanced Computing and Communication (CACC) is a membership-based industry/university cooperative research center co-located at North Carolina State University and Duke University. North Carolina State University was selected by the National Science Foundation in 1981 as a site for an industry/university cooperative research center in communications and signal processing. The center was named the Center for Communications and Signal Processing until 1994 when a second center site at Duke University was added. The CACC research goal is to create concepts, methods and tools for use in the analysis, design and implementation of advanced computer and communication systems.

Joint Faculty Appointment
Xiaosong Ma

$549,457 by UT-Battelle, LLC
09/21/2003 - 08/15/2012

Xioasong Ma's joint work with NCSU and Oak Ridge National Laboratories (ORNL) will bridge the gap between the two organizations in a practical manner to cooperatively research parallel I/O in conjunction with the Genomes to Life (GTL) and Scientific Data management projects within the Computer Science and Mathematics Division at ORNL.

Runtime Data Management for Data-Intensive Scientific Applications
Xiaosong Ma

$299,992 by the US Department of Energy
08/15/2005 - 08/14/2008

Many applications currently used on daily basis by scientists fail to take advantage of state-of-the-art computer systems. This problem is more severe for many data-intensive applications, such as bioinformatics and visualization codes, whose parallelization are more recent and less studied in parallel architectures' design, compared to traditional simulations. We propose to address the above problems by investigating efficient runtime data management for data-intensive applications. We plan to build novel technologies for generic, automatic parallel execution plan optimization and enhancing parallel scientific data libraries by hiding I/O costs.

Collaborative Research: Reusable, Observation-Based Performance Prediction Across Platforms
Xiaosong Ma ; Frank Mueller

$76,566 by the National Science Foundation
08/ 1/2004 - 07/31/2006

This project investigates observation-based execution time estimation for resource planning and usage estimation in the grid environment. The proposed approaches will collect/manage/utilize application characteristics and performance results, and transfer such information across applications and platforms. Thus, performance data from one application's executions on one platform helps predict the performance of another application on another platform. The expected outcome of this research is a meta-predictor, an efficient and sufficiently accurate cross-platform performance prediction tool that provides performance predictions as a service to grid users. These approaches will be validated on production platforms with applications representative for nationally relevant high-end applications.

Comparative and Web-Enabled Virtual Screening
Xiaosong Ma (Co-PI) ; Jacqueline Hughes-Oliver (PI) ; Moody Chu ; Gary Howell; Morteza Khaledi

$1,111,110 by the National Institutes of Health
09/23/2005 - 07/31/2008

The long-term objective of this project is to develop computational algorithms and software to gain theoretical and empirical insights in the use of chemical diversity for determining quantitative structure-activity relationships (QSARs). In addition to addressing scientific and technical goals with respect to QSAR modeling, planning-period tasks will include specific activities to bring together the researchers and to facilitate inter-disciplinary communication. Specific Aim 1 is to develop and enhance collaborations between three broad disciplines: statistics, computer science, and chemistry.

CAREER: Exploiting Binary Rewriting to Analyze and Alleviate Memory Bottlenecks for Scientific Applications
Frank Mueller

$400,000 by the National Science Foundation
06/ 1/2003 - 05/31/2010

Today, high-performance clusters of shared-memory multiprocessors (SMPs) are employed to cope with large data sets for scientific applications. On these SMPs, hybrid programming models combing message passing and shared memory are often less efficient than pure message passing although the former fits SMP architectures more closely. For more information on this project check Dr. Mueller's Web Page

MOLAR: Modular Linux and Adaptive Runtime Support for HEC OS/R Research
Frank Mueller

$93,708 by the US Department of Energy
02/ 1/2005 - 01/31/2009

This project addresses issues of adaptive, reliable,and efficient operating and runtime system solutions for ultra-scale high-end scientific computing with the following goals: (1)Create a modular and configurable Linux system based on the application / runtime requirements. (2)Build runtime systems that leverage the OS modularity and configurability to improve efficiency, reliability, scalability,ease-of-use. (3)Advance computer reliability, availability and serviceability management systems to work cooperatively. (4)Explore the use of advanced monitoring and adaptation to improve application performance and predictability of system interruptions. Our focus is on developing scalable algorithms for high-availability without single points of failure and without single points of control.

ITR: Collaborative Research: SPARTA: Static Parametric Timing Analysis to Support Dynamic Decisions in Embedded Systems
Frank Mueller

$130,000 by the National Science Foundation
09/ 1/2003 - 08/31/2008

Embedded systems with temporal constraints rely on timely scheduling and a prior knowledge of worst-case execution times. Static timing analysis derives safe bounds of WCETs but its applicability has been limited to hard real-time systems and small code snippets. This proposal addresses these limitations of timing analysis for embedded systems. It contributes a novel approach to program analysis through parametric techniques of static timing analysis and provides innovative methods for exploiting them.

Reducing Frequency Via Speculation and Fall-Back Recovery
Frank Mueller

$300,000 by the National Science Foundation
07/ 1/2002 - 06/30/2006

This work puts forth a two-tier approach to reduce the processor frequency of complex embedded systems. First, tight worst-case timing analysis reduces the perceived upper bound on the number of cycles consumed by tasks. Second, architecture simulation and processors with dual frequency/voltage modes enable significant additional power savings. Architecture simulation produces an approximate worst-case timing estimate. A higher recovery frequency is utilized as a fall-back mode to ensure safe operation bounded by tight worst-case timing analysis. These two approaches complement each other. They initially reduce the power requirements by significant amount when compared with naive approach.

Collaborative Research: Effective Detection and Alleviation of Scalability Problems
Frank Mueller ; Jerzy Bernholc

$231,652 by the National Science Foundation
09/ 1/2004 - 08/31/2009

The focus of this project is to develop tool support to provide the ability for scientific programmers to inquire about scalability problems and correlate this information back to source code. Furthermore, we believe that tools should be able to suggest and evaluate optimizing transformations to alleviate these problems. This would constitute a significant improvement over current performance analysis practice. The key intellectual merit is in providing an automatic framework for detecting scalability problems and correlating them back to source code. We will experiment with our framework on the ASCI codes, which is intended to stress high-performance clusters.

Virtual Simple Architecture (VISA): Exceeding the Complexity Limit in Safe Real-Time Systems
Frank Mueller ; Eric Rotenberg

$275,000 by the National Science Foundation
08/15/2003 - 07/31/2007

While essential for real-time scheduling, deriving worst-case execution times (WCET) for contemporary processors is intractable. The Virtual Simple Architecture (VISA) framework shifts the burden of bounding the WCETs of tasks, in part, to hardware. A VISA is the pipeline timing specification of a hypothetical simple processor. WCET is derived for a task assuming the VISA. Tasks are executed speculatively on an unsafe complex processor. Yet, before deadlines become jeopardized, a simple mode is entered. Overall, VISA provides a general framework for safe operation on unsafe processors.

CAREER: Towards Trustworthy and Resilient Sensor Networks
Peng Ning

$400,000 by the National Science Foundation
07/ 1/2005 - 06/30/2011

Sensor networks are ideal candidates for a wide range of applications such as critical infrastructure protection. It is necessary to guarantee the trustworthiness and resilience of sensor networks as well as the sensing applications. The objective of this project is to develop practical techniques for building trustworthy and resilient sensor networks as well as instructional materials that facilitate the education of these techniques. The research activities are focused on practical broadcast authentication, trustworthy and resilient clock synchronization, and light-weight and collaborative intrusion detection in sensor networks, seeking effective integration of cryptographic techniques, application semantics, and other knowledge or constraints.

Efficient and Resilient Key Management for Wireless Sensor Networks
Peng Ning

$173,165 by Syracuse University
05/ 1/2005 - 07/30/2008

Security of sensor networks is a critical issue, especially when the sensor networks are deployed in hostile environments for mission critical applications. This project aims at developing efficient and resilient key management techniques for wireless sensor networks, including novel key pre-distribution techniques, effective use of knowledge extracted from practical sensor deployment models as well as application semantics, effective integration of public key and secret key, and specific techniques for key management in hybrid sensor networks consisting of a small number of resourceful nodes and a potentially large number of resource constrained, regular sensor nodes.

Collaborative Research: Trustworthy and Resilient Location Discovery in Wireless Sensor Networks
Peng Ning

$150,000 by the National Science Foundation
10/ 1/2004 - 06/30/2008

The objective of this project is to develop a comprehensive suite of techniques to prevent, detect, or survive malicious attacks against location discovery in sensor networks. The PIs will investigate key management schemes suitable for authenticating beacon messages, explore techniques to make existing location discovery schemes more resilient, seek beaconless location discovery that uses deployment knowledge instead of beacon nodes, and finally investigate methods to integrate the proposed techniques so that they can be combined cost-effectively for sensor network applications. This project will provide specific technical solutions that can be integrated with the sensor network techniques currently being developed.

ITR: Integrating Intrusion Detection with Intelligent Visualization and Interaction Strategies
Peng Ning ; Christopher Healey ; Robert St. Amant

$415,099 by the National Science Foundation
08/15/2002 - 08/31/2006

This project is motivated by current limitations of intrusion detection systems, which are generally unable to fully detect unknown attacks, or even unknown variations of known attacks, without generating a large number of false alarms. The focus of this project is to integrate intrusion detection with visualization techniques and human computer interaction strategies to address these limitations. Our system will include interactive intrusion detection algorithms that capitalize on human knowledge and judgment, novel visualization and interaction techniques to monitor for potential attacks, and semi-automated tools for constructing and evaluating attack profiles to extend the capabilities of an intrusion detection system.

Reduce False Alerts, Uncover High-Level Attack Strategies and Predict Attacks in Progress Using Prerequisites of Intrusions
Peng Ning ; Douglas Reeves

$330,000 by the National Science Foundation
07/ 1/2002 - 06/30/2006

Current intrusion detection systems (IDSs) usually generate a large amount of false alerts, and often do not detect novel attacks or variations of known attacks. Moreover, most of the existing IDSs focus on low-level attacks or anomalies; none of them can capture the logical steps or strategies behind these attacks. This project will exploit alert correlation techniques to reduce false alerts, discover attackers¹ high-level strategies, and predict possible future attacks based on the detected attacks in progress.

Correlating Alerts Using Prerequisites of Intrusions Towards Reducing False Alerts & Uncovering High Level Attacks
Peng Ning ; Douglas Reeves

$258,812 by the U.S. Army Research Office
07/ 1/2002 - 05/31/2005

Current intrusion detection systems (IDSs) usually generate a large amount of false alerts, and often do not detect novel attacks or variations of known attacks. Moreover, most of the existing IDSs focus on low-level attacks or anomalies; none of them can capture the logical steps or strategies behind these attacks. As a result, it is difficult for human users or intrusion response systems to understand the nature of the attack and to take appropriate actions. To address these issues, this project will investigate techniques to correlate intrusion alerts on the basis of the prerequisites and consequences of attacks.

Positioning and Reliable Data Transmission of Sensor Networks
Peng Ning ; Wesley Snyder

$199,823 by the US Army
08/ 1/2004 - 07/31/2007

Reliable and sufficient sensor coverage is an important requirement for a successful sensor network deployment. The goal of the project is to study and develop optimal positioning algorithms for sensor network deployment that provide surveillance and monitoring of assets or facilities. This project considers two scenarios: 1) Assuming we would deploy sensors for asset protection, an optimal sensor positioning and deployment algorithm is needed. 2) If sensors are deployed randomly over a geographical region to protect certain asset, what will be the best scheduling plan to turn on some of the deploying sensors such that sufficient surveillance can be provided.

A Performance Evaluation Study of an Optical Network with Temporal and Spatial Scheduling: A CACC Enhancement Project
Harry Perros

$56,517 by CACC-Nortel Networks
03/ 1/2004 - 12/31/2005

We propose to construct simulation and queueing-based models of an optical architecture proposed by Nortel Networks. There is little argument in the networking community that eventually the wire-lined networks will be based on an optical core. Currently, due to the collapse of the "bubble", the networking industry has drastically reduced its R&D on optical networks. However, it is important to have blue prints ready for commercialization when the tide turns around! This proposed research aims at evaluating the performance of such a blue print.

Tracing Attacks Through Non-Cooperative Networks and Stepping Stones with Timing-Based Watermarking
Douglas Reeves ; Peng Ning

$1,179,321 by the US Department of Interior
09/29/2003 - 02/28/2007

Increasingly the nation's infrastructure is connected by the Internet for computing, communication, monitoring, and control. Adversaries(hackers, criminals, terrorists, etc.)can and do exploit this connectivity to attack networked computers and devices. In order to defend against such attacks, and prosecute the adversaries, it is necessary to be able to identify the source of the attack. Tracing an attack back through the Internet to its source is the goal of this research project. As part of this project, tools for watermarking flows and performing timing correlation in real-time will be developed.

CAREER: Investigation of Error Recovery Techniques for Interactive Video Transmission over Wireless Networks
Injong Rhee

$269,075 by the National Science Foundation
04/ 1/1999 - 03/31/2005

We propose to study a new class of error recovery techniques that focuses on eliminating error propagation. The approach is to isolate errors when they occur by preventing them from propagating. The delays in repairing data losses affect only the duration of error propagation. Our Recovery from Error Spread using Continuous Updates (RESCU) does not introduce any delay in video playout, and has potential to achieve good error resilience. Our proposed techniques, in contrast, have potential to work well with or without feedback channels, and to be scalable for multicast. Encouraging preliminary results indicate that such potential is highly realizable.

NeTS-NOSS: Exploring the Design Space of Sensor Networks Using Route-Aware MAC Protocols
Injong Rhee ; Robert Fornaro

$584,999 by the National Science Foundation
01/ 1/2005 - 12/31/2008

As applications for wireless sensor networks are extremely diverse, sensor network designers will benefit immensely from (sensor) network protocols that can provide a wide spectrum of design choices, especially for very low energy budget applications. In this proposal, the PIs plan to develop a suite of new MAC protocols for sensor network applications based on a new approach, called Route-aware Media Access Control (RASMAC), that can greatly diversify design choices for application designers. A comprehensive evaluation of the developed protocols and their performance models is planned that involves design and implementation of a wildlife tracking system.

Investigation and Evaluation of Binary Increase Congestion Control for Ultra-Scale Networks
Injong Rhee ; Khaled Harfoush

$40,000 by CACC
07/ 1/2004 - 06/30/2005

The main approach of the proposed research is to emulate the per-flow characteristics of a "scalable" protocol with multiple parallel flows. The main goals of the multi-flow emulation are to achieve protocol fairness and to overcome the end-system bottlenecks. A new congestion control protocol called BIC (binary increase congestion control) is proposed for the protocol being emulated. The per-flow characteristics of BIC exhibit good scalability and fairness suitable for the multiflow emulation. A third party testing by SLAC over production networks involving a preliminary version of BIC and six other scalable TCP variants reported that BIC consistently topped the rankings in stability, scalability, and fairness. In this project, the PIs plan to enhance the responsiveness and scalability of BIC while retaining its fairness and stability properties.

NeTS-NR: Traffic Quantization: A Formal Framework for Quality of Service (QoS) and Scalability in Packet-Switched Networks
George Rouskas

$357,314 by the National Science Foundation
09/ 1/2004 - 08/31/2009

Traffic quantization is a new approach to supporting per-flow functionality in packet-switched networks in an efficient and scalable manner. We propose the concept of tiered service to alleviate the complexity associated with supporting per-flow QoS: a quantized network offers a small set of service tiers, and each flow is mapped to the tier that guarantees its QoS. Research will consist of four components: develop novel quantized implementations of weighted fair queueing (WFQ); develop Linux implementations of quantized WFQ to validate the theoretical results; extend the quantization approach to multiple traffic parameters; and investigate efficient constraint-based routing algorithms for quantized traffic.

A Formal Approach to Traffic Grooming in Optical Networks with General Topologies
George Rouskas ; Carla Savage ; Rudra Dutta

$404,968 by the National Science Foundation
09/ 1/2003 - 08/31/2007

We address the problem of grooming traffic into lightpaths for transport over general topology optical networks so as to minimize the network cost. We will first study the traffic grooming problem in a number of elemental topologies such as rings, stars, and trees. We will consequently develop hierarchical approaches to tackle the problem in general topologies by decomposing it into smaller subproblems involving elemental topologies. The end-result of this project will be a suite of traffic grooming algorithms with formally verified properties that can be flexibly and efficiently applied within a variety of optical network and cost models.

Enumeration and Structure in Families of Partitions, Compositions, and Combinations
Carla Savage

$183,287 by the National Science Foundation
07/15/2003 - 06/30/2007

The proposed research is an investigation of fundamental questions involving the structure of combinatorial families and relationships between families with intrinsically different characterizations. The focus is on families of integer partitions, compositions, and combinations. The topics under investigation include new work on partitions and compositions defined by linear inequalities; new tools for investigating classical questions about generalizations of the Rogers-Ramanujan identities; and symmetric chain decompositions with geometric applications.

Analysis and Evaluation of Combinatorial Structures and Algorithms: US-France Cooperative Research
Carla Savage ; William Stewart

$21,000 by the National Science Foundation
02/ 5/2003 - 01/31/2006

This project is the U .S. portion of a joint NSF /CNRS proposal for a cooperative research effort involving U.S. faculty from North Carolina State and Drexel Universities and French faculty from the University of Versailles Saint-Quentin. The funds will support travel for the U .S. participants to visit the University of Versailles to establish a cooperative research program focusing on an integrated approach to the analysis of combinatorial structures arising in applications.

ITR:Computational Principles of Trust
Munindar Singh

$573,473 by the National Science Foundation
09/ 1/2000 - 06/30/2007

Successful interaction relies heavily upon trust. This applies equally to electronic commerce and virtual social communities. However, figuring out who to trust and to what extent is extremely difficult in open networked information environments. Trust is a complex concept and involves aspects of competence and good nature (of the trusted party) and the risk tolerance and urgency (of the trusting party). This project studies distributed, scalable computational approaches for trust management, especially with regard to aggregate phenomena such as the emergence of subcommunities, pivots (which link different subcommunities), and the sensitivity of a community to invasion by nontrustworthy players.

Principles of Commitment Protocols
Munindar Singh

$345,000 by the National Science Foundation
05/15/2002 - 04/30/2007

Business protocols structure and streamline interactions among autonomous business partners. Traditional representations of protocols specify legal sequences of actions but not their meaning. Thus they cannot adequately support flexible interactions, e.g., to handle exceptions and exploit opportunities. This project is developing a declarative model of protocols that gives meaning to, and reasons about, states and actions based on the participants' commitments. This approach improves flexibility while maintaining rigor. This project is studying practical protocols from real-life domains such as transactions among financial institutions and other varieties of electronic business.

Protocols for Processes: A Semantic Approach Based on WEL and OWL-S
Munindar Singh

$220,000 by BBNT Solutions, LLC
02/ 2/2004 - 09/30/2005

Serious applications of Web services involve complex processes. However, current approaches are inadequate for flexibly modeling and enacting processes. This project is developing an approach for specifying and enacting processes that is based on flexibly represented protocols. Protocols specify what rather than how; thus they naturally maximize the autonomy, heterogeneity, and dynamism of the interacting services. This project is developing a language for representing protocols that incorporates selected abstractions based on commitments and roles, and enables rule-based specifications of service compositions based on protocols. The project includes a proof-of-concept prototype that demonstrates the above ideas.

Intelligent Human-Machine Interface and Control for Highly Automated Chemical Screening Processes
Robert St. Amant ; David Kaber ; Mo-Yuen Chow

$786,000 by the National Science Foundation
10/ 1/2004 - 09/30/2007

The breakthrough information technology that we will develop through this ITR project is an intelligent/adaptive, human-machine interface to support the new role of screening process supervisors in safe and effective, distributed control of high time stress and high risk, automated chemical and toxicity testing. The development of this technology will be based on cognitive modeling of supervisory controller behaviors during actual chemical screening processes and model predictions of operator performance with different interactive information display design alternatives during the (model) design phase and during chemical process run-time.

Towards an International Research Partnership Program on Human-Automation Interaction in the Life Sciences
Robert St. Amant ; David Kaber ; Robert Kelly ; Mo-Yuen Chow; and Leonard Bull

$58,699 by the National Science Foundation
09/ 1/2004 - 08/31/2005

To develop a large ?international research partnership program? proposal is to develop a collection of small, integrated research projects at NCSU in bioinformatics that will complement an integrated set of projects on life sciences automation to be conducted at CELISCA. The idea behind the proposal is that the two universities may be able to define a large high-throughput screening process through the linkage of biocatalysis and high-performance analytical chemistry facilities at CELISCA and microarray research facilities at NCSU.

Human-Computer Visualization
Robert St.Amant ; Chris Healey

$30,000 by Soar Technology
08/16/2004 - 01/31/2005

We propose to apply visualization and intelligent user interface concepts to support situation assessment for company-level decision-making. Our plan of work has three parts. First, we propose to perform an analysis of existing map-based techniques for visualizing strategic information, in order to identify strengths, weaknesses, and opportunities for improvement. Second, we propose to identify promising techniques for building map-based visualizations that present tactical information relevant to decision-making in a Military Operations in Urban Terrain (MOUT) context. Third, we propose to develop a prototype visualization that will illustrate the effectiveness of the techniques we identify.

Structured Methods to Evaluate the Performance of Distributed Software
William Stewart

$120,738 by the University of California - Riverside
02/ 5/2004 - 08/31/2006

Collaborative research effort aimed at removing the computational barriers to widespread adoption of Markov chain modeling technology, with its application to performance modeling of concurrent software. We adopt a compositional modeling formalism, in which the underlying Markov chain is kept as a sum of Kronecker products of matrices of small dimension. Goals include (1)Explore relationship between high-level formalisms suitable for modeling concurrent software and their underlying Kronecker structure, (2)Develop efficient solution methods for asynchronous interactions and block-based Kronecker descriptions, and study their complexity, (3)Explore how to combine savings due to lumping with that due to implicit representations.

National Extreme Events Data And Research Center (NEED) Transforming The National Capability For Resilience To Extreme Weather And Climate Events (Supplement)
Ranga Vatsavai

$19,999 by Oak Ridge National Laboratory vis US Dept of Energy
03/16/15 - 09/30/2016

NCSU graduate student will develop a machine learning approach to linking extreme atmospheric ridging events with extreme surface temperatures, employing a Gaussian Process (GP)-based predictive analysis tool that leverages the predictive value of spatial and temporal correlations and builds on ORNL’s past success in spatial classification, temporal change prediction, and parallelizing GP for large spatiotemporal extents.

Center for Scientific Data Management-Agent Technology Enabling Communication Among Tools and Data
Mladen Vouk

$906,987 by the U.S. Department of Energy
08/15/2001 - 08/14/2007

Scientific Data Management Center is a SciDAC funded center with a goal to establish an Enabling Technology Center that will provide a coordinated framework for the unification, development, deployment, and reuse of scientific data management software, including scientific workflow technologies, specifically through SDM?s Scientific Process Automation (SPA) focus area. The goal of this technology is to allow for easy and accurate interactions and flows among distributed computational, storage and application resources used in scientific discovery.

Women and Information Technology: A Comparative Study of Young Women from Middle Grades through High School and into College
Mladen Vouk ; Sarah Berenson ; Joan Michael

$500,027 by the National Science Foundation
08/15/2002 - 07/31/2006

This project is a seven-year longitudinal research study of young women who were identified as talented in mathematics in middle school. The purpose of this research is to address the current low participation of young women in IT careers. The project will develop and test a model of the factors associated with young women's decisions to persist in advanced mathematics and computer science courses so as to prepare themselves for, and decide to make information technology [IT] their career. IT careers are defined in this proposal as those requiring an electrical engineering, computer science, or computer engineering degree.

Security Assessment of Cisco mIOS - CNL portion
Mladen Vouk ; Dennis Kekas

$88,863 by NTI-Cisco Systems
10/ 1/2003 - 12/31/2005

This project will develop a security and exploit assessment methodology and scenarios based on analyses of existing solution and parameterization of internal and external interfaces; explore and develop initial mIOS security probe and metrics; and explore and develop an initial definition and possibly a working model of an anomaly-based security module.

CAREER: The Test-Driven Development of Secure and Reliable Software Applications
Laurie Williams

$413,764 by the National Science Foundation
04/ 1/2004 - 03/31/2011

Our nation's critical infrastructure demands that our current and future IT professionals have the knowledge, tools, and techniques to produce reliable and trustworthy software. The objective of this research is to extend, validate, and disseminate a software development practice to aid in the prevention of computer-related disasters. The practice is based upon test-driven development (TDD), a software development technique with tight verification and validation feedback loops. The proposed work extends the TDD practice and provides a supportive open-source tool for explicitly situating security and reliability as primary attributes considered in these tight feedback loops.

In Regression Testing Without Code
Laurie Williams

$115,067 by ABB, Inc.
01/10/2005 - 08/15/2007

The goals of this research project are to come up with a validated method of regression test selection for software components for which source code is not available to software development organizations incorporating the components into their products. This includes all third party components that are used "off the shelf", as well as internal components where the source code is not readily available at the time of the regression test selection. This project will involve an understanding of component changes, regression testing, and release documentation.

Supporting Evidence-Based Software Engineering
Laurie Williams

$40,000 by CACC
07/ 1/2005 - 12/31/2006

This project will apply research framework to provide industry-based research results of agile software development, software process transition, process customization, reliability and quality prediction, requirements prioritization, pair programming vs. inspection, and distrubted software development. Research will also adapt an Eclipse plug-in to enable detailed causal analysis of field failures and to empirically examine the defect-removal efficacy of V&V techniques, such as inspections and unit testing, to build up evidence about these practices.

Academy for Software Engineering Educators & Trainers
Laurie Williams

$10,000 by the National Science Foundation
09/ 1/2005 - 08/31/2006

A special one-day Academy for Software Engineering Educators & Trainers will be offered on the day prior to the start of The Conference on Software Engineering Education and Training (CSEE&T). The purpose of the Academy is to provide an opportunity for software engineering educators and trainers to learn from master instructors in a highly dynamic, hands-on, interactive environment. The Academy will provide a learning opportunity to PhD students who will be entering academic careers in the near term, new faculty members, and mid-career faculty members who are beginning to teach software engineering course(s).

On the Identification of Agile and Plan-Driven Best Practices
Laurie Williams ; Carla Savage ; Jason Osborne

$40,000 by CACC
07/ 1/2004 - 12/31/2005

The objective of this proposal is to define and validate a methodology for identifying software development best practices or sets of best practices to guide the tailoring of defined processes. In this way, organizations can choose a set of mutually-supportive and non-conflicting practices for software development based on their project characteristics. Through this research, we believe member organizations will be more able to confidently understand the implications of tailoring a defined method in specific ways, increasing the ability for a team to be more successful in completing reliable projects on time.

Extending Extreme Programming
Laurie Williams ; Mladen Vouk

$254,134 by CACC-NSA
09/15/2003 - 12/31/2008

The Extreme Programming methodology was designed for relatively small teams of collocated programmers working on non-critical, small-medium, object-oriented projects. Little empirical assessment has been done on the methodology, though a sizable amount of anecdotal evidence supports the use of the methodology under these conditions. We are proposing collaborative research with the NSA and Galois in which we will empirically assess the efficacy of Extreme Programming practices in a high-confidence, secure, functional programming project. Additionally, we will work on integrating formal methods, reliability, and security testing into the set of Extreme Programming practices.

Collaboration through Agile Software Development Practices: A Means for Improvement in Quality and Retention of IT Workers
Laurie Williams ; Mladen Vouk ; Jason Osborne ; Winser Alexander; an Sarah Berenson

$812,587 by the National Science Foundation
06/15/2003 - 06/30/2009

This ITWF award to NCSU, NCA&T, and Meredith College will support a three-year study of the collaborative aspects of agile software development methodologies. We believe the collaboration and the social component inherent in these methodologies is appealing to people whose learning and training models are socially oriented, such as some minority groups, women, and men. The project?s objective is to perform extensive, longitudinal experimentation in advanced undergraduate software engineering college classes at the three institutions to examine student success and retention in the educational and training pipeline when the classes utilize an agile software development model.

Continuous Checking of Static Analysis and Automated Unit Tests for Java Programs
Laurie Williams ; Jun Xu

$40,000 by CACC
07/ 1/2005 - 12/31/2006

Researchers propose to develop "Continuous Checking" tool that would use the computer's available CPU cycles and continuously provide feedback to software developers on compilation, static analysis, and testing defects. The user can train the tool to reduce false positives reported from static analysis. Researchers will implement Continuous Checking as a plug-in for the open source Eclipse development environment and will validate the effectiveness of the tool via empirical studies of open source programs and by working closely with CACC members.

CAREER: Automated Synthesis of Bidding Strategies for Trading Agents
Peter Wurman

$300,010 by the National Science Foundation
08/ 1/2001 - 07/31/2007

This project will investigate approaches to building a strategy generation engine as a component of a flexible trading agent that converts user preferences, auction rules, and a model of the other agents into a decisionable format. The first strategy generation engine will produce game-theoretic representations of the decision problem. For small problems, the game can be solved and an equilibrium bidding strategy selected. However, for intractable larger problems, alternate strategy generation engines will be constructed which use other decision technologies. Ideally, the agent will be able to make this decision by assessing the structure of the problem instance.

Techniques for Building Robust Software
Jun Xu

$40,000 by CACC
07/ 1/2004 - 08/31/2005

The proposed research develops two methods for funding robustness problems in software: a black-box method and a static source code analysis method. The proposed research contributes with a comprehensive methodology that addresses limitations in current research. The major phases and deliverables of the project are: (1) the POSIX error interface definition and classification in an XML-database; (2) the black-box robustness testing tools including the stress insertion tool based on both ptracelltrace and library wrappers; (3) a compiler that enforces robustness properties using intra- and inter-procedure analysis; (4) robustness properties from of black-box evaluation results.

Creating Effective Task Descriptions from Action Plans
R. Michael Young

$315,000 by the National Science Foundation
08/15/2004 - 07/31/2008

Artificial intelligence planning systems are being put to use to determine the activities of a wide range of intelligent interactive systems. The ability for these kinds of systems to explain their plans to human users is essential for the systems' successful adoption and use. We are investigating the generation of natural language descriptions of plan data structures. This work will develop a cognitive and computational model of task context and its role in the generation of action descriptions.

CAREER: Plan-Based Integration of Control and Coherence in Intelligent Exploratory Environments
R. Michael Young

$480,695 by the National Science Foundation
03/15/2001 - 08/31/2007

The use of virtual environments has shown success in applications ranging from education to entertainment. One limitation of these systems is that users' activities within them are over- or under-constrained. In this project, I will develop new models for the structure of user interactions within virtual worlds. Because a user's understanding of the activity in a world provides scaffolding for her own exploration, presenting the user with an environment in which action can be readily understood encourages the user to acquire and employ knowledge of the environment. This activity leads to an increased understanding of the world the environment models.

HI-FIVES: Using Web-Based Gaming to Improve Student Comprehension of Information Technology in Science
R. Michael Young (Co-PI) ; Leonard Annetta (PI) ; Deborah Mangum ; Thomas Miller

$1,197,270 by the National Science Foundation
09/ 1/2005 - 08/31/2009

Researchers in science education, computer science, distance education, and the NC Department of Public Instruction are partnering with the Kenan Fellows Program to harness the untapped potential of inexpensive, online multi-user competitive simulation software in improving the science achievement and IT skills of NC's grade 6-12 students. Over three years, teacher participants will learn how to use this technology to increase student science achievement and motivation to enter IT-related science careers. Intellectual merits of the project entail rigorous assessment and evaluation of how these environments most effectively improve the IT skills and science content mastery of students.