Navigation: CSC Department Menu

Total: $49,015,716

Current Computer Science Research Projects (by faculty)

The funded projects listed below are active projects and the funded running total for the active projects is on the left navigational menu.

Track 2: CS10K: BJC-STARS: Scaling CS Principles through STARS community & Leadership Development
Tiffany Barnes

$500,000 by National Science Foundation
10/ 1/2015 - 09/30/2018

BJC-STARS is a CS10K proposal to broaden access to Computer Science education through engaging colleges and universities to prepare and support regional communities of high school teachers to teach the Beauty and Joy of Computing (BJC) Computer Science Principles course. We will leverage the successful STARS model focusing on engaging faculty and students in a community committed to leading efforts to broaden participation in computing. Each year, we will engage new university faculty who will teach BJC and facilitate professional development and support to high school teachers and students. We will also build a STARS community among participating high school teachers and students, engaging them in the need to broaden participation in computing.

Collaborative Research: Modeling Social Interaction and Performance in STEM Learning
Tiffany Barnes

$200,003 by National Science Foundation
09/ 1/2014 - 08/31/2017

Despite long-standing awareness that social interaction is an integral part of knowledge construction, efforts to study complex collaborative learning have traditionally been relegated to qualitative and small-scale methodologies. Relatively new data traces left by online learning environments, including massive open online courses (MOOCs), offer the first real hope for scaling up such analyses. The purpose of the proposed research is to develop comprehensive models for collaborative learning which in turn will enable instructional design and the authentic assessment of the individual within the group context. This task is undertaken by an interdisciplinary team of researchers with specializations in natural language processing, discourse analysis, social network analysis, educational data mining and psychometrics.

REU Site: Interactive and Intelligent Media
Tiffany Barnes

$359,999 by National Science Foundation
04/ 1/2013 - 03/31/2017

The REU Site at NC State University will immerse a diverse group of undergraduates in a vibrant research community of faculty and graduate students working on cutting-edge games, intelligent tutors, and mobile applications. We will recruit students from underrepresented groups and colleges and universities with limited research opportunities through the STARS Alliance, an NSF-funded national consortium of institutions dedicated to broadening participation in computing. Using the Affinity Research Groups and STARS Training for REUs models, we will engage faculty and graduate student mentors with undergraduates to create a supportive culture of collaboration while promoting individual contributions to research through just-in-time training for both mentors and students throughout the summer.

BPC-AE: Scaling the STARS Alliance: A National Community for Broadening Participation Through Regional Partnerships
Tiffany Barnes

$150,000 by UNC-UNC Charlotte ( NSF)
01/ 1/2013 - 03/21/2017

The Beauty and Joy of Computing project presents a unique opportunity to scale the STARS Alliance further while also enhancing national efforts to engage more high school teachers and students in teaching and learning computing and build stronger university/college/K12 partnerships. Through this supplement, we will extend the Alliance with at least three new STARS Computing Corps, providing leadership training to a group of 8-10 students in each Corps, all focused on supporting the BJC effort. New Corps will provide teaching assistance to high school teachers implementing the BJC course through classroom visits and monthly Computer Science Teacher Association chapter meetings. These new STARS Computing Corps will also teach BJC material either through in middle school Citizen Schools after-school programs, and K-12 summer camps. This will provide a vibrant community of support for high school teachers and students engaging the new BJC course.

BPC-AE: Scaling the STARS Alliance: A National Community for Broadening Participation Through Regional Partnerships (supplement)
Tiffany Barnes

$39,164 by UNC-Charlotte via National Science Foundation
01/ 1/2013 - 12/31/2016

The Beauty and Joy of Computing project presents a unique opportunity to scale the STARS Alliance further while also enhancing national efforts to engage more high school teachers and students in teaching and learning computing and build stronger university/college/K12 partnerships. Through this supplement, we will extend the Alliance with at least three new STARS Computing Corps, providing leadership training to a group of 8-10 students in each Corps, all focused on supporting the BJC effort. New Corps will provide teaching assistance to high school teachers implementing the BJC course through classroom visits and monthly Computer Science Teacher Association chapter meetings. These new STARS Computing Corps will also teach BJC material either through in middle school Citizen Schools after-school programs, and K-12 summer camps. This will provide a vibrant community of support for high school teachers and students engaging the new BJC course.

Type I: Collaborative Research: FRABJOUS CS - Framing a Rigorous Approach to Beauty and Joy for Outreach to Underrepresented Students in Computing at Scale
Tiffany Barnes

$352,831 by National Science Foundation
02/ 1/2013 - 08/31/2016

In this FRABJOUS CS project, we will prepare 60 high school teachers to teach the Beauty and Joy of Computing (BJC) Computer Science Principles curriculum. The BJC course is a rigorous introductory computing course that highlights the meaning and applications of computing, while introducing low-threshold programming languages Snap-Scratch, GameMaker and AppInventor. BJC is informed and inspired by the Exploring Computer Science curriculum, that was explicitly designed to channel the interests of urban HS students with ?culturally relevant and meaningful curriculum? [Goode 2011][Margolis 2008]. The BJC course uses collaborative classroom methods including pair learning, and student-selected projects are geared toward leveraging students? knowledge of social media, games, devices, and the internet. At UNC Charlotte in 2010 and 2011, PI Barnes engaged college students in supporting the BJC course, and in after-school outreach and summer camps that excite middle and high school students about this curriculum at different levels. The project engages three university faculty members and 6 college students to help the high school teachers build a Computer Science Teachers Association chapter and provide ongoing professional development and support for the BJC course. The project also engages high school teachers and an education researcher to help refine and enriches the BJC curriculum to be easier to adopt and teach in high schools.

Type I: Collaborative Research: FRABJOUS CS - Framing a Rigorous Approach to Beauty and Joy for Outreach to Underrepresented Students in Computing at Scale (Supplement)
Tiffany Barnes

$86,000 by NSF
02/ 1/2013 - 08/31/2016

In this FRABJOUS CS project, we will prepare 60 high school teachers to teach the Beauty and Joy of Computing (BJC) Computer Science Principles curriculum. The BJC course is a rigorous introductory computing course that highlights the meaning and applications of computing, while introducing low-threshold programming languages Snap-Scratch, GameMaker and AppInventor. BJC is informed and inspired by the Exploring Computer Science curriculum, that was explicitly designed to channel the interests of urban HS students with ?culturally relevant and meaningful curriculum? [Goode 2011][Margolis 2008]. The BJC course uses collaborative classroom methods including pair learning, and student-selected projects are geared toward leveraging students? knowledge of social media, games, devices, and the internet. At UNC Charlotte in 2010 and 2011, PI Barnes engaged college students in supporting the BJC course, and in after-school outreach and summer camps that excite middle and high school students about this curriculum at different levels. The project engages three university faculty members and 6 college students to help the high school teachers build a Computer Science Teachers Association chapter and provide ongoing professional development and support for the BJC course. The project also engages high school teachers and an education researcher to help refine and enriches the BJC curriculum to be easier to adopt and teach in high schools.

Educational Data Mining for Individualized Instruction in STEM Learning Environments
Min Chi ; Tiffany Barnes

$639,401 by National Science Foundation
09/ 1/2014 - 08/31/2017

Human one-on-one tutoring is one of the most effective educational interventions. Tutored students often perform significantly better than students in classroom settings (Bloom, 1984; Cohen, Kulik, & Kulik, 1982). Computer learning environments that mimic aspects of human tutors have also been highly successful. Intelligent Tutoring Systems (ITSs) have been shown to be highly effective in improving students' learning at real classrooms (Anderson, Corbett, Koedinger, & Pelletier, 1995; Koedinger, Anderson, Hadley, & Mark, 1997; VanLehn et al., 2005). The development of ITSs has enabled schools and universities to reach out and educate students who otherwise would be unable to take advantage of one-on-one tutoring due to cost and time constraints (Koedinger, Anderson, Hadley, & Mark, 1997). Despite the high payoffs provided by ITSs, significant barriers remain. High development costs and the challenges of knowledge engineering have prevented their widespread deployment. A diverse team of software developers, domain experts, and educational theorists are required for development, testing and even maintenance. Generally speaking, it requires an average of 80 man-hours per hour of tutoring content. In this proposed work, our goal is to automatically design effective personalized ITSs directly from log data. We will combine co-pI Dr. Barnes data-driven approach on learning what to teach with PI Dr. Chi’s data-driven work on learning how to teach. More specifically, we will explore two important undergraduate stem domains: discrete math and probability; and employ two types of ITSs: an example-based ITS, the discrete math tutor, and a rule-based ITS, Pyrenees. The former can automatically generate hints directly from students’ prior solutions while the latter has hard-coded domain rules and teaches students a domain-general problem-solving strategy within the context of probability. For learning how to teach, we will apply reinforcement learning to induce adaptive pedagogical strategies directly from students’ log files and will focus on three levels of pedagogical decisions: 1) whether or not to give students hints and what level of hint (step-level); 2) whether to show students worked example or ask students to engage problem solving (problem-level); and 3) whether or not to teach students a meta-cognitive problem-solving strategy (metacognitive level).

Membership for Center of Hybrid Multicore Productivity Research (CHMPR), Full Member
Rada Chirkova

$80,000 by Northrup Grumman
01/ 1/2016 - 12/31/2017

Membership dues from Northrup Grumman for Center of Hybrid Multicore Productivity Research. Exploring and analyzing the available data is key to making the right decisions. It is well known that “data wrangling,” which includes many kinds of end-to-end data enablement, makes up 60-80% of the total effort in analytics on large-scale data. We look to address the challenge of maximizing the usefulness of the available data, by providing tools, science, and talent for next-generation technologies and infrastructure. We focus on empowering organizations that wish to unlock the value of decisions based on their data, and envision a future where technologies and tools for data enablement provide significant business advantages to such organizations. At NCSU, we will lead national and international efforts in this space, by developing and providing technologies and tools for bridging the time gap between the acquisition of data and real-time and long-term decision making.

NeTS: JUNO: Service Offering Model and Versatile Network Resource Grooming for Optical Packet and Circuit Integrated Networks
Rudra Dutta

$291,956 by National Science Foundation (NSF)
04/ 1/2014 - 03/31/2017

The explosive growth in bandwidth represented by advances in optical communication and networking technologies has underpinned the increasing reach and reliability of the Internet in the last two decades. However, the potential impact of increasingly sophisticated recent advances in optical technology, such as rapid switching and elastic wavelengths have not yet been realized. The main cause of this is that such technology, while possible to integrate into the data plane of planetary networking, is difficult to accommodate in the current planning, management, and control strategies. We propose in this project to work hand-in-hand with collaborating researchers from NICT, Japan, who are working to realize a novel technology of hybrid optical packet/circuit switching. Such a technology could be immensely useful to large transport network operators, but there are no existing algorithms that can easily determine how a provider can provision their resources between the circuit and packet possibilities on an ongoing dynamic basis. We envision a novel approach to this problem, where we utilize the concept of a "choice marketplace" that allows sophisticated rendezvous semantics between customer and provider, and allows them to cooperatively guide network resource provisioning to dynamically fulfill network objectives such as maximizing performance received by network traffic. Our approach also allows balancing of various objectives, such as network utilization, latency, and the increasingly important metric of energy expenditure in the network.

NeTS: Small: Collaborative Research: Enabling Robust Communication in Cognitive Radio Networks with Multiple Lines of Defense
Rudra Dutta ; Peng Ning

$249,901 by National Science Foundation
10/ 1/2013 - 09/30/2016

Cognitive radio is an emerging advanced radio technology in wireless access, with many promising benefits including dynamic spectrum sharing, robust cross-layer adaptation, and collaborative networking. Opportunistic spectrum access (OSA) is at the core of cognitive radio technologies, which has received great attention recently, focusing on improving spectrum utilization efficiency and reliability. However, the state-of-the-art still suffers from one severe security vulnerability, which has been largely overlooked by the research community so far. That is, a malicious jammer can always disrupt the legitimate network communication by leveraging the public-available channel statistic information to effectively jam the channels and thus lead to serious spectrum underutilization. In this proposal, we propose to address the challenge of effective anti-jamming communication in cognitive radio networks (CRNs). We propose a multiple lines of defense approach, which considers and integrates defense technologies from different dimensions, including frequency hopping, power control, cooperative communication, and signal processing. The proposed defense approach enables both reactive and proactive protection, from evading jammers to competing against jammers, and to expelling jamming signals, and thus guarantees effective anti-jamming communication under a variety of network environments.

NeTS: Large: Collaborative Research: Network Innovation Through Choice
Rudra Dutta ; George Rouskas

$643,917 by National Science Foundation
09/15/2011 - 08/31/2016

This project builds on the SILO project that started in 2006 to design a new architecture for the Internet. In this new project, we will collaborate with teams of researchers from the University of Kentucky, the University of Massachusetts, and RENCI, to design critical parts of a new architecture for the Internet that will support the flexible use of widely applicable information transport and transformation modules to create good solutions for specific communication applications. The key idea is to allow a network to offer information transformation services at the edge or in the core transparently to the application, and creating a framework in which application can issue a request not only for communication but for specific reusable services. We also propose research tasks that will enable network virtualization and isolation seamlessly at many levels, currently a difficult but highly relevant problem in practical networking.

TWC: Medium: Collaborative: Improving Mobile-Application Security via Text Analytics
William Enck

$300,000 by National Science Foundation
07/ 1/2015 - 06/30/2018

Computing systems that make security decisions often fail to take into account human expectations. This failure occurs because human expectations are commonly drawn from textual sources (e.g., mobile application description) and are hard to extract and codify. This proposal seeks to extract expectation context from natural-language text artifacts presented to users as they find, install, and run software. The proposed work focuses specifically mobile applications to demonstrate concrete and practical advances in our scientific understanding of applying user expectation context to security decisions. These findings will advance the state-of-the-art in identifying and classifying malware and grayware, as well as identify better methods of communicating risk to users. We will also gain a better understanding of the unique challenges of applying text analytics the security domain.

TWC: Medium: Collaborative: Improving Mobile-Application Security via Text Analytics (supplement)
William Enck

$8,000 by National Science Foundation
07/ 1/2015 - 06/30/2018

Smartphones and mobile devices have become a dominant computing platform for billions of users. There are many static and dynamic program analysis tools to detecting security and privacy concerns in mobile applications. However, few approaches bridge the semantic gap between code and visual presentation. Ignoring this context results in analysis that falsely reports an application as malicious (e.g., the user really wanted to use an app to record phone calls), or fails to detect suspicious behavior (e.g., an app that collects sensitive information via text input). We propose to use a hybrid static / dynamic approach to extract the text labels from the Android application UIs followed by text analytics to semantically label the type of input asked for by the application. Doing so will better characterize the amount of security and privacy information entered into Android applications, as well as enable outlier detection to identify applications that ask for unexpected (e.g., SSN) information for their semantic class (e.g., banking applications). This analysis will be applied at scale to identify potential privacy infringements in mobile application stores.

CAREER: Secure OS Views for Modern Computing Platforms
William Enck

$400,000 by National Science Foundation
02/ 1/2013 - 01/31/2018

Controlling the access and use of information is a fundamental challenge of computer security. Emerging computing platforms such as Android and Windows 8 further complicate access control by relying on sharing and collaboration between applications. When more than two applications participate in a workflow, existing permission systems break down due to their boolean nature. In this proposal, we seek to provide applications with residual control of their data and its copies. To do this, we propose secure OS views, which combines a new abstraction for accessing data with whole-system information tracking. We apply secure OS views to modern operating systems (e.g., Android and Windows 8), which use database-like abstractions for sharing and accessing information. Similar to a database view, secure OS views uses runtime context to dynamically define the protection domain, allowing the return of the value, a fake value, or nonexistence of the record.

Correct Enforcement of Access Control Policy in Modern Operating Systems; Research Area: 5.3 Information and Software Assurance
William Enck

$411,895 by US Army - Army Research Office
05/ 9/2016 - 11/ 8/2016

Consumer operating systems are changing. Modern platforms such as Android, iOS, and Windows 8 provide new abstractions for specifying and enforcing access control policy on third-party applications run by end users. The new abstractions add complexity to both policy specification and enforcement. In this proposal, we focus specifically on the correctness of enforcement in these modern platforms. The proposed work seeks to extract a formal semantics of access control policy by mining existing code bases. We then analyze the extracted model for correct enforcement of security goals. The models will also be compared across platform variations, as well as across different platforms. In doing so,we seek to harden existing platforms and establish stronger trustworthiness in a security-critical layer of platforms relied upon by consumers, enterprises, and governments.

TWC: Small: Collaborative: Characterizing the Security Limitations of Accessing the Mobile Web
William Enck

$167,000 by NSF
10/ 1/2012 - 09/30/2016

Mobile browsers are beginning to serve as critical enablers of modern computing. With a combination of rich features that rival their desktop counterparts and strong security mechanisms such as TLS/SSL, these applications are becoming the basis of many other mobile apps. Unfortunately, the security guarantees provided by mobile browsers and the risks associated with today's mobile web have not been evaluated in great detail. In the proposed work, we will investigate the security of mobile browsers and the mobile specific websites they access. Characterizing and responding to the threats in this space is critical, especially given that cellular devices are used by more than five billion people around the world

Collaborative Research: Research in Student Peer Review: A Cooperative Web-Services Approach
Edward Gehringer

$1,034,166 by NSF
09/ 1/2014 - 08/31/2017

Peer review between students has a 40-year history in academia. During the last half of that period, web-based peer-review systems have been used in tens of thousands of classes. Many online systems have been developed, in diverse settings and with diverse purposes. The systems, however, have common concerns: assigning capable reviewers to each student submission, insuring review quality, and delivering reliable scores, in cases where the systems are used for summative review of student work. Many strategies have been proposed to meet those concerns, and tested in relatively small numbers of courses. The next step is to scale up the studies to learn how well they perform in diverse settings, and with large numbers of students. This project brings together researchers from several peer-review systems, including some of the largest, to build web services that can be incorporated into existing systems to test these strategies and visualize the results.

Collaborative Research: Research in Student Peer Review: A Cooperative Web-Services Approach (Supplement)
Edward Gehringer

$40,000 by National Science Foundation
09/ 1/2014 - 08/31/2017

The students assist our efforts to build a database of peer-review responses that can be mined for quantitative research studies. The database will be composed of anonymized data from the peer-review systems of the constituent projects: CritViz, CrowdGrader, Expertiza, and Mobius/Slip. Among other items, it will contain peer feedback and ratings, and links to submitted work. They will embark on a qualitative research study to determine what STEM students value about the peer-review process. They will use a common set of research protocols to investigate three research questions: What do students value about receiving reviews? What do they value about giving reviews? Do their reactions differ, based on demographics, age/level of study, or academic major?

CSR: Medium:Collaborative Research: Holistic, Cross-Site, Hybrid System Anomaly Debugging for Large Scale Hosting Infrastructures
Xiaohui (Helen) Gu

$518,000 by National Science Foundation
08/ 1/2015 - 07/31/2019

Hosting infrastructures provide users with cost-effective computing solutions by obviating the need for users to maintain complex computing infrastructures themselves. Unfortunately, due to their inherent complexity and sharing nature, hosting infrastructures are prone to various system anomalies caused by various external or internal faults.The goal of this project is to investigate a holistic,cross-site, hybrid system anomaly debugging framework that intelligently integrates production-site black-box diagnosis and developer-site white-box analysis into a more powerful hosting infrastructure anomaly debugging system.

CAREER: Enable Robust Virtualized Hosting Infrastructures via Coordinated Learning, Recovery, and Diagnosis
Xiaohui (Helen) Gu

$450,000 by National Science Foundation
01/ 1/2012 - 12/31/2016

Large-scale virtualized hosting infrastructures have become the fundamental platform for many real world systems such as cloud computing, enterprise data centers, and educational computing lab. However, due to their inherent complexity and sharing nature, hosting infrastructures are prone to various runtime problems such as performance anomalies and software/hardware failures. The overarching objective of this proposal is to systematically explore innovative runtime reliability management techniques for large-scale virtualized hosting infrastructures. Our research focuses on handling performance anomalies in distributed systems that are often very difficult to reproduce offline. Our approach combines the power of online learning, knowledge-driven first-response recovery, and in-situ diagnosis to handle unexpected system anomalies more efficiently and effectively. We aim at transforming the runtime system anomaly management from a trial-and-error guessing game into an efficient knowledge-driven self-healing process.

Cloud Configuration Management System for Elastic Application Deployment in Private Clouds
Xiaohui (Helen) Gu

$49,567 by Credit Suisse Securities, LLC
01/ 2/2015 - 08/15/2016

Cloud computing infrastructure provides an elastic application deployment environment. Applications can be dynamically instantiated on different physical hosts on demand. However, in order to fully explore the elasticity of the cloud infrastructure, applications should be able to automatically configure themselves when their components are placed in or migrated to different data centers at geographically distributed regions. Unfortunately, today’s technology does not provide such an automatic configuration support. The application developer still needs to deal with the hassle of configuring their applications manually. The objective of this proposed project is to develop an automatic application configuration management framework that can decouple the configuration management from the application logic. I propose to conduct the following tasks: 1) identifying the configuration problems in existing cloud platform and container model; 2) Designing the configuration management framework and interfaces that can decouple the configuration management from the application. We will use a multi-tier online auction application as a case study example; and 3) developing our designed configuration management framework and optimizing its performance.

1. Predictive Analytics and Visualization of Narrative Threads for Large Document Collections; 2. Visualizing the Structure of a Deep Convolution Neural Net for Text Understanding
Christopher Healey

$108,262 by SAS Institute Inc
08/16/2015 - 08/15/2016

Our goal in this project is to focus on two related tasks. First, we will extend work from last year’s (2014-2015) SAS-NCSU MRA research project to study predictive analytics for narrative threads: the ability to predict how a sequence of narrative events could unfold in the future, based on past and current events; algorithms to determine how different past choices would have effect on current and future states; and visualization and user interface tools and techniques to allow users to explore the predictive space in efficient and effective ways. Second, we will collaborate with the deep learning group to study the problem of visualizing nodes and relationships in the layers of a convolution neural networks (CNN). Visualization may allow researchers to gain important insights into how different nodes in the layers of a CNN are contributing to their final output.

Identification of Translational Hormone-Response Gene Networks and Cis-Regulatory Elements
Steffen Heber(co-PI) ; Jose Alonso(Lead PI-CALS) ; Anna Stepanova (CALS) ; Cranos Williams (ECE)

$897,637 by National Science Foundation
08/ 1/2015 - 07/31/2020

Plants, as sessile organisms, need to constantly adjust their intrinsic growth and developmental programs to the environmental conditions. These environmentally triggered “adjustments“ often involve changes in the developmentally predefined patterns of one or more hormone activities. In turn, these hormonal changes result in alterations at the gene expression level and the concurrent alterations of the cellular activities. In general, these hormone-mediated regulatory functions are achieved, at least in part, by modulating the transcriptional activity of hundreds of genes. The study of these transcriptional regulatory networks not only provides a conceptual framework to understand the fundamental biology behind these hormone-mediated processes, but also the molecular tools needed to accelerate the progress of modern agriculture. Although often overlooked, understanding of the translational regulatory networks behind complex biological processes has the potential to empower similar advances in both basic and applied plant biology arenas. By taking advantage of the recently developed ribosome footprinting technology, genome-wide changes in translation activity in response to ethylene were quantified at codon resolution, and new translational regulatory elements have been identified in Arabidopsis. Importantly, the detailed characterization of one of the regulatory elements identified indicates that this regulation is NOT miRNA dependent, and that the identified regulatory element is also responsive to the plant hormone auxin, suggesting a role in the interaction between these two plant hormones. These findings not only confirm the basic biological importance of translational regulation and its potential as a signal integration mechanism, but also open new avenues to identifying, characterizing and utilizing additional regulatory modules in plants species of economic importance. Towards that general goal, a plant-optimized ribosome footprinting methodology will be deployed to examine the translation landscape of two plant species, tomato and Arabidopsis, in response to two plant hormones, ethylene and auxin. A time-course experiment will be performed to maximize the detection sensitivity (strong vs. weak) and diversity (early vs. late activation) of additional translational regulatory elements. The large amount and dynamic nature of the generated data will be also utilized to generate hierarchical transcriptional and translational interaction networks between these two hormones and to explore the possible use of these types of diverse information to identify key regulatory nodes. Finally, the comparison between two plant species will provide critical information on the conservation of the regulatory elements identified and, thus, inform research on future practical applications. Intellectual merit: The identification and characterization of signal integration hubs and cis-regulatory elements of translation will allow not only to better understand how information from different origins (environment and developmental programs) are integrated, but also to devise new strategies to control this flow for the advance of agriculture. Broader Impacts: A new outreach program to promote interest among middle and high school kids in combining biology, computers, and engineering. We will use our current NSF-supported Plants4kids platform (ref) with a web-based bilingual divulgation tools, monthly demos at the science museum and local schools to implement this new outreach program. Examples of demonstration modules will include comparison between simple electronic and genetic circuits.

Transcriptional Nodes Coordinate Patterning and Cellular Proliferation During Carpel Margin Meristem Development
Steffen Heber/co-PI ; Robert Franks/Lead PI-Genet

$771,784 by National Science Foundation
03/ 1/2014 - 02/28/2017

The coordination of spatial patterning cues and cellular proliferation underlies diverse processes from cancerous growth to reproductive development. A long-term objective of my research program is to understand how proliferative cues are coordinated with spatial information during organogenesis. In Arabidopsis thaliana this coordination of patterning and proliferation is necessary within the carpel margin meristem (CMM) to generate ovules that when fertilized will become seeds. In the previous funding period we demonstrated that the SEUSS (SEU) and AINTEGUMENTA (ANT) transcription factors regulate critical patterning events that support carpel margin meristem and ovule development. Our genetic analysis demonstrates that SEU and ANT share a partially redundant and overlapping function essential for proper seed formation. As SEU and ANT do not share sequence similarity, the molecular basis for this redundancy is not understood. We propose that the SEU and ANT activities synergistically converge at key transcriptional nodes. A node in this sense is a gene or a set of related genes that requires the combined activities of SEU and ANT for its expression. Our recently published transcriptomic analysis indicates that many of these nodes encode known transcriptional regulators. By studying these nodes we hope to better understand the transcriptional hierarchies that control CMM development and uncover the mechanistic basis of the synergistic action of SEU and ANT. Our transcriptomics study cannot determine if the nodes that we have identified are directly or indirectly regulated by SEU or ANT activity, However, even if these genes are indirectly controlled by SEU and ANT activity, their expression within the developing CMM suggests they may still play a critical functional role during CMM development. Furthermore, having now identified a set of genes that are enriched for CMM expression we are in a position to study the cis-regulatory elements that support gene expression within the CMM and medial gynoecial domain. Thus here we propose to: 1) Identify direct targets of SEU regulation within the CMM to further refine the transcriptional hierarchy required for CMM development; 2) assay the functional role of two of these nodes during CMM development; one encoded by the transcription factor PERIANTHIA and the second encoded by members of the REM family of B3 domain-containing proteins; 3) Identify cis-acting DNA regulatory elements required for CMM expression. Scientific significance: Understanding the coordination of cellular proliferation and spatial patterning during organogenesis is broadly of interest to scientists working in a diversity of fields. Completion of these specific aims will move us toward this future goal by illuminating the mechanistic basis for the overlapping functions of SEU and ANT during carpel margin and ovule development. Additionally, we expect that by elucidating the molecular mechanisms of the synergistic action of SEU and ANT upon key transcriptional nodes, we will engender a greater understanding of the molecular underpinnings of non-additivity within transcriptional networks and the complexity of developmental programs. Past NSF funding for this project (IOS-0821896) has resulted in the publication of five articles in well-respected journals (two in Plant Physiology, and one each in Developmental Biology, PLoS One, and BMC Plant Biology). Broader impacts: I ensure a broad societal impact from my program by integrating my research efforts with my teaching and training responsibilities and by widely disseminating materials and results. Furthermore, I initiated and continue to lead an outreach group that prepares and presents hands-on science demonstrations at local North Carolina schools. Our group has reached over 1500 Kindergarten through Grade 12 students over the past six years and continues to develop new demonstration modules inspired by our current work in developmental biology and genetics.

Collaborative Research: Transforming Computer Science Education Research Through Use of Appropriate Empirical Research Methods: Mentoring and Tutorials
Sarah Heckman

$406,557 by National Science Foundation
09/ 1/2015 - 08/31/2020

The computer science education (CSEd) research community consists of a large group of passionate CS educators who often contribute to other disciplines of CS research. There has been a trend in other disciplines toward more rigorous and empirical evaluation of various hypotheses. However, many of the practices that we apply to demonstrate rigor in our discipline research are ignored or actively avoided when performing research in CSEd. This suggests that CSEd is “theory scarce” because most publications are not research and do not provide the evidence or replication required for meta-analysis and theory building . An increase in empiricism in CSEd research will move the field from “scholarly teaching” to the “scholarship of teaching and learning” (SoTL) providing the foundation for meta-analysis and the generation of theories about teaching and learning in computer science. We propose the creation of training workshops and tutorials to educate the educators about appropriate research design and evaluation of educational interventions. The creation of laboratory packages, “research-in-a-box,” will support sound evaluation and replication leading to meta-analysis and theory building in the CSEd community.

Learning Environments Across Disciplines LEADS: Supporting Technology Rich Learning Across Disciplines: Affect Generation and Regulation During Co-Regulated Learning in Game-Based Learning Environments (Supplement
James Lester

$114,672 by McGill University/Social Sciences and Humanities Research Council of Canada
04/ 1/2012 - 02/28/2020

Contemporary research on multi-agent learning environments has focused on self-regulated learning (SRL) while relatively little effort has been made to use co-regulated learning as a guiding theoretical framework (Hadwin et al., 2011). This oversight needs to be addressed given the complex nature that self-and other-regulatory processes play when human learners and artificial pedagogical agents (APAs) interact to support learners? internalization of cognitive, affective, and metacognitive (CAM) SRL processes. We will use the Crystal Island learning environment to investigate these issues.

SCH: INT: Collaborative Research: A Self-Adaptive Personalized Behavior Change System for Adolescent Preventive Healthcare
James Lester

$952,818 by National Science Foundation
10/ 1/2013 - 09/30/2017

Although the majority of adolescent health problems are amenable to behavioral intervention, and most adolescents are comfortable using interactive computing technology, few health information technology interventions have been integrated into adolescent care. The objective of the proposed research is to design, implement, and investigate INSPIRE, a self-adaptive personalized behavior change system for adolescent preventive healthcare. With a focus on adolescents, INSPIRE will enable adolescents to be active participants in dynamically generated, personalized narrative experiences that operationalize theoretically grounded interventions for behavior change through interactive narratives? plot structures and virtual character interactions.

CHS: Medium: Adapting to Affect in Multimodal Dialogue-Rich Interaction with Middle School Students
James Lester ; Kristy Boyer ; Bradford Mott ; Eric Wiebe

$1,184,073 by National Science Foundation
08/ 1/2014 - 07/31/2017

Despite the great promise offered by learning environments for K-12 science education, realizing its potential poses significant challenges. In particular, current learning environments do not adaptively support children's affect. This project investigates computational models of affect that support multimodal dialogue-rich interaction. With an emphasis on devising scalable solutions, the project focus on machine-learning techniques for automatically acquiring affect and dialogue models that can be widely deployed in natural classroom settings.

Tutorial Planning with Markov Decision Processes for Counterinsurgency Training Environments
James Lester ; Bradford Mott ; Jonathan Rowe

$1,072,237 by US Army - Army Research Laboratory
04/10/2015 - 04/ 9/2018

Intelligent tutoring systems (ITSs) are highly effective for education and training. Tutorial planning is a critical component of ITSs, controlling how scaffolding is structured and delivered to learners. Devising data-driven tutorial planners that automatically induce scaffolding models from corpora of student data holds significant promise. This project investigates a data-driven framework for tutorial planning that is based on modular reinforcement learning. This framework explicitly accounts for the inherent uncertainty in how learners respond to different types of tutorial strategies and tactics, and automatically induces and refines tutorial planning policies in order to optimize measures of learning outcomes.

The Effectiveness of Intelligent Virtual Humans in Facilitating Self-Regulated Learning in STEM with MetaTutor
James Lester (Co-PI) ; Roger Azevedo

$1,365,603 by National Science Foundation
09/ 1/2014 - 08/31/2017

Intelligent virtual humans (IVHs) are able to connect with real people in powerful, meaningful, and complex ways. They can mimic the behavior of real people and therefore add a rich social dimension to computer interactions, providing not only a wealth of information but presenting information in more personals ways. This 3-year project will focus on testing the effectiveness of IVHs in facilitating college students self-regulated learning in STEM with MetaTutor. More specifically, we plan to test IVHs detection, monitoring, and modeling (both facially and verbally) the temporal dynamics of learners self-regulatory processes to enhance learners deployment of effective learning strategies, accurate metacognitive judgments, and appraisals of emotional states. This will be accomplished by aligning and conducting complex computational and statistical analyses of a multitude of trace data (e.g., log-files, eye-tracking), behavioral (e.g., human-virtual human dialogue moves), physiological measures (e.g., GSR, ECG, EEG), and learning outcome data collected in real-time. The proposed research, in the context of using IVHs, is extremely challenging and will help us to better understand the nature and temporal dynamics of these processes, how they contribute to various types of learning, and provide the empirical basis for designing intelligent virtual human systems. The results from this grant will contribute significantly to models and theories of social, cognitive, and physiological bases of human-virtual human interactions; statistical and computational methods to make inferences from complex multi-channel data; theoretical and conceptual understanding of temporally-aligned data streams, and enhancing students understanding of complex science topics by making more sensitive and intelligent virtual humans.

Guiding Understanding via Information from Digital Environments (GUIDE)
James Lester Co-PI ; Eric Wiebe Lead PI

$1,238,549 by Concord Consortium via National Science Foundation
09/15/2015 - 08/31/2019

This project will utilize research and development groups at the Concord Consortium and NC State University. Educational software for teaching high school multi-level genetics developed by the Concord Consortium will be enhanced by intelligent agents and machine-based tutoring system technologies developed at NC State to help enhance the learning experience for students. These groups will collaborate closely to develop and research a hybrid system that combines technological intervention and teacher pedagogical expertise to illuminate and guide student learning in deeply digital curricula and classrooms.

SHF:Medium:Collaborative:Transfer Learning in Software Engineering
Tim Menzies

$464,609 by National Science Foundation
08/ 2/2014 - 06/30/2017

Software engineers need better ways to recognize best practices in past projects, and to understand how to transfer and adapt those experiences to current projects. No project is exactly like previous projects- hence, the trick is to find which parts of the past are most relevant and can be transferred into the current project. We propose novel automated methods to apply the machine learning concept of transfer learning to adapt lessons from past software engineering project data to new conditions.

Enabling Evidence-Based Modernization
Timothy Menzies

$70,000 by Carnegie Mellon University via USAF
01/13/2016 - 09/30/2016

Working with the SEI Enabling Evidence-Based Modernization project, researchers at NC State will explore optimum acquisition decisions based on multiple preferences of stakeholders. NC State researchers will will attend periodic meetings to contribute to the project team's research discussions and understanding and co-author with the SEI team at least one white paper/report and one conference paper about this research.

SHF: Small: Scalable Trace-Based Tools for In-Situ Data Analysis of HPC Applications (ScalaJack)
Frank Mueller

$457,395 by National Science Foundation
06/ 1/2012 - 05/31/2017

This decade is projected to usher in the period of exascale computing with the advent of systems with more than 500 million concurrent tasks. Harnessing such hardware with coordinated computing in software poses significant challenges. Production codes tend to face scalability problems, but current performance analysis tools seldom operate effectively beyond 10,000 cores. We propose to combine trace analysis and in-situ data analysis techniques at runtime. Application developers thus create ultra low-overhead measurement and analysis facilities on-the-fly, customized for the performance problems of particular application. We propose an analysis generator called ScalaJack for this purpose. Results of this work will be contributed as open-source code to the research community and beyond as done in past projects. Pluggable, customization analysis not only allows other groups to build tools on top of our approach but to also contribute components to our framework that will be shared in a repository hosted by us.

CPS: Breakthrough: Collaborative Research: Bringing the Multicore Revolution to Safety-Critical Cyber-Physical Systems
Frank Mueller

$400,000 by National Science Foundation
02/ 1/2013 - 01/31/2017

Multicore platforms have the potential of revolutionizing the capabilities of embedded cyber-physical systems but lack predictability in execution time due to shared resources. Safety-critical systems require such predictability for certification. This research aims at resolving this multicore ``predictability problem.'' It will develop methods that enable to share hardware resources to be allocated and provide predictability, including support for real-time operating systems, middleware, and associated analysis tools. The devised methods will be evaluated through experimental research involving synthetic micro-benchmarks and code for unmanned air vehicles ``re-thinking'' their adapting to changing environmental conditions within Cyber-Physical Systems.

Hobbes: OS and Runtime Support for Application Composition
Frank Mueller

$300,000 by Sandia National Laboratories via US Dept of Energy
10/24/2013 - 10/23/2016

This project intends to deliver an operating system and runtime system (OS/R) environment for extreme-scale scientific computing. We will develop the necessary OS/R interfaces and lowlevel system services to support isolation and sharing functionality for designing and implementing applications as well as performance and correctness tools. We propose a lightweight OS/R system with the flexibility to custom build runtimes for any particular purpose. Each component executes in its own "enclave" with a specialized runtime and isolation properties. A global runtime system provides the software required to compose applications out of a collection of enclaves, join them through secure and low-latency communication, and schedule them to avoid contention and maximize resource utilization. The primary deliverable of this project is a full OS/R stack based on the Kitten operating system and Palacios virtual machine monitor that can be delivered to vendors for further enhancement and optimization.

SHF: Small: RESYST: Resilience via Synergistic Redundancy and Fault Tolerance for High-End Computing
Frank Mueller

$376,219 by National Science Foundation
10/ 1/2010 - 09/30/2016

In High-End Computing (HEC), faults have become the norm rather than the exception for parallel computation on clusters with 10s/100s of thousands of cores. As the core count increases, so does the overhead for fault-tolerant techniques that rely on checkpoint/restart (C/R) mechanisms. At 50% overheads, redundancy is a viable alternative to fault recovery and actually scales, which makes the approach attractive for HEC. The objective of this work to the develop a synergistic approach by combining C/R-based fault tolerance with redundancy in computer to achieve high levels of resilience. This work alleviates scalability limitations of current fault tolerant practices. It contributes to fault modeling as well as fault detection and recovery in significantly advancing existing techniques by controlling levels of redundancy and checkpointing intervals in the presence of faults. It is transformative in providing a model where users select a target failure probability at the price of using additional resources.

Resilience for Global Address Spaces
Frank Mueller

$153,934 by Lawrence Berkeley National Laboratory via US Dept of Energy
09/24/2013 - 08/15/2016

he objective of this work is to provide functionality for the BLCR Linux module under a PGAS runtime system (within the DEGAS software stack) to support advanced fault-tolerant capabilities, which are of specific value in the context of large-scale computational science codes running on high-end clusters and, ultimately, exascale facilities. Our proposal is to develop and integrate into DEGAS a set of advanced techniques to reduce the checkpoint/restart (C/R)overhead.

Resilience for Global Address Spaces (Supplement)
Frank Mueller

$50,000 by Lawrence Berkeley National Laboratory via US Dept of Energy
09/24/2013 - 08/15/2016

The objective of this work is to provide functionality for the BLCR Linux module under a PGAS runtime system (within the DEGAS software stack) to support advanced fault-tolerant capabilities, which are of specific value in the context of large-scale computational science codes running on high-end clusters and, ultimately, exascale facilities. Our proposal is to develop and integrate into DEGAS a set of advanced techniques to reduce the checkpoint/restart (C/R) overhead.

CAREER:Expanding Developers' Usage of Software Tools by Enabling Social Learning
Emerson Murphy-Hill

$495,721 by National Science Foundation
08/ 1/2013 - 07/31/2018

Tools can help software developers alleviate the challenge of creating and maintaining software. Unfortunately, developers only use a small subset of the available tools. The proposed research investigates how social learning, an effective mechanism for discovering new tools, can help software developers to discover relevant tools. In doing so, developers will be able to increase software quality while decreasing development time.

FSE 2016 Doctoral Consortium and Mentorship Sessions Program
Emerson Murphy-Hill

$24,952 by National Science Foundation
07/ 1/2016 - 06/30/2017

The two proposed events at the 2016 Symposium on the Foundations of Software Engineering will build connections and enhance research capacity for junior researchers. Specifically, doctoral symposium and mentorship sessions will support senior doctoral students and pre-tenure faculty make connections with senior researchers, get feedback on their work, and foster the creation of a supportive community of scholars and a spirit of collaborative research.

TWC: Small: Collaborative: Discovering Software Vulnerabilities Through Interactive Static Analysis
Emerson Murphy-Hill

$249,854 by National Science Foundation
10/ 1/2013 - 09/30/2016

Software vulnerabilities originating from insecure code are one of the leading causes of security problems people face today. Current tool support for secure programming focuses on catching security errors after the program is written. We propose a new approach, interactive static analysis, to improve upon static analysis techniques by introducing a new mixed-initiative paradigm for interacting with developers to aid in the detection and prevention of security vulnerabilities.

SHF: Small: Expressive and Scalable Notifications from Program Analysis Tools
Emerson Murphy-Hill ; Sarah Heckman

$250,000 by National Science Foundation
10/ 1/2012 - 09/30/2016

A wide variety of program analysis tools have been created to help software developers do their jobs, yet the output of these tools are often difficult to understand and vary significantly from tool to tool. As a result, software developers may waste time trying to interpret the output of these tools, instead of making their software more capable and reliable. This proposal suggests a broad investigation of several types of program analysis tools, with the end goal being an improved understanding of how program analysis tools can inform developers in the most expressive and uniform way possible. Once this goal is reached, we can create program analysis tools that enable developers to make tremendous strides towards more correct, more reliable, and more on-time software systems.

Is Wireless Channel Dependable for Security Provisioning?
Peng Ning (co-PI)

$350,000 by the National Science Foundation
08/12/2013 - 07/31/2016

Wireless security is receiving increasing attention as wireless systems become a key component in our daily life as well as critical cyber-physical systems. Recent progress in this area exploits physical layer characteristics to offer enhanced and sometimes the only available security mechanisms. The success of such security mechanisms depends crucially on the correct modeling of underlying wireless propagation. It is widely accepted that wireless channels decorrelate fast over space, and half a wavelength is the key distance metric used in existing wireless physical layer security mechanisms for security assurance. We believe that this channel correlation model is incorrect in general: it leads to wrong hypothesis about the inference capability of a passive adversary and results in false sense of security, which will expose the legitimate systems to severe threats with little awareness. In this project, we seek to understand the fundamental limits in passive inference of wireless channel characteristics, and further advance our knowledge and practice in wireless security.

NeTS: Small: Collaborative Research: Creating Semantically-Enabled Programmable Networked Systems (SERPENT)
Kemafor Ogan

$278,271 by National Science Foundation
10/ 1/2015 - 09/30/2018

The separation of control and data plane in SDN architectures helps merge packet and circuit paradigms into a single architecture and enables logical centralization of the control function. This enables new thinking about solutions to path optimization problems frequently encountered in networking, from routing to traffic engineering. The SERPENT project proposes to develop effective solutions for representing, storing and manipulating network state using rich semantic models such that path and topology embedding problems can be solved using a semantic database framework. This will simplify creation of novel network control and management systems able to cope with increasingly complex user requirements.

Type System for Naval Essential Tasks
Kemafor Ogan

$48,725 by Datanova Scientific via US Navy- Office of Naval Research
07/ 6/2015 - 11/ 6/2016

Knowledge graphs are information networks with a specific topology that can be modeled as algebraic data types in a type system called Flutes, created by Datanova Scientific to rigorously analyze formal approaches to semantic integration. This project will demonstrate the capability of Flutes typing for summarizing knowledge graphs, collecting and collapsing high-dimensional data into low-dimensional data. Typing will help optimize queries based on column values similar to NOSQL databases. By using Flutes to implement a tactical knowledge base for DoD and other agencies, front-end applications can avoid including code for data validation, checking integrity constraints, or mission data extraction

III: Small: Optimization Techniques for Scalable Semantic Web Data Processing in the Cloud
Kemafor Ogan

$446,942 by National Science Foundation
09/ 1/2012 - 08/31/2016

Achieving scalable processing of the increasing amount of publicly-available Semantic Web data will hinge on parallelization. The Map-Reduce programming paradigm recently emerged as a de-facto parallel data processing standard and has demonstrated effectiveness with respect to structured and unstructured data. However, Semantic Web data presents challenges not adequately addressed by existing techniques due to its flexible, fine-grained data model and the need to reason beyond explicitly represented data. This project will investigate optimization techniques that address these unique challenges based on rethinking Semantic Web data processing on Map-Reduce platforms from the ground, up - from query algebra to query execution.

REU Site: Science of Software
Christopher Parnin ; Emerson Murphy-Hill ; Sarah Heckment

$355,365 by National Science Foundation
01/ 1/2016 - 12/31/2018

There are not enough skilled data science researchers, particularly in software engineering. Hence, this REU Site in Science of Software (SOS) will engage undergraduates as data scientists studying exciting and meaningful SE research problems. Students work side-by-side with faculty mentors to gain experience in qualitative and quantitative research methods in SOS. Activities include dataset challenges, pair research, literature reviews, and presentations. Ultimately, each student works independently toward a published research result with their faculty mentors.

TWC SBE: Medium: Collaborative: User-Centric Risk Communication and Control on Mobile Devices
Douglas Reeves

$267,096 by the National Science Foundation
09/ 1/2013 - 08/31/2016

Human-system interactions is an integral part of any system. Because the vast majority of ordinary users have limited technical knowledge and can easily be confused and/or worn out by repeated security notifications/questions, the quality of users? decisions tends to be very low. On the other hand, any system targeting end-users must have the flexibility to accommodate a wide spectrum of different users, and therefore needs to get the full range of users involved in the decision making loop. This dilemma between fallible human nature and inevitable human decision making is one main challenge to the goal of improving security. In this project, we aim at developing principles and mechanisms for usable risk communication and control. The major technical innovations include (1) multi-granularity risk communications; (2) relative risk information in the context of comparison with alternatives; (3) Discover and integrate risk information from multiple sources; (4) Expand opportunities for risk communication and control.

RI: Small: Collaborative Research: Speeding Up Learning through Modeling the Pragmatics of Training
David Roberts

$156,203 by National Science Foundation
10/ 1/2013 - 08/31/2016

We propose to develop techniques that will enable humans to train computers efficiently and intuitively. In this proposed work, we draw inspiration from the ways that human trainers teach dogs complex behaviors to develop novel machine learning paradigms that will enable intelligent agents to learn from human trainers quickly, and in a way that humans can intuitively take advantage of. This research aims to return to the basics of programming---it seeks to develop novel methods that allow humans to tell computers what to do. More specifically, this research will develop learning techniques that explicitly model and leverage the implicit communication channel that humans use while training, a process akin to interpreting the pragmatic implicature of a natural language communication. We will develop algorithms that view the training process as an intentional communicative act, and can vastly outperform standard reward-seeking algorithms in terms of the speed and accuracy with which human trainers can generate desired behavior.

CPS: Synergy: Integrated Sensing and Control Algorithms for Computer-Assisted Training (Computer Assisted Training Systems (CATS) for Dogs)
David Roberts ; Alper Bozkurt ECE ; Barbara Sherman CVM

$1,029,403 by National Science Foundation
10/ 1/2013 - 09/30/2016

We propose to develop tools and techniques that will enable more effective two-way communication between dogs and handlers. We will work to create non-invasive physiological and inertial measuring devices that will transmit real-time information wirelessly to a computer. We will also develop technologies that will enable the computer to train desired behaviors using positive reinforcement without the direct input from humans. We will work to validate our approach using laboratory animals in the CVM as well as with a local assistance dog training organization working as a consultant.

Graduate Industrial Traineeship for Chirag Kapadia
George Rouskas

$46,641 by SAS Institute, INC
09/21/2015 - 08/15/2016

NCSU through the SAS GA will provide research and analysis to SAS as set forth in this Agreement. Such research and analysis shall include, but is not limited to, research, generation, testing, and documentation of operations research software. SAS GA will provide such services for SAS' offices in Cary, North Carolina, at such times as have been mutually agreed upon by the parties.

NeTS:Small: Computationally Scalable Optical Network Design
George Rouskas

$429,995 by NSF
08/ 1/2011 - 07/31/2016

Optical networking forms the foundation of the global network infrastructure, hence the planning and design of optical networks is crucial to the operation and economics of the Internet and its ability to support critical and reliable communication services. With this research project we aim to make contributions that will lead to a quantum leap in the ability to solve optimally a range of optical design problems. In particular, we will develop compact formulations and solution approaches that can be applied efficiently to instances encountered in Internet-scale environments. Our goal is to lower the barrier to entry in fully exploring the solution space and in implementing and deploying innovative designs. The solutions we will develop are "future-proof" with respect to advances in DWDM transmission technology, as the size of the corresponding problem formulations is independent of the number of wavelengths.

In Situ Indexing and Query Processing of AMR Data
Nagiza Samatova

$383,000 by US Department of Energy
09/ 1/2014 - 08/31/2017

One of the most significant advances for large-scale scientific simulations has been the advent of Adaptive Mesh Refinement, or AMR. By using dynamic gridding, AMR can achieve substantial savings in memory, computation, and disk resources while maintaining or even increasing simulation accuracy, relative to static, uniform gridding. However, the resultant non-uniform structure of the simulation mesh produced by AMR methods cause inefficient access patterns during data analysis and visualization. Given the exponential increase in simulation output, the massive I/O operations are becoming a substantial bottleneck in simulations and analysis. To efficiently analyze AMR data, we propose an integrated, three-prong approach that aims: (a) To devise an AMR query model; (b) To develop effective in situ indexing and query processing methods for AMR data analytics; and (c) To investigate data storage layout strategies for AMR data retrieval optimized for analytics-induced heterogeneous data access patterns. The results, algorithms, and software will be in the public domain.

Gender Differences in the Risk for Alzheimer’s disease in ADNI-1 MCI.
Nagiza Samatova

$45,000 by Duke University
01/ 1/2016 - 12/31/2016

The project will analyze cognitive, biomarker and genetic data in 398 ADNI-1 MCI subjects to discover which crosstalks are active for each individual subject, identity active pathway crosstalks for each individual patient utilizing single nucleotide polymorphism (SNP) data, and determine which of these pathway crosstalks discriminate between genders. A novel graph-based methodology is used to identify these discriminating crosstalking pathways which will be validated by building prognostic models to predict cognitive decline compared to models using non-gender-specific biomarkers.

Runtime System for I/O Staging in Support of In-Situ Processing of Extreme Scale Data
Nagiza Samatova

$286,140 by Oak Ridge National Loboratory/Dept. of Energy
01/31/2011 - 08/31/2016

Accelerating the rate of insight and scientific productivity demands new solutions to managing the avalanche of data expected in extreme-scale. Our approach is to use tools that can reduce, analyze, and index the data while it is still in memory (referred to as "in-situ" processing of data). ). In order to deal with the large amount of data generated by the simulations, our team has partnered with many application teams to deliver proven technology that can accelerate their knowledge discovery process. These technologies include ADIOS, FastBit, and Parallel R. In this proposal we wish to integrate these technologies together, and create a runtime system that will allow scientist to create an easy-to-use scientific workflow system, that will run in situ, in extra nodes on the system, which is used to not only accelerate their I/O speeds, but also to pre-analyze, index, visualize, and reduce the overall amount of information from these solutions.

Joint Faculty Agreement For Nagiza Samatova
Nagiza Samatova

$686,881 by Oak Ridge National Laboratory
08/ 9/2007 - 08/ 8/2016

Dr. Nagiza Samatova's joint work with NC State University and Oak Ridge National Laboratory (ORNL) will provide the interface between the two organizations aiming to collaboratively address computational challenges in the Scientific Data Management, and the Large-Scale Analysis of DOE-mission applications. (Supplement)

Scalable Data Management, Analysis, and Visualization (SDAV) Institute
Nagiza Samatova ; Anatoli Melechko

$750,000 by US Department of Energy
02/15/2012 - 02/14/2017

The SDAV is a unique and comprehensive combination of scientific data management, analysis, and visualization expertise and technologies aimed at enabling scientific knowledge discovery for applications running on state-of-the-art computational platforms located at DOE's primary computing facilities. This integrated institute focuses on tackling key challenges facing applications in our three focus areas through a well-coordinated team and management organization that can respond to changing technical and programmatic objectives. The proposed work portfolio is a blend of applied research and development, aimed at having key software services operate effectively on large distributed memory multi-core, and many-core platforms and especially DOE's open high performance computing facilities. Our goal is to create an integrated, open source, sustainable framework and software tools for the science community.

Collaborative Research: Understanding Climate Change: A Data Driven Approach
Nagiza Samatova ; Frederick Semazzi

$1,815,739 by National Science Foundation
09/ 1/2010 - 08/31/2016

The goal is to provide a computational capability for effective and efficient exploration of high-resolution climate networks derived from multivariate, uncertain, noisy and spatio-temporal climate data. We plan to increase the efficiency and climatologically relevancy of the network patterns identification through integrated research activities focused on: (a) supporting comparative analysis of multiple climate networks; (b) constraining the search space via exploiting the inherent structure (e.g., multi-partite) of climate networks; (c) establishing the foundation to efficiently update solutions for perturbed (changing) graphs; and (d) designing and implementing parallel algorithms scalable to thousands of processors on multi-node multi-core supercomputer architectures.

Consortium for Nonproliferation Enabling Capabilities
Nagiza Samatova, co-PI ; Robin Gardner (Nuclear Eng

$9,744,249 by US Department of Energy
07/31/2014 - 07/30/2019

NC State University, in partnership with University of Michigan, Purdue University, University of Illinois at Urbana Champaign, Kansas State University, Georgia Institute of Technology, NC A&T State University, Los Alamos National Lab, Oak Ridge National Lab, and Pacific Northwest National lab, proposes to establish a Consortium for Nonproliferation Enabling Capabilities (CNEC). The vision of CNEC is to be a pre-eminent research and education hub dedicated to the development of enabling technologies and technical talent for meeting the grand challenges of nuclear nonproliferation in the next decade. CNEC research activities are divided into four thrust areas: 1) Signatures and Observables (S&O); 2) Simulation, Analysis, and Modeling (SAM); 3) Multi-source Data Fusion and Analytic Techniques (DFAT); and 4) Replacements for Potentially Dangerous Industrial and Medical Radiological Sources (RDRS). The goals are: 1) Identify and directly exploit signatures and observables (S&O) associated with special nuclear material (SNM) production, storage, and movement; 2) Develop simulation, analysis, and modeling (SAM) methods to identify and characterize SNM and facilities processing SNM; 3) Apply multi-source data fusion and analytic techniques to detect nuclear proliferation activities; and 4) Develop viable replacements for potentially dangerous existing industrial and medical radiological sources. In addition to research and development activities, CNEC will implement educational activities with the goal to develop a pool of future nuclear non-proliferation and other nuclear security professionals and researchers.

Lecture Hall Polytopes, Inversion Sequences, and Eulerian Polynomials
Carla Savage

$30,000 by Simons Foundation
09/ 1/2012 - 08/31/2017

Over the past ten years, lecture hall partitions have emerged as fundamental structures in combinatorics and number theory, leading to new generalizations and new interpretations of several classical theorems. This project takes a geometric view of lecture hall partitions and uses polyhedral geometry to investigate their remarkable properties.

CRII: CSR: Pervasive Gesture Recognition Using Ambient Light
Muhammad Shahzad

$174,878 by National Science Foundation
05/ 1/2016 - 04/30/2018

The PI proposes to use ambient light for recognizing human gestures. The intuition behind the proposed approach is that as a user performs a gesture in a room that is lit with light, the amount of light that he/she reflects and blocks changes, resulting in a change in the intensity of light in all parts of the room. This change can be measured and the pattern of change in the intensity of light is different for different gestures. Leveraging this observation, the proposed approach first learns these patterns for different gestures and then recognizes the gestures in real-time.

Context-Aware Correlation-Based Program Optimizations
Xipeng Shen

$28,000 by IBM Canada Limited
07/ 1/2014 - 06/30/2017

A component (e.g., a function or loop) in a program often exhibits different behaviors (e.g., execution paths) in a different context. Such a context sensitivity exists in High Performance Computing (HPC) applications, and even more commonly in Business Analytics and Optimizations (BOA) programs. This collaboration with IBM aims at developing context-aware correlation-based program optimizations, a new way to tackle context sensitivity in code specializations that effectively removes some limitations in current compiler technology.

Context-Aware Correlation-Based Program Optimizations (Supplement)
Xipeng Shen

$28,000 by IBM Canada Limited
07/ 1/2014 - 06/30/2017

In this project, we propose to build up context-aware correlation-based program optimizations, a new way to tackle context sensitivity in code specializations that effectively removes some limitations in current compiler technology.

Data Locality Enhancement of Dynamic Simulations for Exascale Computing
Xipeng Shen

$409,214 by US Department of Energy
06/15/2015 - 06/14/2017

Computer simulation is important for scientific research in many disciplines. Many such programs are complex, and transfer a large amount of data in a dynamically changing pattern. Memory performance is key to maximizing computing efficiency in the era of Chip Multiprocessors (CMP) due to the growing disparity between the slowly expanded memory bandwidth and the rapidly increased demands for data by processors. The importance is underlined by the trend towards exascale computing, in which, the processors are expected to each contain hundreds or thousands of (heterogeneous) cores. Unfortunately, today’s computer systems lack support for high degree of memory transfer. This project proposes to improve memory performance of dynamic applications by developing two new techniques that are tailored especially for the emerging features of CMP. The first technique is asynchronous streamlining, which analyzes the memory reference patterns of an application during runtime and regulates both control flows and memory references on the fly. The second technique is neighborhood-aware locality optimizations, which concentrates on the non-uniform relations among computing elements. This research will produce a robust tool for scientific users to enhance program locality on multi- and many-core systems that is not possible to achieve with existing tools. Further, it will contribute to the advancement of computational sciences and promote academic research and education in the challenging field of scientific computing.

CAREER: Input-Centric Program Behavior Analysis and Adaptation
Xipeng Shen

$266,165 by National Science Foundation
07/28/2014 - 08/31/2016

By analyzing and predicting program dynamic behaviors, program behavior analysis offers the fundamental support for program transformations and resource management. Its effectiveness is crucial for the maximization of computing efficiency. This research proposes to include program inputs---a so far virtually ignored dimension---into the focus of program behavior analysis, cultivating a new paradigm, namely input-centric program behavior analysis and adaptation. This input-centric paradigm will create many new opportunities for enhancing the matching between software and hardware, hence significantly improving the performance and power efficiency in modern computing. The proposed technique, input-centric program behavior analysis and adaptation, consists of three components. The first two components, program input characterization and input-behavior modeling, resolve the complexities of program inputs, extract important features, and recognize the correlations between characterized input features and program behaviors. The third component, input-centric adaptation, capitalizes on the novel opportunities that the first two components create, making dynamic optimizations proactive and holistic, but without losing the adaptivity to inputs and environmental changes. Together, the three components make evolvable programming systems more feasible than before. In such a system, the input-behavior models embody the central knowledge base, which grows incrementally across program production runs. As the knowledge base becomes larger, behavior prediction becomes more accurate, stimulating better software-hardware matching and making the program and runtime systems perform increasingly better. Because of the fundamental role of program behavior analysis in software-hardware matching, this research helps pave the way for advancing the optimizations in various layers in the software execution stack (compilers, virtual machines, OS, etc.).

CCF:SHF:Small: Non-Uniformity-Centric Program Optimizations for Dynamic Computations on Chip Multiprocessors
Xipeng Shen

$404,956 by National Science Foundation
06/16/2014 - 08/31/2016

In this project, Dr. Xipeng Shen and his team are building a new paradigm of program optimizations. It is motivated by a growing gap between trends in processor development and needs of modern data-intensive dynamic applications. This class of applications, ranging from differential equation solvers to data mining tools to particle dynamics simulations, play an essential role in science and humanity. But they feature tremendous data accesses and complex patterns in data accesses or control flows. The properties make them a great challenge for modern processors which are evolving exactly opposite to these applications' needs: A chip's aggregate computing power is rapidly outgrowing memory bandwidth; the rise of throughput-oriented manycores makes system throughput even more sensitive to irregular computations. The paradigm being built by Dr. Shen's team, namely "non-uniformity--centric optimizations", distinctively takes the non-uniform inter-core relations in modern systems as the first-order constraint for program optimizations. Specifically, they are developing a framework named PipeReg as a new way to reorganize data accesses and threads during run time to reduce the influence of irregular computations on the throughput of massively parallel processors. Meanwhile, they are investigating a novel kind of program transformations called neighborhood-aware transformations, which exploits the non-uniform interactions among threads in on-chip storage (e.g., shared cache) in a multi-socket multicore system. Together, the two techniques will synergistically remove some important barriers for data-intensive dynamic applications to tap into the full power of future computing systems. The outcome from this research will provide essential support for enhancing the computing efficiency of data-intensive dynamic applications in the era of heterogeneous parallel systems. Because of the critical roles of these applications, this research will help foster sustained advancement in science, commerce, health, and so on. Beyond its technical content, this project stresses technology transfer, develops new teaching materials and tools, emphasizes demographic diversity, and improves the training of both graduate and undergraduate students.

SHF: Small: Improving Memory Performance on Fused Architectures through Compiler and Runtime Innovations
Xipeng Shen ; Frank Mueller

$470,000 by National Science Foundation
08/ 1/2015 - 07/31/2018

Contemporary architectures are adopting an integrated design of conventional CPUs with accelerators on the same die with access to the same memory, albeit with different coherence models. Examples include AMD's Fusion architecture, Intel's integrated main-stream CPU/GPU product line, and NVIDIA Tegra's integrated graphics processor family. Integrated GPUs feature shared caches and a common memory interconnect with multicore CPUs, which intensify resource contention in the memory hierarchy. This creates new challenges for data locality, task partitioning and scheduling, as well as program transformations. Most significantly, a program running on GPU warps and CPU cores may adversely affect performance and power of one another. The objective of this work is to understand these novel implications of fused architectures by studying their effects, qualifying their causes and quantifying the impacts on performance and energy efficiency. We propose to advance the state-of-the-art by creating spheres of isolation between CPU and GPU execution via novel systems mechanisms and compiler transformations that reduce cross-boundary contention with respect to shared hardware resources. This synergy between systems and compiler techniques has the potential to significantly improve performance and power guarantees for co-scheduling pgrams fragments on fused architectures. impact: The proposed work, if successful, has the potential to transform resource allocation and scheduling at the systems level and compiler optimizations at the program level to create a synergistic development environment with significant performance and power improvements and vastly increased isolation suitable for synergistic co-deployment of programs crossing boundaries on innovative fused architectures.

EAGER/Cybermanufacturing: Just-In-Time Compilation of Product Manufacturing Data to Machine Instructions via an Industrial Machine Operating System
Xipeng Shen Co-PI ; Binil Starly Lead PI ISE

$299,999 by National Science Foundation
09/ 1/2015 - 08/31/2017

Intelligent machines are purported to be the back-bone of the cybermanufacturing initiative.Yet, the conventional approach to making a machine ‘cyber-enabled’, is to outfit the machine with an array of multi-modal sensors which is then integrated to the network and enterprise system through communication and computing platforms. To make further development challenging, almost all industrial machine vendors have closed hardware and software architecture which makes it difficult for extensibility and adaptation to a cyber-manufacturing environment. We propose a new architecture, which we term as the – ‘Industrial Machine Operating System - iMOS’, will be a flexible framework for writing machine software. It will be a collection of hardware configurations, data structures, tools, libraries and semantics to simplify the task of creating a cyber-physical enabled manufacturing machine, designed to operate across a wide variety of manufacturing process platforms. (Total Award Amount $299,999.00

EAGER: Cognitive Modeling of Strategies for Dealing with Errors in Mobile Touch Interfaces
Robert St. Amant ; Emerson Murphy-Hill

$281,076 by National Science Foundation
09/ 1/2014 - 08/31/2016

Touch interfaces on mobile phones and tablets are notoriously error prone in use. One plausible reason for slow progress in improving usability is that research and design efforts in HCI take a relatively narrow focus on isolating and eliminating human error. We take a different perspective: failure represents breakdowns in adaptations directed at coping with complexity. The key to improved usability is understanding the factors that contribute to both expertise and its breakdown. We propose to develop cognitive models of strategies for touch interaction. Our research will examine the detailed interactions between users perceptual, cognitive, and motor processes in recognizing, recovering from, and avoiding errors in touch interfaces. Our proposal is for three stages of research: exploratory experiments, analysis and modeling, and finally validation experiments.

CHS: SMALL: Direct Physical Grasping, Manipulation, and Tooling of Simulated Objects
Robert St.Amant ; Christopher Healey

$496,858 by National Science Foundation
08/ 1/2014 - 07/31/2017

This proposal is for the development and evaluation of CAPTIVE, a Cube with Augmented Physical Tools, to support exploration of three-dimensional information. The design of CAPTIVE is founded on the concept of tool use, in which physical objects (tools) are used to modify the properties or presentation of target objects. CAPTIVE integrates findings across a wide range of areas in human-computer interaction and visualization, from bimanual and tangible user interfaces to augmented reality. CAPTIVE is configured as a desktop augmented reality/fishtank virtual reality system [120], with a stereo- scopic display, a haptic pointing device, and a user-facing camera. In one hand the user holds a wireframe cube that contains virtual objects, in the other the pointing device, augmented to reflect its function as a tool: a probe probes for pointing at, choosing, and moving objects; a magnifying or semantic lens for filter- ing, recoding, and elaborating information; a cutting plane that shows slices or projection views. CAPTIVE supports visualization with more fluid and natural interaction techniques, improving the ability of users to explore and understand 3D information.

Parameterized Algorithms Respecting Structure in Noisy Graphs (PARSiNG).
Blair Sullivan

$249,140 by US Navy - Space and Naval Warfare Systems Center (SPAWAR)via DARPA
09/30/2014 - 07/30/2017

This extension to the PARSiNG project focuses on issues related to improving accessibility and usability for downstream analysts of the related open source software toolkit. This may include new features (such as a graphical interface), improved I/O and data format support, extension of the modular framework to additional DARPA-relevant problems, and testing and incorporation of algorithmic coloring advances.

Joint Faculty Appointment For Vida Blair Sullivan (Supplement)
Blair Sullivan

$205,145 by Oak Ridge National Laboratory
09/13/2013 - 08/15/2016

The PI's unique combination of expertise in structural graph theory and scalable graph algorithms for data-driven science is necessary to ensure the success of ORNL-based projects using applied discrete mathematics to enable advances in graph analysis, anomaly detection, cybersecurity, quantum computing, and computational science. The PI will direct and conduct fundamental research, collaborate with ORNL staff, write up research results for peer-reviewed publication, give presentations, and mentor students and junior staff as appropriate.

Moore Foundation Data-Driven Discovery Investigator
Vida Blair Sullivan

$1,500,000 by Gordon and Betty Moore Foundation
11/10/2014 - 12/ 1/2019

Understanding and identifying intermediate-scale structure is key to designing robust tools for data analysis, just as the interdependence of local interactions and global behavior is key in many science domains. We thus focus on constructing a theory and tools for using this structure to improve analysis and identification of relationships in massive graph data. Through careful integration of tools from graph theory, computational complexity, statistics, and parallel algorithm design, the proposed work will derive novel measures of graph similarity based on structural representations and application-inspired features of interest. We will design efficient, scalable sampling algorithms which leverage inherent sparsity and structure to de-noise and improve accuracy of parameter estimation. As a specific example of science domain impact, we focus on improving understanding of the brain. Applying our new tools for characterizing graph-theoretic structure in such networks, scientists will be able to build higher fidelity models of brain network formation and evolution. Additionally, efficient algorithms from the associated parameterized framework will enable rapid comparison of regions and identification of discrepancies, abnormalities, and influential components for specific tasks.

National Extreme Events Data And Research Center (NEED) Transforming The National Capability For Resilience To Extreme Weather And Climate Events
Ranga Vatsavai

$10,890 by Oak Ridge National Laboratory via US Dept of Energy
03/16/2015 - 09/30/2016

NCSU graduate student will develop a machine learning approach to linking extreme atmospheric ridging events with extreme surface temperatures, employing a Gaussian Process (GP)-based predictive analysis tool that leverages the predictive value of spatial and temporal correlations and builds on ORNL’s past success in spatial classification, temporal change prediction, and parallelizing GP for large spatiotemporal extents.

Architecture Driven Optimization Approaches for Pattern Discovery from Big Data, support for student Seokyong Hong under "GO!" Program (Supplement)
Ranga Vatsavai

$98,521 by Oak Ridge National Laboratory via US Dept of Energy
10/15/2015 - 09/30/2016

Pattern discovery and predictive modeling from seemingly related “Big Data” represented as massive, ad-hoc, heterogeneous networks (e.g., extremely large graphs with complex, possibly unknown structure) is an outstanding problem in many application domains. To address this problem, we will be designing graph-mining algorithms capable of: (i) discovering relationship-patterns from such data and (ii) use those discovered patterns as features for classification and predictive modeling. In particular, we will develop, implement, and demonstrate novel, automated and scalable graph-pattern discovery algorithms on two different architectures: (1) Cray’s Urika, and (2) Hadoop based Cluster. The research will bring forth the advantages and disadvantages these two different architectures and shows possible optimization approaches for pattern discovery from big data.

Mini-Apps for Data-Intensive Discovery on Big Data Architectures
Ranga Vatsavai

$25,000 by Oak Ridge National Laboratory
01/ 1/2016 - 09/30/2016

Scaling-up scientific data analysis and machine learning algorithms for data-driven discovery is a grand challenge that we face today. Despite the growing need for analysis from science domains that are generating ‘Big Data’ from instruments and simulations, building high-performance analytical workflows of data-specific algorithms has been daunting because: (i) the ‘Big Data’ hardware and software architecture landscape is constantly evolving, (ii) newer architectures impose new programming models, and (iii) data-parallel kernels of analysis algorithms and their performance facets on different architectures are poorly understood. To address these problems, this project seeks to: (i) develop scalable data-parallel kernels of popular data analysis algorithms, (ii) implement ˜Mini-Apps™ of those kernels using different programming models (e.g. MapReduce, Bulk-synchronous), (iii) benchmark and validate the performance of the kernels in diverse architectures (Rhea, Urika-XA, Amazon EC2, etc.), and (iv) demonstrate a workflow of kernels on a DOE-centric image-analysis use-case.

National Extreme Events Data And Research Center (NEED) Transforming The National Capability For Resilience To Extreme Weather And Climate Events (Supplement)
Ranga Vatsavai

$19,999 by Oak Ridge National Laboratory vis US Dept of Energy
03/16/15 - 09/30/2016

NCSU graduate student will develop a machine learning approach to linking extreme atmospheric ridging events with extreme surface temperatures, employing a Gaussian Process (GP)-based predictive analysis tool that leverages the predictive value of spatial and temporal correlations and builds on ORNL’s past success in spatial classification, temporal change prediction, and parallelizing GP for large spatiotemporal extents.

Triangle Computer Science Distinquished Lecture Series
Mladen Vouk

$20,100 by Duke University (Us Army-Army Research Office)
01/ 1/2014 - 12/31/2016

Since 1995, the Triangle Computer Science Distinguished Lecturer Series (TCSDLS) has been hosting influential university researchers and industry leaders from computer-related fields as speakers at the three universities within the Research Triangle Area. The lecturer series, sponsored by the Army Research Office (ARO), is organized and administered by the Computer Science departments at Duke University, NC State University, and the University of North Carolina at Chapel Hill. This proposal argues for continuation, for an additional 3 years, of this highly successful lecturer series which is being led by Duke University.

TC: Small: Defending against Insider Jammers in DSSS- and FH-Based Wireless Communication Systems
Mladen Vouk ; Huaiyu Dai, ECE ; Peng Ning

$499,064 by National Science Foundation
09/ 1/2010 - 08/31/2016

Jamming resistance is crucial for applications where reliable wireless communication is required, such as rescue missions and military applications. Spread spectrum techniques such as Frequency Hopping (FH) and Direct Sequence Spread Spectrum (DSSS) have been used as countermeasures against jamming attacks. However, these anti-jamming techniques require that senders and receivers share a secret key to communicate with each other, and thus are vulnerable to insider attacks where the adversary has access to the secret key. The objective of this project is to develop a suite of techniques to defend against insider jammers in DSSS and FH based wireless communication systems. We will develop novel and efficient insider-jamming-resistant techniques for both DSSS- and FH-based wireless communication systems. Our proposed research consists of two thrusts. The first thrust is to develop novel spreading/despreading techniques, called DSD-DSSS (which stands for DSSS based on Delayed Seed Disclosure), to enhance DSSS-based wireless communication to defend against insider jamming threats, while the second thrust is to develop a new approach, called USD-FH (which stands for FH based on Uncoordinated Seed Disclosure), to enable sender and receivers using FH to communicate without pre-establishing any common secret hopping pattern. A key property of our new approaches is that they do not depend on any secret shared by the sender and receivers. Our solution has the potential to significantly enhance the anti-jamming capability of today?s wireless communication systems.

CSR: Small: Collaborative Research: Enabling Cost-effective Cloud HPC
Mladen Vouk ; Xiaosong Ma

$311,998 by National Science Foundation
10/ 1/2013 - 09/30/2016

The proposed work examines novel services built on top of public cloud infrastructure to enable cost-effective high-performance computing. We will explore the on-demand, elastic, and configurable features of cloud computing to complement the traditional supercomputer/cluster platforms. If successful, this research will result in tools that adaptively aggregate, configure, and re-configure cloud resources for different HPC needs, with the purpose of offering low-cost R&D environments for scalable parallel applications.

TWC: Frontier: Collaborative: Rethinking Security in the Era of Cloud Computing
Mladen Vouk ; Peng Ning

$749,996 by National Science Foundation
09/ 1/2013 - 08/31/2018

Increased use of cloud computing services is becoming a reality in today's IT management. The security risks of this move are active research topics, yielding cautionary examples of attacks enabled by the co-location of competing tenants. In this project, we propose to mitigate such risks through a new approach to cloud architecture defined by leveraging cloud providers as trusted (but auditable) security enablers. We will exploit cooperation between cloud providers and tenants in preventing attacks as a means to tackle long-standing open security problems, including protection of tenants against outsider attacks, improved intrusion detection and security diagnosis, and security-monitoring inlays.

NeTS: Small: Investigation of Human Mobility: Measurement, Modeling,Analysis, Applications and Protocols
Mladen Vouk ; Injong Rhee

$298,356 by National Science Foundation
08/ 1/2010 - 07/31/2016

Simulating realistic mobility patterns of mobile devices is important for the performance study of mobile networks because deploying a real testbed of mobile networks is extremely difficult, and furthermore, even with such a testbed, constructing repeatable performance experiments using mobile devices is not trivial. Humans are a big factor in simulating mobile networks as most mobile nodes or devices (cell phones, PDAs and cars) are attached to or driven by humans. Emulating the realistic mobility patterns of humans can enhance the realism of simulation-based performance evaluation of human-driven mobile networks. Our NSF-funded research that ends this year has studied the patterns of human mobility using GPS traces of over 100 volunteers from five different sites including university campuses, New York City, Disney World, and State Fair. This research has revealed many important fundamental statistical properties of human mobility, namely heavy-tail flight distributions, self-similar dispersion of visit points, and least-action principle for trip planning. Most of all, it finds that people tend to optimize their trips in a way to minimize their discomfort or cost of trips (e.g., distance). No existing mobility models explicitly represent all of these properties. Our results are very encouraging and the proposed research will extend the work well beyond what has been accomplished so far. . We will perform a measurement study tracking the mobility of 100 or 200 students in a campus simultaneously, and analyze the mobility patterns associated with geo-physical and social contexts of participants including social networks, interactions, spatio-temporal correlations, and meetings. . We will cast the problem of mobility modeling as an optimization problem borrowing techniques from AI and Robotics which will make it easy to incorporate the statistical properties of mobility patterns commonly arising from group mobility traces. The realism of our models in expressing human mobility will surpass any existing human mobility models. . We will develop new routing protocols leveraging the researched statistical properties found in real traces to optimize delivery performance. The end products of the proposed research is (a) a new human mobility model that is capable of realistically expressing mobility patterns arising from reaction to social and geo-physical contexts, (b) their implementation in network simulators such as NS-2/3 and GloMoSim, (c) mobility traces that contain both trajectories of people in a university campus and contact times, (d) new efficient routing protocols for mobile networks

I-Corps L: Recognize - An Application to Support Visual Learning
Ben Watson ; Patrick Fitzgerald (Design)

$50,000 by National Science Foundation
08/15/2015 - 07/31/2016

RECOGNIZE is a visual quiz game, with a mechanic much like the TV game show “Name That Tune.” Rather than naming a snippet of music, players match a slowly revealed “source” image (e.g., a picture of the artist Salvador Dali) to one of several “target” images (a different picture of Dali, as well as several other artists). RECOGNIZE is unique in its fully visual quizzing mechanic, with both questions and answers posed visually. This project will further develop RECOGNIZE into a viable commercial product with a broad range of educational applications, including individual instruction, group exercises, and distance learning.

EDU: Motivating and Reaching Students and Professionals with Software Security Education
Laurie Williams ; Emerson Murphy-Hill ; Kevin Oliver (Education)

$300,000 by National Science Foundation
09/ 1/2013 - 08/31/2017

According to a 2010 report that was based on the interviews from 2,800 Information Technology professionals worldwide, the gap between hacker threats and suitable security defenses is widening, and the types and numbers of threats are changing faster than ever before . In 2010, Jim Gosler, a fellow at the Sandia National Laboratory who works on countering attacks on U.S. networks, claimed that there are approximately 1,000 people in the country with the skills needed for cyber defense. Gosler went on to say that 20 to 30 times that many are needed. Additionally, the Chief Executive Officer (CEO) of the Mykonos Software security firm indicated that today's graduates in software engineering are unprepared to enter the workforce because they lack a solid understanding of how to make their applications secure. Particularly due to this shortage of security expertise, education of students and professionals already in the workforce is paramount. In this grant we provide a plan for motivating and providing software security education to students and professionals.

Growing The Science Of Security Through Analytics
Laurie Williams ; Munindar Singh

$5,939,339 by NSA (US Dept of Defense)
03/28/2014 - 03/27/2017

Since August 2011, North Carolina State University (NCSU) analytics-focused Science of Security Lablet (SOSL) has embraced and helped build a foundation for the NSA vision of the Science of Security (SoS) and a SoS community. Jointly with other SOSLs, we formulated five SoS hard problems, which lie at the core of the BAA. At NCSU, data-driven discovery and analytics have been used to formulate, validate, evolve, and solidify security models and theories as well as the practice of cyber-security. We propose to (1) investigate solutions to five cross-dependent hard problems, building on our extensive experience and research, including in the current SOSL; (2) advance our SoS community development activities; and (3) enhance our evaluation efforts regarding progress on the hard problems by bringing in experts on science evaluation.

HCC:Small:Collaborative Research:Integrating Cognitive and Computational Models of Narrative for Cinematic Generation
R. Michael Young

$352,696 by the National Science Foundation
08/ 1/2013 - 07/31/2017

Virtual cinematography, the use of a virtual camera within a three dimensional virtual world, is being used increasingly to communicate both in entertainment contexts as well as serious ones (e.g., training, education, news reporting). While many aspects of the use of virtual cameras are currently automated, the control of the camera is currently determined using either a pre-programmed script or a human operator controlling the camera at the time of filming. This project seeks to develop a model of virtual cinematography that is both computational -- providing a software system capable of generating camera control directives automatically -- and cognitive -- capable of modeling a viewer's understanding of an unfolding cinematic.