Total: $47,972,063 ## Other Years # Current Computer Science Research Projects (by faculty) The funded projects listed below are active projects and the funded running total for the active projects is on the left navigational menu. Collaborative Research: Modeling Social Interaction and Performance in STEM Learning Tiffany Barnes$200,003 by National Science Foundation
09/ 1/2014 - 08/31/2017

Despite long-standing awareness that social interaction is an integral part of knowledge construction, efforts to study complex collaborative learning have traditionally been relegated to qualitative and small-scale methodologies. Relatively new data traces left by online learning environments, including massive open online courses (MOOCs), offer the first real hope for scaling up such analyses. The purpose of the proposed research is to develop comprehensive models for collaborative learning which in turn will enable instructional design and the authentic assessment of the individual within the group context. This task is undertaken by an interdisciplinary team of researchers with specializations in natural language processing, discourse analysis, social network analysis, educational data mining and psychometrics.

REU Site: Interactive and Intelligent Media
Tiffany Barnes

$359,999 by National Science Foundation 04/ 1/2013 - 03/31/2016 The REU Site at NC State University will immerse a diverse group of undergraduates in a vibrant research community of faculty and graduate students working on cutting-edge games, intelligent tutors, and mobile applications. We will recruit students from underrepresented groups and colleges and universities with limited research opportunities through the STARS Alliance, an NSF-funded national consortium of institutions dedicated to broadening participation in computing. Using the Affinity Research Groups and STARS Training for REUs models, we will engage faculty and graduate student mentors with undergraduates to create a supportive culture of collaboration while promoting individual contributions to research through just-in-time training for both mentors and students throughout the summer. Type I: Collaborative Research: FRABJOUS CS - Framing a Rigorous Approach to Beauty and Joy for Outreach to Underrepresented Students in Computing at Scale Tiffany Barnes$352,831 by National Science Foundation
02/ 1/2013 - 02/28/2016

In this FRABJOUS CS project, we will prepare 60 high school teachers to teach the Beauty and Joy of Computing (BJC) Computer Science Principles curriculum. The BJC course is a rigorous introductory computing course that highlights the meaning and applications of computing, while introducing low-threshold programming languages Snap-Scratch, GameMaker and AppInventor. BJC is informed and inspired by the Exploring Computer Science curriculum, that was explicitly designed to channel the interests of urban HS students with ?culturally relevant and meaningful curriculum? [Goode 2011][Margolis 2008]. The BJC course uses collaborative classroom methods including pair learning, and student-selected projects are geared toward leveraging students? knowledge of social media, games, devices, and the internet. At UNC Charlotte in 2010 and 2011, PI Barnes engaged college students in supporting the BJC course, and in after-school outreach and summer camps that excite middle and high school students about this curriculum at different levels. The project engages three university faculty members and 6 college students to help the high school teachers build a Computer Science Teachers Association chapter and provide ongoing professional development and support for the BJC course. The project also engages high school teachers and an education researcher to help refine and enriches the BJC curriculum to be easier to adopt and teach in high schools.

Type I: Collaborative Research: FRABJOUS CS - Framing a Rigorous Approach to Beauty and Joy for Outreach to Underrepresented Students in Computing at Scale (Supplement)
Tiffany Barnes

$86,000 by NSF 02/ 1/2013 - 02/28/2016 In this FRABJOUS CS project, we will prepare 60 high school teachers to teach the Beauty and Joy of Computing (BJC) Computer Science Principles curriculum. The BJC course is a rigorous introductory computing course that highlights the meaning and applications of computing, while introducing low-threshold programming languages Snap-Scratch, GameMaker and AppInventor. BJC is informed and inspired by the Exploring Computer Science curriculum, that was explicitly designed to channel the interests of urban HS students with ?culturally relevant and meaningful curriculum? [Goode 2011][Margolis 2008]. The BJC course uses collaborative classroom methods including pair learning, and student-selected projects are geared toward leveraging students? knowledge of social media, games, devices, and the internet. At UNC Charlotte in 2010 and 2011, PI Barnes engaged college students in supporting the BJC course, and in after-school outreach and summer camps that excite middle and high school students about this curriculum at different levels. The project engages three university faculty members and 6 college students to help the high school teachers build a Computer Science Teachers Association chapter and provide ongoing professional development and support for the BJC course. The project also engages high school teachers and an education researcher to help refine and enriches the BJC curriculum to be easier to adopt and teach in high schools. CAREER: Educational Data Mining for Student Support in Interactive Learning Environment Tiffany Barnes$237,770 by UNC Charlotte/NSF
11/ 1/2013 - 06/30/2015

Creating intelligent learning technologies from data has unique potential to transform the American educational system, by building a low cost way to adapt learning environments to individual students, while informing research on human learning. This project will create the technology for a new generation of data-driven intelligent tutors, enabling the rapid creation of individualized instruction to support learning in science, technology, engineering, and mathematics (STEM) fields. This has the potential to make individualized learning support accessible for a broad audience, from children to adults, including students that are traditionally underrepresented in STEM fields. This project will (1) develop computational methods to derive cognitive models from data that can be used to support individual learners through guidance, feedback, and help; (2) develop approaches to providing student support that leverage data to provide hints and guidance based on information such as frequency of student responses, probability of future errors, and solution efficiency; (3) develop interactive visualization tools for teachers to learn from student data in real time, to allow teachers and instructional designers to tailor instruction to address actual, rather than perceived, student problem areas; and (4) conduct formal empirical evaluations of the pedagogical effectiveness of our student support. Our software will construct adaptive support for teaching and learning in logic, discrete mathematics, and other STEM domains using a data-driven approach. From the extensive but tractable student performance data in computer-aided learning environments, we will automatically construct student cognitive models. Our cognitive models will build on our prior work using Markov Decision Processes and dimensionality reduction methods that leverage past data to assess student performance, direct a studentâ€™s learning path, and provide contextualized hints. We will use machine learning techniques to expand our problem-specific models into more general cognitive models to bootstrap the construction of new tutors and learn about student learning. For teachers and learning researchers, we will build a web-based visualization and analysis tool to graphically and interactively model student solutions annotated with performance data that reflects frequency, tendency to commit future errors, and closeness to a final solution. Through our new tutors and tools we will conduct experiments to understand student learning in a variety of contexts and domains, including logic, algebra, and chemistry. We will engage a team of diverse students and colleagues to bring interdisciplinary expertise to our research and share our findings broadly. This award is funded under the American Recovery and Reinvestment Act of 2009 (Public Law 111-5).

BPC-AE: Scaling the STARS Alliance: A National Community for Broadening Participation Through Regional Partnerships
Tiffany Barnes

$150,000 by UNC-UNC Charlotte ( NSF) 01/ 1/2013 - 12/31/2014 The Beauty and Joy of Computing project presents a unique opportunity to scale the STARS Alliance further while also enhancing national efforts to engage more high school teachers and students in teaching and learning computing and build stronger university/college/K12 partnerships. Through this supplement, we will extend the Alliance with at least three new STARS Computing Corps, providing leadership training to a group of 8-10 students in each Corps, all focused on supporting the BJC effort. New Corps will provide teaching assistance to high school teachers implementing the BJC course through classroom visits and monthly Computer Science Teacher Association chapter meetings. These new STARS Computing Corps will also teach BJC material either through in middle school Citizen Schools after-school programs, and K-12 summer camps. This will provide a vibrant community of support for high school teachers and students engaging the new BJC course. DO 2 Task 3.3 - Bass John Bass$43,978 by LAS
09/13/2013 - 09/30/2014

LAS DO3 Task Order 2.8 Analytic Workflow
Kristy Boyer

$33,853 by LAS 03/31/2014 - 12/31/2014 Internal award supplement LAS DO3 Task Order 2.8 Analytic Workflow Kristy Boyer$16,529 by LAS
03/31/2014 - 12/31/2014

DO3 Task Order 2.8 Analytic Workflow

Educational Data Mining for Individualized Instruction in STEM Learning Environments
Min Chi ; Tiffany Barnes

$639,401 by National Science Foundation 09/ 1/2014 - 08/31/2017 Human one-on-one tutoring is one of the most effective educational interventions. Tutored students often perform significantly better than students in classroom settings (Bloom, 1984; Cohen, Kulik, & Kulik, 1982). Computer learning environments that mimic aspects of human tutors have also been highly successful. Intelligent Tutoring Systems (ITSs) have been shown to be highly effective in improving students' learning at real classrooms (Anderson, Corbett, Koedinger, & Pelletier, 1995; Koedinger, Anderson, Hadley, & Mark, 1997; VanLehn et al., 2005). The development of ITSs has enabled schools and universities to reach out and educate students who otherwise would be unable to take advantage of one-on-one tutoring due to cost and time constraints (Koedinger, Anderson, Hadley, & Mark, 1997). Despite the high payoffs provided by ITSs, significant barriers remain. High development costs and the challenges of knowledge engineering have prevented their widespread deployment. A diverse team of software developers, domain experts, and educational theorists are required for development, testing and even maintenance. Generally speaking, it requires an average of 80 man-hours per hour of tutoring content. In this proposed work, our goal is to automatically design effective personalized ITSs directly from log data. We will combine co-pI Dr. Barnes data-driven approach on learning what to teach with PI Dr. Chiâ€™s data-driven work on learning how to teach. More specifically, we will explore two important undergraduate stem domains: discrete math and probability; and employ two types of ITSs: an example-based ITS, the discrete math tutor, and a rule-based ITS, Pyrenees. The former can automatically generate hints directly from studentsâ€™ prior solutions while the latter has hard-coded domain rules and teaches students a domain-general problem-solving strategy within the context of probability. For learning how to teach, we will apply reinforcement learning to induce adaptive pedagogical strategies directly from studentsâ€™ log files and will focus on three levels of pedagogical decisions: 1) whether or not to give students hints and what level of hint (step-level); 2) whether to show students worked example or ask students to engage problem solving (problem-level); and 3) whether or not to teach students a meta-cognitive problem-solving strategy (metacognitive level). I/UCRC Planning Grant: Site Addition to CHMPR I/UCRC Rada Chirkova$13,779 by National Science Foundation
09/ 1/2014 - 08/31/2015

The objective of this letter of intent is to indicate that North Carolina State University (NCSU) will join, as a site, the Center of Hybrid Multicore Productivity Research (CHMPR) in Year 2 of its planned phase II I/UCRC renewal. CHMPR consists of sites at the University of Maryland, Baltimore County, and the University of California, San Diego. CHMPR is proposing a phase II extension of its successful conduct of research based on the continued challenges arising from the evolution of multicore technologies. The focus of NCSU within the center will be the science of technologies for end-to-end enablement of data. We are forming an NCSU-based site to join an existing I/UCRC Center for Hybrid Multicore Productivity (CHMPR). This planning grant will support meetings with potential industrial partners and CHMPR co-investigators from the University of Maryland Baltimore County. Planning activities will be facilitated by the NCSU CSC Director of Development & External Relations, Corporate & Alumni Relations Mr. Ken Tate, as well as by the Director of Institute for NEXT generation IT Systems Mr. John Streck. Meetings organized using the planning grant funds will operationalize, transform and integrate themes developed during the preparation of this planning grant, experiences from ongoing industry partnerships at the three centers, and concepts outlined in NSFâ€™s â€˜Purple Bookâ€™ entitled Managing the Industry/University Cooperative Research Center. The NCSU I/UCRC center site will represent an exciting and novel integration of approaches to Big Data in multiple disciplines, as we target technological innovations on extracting value from Big Data for a diverse pool of stakeholders.

$74,660 by Laboratory for Analytic Sciences 03/31/2014 - 12/31/2014 DO3 Task 2.7 Data Readiness DO 2 Task 3.6 - Doyle Jon Doyle$46,904 by Laboratory for Analytic Sciences
09/13/2013 - 12/31/2014

LAS DO3 Task Order 2.9 KRM
Jon Doyle

$76,241 by LAS 03/31/2014 - 12/31/2014 DO3 Task Order 2.9 KRM NeTS: JUNO: Service Offering Model and Versatile Network Resource Grooming for Optical Packet and Circuit Integrated Networks Rudra Dutta$291,956 by National Science Foundation (NSF)
04/ 1/2014 - 03/31/2017

The explosive growth in bandwidth represented by advances in optical communication and networking technologies has underpinned the increasing reach and reliability of the Internet in the last two decades. However, the potential impact of increasingly sophisticated recent advances in optical technology, such as rapid switching and elastic wavelengths have not yet been realized. The main cause of this is that such technology, while possible to integrate into the data plane of planetary networking, is difficult to accommodate in the current planning, management, and control strategies. We propose in this project to work hand-in-hand with collaborating researchers from NICT, Japan, who are working to realize a novel technology of hybrid optical packet/circuit switching. Such a technology could be immensely useful to large transport network operators, but there are no existing algorithms that can easily determine how a provider can provision their resources between the circuit and packet possibilities on an ongoing dynamic basis. We envision a novel approach to this problem, where we utilize the concept of a "choice marketplace" that allows sophisticated rendezvous semantics between customer and provider, and allows them to cooperatively guide network resource provisioning to dynamically fulfill network objectives such as maximizing performance received by network traffic. Our approach also allows balancing of various objectives, such as network utilization, latency, and the increasingly important metric of energy expenditure in the network.

NeTS: Small: Collaborative Research: Enabling Robust Communication in Cognitive Radio Networks with Multiple Lines of Defense
Rudra Dutta ; Peng Ning

$249,901 by National Science Foundation 10/ 1/2013 - 09/30/2016 Cognitive radio is an emerging advanced radio technology in wireless access, with many promising benefits including dynamic spectrum sharing, robust cross-layer adaptation, and collaborative networking. Opportunistic spectrum access (OSA) is at the core of cognitive radio technologies, which has received great attention recently, focusing on improving spectrum utilization efficiency and reliability. However, the state-of-the-art still suffers from one severe security vulnerability, which has been largely overlooked by the research community so far. That is, a malicious jammer can always disrupt the legitimate network communication by leveraging the public-available channel statistic information to effectively jam the channels and thus lead to serious spectrum underutilization. In this proposal, we propose to address the challenge of effective anti-jamming communication in cognitive radio networks (CRNs). We propose a multiple lines of defense approach, which considers and integrates defense technologies from different dimensions, including frequency hopping, power control, cooperative communication, and signal processing. The proposed defense approach enables both reactive and proactive protection, from evading jammers to competing against jammers, and to expelling jamming signals, and thus guarantees effective anti-jamming communication under a variety of network environments. NeTS: Large: Collaborative Research: Network Innovation Through Choice Rudra Dutta ; George Rouskas$643,917 by National Science Foundation
09/15/2011 - 09/30/2015

This project builds on the SILO project that started in 2006 to design a new architecture for the Internet. In this new project, we will collaborate with teams of researchers from the University of Kentucky, the University of Massachusetts, and RENCI, to design critical parts of a new architecture for the Internet that will support the flexible use of widely applicable information transport and transformation modules to create good solutions for specific communication applications. The key idea is to allow a network to offer information transformation services at the edge or in the core transparently to the application, and creating a framework in which application can issue a request not only for communication but for specific reusable services. We also propose research tasks that will enable network virtualization and isolation seamlessly at many levels, currently a difficult but highly relevant problem in practical networking.

CAREER: Secure OS Views for Modern Computing Platforms
William Enck

$400,000 by National Science Foundation 02/ 1/2013 - 01/31/2018 Controlling the access and use of information is a fundamental challenge of computer security. Emerging computing platforms such as Android and Windows 8 further complicate access control by relying on sharing and collaboration between applications. When more than two applications participate in a workflow, existing permission systems break down due to their boolean nature. In this proposal, we seek to provide applications with residual control of their data and its copies. To do this, we propose secure OS views, which combines a new abstraction for accessing data with whole-system information tracking. We apply secure OS views to modern operating systems (e.g., Android and Windows 8), which use database-like abstractions for sharing and accessing information. Similar to a database view, secure OS views uses runtime context to dynamically define the protection domain, allowing the return of the value, a fake value, or nonexistence of the record. TWC: Small: Collaborative: Characterizing the Security Limitations of Accessing the Mobile Web William Enck$167,000 by NSF
10/ 1/2012 - 09/30/2015

Mobile browsers are beginning to serve as critical enablers of modern computing. With a combination of rich features that rival their desktop counterparts and strong security mechanisms such as TLS/SSL, these applications are becoming the basis of many other mobile apps. Unfortunately, the security guarantees provided by mobile browsers and the risks associated with today's mobile web have not been evaluated in great detail. In the proposed work, we will investigate the security of mobile browsers and the mobile specific websites they access. Characterizing and responding to the threats in this space is critical, especially given that cellular devices are used by more than five billion people around the world

Refining Security Policy for Smartphone Applications
William Enck

$49,726 by US Army-Army Research Office 08/15/2014 - 05/14/2015 Title: Refining Security Policy for Smartphone Applications Abstract: Smartphones running Android and iOS have penetrated all areas of the public and private sectors. These platforms mark an important milestone in operating system security: they enforce security policies on applications, not users. Both platforms use least privilege security goals for user-applications downloaded from application markets. However, whereas Android largely enforces security goals with OS policy, iOS depends highly on a review process. Unfortunately, review alone can be circumvented, as demonstrated by multiple recent works. This proposal seeks to deepen our understanding of least privilege policy for user-applications by designing formal frameworks and algorithms for modeling, extracting, and enforcing policy. Successful execution of the proposed tasks enhance end-user security and privacy in Android and iOS devices. Collaborative Research: Research in Student Peer Review: A Cooperative Web-Services Approach Edward Gehringer$1,034,166 by NSF
09/ 1/2014 - 08/31/2017

Peer review between students has a 40-year history in academia. During the last half of that period, web-based peer-review systems have been used in tens of thousands of classes. Many online systems have been developed, in diverse settings and with diverse purposes. The systems, however, have common concerns: assigning capable reviewers to each student submission, insuring review quality, and delivering reliable scores, in cases where the systems are used for summative review of student work. Many strategies have been proposed to meet those concerns, and tested in relatively small numbers of courses. The next step is to scale up the studies to learn how well they perform in diverse settings, and with large numbers of students. This project brings together researchers from several peer-review systems, including some of the largest, to build web services that can be incorporated into existing systems to test these strategies and visualize the results.

CAREER: Enable Robust Virtualized Hosting Infrastructures via Coordinated Learning, Recovery, and Diagnosis
Xiaohui (Helen) Gu

$450,000 by National Science Foundation 01/ 1/2012 - 12/31/2016 Large-scale virtualized hosting infrastructures have become the fundamental platform for many real world systems such as cloud computing, enterprise data centers, and educational computing lab. However, due to their inherent complexity and sharing nature, hosting infrastructures are prone to various runtime problems such as performance anomalies and software/hardware failures. The overarching objective of this proposal is to systematically explore innovative runtime reliability management techniques for large-scale virtualized hosting infrastructures. Our research focuses on handling performance anomalies in distributed systems that are often very difficult to reproduce offline. Our approach combines the power of online learning, knowledge-driven first-response recovery, and in-situ diagnosis to handle unexpected system anomalies more efficiently and effectively. We aim at transforming the runtime system anomaly management from a trial-and-error guessing game into an efficient knowledge-driven self-healing process. Modeling Context and Sentiment to Visualize Narrative Threads in Large Document Collections Christopher Healey$91,749 by SAS Institute, Inc.
08/16/2014 - 08/15/2015

This project will investigate the use of text analytics and information visualization for analyzing, visualizing, and exploring large document collections. Text will be analyzed using a combination of SAS Text Miner and new algorithms designed to identify, structure, and associate context, sentiment, and other properties in a document collection. Results will be segmented into narrative threads that identify conceptual storylines within the document collection. Threads will be visualized using two approaches: node-link graphs and adjacency matrices. For the graphs, graph analysis algorithms will be studied to determine methods to derive useful metrics and summaries on the document collection. For the adjacency matrices, algorithms will be developed to order of rows and columns in ways that expose patters in the document collection. For both visualization techniques, issues of multivariate data representation and level-of-detail hierarchies will also be considered.

Mixed-Initiative Visualization and UI Modeling for Cyber-Physical Data
Christopher Healey

$36,834 by Scientific Systems Company, Inc (US Air Force) 03/18/2014 - 09/18/2014 A significant challenge for Cyber-Physical systems is incorporating human judgment into a complex analysis process. In a fully automated analysis, results and their justification can be difficult for users to understand and trust; a more effective approach is to support interactive construction and incremental modification of findings. We are designing and implementing a visualization assistant, ViA, that supports mixed-initiative interaction to collaborate with an analyst during visualization construction. Mixed-initiative approaches allow the computer and the user to share their expertise: for example, large-scale computation, search, and query processing, performed by the visualization system, together with the application of domain knowledge and expertise, as well as suggestions or constraints based on the data and tasks, provided by an analyst. Extensions of ViA will concentrate on improved higher-level support an analyst current workflow and mental models. One common criticism of past visualization efforts has been that we provided tools and asked the analysts to fit their problems to our tools, rather than building tools for the analyst problems. Transcriptional Nodes Coordinate Patterning and Cellular Proliferation During Carpel Margin Meristem Development Steffen Heber$71,066 by National Science Foundation
03/ 1/2014 - 02/28/2017

The coordination of spatial patterning cues and cellular proliferation underlies diverse processes from cancerous growth to reproductive development. A long-term objective of my research program is to understand how proliferative cues are coordinated with spatial information during organogenesis. In Arabidopsis thaliana this coordination of patterning and proliferation is necessary within the carpel margin meristem (CMM) to generate ovules that when fertilized will become seeds. In the previous funding period we demonstrated that the SEUSS (SEU) and AINTEGUMENTA (ANT) transcription factors regulate critical patterning events that support carpel margin meristem and ovule development. Our genetic analysis demonstrates that SEU and ANT share a partially redundant and overlapping function essential for proper seed formation. As SEU and ANT do not share sequence similarity, the molecular basis for this redundancy is not understood. We propose that the SEU and ANT activities synergistically converge at key transcriptional nodes. A node in this sense is a gene or a set of related genes that requires the combined activities of SEU and ANT for its expression. Our recently published transcriptomic analysis indicates that many of these nodes encode known transcriptional regulators. By studying these nodes we hope to better understand the transcriptional hierarchies that control CMM development and uncover the mechanistic basis of the synergistic action of SEU and ANT. Our transcriptomics study cannot determine if the nodes that we have identified are directly or indirectly regulated by SEU or ANT activity, However, even if these genes are indirectly controlled by SEU and ANT activity, their expression within the developing CMM suggests they may still play a critical functional role during CMM development. Furthermore, having now identified a set of genes that are enriched for CMM expression we are in a position to study the cis-regulatory elements that support gene expression within the CMM and medial gynoecial domain. Thus here we propose to: 1) Identify direct targets of SEU regulation within the CMM to further refine the transcriptional hierarchy required for CMM development; 2) assay the functional role of two of these nodes during CMM development; one encoded by the transcription factor PERIANTHIA and the second encoded by members of the REM family of B3 domain-containing proteins; 3) Identify cis-acting DNA regulatory elements required for CMM expression. Scientific significance: Understanding the coordination of cellular proliferation and spatial patterning during organogenesis is broadly of interest to scientists working in a diversity of fields. Completion of these specific aims will move us toward this future goal by illuminating the mechanistic basis for the overlapping functions of SEU and ANT during carpel margin and ovule development. Additionally, we expect that by elucidating the molecular mechanisms of the synergistic action of SEU and ANT upon key transcriptional nodes, we will engender a greater understanding of the molecular underpinnings of non-additivity within transcriptional networks and the complexity of developmental programs. Past NSF funding for this project (IOS-0821896) has resulted in the publication of five articles in well-respected journals (two in Plant Physiology, and one each in Developmental Biology, PLoS One, and BMC Plant Biology). Broader impacts: I ensure a broad societal impact from my program by integrating my research efforts with my teaching and training responsibilities and by widely disseminating materials and results. Furthermore, I initiated and continue to lead an outreach group that prepares and presents hands-on science demonstrations at local North Carolina schools. Our group has reached over 1500 Kindergarten through Grade 12 students over the past six years and continues to develop new demonstration modules inspired by our current work in developmental biology and genetics.

North Carolina Bio-Preparedness Collaboration (NCB-Prepared)
Marc Hoit ; Laurie Williams

$1,760,486 by US Dept of Homeland Security via UNC-CH 06/ 1/2010 - 09/30/2014 For this project, we will explore the potential benefits of symptomatic and syndromic surveillance using existing NCB-Prepared data sources, including EMS, ED and poison control data, to improve surveillance capacity and outbreak response relating to the area of food safety. During the initial phase, we will examine two years of NCB-Prepared national poison control data to evaluate its utility related to evaluating trends in foodborne illness. This initial phase will produce preliminary statistics by working with the SAS analytics team of NCB-Prepared to incorporate poison control data into the system. Some possible analytical techniques employed may include descriptive statistics, Fourier analysis and cluster analysis. Results from this phase will provide a baseline for identifying potential foodborne illness outbreaks in the future as part of the NCB-Prepared system. This first phase will demonstrate basic functionality of the poison control data by July 30, 2012. During the second phase, we will continue to explore relationships between the poison control, EMS and ED data in relationship to their ability to improve early detection of potential foodborne illness outbreaks. After the first phase, project will have a national poison center data set relating to food safety issues available covering at least 10 year. For example, we will select key national outbreaks and determine if the historical data provided to NCB-Prepared could have been used to provide earlier signals that an outbreak was ongoing. A preliminary result will be produced for this second phase by September 30. Additional efforts will be made to help the team explore relationships between the poison control, EMS and ED data as they pertain to foodborne illness outbreaks. Comprehension-Driven Program Analysis (CPA) for Malware Detection in Android Phones Xuxian Jiang$556,488 by Iowa State University/US Air Force-Research Laboratory
02/ 3/2012 - 08/ 2/2016

Our goal is to develop new automated program analyses capable of proving that the application programs have security properties of interest to the DoD and demonstrate those analyses in the form of tools designed specifically to keep malicious code out of DoD Android-based mobile application marketplaces.

CAREER: Towards Exterminating Stealthy Rootkits -- A Systematic Immunization Approach
Xuxian Jiang

$424,166 by the National Science Foundation 02/15/2010 - 01/31/2015 Stealthy rootkit attacks are one of the most foundational threats to cyberspace. With the capability of subverting the software root of trust of a computer system, i.e., the operating system (OS) or the hypervisor, a rootkit can instantly take over the control of the system and stealthily inhabit the victim. To effectively mitigate and defeat them, researchers have explored various solutions. Unfortunately, the state-of-the-art defense is mainly reactive and in a fundamentally disadvantageous position in the arms-race against these stealthy attacks. The proposed research aims to fundamentally change the arms-race by proposing a systematic immunization approach to proactively prevent and exterminate rootkit attacks. Inspired by our human immune system and fundamental biological design principles, the proposed approach transforms system software (i.e., the OS and the hypervisor) so that the new one will tip the balance of favor toward the rootkit defense. To accomplish that, we will investigate a suite of innovative techniques to preserve kernel/hypervisor control flow integrity and evaluate their effectiveness with real-world malware and infrastructures. The proposed education components include the creation and dissemination of unique hands-on course materials with live demos, lab sessions, and tutorials. Secure Open Systems Initiative Dennis Kekas ; Peng Ning ; Mladen Vouk ; Rudra Dutta$5,644,306 by Army Research Office
04/ 3/2008 - 11/30/2014

This program will establish a national Secure Open Systems Institute (SOSI), located on North Carolina Stateâ€™s premier Centennial Campus that will be a global center for Open Source security research and development. The goals are twofold. First, SOSI will significantly contribute to strengthening mission critical information technology infrastructures vital to the Department of Defense, state and nation. Second, SOSI will accelerate the creation and growth of high tech industries in North Carolina and beyond by providing a centralized repository of research results, testing tools and qualification services.

Learning Environments Across Disciplines LEADS: Supporting Technology Rich Learning Across Disciplines: Affect Generation and Regulation During Co-Regulated Learning in Game-Based Learning Environments (Supplement
James Lester

$30,894 by McGill University/Social Sciences and Humanities Research Council of Canada 04/ 1/2012 - 02/28/2020 Contemporary research on multi-agent learning environments has focused on self-regulated learning (SRL) while relatively little effort has been made to use co-regulated learning as a guiding theoretical framework (Hadwin et al., 2011). This oversight needs to be addressed given the complex nature that self-and other-regulatory processes play when human learners and artificial pedagogical agents (APAs) interact to support learners? internalization of cognitive, affective, and metacognitive (CAM) SRL processes. We will use the Crystal Island learning environment to investigate these issues. SCH: INT: Collaborative Research: A Self-Adaptive Personalized Behavior Change System for Adolescent Preventive Healthcare James Lester$952,818 by National Science Foundation
10/ 1/2013 - 09/30/2017

Although the majority of adolescent health problems are amenable to behavioral intervention, and most adolescents are comfortable using interactive computing technology, few health information technology interventions have been integrated into adolescent care. The objective of the proposed research is to design, implement, and investigate INSPIRE, a self-adaptive personalized behavior change system for adolescent preventive healthcare. With a focus on adolescents, INSPIRE will enable adolescents to be active participants in dynamically generated, personalized narrative experiences that operationalize theoretically grounded interventions for behavior change through interactive narratives? plot structures and virtual character interactions.

Learning Environments Across Disciplines LEADS: Supporting Technology Rich Learning Across Disciplines: Affect Generation and Regulation During Co-Regulated Learning in Game-Based Learning Environments
James Lester

$77,864 by McGill University/Social Sciences and Humanities Research Council of Canada 04/ 1/2012 - 03/31/2015 Contemporary research on multi-agent learning environments has focused on self-regulated learning (SRL) while relatively little effort has been made to use co-regulated learning as a guiding theoretical framework (Hadwin et al., 2011). This oversight needs to be addressed given the complex nature that self-and other-regulatory processes play when human learners and artificial pedagogical agents (APAs) interact to support learners? internalization of cognitive, affective, and metacognitive (CAM) SRL processes. We will use the Crystal Island learning environment to investigate these issues. LAS DO 2 task 3.3 - Lester-Taylor-Mott-Rowe James Lester$54,488 by Lab for Analytic Sciences/NSA
09/13/2013 - 09/30/2014

DO 2 task 3.3 activities (no other abstract available)

James Lester

$45,614 by Lab for Analytic Science/NSA 09/13/2013 - 09/30/2014 DO 2 Task 3.8 activities (no abstract available) DO 2 Task 3.5 Lester James Lester$37,577 by Lab for Analytic Sciences/NSA
09/13/2013 - 09/30/2014

DO 2 Task 3.5 activities (no abstract available)

CHS: Medium: Adapting to Affect in Multimodal Dialogue-Rich Interaction with Middle School Students
James Lester ; Kristy Boyer ; Bradford Mott ; Eric Wiebe

$1,184,073 by National Science Foundation 08/ 1/2014 - 07/31/2017 Despite the great promise offered by learning environments for K-12 science education, realizing its potential poses significant challenges. In particular, current learning environments do not adaptively support children's affect. This project investigates computational models of affect that support multimodal dialogue-rich interaction. With an emphasis on devising scalable solutions, the project focus on machine-learning techniques for automatically acquiring affect and dialogue models that can be widely deployed in natural classroom settings. Type I: ENGAGE: Immersive Game-Based Learning for Middle Grade Computational Fluency James Lester ; Kristy Boyer ; Bradford Mott ; Eric Wiebe$1,047,996 by National Science Foundation
01/ 1/2012 - 12/31/2014

The goal of the ENGAGE project is to develop a game-based learning environment that will support middle grade computer fluency education. It will be conducted by an interdisciplinary research team drawn from computer science, computer science education, and education. In collaboration with North Carolina middle schools, the research team will design, develop, deploy, and evaluate a game-based learning environment that enables middle school students to acquire computer fluency knowledge and skills. The ENGAGE project will be evaluated in middle grade classrooms with respect to both learning effectiveness and engagement.

Using Deep Learning to Build Context-Sensitive Language Models

$272,839 by SAS Institute, Inc. 09/ 1/2014 - 08/31/2015 An important problem in Natural Language Processing is creating a robust language model that represents how a set of documents is composed from a set of words. Having a robust language model is critical for many text analytic tasks, including topic modeling, information retrieval, text categorization, and sentiment analysis. Most language models utilize a bag-of-words representation that ignores word order. While these representations suffice for some text analytic tasks, a context-sensitive language model is essential for many tasks such as discovering semantic relationships, question answering, machine translation, and dialog modeling. This project will investigate the effectiveness of leveraging Deep Learning architectures to create context-sensitive language models. Detection and Transition Analysis of Engagement and Affect in a Simulation-Based Combat Medic Training Environment James Lester ; Bradford Mott$478,592 by Columbia University/US Army Research Laboratory
12/19/2012 - 06/17/2015

The project will develop automated detectors that can infer the engagement and affect of trainees learning through the vMedic training system. This project will combine interaction-based methods for detecting these constructs (e.g., models making inference solely from the trainee?s performance history) with scalable sensor-based methods for detecting these constructs, towards developing models that can leverage sensor information when it is available, but which can still assess trainee engagement and affect effectively even when sensors are not available. The automated detectors will be developed, integrated together, and validated for accuracy when applied to new trainees.

The Leonardo Project: An Intelligent Cyberlearning System for Interactive Scientific Modeling in Elementary Science Education
James Lester ; Bradford Mott ; Michael Carter ; Eric Weibe

$3,499,409 by National Science Foundation 08/15/2010 - 07/31/2015 The goal of the Leonardo project is to develop an intelligent cyberlearning system for interactive scientific modeling. Students will use Leonardo's intelligent virtual science notebooks to create and experiment with interactive models of physical phenomena. As students design and test their models, Leonardo's intelligent virtual tutors will engage them in problem-solving exchanges in which they will interactively annotate their models as they devise explanations and make predictions. During the project, the Leonardo virtual science notebook system will be rolled out to 60 classrooms in North Carolina, Texas, and California. Co-Design of Hardware / Software for Predicting MAV Aerodynamics Frank Mueller$799,999 by Virginia Polytechnic Institute and State University (US Air Force)
09/ 1/2012 - 10/31/2017

This proposal provides subcontractor support to Virginia Tech for a proposal submitted under the Air Force's Basic Research Initiative. The proposal will focus on development of reconfigurable mapping strategies for porting multi-block structured and unstructured-mesh CFD codes to computing clusters containing CPU/GPU processing units.

Hobbes: OS and Runtime Support for Application Composition
Frank Mueller

$300,000 by Sandia National Laboratories via US Dept of Energy 10/24/2013 - 10/23/2016 This project intends to deliver an operating system and runtime system (OS/R) environment for extreme-scale scientific computing. We will develop the necessary OS/R interfaces and lowlevel system services to support isolation and sharing functionality for designing and implementing applications as well as performance and correctness tools. We propose a lightweight OS/R system with the flexibility to custom build runtimes for any particular purpose. Each component executes in its own "enclave" with a specialized runtime and isolation properties. A global runtime system provides the software required to compose applications out of a collection of enclaves, join them through secure and low-latency communication, and schedule them to avoid contention and maximize resource utilization. The primary deliverable of this project is a full OS/R stack based on the Kitten operating system and Palacios virtual machine monitor that can be delivered to vendors for further enhancement and optimization. CPS: Breakthrough: Collaborative Research: Bringing the Multicore Revolution to Safety-Critical Cyber-Physical Systems Frank Mueller$225,000 by National Science Foundation
02/ 1/2013 - 01/31/2016

Multicore platforms have the potential of revolutionizing the capabilities of embedded cyber-physical systems but lack predictability in execution time due to shared resources. Safety-critical systems require such predictability for certification. This research aims at resolving this multicore predictability problem.'' It will develop methods that enable to share hardware resources to be allocated and provide predictability, including support for real-time operating systems, middleware, and associated analysis tools. The devised methods will be evaluated through experimental research involving synthetic micro-benchmarks and code for unmanned air vehicles re-thinking'' their adapting to changing environmental conditions within Cyber-Physical Systems.

SHF: Small: RESYST: Resilience via Synergistic Redundancy and Fault Tolerance for High-End Computing
Frank Mueller

$376,219 by National Science Foundation 10/ 1/2010 - 09/30/2015 In High-End Computing (HEC), faults have become the norm rather than the exception for parallel computation on clusters with 10s/100s of thousands of cores. As the core count increases, so does the overhead for fault-tolerant techniques that rely on checkpoint/restart (C/R) mechanisms. At 50% overheads, redundancy is a viable alternative to fault recovery and actually scales, which makes the approach attractive for HEC. The objective of this work to the develop a synergistic approach by combining C/R-based fault tolerance with redundancy in computer to achieve high levels of resilience. This work alleviates scalability limitations of current fault tolerant practices. It contributes to fault modeling as well as fault detection and recovery in significantly advancing existing techniques by controlling levels of redundancy and checkpointing intervals in the presence of faults. It is transformative in providing a model where users select a target failure probability at the price of using additional resources. Resilience for Global Address Spaces Frank Mueller$153,934 by Lawrence Berkeley National Laboratory via US Dept of Energy
09/24/2013 - 08/15/2015

he objective of this work is to provide functionality for the BLCR Linux module under a PGAS runtime system (within the DEGAS software stack) to support advanced fault-tolerant capabilities, which are of specific value in the context of large-scale computational science codes running on high-end clusters and, ultimately, exascale facilities. Our proposal is to develop and integrate into DEGAS a set of advanced techniques to reduce the checkpoint/restart (C/R)overhead.

SHF: Small: Scalable Trace-Based Tools for In-Situ Data Analysis of HPC Applications (ScalaJack)
Frank Mueller

$457,395 by National Science Foundation 06/ 1/2012 - 05/31/2015 This decade is projected to usher in the period of exascale computing with the advent of systems with more than 500 million concurrent tasks. Harnessing such hardware with coordinated computing in software poses significant challenges. Production codes tend to face scalability problems, but current performance analysis tools seldom operate effectively beyond 10,000 cores. We propose to combine trace analysis and in-situ data analysis techniques at runtime. Application developers thus create ultra low-overhead measurement and analysis facilities on-the-fly, customized for the performance problems of particular application. We propose an analysis generator called ScalaJack for this purpose. Results of this work will be contributed as open-source code to the research community and beyond as done in past projects. Pluggable, customization analysis not only allows other groups to build tools on top of our approach but to also contribute components to our framework that will be shared in a repository hosted by us. Operating System Mechanisms for Many-Core Systems-Phase II (PICASO II) Pico-kernel Adaptive and Scalable Operating Systems Phase II Frank Mueller$225,000 by Securboration via US Air Force Research Laboratory
06/ 1/2013 - 05/31/2015

The objective of this work is to design and evaluate novel system and program abstractions for combined performance and scalability paving the path into a future of operating system supporting a massive number of cores on a single chip.

CAREER:Expanding Developers' Usage of Software Tools by Enabling Social Learning
Emerson Murphy-Hill

$495,721 by National Science Foundation 08/ 1/2013 - 07/31/2018 Tools can help software developers alleviate the challenge of creating and maintaining software. Unfortunately, developers only use a small subset of the available tools. The proposed research investigates how social learning, an effective mechanism for discovering new tools, can help software developers to discover relevant tools. In doing so, developers will be able to increase software quality while decreasing development time. TWC: Small: Collaborative: Discovering Software Vulnerabilities Through Interactive Static Analysis Emerson Murphy-Hill$249,854 by National Science Foundation
10/ 1/2013 - 09/30/2016

Software vulnerabilities originating from insecure code are one of the leading causes of security problems people face today. Current tool support for secure programming focuses on catching security errors after the program is written. We propose a new approach, interactive static analysis, to improve upon static analysis techniques by introducing a new mixed-initiative paradigm for interacting with developers to aid in the detection and prevention of security vulnerabilities.

DO 2 Task 3.7 - Murphy Hill
Emerson Murphy-Hill

$49,486 by Laboratory for Analytic Sciences 09/13/2013 - 12/31/2014 DO 2 Task 3.7 activities LAS DO3 Task Order 2.11 Mission Enabling - Murphy-Hill Emerson Murphy-Hill$72,651 by Laboratory for Analytic Sciences
03/31/2014 - 12/31/2014

DO3 Task Order 2.11 Mission Enabling

LAS DO3 Task Order 2.8 Analytic Workflow
Emerson Murphy-Hill

$16,686 by LAS 03/31/2014 - 12/31/2014 DO3 Task Order 2.8 Analytic Workflow SHF: Small: Expressive and Scalable Notifications from Program Analysis Tools Emerson Murphy-Hill ; Sarah Heckman$250,000 by National Science Foundation
10/ 1/2012 - 09/30/2015

A wide variety of program analysis tools have been created to help software developers do their jobs, yet the output of these tools are often difficult to understand and vary significantly from tool to tool. As a result, software developers may waste time trying to interpret the output of these tools, instead of making their software more capable and reliable. This proposal suggests a broad investigation of several types of program analysis tools, with the end goal being an improved understanding of how program analysis tools can inform developers in the most expressive and uniform way possible. Once this goal is reached, we can create program analysis tools that enable developers to make tremendous strides towards more correct, more reliable, and more on-time software systems.

CISCO-NCSU Internship Program
Peng Ning

$160,000 by Cisco Systems, Inc. 07/12/2011 - 07/14/2016 This is a pilot internship program between NCSU and Cisco for 4 undergraduate students to learn through working part-time on real life problems for Cisco with the hope that this pilot program can grow and develop into a long term working relationship. Specifically, NCSU students will participate in Cisco Software Application Support plus Upgrades (SASU) projects and/or conduct research for SASU. This will be done with an understanding that the interns are students, and as such are learning and being trained with the training coming from both the Cisco (for SASU-specific skills), and NCSU (through the undergraduate program they are enrolled in) in general relevant skills. Is Wireless Channel Dependable for Security Provisioning? Peng Ning (co-PI)$350,000 by the National Science Foundation
08/12/2013 - 07/31/2016

Wireless security is receiving increasing attention as wireless systems become a key component in our daily life as well as critical cyber-physical systems. Recent progress in this area exploits physical layer characteristics to offer enhanced and sometimes the only available security mechanisms. The success of such security mechanisms depends crucially on the correct modeling of underlying wireless propagation. It is widely accepted that wireless channels decorrelate fast over space, and half a wavelength is the key distance metric used in existing wireless physical layer security mechanisms for security assurance. We believe that this channel correlation model is incorrect in general: it leads to wrong hypothesis about the inference capability of a passive adversary and results in false sense of security, which will expose the legitimate systems to severe threats with little awareness. In this project, we seek to understand the fundamental limits in passive inference of wireless channel characteristics, and further advance our knowledge and practice in wireless security.

III: Small: Optimization Techniques for Scalable Semantic Web Data Processing in the Cloud
Kemafor Ogan

$446,942 by National Science Foundation 09/ 1/2012 - 08/31/2015 Achieving scalable processing of the increasing amount of publicly-available Semantic Web data will hinge on parallelization. The Map-Reduce programming paradigm recently emerged as a de-facto parallel data processing standard and has demonstrated effectiveness with respect to structured and unstructured data. However, Semantic Web data presents challenges not adequately addressed by existing techniques due to its flexible, fine-grained data model and the need to reason beyond explicitly represented data. This project will investigate optimization techniques that address these unique challenges based on rethinking Semantic Web data processing on Map-Reduce platforms from the ground, up - from query algebra to query execution. LAS DO3 Task Order 2.9 KRM Kemafor Ogan$74,523 by LAS
03/31/2014 - 12/31/2014

TWC SBE: Medium: Collaborative: User-Centric Risk Communication and Control on Mobile Devices
Douglas Reeves

$267,096 by the National Science Foundation 09/ 1/2013 - 08/31/2016 Human-system interactions is an integral part of any system. Because the vast majority of ordinary users have limited technical knowledge and can easily be confused and/or worn out by repeated security notifications/questions, the quality of users? decisions tends to be very low. On the other hand, any system targeting end-users must have the flexibility to accommodate a wide spectrum of different users, and therefore needs to get the full range of users involved in the decision making loop. This dilemma between fallible human nature and inevitable human decision making is one main challenge to the goal of improving security. In this project, we aim at developing principles and mechanisms for usable risk communication and control. The major technical innovations include (1) multi-granularity risk communications; (2) relative risk information in the context of comparison with alternatives; (3) Discover and integrate risk information from multiple sources; (4) Expand opportunities for risk communication and control. Runtime Enforcement of Security Policies Douglas Reeves$29,780 by US Army - Army Research Office
03/ 5/2014 - 12/ 4/2014

Android smartphones have grown in market share and have penetrated all corners of the market, including US Government and, in particular, DoD. The ecosystem of the Android App marketplace while encouraging creativity also has lax standards. Recent work by Aiken's group shows that it is possible to use static analysis techniques to identify vulnerabilities due to abuse of {\em permissions} afforded to the software app, by the user, but with potential for false positives and attendant necessity for manual analysis. In this preliminary investigation, we propose to investigate a run time monitor that could be used in combination with static analysis to enforce strict permission policies. The particular research questions we will consider are: x Design of a language for expressing positive (shall) and negative (should not) permission x Algorithms for instrumenting application code that would be used to maintain invariants implied by the permission policies set by the user x Algorithms for instrumenting application code to collect trace data that could be mined later for surreptitious violations of security policies, and algorithms for deleting applications automatically when policies are violated. The three parts of the research proposal, when taken together, correspond to traditional law enforcement strategies -- setting of the law, monitoring for compliance, and imposition of penalty when laws are broken. While the ultimate goal is to validate the proposed work in the context of the Android market place, the proposed preliminary investigation will be theoretical in nature.

NeTS: Small: Investigation of Human Mobility: Measurement, Modeling,Analysis, Applications and Protocols
Injong Rhee

$298,356 by National Science Foundation 08/ 1/2010 - 07/31/2015 Simulating realistic mobility patterns of mobile devices is important for the performance study of mobile networks because deploying a real testbed of mobile networks is extremely difficult, and furthermore, even with such a testbed, constructing repeatable performance experiments using mobile devices is not trivial. Humans are a big factor in simulating mobile networks as most mobile nodes or devices (cell phones, PDAs and cars) are attached to or driven by humans. Emulating the realistic mobility patterns of humans can enhance the realism of simulation-based performance evaluation of human-driven mobile networks. Our NSF-funded research that ends this year has studied the patterns of human mobility using GPS traces of over 100 volunteers from five different sites including university campuses, New York City, Disney World, and State Fair. This research has revealed many important fundamental statistical properties of human mobility, namely heavy-tail flight distributions, self-similar dispersion of visit points, and least-action principle for trip planning. Most of all, it finds that people tend to optimize their trips in a way to minimize their discomfort or cost of trips (e.g., distance). No existing mobility models explicitly represent all of these properties. Our results are very encouraging and the proposed research will extend the work well beyond what has been accomplished so far. . We will perform a measurement study tracking the mobility of 100 or 200 students in a campus simultaneously, and analyze the mobility patterns associated with geo-physical and social contexts of participants including social networks, interactions, spatio-temporal correlations, and meetings. . We will cast the problem of mobility modeling as an optimization problem borrowing techniques from AI and Robotics which will make it easy to incorporate the statistical properties of mobility patterns commonly arising from group mobility traces. The realism of our models in expressing human mobility will surpass any existing human mobility models. . We will develop new routing protocols leveraging the researched statistical properties found in real traces to optimize delivery performance. The end products of the proposed research is (a) a new human mobility model that is capable of realistically expressing mobility patterns arising from reaction to social and geo-physical contexts, (b) their implementation in network simulators such as NS-2/3 and GloMoSim, (c) mobility traces that contain both trajectories of people in a university campus and contact times, (d) new efficient routing protocols for mobile networks NetSE: Large: Collaborative Research: Platys: From Position to Place in Next Generation Networks Injong Rhee ; Munindar Singh$706,167 by National Science Foundation
09/ 1/2009 - 08/31/2015

This project develops a high-level notion of context that exploits the capabilities of next genera-tion networks to enable applications that deliver better user experiences. In particular, it exploits mobile devices always with a user to capture key elements of context: the user's location and, through localization, characteristics of the user's environment.

RI: Small: Collaborative Research: Speeding Up Learning through Modeling the Pragmatics of Training
David Roberts

$156,203 by National Science Foundation 10/ 1/2013 - 09/30/2015 We propose to develop techniques that will enable humans to train computers efficiently and intuitively. In this proposed work, we draw inspiration from the ways that human trainers teach dogs complex behaviors to develop novel machine learning paradigms that will enable intelligent agents to learn from human trainers quickly, and in a way that humans can intuitively take advantage of. This research aims to return to the basics of programming---it seeks to develop novel methods that allow humans to tell computers what to do. More specifically, this research will develop learning techniques that explicitly model and leverage the implicit communication channel that humans use while training, a process akin to interpreting the pragmatic implicature of a natural language communication. We will develop algorithms that view the training process as an intentional communicative act, and can vastly outperform standard reward-seeking algorithms in terms of the speed and accuracy with which human trainers can generate desired behavior. CPS: Synergy: Integrated Sensing and Control Algorithms for Computer-Assisted Training (Computer Assisted Training Systems (CATS) for Dogs) David Roberts ; Alper Bozkurt ECE ; Barbara Sherman CVM$999,103 by National Science Foundation
10/ 1/2013 - 09/30/2016

We propose to develop tools and techniques that will enable more effective two-way communication between dogs and handlers. We will work to create non-invasive physiological and inertial measuring devices that will transmit real-time information wirelessly to a computer. We will also develop technologies that will enable the computer to train desired behaviors using positive reinforcement without the direct input from humans. We will work to validate our approach using laboratory animals in the CVM as well as with a local assistance dog training organization working as a consultant.

NeTS:Small: Computationally Scalable Optical Network Design
George Rouskas

$429,995 by NSF 08/ 1/2011 - 07/31/2015 Optical networking forms the foundation of the global network infrastructure, hence the planning and design of optical networks is crucial to the operation and economics of the Internet and its ability to support critical and reliable communication services. With this research project we aim to make contributions that will lead to a quantum leap in the ability to solve optimally a range of optical design problems. In particular, we will develop compact formulations and solution approaches that can be applied efficiently to instances encountered in Internet-scale environments. Our goal is to lower the barrier to entry in fully exploring the solution space and in implementing and deploying innovative designs. The solutions we will develop are "future-proof" with respect to advances in DWDM transmission technology, as the size of the corresponding problem formulations is independent of the number of wavelengths. Graduate Industrial Traineeship for Vedika Seth George Rouskas$52,547 by SAS Institute, INC
05/11/2014 - 05/31/2015

NCSU through the SAS GA will provide research and analysis to SAS as set forth in this Agreement. Such research and analysis shall include, but is not limited to, research, generation, testing, and documentation of operations research software. SAS GA will provide such services for SAS' offices in Cary, North Carolina, at such times as have been mutually agreed upon by the parties.

Graduate Industrial Traineeship for Ameeta Muralidharan
George Rouskas

$65,051 by SAS Institute, Inc. 01/27/2014 - 05/31/2015 NCSU through the SAS GA will provide research and analysis to SAS as set forth in this Agreement. Such research and analysis shall include, but is not limited to, research, generation, testing, and documentation of operations research software. SAS GA will provide such services for SAS' offices in Cary, North Carolina, at such times as have been mutually agreed upon by the parties. Graduate Industrial Traineeship for Namita Shubhy George Rouskas$65,050 by SAS Institute, Inc.
01/27/2014 - 05/31/2015

NCSU through the SAS GA will provide research and analysis to SAS as set forth in this Agreement. Such research and analysis shall include, but is not limited to, research, generation, testing, and documentation of operations research software. SAS GA will provide such services for SAS' offices in Cary, North Carolina, at such times as have been mutually agreed upon by the parties.

Runtime System for I/O Staging in Support of In-Situ Processing of Extreme Scale Data
Nagiza Samatova

$286,140 by Oak Ridge National Loboratory/Dept. of Energy 01/31/2011 - 08/31/2015 Accelerating the rate of insight and scientific productivity demands new solutions to managing the avalanche of data expected in extreme-scale. Our approach is to use tools that can reduce, analyze, and index the data while it is still in memory (referred to as "in-situ" processing of data). ). In order to deal with the large amount of data generated by the simulations, our team has partnered with many application teams to deliver proven technology that can accelerate their knowledge discovery process. These technologies include ADIOS, FastBit, and Parallel R. In this proposal we wish to integrate these technologies together, and create a runtime system that will allow scientist to create an easy-to-use scientific workflow system, that will run in situ, in extra nodes on the system, which is used to not only accelerate their I/O speeds, but also to pre-analyze, index, visualize, and reduce the overall amount of information from these solutions. Joint Faculty Agreement For Nagiza Samatova Nagiza Samatova$507,294 by Oak Ridge National Laboratories - UT Battelle, LLC
08/ 9/2007 - 08/ 8/2015

Dr. Nagiza Samatova's joint work with NC State University and Oak Ridge National Laboratory (ORNL) will provide the interface between the two organizations aiming to collaboratively address computational challenges in the Scientific Data Management, Data-Intensive Computing for Understanding Complex Biologicial Systems, Knowledge Integration for the Shewanella Federation, and the Large-Scale Analysis of Biologicial Networks with Applications to Bioenergy Production.

Scalable and Power Efficient Data Analytics for Hybrid Exascale Systems
Nagiza Samatova

$364,944 by Oak Ridge National Laboratories/ US Dept. of Energy 01/31/2011 - 12/31/2014 The specific objectives of the proposal are as follows: 1. Design and develop data mining kernels and algorithms for acceleration on hybrid architectures which include many-core systems, GPUs, and other accelerators. 2. Design and develop approximate scalable algorithms for data mining and analysis kernels enabling faster exploration, more efficient resource usage, reduced memory footprint, and more power efficient computations. 3. Design and develop index-based data analysis and mining kernels and algorithms for performance and power optimizations including index-based perturbation analysis kernels for noisy and uncertain data. 4. Demonstrate the results of our project by enabling analytics at scale for selected applications on large-scale HPC systems. Analytics-driven Efficient Indexing and Query Processing of Extreme Scale AMR Data Nagiza Samatova$149,999 by National Science Foundation
05/ 1/2012 - 12/31/2014

One of the most significant advances for large-scale scientific simulations has been the advent of Adaptive Mesh Refinement, or AMR. By using dynamic gridding, AMR can achieve substantial savings in memory, computation, and disk resources while maintaining or even increasing simulation accuracy, relative to static, uniform gridding. However, the resultant non-uniform structure of the simulation mesh produced by AMR methods cause inefficient post-simulation access patterns during AMR data analytics that is becoming a substantial bottleneck given the exponential increase in simulation output. Toward bridging this gap in efficient analytics support for AMR data, we propose an integrated, three-prong approach that aims: (a) To devise an AMR query model; (b) To explore effective indexing methods for AMR data analytics; and (c) To investigate data storage layout strategies for AMR data retrieval optimized for analytics-induced heterogeneous data access patterns.

Nagiza Samatova

$49,585 by Laboratory for Analytic Sciences 09/13/2013 - 12/31/2014 DO 2 Task 3.7 activities Scalable Data Management, Analysis, and Visualization (SDAV) Institute Nagiza Samatova ; Anatoli Melechko$750,000 by US Department of Energy
02/15/2012 - 02/14/2017

The SDAV is a unique and comprehensive combination of scientific data management, analysis, and visualization expertise and technologies aimed at enabling scientific knowledge discovery for applications running on state-of-the-art computational platforms located at DOE's primary computing facilities. This integrated institute focuses on tackling key challenges facing applications in our three focus areas through a well-coordinated team and management organization that can respond to changing technical and programmatic objectives. The proposed work portfolio is a blend of applied research and development, aimed at having key software services operate effectively on large distributed memory multi-core, and many-core platforms and especially DOE's open high performance computing facilities. Our goal is to create an integrated, open source, sustainable framework and software tools for the science community.

Scalable Statistical Computing For Physical Science Applications
Nagiza Samatova ; Anatoli Melechko

$354,646 by US Department of Energy (DOE) 12/ 2/2011 - 06/30/2015 Physical science applications such as nanoscience, fusion science, climate and biology generate large-scale data sets from their simulations and high throughput technologies. This necessitates scalable technologies for processing and analyzing this data. We plan to research and develop advanced data mining algorithms for knowledge discovery from this complex, high-dimensional, and noisy data. We will apply these technologies to DOE-mission scientific applications related to fusion energy, bioenergy, understanding the impacts of climate extremes, and insider threat detection and mitigation. Ultrascale Computational Modeling of Phenotype-Specific Metabolic Processes in Microbial Communities Nagiza Samatova ; Anatoli Melechko$454,311 by Oak National Laboratories - UT Battelle (DOE)
01/15/2010 - 12/31/2014

Ultrascale computational modeling methods will be developed for revealing phenotype-specific metabolic processes and their cross-talks and applied to the critical DOE problem of acid mine drainage (AMD). The apex of the project will be a systematic and iterative computational procedure for: (1) identification and expression-level characterization of phenotype-related genes; (2) reconstruction of phenotype-specific metabolic pathways enriched by these genes; (3) elucidation of the symbiotic and/or competing interplay between these pathways across species; and (4) characterization of evolutionary and environmental adaptation of the community.

Collaborative Research: Understanding Climate Change: A Data Driven Approach
Nagiza Samatova ; Frederick Semazzi

$1,815,739 by National Science Foundation 09/ 1/2010 - 08/31/2015 The goal is to provide a computational capability for effective and efficient exploration of high-resolution climate networks derived from multivariate, uncertain, noisy and spatio-temporal climate data. We plan to increase the efficiency and climatologically relevancy of the network patterns identification through integrated research activities focused on: (a) supporting comparative analysis of multiple climate networks; (b) constraining the search space via exploiting the inherent structure (e.g., multi-partite) of climate networks; (c) establishing the foundation to efficiently update solutions for perturbed (changing) graphs; and (d) designing and implementing parallel algorithms scalable to thousands of processors on multi-node multi-core supercomputer architectures. LAS DO3 Task Order 2.6 Future States Nagiza Samatove$50,320 by LAS
03/31/2014 - 12/31/2014

DO3 Task Order 2.6 Future States

Lecture Hall Polytopes, Inversion Sequences, and Eulerian Polynomials
Carla Savage

$30,000 by Simons Foundation 09/ 1/2012 - 08/31/2017 Over the past ten years, lecture hall partitions have emerged as fundamental structures in combinatorics and number theory, leading to new generalizations and new interpretations of several classical theorems. This project takes a geometric view of lecture hall partitions and uses polyhedral geometry to investigate their remarkable properties. Policy-Based Governance for the OOI Cyberinfrastructure Munindar Singh$124,688 by Univ of Calif-San Diego/NSF
09/ 1/2009 - 02/25/2015

This project will develop policy-based service governance modules for the Oceanographic Observatories Initiative (OOI) Cyberinfrastructure. The main objectives of the proposed project include (1) formulating the key conceptual model underlying the patterns of governance; (2) formalizing "best practices" patterns familiar to the scientific community and seeding the cyberinfrastructure with them; (3) understanding user requirements for tools that support creating and editing patterns of governance

DO 2 Task 3.7 - Singh
Munindar Singh

$43,889 by Laboratory for Analytic Sciences 09/13/2013 - 12/31/2014 DO 2 Task 3.7 activities Student Support for Participation in the Symposium and Bootcamp on the Science of Security (HotSoS) Munindar Singh$5,000 by National Science Foundation (NSF)
01/ 1/2014 - 12/31/2014

This project will support travel by US student researchers to the Symposium and Bootcamp on the Science of Security (HotSoS), which will be held in April 2014 in Raleigh, North Carolina. Travel support will be critical in encouraging participation, which is especially important since HotSoS 2014 will be one of the first peer-reviewed events on the Science of Security.

LAS DO3 Task Order 2.11 Mission Enabling - Singh
Munindar Singh

$240,158 by Laboratory for Analytic Sciences 03/31/2014 - 12/31/2014 DO3 Task Order 2.11 Mission Enabling LAS DO3 Task Order 2.8 Analytic Workflow Munindar Singh$61,570 by LAS
03/31/2014 - 12/31/2014

DO3 Task Order 2.8 Analytic Workflow

LAS DO3 Task Order 2.9 KRM
Munindar Singh

$80,116 by LAS 03/31/2014 - 12/31/2014 DO3 Task Order 2.9 KRM Quality of Information-Aware Networks for Tactical Applications (QUANTA) Munindar Singh$669,028 by Penn State University (Army Research Laboratory
09/28/2009 - 09/29/2014

This project will develop a computational approach to trust geared toward enhancing the quality of information in tactical networks. In particular, this project will develop a trust model that takes into account various objective and subjective qualities of service as well as the social relationships among the parties involved in a network who originate, propagate, or consume information. The proposed approach will build an ontology for quality of information and its constituent qualities, and will expand existing probabilistic techniques to multivalued settings. The project will develop a prototype software module that realize the techniques for producing trust assessments regarding the information exchanged.

DO 2 Task 3.8 - St. Amant
Robert St. Amant

$49,493 by Laboratory for Analytic Sciences 09/13/2013 - 12/31/2014 DO 2 Task 3.8 activities LAS DO3 Task Order 2.5 Cognitive Processing Robert St. Amant$90,860 by LAS
03/31/2014 - 12/31/2014

DO3 Task Order 2.5 Cognitive Processing

EAGER: Cognitive Modeling of Strategies for Dealing with Errors in Mobile Touch Interfaces
Robert St. Amant ; Emerson Murphy-Hill

$281,076 by National Science Foundation 09/ 1/2014 - 08/31/2016 Touch interfaces on mobile phones and tablets are notoriously error prone in use. One plausible reason for slow progress in improving usability is that research and design efforts in HCI take a relatively narrow focus on isolating and eliminating human error. We take a different perspective: failure represents breakdowns in adaptations directed at coping with complexity. The key to improved usability is understanding the factors that contribute to both expertise and its breakdown. We propose to develop cognitive models of strategies for touch interaction. Our research will examine the detailed interactions between users perceptual, cognitive, and motor processes in recognizing, recovering from, and avoiding errors in touch interfaces. Our proposal is for three stages of research: exploratory experiments, analysis and modeling, and finally validation experiments. DO 2 Task 3.2 - Streck John Streck$935,302 by NSA
09/13/2013 - 09/30/2014

LAS DO3 Task Order 2.2 Steck-Infrastructure
John Streck ; John Bass

$277,915 by Laboratory for Analytic Sciences 03/31/2014 - 12/31/2014 DO3 Task Order 2.2 Infrastructure NCDS Data Science Faculty Fellow-Tracking Community Evolution in Dynamic Graph Data Using Tree-Like Structure Vida Sullivan$30,000 by National Consortium for Data Science (UNC-UNC Chapel Hill)
01/ 1/2014 - 12/31/2014

"Big Data" sources for many real-world applications pose numerous challenges to understanding the complex and possibly hidden relationships between components of a complex network. Furthermore, these networks often consist of heterogeneous entities and types of relationships, and many existing algorithms for computing network features and similarity are not directly applicable. In order to draw actionable insights, analysis need to identify events of interest, place them in context, and understand their impact. Existing approaches which emphasize visualization (at the expense of analytics), struggle to coherently present networks with hundreds of entities, whereas practical applications require hundreds of thousands (or more). We propose to develop methods which integrate ideas from graph theory with multi-scale modeling (since events of interest may occur at different levels of granularity/contexts within the data) to improve comprehension of such relational data and form a foundation for novel methods of visualization and interaction.

Triangle Computer Science Distinquished Lecture Series

$6,700 by Duke University (Us Army-Army Research Office) 01/ 1/2014 - 12/31/2014 Since 1995, the Triangle Computer Science Distinguished Lecturer Series (TCSDLS) has been hosting influential university researchers and industry leaders from computer-related fields as speakers at the three universities within the Research Triangle Area. The lecturer series, sponsored by the Army Research Office (ARO), is organized and administered by the Computer Science departments at Duke University, NC State University, and the University of North Carolina at Chapel Hill. This proposal argues for continuation, for an additional 3 years, of this highly successful lecturer series which is being led by Duke University. Collaborative Research: CPATH II: Incorporating Communication Outcomes into the Computer Science Curriculum Mladen Vouk ; Michael Carter (co-PI). Grad$369,881 by National Science Foundation
10/ 1/2009 - 03/31/2015

In partnership with industry and faculty from across the country, this project will develop a transformative approach to developing the communication abilities (writing, speaking, teaming, and reading) of Computer Science and Software Engineering students. We will integrate communication instruction and activities throughout the curriculum in ways that enhance rather than replace their learning technical content and that supports development of computational thinking abilities of the students. We will implement the approach at two institutions. By creating concepts and resources that can be adapted by all CS and SE programs, this project also has the potential to increase higher education's ability nationwide to meet industry need for CS and SE graduates with much better communication abilities than, on average, is the case today. In addition, by using the concepts and resources developed in this project, CS and SE programs will be able to increase their graduates' mastery of technical content and computational thinking.

Investigation of a Novel Indoor Localization (Navigation) Technique For Smartphones

$75,000 by Samsung Telecommunications America, LLC - TX 01/ 2/2012 - 12/31/2014 In this project, we aim at developing a new indoor localization technique relying on low-frequency radio that can penetrate indoor obstacles (or detour obstacles by diffraction in the shortest path) by its long wave characteristics. The smartphone running this system would be able to identify its position by measuring straight-line distances from a few radio transmission towers deployed in a city scale (or in a district scale). Straight-line distances that have not been affected by indoor obstacles would be able to provide a three-dimensional position including floor information and position information in the floor (e.g., store information in a shopping complex). CSR: Small: Collaborative Research: Enabling Cost-effective Cloud HPC Mladen Vouk ; Xiaosong Ma$311,998 by National Science Foundation
10/ 1/2013 - 09/30/2016

The proposed work examines novel services built on top of public cloud infrastructure to enable cost-effective high-performance computing. We will explore the on-demand, elastic, and configurable features of cloud computing to complement the traditional supercomputer/cluster platforms. If successful, this research will result in tools that adaptively aggregate, configure, and re-configure cloud resources for different HPC needs, with the purpose of offering low-cost R&D environments for scalable parallel applications.

TWC: Frontier: Collaborative: Rethinking Security in the Era of Cloud Computing

$749,996 by National Science Foundation 09/ 1/2013 - 08/31/2018 Increased use of cloud computing services is becoming a reality in today's IT management. The security risks of this move are active research topics, yielding cautionary examples of attacks enabled by the co-location of competing tenants. In this project, we propose to mitigate such risks through a new approach to cloud architecture defined by leveraging cloud providers as trusted (but auditable) security enablers. We will exploit cooperation between cloud providers and tenants in preventing attacks as a means to tackle long-standing open security problems, including protection of tenants against outsider attacks, improved intrusion detection and security diagnosis, and security-monitoring inlays. DO 2 Task 3.5 Laurie Williams$62,901 by LAS
09/13/2013 - 12/31/2014

Differential Analysis on Changes in Medical Device Software
Laurie Williams

$60,000 by NSF 10/ 1/2012 - 09/30/2014 As medical device technology evolves, so too does the software upon which the technology often relies. Changes in device software, after it has been approved or cleared, may compromise the safety of that device. Assessing the safety of such changes presents special challenges to regulators at the FDA. This project explores differential analysis techniques to assess the effects of software changes on device safety. EDU: Motivating and Reaching Students and Professionals with Software Security Education Laurie Williams ; Emerson Murphy-Hill ; Kevin Oliver (Education)$300,000 by National Science Foundation
09/ 1/2013 - 08/31/2015

According to a 2010 report that was based on the interviews from 2,800 Information Technology professionals worldwide, the gap between hacker threats and suitable security defenses is widening, and the types and numbers of threats are changing faster than ever before . In 2010, Jim Gosler, a fellow at the Sandia National Laboratory who works on countering attacks on U.S. networks, claimed that there are approximately 1,000 people in the country with the skills needed for cyber defense. Gosler went on to say that 20 to 30 times that many are needed. Additionally, the Chief Executive Officer (CEO) of the Mykonos Software security firm indicated that today's graduates in software engineering are unprepared to enter the workforce because they lack a solid understanding of how to make their applications secure. Particularly due to this shortage of security expertise, education of students and professionals already in the workforce is paramount. In this grant we provide a plan for motivating and providing software security education to students and professionals.

NSA / North Carolina State University Science of Security Lablet: Analytics Supporting Security Science
Laurie Williams ; Michael Rappa

$2,475,248 by National Security Agency via US Army Research Office 10/ 1/2011 - 11/30/2014 North Carolina State University (NCSU), led by the Department of Computer Science and the Institute of Advanced Analytics in conjunction with the Institute for Next Generation IT Systems (ITng), will create and manage a Science of Security Lablet (SOSL) research organization on behalf of the National Security Agency (NSA). The SOSL research organization will conduct and coordinate basic science research projects focused on the Science of Security. SOSL will coordinate with related Lablet activities sponsored by NSA at Carnegie Mellon University and at University of Illinois at Urbana Champaign (UIUC). SOSL will also coordinate with the Security Science Virtual Organization at Vanderbilt University. The coordination will be in the form of workshops and technical exchanges. Science of Security Laurie Williams ; Michael Rappa (joint coll)$2,117,740 by US Army - Army Research Office (National Security Agency)
06/25/2013 - 06/24/2015

Critical cyber systems must inspire trust and confidence, protect the privacy and integrity of data resources, and perform reliably. Therefore, a more scientific basis for the design and analysis of trusted systems is needed. In this proposal, we aim to progress the Science of Security. The Science of Security entails the development of a body of knowledge containing laws, axioms and provable theories relating to some aspect of system security. Security science should give us an understanding of the limits of what is possible in some security domain, by providing objective and quantifiable descriptions of security properties and behaviors. The notions embodied in security science should have broad applicability - transcending specific systems, attacks, and defensive mechanisms. A major goal is the creation of a unified body of knowledge that can serve as the basis of a trust engineering discipline, curriculum, and rigorous design methodologies. As such, we provide eight hard problems in the science of security. We also present representative projects which we feel will make progress in the discipline of the science of security.

Growing The Science Of Security Through Analytics
Laurie Williams ; Munindar Singh

$2,255,237 by NSA (US Dept of Defense) 03/28/2014 - 03/27/2015 Since August 2011, North Carolina State University (NCSU) analytics-focused Science of Security Lablet (SOSL) has embraced and helped build a foundation for the NSA vision of the Science of Security (SoS) and a SoS community. Jointly with other SOSLs, we formulated five SoS hard problems, which lie at the core of the BAA. At NCSU, data-driven discovery and analytics have been used to formulate, validate, evolve, and solidify security models and theories as well as the practice of cyber-security. We propose to (1) investigate solutions to five cross-dependent hard problems, building on our extensive experience and research, including in the current SOSL; (2) advance our SoS community development activities; and (3) enhance our evaluation efforts regarding progress on the hard problems by bringing in experts on science evaluation. HCC:Small:Collaborative Research:Integrating Cognitive and Computational Models of Narrative for Cinematic Generation R. Michael Young$352,696 by the National Science Foundation
08/ 1/2013 - 07/31/2016

Virtual cinematography, the use of a virtual camera within a three dimensional virtual world, is being used increasingly to communicate both in entertainment contexts as well as serious ones (e.g., training, education, news reporting). While many aspects of the use of virtual cameras are currently automated, the control of the camera is currently determined using either a pre-programmed script or a human operator controlling the camera at the time of filming. This project seeks to develop a model of virtual cinematography that is both computational -- providing a software system capable of generating camera control directives automatically -- and cognitive -- capable of modeling a viewer's understanding of an unfolding cinematic.

ALICE: A Model for Sustaining Technology-Rich Adaptive Learning Spaces and Interactive Content Environments
R. Michael Young

$285,321 by Institute of Museum & Library Services 11/ 1/2013 - 10/31/2014 This proposal would fund the research and design phase of building the ALICE engine. The project will focus on creating a conceptual model for how to build an adaptive learning space; an architecture for the Artificial Intelligence (AI) engine and technology core; a series of "proof of concept" functional prototypes for collecting data and creating content; and a continuous assessment model for measuring the success of the AI as far as the quality of its engagement with the community and the success of the engine at enhancing a given technology-rich research and learning space. DO 2 Task 3.4 -Young and Roberts R. Michael Young$1,017,700 by LAS/NSA
09/13/2013 - 09/30/2014