Current Research Projects (by faculty)

The funded projects listed below are active projects and the funded running total for the active projects is on the left navigational menu.


Collaborative Research: Integrating Computing in Stem: Designing, Developing and Investigating a Team-based Professional Development Model for Middle-and High-school Teachers
Tiffany Barnes

$861,773 by National Science Foundation (NSF)
09/ 1/2017 - 08/31/2020

Integrated Snap! is a comparison study between the traditional single teacher professional development model and a community of practice professional development where teachers attend as a group. CTE/CS, Math, Engineering, and Science teachers that would foster the integration of Computer Science Principles Curriculum and Science Curriculum in high school classrooms. The professional development will take place using Borko’s (2004) phases. We will develop a professional development model centered on helping content teachers learn best practices for integrating computing in their classroom. That PD will be piloted, replicated, and then modified for other sites and contexts. Teachers will work together to build simulation and programming tools and corresponding activities for their classrooms to use to explore computational concepts in the context of their discipline. An integral part of Integrated Snap is to have teams of teachers from the same school or district attend the PD together and build a community among the teachers and their students doing similar work across their classrooms.

REU Site: Socially Relevant Computing and Analytics
Tiffany Barnes

$359,997 by National Science Foundation
03/ 1/2017 - 02/29/2020

The REU Site at NC State University will immerse a diverse group of undergraduates in a vibrant research community of faculty and graduate students building and analyzing cutting-edge human-centric applications including games, tutors, and mobile apps. We will recruit students from underrepresented groups and colleges and universities with limited research opportunities through the STARS Computing Corps, an NSF-funded national consortium of institutions dedicated to broadening participation in computing. Using the Affinity Research Groups and STARS Training for REUs models, we will engage faculty and graduate student mentors with undergraduates to create a supportive culture of collaboration while promoting individual contributions to research through just-in-time training for both mentors and students throughout the summer.

Track 2: CS10K: BJC-STARS: Scaling CS Principles through STARS community & Leadership Development
Tiffany Barnes

$500,000 by National Science Foundation
10/ 1/2015 - 09/30/2018

BJC-STARS is a CS10K proposal to broaden access to Computer Science education through engaging colleges and universities to prepare and support regional communities of high school teachers to teach the Beauty and Joy of Computing (BJC) Computer Science Principles course. We will leverage the successful STARS model focusing on engaging faculty and students in a community committed to leading efforts to broaden participation in computing. Each year, we will engage new university faculty who will teach BJC and facilitate professional development and support to high school teachers and students. We will also build a STARS community among participating high school teachers and students, engaging them in the need to broaden participation in computing.

Type I: Collaborative Research: FRABJOUS CS - Framing a Rigorous Approach to Beauty and Joy for Outreach to Underrepresented Students in Computing at Scale
Tiffany Barnes

$352,831 by National Science Foundation
02/ 1/2013 - 08/31/2018

In this FRABJOUS CS project, we will prepare 60 high school teachers to teach the Beauty and Joy of Computing (BJC) Computer Science Principles curriculum. The BJC course is a rigorous introductory computing course that highlights the meaning and applications of computing, while introducing low-threshold programming languages Snap-Scratch, GameMaker and AppInventor. BJC is informed and inspired by the Exploring Computer Science curriculum, that was explicitly designed to channel the interests of urban HS students with ?culturally relevant and meaningful curriculum? [Goode 2011][Margolis 2008]. The BJC course uses collaborative classroom methods including pair learning, and student-selected projects are geared toward leveraging students? knowledge of social media, games, devices, and the internet. At UNC Charlotte in 2010 and 2011, PI Barnes engaged college students in supporting the BJC course, and in after-school outreach and summer camps that excite middle and high school students about this curriculum at different levels. The project engages three university faculty members and 6 college students to help the high school teachers build a Computer Science Teachers Association chapter and provide ongoing professional development and support for the BJC course. The project also engages high school teachers and an education researcher to help refine and enriches the BJC curriculum to be easier to adopt and teach in high schools.

Type I: Collaborative Research: FRABJOUS CS - Framing a Rigorous Approach to Beauty and Joy for Outreach to Underrepresented Students in Computing at Scale (Supplement)
Tiffany Barnes

$86,000 by NSF
02/ 1/2013 - 08/31/2018

In this FRABJOUS CS project, we will prepare 60 high school teachers to teach the Beauty and Joy of Computing (BJC) Computer Science Principles curriculum. The BJC course is a rigorous introductory computing course that highlights the meaning and applications of computing, while introducing low-threshold programming languages Snap-Scratch, GameMaker and AppInventor. BJC is informed and inspired by the Exploring Computer Science curriculum, that was explicitly designed to channel the interests of urban HS students with ?culturally relevant and meaningful curriculum? [Goode 2011][Margolis 2008]. The BJC course uses collaborative classroom methods including pair learning, and student-selected projects are geared toward leveraging students? knowledge of social media, games, devices, and the internet. At UNC Charlotte in 2010 and 2011, PI Barnes engaged college students in supporting the BJC course, and in after-school outreach and summer camps that excite middle and high school students about this curriculum at different levels. The project engages three university faculty members and 6 college students to help the high school teachers build a Computer Science Teachers Association chapter and provide ongoing professional development and support for the BJC course. The project also engages high school teachers and an education researcher to help refine and enriches the BJC curriculum to be easier to adopt and teach in high schools.

Collaborative Research: Modeling Social Interaction and Performance in STEM Learning
Tiffany Barnes

$200,003 by National Science Foundation
09/ 1/2014 - 08/31/2018

Despite long-standing awareness that social interaction is an integral part of knowledge construction, efforts to study complex collaborative learning have traditionally been relegated to qualitative and small-scale methodologies. Relatively new data traces left by online learning environments, including massive open online courses (MOOCs), offer the first real hope for scaling up such analyses. The purpose of the proposed research is to develop comprehensive models for collaborative learning which in turn will enable instructional design and the authentic assessment of the individual within the group context. This task is undertaken by an interdisciplinary team of researchers with specializations in natural language processing, discourse analysis, social network analysis, educational data mining and psychometrics.

Type I: Collaborative Research: FRABJOUS CS - Framing a Rigorous Approach to Beauty and Joy for Outreach to Underrepresented Students in Computing at Scale (Supplement)
Tiffany Barnes

$565,874 by National Science Foundation
09/ 1/2016 - 08/31/2018

In recent decades, coding has evolved from a professional activity of a few million developers to a near universally-needed skill. However, there are still fewer than 2000 teachers prepared to teach computer science to high school students. In this supplement we propose to engage 240 teachers in NC districts, and an additional 1000 online, in BJC professional developme

BPC-AE: Scaling the STARS Alliance: A National Community for Broadening Participation Through Regional Partnerships
Tiffany Barnes

$150,000 by UNC-UNC Charlotte ( NSF)
01/ 1/2013 - 03/21/2018

The Beauty and Joy of Computing project presents a unique opportunity to scale the STARS Alliance further while also enhancing national efforts to engage more high school teachers and students in teaching and learning computing and build stronger university/college/K12 partnerships. Through this supplement, we will extend the Alliance with at least three new STARS Computing Corps, providing leadership training to a group of 8-10 students in each Corps, all focused on supporting the BJC effort. New Corps will provide teaching assistance to high school teachers implementing the BJC course through classroom visits and monthly Computer Science Teacher Association chapter meetings. These new STARS Computing Corps will also teach BJC material either through in middle school Citizen Schools after-school programs, and K-12 summer camps. This will provide a vibrant community of support for high school teachers and students engaging the new BJC course.

EXP: Data-driven Support for Novice Programmers
Tiffany Barnes ; Min Chi

$549,874 by National Science Foundation
09/ 1/2016 - 08/31/2019

While intelligent tutors have been shown to increase student learning in programming and other domains, and creative, exploratory programming environments are assumed to promote novice interest and motivation to learn to program, there are no environments that provide both creative tasks and intelligent support. We propose to extend our methods for data-driven hint generation, model tracing, and knowledge tracing to augment Snap and Java programming environments to be more supportive for novice programmers doing open-ended creative tasks.

Scaling a Rigorous CS Principles Curriculum: A Supplement to Beauty and Joy of Computing in New York City STEM-C MSP
Tiffany Barnes ; Glenn Kleiman

$568,967 by Educational Development Center via National Science Foundation
01/ 1/2017 - 12/31/2018

In recent decades, coding has evolved from a professional activity of a few million developers to a near universally-needed skill. However, there are still fewer than 2000 teachers prepared to teach computer science to high school students. In this supplement we propose to engage 162 teachers in BJC professional development and Train the Trainer workshops.

GRIP: Computer Science for All K-12 Students
Tiffany Barnes, Co-PI ; James Lester, Co-PI ; Glenn Kleiman, PI (Friday ; Eric Weibe, Co-PI (Friday

$547,187 by Game-Changing Research Incentive Program (GRIP
02/ 1/2017 - 01/31/2018

This project addresses the critical national need to “Empower all American students from kindergarten through high school to learn computer science and be equipped with the computational thinking skills they need to be creators in the digital economy, not just consumers, and to be active citizens in our technology-driven world” (President Obama, 2016 State of the Union Address). We take a systemic approach in which we will: -Conduct research to determine how students develop the key concepts and processes of computer science and the effectiveness of alternative teaching approaches. -Design learning resources that guide and support the teaching and learning of computer science, using emerging technologies such as artificial intelligent-driven tutors and interactive, game-like, virtual learning environments. -Influence the practice of teaching computer science in K-12 schools, through providing both professional learning and mentoring opportunities for teachers. -Inform policy decisions, at the state, district and school levels, through providing information for principals, district leaders, local school board members and state policymakers to inform their decisions about adding computer science in the K-12 curriculum

Evaluation For Actionable Change: A Data-Driven Approach
Tiffany Barnes, Co-PI ; Collin Lynch, Co-PI

$799,837 by National Science Foundation
01/ 1/2016 - 12/31/2018

RCTs are expensive and often show small effects. Even RCTs of widely adopted digital learning platforms can show disappointing results and these results have little impact on subsequent adoptions of programs already entrenched in the educational landscape. New methods are needed to both estimate effects and to indicate ways of improving outcomes for already-adopted digital learning tools. With platforms currently in wide-scale use, novel approaches to assessing use patterns and their relations with outcomes can both evaluate maximal effectiveness and provide means for improved effectiveness. Our study will use a data-driven approach to identify patterns of both student use and teacher implementation in the widely-adopted software Spatial Temporal Mathematics (ST Math). By linking these patterns with important learning and motivational outcomes we can form recommendations regarding promising actions teachers and administrators can take in implementing ST Math and refinements program developers can make in guiding students toward successful patterns. This work has the potential to not only transform use and success of the platform studied, but to create methods that can be refined and transferred to the study and implementation of other platforms.

CAREER: Improving Adaptive Decision Making in Interactive Learning Environments
Min Chi

$547,810 by National Science Foundation
03/ 1/2017 - 02/28/2022

For many forms of interactive environments, the system's behaviors can be viewed as a sequential decision process wherein, at each discrete step, the system is responsible for selecting the next action to take from a set of alternatives. The objective of this CAREER proposal is to learn robust interaction strategies that will lead to desirable outcomes in complex interactive environments. The central idea of this project is that strategies should not only be effective in complex interactive environments but they should also be efficient, focusing solely on the key features of the domain and the crucial decision points. These are the features and decisions that are not only associated with desirable outcomes, but without which the desirable outcomes are unlikely to occur.

DIP: Integrated Data-driven Technologies for Individualized Instruction in STEM Learning Environments
Min Chi ; Tiffany Barnes

$1,999,438 by National Science Foundation
08/15/2017 - 07/30/2022

In this proposed work, our goal is to automatically design effective personalized ITSs directly from log data. We will combine Co-PI Dr. Barnes data-driven approach on learning what to teach with PI Dr. Chi’s data-driven work on learning how to teach. More specifically, we will explore across three important undergraduate stem domains: discrete math, probability, and programming.

Educational Data Mining for Individualized Instruction in STEM Learning Environments
Min Chi ; Tiffany Barnes

$639,401 by National Science Foundation
09/ 1/2014 - 08/31/2018

Human one-on-one tutoring is one of the most effective educational interventions. Tutored students often perform significantly better than students in classroom settings (Bloom, 1984; Cohen, Kulik, & Kulik, 1982). Computer learning environments that mimic aspects of human tutors have also been highly successful. Intelligent Tutoring Systems (ITSs) have been shown to be highly effective in improving students' learning at real classrooms (Anderson, Corbett, Koedinger, & Pelletier, 1995; Koedinger, Anderson, Hadley, & Mark, 1997; VanLehn et al., 2005). The development of ITSs has enabled schools and universities to reach out and educate students who otherwise would be unable to take advantage of one-on-one tutoring due to cost and time constraints (Koedinger, Anderson, Hadley, & Mark, 1997). Despite the high payoffs provided by ITSs, significant barriers remain. High development costs and the challenges of knowledge engineering have prevented their widespread deployment. A diverse team of software developers, domain experts, and educational theorists are required for development, testing and even maintenance. Generally speaking, it requires an average of 80 man-hours per hour of tutoring content. In this proposed work, our goal is to automatically design effective personalized ITSs directly from log data. We will combine co-pI Dr. Barnes data-driven approach on learning what to teach with PI Dr. Chi’s data-driven work on learning how to teach. More specifically, we will explore two important undergraduate stem domains: discrete math and probability; and employ two types of ITSs: an example-based ITS, the discrete math tutor, and a rule-based ITS, Pyrenees. The former can automatically generate hints directly from students’ prior solutions while the latter has hard-coded domain rules and teaches students a domain-general problem-solving strategy within the context of probability. For learning how to teach, we will apply reinforcement learning to induce adaptive pedagogical strategies directly from students’ log files and will focus on three levels of pedagogical decisions: 1) whether or not to give students hints and what level of hint (step-level); 2) whether to show students worked example or ask students to engage problem solving (problem-level); and 3) whether or not to teach students a meta-cognitive problem-solving strategy (metacognitive level).

Using Real-Time Multichannel Self-Regulated Learning Data to Enhance Student Learning and Teachers' Decision-Making with MetaDash
Min Chi - Co PI ; Roger Azevedo (Psychology) ; Soonhye Pakr - co-PI EDUCA

$914,585 by National Science Foundation
04/ 1/2017 - 03/31/2020

This 3-year project will focus on laboratory and classroom research in the Raleigh-Chapel-Hill, and Durham areas. A team of interdisciplinary researchers from NCSU's Department of Psychology (Dr. Roger Azevedo), Computer Science (Dr. Min Chi), and STEM Education (Dr. Soonhye Park) will conduct empirical and observational research aimed at improving teachers' decision-making based on their analyses of students' real-time, multi-channel self-regulated learning data. We will use multi-channel data to understand the nature of self-regulatory processes in students while using MetaTutor to understand challenging science topics (e.g., human biological systems). This will be accomplished by aligning and conducting complex computational and statistical analyses of a multitude of trace data (e.g., log-files, eye-tracking, facial expression of emotions), behavioral (e.g., human-pedagogical agent dialogue moves), and physiological measures (e.g., EDA), learning outcomes, and classroom data (e.g., teacher-student interactions, gaze behavior of teachers’ attention and use of data presented by the visualization tool). The proposed research, in the context of using MetaTutor and a visualization tool for teachers, is extremely challenging and will help us to better understand the nature and temporal dynamics of these processes in classroom contexts, how they contribute to various types of learning and use of self-regulatory skills, and provide empirical basis for designing an intelligent teacher analytics tool. The results from this grant will contribute significantly to models of social and cognitive bases of student-machine-teacher interactions; statistical and computational methods used to make inferences from complex multi-channel data; theoretical and conceptual understanding of temporally-aligned data streams; enhancing students’ understanding of complex science topics by making more sensitive and intelligent advanced learning technologies; and, enhanced understanding of how teachers use real-time student data to enhance their instructional decision-making, based on data presented in teacher analytic tools.

SCH: INT: Collaborative Research: S.E.P.S.I.S.: Sepsis Early Prediction Support Implementation System
Min Chi, Co-PI ; Julie Ivy, PI

$834,725 by National Science Foundation
10/ 1/2015 - 09/30/2018

Every year approximately 700,000 people die in US hospitals. In 16% of them, the first diagnosis at death was septicemia – one of the most common delayed diagnoses associated with inpatient death. Sepsis is one of the ten leading causes of death. While it is difficult to estimate how many of these deaths could have been averted or postponed if a better system of care was in place, it is widely recognized that sepsis patients have better outcomes if treated earlier. Sepsis is one of the most common of these diagnoses with delayed effective treatment interventions. As opposed to wrong diagnoses, delayed diagnoses have historically not been considered adverse events as there is no change in patient condition as a result of care delivered. However, patients with delayed diagnoses do have worse outcomes than those who receive timely treatment. These diagnostic and/or treatment delays associated with inpatient mortality and long term morbidity consequences represent a significant and modifiable patient safety issue. Awareness of sepsis is low; many septic patients are under-diagnosed at an early stage when aggressive treatment could still reverse the course of the infection. Early recognition and implementation of early goal directed therapy improves outcomes and decreases mortality. For every one hour delay in treatment of severe sepsis or severe shock with antibiotics, there is an incremental decrease in patient survival. For example, a delay in antibiotics of five hours decreases survival to 50%. We propose to take a date-driven evidence-based approach that integrates computer science and industrial engineering to develop personalized sepsis diagnosis and treatment plans. The goal of this research is to integrate electronic health records (EHR) and clinical expertise to provide an evidence-based framework for diagnosing sepsis patients, accurately risk-stratifying patients within the sepsis spectrum, and developing intervention policies that inform sepsis treatment decisions. We will achieve this research goal through accomplishing three specific aims based on a longitudinal data set of EHRs of Mayo Clinic Rochester hospital patients discharged and Christiana Care Health System hospital patients.

I/UCRC: Site application to join I/UCRC known as CHMPR
Rada Chirkova

$298,533 by National Science Foundation
09/ 1/2016 - 08/31/2019

Abstract: The objective of this proposal is to indicate that North Carolina State University (NCSU) will join, as a site, the Center of Hybrid Multicore Productivity Research (CHMPR) in Year 2 of its phase II I/UCRC renewal. The focus of NCSU within the center will be the science of technologies for end-to-end enablement of data. The research at NCSU complements well the work being done at the other I/UCRC centers and at the other sites of the CHMPR center. NCSU has had extensive positive experience with I/UCRC centers over the years, and is very comfortable with the model.

Membership in Center of Hybrid Multicore Productivity Research (CHMPR) - Full Member
Rada Chirkova

$80,000 by Merck & Company
01/ 1/2017 - 12/31/2018

Exploring and analyzing the available data is key to making the right decisions. It is well known that “data wrangling,” which includes many kinds of end-to-end data enablement, makes up 60-80% of the total effort in analytics on large-scale data. We look to address the challenge of maximizing the usefulness of the available data, by providing tools, science, and talent for next-generation technologies and infrastructure. We focus on empowering organizations that wish to unlock the value of decisions based on their data, and envision a future where technologies and tools for data enablement provide significant business advantages to such organizations. At NCSU, we will lead national and international efforts in this space, by developing and providing technologies and tools for bridging the time gap between the acquisition of data and real-time and long-term decision making.

Membership in Center of Hybrid Multicore Productivity Research (CHMPR) - Full Member
Rada Chirkova

$80,000 by SAS Institute, Inc
01/ 1/2017 - 12/31/2018

Exploring and analyzing the available data is key to making the right decisions. It is well known that “data wrangling,” which includes many kinds of end-to-end data enablement, makes up 60-80% of the total effort in analytics on large-scale data. We look to address the challenge of maximizing the usefulness of the available data, by providing tools, science, and talent for next-generation technologies and infrastructure. We focus on empowering organizations that wish to unlock the value of decisions based on their data, and envision a future where technologies and tools for data enablement provide significant business advantages to such organizations. At NCSU, we will lead national and international efforts in this space, by developing and providing technologies and tools for bridging the time gap between the acquisition of data and real-time and long-term decision making.

BD Spokes: PLANNING: SOUTH: Collaborative: Rare Disease Observatory
Rada Chirkova

$71,143 by National Science Foundation
09/ 1/2016 - 08/31/2018

One of the greatest challenges in the rare disease domain is access to trusted, verified data. With advances in mapping the human genome, over 7000 rare diseases have been identified. However no integrated, comprehensive patient registries exist that reliably collect data on these patients and their conditions and would allow for queries such as outcomes and economic impact. This planning grant will concentrate on building a large data system that can be accessed by a large collaborative community of those in the rare disease space, including state and federal agencies, clinicians, investigators, patient advocacy groups, industry and SPOKE: South.

BD Spokes: PLANNING: SOUTH: Collaborative: Rare Disease Observatory (Supplement)
Rada Chirkova

$14,000 by National Science Foundation
07/27/2017 - 08/31/2018

Managing and extracting useful information from massive data sets is critical in nearly every research field and industry sector. Modeling, data management, and analytics are key elements in understanding and using these data sets, and their importance in academia, government, and industry is predicted to grow exponentially. Success in solving the problem of extracting value from massive data hinges on balancing fundamental research, technological know-how, and commercial market intelligence. This proposal puts forward a plan for enhanced activities for a graduate student, making use of important data, mentorship, and collaboration resources, with outreach activities that maximize impact. The proposal is focused on an intensive training program with a mentor who is a senior researcher at the Laboratory for Analytic Sciences (LAS). LAS is a federal-government entity located in Raleigh, North Carolina and focused on translational research in support of the Department of Defense and the Intelligence Community. The training will focus on building and strengthening the student's interaction skills that are critical for today's workforce, by exposing the student to collaborative work on knowledge graphs with players in the government and industry.

Membership for Center of Hybrid Multicore Productivity Research (CHMPR), Full Member
Rada Chirkova

$80,000 by Northrup Grumman
01/ 1/2016 - 12/31/2017

Membership dues from Northrup Grumman for Center of Hybrid Multicore Productivity Research. Exploring and analyzing the available data is key to making the right decisions. It is well known that “data wrangling,” which includes many kinds of end-to-end data enablement, makes up 60-80% of the total effort in analytics on large-scale data. We look to address the challenge of maximizing the usefulness of the available data, by providing tools, science, and talent for next-generation technologies and infrastructure. We focus on empowering organizations that wish to unlock the value of decisions based on their data, and envision a future where technologies and tools for data enablement provide significant business advantages to such organizations. At NCSU, we will lead national and international efforts in this space, by developing and providing technologies and tools for bridging the time gap between the acquisition of data and real-time and long-term decision making.

NeTS: JUNO: Service Offering Model and Versatile Network Resource Grooming for Optical Packet and Circuit Integrated Networks
Rudra Dutta

$291,956 by National Science Foundation (NSF)
04/ 1/2014 - 03/31/2018

The explosive growth in bandwidth represented by advances in optical communication and networking technologies has underpinned the increasing reach and reliability of the Internet in the last two decades. However, the potential impact of increasingly sophisticated recent advances in optical technology, such as rapid switching and elastic wavelengths have not yet been realized. The main cause of this is that such technology, while possible to integrate into the data plane of planetary networking, is difficult to accommodate in the current planning, management, and control strategies. We propose in this project to work hand-in-hand with collaborating researchers from NICT, Japan, who are working to realize a novel technology of hybrid optical packet/circuit switching. Such a technology could be immensely useful to large transport network operators, but there are no existing algorithms that can easily determine how a provider can provision their resources between the circuit and packet possibilities on an ongoing dynamic basis. We envision a novel approach to this problem, where we utilize the concept of a "choice marketplace" that allows sophisticated rendezvous semantics between customer and provider, and allows them to cooperatively guide network resource provisioning to dynamically fulfill network objectives such as maximizing performance received by network traffic. Our approach also allows balancing of various objectives, such as network utilization, latency, and the increasingly important metric of energy expenditure in the network.

TWC: Medium: Collaborative: Improving Mobile-Application Security via Text Analytics
William Enck

$300,000 by National Science Foundation
07/ 1/2015 - 06/30/2018

Computing systems that make security decisions often fail to take into account human expectations. This failure occurs because human expectations are commonly drawn from textual sources (e.g., mobile application description) and are hard to extract and codify. This proposal seeks to extract expectation context from natural-language text artifacts presented to users as they find, install, and run software. The proposed work focuses specifically mobile applications to demonstrate concrete and practical advances in our scientific understanding of applying user expectation context to security decisions. These findings will advance the state-of-the-art in identifying and classifying malware and grayware, as well as identify better methods of communicating risk to users. We will also gain a better understanding of the unique challenges of applying text analytics the security domain.

TWC: Medium: Collaborative: Improving Mobile-Application Security via Text Analytics (supplement)
William Enck

$8,000 by National Science Foundation
07/ 1/2015 - 06/30/2018

Smartphones and mobile devices have become a dominant computing platform for billions of users. There are many static and dynamic program analysis tools to detecting security and privacy concerns in mobile applications. However, few approaches bridge the semantic gap between code and visual presentation. Ignoring this context results in analysis that falsely reports an application as malicious (e.g., the user really wanted to use an app to record phone calls), or fails to detect suspicious behavior (e.g., an app that collects sensitive information via text input). We propose to use a hybrid static / dynamic approach to extract the text labels from the Android application UIs followed by text analytics to semantically label the type of input asked for by the application. Doing so will better characterize the amount of security and privacy information entered into Android applications, as well as enable outlier detection to identify applications that ask for unexpected (e.g., SSN) information for their semantic class (e.g., banking applications). This analysis will be applied at scale to identify potential privacy infringements in mobile application stores.

NSF Travel Grant Support for ACM WiSec 2016
William Enck

$5,000 by National Science Foudnation
07/ 1/2016 - 05/31/2018

The 9th Association for Computing Machinery (ACM) Conference on Security and Privacy in Wireless and Mobile Networks (WiSec 2016) will be held at the Darmstadtium in Darmstadt, Germany, from July 18 to July 20, 2016 [1]. This proposal requests $5,000 in funding to assist approximately five (5) United States-based graduate students to attend WiSec 2016.

CAREER: Secure OS Views for Modern Computing Platforms
William Enck

$400,000 by National Science Foundation
02/ 1/2013 - 01/31/2018

Controlling the access and use of information is a fundamental challenge of computer security. Emerging computing platforms such as Android and Windows 8 further complicate access control by relying on sharing and collaboration between applications. When more than two applications participate in a workflow, existing permission systems break down due to their boolean nature. In this proposal, we seek to provide applications with residual control of their data and its copies. To do this, we propose secure OS views, which combines a new abstraction for accessing data with whole-system information tracking. We apply secure OS views to modern operating systems (e.g., Android and Windows 8), which use database-like abstractions for sharing and accessing information. Similar to a database view, secure OS views uses runtime context to dynamically define the protection domain, allowing the return of the value, a fake value, or nonexistence of the record.

TWC: Frontier: Collaborative: Rethinking Security in the Era of Cloud Computing
William Enck ; Peng Ning ; Mladen Vouk

$749,996 by National Science Foundation
09/ 1/2013 - 08/31/2018

Increased use of cloud computing services is becoming a reality in today's IT management. The security risks of this move are active research topics, yielding cautionary examples of attacks enabled by the co-location of competing tenants. In this project, we propose to mitigate such risks through a new approach to cloud architecture defined by leveraging cloud providers as trusted (but auditable) security enablers. We will exploit cooperation between cloud providers and tenants in preventing attacks as a means to tackle long-standing open security problems, including protection of tenants against outsider attacks, improved intrusion detection and security diagnosis, and security-monitoring inlays.

Collaborative Research: Research in Student Peer Review: A Cooperative Web-Services Approach
Edward Gehringer

$1,034,166 by NSF
09/ 1/2014 - 08/31/2018

Peer review between students has a 40-year history in academia. During the last half of that period, web-based peer-review systems have been used in tens of thousands of classes. Many online systems have been developed, in diverse settings and with diverse purposes. The systems, however, have common concerns: assigning capable reviewers to each student submission, insuring review quality, and delivering reliable scores, in cases where the systems are used for summative review of student work. Many strategies have been proposed to meet those concerns, and tested in relatively small numbers of courses. The next step is to scale up the studies to learn how well they perform in diverse settings, and with large numbers of students. This project brings together researchers from several peer-review systems, including some of the largest, to build web services that can be incorporated into existing systems to test these strategies and visualize the results.

Collaborative Research: Research in Student Peer Review: A Cooperative Web-Services Approach (Supplement)
Edward Gehringer

$40,000 by National Science Foundation
09/ 1/2014 - 08/31/2018

The students assist our efforts to build a database of peer-review responses that can be mined for quantitative research studies. The database will be composed of anonymized data from the peer-review systems of the constituent projects: CritViz, CrowdGrader, Expertiza, and Mobius/Slip. Among other items, it will contain peer feedback and ratings, and links to submitted work. They will embark on a qualitative research study to determine what STEM students value about the peer-review process. They will use a common set of research protocols to investigate three research questions: What do students value about receiving reviews? What do they value about giving reviews? Do their reactions differ, based on demographics, age/level of study, or academic major?

CSR: Medium:Collaborative Research: Holistic, Cross-Site, Hybrid System Anomaly Debugging for Large Scale Hosting Infrastructures
Xiaohui (Helen) Gu

$518,000 by National Science Foundation
08/ 1/2015 - 07/31/2019

Hosting infrastructures provide users with cost-effective computing solutions by obviating the need for users to maintain complex computing infrastructures themselves. Unfortunately, due to their inherent complexity and sharing nature, hosting infrastructures are prone to various system anomalies caused by various external or internal faults.The goal of this project is to investigate a holistic,cross-site, hybrid system anomaly debugging framework that intelligently integrates production-site black-box diagnosis and developer-site white-box analysis into a more powerful hosting infrastructure anomaly debugging system.

CAREER: Enable Robust Virtualized Hosting Infrastructures via Coordinated Learning, Recovery, and Diagnosis
Xiaohui (Helen) Gu

$450,000 by National Science Foundation
01/ 1/2012 - 12/31/2017

Large-scale virtualized hosting infrastructures have become the fundamental platform for many real world systems such as cloud computing, enterprise data centers, and educational computing lab. However, due to their inherent complexity and sharing nature, hosting infrastructures are prone to various runtime problems such as performance anomalies and software/hardware failures. The overarching objective of this proposal is to systematically explore innovative runtime reliability management techniques for large-scale virtualized hosting infrastructures. Our research focuses on handling performance anomalies in distributed systems that are often very difficult to reproduce offline. Our approach combines the power of online learning, knowledge-driven first-response recovery, and in-situ diagnosis to handle unexpected system anomalies more efficiently and effectively. We aim at transforming the runtime system anomaly management from a trial-and-error guessing game into an efficient knowledge-driven self-healing process.

Visualizing Wildfire Narratives, CHMPR Core Project
Christopher Healey

$59,983 by Center of Hybrid Multicore Productivity Research (CHMPR) - NCSU Research Site
09/27/2017 - 12/31/2017

This project will investigate the use of large-scale social media data capture to identify, construct, and visualize risk narratives: conversations formed from explicitly or implicitly related social media posts over time. We will build narratives based on an “anchor topic” of interest, estimate sentiment on the text within the narrative, then present the social media posts and related narratives using an interactive, web-based text visualization system that represents posts, sentiment, narratives, and other text properties using a suite of text visualization techniques. Our overall goal is to assess the strengths, limitations, and capabilities of our system, and to determine how to scale the text analytics and visualization approaches as the social media database grows.

CHS: SMALL: Direct Physical Grasping, Manipulation, and Tooling of Simulated Objects
Christopher Healey ; Robert St. Amant

$496,858 by National Science Foundation
08/ 1/2014 - 07/31/2018

This proposal is for the development and evaluation of CAPTIVE, a Cube with Augmented Physical Tools, to support exploration of three-dimensional information. The design of CAPTIVE is founded on the concept of tool use, in which physical objects (tools) are used to modify the properties or presentation of target objects. CAPTIVE integrates findings across a wide range of areas in human-computer interaction and visualization, from bimanual and tangible user interfaces to augmented reality. CAPTIVE is configured as a desktop augmented reality/fishtank virtual reality system [120], with a stereo- scopic display, a haptic pointing device, and a user-facing camera. In one hand the user holds a wireframe cube that contains virtual objects, in the other the pointing device, augmented to reflect its function as a tool: a probe probes for pointing at, choosing, and moving objects; a magnifying or semantic lens for filter- ing, recoding, and elaborating information; a cutting plane that shows slices or projection views. CAPTIVE supports visualization with more fluid and natural interaction techniques, improving the ability of users to explore and understand 3D information.

Identification of Translational Hormone-Response Gene Networks and Cis-Regulatory Elements
Steffen Heber(co-PI) ; Jose Alonso(Lead PI-CALS) ; Anna Stepanova (CALS) ; Cranos Williams (ECE)

$897,637 by National Science Foundation
08/ 1/2015 - 07/31/2020

Plants, as sessile organisms, need to constantly adjust their intrinsic growth and developmental programs to the environmental conditions. These environmentally triggered “adjustments“ often involve changes in the developmentally predefined patterns of one or more hormone activities. In turn, these hormonal changes result in alterations at the gene expression level and the concurrent alterations of the cellular activities. In general, these hormone-mediated regulatory functions are achieved, at least in part, by modulating the transcriptional activity of hundreds of genes. The study of these transcriptional regulatory networks not only provides a conceptual framework to understand the fundamental biology behind these hormone-mediated processes, but also the molecular tools needed to accelerate the progress of modern agriculture. Although often overlooked, understanding of the translational regulatory networks behind complex biological processes has the potential to empower similar advances in both basic and applied plant biology arenas. By taking advantage of the recently developed ribosome footprinting technology, genome-wide changes in translation activity in response to ethylene were quantified at codon resolution, and new translational regulatory elements have been identified in Arabidopsis. Importantly, the detailed characterization of one of the regulatory elements identified indicates that this regulation is NOT miRNA dependent, and that the identified regulatory element is also responsive to the plant hormone auxin, suggesting a role in the interaction between these two plant hormones. These findings not only confirm the basic biological importance of translational regulation and its potential as a signal integration mechanism, but also open new avenues to identifying, characterizing and utilizing additional regulatory modules in plants species of economic importance. Towards that general goal, a plant-optimized ribosome footprinting methodology will be deployed to examine the translation landscape of two plant species, tomato and Arabidopsis, in response to two plant hormones, ethylene and auxin. A time-course experiment will be performed to maximize the detection sensitivity (strong vs. weak) and diversity (early vs. late activation) of additional translational regulatory elements. The large amount and dynamic nature of the generated data will be also utilized to generate hierarchical transcriptional and translational interaction networks between these two hormones and to explore the possible use of these types of diverse information to identify key regulatory nodes. Finally, the comparison between two plant species will provide critical information on the conservation of the regulatory elements identified and, thus, inform research on future practical applications. Intellectual merit: The identification and characterization of signal integration hubs and cis-regulatory elements of translation will allow not only to better understand how information from different origins (environment and developmental programs) are integrated, but also to devise new strategies to control this flow for the advance of agriculture. Broader Impacts: A new outreach program to promote interest among middle and high school kids in combining biology, computers, and engineering. We will use our current NSF-supported Plants4kids platform (ref) with a web-based bilingual divulgation tools, monthly demos at the science museum and local schools to implement this new outreach program. Examples of demonstration modules will include comparison between simple electronic and genetic circuits.

Transcriptional Nodes Coordinate Patterning and Cellular Proliferation During Carpel Margin Meristem Development
Steffen Heber/co-PI ; Robert Franks/Lead PI-Genet

$771,784 by National Science Foundation
03/ 1/2014 - 02/28/2018

The coordination of spatial patterning cues and cellular proliferation underlies diverse processes from cancerous growth to reproductive development. A long-term objective of my research program is to understand how proliferative cues are coordinated with spatial information during organogenesis. In Arabidopsis thaliana this coordination of patterning and proliferation is necessary within the carpel margin meristem (CMM) to generate ovules that when fertilized will become seeds. In the previous funding period we demonstrated that the SEUSS (SEU) and AINTEGUMENTA (ANT) transcription factors regulate critical patterning events that support carpel margin meristem and ovule development. Our genetic analysis demonstrates that SEU and ANT share a partially redundant and overlapping function essential for proper seed formation. As SEU and ANT do not share sequence similarity, the molecular basis for this redundancy is not understood. We propose that the SEU and ANT activities synergistically converge at key transcriptional nodes. A node in this sense is a gene or a set of related genes that requires the combined activities of SEU and ANT for its expression. Our recently published transcriptomic analysis indicates that many of these nodes encode known transcriptional regulators. By studying these nodes we hope to better understand the transcriptional hierarchies that control CMM development and uncover the mechanistic basis of the synergistic action of SEU and ANT. Our transcriptomics study cannot determine if the nodes that we have identified are directly or indirectly regulated by SEU or ANT activity, However, even if these genes are indirectly controlled by SEU and ANT activity, their expression within the developing CMM suggests they may still play a critical functional role during CMM development. Furthermore, having now identified a set of genes that are enriched for CMM expression we are in a position to study the cis-regulatory elements that support gene expression within the CMM and medial gynoecial domain. Thus here we propose to: 1) Identify direct targets of SEU regulation within the CMM to further refine the transcriptional hierarchy required for CMM development; 2) assay the functional role of two of these nodes during CMM development; one encoded by the transcription factor PERIANTHIA and the second encoded by members of the REM family of B3 domain-containing proteins; 3) Identify cis-acting DNA regulatory elements required for CMM expression. Scientific significance: Understanding the coordination of cellular proliferation and spatial patterning during organogenesis is broadly of interest to scientists working in a diversity of fields. Completion of these specific aims will move us toward this future goal by illuminating the mechanistic basis for the overlapping functions of SEU and ANT during carpel margin and ovule development. Additionally, we expect that by elucidating the molecular mechanisms of the synergistic action of SEU and ANT upon key transcriptional nodes, we will engender a greater understanding of the molecular underpinnings of non-additivity within transcriptional networks and the complexity of developmental programs. Past NSF funding for this project (IOS-0821896) has resulted in the publication of five articles in well-respected journals (two in Plant Physiology, and one each in Developmental Biology, PLoS One, and BMC Plant Biology). Broader impacts: I ensure a broad societal impact from my program by integrating my research efforts with my teaching and training responsibilities and by widely disseminating materials and results. Furthermore, I initiated and continue to lead an outreach group that prepares and presents hands-on science demonstrations at local North Carolina schools. Our group has reached over 1500 Kindergarten through Grade 12 students over the past six years and continues to develop new demonstration modules inspired by our current work in developmental biology and genetics.

Collaborative Research: Transforming Computer Science Education Research Through Use of Appropriate Empirical Research Methods: Mentoring and Tutorials
Sarah Heckman

$406,557 by National Science Foundation
09/ 1/2015 - 08/31/2020

The computer science education (CSEd) research community consists of a large group of passionate CS educators who often contribute to other disciplines of CS research. There has been a trend in other disciplines toward more rigorous and empirical evaluation of various hypotheses. However, many of the practices that we apply to demonstrate rigor in our discipline research are ignored or actively avoided when performing research in CSEd. This suggests that CSEd is “theory scarce” because most publications are not research and do not provide the evidence or replication required for meta-analysis and theory building . An increase in empiricism in CSEd research will move the field from “scholarly teaching” to the “scholarship of teaching and learning” (SoTL) providing the foundation for meta-analysis and the generation of theories about teaching and learning in computer science. We propose the creation of training workshops and tutorials to educate the educators about appropriate research design and evaluation of educational interventions. The creation of laboratory packages, “research-in-a-box,” will support sound evaluation and replication leading to meta-analysis and theory building in the CSEd community.

TrackWhere: An App for Tracking and Visualizing Communication Across Teams During Product Lifecycle
Arnav Jhala ; Christopher Parnin

$29,999 by Lenovo
04/15/2017 - 01/15/2018

This project seeks to create an intelligent monitoring and visualization framework for communication across different groups involved in design, development, and release of hardware and software products within Lenovo.

TrackWhere: An App for Tracking and Visualizing Communication Across Teams During Product Lifecycle (Supplement)
Arnav Jhala ; Christopher Parnin

$20,000 by Lenovo
04/15/2017 - 01/15/2018

This project seeks to create an intelligent monitoring and visualization framework for communication across different groups involved in design, development, and release of hardware and software products within Lenovo.

SaTC: CORE: Medium: Collaborative: Taming Web Content Through Automated Reduction in Browser Functionality
Alexandros Kapravelos

$406,609 by National Science Foundation
09/ 1/2017 - 08/31/2021

The browser is constantly evolving to meet the demands of Web applications. Although this evolution supports the innovation that we see on the internet, there are security implications that we need to consider, such as attacks against the browser that leverage bugs that occur from the rapid development. In this project, we plan to examine how certain web applications work and associate their behavior directly with the corresponding browser functionality. Our goal is to be able to characterize what functionality is need from the browser when rendering a page and certain components. By building a system like this we will be able to identify for example what is needed from the browser to render a web advertisement. To better protect the internet users, we are going to leverage that information so that we can identify when web applications diverge from their expected behavior and attack the users' browser. We will use this information to limit the exposed functionality to the web applications and eliminate this way multiple classes of attacks, such as browser fingerprinting and drive-by downloads.

XS-Shredder: A Cross-Layer Framework for Removing Code Bloat in Web Applications
Alexandros Kapravelos

$300,000 by Arizona State University via Office of Naval Research
07/ 1/2017 - 06/30/2019

Modern web applications are incredibly complex pieces of software, with frameworks and libraries that assist web developers to write their applications quickly. However, these frameworks and libraries increase the attack surface of the web application. In this proposal, we present the design of a framework, called XS-Shredder, which is able to debloat all layers of the web application software stack: client-side code, server-side code, database, and operating system. This framework will perform analysis inter- and intra-layer, ultimately resulting in a web application that is semantically identically, yet with a significantly reduced attack surface.

Intelligence Augmentation – Development of a Field-Based Query System for Business Risk Assessment, Center Core Project
Michael Kowolenko

$45,000 by Center for Hybrid Multicore Productivity Research (CHMPR)-via NSF
05/ 1/2017 - 12/31/2017

Determining business risk requires in depth assessment of multiple external customers that can influence business success or failure. A methodology focused on characterizing the “wants and needs” of these customers allows for the development of analytics that can be used for qualifying risk. This project will fuse data to obtain facts or attributes of customers. Attributes are assigned relative weights based on the SMEs. An interface is presented to the analyst that provides the overall and relative risk to the enterprise depending on the weights assigned. The system allows for scenario playing by adjusting relative risks based on user input.

Collaborative Research: Big Data from Small Groups: Learning Analytics and Adaptive Support in Game-based Collaborative Learning
James Lester

$1,249,611 by National Science Foundation
10/ 1/2016 - 09/30/2021

The proposed project focuses on integrating models of game-based and problem-based learning in a computer-supported collaborative learning environment (CSCL). As groups of students solve problems in these environments, their actions generate rich and dynamic streams of fine-grained multi-channel data that can be instrumented for investigating students' learning processes and outcomes. Using the big data generated by small groups, we will leverage learning analytics to provide adaptive support for collaboration that will allow these models to be used at larger scales in real classrooms. The project will study CSCL in the context of an environmental-science-based digital game that will employ specific strategies to support the problem-based learning goals of helping students construct explanations, reason effectively, and become self-directed learners. In problem-based learning, students are active, intentional learners who collaboratively negotiate meaning. The project will embed models induced using learning analytic techniques inside of a digital game environment to enable students to cultivate collaborative learning competencies that translate to non-digital classroom settings.

Learning Environments Across Disciplines LEADS: Supporting Technology Rich Learning Across Disciplines: Affect Generation and Regulation During Co-Regulated Learning in Game-Based Learning Environments (Supplement
James Lester

$114,672 by McGill University/Social Sciences and Humanities Research Council of Canada
04/ 1/2012 - 02/28/2020

Contemporary research on multi-agent learning environments has focused on self-regulated learning (SRL) while relatively little effort has been made to use co-regulated learning as a guiding theoretical framework (Hadwin et al., 2011). This oversight needs to be addressed given the complex nature that self-and other-regulatory processes play when human learners and artificial pedagogical agents (APAs) interact to support learners? internalization of cognitive, affective, and metacognitive (CAM) SRL processes. We will use the Crystal Island learning environment to investigate these issues.

Learning Environments Across Disciplines LEADS: Supporting Technology Rich Learning Across Disciplines: Affect Generation and Regulation During Co-Regulated Learning in Game-Based Learning Environments ((Supplement)
James Lester

$3,737 by McGill University
04/ 1/2017 - 03/31/2018

Contemporary research on multi-agent learning environments has focused on self-regulated learning (SRL) while relatively little effort has been made to use co-regulated learning as a guiding theoretical framework (Hadwin et al., 2011). This oversight needs to be addressed given the complex nature that self-and other-regulatory processes play when human learners and artificial pedagogical agents (APAs) interact to support learners? internalization of cognitive, affective, and metacognitive (CAM) SRL processes. We will use the Crystal Island learning environment to investigate these issues.

REFLECT: Improving Science Problem Solving with Adaptive Game-Based Reflection Tools
James Lester ; Roger Azevedo (Psychology)

$1,300,000 by National Science Foundation
04/15/2017 - 03/31/2020

Reflection has long been recognized as a central component of effective learning. With the overarching goal of improving middle school students' science problem solving and learning outcomes, the REFLECT project has the objective of investigating a suite of theoretically grounded, adaptive game-based reflection tools to scaffold students' cognitive and metacognitive processes. The project will center on the design, development, and investigation of game-based learning tools for science education that adaptively scaffold students’ reflection through both embedded and retrospective support. It will culminate in a classroom experiment to study the impact of the adaptive reflection tools on both problem solving and learning. The results from this project will contribute significantly to theoretical and computational models of reflection, and produce both design principles and learning technologies that support the creation of effective learning environments.

CHS: Medium: Adapting to Affect in Multimodal Dialogue-Rich Interaction with Middle School Students
James Lester ; Kristy Boyer ; Bradford Mott ; Eric Wiebe

$1,184,073 by National Science Foundation
08/ 1/2014 - 07/31/2018

Despite the great promise offered by learning environments for K-12 science education, realizing its potential poses significant challenges. In particular, current learning environments do not adaptively support children's affect. This project investigates computational models of affect that support multimodal dialogue-rich interaction. With an emphasis on devising scalable solutions, the project focus on machine-learning techniques for automatically acquiring affect and dialogue models that can be widely deployed in natural classroom settings.

ENGAGE: A Game-Based Curricular Strategy for Infusing Computational Thinking into Middle School Science.
James Lester ; Brad Mott ; Eric Wiebe (Friday Instit

$2,498,862 by National Science Foundation
08/15/2016 - 07/31/2019

Recent years have seen a growing recognition that computer science is vital for scientific inquiry. The middle school grade band is critical for shaping students’ aspirations and skills, and many issues relating to workforce underproduction and underrepresentation of diverse students in computer science can be traced back to middle school. To address this problem, the project will deeply integrate computer science into middle school science classrooms. Centered on a game-based learning environment that features collaborative learning, the project will have a specific focus on addressing gender issues in middle school computer science education with the goal of creating learning interactions that are both effective and engaging for all students.

Tutorial Planning with Markov Decision Processes for Counterinsurgency Training Environments
James Lester ; Bradford Mott ; Jonathan Rowe

$1,072,237 by US Army - Army Research Laboratory
04/10/2015 - 04/ 9/2018

Intelligent tutoring systems (ITSs) are highly effective for education and training. Tutorial planning is a critical component of ITSs, controlling how scaffolding is structured and delivered to learners. Devising data-driven tutorial planners that automatically induce scaffolding models from corpora of student data holds significant promise. This project investigates a data-driven framework for tutorial planning that is based on modular reinforcement learning. This framework explicitly accounts for the inherent uncertainty in how learners respond to different types of tutorial strategies and tactics, and automatically induces and refines tutorial planning policies in order to optimize measures of learning outcomes.

Collaborative Research: PRIME: Engaging STEM Undergraduate Students in Computer Science with Intelligent Tutoring Systems
James Lester ; Bradford Mott ; Eric Wiebe (Friday Instit

$1,499,828 by National Science Foundation
09/ 1/2016 - 08/31/2020

Significant advances in intelligent tutoring systems have paved the way for engaging STEM undergraduates in computer science. This research has spawned a new generation of personalized learning environments that offer significant promise for providing students with adaptive learning experiences that are crafted to their individual needs. Spurred by this significant promise and building on a research infrastructure developed by the project team in a series of NSF-supported projects, the PRIME project will transform introductory computer science education with state-of-the-art intelligent tutoring systems technologies.

Guiding Understanding via Information from Digital Environments (GUIDE)
James Lester Co-PI ; Eric Wiebe Lead PI

$1,238,549 by Concord Consortium via National Science Foundation
09/15/2015 - 08/31/2019

This project will utilize research and development groups at the Concord Consortium and NC State University. Educational software for teaching high school multi-level genetics developed by the Concord Consortium will be enhanced by intelligent agents and machine-based tutoring system technologies developed at NC State to help enhance the learning experience for students. These groups will collaborate closely to develop and research a hybrid system that combines technological intervention and teacher pedagogical expertise to illuminate and guide student learning in deeply digital curricula and classrooms.

Health Quest: Engaging Adolescents in Health Careers with Technology-Rich Personalized Learning
James Lester, II

$1,301,820 by National Institutes of Health (NIH)
08/ 1/2017 - 07/31/2022

Leveraging intelligent game-based learning technologies, the Health Quest project focuses on developing and disseminating technology-rich resources to broaden the interests of adolescents in biomedical, behavioral and clinical research careers. The project centers on the development of technology-rich learning resources. These include a game-based learning environment featuring health careers as well as an online community that includes a speaker series featuring a broad range of health professionals. The final year of the project will see a full evaluation of the Health Quest program and its impact on students’ interest in biomedical, behavioral and clinical research careers.

Collaborative Research: Fostering Collaborative Computer Science Learning with Intelligent Virtual Companions for Upper Elementary Students
Collin Lynch (co-PI) ; Eric Wiebe

$1,399,088 by National Science Foundation
08/15/2017 - 07/31/2021

The University of Florida and North Carolina State University jointly propose FLECKS, a Design and Development proposal for the NSF's Discovery Research PreK-12 (DRK-12) program. FLECKS (Friendly Learning Environment for Kids' Computer Science) addresses the pressing need for the development of fundamental computer science competencies in upper elementary-school children. The goal of the proposed project is to design, develop, and investigate FLECKS, an intelligent learning environment to teach collaborative computer science problem solving. Collaboration is a central academic and professional practice in computational thinking, yet it presents many challenges for elementary school students. Students often struggle to collaborate successfully due to individual differences in academic status; gender; cultural background; personality; attitudes toward collaboration; or attitudes toward learning. In order to address these challenges, FLECKS will provide dyads of students with a rich, scaffolded environment where they use an interactive online coding environment to engage in computer science challenges related to their STEM subject areas. Central to the innovation is the way in which the dyads are supported. FLECKS are animated virtual characters that take a rich set of multimodal features as input, and then adapt to students’ patterns of collaboration, including who has control of the keyboard and mouse; who speaks when; and the problem-solving actions the students take in the online environment.

SHF:Medium:Collaborative:Transfer Learning in Software Engineering
Tim Menzies

$464,609 by National Science Foundation
08/ 2/2014 - 06/30/2018

Software engineers need better ways to recognize best practices in past projects, and to understand how to transfer and adapt those experiences to current projects. No project is exactly like previous projects- hence, the trick is to find which parts of the past are most relevant and can be transferred into the current project. We propose novel automated methods to apply the machine learning concept of transfer learning to adapt lessons from past software engineering project data to new conditions.

SHF:Medium:Scalable Holistic Autotuning for Software Analytics
Timothy Menzies ; Xipeng Shen

$898,349 by National Science Foundation
07/ 1/2017 - 06/30/2021

This research proposes to advance the state of the art to holistic scalable autotuners, which tunes all levels of options for multiple optimization objectives at the same time. It will achieve this ambitious goal through the development of a set of novel techniques that efficiently handles the tremendous tuning space. These techniques take advantage of the synergies between all those options and goals by exploiting relevancy filtering (to quickly dispose of unhelpful options), locality of inference (that enables faster updates to out- dated tunings) and redundancy reduction (that reduces the search space for better tunings). This new autotuner will be a faster method for finding better tunings that satisfy more goals. To test this claim, this research will assess if this new tool can reduce the total computational resources required for effective SE data analytics by orders of magnitude.

Large-Scale Automatic Analysis of the OAI Magnetic Resonance Image Dataset
Frank Mueller

$331,603 by UNC - UNC Chapel Hill
08/15/2017 - 07/31/2022

The goal of this proposal is to optimize and to openly provide to the OA community a new technology to rapidly and automatically measure cartilage thickness, appearance and changes on magnetic resonance images (MRI) of the knee for huge image databases. This will allow assessment of trajectories of cartilage loss over time and associations with clinical outcomes on an unprecedented scale; future work will focus on incorporating additional disease markers, ranging from MRI-derived biomarkers for bone and synovial lesions, to biochemical biomarkers, to genetic information.

HPC Power Modeling and Active Control
Frank Mueller

$386,290 by Lawrence Livermore National Laboratory via US Department of Energy
10/25/2016 - 09/30/2019

As we approach the exascale era, power has become a primary bottleneck. The US Department of Energy has set a power constraint of 20MW on each exascale machine. To be able achieve one exaflop in 20MW,it is necessary that we use power intelligently to maximize performance under a power constraint. In this work, we propose to alleviate the shortcomings of current HPC systems in addressing power constraints by (1) power-aware machine partitioning, (2) power-constrained job scheduling, (3) systematic provisioning and procurement of hardware under a power cap, (4)modeling of network, deep memories, and storage, as well as (5)investigating the inter-dependence between power and cooling.

Student Travel Grant for RTSS'17 Ph.D. Student Poster Forum on Real-Time Aspects of Internet of Things and Cyber-Physical Systems
Frank Mueller

$15,000 by National Science Foundation
09/ 1/2017 - 08/31/2018

As General Chair of the IEEE Real-Time Systems Symposium in 2017 (RTSS 2017), the PI proposes to organize, in conjunction with the symposium, a Ph.D. student poster forum. The forum will have significant educational value. Essays and posters will be solicited for the Ph.D. student poster forum. Funds are requested to sponsor the Ph.D. student forum. The funds will provide stipends (both merit-based and need-based) and cover organizers’ costs

A Deep-Learning Approach Towards Auto-Tuning CFD Codes
Frank Mueller

$90,000 by Virginia Tech vai US Air Force Office of Scientific Research
06/ 1/2017 - 02/14/2018

Heterogeneous computing systems are increasingly becoming the norm in high-performance computing (HPC) norm. For instance, as of the November 2015 TOP500 List, more than 35% of the computational power on the TOP500 now comes from systems containing heterogeneous computing devices, e.g., CPUs, GPUs, APUs, Xeon Phis, and even FPGAs. However,significant hurdles impede a domain scientist ability to extract high performance out of such heterogeneous devices, including (1) selecting the appropriate algorithm(s) for the target heterogeneous device, (2)setting the runtime parameters, and (3) configuring the hardware relative to some evaluation metric, e.g., performance, power, or energy efficiency. Unfortunately, exploring hardware and software design choices often requires time-consuming simulations, and while some brute-force auto-tuning support has been proposed, the results are heuristic that are often narrow in scope, i.e., only applicable to a particular problem. This proposal seeks to study, analyze, and synthesis deep-learning approaches that expose the various parameters as "knobs" that can be tuned via deep learning to optimize for the metric of interest, whether it be performance, power, or energy efficiency.

TWC: Small: Collaborative: Discovering Software Vulnerabilities Through Interactive Static Analysis
Emerson Murphy-Hill

$249,854 by National Science Foundation
10/ 1/2013 - 09/30/2018

Software vulnerabilities originating from insecure code are one of the leading causes of security problems people face today. Current tool support for secure programming focuses on catching security errors after the program is written. We propose a new approach, interactive static analysis, to improve upon static analysis techniques by introducing a new mixed-initiative paradigm for interacting with developers to aid in the detection and prevention of security vulnerabilities.

CAREER:Expanding Developers' Usage of Software Tools by Enabling Social Learning
Emerson Murphy-Hill

$495,721 by National Science Foundation
08/ 1/2013 - 07/31/2018

Tools can help software developers alleviate the challenge of creating and maintaining software. Unfortunately, developers only use a small subset of the available tools. The proposed research investigates how social learning, an effective mechanism for discovering new tools, can help software developers to discover relevant tools. In doing so, developers will be able to increase software quality while decreasing development time.

SHF:Small: Enabling Scalable and Expressive Program Analysis Notifications
Emerson Murphy-Hill ; Sarah Heckman

$265,853 by National Science Foundation (NSF)
08/15/2017 - 07/31/2020

Previous research shows that existing notifications produced by integrated development environments are poorly understood and overwhelming to programmers. We propose building on our prior work to create a new architecture for program analysis notifications that enables toolsmiths to create scalable and understandable notifications for a variety of program analysis tools.

NeTS: Small: Collaborative Research: Creating Semantically-Enabled Programmable Networked Systems (SERPENT)
Kemafor Ogan

$278,271 by National Science Foundation
10/ 1/2015 - 09/30/2018

The separation of control and data plane in SDN architectures helps merge packet and circuit paradigms into a single architecture and enables logical centralization of the control function. This enables new thinking about solutions to path optimization problems frequently encountered in networking, from routing to traffic engineering. The SERPENT project proposes to develop effective solutions for representing, storing and manipulating network state using rich semantic models such that path and topology embedding problems can be solved using a semantic database framework. This will simplify creation of novel network control and management systems able to cope with increasingly complex user requirements.

REU Site: Science of Software
Christopher Parnin ; Emerson Murphy-Hill ; Sarah Heckmen

$355,365 by National Science Foundation
01/ 1/2016 - 12/31/2018

There are not enough skilled data science researchers, particularly in software engineering. Hence, this REU Site in Science of Software (SOS) will engage undergraduates as data scientists studying exciting and meaningful SE research problems. Students work side-by-side with faculty mentors to gain experience in qualitative and quantitative research methods in SOS. Activities include dataset challenges, pair research, literature reviews, and presentations. Ultimately, each student works independently toward a published research result with their faculty mentors.

An Innovative Curriculum for Cybersecurity Education
Douglas Reeves ; Jason King

$234,839 by National Security Agency
04/ 3/2017 - 04/ 2/2018

N.C. State will develop, document, and deliver course materials for institutions wishing to educate and prepare Cybersecurity graduates to meet the workforce needs of the Federal government and industry, in support of the Cybersecurity National Action Plan. This curriculum addresses key elements of the identified knowledge, skill, and ability areas. Individual courses will be modularized and packaged for use by others, including syllabi, teaching materials, readings, and fully documented, hands-on student exercises. A course roadmap will be presented, along with a means for prospective students to self-assess their needs. An educator's workshop will be organized to train teachers in the use of these materials.

EAGER: Formal Models of Trainer Feedback for I-Learning Theoretical Guarantees
David Roberts

$70,043 by National Science Foundation
08/15/2016 - 12/31/2017

Machines learning from feedback is a well-studied problem. If robots and software systems can successfully adapt, they can remain useful in changing environments, in situations unanticipated at design time, and can take direction from human users. This proposal contributes a new algorithm, Income Learning (I-learning), that is designed to thrive in these scenarios. The emphasis of the work will be on theoretical and empirical analyses of how I-learning and existing temporal difference methods (that maximize the expected reward) differ in performance on a variety of tasks, and how I-learning is better able to take advantage of human teaching.

CPS: Synergy: Integrated Sensing and Control Algorithms for Computer-Assisted Training (Computer Assisted Training Systems (CATS) for Dogs)
David Roberts ; Alper Bozkurt ECE ; Barbara Sherman CVM

$1,029,403 by National Science Foundation
10/ 1/2013 - 09/30/2018

We propose to develop tools and techniques that will enable more effective two-way communication between dogs and handlers. We will work to create non-invasive physiological and inertial measuring devices that will transmit real-time information wirelessly to a computer. We will also develop technologies that will enable the computer to train desired behaviors using positive reinforcement without the direct input from humans. We will work to validate our approach using laboratory animals in the CVM as well as with a local assistance dog training organization working as a consultant.

Scalable Data Management, Analysis, and Visualization (SDAV) Institute
Nagiza Samatova ; Anatoli Melechko

$750,000 by US Department of Energy
02/15/2012 - 02/14/2018

The SDAV is a unique and comprehensive combination of scientific data management, analysis, and visualization expertise and technologies aimed at enabling scientific knowledge discovery for applications running on state-of-the-art computational platforms located at DOE's primary computing facilities. This integrated institute focuses on tackling key challenges facing applications in our three focus areas through a well-coordinated team and management organization that can respond to changing technical and programmatic objectives. The proposed work portfolio is a blend of applied research and development, aimed at having key software services operate effectively on large distributed memory multi-core, and many-core platforms and especially DOE's open high performance computing facilities. Our goal is to create an integrated, open source, sustainable framework and software tools for the science community.

Consortium for Nonproliferation Enabling Capabilities
Nagiza Samatova, co-PI ; Robin Gardner (Nuclear Eng

$9,744,249 by US Department of Energy
07/31/2014 - 07/30/2019

NC State University, in partnership with University of Michigan, Purdue University, University of Illinois at Urbana Champaign, Kansas State University, Georgia Institute of Technology, NC A&T State University, Los Alamos National Lab, Oak Ridge National Lab, and Pacific Northwest National lab, proposes to establish a Consortium for Nonproliferation Enabling Capabilities (CNEC). The vision of CNEC is to be a pre-eminent research and education hub dedicated to the development of enabling technologies and technical talent for meeting the grand challenges of nuclear nonproliferation in the next decade. CNEC research activities are divided into four thrust areas: 1) Signatures and Observables (S&O); 2) Simulation, Analysis, and Modeling (SAM); 3) Multi-source Data Fusion and Analytic Techniques (DFAT); and 4) Replacements for Potentially Dangerous Industrial and Medical Radiological Sources (RDRS). The goals are: 1) Identify and directly exploit signatures and observables (S&O) associated with special nuclear material (SNM) production, storage, and movement; 2) Develop simulation, analysis, and modeling (SAM) methods to identify and characterize SNM and facilities processing SNM; 3) Apply multi-source data fusion and analytic techniques to detect nuclear proliferation activities; and 4) Develop viable replacements for potentially dangerous existing industrial and medical radiological sources. In addition to research and development activities, CNEC will implement educational activities with the goal to develop a pool of future nuclear non-proliferation and other nuclear security professionals and researchers.

Lecture Hall Polytopes, Inversion Sequences, and Eulerian Polynomials
Carla Savage

$30,000 by Simons Foundation
09/ 1/2012 - 08/31/2018

Over the past ten years, lecture hall partitions have emerged as fundamental structures in combinatorics and number theory, leading to new generalizations and new interpretations of several classical theorems. This project takes a geometric view of lecture hall partitions and uses polyhedral geometry to investigate their remarkable properties.

SaTC: CORE: Small: Collaborative: A Broad Treatment of Privacy in Blockchains
Alessandra Scafuro

$249,922 by National Science Foundation (NSF)
09/ 1/2017 - 08/31/2020

A blockchain is a public, distributed, append-only database whose consistency is maintained by the combined work of users across the world rather than a single party, thus avoiding single points of failure and trust. The public nature of the blockchain, however, raises important privacy concerns. Existing work partially addressed privacy concerns for the restricted case of blockchains used for financial transactions. As blockchains are set to be used in a variety of contexts, proposed work will initiate a broad treatment of privacy definition and provide constructions achieving new privacy goals that can be implemented across different blockchains.

Analysis of Hash-Based Signatures Schemes in the QROM
Alessandra Scafuro

$96,752 by Silicon Valley Community Foundation
12/ 1/2017 - 12/31/2018

Digital signatures are fundamental cryptographic building blocks that guarantee the authenticity and integrity of digital communications. Currently adopted signature schemes (such as ECDSA) leverage number-theoretic properties, and their security relies on the intractability of solving number-theoretic problems such as integer factorization and the discrete logarithm problem. Shor's algorithm however shows that such problems are practically solvable on quantum computers. The threat of quantum attacks triggered NIST's competition for the development and standardization of signature schemes that are secure in presence of quantum adversaries. There exist several candidates for such competition, and CISCO research team is participating with the LMS scheme. The advantage of the LMS scheme over the other candidates is in its simplicity. The goal of the proposed research is to develop a formal treatment of the post-quantum security of the LMS scheme.

NeTS: Small: Fine-grained Measurement of Performance Metrics in the Internet of Things
Muhammad Shahzad

$449,999 by National Science Foundation
10/ 1/2016 - 09/30/2019

PI proposes to develop a framework for passive and fine-grained measurements of the performance metrics in the Internet of Things, which include both Quality of Service metrics such as latency, loss, and throughput and Resource Utilization metrics such as power consumption, storage utilization, and radio on time etc. Measurements of these performance metrics can be used reactively by network operators to perform tasks such as detecting and localizing offending flows that are responsible for causing delay bursts, throughput deterioration, or even power surges. These measurements can also be used proactively by network operators to locate and preemptively update any potential bottlenecks.

CSR:Small:Collaborative Research: Scalable Fine-Grained Cloud Monitoring for Empowering IoT
Muhammad Shahzad

$257,996 by National Science Foundation
09/15/2016 - 08/31/2019

Due to the rapid adoption of the cloud computing model, the size of the data centers and the variety of the cloud services is increasing at an unprecedented rate. Due to this, fine-grained monitoring of the health and the usage of data center resources is becoming increasingly important and challenging. In this work, we address the problem of efficiently acquiring and transporting cloud management and monitoring data. For data acquisition, we address the crucial challenge of controlling data size. For data transportation, we focus on efficiently moving the data from the point it is collected inside the data center to the point it needs to be stored for processing.

CRII: CSR: Pervasive Gesture Recognition Using Ambient Light
Muhammad Shahzad

$174,878 by National Science Foundation
05/ 1/2016 - 04/30/2018

The PI proposes to use ambient light for recognizing human gestures. The intuition behind the proposed approach is that as a user performs a gesture in a room that is lit with light, the amount of light that he/she reflects and blocks changes, resulting in a change in the intensity of light in all parts of the room. This change can be measured and the pattern of change in the intensity of light is different for different gestures. Leveraging this observation, the proposed approach first learns these patterns for different gestures and then recognizes the gestures in real-time.

CSR:Small:Supporting Position Independence and Reusability of Data on Byte-Addressable Non-Volatile Memory
Xipeng Shen

$499,998 by National Science Foundation (NSF)
08/16/2017 - 07/31/2020

Byte-Addressable Non-Volatile Memory (NVM) is the upcoming next generation of memory with tremendous potential benefits. This proposal is about offering programming system-level support of persistency on NVM. Particularly, it focuses on effective support of the usage of dynamic data structures on NVM.

Cognitive Computing-Based Compilation (YUE ZHAO)
Xipeng Shen

$10,846 by IBM Canada Limited
06/30/2016 - 06/29/2018

This project proposes to leverage IBM Watson-like cognitive computing engines for improving the efficacy of IBM commercial compilers (XLC/C++, XLFortran). Recent years have witnessed some exiting improvement of cognitive computing engines as demonstrated by the increasing impact of IBM Watson. In this project, we propose to leverage such engines to allow compilers to automatically accumulate the knowledge on appropriate ways to compile programs, learn from it, and then apply it to new programs. The success will remove the difficulties for users to choose the appropriate compilation flags, largely reduce the required tuning efforts from users, and help compilers to make better decisions to produce high-quality code

Data Locality Enhancement of Dynamic Simulations for Exascale Computing
Xipeng Shen

$409,214 by US Department of Energy
06/15/2015 - 06/14/2018

Computer simulation is important for scientific research in many disciplines. Many such programs are complex, and transfer a large amount of data in a dynamically changing pattern. Memory performance is key to maximizing computing efficiency in the era of Chip Multiprocessors (CMP) due to the growing disparity between the slowly expanded memory bandwidth and the rapidly increased demands for data by processors. The importance is underlined by the trend towards exascale computing, in which, the processors are expected to each contain hundreds or thousands of (heterogeneous) cores. Unfortunately, today’s computer systems lack support for high degree of memory transfer. This project proposes to improve memory performance of dynamic applications by developing two new techniques that are tailored especially for the emerging features of CMP. The first technique is asynchronous streamlining, which analyzes the memory reference patterns of an application during runtime and regulates both control flows and memory references on the fly. The second technique is neighborhood-aware locality optimizations, which concentrates on the non-uniform relations among computing elements. This research will produce a robust tool for scientific users to enhance program locality on multi- and many-core systems that is not possible to achieve with existing tools. Further, it will contribute to the advancement of computational sciences and promote academic research and education in the challenging field of scientific computing.

Cognitive Computing-Based Compilation (YUE ZHAO)
Xipeng Shen

$22,500 by IBM Canada Limited
01/ 1/2017 - 12/31/2017

This project proposes to leverage IBM Watson-like cognitive computing engines for improving the efficacy of IBM commercial compilers (XLC/C++, XLFortran) and for enabling intelligent compilationas-a-service on cloud.

Exploring Deep Learning and Ontology-Based Framework
Xipeng Shen

$30,000 by Lawrence Livermore National Laboratory via US Dept of Energy
06/20/2017 - 12/31/2017

This work is to explore the potential of deep learning in optimizing sparse matrix computations, and to complete an ontology-based framework with a web interface for enabling effective knowledge accumulation and knowledge sharing for HPC.

SHF: Small: Improving Memory Performance on Fused Architectures through Compiler and Runtime Innovations
Xipeng Shen ; Frank Mueller

$470,000 by National Science Foundation
08/ 1/2015 - 07/31/2018

Contemporary architectures are adopting an integrated design of conventional CPUs with accelerators on the same die with access to the same memory, albeit with different coherence models. Examples include AMD's Fusion architecture, Intel's integrated main-stream CPU/GPU product line, and NVIDIA Tegra's integrated graphics processor family. Integrated GPUs feature shared caches and a common memory interconnect with multicore CPUs, which intensify resource contention in the memory hierarchy. This creates new challenges for data locality, task partitioning and scheduling, as well as program transformations. Most significantly, a program running on GPU warps and CPU cores may adversely affect performance and power of one another. The objective of this work is to understand these novel implications of fused architectures by studying their effects, qualifying their causes and quantifying the impacts on performance and energy efficiency. We propose to advance the state-of-the-art by creating spheres of isolation between CPU and GPU execution via novel systems mechanisms and compiler transformations that reduce cross-boundary contention with respect to shared hardware resources. This synergy between systems and compiler techniques has the potential to significantly improve performance and power guarantees for co-scheduling pgrams fragments on fused architectures. impact: The proposed work, if successful, has the potential to transform resource allocation and scheduling at the systems level and compiler optimizations at the program level to create a synergistic development environment with significant performance and power improvements and vastly increased isolation suitable for synergistic co-deployment of programs crossing boundaries on innovative fused architectures.

Realizing Cyber Inception: Toward a Science of Personalized Deception for Cyber Defense
Munindar Singh

$375,360 by University of Southern California via US Army Research Office
09/ 1/2017 - 12/31/2017

Frequent security breaches have highlighted both the growing importance of cybersecurity and weaknesses of traditional methods such as firewalls, malware detection, intrusion detection, and prevention technologies. To leap ahead of attackers, we must move beyond passive defense strategies toward a new science of interactive personalized deception for cyberdefense. Our proposed approach involves (1) building models of attackers and their propensities and (2) characterizing computers, networks, users, and their relationships and interactions so as to enable realistic deception. We will develop a modular framework for evaluation of the key deception techniques consisting of a pluggable game-based scaffolding.

SHF: Small: Supporting Regular Expression Testing, Search, Repair, Comprehension, and Maintenance
Kathryn Stolee

$499,996 by National Science Foundation (NSF)
08/15/2017 - 07/31/2020

Regular expressions (regexes) are responsible for numerous faults in many software products, and yet, static bug finders and automated program repair techniques generally ignore this common language feature. First, I propose to explore and characterize regex-related bugs in bug repositories. From there, I propose to develop approaches for detecting regex-related bugs using static analysis and patching regex-related bugs using automated program repair. The proposed detection and patching techniques both depend on similarity analysis of regexes. The expected research outcomes include a publicly available data set of regex-related faults, new regex-related bug patterns for static bug finders like FindBugs and PMD, and in the best case, an open source tool for automated patch generation for regular expressions.

SHF: Medium: Collaborative Research: Semi and Fully Automated Program Repair and Synthesis via Semantic Code Search
Kathryn Stolee

$387,661 by National Science Foundation
07/ 1/2016 - 06/30/2020

Software plays an integral role in our society. However, software bugs are common, routinely cause security breaches, and cost our economy billions of dollars annually. The software industry struggles to overcome this challenge: Software is so inherently complex, and mistakes so common, that new bugs are typically reported faster than developers can fix them. Recent research has demonstrated the potential of automated program repair techniques to address this challenge. However, these techniques often produce low-quality repairs that break existing functionality. In this research, we develop new techniques to fix bugs and implement new features automatically, producing high-quality code.

Algorithms for Exploiting Approximate Network Structure Research Area 10: Network Science
Vida Blair Sullivan

$538,199 by US Army-Army Research Office
05/15/2017 - 05/14/2020

We propose a new framework for efficient, robust, and noise-tolerant network algorithms that guarantee near-optimal solutions to NP-hard problems by exploiting structure inherent in real-world networks. We model networks as consisting of a majority that belongs to a structural graph class, plus a few deviations resulting from measurement errors, unusual behaviors, and/or unexplained exceptions. We will develop algorithms which exploit this more approximate form of graph structure and guarantee near-optimal solutions and polynomial running time for any network that is ``close'' to a structural graph class, initially focusing on hierarchical/tree-like networks (e.g. those arising in biology and social behavior).

Moore Foundation Data-Driven Discovery Investigator
Vida Blair Sullivan

$1,500,000 by Gordon and Betty Moore Foundation
11/10/2014 - 12/ 1/2019

Understanding and identifying intermediate-scale structure is key to designing robust tools for data analysis, just as the interdependence of local interactions and global behavior is key in many science domains. We thus focus on constructing a theory and tools for using this structure to improve analysis and identification of relationships in massive graph data. Through careful integration of tools from graph theory, computational complexity, statistics, and parallel algorithm design, the proposed work will derive novel measures of graph similarity based on structural representations and application-inspired features of interest. We will design efficient, scalable sampling algorithms which leverage inherent sparsity and structure to de-noise and improve accuracy of parameter estimation. As a specific example of science domain impact, we focus on improving understanding of the brain. Applying our new tools for characterizing graph-theoretic structure in such networks, scientists will be able to build higher fidelity models of brain network formation and evolution. Additionally, efficient algorithms from the associated parameterized framework will enable rapid comparison of regions and identification of discrepancies, abnormalities, and influential components for specific tasks.

CRII: CSR: Rethinking the FTL in SSDs -- a file translation layer instead of a flash translation layer
Hung-Wei Tseng

$174,998 by National Science Foundation
03/15/2017 - 02/28/2019

SSDs (solid state drives) nowadays become popular in all kinds of computing systems. However, these systems still leverage the existing block interface to manage SSDs, resulting in multiple layers of indirections, under-utilized parallelism inside the SSD, overheads for in-storage computing,and difficulties in sharing file among heterogeneous computing devices. This work will reshape the current system stack and simplifies the software/hardware interface for SSDs by using the SSD to directly map files into physical block addresses on the SSD. This work will demonstrate the effect of the proposed system in applications ranging from file systems, datacenter storage, and virtualized machines.

Property Rights and Land Tenure in the Slums of Bangalore
Ranga Vatsavai

$21,237 by Duke University via Omidyar Network
01/ 1/2017 - 02/ 8/2018

Slums have become an inescapable feature of cities in the developing world, and the number of people living in slums has increased rapidly, coming close to 1 billion and rising higher (UN-Habitat 2010). Relatively little is known, however, about patterns of slum development over periods of time and about factors associated with progressive improvements. One of the objectives of this research is to develop a prototype methodology for semi-automatic slum identification and categorization that can speedily and reliably be adapted for use in other cities.

In Situ Summarization for Existing Petascale and Future Exascale HPC Infrastructures (In-Situ Descriptive and Feature-Based Summarization)
Ranga Vatsavai ; Nagiza Samatova

$45,060 by Kitware, Inc. via US Army
07/ 1/2017 - 12/31/2017

Datasets being generated by experiment and simulation today are increasingly large, and nations across the world – including China, the United States, Europe, and Japan – have all invested heavily in developing computers capable of processing or generating these datasets. These datasets come from applications in many areas, and are driven by national security issues as well as industries of strategic value. Large-scale computing is seen as driving technological developments in biology and biomedicine, high-energy physics (a key to stockpile stewardship), and materials science. All of these are areas where the United States has traditionally led the world. However, recent developments have placed China as the leader in building large-scale computers and there is a concern that this could result in the loss of a leadership role for the United States in many of the related technologies. With this in mind, the United States has placed a renewed emphasis on developing exascale computer platforms, and strategically, on the development of algorithms which can make use of large computers to support decisions in science and engineering – an area where the United States arguably still leads the world. In this proposal, we propose adapting Kitware’s Catalyst and Cinema platforms to perform new summarization tasks including compression scalably on these new architectures. To demonstrate the effectiveness of these summarizations for this phase I project, we will adapt a simulation program to use Catalyst for in-situ processing – saving only the dynamic summarization – in order to provide stakeholders with the information necessary to make a decision and be confident in the simulation process.

Triangle Computer Science Distinquished Lecture Series
Mladen Vouk

$20,100 by Duke University (Us Army-Army Research Office)
01/ 1/2014 - 12/31/2017

Since 1995, the Triangle Computer Science Distinguished Lecturer Series (TCSDLS) has been hosting influential university researchers and industry leaders from computer-related fields as speakers at the three universities within the Research Triangle Area. The lecturer series, sponsored by the Army Research Office (ARO), is organized and administered by the Computer Science departments at Duke University, NC State University, and the University of North Carolina at Chapel Hill. This proposal argues for continuation, for an additional 3 years, of this highly successful lecturer series which is being led by Duke University.