Current Research Projects (by faculty)

The funded projects listed below are active projects and the funded running total for the active projects is on the left navigational menu.

Collaborative Research: Integrating Computing in Stem: Designing, Developing and Investigating a Team-based Professional Development Model for Middle-and High-school Teachers
Tiffany Barnes

$861,773 by National Science Foundation (NSF)
09/ 1/2017 - 08/31/2020

Integrated Snap! is a comparison study between the traditional single teacher professional development model and a community of practice professional development where teachers attend as a group. CTE/CS, Math, Engineering, and Science teachers that would foster the integration of Computer Science Principles Curriculum and Science Curriculum in high school classrooms. The professional development will take place using Borko’s (2004) phases. We will develop a professional development model centered on helping content teachers learn best practices for integrating computing in their classroom. That PD will be piloted, replicated, and then modified for other sites and contexts. Teachers will work together to build simulation and programming tools and corresponding activities for their classrooms to use to explore computational concepts in the context of their discipline. An integral part of Integrated Snap is to have teams of teachers from the same school or district attend the PD together and build a community among the teachers and their students doing similar work across their classrooms.

REU Site: Socially Relevant Computing and Analytics
Tiffany Barnes

$359,997 by National Science Foundation
03/ 1/2017 - 02/29/2020

The REU Site at NC State University will immerse a diverse group of undergraduates in a vibrant research community of faculty and graduate students building and analyzing cutting-edge human-centric applications including games, tutors, and mobile apps. We will recruit students from underrepresented groups and colleges and universities with limited research opportunities through the STARS Computing Corps, an NSF-funded national consortium of institutions dedicated to broadening participation in computing. Using the Affinity Research Groups and STARS Training for REUs models, we will engage faculty and graduate student mentors with undergraduates to create a supportive culture of collaboration while promoting individual contributions to research through just-in-time training for both mentors and students throughout the summer.

Track 2: CS10K: BJC-STARS: Scaling CS Principles through STARS community & Leadership Development
Tiffany Barnes

$500,000 by National Science Foundation
10/ 1/2015 - 09/30/2018

BJC-STARS is a CS10K proposal to broaden access to Computer Science education through engaging colleges and universities to prepare and support regional communities of high school teachers to teach the Beauty and Joy of Computing (BJC) Computer Science Principles course. We will leverage the successful STARS model focusing on engaging faculty and students in a community committed to leading efforts to broaden participation in computing. Each year, we will engage new university faculty who will teach BJC and facilitate professional development and support to high school teachers and students. We will also build a STARS community among participating high school teachers and students, engaging them in the need to broaden participation in computing.

Type I: Collaborative Research: FRABJOUS CS - Framing a Rigorous Approach to Beauty and Joy for Outreach to Underrepresented Students in Computing at Scale
Tiffany Barnes

$352,831 by National Science Foundation
02/ 1/2013 - 08/31/2018

In this FRABJOUS CS project, we will prepare 60 high school teachers to teach the Beauty and Joy of Computing (BJC) Computer Science Principles curriculum. The BJC course is a rigorous introductory computing course that highlights the meaning and applications of computing, while introducing low-threshold programming languages Snap-Scratch, GameMaker and AppInventor. BJC is informed and inspired by the Exploring Computer Science curriculum, that was explicitly designed to channel the interests of urban HS students with ?culturally relevant and meaningful curriculum? [Goode 2011][Margolis 2008]. The BJC course uses collaborative classroom methods including pair learning, and student-selected projects are geared toward leveraging students? knowledge of social media, games, devices, and the internet. At UNC Charlotte in 2010 and 2011, PI Barnes engaged college students in supporting the BJC course, and in after-school outreach and summer camps that excite middle and high school students about this curriculum at different levels. The project engages three university faculty members and 6 college students to help the high school teachers build a Computer Science Teachers Association chapter and provide ongoing professional development and support for the BJC course. The project also engages high school teachers and an education researcher to help refine and enriches the BJC curriculum to be easier to adopt and teach in high schools.

Type I: Collaborative Research: FRABJOUS CS - Framing a Rigorous Approach to Beauty and Joy for Outreach to Underrepresented Students in Computing at Scale (Supplement)
Tiffany Barnes

$86,000 by NSF
02/ 1/2013 - 08/31/2018

In this FRABJOUS CS project, we will prepare 60 high school teachers to teach the Beauty and Joy of Computing (BJC) Computer Science Principles curriculum. The BJC course is a rigorous introductory computing course that highlights the meaning and applications of computing, while introducing low-threshold programming languages Snap-Scratch, GameMaker and AppInventor. BJC is informed and inspired by the Exploring Computer Science curriculum, that was explicitly designed to channel the interests of urban HS students with ?culturally relevant and meaningful curriculum? [Goode 2011][Margolis 2008]. The BJC course uses collaborative classroom methods including pair learning, and student-selected projects are geared toward leveraging students? knowledge of social media, games, devices, and the internet. At UNC Charlotte in 2010 and 2011, PI Barnes engaged college students in supporting the BJC course, and in after-school outreach and summer camps that excite middle and high school students about this curriculum at different levels. The project engages three university faculty members and 6 college students to help the high school teachers build a Computer Science Teachers Association chapter and provide ongoing professional development and support for the BJC course. The project also engages high school teachers and an education researcher to help refine and enriches the BJC curriculum to be easier to adopt and teach in high schools.

Collaborative Research: Modeling Social Interaction and Performance in STEM Learning
Tiffany Barnes

$200,003 by National Science Foundation
09/ 1/2014 - 08/31/2018

Despite long-standing awareness that social interaction is an integral part of knowledge construction, efforts to study complex collaborative learning have traditionally been relegated to qualitative and small-scale methodologies. Relatively new data traces left by online learning environments, including massive open online courses (MOOCs), offer the first real hope for scaling up such analyses. The purpose of the proposed research is to develop comprehensive models for collaborative learning which in turn will enable instructional design and the authentic assessment of the individual within the group context. This task is undertaken by an interdisciplinary team of researchers with specializations in natural language processing, discourse analysis, social network analysis, educational data mining and psychometrics.

Type I: Collaborative Research: FRABJOUS CS - Framing a Rigorous Approach to Beauty and Joy for Outreach to Underrepresented Students in Computing at Scale (Supplement)
Tiffany Barnes

$565,874 by National Science Foundation
09/ 1/2016 - 08/31/2018

In recent decades, coding has evolved from a professional activity of a few million developers to a near universally-needed skill. However, there are still fewer than 2000 teachers prepared to teach computer science to high school students. In this supplement we propose to engage 240 teachers in NC districts, and an additional 1000 online, in BJC professional developme

EXP: Data-driven Support for Novice Programmers
Tiffany Barnes ; Min Chi

$549,874 by National Science Foundation
09/ 1/2016 - 08/31/2019

While intelligent tutors have been shown to increase student learning in programming and other domains, and creative, exploratory programming environments are assumed to promote novice interest and motivation to learn to program, there are no environments that provide both creative tasks and intelligent support. We propose to extend our methods for data-driven hint generation, model tracing, and knowledge tracing to augment Snap and Java programming environments to be more supportive for novice programmers doing open-ended creative tasks.

Scaling a Rigorous CS Principles Curriculum: A Supplement to Beauty and Joy of Computing in New York City STEM-C MSP
Tiffany Barnes ; Glenn Kleiman

$568,967 by Educational Development Center via National Science Foundation
01/ 1/2017 - 12/31/2018

In recent decades, coding has evolved from a professional activity of a few million developers to a near universally-needed skill. However, there are still fewer than 2000 teachers prepared to teach computer science to high school students. In this supplement we propose to engage 162 teachers in BJC professional development and Train the Trainer workshops.

Evaluation For Actionable Change: A Data-Driven Approach
Tiffany Barnes, Co-PI ; Collin Lynch, Co-PI

$799,837 by National Science Foundation
01/ 1/2016 - 12/31/2018

RCTs are expensive and often show small effects. Even RCTs of widely adopted digital learning platforms can show disappointing results and these results have little impact on subsequent adoptions of programs already entrenched in the educational landscape. New methods are needed to both estimate effects and to indicate ways of improving outcomes for already-adopted digital learning tools. With platforms currently in wide-scale use, novel approaches to assessing use patterns and their relations with outcomes can both evaluate maximal effectiveness and provide means for improved effectiveness. Our study will use a data-driven approach to identify patterns of both student use and teacher implementation in the widely-adopted software Spatial Temporal Mathematics (ST Math). By linking these patterns with important learning and motivational outcomes we can form recommendations regarding promising actions teachers and administrators can take in implementing ST Math and refinements program developers can make in guiding students toward successful patterns. This work has the potential to not only transform use and success of the platform studied, but to create methods that can be refined and transferred to the study and implementation of other platforms.

CAREER: Improving Adaptive Decision Making in Interactive Learning Environments
Min Chi

$547,810 by National Science Foundation
03/ 1/2017 - 02/28/2022

For many forms of interactive environments, the system's behaviors can be viewed as a sequential decision process wherein, at each discrete step, the system is responsible for selecting the next action to take from a set of alternatives. The objective of this CAREER proposal is to learn robust interaction strategies that will lead to desirable outcomes in complex interactive environments. The central idea of this project is that strategies should not only be effective in complex interactive environments but they should also be efficient, focusing solely on the key features of the domain and the crucial decision points. These are the features and decisions that are not only associated with desirable outcomes, but without which the desirable outcomes are unlikely to occur.

DIP: Integrated Data-driven Technologies for Individualized Instruction in STEM Learning Environments
Min Chi ; Tiffany Barnes

$1,999,438 by National Science Foundation
08/15/2017 - 07/30/2022

In this proposed work, our goal is to automatically design effective personalized ITSs directly from log data. We will combine Co-PI Dr. Barnes data-driven approach on learning what to teach with PI Dr. Chi’s data-driven work on learning how to teach. More specifically, we will explore across three important undergraduate stem domains: discrete math, probability, and programming.

Educational Data Mining for Individualized Instruction in STEM Learning Environments
Min Chi ; Tiffany Barnes

$639,401 by National Science Foundation
09/ 1/2014 - 08/31/2018

Human one-on-one tutoring is one of the most effective educational interventions. Tutored students often perform significantly better than students in classroom settings (Bloom, 1984; Cohen, Kulik, & Kulik, 1982). Computer learning environments that mimic aspects of human tutors have also been highly successful. Intelligent Tutoring Systems (ITSs) have been shown to be highly effective in improving students' learning at real classrooms (Anderson, Corbett, Koedinger, & Pelletier, 1995; Koedinger, Anderson, Hadley, & Mark, 1997; VanLehn et al., 2005). The development of ITSs has enabled schools and universities to reach out and educate students who otherwise would be unable to take advantage of one-on-one tutoring due to cost and time constraints (Koedinger, Anderson, Hadley, & Mark, 1997). Despite the high payoffs provided by ITSs, significant barriers remain. High development costs and the challenges of knowledge engineering have prevented their widespread deployment. A diverse team of software developers, domain experts, and educational theorists are required for development, testing and even maintenance. Generally speaking, it requires an average of 80 man-hours per hour of tutoring content. In this proposed work, our goal is to automatically design effective personalized ITSs directly from log data. We will combine co-pI Dr. Barnes data-driven approach on learning what to teach with PI Dr. Chi’s data-driven work on learning how to teach. More specifically, we will explore two important undergraduate stem domains: discrete math and probability; and employ two types of ITSs: an example-based ITS, the discrete math tutor, and a rule-based ITS, Pyrenees. The former can automatically generate hints directly from students’ prior solutions while the latter has hard-coded domain rules and teaches students a domain-general problem-solving strategy within the context of probability. For learning how to teach, we will apply reinforcement learning to induce adaptive pedagogical strategies directly from students’ log files and will focus on three levels of pedagogical decisions: 1) whether or not to give students hints and what level of hint (step-level); 2) whether to show students worked example or ask students to engage problem solving (problem-level); and 3) whether or not to teach students a meta-cognitive problem-solving strategy (metacognitive level).

Using Real-Time Multichannel Self-Regulated Learning Data to Enhance Student Learning and Teachers' Decision-Making with MetaDash
Min Chi - Co PI ; Roger Azevedo (Psychology) ; Soonhye Pakr - co-PI EDUCA

$914,585 by National Science Foundation
04/ 1/2017 - 03/31/2020

This 3-year project will focus on laboratory and classroom research in the Raleigh-Chapel-Hill, and Durham areas. A team of interdisciplinary researchers from NCSU's Department of Psychology (Dr. Roger Azevedo), Computer Science (Dr. Min Chi), and STEM Education (Dr. Soonhye Park) will conduct empirical and observational research aimed at improving teachers' decision-making based on their analyses of students' real-time, multi-channel self-regulated learning data. We will use multi-channel data to understand the nature of self-regulatory processes in students while using MetaTutor to understand challenging science topics (e.g., human biological systems). This will be accomplished by aligning and conducting complex computational and statistical analyses of a multitude of trace data (e.g., log-files, eye-tracking, facial expression of emotions), behavioral (e.g., human-pedagogical agent dialogue moves), and physiological measures (e.g., EDA), learning outcomes, and classroom data (e.g., teacher-student interactions, gaze behavior of teachers’ attention and use of data presented by the visualization tool). The proposed research, in the context of using MetaTutor and a visualization tool for teachers, is extremely challenging and will help us to better understand the nature and temporal dynamics of these processes in classroom contexts, how they contribute to various types of learning and use of self-regulatory skills, and provide empirical basis for designing an intelligent teacher analytics tool. The results from this grant will contribute significantly to models of social and cognitive bases of student-machine-teacher interactions; statistical and computational methods used to make inferences from complex multi-channel data; theoretical and conceptual understanding of temporally-aligned data streams; enhancing students’ understanding of complex science topics by making more sensitive and intelligent advanced learning technologies; and, enhanced understanding of how teachers use real-time student data to enhance their instructional decision-making, based on data presented in teacher analytic tools.

SCH: INT: Collaborative Research: S.E.P.S.I.S.: Sepsis Early Prediction Support Implementation System
Min Chi, Co-PI ; Julie Ivy, PI

$834,725 by National Science Foundation
10/ 1/2015 - 09/30/2018

Every year approximately 700,000 people die in US hospitals. In 16% of them, the first diagnosis at death was septicemia – one of the most common delayed diagnoses associated with inpatient death. Sepsis is one of the ten leading causes of death. While it is difficult to estimate how many of these deaths could have been averted or postponed if a better system of care was in place, it is widely recognized that sepsis patients have better outcomes if treated earlier. Sepsis is one of the most common of these diagnoses with delayed effective treatment interventions. As opposed to wrong diagnoses, delayed diagnoses have historically not been considered adverse events as there is no change in patient condition as a result of care delivered. However, patients with delayed diagnoses do have worse outcomes than those who receive timely treatment. These diagnostic and/or treatment delays associated with inpatient mortality and long term morbidity consequences represent a significant and modifiable patient safety issue. Awareness of sepsis is low; many septic patients are under-diagnosed at an early stage when aggressive treatment could still reverse the course of the infection. Early recognition and implementation of early goal directed therapy improves outcomes and decreases mortality. For every one hour delay in treatment of severe sepsis or severe shock with antibiotics, there is an incremental decrease in patient survival. For example, a delay in antibiotics of five hours decreases survival to 50%. We propose to take a date-driven evidence-based approach that integrates computer science and industrial engineering to develop personalized sepsis diagnosis and treatment plans. The goal of this research is to integrate electronic health records (EHR) and clinical expertise to provide an evidence-based framework for diagnosing sepsis patients, accurately risk-stratifying patients within the sepsis spectrum, and developing intervention policies that inform sepsis treatment decisions. We will achieve this research goal through accomplishing three specific aims based on a longitudinal data set of EHRs of Mayo Clinic Rochester hospital patients discharged and Christiana Care Health System hospital patients.

IUCRC Pre-proposal Phase I NC State University: Center for Accelerated Real Time Analytics (CARTA)
Rada Chirkova

$747,647 by National Science Foundation (NSF)
06/ 1/2018 - 05/31/2023

Real-time analytics is the leading edge of a smart data revolution, pushed by Internet advances in sensor hardware on one side, and AI/ML streaming acceleration on the other. We propose creation of a Center of Accelerated Real Time Analytics (CARTA) to explore the realm streaming applications of analytics. This center will be lead by University of Maryland, Baltimore County with partners from NCSU, Rutgers, and other affiliated universities. The proposed center will work with next generation hardware technologies, like the IBM Minsky with on board GPU accelerated processors and Flash RAM, a Smart Cyber Physical Sensor Systems to build Cognitive Analytics systems and Active storage devices for real time analytics. This will lead to the automated ingestion and simultaneous analytics of Big Datasets generated in various domains including Cyberspace, Healthcare, Internet of Things (IoT) and the Scientific arena, and the creation of self learning, self correcting “smart” systems. At the core of these technologies are the techniques of data wrangling that enable this end-to-end real-time data processing and the infrastructure of the next generation of high-performance analytics systems.

I/UCRC: Site application to join I/UCRC known as CHMPR
Rada Chirkova

$298,533 by National Science Foundation
09/ 1/2016 - 08/31/2019

Abstract: The objective of this proposal is to indicate that North Carolina State University (NCSU) will join, as a site, the Center of Hybrid Multicore Productivity Research (CHMPR) in Year 2 of its phase II I/UCRC renewal. The focus of NCSU within the center will be the science of technologies for end-to-end enablement of data. The research at NCSU complements well the work being done at the other I/UCRC centers and at the other sites of the CHMPR center. NCSU has had extensive positive experience with I/UCRC centers over the years, and is very comfortable with the model.

Membership in Center of Hybrid Multicore Productivity Research (CHMPR) - Full Member
Rada Chirkova

$80,000 by Merck & Company
01/ 1/2017 - 12/31/2018

Exploring and analyzing the available data is key to making the right decisions. It is well known that “data wrangling,” which includes many kinds of end-to-end data enablement, makes up 60-80% of the total effort in analytics on large-scale data. We look to address the challenge of maximizing the usefulness of the available data, by providing tools, science, and talent for next-generation technologies and infrastructure. We focus on empowering organizations that wish to unlock the value of decisions based on their data, and envision a future where technologies and tools for data enablement provide significant business advantages to such organizations. At NCSU, we will lead national and international efforts in this space, by developing and providing technologies and tools for bridging the time gap between the acquisition of data and real-time and long-term decision making.

Membership in Center of Hybrid Multicore Productivity Research (CHMPR) - Full Member
Rada Chirkova

$80,000 by SAS Institute, Inc
01/ 1/2017 - 12/31/2018

Exploring and analyzing the available data is key to making the right decisions. It is well known that “data wrangling,” which includes many kinds of end-to-end data enablement, makes up 60-80% of the total effort in analytics on large-scale data. We look to address the challenge of maximizing the usefulness of the available data, by providing tools, science, and talent for next-generation technologies and infrastructure. We focus on empowering organizations that wish to unlock the value of decisions based on their data, and envision a future where technologies and tools for data enablement provide significant business advantages to such organizations. At NCSU, we will lead national and international efforts in this space, by developing and providing technologies and tools for bridging the time gap between the acquisition of data and real-time and long-term decision making.

BD Spokes: PLANNING: SOUTH: Collaborative: Rare Disease Observatory
Rada Chirkova

$71,143 by National Science Foundation
09/ 1/2016 - 08/31/2018

One of the greatest challenges in the rare disease domain is access to trusted, verified data. With advances in mapping the human genome, over 7000 rare diseases have been identified. However no integrated, comprehensive patient registries exist that reliably collect data on these patients and their conditions and would allow for queries such as outcomes and economic impact. This planning grant will concentrate on building a large data system that can be accessed by a large collaborative community of those in the rare disease space, including state and federal agencies, clinicians, investigators, patient advocacy groups, industry and SPOKE: South.

BD Spokes: PLANNING: SOUTH: Collaborative: Rare Disease Observatory (Supplement)
Rada Chirkova

$14,000 by National Science Foundation
07/27/2017 - 08/31/2018

Managing and extracting useful information from massive data sets is critical in nearly every research field and industry sector. Modeling, data management, and analytics are key elements in understanding and using these data sets, and their importance in academia, government, and industry is predicted to grow exponentially. Success in solving the problem of extracting value from massive data hinges on balancing fundamental research, technological know-how, and commercial market intelligence. This proposal puts forward a plan for enhanced activities for a graduate student, making use of important data, mentorship, and collaboration resources, with outreach activities that maximize impact. The proposal is focused on an intensive training program with a mentor who is a senior researcher at the Laboratory for Analytic Sciences (LAS). LAS is a federal-government entity located in Raleigh, North Carolina and focused on translational research in support of the Department of Defense and the Intelligence Community. The training will focus on building and strengthening the student's interaction skills that are critical for today's workforce, by exposing the student to collaborative work on knowledge graphs with players in the government and industry.

TWC: Medium: Collaborative: Improving Mobile-Application Security via Text Analytics
William Enck

$300,000 by National Science Foundation
07/ 1/2015 - 06/30/2018

Computing systems that make security decisions often fail to take into account human expectations. This failure occurs because human expectations are commonly drawn from textual sources (e.g., mobile application description) and are hard to extract and codify. This proposal seeks to extract expectation context from natural-language text artifacts presented to users as they find, install, and run software. The proposed work focuses specifically mobile applications to demonstrate concrete and practical advances in our scientific understanding of applying user expectation context to security decisions. These findings will advance the state-of-the-art in identifying and classifying malware and grayware, as well as identify better methods of communicating risk to users. We will also gain a better understanding of the unique challenges of applying text analytics the security domain.

TWC: Medium: Collaborative: Improving Mobile-Application Security via Text Analytics (supplement)
William Enck

$8,000 by National Science Foundation
07/ 1/2015 - 06/30/2018

Smartphones and mobile devices have become a dominant computing platform for billions of users. There are many static and dynamic program analysis tools to detecting security and privacy concerns in mobile applications. However, few approaches bridge the semantic gap between code and visual presentation. Ignoring this context results in analysis that falsely reports an application as malicious (e.g., the user really wanted to use an app to record phone calls), or fails to detect suspicious behavior (e.g., an app that collects sensitive information via text input). We propose to use a hybrid static / dynamic approach to extract the text labels from the Android application UIs followed by text analytics to semantically label the type of input asked for by the application. Doing so will better characterize the amount of security and privacy information entered into Android applications, as well as enable outlier detection to identify applications that ask for unexpected (e.g., SSN) information for their semantic class (e.g., banking applications). This analysis will be applied at scale to identify potential privacy infringements in mobile application stores.

TWC: Frontier: Collaborative: Rethinking Security in the Era of Cloud Computing
William Enck ; Peng Ning ; Mladen Vouk

$749,996 by National Science Foundation
09/ 1/2013 - 08/31/2018

Increased use of cloud computing services is becoming a reality in today's IT management. The security risks of this move are active research topics, yielding cautionary examples of attacks enabled by the co-location of competing tenants. In this project, we propose to mitigate such risks through a new approach to cloud architecture defined by leveraging cloud providers as trusted (but auditable) security enablers. We will exploit cooperation between cloud providers and tenants in preventing attacks as a means to tackle long-standing open security problems, including protection of tenants against outsider attacks, improved intrusion detection and security diagnosis, and security-monitoring inlays.

Collaborative Research: Research in Student Peer Review: A Cooperative Web-Services Approach
Edward Gehringer

$1,034,166 by NSF
09/ 1/2014 - 08/31/2018

Peer review between students has a 40-year history in academia. During the last half of that period, web-based peer-review systems have been used in tens of thousands of classes. Many online systems have been developed, in diverse settings and with diverse purposes. The systems, however, have common concerns: assigning capable reviewers to each student submission, insuring review quality, and delivering reliable scores, in cases where the systems are used for summative review of student work. Many strategies have been proposed to meet those concerns, and tested in relatively small numbers of courses. The next step is to scale up the studies to learn how well they perform in diverse settings, and with large numbers of students. This project brings together researchers from several peer-review systems, including some of the largest, to build web services that can be incorporated into existing systems to test these strategies and visualize the results.

Collaborative Research: Research in Student Peer Review: A Cooperative Web-Services Approach (Supplement)
Edward Gehringer

$40,000 by National Science Foundation
09/ 1/2014 - 08/31/2018

The students assist our efforts to build a database of peer-review responses that can be mined for quantitative research studies. The database will be composed of anonymized data from the peer-review systems of the constituent projects: CritViz, CrowdGrader, Expertiza, and Mobius/Slip. Among other items, it will contain peer feedback and ratings, and links to submitted work. They will embark on a qualitative research study to determine what STEM students value about the peer-review process. They will use a common set of research protocols to investigate three research questions: What do students value about receiving reviews? What do they value about giving reviews? Do their reactions differ, based on demographics, age/level of study, or academic major?

CSR: Medium:Collaborative Research: Holistic, Cross-Site, Hybrid System Anomaly Debugging for Large Scale Hosting Infrastructures
Xiaohui (Helen) Gu

$518,000 by National Science Foundation
08/ 1/2015 - 07/31/2019

Hosting infrastructures provide users with cost-effective computing solutions by obviating the need for users to maintain complex computing infrastructures themselves. Unfortunately, due to their inherent complexity and sharing nature, hosting infrastructures are prone to various system anomalies caused by various external or internal faults.The goal of this project is to investigate a holistic,cross-site, hybrid system anomaly debugging framework that intelligently integrates production-site black-box diagnosis and developer-site white-box analysis into a more powerful hosting infrastructure anomaly debugging system.

CHS: SMALL: Direct Physical Grasping, Manipulation, and Tooling of Simulated Objects
Christopher Healey ; Robert St. Amant

$496,858 by National Science Foundation
08/ 1/2014 - 07/31/2018

This proposal is for the development and evaluation of CAPTIVE, a Cube with Augmented Physical Tools, to support exploration of three-dimensional information. The design of CAPTIVE is founded on the concept of tool use, in which physical objects (tools) are used to modify the properties or presentation of target objects. CAPTIVE integrates findings across a wide range of areas in human-computer interaction and visualization, from bimanual and tangible user interfaces to augmented reality. CAPTIVE is configured as a desktop augmented reality/fishtank virtual reality system [120], with a stereo- scopic display, a haptic pointing device, and a user-facing camera. In one hand the user holds a wireframe cube that contains virtual objects, in the other the pointing device, augmented to reflect its function as a tool: a probe probes for pointing at, choosing, and moving objects; a magnifying or semantic lens for filter- ing, recoding, and elaborating information; a cutting plane that shows slices or projection views. CAPTIVE supports visualization with more fluid and natural interaction techniques, improving the ability of users to explore and understand 3D information.

Identification of Translational Hormone-Response Gene Networks and Cis-Regulatory Elements
Steffen Heber(co-PI) ; Jose Alonso(Lead PI-CALS) ; Anna Stepanova (CALS) ; Cranos Williams (ECE)

$897,637 by National Science Foundation
08/ 1/2015 - 07/31/2020

Plants, as sessile organisms, need to constantly adjust their intrinsic growth and developmental programs to the environmental conditions. These environmentally triggered “adjustments“ often involve changes in the developmentally predefined patterns of one or more hormone activities. In turn, these hormonal changes result in alterations at the gene expression level and the concurrent alterations of the cellular activities. In general, these hormone-mediated regulatory functions are achieved, at least in part, by modulating the transcriptional activity of hundreds of genes. The study of these transcriptional regulatory networks not only provides a conceptual framework to understand the fundamental biology behind these hormone-mediated processes, but also the molecular tools needed to accelerate the progress of modern agriculture. Although often overlooked, understanding of the translational regulatory networks behind complex biological processes has the potential to empower similar advances in both basic and applied plant biology arenas. By taking advantage of the recently developed ribosome footprinting technology, genome-wide changes in translation activity in response to ethylene were quantified at codon resolution, and new translational regulatory elements have been identified in Arabidopsis. Importantly, the detailed characterization of one of the regulatory elements identified indicates that this regulation is NOT miRNA dependent, and that the identified regulatory element is also responsive to the plant hormone auxin, suggesting a role in the interaction between these two plant hormones. These findings not only confirm the basic biological importance of translational regulation and its potential as a signal integration mechanism, but also open new avenues to identifying, characterizing and utilizing additional regulatory modules in plants species of economic importance. Towards that general goal, a plant-optimized ribosome footprinting methodology will be deployed to examine the translation landscape of two plant species, tomato and Arabidopsis, in response to two plant hormones, ethylene and auxin. A time-course experiment will be performed to maximize the detection sensitivity (strong vs. weak) and diversity (early vs. late activation) of additional translational regulatory elements. The large amount and dynamic nature of the generated data will be also utilized to generate hierarchical transcriptional and translational interaction networks between these two hormones and to explore the possible use of these types of diverse information to identify key regulatory nodes. Finally, the comparison between two plant species will provide critical information on the conservation of the regulatory elements identified and, thus, inform research on future practical applications. Intellectual merit: The identification and characterization of signal integration hubs and cis-regulatory elements of translation will allow not only to better understand how information from different origins (environment and developmental programs) are integrated, but also to devise new strategies to control this flow for the advance of agriculture. Broader Impacts: A new outreach program to promote interest among middle and high school kids in combining biology, computers, and engineering. We will use our current NSF-supported Plants4kids platform (ref) with a web-based bilingual divulgation tools, monthly demos at the science museum and local schools to implement this new outreach program. Examples of demonstration modules will include comparison between simple electronic and genetic circuits.

Collaborative Research: Transforming Computer Science Education Research Through Use of Appropriate Empirical Research Methods: Mentoring and Tutorials
Sarah Heckman

$406,557 by National Science Foundation
09/ 1/2015 - 08/31/2020

The computer science education (CSEd) research community consists of a large group of passionate CS educators who often contribute to other disciplines of CS research. There has been a trend in other disciplines toward more rigorous and empirical evaluation of various hypotheses. However, many of the practices that we apply to demonstrate rigor in our discipline research are ignored or actively avoided when performing research in CSEd. This suggests that CSEd is “theory scarce” because most publications are not research and do not provide the evidence or replication required for meta-analysis and theory building . An increase in empiricism in CSEd research will move the field from “scholarly teaching” to the “scholarship of teaching and learning” (SoTL) providing the foundation for meta-analysis and the generation of theories about teaching and learning in computer science. We propose the creation of training workshops and tutorials to educate the educators about appropriate research design and evaluation of educational interventions. The creation of laboratory packages, “research-in-a-box,” will support sound evaluation and replication leading to meta-analysis and theory building in the CSEd community.

PFI: BIC: FEEED: Flexible, Equitable, Efficient and Effective Distribution
Julie Ivy ; Min Chi

$400,000 by UNC - NC A & T State University
08/ 1/2017 - 07/31/2020

Nonprofit hunger relief organizations operate in a complex environment consisting of a large and diverse donor base and a dynamic distribution network. These organizations generate a large amount of unstructured and complex data on food collection, inventory management, and distribution activities. However, existing information systems lack the infrastructure to interpret this large-scale data to provide real-time policy recommendations and support operational and strategic decision-making. The proposed smart service system will synthesize data from disparate sources to create a real-time perspective of the environment and learn from the actions of the decision maker. Specifically, this system will automatically predict, visualize, and recommend decisions that will advance operational effectiveness of food collection, distribution, and resource management in a way that is efficient and equitable and will identify opportunities to improve a food bank’s capability to satisfy hunger need.

SaTC: CORE: Medium: Collaborative: Taming Web Content Through Automated Reduction in Browser Functionality
Alexandros Kapravelos

$406,609 by National Science Foundation
09/ 1/2017 - 08/31/2021

The browser is constantly evolving to meet the demands of Web applications. Although this evolution supports the innovation that we see on the internet, there are security implications that we need to consider, such as attacks against the browser that leverage bugs that occur from the rapid development. In this project, we plan to examine how certain web applications work and associate their behavior directly with the corresponding browser functionality. Our goal is to be able to characterize what functionality is need from the browser when rendering a page and certain components. By building a system like this we will be able to identify for example what is needed from the browser to render a web advertisement. To better protect the internet users, we are going to leverage that information so that we can identify when web applications diverge from their expected behavior and attack the users' browser. We will use this information to limit the exposed functionality to the web applications and eliminate this way multiple classes of attacks, such as browser fingerprinting and drive-by downloads.

XS-Shredder: A Cross-Layer Framework for Removing Code Bloat in Web Applications
Alexandros Kapravelos

$300,000 by Arizona State University via Office of Naval Research
07/ 1/2017 - 06/30/2019

Modern web applications are incredibly complex pieces of software, with frameworks and libraries that assist web developers to write their applications quickly. However, these frameworks and libraries increase the attack surface of the web application. In this proposal, we present the design of a framework, called XS-Shredder, which is able to debloat all layers of the web application software stack: client-side code, server-side code, database, and operating system. This framework will perform analysis inter- and intra-layer, ultimately resulting in a web application that is semantically identically, yet with a significantly reduced attack surface.

Collaborative Research: Big Data from Small Groups: Learning Analytics and Adaptive Support in Game-based Collaborative Learning
James Lester

$1,249,611 by National Science Foundation
10/ 1/2016 - 09/30/2021

The proposed project focuses on integrating models of game-based and problem-based learning in a computer-supported collaborative learning environment (CSCL). As groups of students solve problems in these environments, their actions generate rich and dynamic streams of fine-grained multi-channel data that can be instrumented for investigating students' learning processes and outcomes. Using the big data generated by small groups, we will leverage learning analytics to provide adaptive support for collaboration that will allow these models to be used at larger scales in real classrooms. The project will study CSCL in the context of an environmental-science-based digital game that will employ specific strategies to support the problem-based learning goals of helping students construct explanations, reason effectively, and become self-directed learners. In problem-based learning, students are active, intentional learners who collaboratively negotiate meaning. The project will embed models induced using learning analytic techniques inside of a digital game environment to enable students to cultivate collaborative learning competencies that translate to non-digital classroom settings.

Learning Environments Across Disciplines LEADS: Supporting Technology Rich Learning Across Disciplines: Affect Generation and Regulation During Co-Regulated Learning in Game-Based Learning Environments (Supplement
James Lester

$114,672 by McGill University/Social Sciences and Humanities Research Council of Canada
04/ 1/2012 - 02/28/2020

Contemporary research on multi-agent learning environments has focused on self-regulated learning (SRL) while relatively little effort has been made to use co-regulated learning as a guiding theoretical framework (Hadwin et al., 2011). This oversight needs to be addressed given the complex nature that self-and other-regulatory processes play when human learners and artificial pedagogical agents (APAs) interact to support learners? internalization of cognitive, affective, and metacognitive (CAM) SRL processes. We will use the Crystal Island learning environment to investigate these issues.

REFLECT: Improving Science Problem Solving with Adaptive Game-Based Reflection Tools
James Lester ; Roger Azevedo (Psychology)

$1,300,000 by National Science Foundation
04/15/2017 - 03/31/2020

Reflection has long been recognized as a central component of effective learning. With the overarching goal of improving middle school students' science problem solving and learning outcomes, the REFLECT project has the objective of investigating a suite of theoretically grounded, adaptive game-based reflection tools to scaffold students' cognitive and metacognitive processes. The project will center on the design, development, and investigation of game-based learning tools for science education that adaptively scaffold students’ reflection through both embedded and retrospective support. It will culminate in a classroom experiment to study the impact of the adaptive reflection tools on both problem solving and learning. The results from this project will contribute significantly to theoretical and computational models of reflection, and produce both design principles and learning technologies that support the creation of effective learning environments.

CHS: Medium: Adapting to Affect in Multimodal Dialogue-Rich Interaction with Middle School Students
James Lester ; Kristy Boyer ; Bradford Mott ; Eric Wiebe

$1,184,073 by National Science Foundation
08/ 1/2014 - 07/31/2018

Despite the great promise offered by learning environments for K-12 science education, realizing its potential poses significant challenges. In particular, current learning environments do not adaptively support children's affect. This project investigates computational models of affect that support multimodal dialogue-rich interaction. With an emphasis on devising scalable solutions, the project focus on machine-learning techniques for automatically acquiring affect and dialogue models that can be widely deployed in natural classroom settings.

ENGAGE: A Game-Based Curricular Strategy for Infusing Computational Thinking into Middle School Science.
James Lester ; Brad Mott ; Eric Wiebe (Friday Instit

$2,498,862 by National Science Foundation
08/15/2016 - 07/31/2019

Recent years have seen a growing recognition that computer science is vital for scientific inquiry. The middle school grade band is critical for shaping students’ aspirations and skills, and many issues relating to workforce underproduction and underrepresentation of diverse students in computer science can be traced back to middle school. To address this problem, the project will deeply integrate computer science into middle school science classrooms. Centered on a game-based learning environment that features collaborative learning, the project will have a specific focus on addressing gender issues in middle school computer science education with the goal of creating learning interactions that are both effective and engaging for all students.

Collaborative Research: PRIME: Engaging STEM Undergraduate Students in Computer Science with Intelligent Tutoring Systems
James Lester ; Bradford Mott ; Eric Wiebe (Friday Instit

$1,499,828 by National Science Foundation
09/ 1/2016 - 08/31/2020

Significant advances in intelligent tutoring systems have paved the way for engaging STEM undergraduates in computer science. This research has spawned a new generation of personalized learning environments that offer significant promise for providing students with adaptive learning experiences that are crafted to their individual needs. Spurred by this significant promise and building on a research infrastructure developed by the project team in a series of NSF-supported projects, the PRIME project will transform introductory computer science education with state-of-the-art intelligent tutoring systems technologies.

Guiding Understanding via Information from Digital Environments (GUIDE)
James Lester Co-PI ; Eric Wiebe Lead PI

$1,238,549 by Concord Consortium via National Science Foundation
09/15/2015 - 08/31/2019

This project will utilize research and development groups at the Concord Consortium and NC State University. Educational software for teaching high school multi-level genetics developed by the Concord Consortium will be enhanced by intelligent agents and machine-based tutoring system technologies developed at NC State to help enhance the learning experience for students. These groups will collaborate closely to develop and research a hybrid system that combines technological intervention and teacher pedagogical expertise to illuminate and guide student learning in deeply digital curricula and classrooms.

Health Quest: Engaging Adolescents in Health Careers with Technology-Rich Personalized Learning
James Lester, II

$1,301,820 by National Institutes of Health (NIH)
08/ 1/2017 - 07/31/2022

Leveraging intelligent game-based learning technologies, the Health Quest project focuses on developing and disseminating technology-rich resources to broaden the interests of adolescents in biomedical, behavioral and clinical research careers. The project centers on the development of technology-rich learning resources. These include a game-based learning environment featuring health careers as well as an online community that includes a speaker series featuring a broad range of health professionals. The final year of the project will see a full evaluation of the Health Quest program and its impact on students’ interest in biomedical, behavioral and clinical research careers.

DeepGen: Dynamic Generation of Training Simulation Scenarios with Deep Reinforcement Learning (W911NF-17-S-0003: CCE-HS-3: Training)
James Lester, II ; Bradford Mott ; Jonathan Rowe

$1,398,004 by US Army - Army Research Laboratory
12/ 8/2017 - 12/ 7/2021

Automated scenario generation offers considerable promise for addressing the needs of simulation-based training. Given recent advances in machine learning, including artificial neural networks and deep learning, data-driven approaches to automatically generating scenarios that are customized to the cognitive and affective characteristics of individual learners hold great potential. This project will investigate a critical research question for adaptive simulation-based training: how can we devise generalizable, data-driven scenario generation models that dynamically adapt training events to achieve target learning objectives in simulation-based virtual training environments? To answer this question, the project will investigate a data-driven framework for dynamic scenario generation that formalizes the task as a deep reinforcement learning problem. We will demonstrate the generalizability of the approach by investigating its implementation in multiple distinct simulation-based virtual training environments.

Multimodal Visitor Analytics: Investigating Naturalistic Engagement with Interactive Tabletop Science Exhibits
James Lester, II ; Jonathan Rowe ; James Minogue

$1,951,956 by National Science Foundation (NSF)
03/ 1/2018 - 02/28/2021

Recent advances in multimodal learning analytics present new opportunities for investigating learning and engagement in informal education settings. In this project, we will investigate visitors’ learning experiences in science museums using multimodal visitor analytics, which marry the rich multi-channel data streams produced by fully-instrumented exhibit spaces and the data-driven modeling functionalities afforded by recent advances in machine learning. The project will leverage Future Worlds, a fully-instrumented prototype digital interactive exhibit about sustainability, which was developed and piloted by the project team in a previously funded NSF Informal Science Education proof-of-concept project. The research team will conduct a series of museum studies to investigate how learners interact with Future Worlds and other exhibits in a science museum, and will utilize learning analytic techniques to model visitors’ cognitive, affective, and behavioral components of learning and engagement. The project will produce a detailed empirical account of visitors’ learning experiences in a science museum, as well as an open-source software platform for conducting multimodal visitor analytics, which will help other informal education researchers utilize learning analytics with their own datasets

Collaborative Research: Fostering Collaborative Computer Science Learning with Intelligent Virtual Companions for Upper Elementary Students
Collin Lynch (co-PI) ; Eric Wiebe

$1,399,088 by National Science Foundation
08/15/2017 - 07/31/2021

The University of Florida and North Carolina State University jointly propose FLECKS, a Design and Development proposal for the NSF's Discovery Research PreK-12 (DRK-12) program. FLECKS (Friendly Learning Environment for Kids' Computer Science) addresses the pressing need for the development of fundamental computer science competencies in upper elementary-school children. The goal of the proposed project is to design, develop, and investigate FLECKS, an intelligent learning environment to teach collaborative computer science problem solving. Collaboration is a central academic and professional practice in computational thinking, yet it presents many challenges for elementary school students. Students often struggle to collaborate successfully due to individual differences in academic status; gender; cultural background; personality; attitudes toward collaboration; or attitudes toward learning. In order to address these challenges, FLECKS will provide dyads of students with a rich, scaffolded environment where they use an interactive online coding environment to engage in computer science challenges related to their STEM subject areas. Central to the innovation is the way in which the dyads are supported. FLECKS are animated virtual characters that take a rich set of multimodal features as input, and then adapt to students’ patterns of collaboration, including who has control of the keyboard and mouse; who speaks when; and the problem-solving actions the students take in the online environment.

CRII: SHF: Supporting Domain-Specific Inquiry with Rule-Based Modeling
Christopher Martens

$161,142 by National Science Foundation (NSF)
03/ 1/2018 - 02/29/2020

An increasingly common method for communicating and critiquing the emergent behavior of complex systems is interactive simulation, which can teach interactors about the way a system works by revealing system-level properties like feedback loops and tension between objectives. The Ceptre programming language provides a way to author interactive simulations in a rule-based way, amenable to both intuitive understanding and analysis. We propose to expand Ceptre’s audience by implementing a user interface that enforces syntax-level and type-level correctness of programs, which can be run and deployed in the browser for rapid prototyping.

SHF:Medium:Collaborative:Transfer Learning in Software Engineering
Tim Menzies

$464,609 by National Science Foundation
08/ 2/2014 - 06/30/2018

Software engineers need better ways to recognize best practices in past projects, and to understand how to transfer and adapt those experiences to current projects. No project is exactly like previous projects- hence, the trick is to find which parts of the past are most relevant and can be transferred into the current project. We propose novel automated methods to apply the machine learning concept of transfer learning to adapt lessons from past software engineering project data to new conditions.

EAGER: Empirical Software ENGINEERING for Computational Science
Timothy Menzies

$124,628 by National Science Foundation (NSF)
05/ 1/2018 - 05/31/2019

If we could improve the methods of Computational Science, then this would also improve our understanding and utilization of the physical process studied by Computational Scientists (such as molecular dynamics, quantum chemistry, and computational materials). Much of the work in Computational Science is related to the software that implements it. Hence, in this work, we apply state of the art empirical software engineering methods to Computational Science

SHF:Medium:Scalable Holistic Autotuning for Software Analytics
Timothy Menzies ; Xipeng Shen

$898,349 by National Science Foundation
07/ 1/2017 - 06/30/2021

This research proposes to advance the state of the art to holistic scalable autotuners, which tunes all levels of options for multiple optimization objectives at the same time. It will achieve this ambitious goal through the development of a set of novel techniques that efficiently handles the tremendous tuning space. These techniques take advantage of the synergies between all those options and goals by exploiting relevancy filtering (to quickly dispose of unhelpful options), locality of inference (that enables faster updates to out- dated tunings) and redundancy reduction (that reduces the search space for better tunings). This new autotuner will be a faster method for finding better tunings that satisfy more goals. To test this claim, this research will assess if this new tool can reduce the total computational resources required for effective SE data analytics by orders of magnitude.

Large-Scale Automatic Analysis of the OAI Magnetic Resonance Image Dataset
Frank Mueller

$331,603 by UNC - UNC Chapel Hill
08/15/2017 - 07/31/2022

The goal of this proposal is to optimize and to openly provide to the OA community a new technology to rapidly and automatically measure cartilage thickness, appearance and changes on magnetic resonance images (MRI) of the knee for huge image databases. This will allow assessment of trajectories of cartilage loss over time and associations with clinical outcomes on an unprecedented scale; future work will focus on incorporating additional disease markers, ranging from MRI-derived biomarkers for bone and synovial lesions, to biochemical biomarkers, to genetic information.

HPC Power Modeling and Active Control
Frank Mueller

$386,290 by Lawrence Livermore National Laboratory via US Department of Energy
10/25/2016 - 09/30/2019

As we approach the exascale era, power has become a primary bottleneck. The US Department of Energy has set a power constraint of 20MW on each exascale machine. To be able achieve one exaflop in 20MW,it is necessary that we use power intelligently to maximize performance under a power constraint. In this work, we propose to alleviate the shortcomings of current HPC systems in addressing power constraints by (1) power-aware machine partitioning, (2) power-constrained job scheduling, (3) systematic provisioning and procurement of hardware under a power cap, (4)modeling of network, deep memories, and storage, as well as (5)investigating the inter-dependence between power and cooling.

Auto-Tuned Per-Loop Compilation
Frank Mueller

$50,000 by Lawrence Livermore National Laboratory
01/24/2018 - 01/31/2019

HPC applications require careful tuning to exploit close to peak performance on cutting-edge hardware platforms. This work hypothesizes that traditional per-module optimizations fall short of fully exploiting a compiler's capabilities, even when interprocedural optimization complement local and global ones. This project proposes to investigate the viability to separately compile major loops in an auto-tuning effort. Such an ensemble of loop units, when linked together, has the potential to improve not only single-loop but also overall application performance, thereby edging closer to peak performance for a given platform.

Student Travel Grant for RTSS'17 Ph.D. Student Poster Forum on Real-Time Aspects of Internet of Things and Cyber-Physical Systems
Frank Mueller

$15,000 by National Science Foundation
09/ 1/2017 - 08/31/2018

As General Chair of the IEEE Real-Time Systems Symposium in 2017 (RTSS 2017), the PI proposes to organize, in conjunction with the symposium, a Ph.D. student poster forum. The forum will have significant educational value. Essays and posters will be solicited for the Ph.D. student poster forum. Funds are requested to sponsor the Ph.D. student forum. The funds will provide stipends (both merit-based and need-based) and cover organizers’ costs

TWC: Small: Collaborative: Discovering Software Vulnerabilities Through Interactive Static Analysis
Emerson Murphy-Hill

$249,854 by National Science Foundation
10/ 1/2013 - 09/30/2018

Software vulnerabilities originating from insecure code are one of the leading causes of security problems people face today. Current tool support for secure programming focuses on catching security errors after the program is written. We propose a new approach, interactive static analysis, to improve upon static analysis techniques by introducing a new mixed-initiative paradigm for interacting with developers to aid in the detection and prevention of security vulnerabilities.

CAREER:Expanding Developers' Usage of Software Tools by Enabling Social Learning
Emerson Murphy-Hill

$495,721 by National Science Foundation
08/ 1/2013 - 07/31/2018

Tools can help software developers alleviate the challenge of creating and maintaining software. Unfortunately, developers only use a small subset of the available tools. The proposed research investigates how social learning, an effective mechanism for discovering new tools, can help software developers to discover relevant tools. In doing so, developers will be able to increase software quality while decreasing development time.

SHF:Small: Enabling Scalable and Expressive Program Analysis Notifications
Emerson Murphy-Hill ; Sarah Heckman

$265,853 by National Science Foundation (NSF)
08/15/2017 - 07/31/2020

Previous research shows that existing notifications produced by integrated development environments are poorly understood and overwhelming to programmers. We propose building on our prior work to create a new architecture for program analysis notifications that enables toolsmiths to create scalable and understandable notifications for a variety of program analysis tools.

NeTS: Small: Collaborative Research: Creating Semantically-Enabled Programmable Networked Systems (SERPENT)
Kemafor Ogan

$278,271 by National Science Foundation
10/ 1/2015 - 09/30/2018

The separation of control and data plane in SDN architectures helps merge packet and circuit paradigms into a single architecture and enables logical centralization of the control function. This enables new thinking about solutions to path optimization problems frequently encountered in networking, from routing to traffic engineering. The SERPENT project proposes to develop effective solutions for representing, storing and manipulating network state using rich semantic models such that path and topology embedding problems can be solved using a semantic database framework. This will simplify creation of novel network control and management systems able to cope with increasingly complex user requirements.

CRII: SHF: Building Visibility into the Cognitive Processes of Software Engineers via Biosensors
Christopher Parnin

$159,662 by National Science Foundation (NSF)
02/ 1/2018 - 01/31/2020

Despite its vast capacity and associative powers, the human brain does not deal well with interruptions. Particularly in situations where information density is high, such as during a programming task, recovering from an interruption requires extensive time and effort. Although researchers recognize this problem, no programming tool takes into account the brain’s structure and limitations in its design. In this project, we measure cognitive load of programmers during different programming tasks. To measure cognitive load, we collect both biometrics and metrics collected from sensors and brain imaging devices. We apply our measures to applications in the software engineering domain: 1) Measuring cognitive load during technical interviews, and 2) Correlating complexity measures of code with higher measures of cognitive load.

REU Site: Science of Software
Christopher Parnin ; Emerson Murphy-Hill ; Sarah Heckmen

$355,365 by National Science Foundation
01/ 1/2016 - 12/31/2018

There are not enough skilled data science researchers, particularly in software engineering. Hence, this REU Site in Science of Software (SOS) will engage undergraduates as data scientists studying exciting and meaningful SE research problems. Students work side-by-side with faculty mentors to gain experience in qualitative and quantitative research methods in SOS. Activities include dataset challenges, pair research, literature reviews, and presentations. Ultimately, each student works independently toward a published research result with their faculty mentors.

CPS: Synergy: Integrated Sensing and Control Algorithms for Computer-Assisted Training (Computer Assisted Training Systems (CATS) for Dogs)
David Roberts ; Alper Bozkurt ECE ; Barbara Sherman CVM

$1,029,403 by National Science Foundation
10/ 1/2013 - 09/30/2018

We propose to develop tools and techniques that will enable more effective two-way communication between dogs and handlers. We will work to create non-invasive physiological and inertial measuring devices that will transmit real-time information wirelessly to a computer. We will also develop technologies that will enable the computer to train desired behaviors using positive reinforcement without the direct input from humans. We will work to validate our approach using laboratory animals in the CVM as well as with a local assistance dog training organization working as a consultant.

Consortium for Nonproliferation Enabling Capabilities
Nagiza Samatova, co-PI ; Robin Gardner (Nuclear Eng

$9,744,249 by US Department of Energy
07/31/2014 - 07/30/2019

NC State University, in partnership with University of Michigan, Purdue University, University of Illinois at Urbana Champaign, Kansas State University, Georgia Institute of Technology, NC A&T State University, Los Alamos National Lab, Oak Ridge National Lab, and Pacific Northwest National lab, proposes to establish a Consortium for Nonproliferation Enabling Capabilities (CNEC). The vision of CNEC is to be a pre-eminent research and education hub dedicated to the development of enabling technologies and technical talent for meeting the grand challenges of nuclear nonproliferation in the next decade. CNEC research activities are divided into four thrust areas: 1) Signatures and Observables (S&O); 2) Simulation, Analysis, and Modeling (SAM); 3) Multi-source Data Fusion and Analytic Techniques (DFAT); and 4) Replacements for Potentially Dangerous Industrial and Medical Radiological Sources (RDRS). The goals are: 1) Identify and directly exploit signatures and observables (S&O) associated with special nuclear material (SNM) production, storage, and movement; 2) Develop simulation, analysis, and modeling (SAM) methods to identify and characterize SNM and facilities processing SNM; 3) Apply multi-source data fusion and analytic techniques to detect nuclear proliferation activities; and 4) Develop viable replacements for potentially dangerous existing industrial and medical radiological sources. In addition to research and development activities, CNEC will implement educational activities with the goal to develop a pool of future nuclear non-proliferation and other nuclear security professionals and researchers.

Lecture Hall Polytopes, Inversion Sequences, and Eulerian Polynomials
Carla Savage

$30,000 by Simons Foundation
09/ 1/2012 - 08/31/2018

Over the past ten years, lecture hall partitions have emerged as fundamental structures in combinatorics and number theory, leading to new generalizations and new interpretations of several classical theorems. This project takes a geometric view of lecture hall partitions and uses polyhedral geometry to investigate their remarkable properties.

SaTC: CORE: Small: Collaborative: A Broad Treatment of Privacy in Blockchains
Alessandra Scafuro

$249,922 by National Science Foundation (NSF)
09/ 1/2017 - 08/31/2020

A blockchain is a public, distributed, append-only database whose consistency is maintained by the combined work of users across the world rather than a single party, thus avoiding single points of failure and trust. The public nature of the blockchain, however, raises important privacy concerns. Existing work partially addressed privacy concerns for the restricted case of blockchains used for financial transactions. As blockchains are set to be used in a variety of contexts, proposed work will initiate a broad treatment of privacy definition and provide constructions achieving new privacy goals that can be implemented across different blockchains.

Analysis of Hash-Based Signatures Schemes in the QROM
Alessandra Scafuro

$96,752 by Silicon Valley Community Foundation
12/ 1/2017 - 12/31/2018

Digital signatures are fundamental cryptographic building blocks that guarantee the authenticity and integrity of digital communications. Currently adopted signature schemes (such as ECDSA) leverage number-theoretic properties, and their security relies on the intractability of solving number-theoretic problems such as integer factorization and the discrete logarithm problem. Shor's algorithm however shows that such problems are practically solvable on quantum computers. The threat of quantum attacks triggered NIST's competition for the development and standardization of signature schemes that are secure in presence of quantum adversaries. There exist several candidates for such competition, and CISCO research team is participating with the LMS scheme. The advantage of the LMS scheme over the other candidates is in its simplicity. The goal of the proposed research is to develop a formal treatment of the post-quantum security of the LMS scheme.

NeTS: Small: Fine-grained Measurement of Performance Metrics in the Internet of Things
Muhammad Shahzad

$449,999 by National Science Foundation
10/ 1/2016 - 09/30/2019

PI proposes to develop a framework for passive and fine-grained measurements of the performance metrics in the Internet of Things, which include both Quality of Service metrics such as latency, loss, and throughput and Resource Utilization metrics such as power consumption, storage utilization, and radio on time etc. Measurements of these performance metrics can be used reactively by network operators to perform tasks such as detecting and localizing offending flows that are responsible for causing delay bursts, throughput deterioration, or even power surges. These measurements can also be used proactively by network operators to locate and preemptively update any potential bottlenecks.

CSR:Small:Collaborative Research: Scalable Fine-Grained Cloud Monitoring for Empowering IoT
Muhammad Shahzad

$257,996 by National Science Foundation
09/15/2016 - 08/31/2019

Due to the rapid adoption of the cloud computing model, the size of the data centers and the variety of the cloud services is increasing at an unprecedented rate. Due to this, fine-grained monitoring of the health and the usage of data center resources is becoming increasingly important and challenging. In this work, we address the problem of efficiently acquiring and transporting cloud management and monitoring data. For data acquisition, we address the crucial challenge of controlling data size. For data transportation, we focus on efficiently moving the data from the point it is collected inside the data center to the point it needs to be stored for processing.

XPlacer: Extensible and Portable Optimization of Data Placement in Memory
Xipeng Shen

$235,398 by Lawrence Livermore National Laboratory
01/29/2018 - 09/30/2020

Modern supercomputers with heterogeneous components (e.g., GPUs) feature complex memory systems to meet the ever growing demands for data by processors. Putting data into the proper part of a memory system is essential for program performance, but is difficult to do. To address this challenge, we propose a new paradigm featuring three interacting components: 1) an extensible memory specification language to describe memory properties, 2) a compiler for analyzing data access patterns and transforming code for runtime adaptation, and 3) a data placement runtime to find and materialize the best data placements on the fly. The result will be a software framework (named XPlacer) that transforms OpenMP code to automatically place its data in memory in a way best suiting the GPU architecture, inputs, and program phases.

CSR:Small:Supporting Position Independence and Reusability of Data on Byte-Addressable Non-Volatile Memory
Xipeng Shen

$499,998 by National Science Foundation (NSF)
08/16/2017 - 07/31/2020

Byte-Addressable Non-Volatile Memory (NVM) is the upcoming next generation of memory with tremendous potential benefits. This proposal is about offering programming system-level support of persistency on NVM. Particularly, it focuses on effective support of the usage of dynamic data structures on NVM.

Accelerating DNN Pruning on CPU-GPU Supercomputers
Xipeng Shen

$29,999 by Oak Ridge National Laboratories - UT-Battelle LLC
04/18/2018 - 09/30/2018

Network pruning is a common approach for compressing CNN models to fit into resource-constrained devices that have limited storage and computing power. This work explores composability-centered CNN pruning to address the problem, for both recently proposed structural CNN models such as GoogleLeNet and ResNet.

Cognitive Computing-Based Compilation (YUE ZHAO)
Xipeng Shen

$10,846 by IBM Canada Limited
06/30/2016 - 06/29/2018

This project proposes to leverage IBM Watson-like cognitive computing engines for improving the efficacy of IBM commercial compilers (XLC/C++, XLFortran). Recent years have witnessed some exiting improvement of cognitive computing engines as demonstrated by the increasing impact of IBM Watson. In this project, we propose to leverage such engines to allow compilers to automatically accumulate the knowledge on appropriate ways to compile programs, learn from it, and then apply it to new programs. The success will remove the difficulties for users to choose the appropriate compilation flags, largely reduce the required tuning efforts from users, and help compilers to make better decisions to produce high-quality code

SHF: Small: Improving Memory Performance on Fused Architectures through Compiler and Runtime Innovations
Xipeng Shen ; Frank Mueller

$470,000 by National Science Foundation
08/ 1/2015 - 07/31/2018

Contemporary architectures are adopting an integrated design of conventional CPUs with accelerators on the same die with access to the same memory, albeit with different coherence models. Examples include AMD's Fusion architecture, Intel's integrated main-stream CPU/GPU product line, and NVIDIA Tegra's integrated graphics processor family. Integrated GPUs feature shared caches and a common memory interconnect with multicore CPUs, which intensify resource contention in the memory hierarchy. This creates new challenges for data locality, task partitioning and scheduling, as well as program transformations. Most significantly, a program running on GPU warps and CPU cores may adversely affect performance and power of one another. The objective of this work is to understand these novel implications of fused architectures by studying their effects, qualifying their causes and quantifying the impacts on performance and energy efficiency. We propose to advance the state-of-the-art by creating spheres of isolation between CPU and GPU execution via novel systems mechanisms and compiler transformations that reduce cross-boundary contention with respect to shared hardware resources. This synergy between systems and compiler techniques has the potential to significantly improve performance and power guarantees for co-scheduling pgrams fragments on fused architectures. impact: The proposed work, if successful, has the potential to transform resource allocation and scheduling at the systems level and compiler optimizations at the program level to create a synergistic development environment with significant performance and power improvements and vastly increased isolation suitable for synergistic co-deployment of programs crossing boundaries on innovative fused architectures.

Workshop on Inter-Disciplinary Research Challenges in Computer Systems
Xipeng Shen ; James Tuck

$44,954 by National Science Foundation (NSF)
03/ 1/2018 - 02/28/2019

This proposal requests travel support funds to enable invited participants to defray the costs of traveling to and attending the first ASPLOS Grand Challenges and Synergy Workshop (GCSW) collocated with ASPLOS-23. The PIs will ensure the creation and release of a comprehensive report to capture the discussions at the workshop.

CAREER: On the Foundations of Semantic Code Search
Kathryn Stolee

$500,000 by National Science Foundation (NSF)
08/ 1/2018 - 07/31/2023

Semantic code search uses behavioral specifications, such as input/output examples, to identify code in a repository that matches the specification. Challenges include handling scenarios when 1) there are too few solutions, 2) it is difficult to understand how solutions differ, and 3) there are too many solutions. I propose techniques to 1) expand the scope of code that can be modeled and find approximate solutions when an exact one does not exist, 2) determine the differences between two code fragments, and 3) navigate a large space of possible solutions are needed by selecting inputs that maximally divide the solution space.

SHF: Small: Supporting Regular Expression Testing, Search, Repair, Comprehension, and Maintenance
Kathryn Stolee

$499,996 by National Science Foundation (NSF)
08/15/2017 - 07/31/2020

Regular expressions (regexes) are responsible for numerous faults in many software products, and yet, static bug finders and automated program repair techniques generally ignore this common language feature. First, I propose to explore and characterize regex-related bugs in bug repositories. From there, I propose to develop approaches for detecting regex-related bugs using static analysis and patching regex-related bugs using automated program repair. The proposed detection and patching techniques both depend on similarity analysis of regexes. The expected research outcomes include a publicly available data set of regex-related faults, new regex-related bug patterns for static bug finders like FindBugs and PMD, and in the best case, an open source tool for automated patch generation for regular expressions.

SHF: Medium: Collaborative Research: Semi and Fully Automated Program Repair and Synthesis via Semantic Code Search
Kathryn Stolee

$387,661 by National Science Foundation
07/ 1/2016 - 06/30/2020

Software plays an integral role in our society. However, software bugs are common, routinely cause security breaches, and cost our economy billions of dollars annually. The software industry struggles to overcome this challenge: Software is so inherently complex, and mistakes so common, that new bugs are typically reported faster than developers can fix them. Recent research has demonstrated the potential of automated program repair techniques to address this challenge. However, these techniques often produce low-quality repairs that break existing functionality. In this research, we develop new techniques to fix bugs and implement new features automatically, producing high-quality code.

Algorithms for Exploiting Approximate Network Structure Research Area 10: Network Science
Vida Blair Sullivan

$538,199 by US Army-Army Research Office
05/15/2017 - 05/14/2020

We propose a new framework for efficient, robust, and noise-tolerant network algorithms that guarantee near-optimal solutions to NP-hard problems by exploiting structure inherent in real-world networks. We model networks as consisting of a majority that belongs to a structural graph class, plus a few deviations resulting from measurement errors, unusual behaviors, and/or unexplained exceptions. We will develop algorithms which exploit this more approximate form of graph structure and guarantee near-optimal solutions and polynomial running time for any network that is ``close'' to a structural graph class, initially focusing on hierarchical/tree-like networks (e.g. those arising in biology and social behavior).

Moore Foundation Data-Driven Discovery Investigator
Vida Blair Sullivan

$1,500,000 by Gordon and Betty Moore Foundation
11/10/2014 - 12/ 1/2019

Understanding and identifying intermediate-scale structure is key to designing robust tools for data analysis, just as the interdependence of local interactions and global behavior is key in many science domains. We thus focus on constructing a theory and tools for using this structure to improve analysis and identification of relationships in massive graph data. Through careful integration of tools from graph theory, computational complexity, statistics, and parallel algorithm design, the proposed work will derive novel measures of graph similarity based on structural representations and application-inspired features of interest. We will design efficient, scalable sampling algorithms which leverage inherent sparsity and structure to de-noise and improve accuracy of parameter estimation. As a specific example of science domain impact, we focus on improving understanding of the brain. Applying our new tools for characterizing graph-theoretic structure in such networks, scientists will be able to build higher fidelity models of brain network formation and evolution. Additionally, efficient algorithms from the associated parameterized framework will enable rapid comparison of regions and identification of discrepancies, abnormalities, and influential components for specific tasks.

CRII: CSR: Rethinking the FTL in SSDs -- a file translation layer instead of a flash translation layer
Hung-Wei Tseng

$174,998 by National Science Foundation
03/15/2017 - 02/28/2019

SSDs (solid state drives) nowadays become popular in all kinds of computing systems. However, these systems still leverage the existing block interface to manage SSDs, resulting in multiple layers of indirections, under-utilized parallelism inside the SSD, overheads for in-storage computing,and difficulties in sharing file among heterogeneous computing devices. This work will reshape the current system stack and simplifies the software/hardware interface for SSDs by using the SSD to directly map files into physical block addresses on the SSD. This work will demonstrate the effect of the proposed system in applications ranging from file systems, datacenter storage, and virtualized machines.

Deep Learning on Remote Sensing Imagery
Ranga Vatsavai

$50,000 by Lenovo
02/ 1/2018 - 12/31/2018

Massive amounts of remote sensing data are being collected and archived from satellites and airborne platforms (including drones) on daily basis. This data supports a wide range of applications of national importance. Examples of applications include crop type mapping, forest mapping, urban neighborhood mapping, damages due to flooding, hailstorms, and forest fires, impacts of climate change on crops, unusual crop detection (e.g., poppy plantations), changes in biomass, understanding complex interaction between food, energy, and water, etc. Classification of these high-resolution images requires object and arbitrary patch based classification to capture relevant spatial context. The advent of multiple instance learning and deep learning took the natural image processing community by storm. However, its application to satellite images has been slow due to training data and computational requirements. In this project, we develop deep learning algorithms for classification of satellite images and scale these algorithms on Lenovo/Intel’s new architectures and software infrastructure (e.g., Neon, Caffe, Theano, and MXNet).

Big Data & Machine Learning
Ben Watson

$5,000 by Caterpillar, Inc.
12/15/2017 - 12/31/2018

Analyze the data and video provided by Caterpillar to develop possible measures of user experience such as frustration, engagement, or flow; compare results to the survey results University collects as part of the funded project. Surveys will ideally be well-established instruments in the public domain, or custom built for the project. If necessary, surveys may be licensed by the University, as long as they do not restrict use of the resulting data. The University will provide Caterpillar with the results of this comparison.

Science of Security Lablet: Impact through Research, Scientific Methods, and Community Development
Laurie Williams ; Munindar Singh

$467,750 by US Dept. of Defense (DOD)
04/ 4/2018 - 09/14/2022

This project proposes the continuation of the Science of Security Lablet at NC State University. Science of Security refers to the study of cybersecurity from an explicitly scientific perspective. Cybersecurity encompasses elements of technology, human behavior, and policy. Science of Security seeks to identify and apply the appropriate scientific principles on cybersecurity problems, enhancing rigor and reproducibility, thereby improving the transfer of research to practice. This Lablet provides a home for investigations into diverse topics pertaining to a Science of Security. The Lablet will support the three major elements of a Science of Security: research, scientific methods, and community engagement.