1993 — 1999 |
Bradley, Elizabeth |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Nyi: New Approaches to Engineering Design: Controlled Chaos and Computer Automation @ University of Colorado At Boulder
This project explores two new approaches to engineering design. Each of these threads is composed of synergistic subprojects; these subprojects combine analytical mathematics, numerical simulation, symbolic reasoning, and physical experiments. The first approach is to actively use chaos to improve the designs of various engineering systems. One example of this is causing gases to burn more effectively by induction of chaotic mixing in a combustion chamber. A small, well-placed obstruction can interact with the inflow and with the chamber's geometry to cause large effects: chaos's symptomatic ``sensitive dependence on initial conditions.'' These effects are tested both experimentally and with computational fluid dynamic simulations. Another example is the use of chaos to broaden the capture range of a phase-locked loop circuit. The second approach to improving engineering design is through computer automation. The ultimate goal of this endeavor is a suite of programs that explores different high-level analysis and design paradigms. Such tools would not only streamline an engineer's problem-solving task, but actually amplify creativity. The first step is a program that autonomously constructs mathematical models from information about a system, presented in a variety of forms and formats: measurements made directly on the physical device; a user's observations described to the program in varying degrees of precision, symbolically or graphically; partial models and hypotheses suggested by a user; and general mathematical theory that is encoded in the program's knowledge base.
|
1 |
2000 — 2004 |
Lee, Yung-Cheng (co-PI) [⬀] Cai, Xiao-Chuan (co-PI) [⬀] Bradley, Elizabeth |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Itr: An Interactive Experimental/Numerical Simulation System With Applications in Mems Design @ University of Colorado At Boulder
This project will build a new generation of numerical simulation systems by creating a feedback path between physical experiments and numerical solvers. There are a number of exciting implications of this data-adaptive simulation idea. Engineering fluid flows are inherently complex. This complexity limits measurement and precision, so engineers are forced to work with fluid flows based on very sparse information. Numerical solvers, on the other hand, can resolve tiny flow structures, but they generally run in an open-loop mode and are thus unverified. Coupling the two forms of technology offers powerful advantages to each. Comparisons against live experimental data will allow simulation algorithms to be verified quantitatively, in detail, and in-line. Once it is verified in this fashion, one can use the simulation with confidence on related problems. Once can also use the sensor information to correct the solver's data, or even to adjust the solver parameters on the fly. Moreover, once the solver is properly synchronized with the real system, one could use the former to explore the physics of the latter in more detail than sensors would allow - and still trust the results.
A particularly compelling application area for data-adaptive simulation techniques is microelectromechanical systems (MEMS). This emerging technology is driving a revolution in engineering design that is placing new demands on numerical simulation. Accurate modeling of the interaction of tiny, flexible, moving structures with high-speed chaotic fluids is challenging. To resolve the fine details in this kind of simulation, computational fluid dynamics technology requires extremely fine meshes and the solution of very large systems of nonlinear equations. This makes it difficult to build production-quality computer-aided design (CAD) tools for MEMS, which in turn forces engineers to fabricate devices without testing them. Functional CAD tools would allow MEMS designers to achieve one-pass design, much as VLSI does now.
|
1 |
2001 — 2003 |
Shandas, Robin (co-PI) [⬀] Lee, Yung-Cheng (co-PI) [⬀] Bright, Victor (co-PI) [⬀] Bradley, Elizabeth Hertzberg, Jean [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Acquisition of a Particle Image Velocimetry System @ University of Colorado At Boulder
CTS-0114109 Jean Hertzberg, University of Colorado
In this proposal, funding is requested to purchase a Particle Image Velocimetry (PIV) system to enhance the research capabilities of the PI and four Co-PI's. They are actively engaged in a number of interesting research problems in fluid mechanics. These include real time simulation and control of a two-dimensional jet, evaluation of micro-electro-mechanical systems (MEMS) fluidic devices, cardiovascular fluid dynamics, and infectious aerosol generation.
|
1 |
2001 |
Bradley, Elizabeth Howe |
R03Activity Code Description: To provide research support specifically limited in time and amount for studies in categorical program areas. Small grants provide flexibility for initiating studies which are generally for preliminary short-term projects and are non-renewable. |
Health Effects of Job Displacement Among Older Workers
In the last two decades, older workers have represented an increasing proportion of the displaced workers in the U.S. Displaced workers are workers who experience involuntary loss of a permanent job, due to either business closing or layoff. Although the impact of job displacement on earnings is well established, the health effects of displacement among older workers in the U.S remain largely unknown. The broad objective of the proposed research is to examine the health consequences of job displacement among older workers in the U.S. The specific aims are to: 1) assess the effect of job displacement on subsequent physical disability and mental health, and investigate the persistence of observed effects; 2) assess the impact of re-employment and jobless duration on physical disability and mental health; and 3) examine the effect of job displacement on reports of disease onset. Using data from the first three waves of the Health and Retirement Survey (HRS), a nationally representative sample of older workers in the U.S., multivariate, longitudinal techniques will be used to investigate each aim. Physical disability will be measured using a composite score derived from self reports of difficulties with activities of daily living and mobility tasks. Mental health will be assessed using available items from the Center for Epidemiological Studies-Depression scale. Onset of disease will be measured by self-reported myocardial infarction, high blood pressure, and cancer. The proposed analyses build on the Investigators' previous examinations of the health effects of job displacement among older workers using data from the first two waves of the HRS. In addition, these proposed analyses are an important component of the Investigators' longer term goal of examining and explaining variations in the health effects of job displacement among older workers in comparable data sets in other countries. The larger goal of exploring cross-national comparisons will be the aim of a future application for national funding and is planned as the next step in the Investigators' research.
|
0.928 |
2003 — 2009 |
Anderson, Kenneth (co-PI) [⬀] Bradley, Elizabeth |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Collaborative Research: Itr: Software For Interpretation of Cosmogenic Isotope Inventories - Combination of Geology, Modeling, Software Engineering and Artificial Intelligence @ University of Colorado At Boulder
This project aims to improve the accuracy of geological chronologies based on cosmogenic isotope data. It combines research on computer models of the effects of various processes on the rate of accumulation of cosmogenic isotopes in geological formations, the development of a database of data (such as isotope abundances) and environmental parameters useful for cosmogenic isotope dating (such as solar output and the state of the terrestrial magnetic field at different times), and an artificial intelligence (AI) system to support the use of cosmogenic isotope accumulation models in dating. The AI system will act as an expert system guiding the user in the choice of modeling tools and data, and integrating the selected models and data. The combined AI/database/modeling system will be built using a component-based architecture to make it simpler to modify and extend as new processes and data sets are added.
Because of the central importance to geological inference of establishing the age of geological structures, this project, if successful should be helpful in a number of problems in research and applied geology. The project will also strengthen the connections between the geological and computer science research communities, both by virtue of the collaboration itself and by demonstrating the potential for exploiting information technology to further geological research.
|
1 |
2007 — 2012 |
Diwan, Amer (co-PI) [⬀] Bradley, Elizabeth |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Csr---Sma: Validating Architectural Simulators Using Non-Linear Dynamics Techniques @ University of Colorado At Boulder
This project strives to improve the state-of-the-art in architectural simulators in two ways: (i) by enabling users to determine if an architectural simulator correctly models the key aspects of a real hardware system; and (ii) by helping simulator implementors to identify weaknesses in their simulators. This problem is critical because much architectural research today uses simulations; if the simulators are not accurate then the results from the simulator may be misleading. Prior work validates simulators by using aggregate metrics (e.g., do the simulator and the real hardware execute a program in a similar number of total cycles?). This validation does not consider whether or not the simulator and the hardware exhibit similar time-varying behavior.
A key innovation of this project is that it embraces techniques from the field of non-linear dynamics. These techniques are well suited for computer systems for two reasons. First, computer systems are non-linear at all levels (e.g., the cost of a cache miss is not fixed and instead depends on many factors) and thus linear techniques will be of only limited use. Second, many aspects of computer systems are unknown (e.g., the hardware manufacturer may not reveal the full specifications of the hardware) or unmeasurable (e.g., it may not be possible to measure how often a particular event occurs deep down in the hardware). Non-linear dynamics techniques are specifically designed to deal with such missing information.
|
1 |
2007 — 2009 |
Bradley, Elizabeth Howe |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
For-Profit Ownership and End-of-Life Care
[unreadable] DESCRIPTION (provided by applicant): Hospice is an increasingly important aspect of care for patients with cancer and their families; currently, more than half of all patients dying from cancer use hospice. In the last decade, the hospice industry has experienced dramatic growth, in for-profit ownership. The number of for-profit hospices grew 300% while the number of nonprofit hospices grew only 43%. Despite this expansion, we know little about how for-profit ownership affects end-of-life care for patients with cancer and their families. To address this gap in knowledge, we will produce critical evidence about the impact of for-profit hospice ownership on cancer care. Our specific aims are: 1) to determine the association between hospice ownership type and claims-based quality indicators for patients dying from cancer, adjusted for patient and county-level features, and 2) to determine the association between hospice ownership type and additional quality indicators for patients dying from cancer that cannot be assessed with claims data and examine how these associations are modified by other organization-level features. For the first of these specific aims, we have created a unique dataset that has linked the Surveillance, Epidemiology, and End Results-Medicare database with the Medicare Provider of Services data and the Area Resource File and includes detailed clinical and utilization data for 88,293 patients with cancer. For the second specific aim, we will develop, pretest, and field a survey of 500 hospice organizations to evaluate organization-based quality indicators that have been recommended by RAND, the National Quality Forum, and the Institute of Medicine. The phenomenon of increasing for-profit ownership of hospice is dramatic, current, and has potentially far-reaching implications for patients with cancer and their families. At a time when government and employers are looking for private sector solutions to health care problems, it makes sense to ask what impact for-profit hospice ownership has on end-of-life care and to identify opportunities for improving cancer care. Relevance in lay language: We will generate knowledge about the influence of for-profit hospice ownership on the experiences of patients with cancer and their families. [unreadable] [unreadable] [unreadable]
|
0.928 |
2012 — 2016 |
Bradley, Elizabeth |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Reduced-Order Dynamical Models For Effective Power Management in Computer Systems @ University of Colorado At Boulder
This project concerns reduced-order modeling and forecasting techniques for reducing power use in microprocessor chips. Modern power management solutions for these systems employ extremely simple control strategies---e.g., lowering the clock frequency by a fixed, pre-determined amount if a processor's load crosses some threshold. The methods developed by the control theory and nonlinear dynamics communities have long since moved beyond this level of sophistication. Model-based prediction, in particular, could enable vastly improved power management, but only if the models involved are accurate. If one could predict that a particular thread of computation would be bogged down for the next 0.6 seconds waiting for data from the computer's memory, for instance, one could put that thread on a low-power hold for that time period. Prediction of the future behavior of a complex nonlinear dynamical system like a modern computer is a serious challenge, however---and it is all but impossible if one uses mathematics that assumes linearity and/or time invariance, as has been the rule until recently in the computer systems community. The approach proposed here uses a novel reduced-order modeling strategy that first transforms the time-series data into a 2D representation called a tau-return map. The power of this representation is that it brings out temporal relationships explicitly, `unfolding' the temporal patterns in the time series into a spatial dimension. Forecast models working on this transformed data can be used to create accurate predictions of processor and memory loads in multicore processors: information that can be used to dynamically adapt the computation to the resources, and vice versa.
Reducing the power use of microprocessor chips, one of the most critical classes of engineered systems in use today, is an important challenge in the modern world of ubiquitous computation. In order to manage power use effectively, one must be able to dynamically adapt the computation to the resources. Monitoring is one key element in solving that problem: if one knew which processing units in a multi-core processor were busy and which ones were idle, for instance, one could re-route work from the former to the latter. Forecasting is another key element: doing that kind of reallocation reactively is good, but doing it PROACTIVELY---based on a prediction of those loads and levels---would be far better. The complexity of modern computer systems makes prediction very difficult, however. These systems have large numbers of internal variables that interact in complex, nonlinear ways, and only a few of those variables can be monitored. The approach proposed here uses a mathematical mapping to tease the important temporal relationships out of these narrow streams of data that can be measured from a running computer. It builds forecast models in that new space using mathematics that fully respects the complexity and nonlinearity of the underlying system---unlike traditional approaches, which generally treat computer systems as linear and time-invariant. It uses those forecast models to save power by tailoring the computational load to the available resources, and vice versa. The potential impact of this work is significant, particularly in view of the recent design evolution of multicore processors and the rapid proliferation of mobile devices. Because computers are so common and so critical, this work has the potential to contribute to science, engineering, and well beyond.
|
1 |
2012 — 2017 |
White, James Bradley, Elizabeth Anderson, Kenneth (co-PI) [⬀] Marchitto, Thomas (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Inspire: Automating Reasoning in Interpreting Climate Records of the Past @ University of Colorado At Boulder
This INSPIRE award is jointly funded by the Information Integration and Informatics Program in the Information and Intelligent Systems Division of the Computer and Information Sciences Directorate, the Marine Geology and Geophysics Program in the Ocean Sciences Division of the Geosciences Directorate, the Arctic Natural Sciences Program in the Arctic Sciences Division and the Antarctic Glaciology Program in the Antarctic Sciences Division in the Office of Polar Programs, and the Office of Cyberinfrastructure.
The critical first step in the analysis of paleoclimate records like ice or sediment cores is the construction of an age model, which relates the depth in a core to the calendar age of the material at that point. The reasoning involved in age-model construction is complex, subtle, and scientifically demanding because the processes that control the rate of material accumulation over time, and that affect the core between formation and sampling, are unknown. Geoscientists approach this problem by treating the core like a crime scene and asking the question: "What physical and chemical processes could have produced this situation, and what does that say about the timeline?" However, the sheer number of possibilities, coupled with the volume and complexity of the climatology data that is currently available and is continually collected, severely limit the scope of these investigations. The goal of this project is to remove this roadblock. This research will lead to an integrated software tool called CScibox, that uses automated reasoning techniques to help scientists analyze ice and sediment cores. It employs a cyberinfrastructure that provides powerful, intuitive tools on a scientist's desktop while taking full advantage of modern data- and computation-intensive computing and networking infrastructure -- including workflow-based computation, parallel execution, distributed systems, cluster machines and the cloud.
CScibox will not only improve the ability of individual geoscientists analyze single cores; it has the potential to transform the field of paleoclimatology by facilitating rapid, reproducible analysis and synthesis of the information in the diverse collections of raw data available in data archives to foster understanding and improved scientific decision making. This will have broad impacts for society by allowing scientists to develop deeper insights into the roles of various factors in the complex relationships that give rise to geological records of the earth's climate that are used in today's models of environmental change. This project also has a broad educational impact. Students involved in the development and implementation of CScibox will develop skills in interdisciplinary research and will learn how to apply computational methodology in a challenging scientific context that has not yet significantly benefitted from developments in information technology. CScibox is designed to be easy to install and use; see the project web site (http://www.cs.colorado.edu/~lizb/cscience.html) for source code, documentation, and examples of its use. Future steps include extending the work to other paleoclimate data, working with geoscientists to make the user interface as intuitive as possible, and holding demos and workshops at geosciences conferences to educate that community about what the tool can do and how to use it.
|
1 |
2014 — 2016 |
Meiss, James (co-PI) [⬀] Bradley, Elizabeth |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Eager: Characterizing Regime Shifts in Data Streams Using Computational Topology - the Mathematics of Shape @ University of Colorado At Boulder
Time-series data arise in a wide array of engineered systems, including network traffic, vibration sensors on machine tools, acoustic sensors on reactor containment vessels, and many other examples. The development of efficient and effective methods to characterize the patterns in such data has widespread utility in engineering, commerce and other fields. Methods for characterizing patterns in these streams could be used to detect malware attacks on a network, a lathe bearing that is degrading, or an impending containment failure in a reactor. Common challenges include observability - situations when sensors are expensive or difficult to deploy, or when they perturb the behavior under examination - as well as high information content, noise, and rapid regime shifts. The ultimate goal of this EArly-Grant for Exploratory Research (EAGER) project is to use computational topology, the fundamental mathematics of shape, to deal with these challenges. Shape is perhaps the roughest notion of structure and can be particularly robust to contamination of the signal. The specific goal of this study is to develop new methods for identifying and categorizing the temporal patterns associated with the regime shifts a stream of data.
A topological approach to time series analysis is distinct from standard methods of the machine learning and stream-mining communities, which typically use probabilistic approaches and often implicitly assume linearity. This project seeks to extract nonlinear structure not necessarily visible in a regresssion or spectral approach. Indeed, a regime shift need not correspond to a change in the frequency content of a signal, but could nevertheless be represented as a shift in the homology (e.g., Betti numbers) of the embedded signal. A goal is to develop techniques useful to engineers and scientists for the detection of incipient system failure or rapid evaluation of state changes from hidden causes. Existing algorithms of computational topology often require lengthy computations, especially for large data sets in many dimensions. However, since not all of those variables may be observable, one may have to reconstruct the full dynamics from partial measurements--e.g., using the process called delay-coordinate embedding. This project seeks rapid evaluation of Betti numbers based on incomplete, partial embeddings. A novel aspect is that the dynamics gives rise to a multivalued map on a simplicial complex, a "witness map." Selection of multiple parameters in the algorithms will be based on persistent homology, previously developed only for the analysis of static data sets and for a single parameter. The ultimate goal is robust and rapid regime detection for a limited data stream from a "black-box" source.
|
1 |
2015 — 2018 |
Meiss, James (co-PI) [⬀] Bradley, Elizabeth |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
The Shape of Data: a New Way to Detect Critical Shifts in System Performance @ University of Colorado At Boulder
This project is about a new way to detect and classify shifts in the patterns of measurements taken by sensors. Subtle shifts in the output from a vibration sensor on a bridge abutment, for instance, can indicate that a crack is developing in that structure. Identifying these shifts can be a real challenge because modern sensors can generate so much information, and so quickly. Existing approaches to this use the mathematics of statistics: averages, variability, and the like. This project uses techniques from topology, the branch of mathematics that is concerned with shape, to detect these kinds of shifts.
This project will develop techniques to assess and characterize temporal patterns in nonstationary time-series data. The project will compute aspects of the homology---e.g., the first few Betti numbers---of the stream "on the fly" to obtain a signature of its regime. To mitigate the computational burden of traditional techniques, this project uses a simplicial complex based on a small, representative set of "landmarks," with the remaining data, the "witnesses," defining the connections. New techniques include a generalization of the oft-used false near-neighbor method to evaluate the fidelity of the topology of the reconstruction, efficient selection of landmarks and choice of witness relation, classification of structure through multi-parameter persistent homology of embedded data, and development of a map on the witness complex to obtain a dynamical signature of each regime. The goals include efficient detection of regime shifts, the development of a catalog of signatures within regimes, and the detection and mitigation of noise. The classification techniques will be applicable, for example, to the detection of failure modes in manufacturing systems, malware attacks on a network, incipient heart problems, or an impending containment failure.
|
1 |
2017 — 2021 |
Bradley, Elizabeth W |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Functions of the Ph-Domain Leucine Rich Protein Phosphatase (Phlpp)1 in Osteoclasts @ University of Minnesota
Abstract Osteoporosis is a significant cause of morbidity and mortality in developed countries. In the United States, over 8 million women and 2 million men age 50 and above have osteoporosis. Another 51 million women and 35 million men have osteopenia. This translates into a higher risk of fracture within this population, with half of all women and one quarter of men projected to suffer a low bone mass-related fracture. Osteoporotic fractures impart deleterious consequences, including increased hospitalization and mortality. An estimated 25% of individuals will die within one year of a hip fracture. Despite the availability of anti-resorptive therapies that decrease fracture rates, their prescription and use have declined significantly due to fears of extremely rare but serious adverse events. This has created a treatment gap that threatens the quality of life of millions of Americans and poses a tremendous challenge to our healthcare systems and economy. It also highlights the need to better understand osteoclast formation and function. The goal of this five-year application is to determine the role of the protein phosphatase Phlpp1 in osteoclastogenesis and bone density and to define the epigenetic mechanisms required for Phlpp1 to modulate Csf1r (c-fms) expression and osteoclastogenesis. The five-year deliverables are: 1) definition of the functions of Phlpp1 in osteoclast progenitor cells, 2) characterization of the cytosolic functions of Phlpp1 that influence M-CSF responsiveness, Csfr1 expression and osteoclast function, and 3) determination of the nuclear functions of Phlpp1 that influence Csfr1 transcription and osteoclastogenesis. This work will increase our understanding of how Phlpp1 and associated pathways affect bone density.
|
0.916 |
2020 — 2023 |
Meiss, James (co-PI) [⬀] Bradley, Elizabeth Berger, Thomas (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Harnessing the Data Revolution in Space Physics: Topological Data Analysis and Deep Learning For Improved Solar Eruption Prediction @ University of Colorado At Boulder
Eruptions generated by sunspots --- large concentrations of magnetic field on the visible surface of the Sun --- can have a number of dire impacts on Earth-based technological systems, crippling satellites and power grids, among many other things. With enough advance notice, the effects of these events can be mitigated, but predicting them is a real challenge. In current operational practice, this is accomplished by human forecasters examining images of the Sun, classifying each sunspot according to a taxonomy developed in the 1960s, and then using look-up tables of historical probabilities to say whether or not it will erupt in the next 24 hours. Recently, there has been a burst of work on machine-learning methods to automate this task. To date, the "features" used in these approaches have been predominately physics-based: the gradient of the magnetic field, for instance, or the sum of its strength over high-flux regions. The main objective of this 3-year research project is to leverage algorithms based on the fundamental mathematics of shape --- topology and geometry --- to improve the performance of these methods. The specific plan is to use these powerful techniques to extend the relevant feature set to include characteristics of the magnetic field that are based purely on the geometry and topology of 2D magnetogram images. Although this approach ignores the 3D structure of the full electromagnetic fields, it can enhance the predictive skill of machine learning systems. Preliminary results show clear topological changes emerging in magnetograms of a 2017 sunspot more than 24 hours before it flared, as well as clear improvements in the accuracy scores of a neural-net based flare prediction method that employs these shape-based features. Better predictions of solar flares could allow operators of power grids, airlines, communications satellites, and other critical infrastructure systems to mitigate the effects of these potentially destructive events. The broader impacts of this project also include the development of the STEM workforce through the training of graduate students at the University of Colorado at Boulder, as well as education and outreach, including community lectures, development of large-scale, online courses and public lecture series. The interdisciplinary nature of the project will deepen the contact between the fields of space weather, applied mathematics, and computer science, bringing researchers, students, and post-docs from both fields into productive new collaborations. The collaboration with the Space Weather Technology, Research, and Education Center at the University of Colorado offers unique opportunities to factor in real-world forecasting constraints and set the stage for transitioning the results to operational status.
For the first time, this 3-year research project would provide systematic quantitative measures of the shape of 2D magnetic structures in the Sun?s photosphere for the purposes of solar flare prediction. In a sense, this amounts to a mathematical systemization of the venerable McIntosh and Hale classification systems. This approach differs from current studies in the solar physics community that model the magnetic field-line structure: it uses topology to address the structure of two-dimensional sets. The analysis is restricted to photospheric magnetic field structures; the goal is to extract a formal characterization of shape that can be leveraged by machine learning to improve flare prediction. The considered addition of geometry into these methods by the project team is essential if they are to capture the full richness and physical relevance of the structures important in the evolution of a sunspot. This research project will point the way forward to a more robust set of features for machine-learning-based eruption prediction architectures. The research and EPO agenda of this project supports the Strategic Goals of the AGS Division in discovery, learning, diversity, and interdisciplinary research.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
|
1 |