1995 — 2001 |
Rosen, J. Ben Petzold, Linda Tranquillo, Robert |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Mdc: a High-Performance Problem-Solving Environment For Optimization and Control of Chemical and Biological Processes @ University of California-Santa Barbara |
1 |
2000 — 2005 |
Yang, Tao (co-PI) [⬀] Petzold, Linda Mezic, Igor (co-PI) [⬀] Macdonald, Noel (co-PI) [⬀] Tirrell, Matthew (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Itr: Computational Infrastructure For Microfluidic Systems With Applications to Biotechnology @ University of California-Santa Barbara
The applications of microfluidic devices (which involve liquids moving in spaces measured in micrometers, i.e. millionths of a meter) are growing explosively. As a specific example, consider the development of microsystems for blood testing and screening. For consumers, one could envision devices available in drugstores that could perform genetic screening for conditions of concern to individuals. At a larger scale, use of such devices in blood banks could significantly reduce the time and blood lost in screening the 14 million pints of blood donated per year. Sample preparation is a critical bottleneck in the development of integrated miniature analytical systems, and it remains largely unaddressed. It is currently done outside the microsystem by mixing, shaking, and pipetting, because there are no effective integrated design method. Improved computational methods promise to allow integration and interconnection of microfluidics. This will have an effect analogous to automated methods for VLSI design on microelectronics; it will revolutionize the field.
This project will develop a computational infrastructure for simulation and design of microfluidic systems involving non-Newtonian, micrometer/nanometer-scale flows dominated by surface-related phenomena. Computational tools and analytical tools will be developed and used to compare with theoretical and experimental results. The project emphasizes methods to deliver complex molecules to flow surfaces, to create surface reaction sites and to provide the components for molecular-scale mixing and dispensing. It will design, fabricate, and characterize both stationary and oscillating MEMS fluidic channels and surfaces to evaluate molecular-scale mixing, flow, delivery, and dispensing of complex biological fluids. The focus will be on surface dominated flow and reaction phenomena that can be scaled for delivery of single molecules to programmed reaction sites. Such surface-related phenomena should find broad application in making MEMS-based, "chip-scale" analytical instruments and "biochips". The computational tools required to analyze and design such devices are currently nonexistent. This project brings together a team of computer scientists, numerical analysts, fluid dynamicists, experimentalists, and microscale process theoreticians who will collaborate closely on creating those tools and using them.
|
1 |
2002 — 2006 |
Petzold, Linda Milstein, Frederick (co-PI) [⬀] Maroudas, Dimitrios (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Collaborative Research Itr/Ap: Enabling Microscopic Simulators to Perform System-Level Analysis @ University of California-Santa Barbara
Research:
An interdisciplinary research team, at four universities, is collaborating on this medium-size Information Technology Research (ITR) project aimed at systematically bridging the gap between microscopic descriptions of complex material systems and systems-level analysis of direct engineering importance. A mathematics-assisted computational methodology will be developed that will enable microscopic-level simulators to perform systems-level analysis directly, without the need to pass through an intermediate level description of the material system through macroscopic (partial differential or integro-differential) evolution equations. Specifically, an ensemble-averaged "coarse" time-stepper-based computational superstructure will be "wrapped around" state-of-the-art microscopic dynamic simulators, such as molecular dynamics, kinetic Monte Carlo, Lattice-Boltzmann or hybrid codes. This methodology will enable microscopic simulators to perform advanced systems-level analysis: stability, bifurcation, "coarse" integration, sensitivity, and control tasks, of complex, nonlinear distributed processes. The planned algorithms will run on massively parallel machines.
The computational framework will consist of the following basic elements: (i) choice of statistics of interest (e.g. distribution moments) for describing the coarse behavior; (ii) "lifting" of a macroscopic initial condition to an ensemble of consistent microscopic configurations; (iii) evolution over the same (short) time period of each initial microscopic configuration in the ensemble according to a microscopic simulator that embodies the best current description of the physical system; (iv) averaging ("restriction") over the ensemble of the evolved microscopic configurations to provide a macroscopic evolved system state; and (v) execution of the previous three steps over a finite set of macroscopic initial conditions. This new approach is robust in its implementation and portable in its range of scientific and engineering applications. It has general applicability to all systems for which a macroscopic description is conceptually possible, yet unavailable in closed form. It circumvents the difficulty in obtaining and closing such macroscopic models, while computationally extracting precisely the information that would be obtained by a macroscopic model, had the model been available in closed form. This provides the link between ITR and a spectrum of application areas.
Impact:
The impact of the research will be on establishing a powerful and general link between state-of-the-art microscopic-level simulations and fast systems level analysis capabilities. Although the research focuses on specific problems in heterogeneous hard materials and complex fluids, the computational framework is applicable to a broad range of complex systems, including biological systems, their processing and function. Since it has the potential to revolutionize engineering systems-level analysis, it could have educational impact as well as furthering advances in microelectronics, bioinformatics and nanotechnology.
|
1 |
2002 — 2013 |
Birnir, Bjorn (co-PI) [⬀] Petzold, Linda Homsy, George (co-PI) [⬀] Meiburg, Eckart (co-PI) [⬀] Maroudas, Dimitrios (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Igert: Graduate Education Program in Computational Science and Engineering With Emphasis On Multiscale Problems in Fluids and Materials @ University of California-Santa Barbara
This IGERT program is structured to provide a unique Ph.D. program in interdisciplinary research and education in Computational Science and Engineering (CSE). The vision is to educate students for whom working in interdisciplinary teams is the norm, and who have the ability to acquire knowledge, ways of thinking, and perspectives from other disciplines. The proposed IGERT PhD experience is different from one in a traditional discipline, and possibly unique among CSE programs in the USA. The IGERT PhD theses will be jointly supervised, and those students with a particular disciplinary orientation will share resources, knowledge, and approaches with IGERT students with other orientations. While a typical IGERT PhD thesis will still have a strong focus in a discipline, it will contain major elements of independent creative work in other disciplines relevant to the general problem area under study. IGERT students and faculty will work together in three Focus Groups: Microscale Engineering, Complex Fluids, and Computational Materials Science, to solve a wide range of important and timely problems that depend deeply on integration of information from the smaller scales to the larger scales. These multiscale problems require a strong foundation in both engineering and the mathematical and computational sciences. The curriculum ensures depth in one area and a significant exposure to high level courses in one or more ancillary areas. It includes new courses in atomic-scale computer simulation, and computing for high performance, to specifically address the multiscale nature of the Focus Group problems and their computational requirements. An internship is required to broaden and reinforce the interdisciplinary research experience, and a required series of workshops and seminars will give IGERT students a significant exposure to important aspects of career development and ethics.
IGERT is an NSF-wide program intended to meet the challenges of educating U.S. Ph.D. scientists and engineers with the multidisciplinary backgrounds and the technical, professional, and personal skills needed for the career demands of the future. The program is intended to catalyze a cultural change in graduate education by establishing innovative new models for graduate education and training in a fertile environment for collaborative research that transcends traditional disciplinary boundaries. In the fifth year of the program, awards are being made to twenty-one institutions for programs that collectively span the areas of science and engineering supported by NSF.
|
1 |
2003 — 2006 |
Sugar, Robert (co-PI) [⬀] Brown, Frank [⬀] Brown, Frank [⬀] Petzold, Linda Metiu, Horia (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Acquisition of a High Performance Central Computing Facility At Ucsb @ University of California-Santa Barbara
With support from the Major Research Instrumentation (MRI) Program, the Department of Chemistry and Biochemistry at the University of California in Santa Barbara will acquire a Beowulf cluster. This equipment will enhance research in a number of areas including a) simulation of biomaterial microstructures; b) simulation and interpretation of single molecule spectroscopy experiments; c) field-theoretic computer simulations of polymers and complex fluids; and d) modeling and computation of flow-structure interaction in multi-phase and complex fluids.
A cluster of fast, modern computer workstations is vital to serving the computing needs of active research departments. Such a "computer network" also serves as a development environment for new theoretical codes and algorithms, provides state-of-the-art graphics and visualization facilities, and supports research in state-of-the-art applications of parallel processing. These studies will have a significant impact in a wide number of areas, including materials science and physics.
|
1 |
2003 — 2008 |
Carlson, Jean (co-PI) [⬀] Khammash, Mustafa [⬀] Petzold, Linda |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Itr Collab: Theory and Software Infrastructure For a Scalable Systems Biology @ University of California-Santa Barbara
The now well-known vision and challenge in post-genomics biology is to make the entire process of research scalable to large networks using high-throughput techniques and large-scale computation. Computational biology and bioinformatics have focused attention on the need for sophisticated methods for handling large databases and tools for modeling and simulating complex networks. Not as widely recognized is that the scalability of the more subtle processes of drawing meaningful and reliable scientific, medical, and biological inferences from the wealth of data and computation is equally important and requires the development of fundamentally new theory and software. The research objective of this project is to develop the theoretical foundation and information technology in-frastructure necessary to accelerate progress in systems biology, with concrete demonstrations on a variety of bi-ological experiments. This ambitious goal requires augmenting bioinformatics and current modeling and simula-tion approaches with greater understanding of the organizational principles underlying network complexity, including connections with molecular details, and exploiting this understanding to advance mainstream experimental biology. Building on recent breakthroughs in theory and scalable algorithms for systematic robustness analysis and model (in)validation of nonlinear network models with uncertain rate constants, the project maps out a research path that will (1) develop the necessary rigorous and practical mathematical theory; (2) embody it in a software environment that supports the complex iterative processes involved in going from raw data to modeling, analysis, and inference, with tight feedback to experimentation and modeling throughout; and (3) apply the theory and software to specific experimental studies in biology as a way of grounding the entire endeavor. The intellectual merit combines immediate practical impact and conceptual depth. Automating and computation-ally augmenting scientific and mathematical inference from noisy and incomplete data for uncertain models has long been an elusive goal. Achieving it in the context of complex biological systems is for the first time both a necessity and an achievable goal. To do this, data and modeling assertions and questions must be described in a common framework that is biologically natural, yet can be stored, manipulated, shared, and ultimately turned over to powerful algorithms for resolution. Our objective is to create tools which make it possible to systematically answer questions such as: Is a proposed model consistent with experimental data? If so, is it robust to additional perturbations that are plausible but untested? Are different models at multiple scales of resolution consistent? What is the most promising experiment to refute or confirm a model? Traditionally, such network-level questions that arise naturally in biology have been considered computationally intractable, since they are typically stochastic, nonlinear, nonequilibrium, ncertain, in-volve multiple scales, and hybrid (mixing continuous and discrete mathematics), limiting approaches to heuristic and brute-force methods, or to extreme simplification. Recently this situation changed profoundly, based on new methods developed by the research team and their collaborators. A crucial insight is that evolution favors high robustness to uncertain environments and components, yet allows severe fragility to novel perturbations, and this robust yet fragile feature must be exploited explicitly in scalable algorithmic approaches. The broader impact lies in the synergistic links this work forges with similar challenges that exist throughout science and technology, such as the Internet, aerospace systems design, materials science, multiscale physics, stochas-tic multiscale chemistry, and disturbance ecology. The theoretical foundations build broadly on robust control theory, dynamical systems, numerical analysis, operator theory, real algebraic geometry, computational complexity theory, duality and optimization, and semi-definite programming. The results will be made accessible to the broadest possible audience, both with representative and challenging experimental biology and the connections with other examples of complex systems. The preliminary progress already made by this team is striking and has been applied to under-standing, for example, the robustness of complex control systems, the performance of internet protocols, and bacterial chemotaxis and stress response. The work is creating new mathematics and algorithms, beginning to appear in the highest-impact journals, and concretely demonstrating that this research can help experimental biologists. Diversity and breadth appear at every level. In the research group of the lead PI (Doyle), 6 of 11 graduate students and 2 of 4 postdoctoral scholars are women, and include a broad racial and ethnic diversity. The other 5 co-PIs are from a broad spectrum of disciplines and diverse but elite academic institutions, 3 are women, and all PIs have strong and very concrete commitments to integrative, multidisciplinary research, diversity, educational innovation, and outreach at every level including K-12. The team members are frequent featured speakers at integrative conferences and in interdisciplinary colloquia at premier universities, and speakers and organizers of workshops and short courses in systems biology. This program both directly involves leading mainstream biology, and has broad contact with it through additional collaborations, creating conduits to broad dissemination of the research results in biology. The team's algorithms and software infrastructure are becoming de facto standard tools empowering research in multiple disciplines, and forming a solid foundation upon which this program builds.
|
1 |
2004 — 2010 |
Yang, Tao (co-PI) [⬀] Liu, Xu-Dong (co-PI) [⬀] Petzold, Linda Alkire, Richard |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Itr - (Ase) - (Sim+Dmc): Computational Toolbox For the Investigation of Multiscale Surface Processes @ University of California-Santa Barbara
This project focuses on the development of a computational toolbox for investigation of multiscale surface processes that are central to nanotechnology as well as other current technologies. Two physical systems will be studied that span from nano-scale phenomena to large-scale deterministic transport phenomena. The algorithms and software, developed to simulate and extract information from multiscale systems, are generic over a broad class of problems, and will contribute well beyond the applications used in their development. The physical systems include electrodeposition of metallic nanoclusters with additives to achieve specific shapes, and environmental degradation through interaction of pits, crevices and cracks. The physical systems, chosen for their computational structure, are characteristic of a large class of systems where controlled shape evolution is exploited to produce desired structures. Key issues are to understand how small-scale surface interactions guide spontaneous self-organization, how to extract insight from noisy data and uncertain fundamental understanding, and how to insure quality control at multiple scales in manufacturing. Computational tools will be developed for simulation and sensitivity analysis in multi-phenomena multiscale systems that require methods for coupling of stochastic and deterministic models. Challenges for deterministic simulation include the effective use of parallel computers, and dealing with moving boundaries, ill-conditioning and stiffness. We will explore classes of preconditioners for the iterative methods that solve large linear systems of equations at each time step, in particular a newly-developed multigrid method that is well-suited to moving boundary problems. Challenges for stochastic simulation include stiffness, which has only recently been recognized as a barrier to efficiency for stochastic simulation. Sensitivity analysis is an important part of this effort. For the deterministic computations, we will make use of recently developed methods that are adaptive in space and time. We will develop new sensitivity analysis methods and software for stochastic systems, and couple them to deterministic sensitivity analysis for the physical systems of interest. We will facilitate the use of our toolkit by extension to larger-scale software systems of a recently-developed environment for the rapid creation of GUIs for scientific and numerical software. This project addresses the National Priority Area of Advanced Science and Engineering (ASE), and the Technical Focus Areas of Innovation in computational modeling or simulation in research or education (sim) (primary), and of Innovative approaches to the integration of data, models, communications, analysis and/or control systems, including dynamic, data-driven applications for use in prediction, risk-assessment and decision-making (dmc) (secondary). Broader Impacts The proposed project will impact the National Priority Area of ASE through the development of algorithms and software to enhance the use of high performance computers in the investigation of multiscale surface processes. The availability of such a toolbox will accelerate fundamental scientific research and engineering design in an area with the potential for large economic impact. Software developed as a result of this project will be widely distributed in the scientific and engineering, computer science and mathematical sciences communities. The educational activities feature a multidisciplinary, cross-institutional approach to graduate education. Students will work in multidisciplinary teams, with joint thesis advisors from a primary and a secondary discipline. This approach has recently been undertaken at UCSB with some success; we plan to institutionalize this approach to graduate education in Computational Science and Engineering (CSE) at UCSB, and to export the model to UIUC. The model also includes industrial internships, career development workshops, and mentoring of undergraduates. Both UIUC and UCSB have been pioneers in developing graduate programs in CSE and have programs with a similar structure which will facilitate the sharing of educational ideas and innovations across the institutions.
|
1 |
2006 |
Petzold, Linda R. |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Multiscale Modeling &Analysis of Circadian Rhythm @ University of California Santa Barbara
[unreadable] DESCRIPTION (provided by applicant): A multicellular model of mammalian circadian rhythm generation and synchronization will be developed through an integrated program ofneurophysiological experiments, theoretical modeling and multi-scale systems analysis and computation. Our hypothesis is that the coupling of individual pacemaker neurons is mediated by the neurotransmitter vasoactive intestinal peptide (VIP) prevalent in the suprachiasmatic nucleus (SCN). Experiments will identify the dose- and phase-dependence of SCN neurons on VIP for circadian synchrony. These experiments will guide the development of a pacemaker cell model in which a detailed description of the gene regulatory network responsible for rhythmic electrical activity is combined with a simplified description of the VIP signaling pathways implicated in circadian coupling. The pacemaker model will be used as a building block in the construction of neural population models that account for the known anatomy and physiology of the SCN including the core and shell divisions and the distribution of VIP producing cells in the two divisions. The resulting models will cover a wide range of time and length scales ranging from the gene regulation level to the neuron signaling level to the tissue level. Deterministic simulation codes will be developed to allow the efficient simulation of large ensembles of coupled SCN neurons. Stochastic effects at the gene and signaling levels will be studied by developing combined deterministic/stochastic simulation codes. Parallel theoretical work on the population model and a simplified surrogate model will yield insights into stochastic effects in large neuron populations. The combined experimental, theoretical and computational work will allow systematic perturbation analysis of the roles of individual neurons and their interconnections on the precision and robustness of circadian rhythm generation. [unreadable] [unreadable] [unreadable] [unreadable]
|
0.958 |
2007 — 2009 |
Petzold, Linda R. |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Multiscale Modeling &Analysis of Circadian Rhythm Generation &Synchronization @ University of California Santa Barbara
[unreadable] DESCRIPTION (provided by applicant): A multicellular model of mammalian circadian rhythm generation and synchronization will be developed through an integrated program ofneurophysiological experiments, theoretical modeling and multi-scale systems analysis and computation. Our hypothesis is that the coupling of individual pacemaker neurons is mediated by the neurotransmitter vasoactive intestinal peptide (VIP) prevalent in the suprachiasmatic nucleus (SCN). Experiments will identify the dose- and phase-dependence of SCN neurons on VIP for circadian synchrony. These experiments will guide the development of a pacemaker cell model in which a detailed description of the gene regulatory network responsible for rhythmic electrical activity is combined with a simplified description of the VIP signaling pathways implicated in circadian coupling. The pacemaker model will be used as a building block in the construction of neural population models that account for the known anatomy and physiology of the SCN including the core and shell divisions and the distribution of VIP producing cells in the two divisions. The resulting models will cover a wide range of time and length scales ranging from the gene regulation level to the neuron signaling level to the tissue level. Deterministic simulation codes will be developed to allow the efficient simulation of large ensembles of coupled SCN neurons. Stochastic effects at the gene and signaling levels will be studied by developing combined deterministic/stochastic simulation codes. Parallel theoretical work on the population model and a simplified surrogate model will yield insights into stochastic effects in large neuron populations. The combined experimental, theoretical and computational work will allow systematic perturbation analysis of the roles of individual neurons and their interconnections on the precision and robustness of circadian rhythm generation. [unreadable] [unreadable] [unreadable] [unreadable]
|
0.958 |
2010 — 2016 |
Khammash, Mustafa (co-PI) [⬀] Petzold, Linda |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Collaborative Research: Next-Generation Algorithms For Stochastic Spatial Simulation of Cell Polarization @ University of California-Santa Barbara
Cell polarity, whereby cellular components that were previously uniformly distributed become asymmetrically localized, is essential to the diverse specialized functions of eukaryotic cells. A hallmark of cell polarity is spatial localization. From a modeling point of view, spatial localization cannot be understood without proper modeling of the spatial dynamics governing its creation and time evolution. At the same time, spatial dynamics are profoundly influenced by stochastic events that manifest as cellular noise. Therefore, deep understanding of cell polarity inevitably requires the proper modeling, simulation, and analysis of stochastic spatial dynamics. The fundamental problem limiting work in this area in the past has been the computational complexity of stochastic spatial simulations. This project develops the experimental data and algorithms for modeling, simulation and analysis of spatial stochastic dynamics arising in cell polarity in the yeast pheromone response system. A novel algorithm is developed to address the computationally intensive task of spatial stochastic simulation. The algorithm is then further developed and then integrated into a powerful software infrastructure to enable its widespread use. Experiments capable of capturing stochastic variability inform model development and analysis.
Software developed as a result of this project enables routine simulation of highly complex spatial stochastic phenomena across the sciences and engineering. All software will be made widely available. Tutorial courses and presentations at meetings and workshops will be given to ensure the accessibility of the research. Graduate students involved in this project are provided with a unique, highly multidisciplinary research experience. Students at UCSB work as a tightly-knit team co-advised by Petzold and Khammash, with extended visits to UCI to work in Yi's experimental lab, learning about the possibilities and limitations of the experimental techniques. UCI students focused on experiment spend significant time at UCSB working with the modelers, learning first-hand what the systems-level approach can bring to biological research.
|
1 |
2011 — 2014 |
Doyle, Francis J (co-PI) [⬀] Henson, Michael Herzog, Erik [⬀] Petzold, Linda R. Taylor, Stephanie R. (co-PI) [⬀] |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Mechanisms and Modeling of Networked Circadian Pacemaker Synchronization
DESCRIPTION (provided by applicant): The mammalian suprachiasmatic nucleus (SCN), required for daily cycles in behavior and physiology. How the cells of the SCN synchronize to coordinate behavior is largely unknown. We have established a collaborative program combining experimental and computational methods to study large numbers of circadian oscillators, their connections, and the real-time kinetics by which they self-synchronize and respond to perturbations in environmental timing cues. To understand circadian regulation within the brain, we must understand the topology and types of interactions between circadian neurons. Aim 1 will monitor the network of SCN oscillators as they synchronize during fetal development, during entrainment, following a phase shift, and after restoration of cell-cell communication in the adult SCN. Using novel wavelet-based time series analyses, we will estimate the strength and direction of individual connections in the SCN. Aim 2 will use graph theory and spatial statistics to quantify network features of the developing and adult SCN. These analyses will define the mechanisms of synchronization during normal development and following environmental perturbations and the relative contributions of local, regional or global coupling which contribute to period precision. Aim 3 will compare the performance of the SCN networks under the four conditions with both deterministic and stochastic model networks. The computational models will investigate the effects of intrinsic noise and cell-cell heterogeneity on circadian synchronization. Revealing how circadian oscillators interact to generate a coherent rhythmic output will have important clinical implications for prevention and treatment of circadian rhythm disruptions, including mood and sleep disorders.
|
0.905 |
2012 — 2016 |
Van Der Ven, Anton (co-PI) [⬀] Pollock, Tresa [⬀] Begley, Matthew (co-PI) [⬀] Petzold, Linda |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Dmref: Goali - Discovery, Development, and Deployment of High Temperature Coating/Substrate Systems @ University of California-Santa Barbara
TECHNICAL ABSTRACT:
The convergence of new computational capabilities, advanced characterization techniques and the ability to generate and harness large-scale data enables new pathways for the discovery, development and deployment of advanced materials systems. This program engages a multidisciplinary team to develop a fundamental framework for design of a new class of multilayered systems for deployment in new, energy efficient power generation and propulsion systems. Novel complementary computational and experimental tools developed will be integrated with existing tools and applied to a promising new class of intermetallic-strengthened cobalt-base alloys. The unique high-temperature properties of these alloys, when combined with thermal barrier coatings, promise very substantial improvements in powerplant efficiency, motivating GE Energy and GE Global Research as partners in this DMREF-GOALI program. The program will take a systems approach, developing tools and models that permit simultaneous design of the metallic substrate and intermetallic bond coat for compatibility with the ceramic top coat, going beyond the linear, experiment-driven approach historically employed for independent development of these three critical system elements.
NON-TECHNICAL ABSTRACT:
The convergence of new computational capabilities, advanced characterization techniques and the ability to generate and harness large-scale data enables new pathways for the discovery, development and deployment of advanced materials systems. This program engages an engineering and computer science team to develop a fundamental framework for design of new multilayered materials systems for energy efficient power generation and aircraft propulsion. Novel complementary computational and experimental tools will be developed and integrated with existing tools to accelerate development of a newly discovered cobalt-base substrate material along with compatible environmental protection layers. The program will take a systems approach, developing tools and models that permit simultaneous design of the layered system, going beyond the linear, experiment-driven approach historically employed for independent development of these critical system elements.
|
1 |
2012 — 2014 |
Petzold, Linda R. |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Stochastic Simulation Service: a Cloud Computing Framework For Modeling and Simul @ University of California Santa Barbara
DESCRIPTION (provided by applicant): Stochasticity plays an important role in many biological processes. Examples include bistable genetic switches, noise enhanced robustness of oscillations, and uctuation enhanced sensitivity or stochastic focusing. Numerous cellular systems, including development, morphogenesis, polarization and chemotaxis rely on spatial stochastic noise for robust performance. At the same time, stochastic simulations are complex and consume large amounts of computer time. They may require the researcher to be procient in the use of one or more complex software packages. Learning to use existing simulation tools and to integrate them with other software takes considerable time. In many cases, the tools do not exist and require the expertise of mathematicians and computer scientists to develop them. Often, researchers must purchase and maintain clusters of computers to perform the large-scale computations. All of this adds costs and delays to the research process. Currently, there exists no software package that allows researchers to easily build a stochastic model of a biological system, and scale it up to increasing levels of detail and complexity. We propose to build an environment where the modeler can focus his/her attention on the biology; alleviating the burden of software installation and versions, mathematical algorithms, code optimizations, computer systems, etc. This environment will run on laptops and computer workstations (for small problems), extending on demand to high-performance compute clusters, grids, and public or private clouds; thus creating a cost-eective and energy-ecient solution for simulations of all sizes. We will equip this environment with state of the art software for key classes of problems, and make it easy for software developers to integrate new and improved algorithms without the need to develop their own software infrastructure. We will develop new algorithms and software to address key computational capabilities that have not previously been attainable: (1) fully- adaptive, hybrid solvers for sti (and nonsti) well-mixed systems (2) ecient computation of probabilities of rare events, and (3) simulation of spatial stochastic systems at speeds that are several orders of magnitude faster than previous methods. The availability of such a community resource will enable and accelerate progress in both biology and algorithm development.
|
0.958 |
2019 — 2020 |
Petzold, Linda R. |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Stochss: a Next-Generation Toolkit For Simulation-Driven Biological Discovery @ University of California Santa Barbara
Project Summary The development of a mathematical model is critical to the understanding of complex biological processes because it codifies current understanding so that it can be tested against existing data. A good model with sufficient detail can be used to identify potential points of intervention (for example, drug targets) at which an undesired outcome (for example, effects of disease) of the process might be altered. Model development proceeds through a cycle of model building, simulation of the model under numerous conditions, and comparison to experimental data. The cycle is repeated and often augmented by new experiments to capture additional data, until the resulting model can plausibly explain the data. Tremendous amounts of time and effort must be devoted to finding and/or developing tools to analyze the model and compare it to the data, fit the parameters and assess the effects of typically large amounts of uncertainty in both the data and the parameters, simulate the model and analyze the simulation data, refine the model to better capture our increased understanding at each stage of the process, decide which additional experiments would add most to our understanding, etc. Our objective in the proposed work is to facilitate and accelerate the modeling process by providing state of the art, well-integrated tools to report complete and informative results at each stage, enabling the modeler and the experimentalist to focus on what they do best: scientific discovery. This is a renewal proposal that builds on the capabilities and infrastructure developed in the current project. In that work we developed StochSS, a novel Software-as-a-Service offering for quantitative modeling of biochemical networks capable of seamless deployment in public cloud environments. StochSS does an excellent job of supporting two of the major steps of the modeling process: Model Building - taking your model description and putting it into a form that the StochSS simulation engines can work with, and Simulation - performing the simulations to produce the results. The proposed project has three complementary Aims. The first is to further develop StochSS's core capabilities and to take the steps that will ensure its long-term sustainability; the second is to develop a Model Development Toolkit, and the third is to develop a Model Exploration Toolkit. Both of these toolkits will be integrated into our existing StochSS Model Building and Simulation environment and will leverage our existing software infrastructure for cloud computing. Aim 1. Core Capabilities and Long-Term Sustainability This aim has three sub-aims: (1) instituting practices that will help ensure community involvement and better long-term sustainability of StochSS beyond NIH funding, (2) extending core StochSS functional capabilities, and (3) improving compatibility with other software via support for standard data formats. Aim 2. Model Development Toolkit Develop and integrate tools to facilitate and accelerate the process of Model Development: the iterations of (modeling, simulation, experiment) that are typically required to converge on the most plausible model that can explain the data. The Model Development Toolkit will address parameter estimation and quantification of uncertainty, generation and evaluation of the set of plausible models, and optimal design of experiments. Aim 3. Model Exploration Toolkit Develop and integrate tools for Model Exploration: the process of exploring the parameter space to ensure that the model is robust to variations in uncertain and/or undetermined parameters, to find the regions of parameter space in which the model is capable of yielding a given behavior, and to discover all of the qualitatively distinct behaviors that the model is capable of within the space of uncertain and/or undetermined parameters.
|
0.958 |
2021 — 2026 |
Pruitt, Beth [⬀] Petzold, Linda Clegg, Dennis (co-PI) [⬀] Manjunath, Bangalore (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Nrt-Urol: Data Driven Biology @ University of California-Santa Barbara
This National Science Foundation Research Traineeship (NRT) award to the University of California, Santa Barbara will develop, implement, and test an innovative graduate education model on the theme of Data Driven Biology (DDB). The program's goal is to train a new generation of biological scientists and engineers who can work across disciplines, are fluent in data analytics and experimental methods, and who will advance fundamental research in quantitative biology and bioengineering. Cell and developmental biology are in transition from qualitative observational sciences to quantitative, data-rich fields that leverage modeling and design principles from physics and engineering. Advances in imaging and sequencing technologies, paired with machine learning and computer vision tools, are having a transformative impact on quantitative cell biology. To take full advantage of these technologies, the modern biological engineer needs to be fluent both in the design of biological experiments and in data mining strategies to integrate information across scales (temporal and from genetic/molecular to cellular and tissue scales). DDB seeks to provide students with the breadth to collaborate across disciplines meaningfully and with the depth to answer biological questions with scientific rigor supported by knowledge of and experience with data science approaches. The project anticipates training 70 Ph.D. students, including 30 funded trainees, from doctoral programs in: biological engineering, biomolecular science & engineering, chemical engineering, computer science, electrical and computer engineering, molecular, cellular and developmental biology, mechanical engineering, and physics.
Through DDB, students will learn how to design experiments; acquire and integrate multi-modal, disparate data; and integrate machine learning and computational approaches to extract patterns and meaning from biological data to understand and leverage heterogeneity in stem-cell-derived models. Trainees will be supported by a new curriculum, which will serve as the basis of an emergent Biological Engineering Ph.D. program. Onboarding will include a structured course on seminal research papers and best practices for designing interdisciplinary inquiry. Armed with these training elements, students will be immersed in an in-vivo research experience to conduct hands-on redesign of seminal experiments and to personally implement advanced research methods to test the conclusions of these seminal papers. Students will also engage in co-mentored research rotation projects across diverse labs (experimental and modeling). To support self-reflection, deliberate career planning, and self-efficacy, the program will deploy a three-pronged mentoring plan, including a faculty advisor, peer feedback, and self-assessment through individual development plans. Finally, internships and externships will provide trainees with immersive exchange opportunities across a research network committed to convergent and translational training. This will allow students to experience firsthand how fundamental discovery can ultimately impact applied health applications.
The NSF Research Traineeship (NRT) Program is designed to encourage the development and implementation of bold, new potentially transformative models for STEM graduate education training. The program is dedicated to effective training of STEM graduate students in high priority interdisciplinary or convergent research areas through comprehensive traineeship models that are innovative, evidence-based, and aligned with changing workforce and research needs.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
|
1 |