1983 — 1985 |
Chazelle, Bernard |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Theoretical Computational Geometry (Computer Research) |
0.967 |
1987 — 1990 |
Chazelle, Bernard Dobkin, David (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Investigations Into the Nature of Search - Data Structures and Geometric Applications |
1 |
1990 — 1993 |
Chazelle, Bernard Dobkin, David (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
The Design and Implementation of Geometric Algorithms
Research in three broad areas of computational geometry will be conducted. These areas cover the spectrum from practical considerations which arise when implementing and debugging geometric algorithms to theoretical questions arising in algorithm design and lower bound proofs. They include design and analysis of algorithms, concentrating mostly on problems involving nonlinear surfaces in dimensions higher than two, in particular, hidden surface removal, triangulations of real-algebraic varieties, and multidimensional searching, building an environment for implementing geometric algorithms and tools which can ultimately be used by researchers to produce and share geometric software, and design of robust geometric algorithms which entails tackling problems arising from finite precision arithmetic as well as the degeneracy of common real world geometric data.
|
1 |
1993 — 1997 |
Chazelle, Bernard Dobkin, David (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
The Design and Analysis of Geometric Algorithms
The project investigates problems within three broad areas of computational geometry: (1) design and implementation of efficient and robust geometric primitives, (2) algorithmic complexity of multidimensional searching and (3) mathematical tools (e.g. discrepancy theory) for randomized (and derandomized) geometric algorithms. These areas cover the spectrum from practical considerations which arise when implementing and debugging geometric algorithms to theoretical questions arising in algorithm design and lower bound proofs. Tools continue to be built which can ultimately be used to produce and share geometric software.
|
1 |
1998 — 2001 |
Chazelle, Bernard Dobkin, David [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Design and Experimentation in Computational Geometry
This project involves the dual pursuit of core theoretical investigations in computational geometry and experimentation with geometric codes. Theoretical investigations will address lower bounds, geometric sampling techniques and derandomization. Experimentation will involve the implementation of algorithms as well as the development of tools to make future implementations easier. Special emphasis will be given to problems of simplification and multiresolution of shapes in 3D, sampling and optimization, algorithm animation and experimentation tools.
|
1 |
2000 — 2004 |
Chazelle, Bernard Dobkin, David [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Algorithms and Experimentation in Computational Geometry
Algorithms and Experimentation in Computational Geometry
The project mixes central theoretical investigations in geometric computing together with experimentation with geometric codes. A major part of the effort will be devoted to problems of multi-scale representation and simplification of shapes in 3D, with applications to computer graphics and virtual reality, sampling and optimization, algorithm animation and visualization. On the theoretical side, it is anticipated that the work will draw mostly from complexity theory, discrepancy theory, and algorithm design, while the experimental aspect will emphasize software building using available geometric codes and animation tools
|
1 |
2001 — 2005 |
Chazelle, Bernard Dobkin, David (co-PI) [⬀] Finkelstein, Adam (co-PI) [⬀] Funkhouser, Thomas [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Itr/Im:3d Shape-Based Retrieval and Its Applications
This research will investigate methods for automatic retrieval and analysis of 3D models. It will develop computational representations of 3D shape for which indices can be built, similarity queries can be answered efficiently, and interesting features can be computed robustly. Next, it will build user interfaces which permit untrained users to specify shape-based queries. This will include queries specified with text, 3D models, 2D sketching, and high-level methods based on constraints and rules. It will combine elements of computer graphics, computer vision, and computational geometry.
Applications of shape-based query methods will include Internet search engines, computer-aided design, molecular biology, medicine, and security. In each application the researchers will work with domain experts to understand the critical elements of the 3D databases and the challenging shape queries for which new methods are required. For example, working with molecular biologists will help develop query tools for the Protein Data Bank to find macromolecules matching a given shape. These methods will aid classification of proteins for which only low-resolution electron density maps are available, and aid searches for proteins matching a specific binding site.
|
1 |
2003 — 2007 |
Chazelle, Bernard Dobkin, David [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
New Directions in Geometric Algorithm Design
Special Abstract for 0306283 (Dobkin and Chazelle, Princeton U,
Title: New Directions in Geometric Algorithm Design
This project pursues a number of objectives
that have been at the heart of important
new developments in computational geometry.
Chief among them is the issue of
algorithm design for large datasets:
how to deal with high dimensionality,
uncertainty; how to optimize functions
approximately in sublinear time; how to simplify and
visualize complex data;
how to customize data structures to speed up search.
The effort is to involve a mix of theoretical
and experimental investigations, with targeted
applications to surface simplification,
3D shape matching, massive dataset visualization,
and protein structure prediction.
The theoretical issues involve
combinatorial geometry, algorithm design
and basic complexity theory.
This effort is aimed at deriving new
computational methods for solving
problems of a geometric or biological nature
that have resisted past investigations
because of one two reasons: either the input
data is too massive to be processed directly
and it can only be "sampled" cleverly or
the number of variables is itself so high
that standard methods suffer from an exponential blowup
in the time it takes to run them. New
dimension reduction techniques are needed
to resolve this bottleneck.
etric Algorithm Design
This project pursues a number of objectives
that have been at the heart of important
new developments in computational geometry.
Chief among them is the issue of
algorithm design for large datasets:
how to deal with high dimensionality,
uncertainty; how to optimize functions
approximately in sublinear time; how to simplify and
visualize complex data;
how to customize data structures to speed up search.
The effort is to involve a mix of theoretical
and experimental investigations, with targeted
applications to surface simplification,
3D shape matching, massive dataset visualization,
and protein structure prediction.
The theoretical issues involve
combinatorial geometry, algorithm design
and basic complexity theory.
This effort is aimed at deriving new
computational methods for solving
problems of a geometric or biological nature
that have resisted past investigations
because of one two reasons: either the input
data is too massive to be processed directly
and it can only be "sampled" cleverly or
the number of variables is itself so high
that standard methods suffer from an exponential blowup
in the time it takes to run them. New
dimension reduction techniques are needed
to resolve this bottleneck.
The proposal is a careful outline of research work that may greatly aid in geometric data
|
1 |
2005 — 2007 |
Chazelle, Bernard Singh, Mona [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Sger: Algorithms For Predicting Protein Function Using Interaction Maps
Intellectual Merit
The goal of this project is to develop algorithms for analyzing protein interaction maps, in order to make novel predictions about a protein's biological process. The goal is to provide a framework for moving from individual pairwise linkages to exploiting entire interaction networks, where each interaction may arise from either experimental and/or other computational methods.
Methods for analyzing interaction networks are in their infancy, and most current approaches predict the function of a protein by considering only the annotations of its direct interactions. In contrast, the proposed methods will use the global connectivity of interaction networks, the relationships between functions, and several high-throughput data sources in making predictions. The hope is that by developing novel network-based algorithms, we will obtain functional predictions for many, as yet, uncharacterized proteins.
Broader Impact
Both PIs teach cross-disciplinary courses in computational biology, and the research outlined here will further enhance their educational efforts. The co-PI, Chazelle, has designed and is teaching a new undergraduate course-an integrated, quantitative introduction to the natural sciences. Together with biologists, physicists, and chemists the PI, Singh, has designed a graduate course Introduction to computational molecular biology and genomics; she has co-taught it with a molecular biologist for the past four years.
The proposed work will develop methods using interaction maps for baker's yeast and fruit fly. Humans share many proteins and pathways with these model organisms. Thus, network analysis methods may allow transfer of information from these organisms to human, potentially revealing critical information about proteins and pathways implicated in human disease. Predictions and software will be made available on the web (www.cs.princeton.edu/mona/software.html).
|
1 |
2006 — 2009 |
Chazelle, Bernard |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Data-Powered Algorithms
This project intends to focus on several aspects of this development: self-improving algorithms; online data reconstruction; sublinear algorithms; dimension reduction; low en-tropy data structures, nonuniformly priced computation; and data analysis. The scope of the proposed research is intentionally wide-ranging. Indeed, it is the PI's belief, backed by his preliminary investigations, that progress on these topics will rely on the emergence of common threads and broadly applicable methods. The intellectual merit of this proposal is to lay down the foundations for a systematic study of data-powered algorithms. The eort will involve theoretical investigations coupled with extensive computer experimentation. The principal beneciaries of data-powered algorithms come from engineering and the natural sciences, and this is where the broader impact of this proposal will be sought; specically, via the PI's ongoing research collaborations with biologists and physicists and, on the educational front, his joint effort with them to develop a year-long algorithms-based course for quantitative scientists.
|
1 |
2008 — 2014 |
Charikar, Moses (co-PI) [⬀] Barak, Boaz (co-PI) [⬀] Tarjan, Robert (co-PI) [⬀] Arora, Sanjeev [⬀] Chazelle, Bernard |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Collaborative Research: Understanding, Coping With, and Benefiting From Intractibility.
Expeditions in Computing: Understanding, Coping with, and Benefiting from Intractability Computational intractability imposes a limit on the ability to understand nature and design systems. Intractability is a stumbling block for postmen, travelling salesmen, map colorers, and millions of others who would like to complete their tasks as efficiently as possible, yet it is the foundation of 21st century cryptography, which in turn is a pillar of electronic commerce. In order to understand, manage, and exploit intractability it is imperative that progress be made on proving intractability in many forms and computational models, and on unraveling the interconnections among different forms and uses of intractability. This Expedition will explore an array of diverse but interrelated topics in computational intractability including algorithms, complexity, cryptography, analysis, geometry, combinatorics, and quantum mechanics. A "Center for Intractability," the first of its kind, will be based at Princeton. Addressing some of the deepest and hardest theoretical problems standing in the way of significant advancements in computer science, this Expedition involves a high degree of collegial interactivity through collaborations among geographically local participating institutions. Outreach includes an active "Women in Theory" program as well as programs targeting undergraduate and high-school students.
|
1 |
2010 — 2015 |
Chazelle, Bernard |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Af: Small: New Directions in Computational Geometry
This award seeks progress in new areas of computational geometry: specifically, collective behavior, sublinearity, hereditary structures, and low-entropy geometric algorithms. Recent work on multiagent agreement systems has revealed the great potential of a geometric approach to the subject. A new generating function, the "total s-energy", shows considerable promise for the study of consensus. A thorough investigation of this new device is the starting point of this project.
Geometric data tends to capture a vast amount of coherence and hence little entropy. The second part of this project looks at four instances of this phenomenon: Markov sources, hereditary complexity, sublinear computing, and self-improving geometric algorithms. In the first case, the data is generated by a Markov chain; in the second, it is presented within a larger structure that is given explicitly; in the third, the data is only accessible in limited amounts through random sampling; in the fourth, it comes from an unknown memoryless low-entropy distribution. In all cases, the objective is the same: to go beyond worst-case analysis and make room for a more realistic analytical framework.
|
1 |
2010 — 2014 |
Chazelle, Bernard |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Ccf: Af Analytical Tools For Natural Algorithms
This project seeks to build an algorithmic foundation for the analysis of complex self-organizing systems. Computing theory has been successful at pinning down the computational power of common dynamical systems, but less so at analyzing their behavior and functionality. Recent research points to the potential benefits of viewing these systems as "natural algorithms." The approach is premised on the belief that the algorithm itself is at least as important as the data it generates and that the current reliance on simulations and statistics must be supplemented with analytical tools, something akin to an ``algorithms calculus.'' The research will be oriented along the following lines: new proof techniques and analytical tools; communication-based algorithmic renormalization; geometry, topology, and approximation of complex systems.
The ambition of this project is to lay the foundations of an algorithmic theory of complex adaptive systems. Algorithm design has been one of the most fruitful areas of computer science in the past few decades. Building a bridge between this intellectual discipline and the ever-growing area of complex systems could potentially have a major impact on science as a whole. Accordingly, the present project will involve collaborations with computer scientists, control theorists, physicists, and systems biologists.
|
1 |
2014 — 2017 |
Chazelle, Bernard |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Af: Small: An Algorithmic Approach to Collective Behavior
This proposal seeks to answer fundamental questions about collective behavior by attacking them from an "algorithmic" perspective. The systems under consideration consist of agents communicating with one another and taking autonomous actions based on the information gathered along the way. The agents could be members of a social network, flocking birds, flashing fireflies, swarming bacteria, etc. In all cases, agents are equipped with their own (possibly distinct) decision procedures that tell them what to do under what conditions and what agents to listen to. Out of this diversity of local interactions, striking patterns will often emerge: birds will form triangles; fireflies will reach perfect synchronization; bacteria will perform quorum sensing. How does one study emergent self-organization of this sort? The premise of this work is that an algorithmic approach to collective behavior holds the promise of a uniquely novel and powerful line of attack.
The traditional tools for the study of complex systems draw mostly from the fields of dynamics and statistical physics. The PI's algorithmic approach will unfold in three phases. One is to develop general methods for decomposing complex systems into simpler ones. Just as graph clustering techniques are essential parts of the toolkit of network analysis, so "renormalization" methods are crucial for the analysis of self-organization and collective emergence. The second phase of this project entails the design of new tools for dynamic networks and time-varying random walks. The third phase is to investigate specific models of collective behavior, in particular classical systems for swarming and opinion dynamics. The challenge there is to classify the dynamic regimes of these systems (attractive, periodic, chaotic, Turing-universal, etc).
Broader impacts include curriculum development on this inter-disciplinary field and outreach presentations of the research at all levels.
|
1 |
2020 — 2023 |
Chazelle, Bernard |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Af: Small: Natural Algorithms and Dynamic Networks
The emergence of large-scale order from local interactions is a theme common to computing and living systems. The field of natural algorithms seeks to discover the fundamental principles that underlie this phenomenon in all its remarkable diversity. Can simple algorithmic rules explain how birds flock, how termites cooperate, how opinions polarize, how oscillators self-synchronize, how languages evolve through iterated learning, or how robustness emerges from adaptive systems? Beyond providing answers to these specific questions, the broader ambition of this project is to build bridges between the fields of algorithms and dynamical systems. Nearly all of the processes under consideration involve dynamic networks whose nodes represent autonomous agents interacting under time-varying topologies. It is often crucial to perform dimension reduction on such systems in a manner respectful of the dynamics. This task is approached through the lens of ?semantic renormalization,? a process that involves clustering dynamic graphs hierarchically. The theme of the project is highly multidisciplinary and forms the basis of graduate seminars and undergraduate projects with participation from computer science, mathematics, statistics, neuroscience, genomics, evolutionary biology, and mechanical engineering.
The project draws upon a wide range of techniques from areas as diverse as dynamical systems, machine learning, statistical mechanics, and network theory. It consists of four main parts: (a) ?iterated learning? features students who, acting as Bayesian agents, learn from teachers and then, in turn, become teachers themselves; (b) ?dynamic random walks? model random walks over graphs whose topology changes over time via a feedback loop; (c) ?opinion dynamics? investigates how autonomous agents can reach consensus through mutual, constrained interaction; (d) ?averaging systems? are coupled dynamical systems driven by convex combination updates over embedded dynamic networks.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
|
1 |