2003 — 2007 |
Charikar, Moses (co-PI) [⬀] Barak, Boaz Yao, Andrew (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Foundations of Complexity Theory
Complexity theory addresses the question of how much resource is needed to solve a given computational problem. In recent years complexity theory has found connections and applications in many scientific and engineering disciplines. This project plans to investigate complexity theory across a broad front. The topics include communication complexity, decision trees, quantum algorithms, quantum cryptography, and other foundational issues in complexity theory. Te goal is to gain deep insights into the nature of computation, so that one can most effectively perform computational tasks in any model. It is expected that a variety of algebraic and combinatorial techniques will be called upon to explore these topics.
Two of the most interesting scientific developments in recent years spring from the realization that complexity theory can be utilized for cryptography, and that quantum mechanics can be used for performing powerful computation. Advances in these fronts have attracted interest from scientific communities at large, as their relevance is far-reaching. The research of this project could lead to substantial further progress in these two areas.
In the broader domain of public education, the Principal Investigator has given many speeches in recent years on various topics on complexity, cryptography and quantum computing. Many of these talks are geared toward general audiences and often undergraduate students. These lectures, if thoughtfully designed, can stimulate the intellectual curiosity of young minds and develop their interest in the computing sciences. With further advances of research in these areas, one can expect that public awareness and interest will be further enhanced to the good of a stronger scientific education.
|
1 |
2004 — 2009 |
Charikar, Moses [⬀] Barak, Boaz Yao, Andrew (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Itr Collaborative Research: Ase-Dmc Computational Complexity of Interactive Computation
In the rapidly growing world of Internet infrastructures, we face many challenging new problems. These arise in part because the usual assumptions made in problems of this general type may no longer hold. For example, many typical questions dealing with massive data sets often involve networks or graphs of prohibitively large sizes. Only partial information can be obtained and in addition, this information is changing dynamically. There is an increasing need to develop the theoretical foundation for these myriad complex processes. In particular, there are many unresolved fundamental issues regarding the computational and informational-theoretic complexity of interactive computing, both in the classical setting as well as other emerging computational paradigms. In this proposal, we will investigate several inter-related areas : - Major open problems in communication complexity. - Two information-theoretic identification problems Guessing secrets and Finding favorites. - Two directions in quantum information processing quantum decision tree model and quantum communication complexity. - Using techniques in the study of the so-called "power law model" for realistic networks to develop new methods in the analysis of on-line algorithms. Impact Interactive computing is prevalent in almost all areas of computing and communcations with applications in numerous areas of science and engineering, such as security, finance, information retrieval, bioinformatics and beyond. However, the current state of the theoretical foundation for interactive computation is quite primitive and far from satisfactory. The proposed study on the computational complexity of interactive computation is meant to strengthen our understanding and provide insight that is crucial for the design and analysis of interactive algorithms. Because of the fundamental and far-reaching nature of the proposed work, this study will help bring together different areas and crossfertilization typically occur.
|
1 |
2006 — 2011 |
Barak, Boaz |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Ct-Isg: Cryptographic Foundations For Next-Generation Security Applications
CT-ISG: Cryptographic Foundations for Next-Generation Security Applications
Boaz Barak
This project tackles some of the foundational challenges to the advent of the networked society and pervasive databases pose to cryptography. Specifically, it studies the existence of cryptographic protocols that allow mutually distrusting parties to achieve a common goal with a minimal sacrifice in privacy. This includes as a special case protocols for electronic elections, electronic auctions, privacy-preserving data mining and many more applications.
A central issue in such protocols is the effect on security of executing multiples instances of one or several protocols over a network. It is known that in some cases such interaction can lead to subtle but fatal security flaws, and so constructing protocols whose security is robust with respect to such interactions is a central question in this cryptography. This project uses recent and new ideas and techniques to attack this problem.
More concretely, the research involves studying the tradeoff between trust and interaction. There are known protocols with robust security assuming the existence of completely trusted third parties. This project focuses on reducing the assumption of a trusted party, and aims to discover to what extent this can be done without compromising security. The goal is to get both positive results- protocols obtaining robust security under weaker trust assumptions than was previously known, and negative results- showing that certain security goals are impossible to achieve without some trust assumptions. This will allow the users of secure applications to know what is and is not achievable in terms of trust and security.
This project also tackles the main sources of inefficiency in current constructions of robustly secure protocol, in order to find out to what extent the current efficiency bottlenecks in such constructions are inherent.
|
1 |
2008 — 2014 |
Charikar, Moses (co-PI) [⬀] Barak, Boaz Tarjan, Robert (co-PI) [⬀] Arora, Sanjeev [⬀] Chazelle, Bernard (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Collaborative Research: Understanding, Coping With, and Benefiting From Intractibility.
Expeditions in Computing: Understanding, Coping with, and Benefiting from Intractability Computational intractability imposes a limit on the ability to understand nature and design systems. Intractability is a stumbling block for postmen, travelling salesmen, map colorers, and millions of others who would like to complete their tasks as efficiently as possible, yet it is the foundation of 21st century cryptography, which in turn is a pillar of electronic commerce. In order to understand, manage, and exploit intractability it is imperative that progress be made on proving intractability in many forms and computational models, and on unraveling the interconnections among different forms and uses of intractability. This Expedition will explore an array of diverse but interrelated topics in computational intractability including algorithms, complexity, cryptography, analysis, geometry, combinatorics, and quantum mechanics. A "Center for Intractability," the first of its kind, will be based at Princeton. Addressing some of the deepest and hardest theoretical problems standing in the way of significant advancements in computer science, this Expedition involves a high degree of collegial interactivity through collaborations among geographically local participating institutions. Outreach includes an active "Women in Theory" program as well as programs targeting undergraduate and high-school students.
|
1 |
2008 — 2009 |
Barak, Boaz |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Women in Theory Workshop
The purpose of the workshop is two fold: first to deliver an invigorating educational program and second is to provide an outstanding opportunity to bring together women students from different departments, across the country (and possibly internationally), so as to foster a sense of kinship and camaraderie. And to provide access to the role models in this area by having senior and junior faculty present. Further enhancing this initial relationship with a long time mentoring program which will be organized by Anne Condon. Our long term goal is that these efforts will help widen the pipeline of women doing theory.
While there are groups for "Women in Computer Science" and "Women in Engineering" at both the university and national level, our objective in creating a "Women in Theory" workshop is to enable meaningful technical interactions among the participants. There will be significant overlap in research interests in this group to facilitate a technical program that is interesting to everyone.
List of confirmed technical speakers: Dorit Aharonov, Hebrew University, Shuchi Chawla, University of Wisconsin, Madison, Julia Chuzhoy, TTI, Irit Dinur, Weizmann Institute, Cynthia Dwork, Microsoft Research, Joan Feigenbaum, Yale, Shafi Goldwasser, MIT and Weizmann Institute, Tal Malkin, Columbia University, Eva Tardos, Cornell, Lisa Zhang, Bell-Labs
We will also have talks by the participating students and sessions devoted to discussions on issues that are especially relevant to female scientists such as finding a job and salary negotiations, communicating with your advisor, and work-life balance, etc.
|
1 |
2016 — 2021 |
Barak, Boaz |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Af: Large: Collaborative Research: Algebraic Proof Systems, Convexity, and Algorithms
This project tackles some of the central questions in algorithms, optimization, and the theory of machine learning, through the lens of the "Sum of Squares" algorithmic framework. In particular, it will allow us to understand what classes of functions can be efficiently minimized, and what computational resources are needed to do so. If successful, this will significantly advance our understanding in all these key areas, produce new practical algorithmic methodologies, as well as build new connections with other fields, including quantum information theory, statistical physics, extremal graph theory and more.
This collaborative grant will foster new interactions between intellectual communities that have had relatively little interaction so far, and the PIs will organize workshops, courses, and other events that bring these communities together. The students and postdocs trained will gain a uniquely broad view of the landscape of these areas.
The PIs propose a unified approach to the development and analysis of convex proof systems that include and generalize the "Sum of Squares" (SoS) method. Despite considerable recent progress, understanding SoS?s performance seems to be out-of-reach for most current techniques. Significant progress in this area requires the synthesis of ideas and techniques from different domains, including theoretical computer science, optimization, algebraic geometry, quantum information theory and machine learning. The research plans include both theory-building and problem-solving aspects, with the ultimate goal of obtaining a complete understanding of the SoS method and related proof systems, as well as their algorithmic implications.
Research efforts will be directed along several thrusts: Unique Games and related problems (Small Set Expansion, Max Cut, Sparsest Cut), analysis of average-case problems (e.g., Planted Clique), applications to Machine Learning (sparse PCA, dictionary learning), algorithmic speedups of SoS, and connections to math and physics (e.g., quantum entanglement, p-spin glasses, extremal graph theory and algebraic geometry). While the main focus is on theoretical aspects, this project is also concerned with effective computational methods, and the outcomes may yield novel practical techniques for machine learning and optimization.
Other key features of this proposal include its strong integration with curriculum development, undergraduate research projects, and training the next wave of graduate students and postdocs and equipping them with the necessary tools to work across these areas.
|
1 |
2016 — 2019 |
Barak, Boaz |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Twc: Small: Complexity Assumptions For Cryptographic Schemes
This project investigates the foundational computational underpinnings of secure systems. Cryptographic constructions such as encryption, signatures, and more rely for their security on the conjectured computational difficulty of certain problems. For example, many public key encryption currently in use would be broken if someone discovered an efficient algorithm to factor large integers. Unfortunately, the current state of art is that we are unable to prove that these problems are truly hard, and so need to rely on unproven conjectures. Moreover, a handful of these conjectures serve as the foundations of many if not most of current cryptographic schemes, and so each such conjecture is a single point of failure whose refutation could have severe consequences for a large fraction of our systems. In particular, progress in quantum computing threatens some of these conjectures, and motivates reevaluation of the right basis for cryptography.
The goal of this project is better understand and remedy these issues. Concretely, this project investigates cryptographic schemes, and particularly public key encryption, that are based on assumptions that are qualitatively different from current ones, and hence more likely to survive technological or algorithmic advances that would break current schemes. The project obtains new forms of evidence for computational assumptions, thus deriving some assurances that current schemes are secure. The project also explores assumptions for new cryptographic primitives such as software obfuscation that are more well founded by computational complexity considerations
|
1 |
2022 — 2024 |
Barak, Boaz Ba, Demba (co-PI) [⬀] Janson, Lucas Pehlevan, Cengiz |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
A Theory of Learned Representations in Artificial and Natural Neural Networks
Deep learning is successful in practice because through large amounts of data and computation, useful general representations are learned that enable performance of complex tasks. Such representation learning is one of the most important and least understood aspects of deep learning — there are currently no quantitative measures for quality of representations, nor ways to certify that methods achieve the desired quality. This project is concerned with obtaining such measures and using both empirical and theoretical approaches to obtain certified representation-learning algorithms, as well as connecting these to representation learning in human and animal brains. Such an understanding is crucial for obtaining robust general algorithms that can be used for a wide variety of applications and tasks. This project will form new connections between machine learning, signal processing, statistics, and computational neuroscience. It will also result in stronger statistical guarantees for representation learning, placing it on a firmer mathematical foundation. As deep learning is used for increasingly consequential decisions, rigorous guarantees such as the ones pursued here become ever more important. Results of the project will also be used in education efforts at the K-12, college, and graduate level, including in programs aimed at groups historically under-represented in computing.
This project combines insights from machine learning, statistics, signal processing, and computational neuroscience to obtain a theory of representations in both artificial and natural neural networks. Specifically, the project aims to develop both task-dependent and task-independent measures of representation quality. Task-dependent measures capture the quality of representation through its performance in down-stream tasks, while task-independent measures define quality in terms of intrinsic properties of the representation and input distribution. The project will obtain relations between the two types and hence characterize conditions under which representation-learning algorithms transfer. The project will also result in rigorous bounds on representation quality under assumptions. Through the study of representation quality, the project will aim to explain prevalent features in real-world natural and artificial neural networks. These features include: locality in parameter space of neural responses in the visual and auditory system, mixed-selectivity of neurons that respond to signals of different types (for example, olfactory neurons in mice that respond to both spatial and odorant changes), and cross-modal neurons that respond to the same concept in signals of different types (for example, visual and auditory signals in the brain). The project will also connect representation learning to classical notions in signal processing and learning such as dictionary learning, as well as to well-known open questions in deep learning, including the prevalence of hierarchical representations, generalization of over-parameterized models, and simplicity bias.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
|
1 |