2007 — 2010 |
Kim, Young-Han |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Collaborative Research: the Role of Feedback in Two-Way Communication Networks @ University of California-San Diego
Many common communication situations are over inherently two-way channels, such as telephone systems, digital subscriber lines (DSL), cellular networks, and the Internet. In fact, even `point-to-point' systems, where the end goal is to transfer information in one direction, often give rise to two-way communication scenarios due to the presence of feedback. In such systems, one can receive feedback from the other end of the channel, which can be used to improve the quality of communication. Although feedback is present in many communication systems, and is being used in certain primitive forms as in channel estimation and automatic repeat request, the theory behind its use is far from complete. This research investigates the role of feedback in two-way communication networks and provides architecture-level guidance for designing robust and efficient communication systems. While positive results lead to novel approaches to communication systems design, negative results prevent over-engineering and allow more confidence in simple and modular implementations. At the same time, feedback is a pivotal concept in biological and artificial control systems, learning machines, and communication networks. A deeper understanding of the role of feedback in one area (communication) will lead to a better understanding of the role of feedback in a broader multidisciplinary context.
Concretely, this research focuses on and develops new approaches for tackling problems arising in the following areas: 1) feedback capacity of single-user channels with memory (new coding theorems based on directed information, causal conditioning, and Shannon strategy, as well as the development of concrete schemes for achieving the fundamental limits), 2) multiple-user channels with feedback (emphasis on multiple access channels and broadcast channels: characterization of fundamental limits as well as the construction of practical coding schemes), 3) capacity region of the two-way channel such as the Blackwell-Shannon binary multiplying channel (dynamic programming and infinite-dimensional convex optimization), 4) robust feedback coding techniques under channel uncertainty (universal decoding schemes), and 5) reliable communication with noisy feedback (new perspectives on cross-layer design of channel codes and network protocols). Massey's directed information takes the role of Shannon's mutual information in many feedback communication problems. Thus the role of directed information is investigated as a fundamental notion in general causal inference problems. Examples include gambling in horse-race markets with causal side information and its dual in source coding.
|
1 |
2008 — 2013 |
Kim, Young-Han |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Career: Network Information Theory: Coding For Communication, Control, and Computing @ University of California-San Diego
Despite significant advances in information theory and its application to point-to-point communications, a general theory governing optimal information flow over networks does not yet exist. Except for a few simple network models, the capacity region of a general memoryless network remains unknown, due to the complex tradeoff between competition and cooperation among many nodes in the network. Modern applications such as sensor networks, peer-to-peer systems, and distributed storages further bring up a new set of challenging problems spanning control, estimation, compression, computation, communication, as well as networking.
This research program provides a common set of conceptual, mathematical, and algorithmic tools for the emerging convergence of computation, control, and communication over networks, with the ultimate goal of developing a unified framework for characterizing fundamental performance limits of such systems. Towards this goal, we focus on three concrete problems representing the intersection of computation, control, and communication -- 1) the capacity of the relay channel (how to summarize the relay's noisy observation of the codeword: a list intersection technique via coding for distributed computing), 2) networked control (how to summarize the observation to stabilize a linear dynamical system: a variant of rate distortion coding for control), 3) collision avoidance for multiple access (how to build cooperation from a common source of randomness: a distributed generation of correlated random variables). Novel coding schemes will be proposed, while classical coding techniques will be reinterpreted and extended to broader applications. An algebraic framework for capacity regions is also developed to check the optimality of such coding schemes. The research program is complemented by educational activities that include the curriculum development for a graduate course on network information theory and the writing of an accompanied textbook.
|
1 |
2011 — 2014 |
Kim, Young-Han |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Cif: Small: Collaborative Research: a New Approach to Joint Source-Channel Coding @ University of California-San Diego
Shannon's source-channel separation theorem states that separating the source coder (compressor/decompressor) and the channel coder (error correction coder/decoder) via the universal digital interface of 'bits' is optimal for point-to-point communication of a single data source. While this modular design principle has inspired the basic architecture for most of today's communication systems, it is outperformed by more complex 'joint' source-channel coders for communication of multiple sources over networks---an important task in today's explosive demand and supply of distributed information sources. At the same time, many emerging applications involve computing a summary of data from multiple nodes, making a coordinated decision, and performing a joint action among these nodes, rather than merely communicating sources. This research establishes a hybrid source-channel coding architecture for these applications that is as simple as Shannon's separation architecture, yet achieves much improved performance. The new architecture will eventually lead to the discovery of practical algorithms for distributed computation, sensing, decision making, and coordination over networks.
Specifically, this research focuses on and develops new approaches for tackling the following problems: 1) hybrid coding for communicating correlated sources over multiuser channels (a unified approach for joint source-channel coding), 2) hybrid coding for network communication (a new relaying scheme based on joint source-channel coding), 3) implementation issues for hybrid coding (design of a practical code that is a good channel code and a good source code simultaneously), and 4) a mathematical framework for performance analysis and code design (information theoretic tools when the codebook and the message are entangled).
|
1 |
2011 — 2014 |
Rao, Bhaskar [⬀] Kim, Young-Han |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Eager: a Multi-User Communication and Information Theoretic Approach to the Sparse Signal Recovery Problem @ University of California-San Diego
This research project examines the theoretical, algorithmic, and computational issues that arise in compressed sensing (CS) and signal processing problems where there is a need to compute solutions to problems in which the solution vector has many zeros. In addition to the exciting compressed sensing area, this research will benefit numerous signal processing applications where the sparsity constraint on the solution vector naturally arises. Brain imaging techniques such as Magnetoencephalography (MEG) and Electroencephalography (EEG) are currently important examples. Sparse communication channels with large delay spread, high resolution spectral analysis, and direction of arrival estimation, are other important examples. An effective solution to this problem will have significant impact, by providing new and valuable tools to the practicing signal processing engineer. In addition, the tools will be of interest to researchers in cognitive science, neuroscience, and machine learning where sparsity issues naturally arise, such as sparse coding of signals in the brain or learning from data which is often assumed to lie on a low dimensional manifold.
This project provides a comprehensive and tighter integration of the compressed sensing field and multi-user information theory. This makes it possible to utilize the rich results available in network information theory which have been successfully applied to the implementation of communication systems. The theoretical tools necessary to enable this integration are being developed by the investigators. This research enables significant advances in both theory and practice in the CS field. The information theoretic insights are leveraged to provide insights on performance limits and guidance on practical CS-based system design. The implementation experience gained from communication systems will be translated to practical algorithm development and efficient CS-based system design.
|
1 |
2013 — 2016 |
Kim, Young-Han |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Cif: Small: New Directions in Network Information Theory @ University of California-San Diego
Most coding schemes developed in network information theory are combinations of a handful of basic components using Shannon's random coding technique. With the goal of advancing our understanding of these random coding schemes and making them applicable in practice, this research explores three important problems in network information theory. First, this project investigates the simultaneous decoding rule that is easy to analyze, yet powerful enough to achieve the maximum achievable rates of random coding over interference channels. Second, this project applies the insights thus gained on random coding and simultaneous decoding to the index coding problem, in which multiple messages are communicated through a single, noise-free link to multiple receivers with different pieces of side information. Third, this project develops a concatenated coding architecture based on random coding, product codes, and iterative decoding that can provide a systematic method for translating random coding schemes to practical, implementable coding techniques for real-world networks.
Playing an ever-increasing role in our networked society, network information theory studies the fundamental limits on information flow over networks and the optimal coding techniques, protocols, and architectures that achieve these limits. This research investigates canonical problems in network information theory that involve interference and broadcast, offering fresh insights and new mathematical tools for optimal information flow in several important applications such as network coding, wireless communication, peer-to-peer networking, and content broadcasting. The concatenated coding architecture developed in this research has potential to provide a new framework for transforming theoretical concepts in network information theory into practical algorithms for applications.
|
1 |