1995 — 1998 |
Ortega, Antonio |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Adaptive Compression Techniques For Digital Video Communications @ University of Southern California
The area of digital image and video compression and communications has seen growing activity in recent years, including standardization efforts such as JPEG and MPEG, and an increased number of current or proposed applications from Cable or Satellite TV to videoconferencing and future multimedia services. This trend will continue to create an incentive for new compression techniques that are more efficient as well as better adapted to the particular transmission environments. The main thrust of this research is to provide adaptive compression techniques for video communications. Adaptive techniques are those where encoders are capable of matching their encoding procedure to the local characteristics of the source or the channel. Rate-distortion optimal solutions and fast heuristic approximations are being obtained for some of the problems of interest in video communications, such as bit allocation and rate control. A novel scalar adaptive quantizer, close in spirit to arithmetic coding, is being investigated. This new technique, combined with such popular quantization techniques as trellis coded quantization, will be used in real encoding situations. Convergence and asymptotic properties of this scheme are also being studied. For variable channel situations where transmission resources are shared, as in ATM networks for instance, rate constraints for each video connection are being designed to enable good individual video quality as well as efficient overall network utilization. This research is being carried out in the Digital Video Communications research laboratory that is currently being set up within the Signal and Image Processing Institute. Our main education goal is to provide teaching in the area of signal, image and video processing. We will introduce a digital video communications course, designed to cover material that is specific of video compression including the relevant communications aspects, and to provide hands-on experience for the students .
|
1 |
1995 — 1996 |
Sawchuk, Alexander (co-PI) [⬀] Nikias, Chrysostomos (co-PI) [⬀] Kuo, Chung-Chieh Jay (co-PI) [⬀] Jenkins, B.keith Leahy, Richard [⬀] Ortega, Antonio |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Cise Research Instrumentation: a Computer Laboratory For Multidimensional Signal and Image Processing @ University of Southern California
9422106 Leahy This award is to purchase equipment dedicated to research in computer and information science and engineering. Specifically, the equipment will be used for research in multi-dimensional signal and image processing, including in particular: 1) fusion of multimodal neuroimaging data; 2) adaptive quantization of image and video; 3) automatic target recognition via deformable template matching; 4) design of high resolution diffractive optics for photonic interconnections and computing; and 5) advanced adaptive multidimensional and array signal processing. Common to all of these projects is a need for access to fast numerical computation and high resolution visualization and display capabilities. The goal of this project is to set up a state of the art facility for processing, visualization and display of multidimensional data. Towards this end, a computer for high performance numerical computation, and a RAM-based workstation for display of high resolution video image sequences with a high performance graphics capability will be purchased. ***
|
1 |
1997 — 2000 |
Sawchuk, Alexander [⬀] Mcleod, Dennis (co-PI) [⬀] Kuo, Chung-Chieh Jay (co-PI) [⬀] Levi, Anthony Ortega, Antonio Neumann, Ulrich (co-PI) [⬀] Shahabi, Cyrus (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
High Performance Processors and Networks For Video Compression, Distributed Visualization, Database Systems and Collaborative Telepresence @ University of Southern California
9724567 Sawchuk, Alexander McLeod, Dennis University of Southern California High Performance of Processors and Networks for Video Compression, Distributed Visualization, Database Systems and Collaborative Telepresence USC has received a Major Research Instrumentation award for the acquisition of processing hardware; a special purpose high-speed-resolution digital video storage and display hardware; and miscellaneous data communications hardware; for an Integrated Media Research Network (IMRN) to support research and training programs in high-performance multimedia, graphics, visualization, and database systems. Research projects to be supported include the generation, compression and transmission of real-time video over shared networks; processing of remote high resolution 3-D visualization and computation-intensive graphics; robust distribution and networking of interactive multimedia data within a heterogeneous distributed computing environment; distributed database management techniques for video and audio servers; and utilization of multiprocessor computers for collaborative telepresence over long physical distances. Besides making the enhanced facilities available to students pursuing research in high-performance graphics, visualization and database systems, USC plans to connect the IMRN to the Institution's School of Engineering's Instructional Television network which supports two-way live interactive broadcasts of regular credit courses from engineering, computer science and mathematics. It is envisaged that this connection will allow students to work on research projects anywhere on the main campus, medical school campus and USC's Information Sciences Institute and will allow classroom demonstrations over these locations as well as the local Los Angeles area. Research results on database management for video data and distance learning will contribute directly to these distance learning activities.
|
1 |
1998 — 1999 |
Ortega, Antonio |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
1998 Workshop On Multimedia Signal Processing @ University of Southern California
The 2nd IEEE Signal Processing Society Multimedia Signal Processing Workshop will take place in Redondo Beach, CA, in December 1998. The first workshop was held in Princeton, NJ, in June 1997. The objective of the workshop is to bring together experts in areas, which are increasingly being required to interact with each other, as emerging integrated media systems become available. As an example, the list of topics includes areas such as databases, communications, processing and interfaces. The objective of the workshop is to study how issues arising in the above fields affect signal processing for multimedia systems. The goal of the funds requested in this proposal is to facilitate the participation of students. The student support will come in the form of reduced registration fees and/or travel support. Reduced registration will made available to as many students as possible.
|
1 |
1998 — 2001 |
Chugg, Keith (co-PI) [⬀] Beerel, Peter [⬀] Ortega, Antonio |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Challenges in Cise: Algorithm and Implementation Co-Design For Optimizing Average Complexity/Performance: a Case-Study in High-Performance Low-Power Mobile Multimedia Comm. Design @ University of Southern California
We propose a new paradigm for system design in which the algorithm and circuit implementations are jointly designed to achieve high-performance while minimizing average power consumption. We propose to drive the research using a case study of an adaptive equalizer/decoder where both algorithm selection and asynchronous hardware design will be tackled using a Minimum Average Complexity (MAC) approach. In particular, we will develop a detailed asynchronous design of representative components of a MAC optimized mobile radio receiver, variable complexity algorithms for joint adaptive equalization and decoding, a characterization of the average computational load of these algorithms for a typical mobile user, general procedures for MAC algorithm optimization, an architecture for the new algorithms and a quantitative estimate of the power savings relative to a standard design.
|
1 |
1998 — 2002 |
Scholtz, Robert [⬀] Prata, Aluizio (co-PI) [⬀] Ortega, Antonio Chugg, Keith (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Fundamental Experimental and Analytical Studies in Ultra Wideband Radio With Application to Wireless Multimedia Communication @ University of Southern California
In this proposal a complete study of fundamental issues, both experimental and analytical, in ultra-wideband (UWB) radio is presented. The proposed work involves a strong interaction between experimental and analytical components, so that experimentally derived models can be used to design optimized algorithms, which can in turn be tested. For this purpose, radio, video, and test equipment is requested to complement existing facilities in the ultra-wideband radio lab (UltRa Lab). This equipment will be used for closely related research projects that support the development of fully mobile indoor video communication systems. The motivating wireless technology is spread-spectrum impulse radio which alleviates multipath problems, but must coexist in the same spectrum with signals in the frequency range from roughly 500 MHz to 2 GHz. The experiments are aimed at (1) quantifying the ability of different radio systems to coexist in the same band without mutually interfering, (2) quantifying the distortion properties of ultra-wideband antennas and propagation environments, and (3) measuring the effects of indoor radio performance anomalies on video transmissions. The equipment requested includes anechoic chamber components, a flexible software video compression system, high quality LANs that can be utilized in both the video and radio interference experiments, and a bit error-rate tester and other components necessary for interference and coexistence tests. These experiments will result in a database of channel measurements, which characterize propagation, antenna, and interference effects. The database will be used to extract statistical models which, in turn, will be used to develop algorithms for receiver signal processing and video rate control. This process of experimentation, model extraction, and algorithm development will be iterated throughout the duration of the requested support. Each of these experiments is briefly described below.
(1) Interference Rejection and Coexistence Experiments and Analysis (Scholtz and Chugg). This experimental work will explore the ability of impulse radio with power spectrum thinly spread roughly from 500 MHz to 2 GHz to coexist over short range channels with the myriad of other electronic systems in that broad spectrum. This effort will test coexistence with LANs, cordless phones, microwave ovens, wireless TV links, etc. The processing gain of the spread-spectrum techniques employed in the impulse radio will be checked and the dynamic range of current impulse radios will be tested. These experiments will determine the direction of research efforts on impulse radio implementation.
(2) Characterization of ultra-wideband antennas and propagation environment (Prata, Chugg, Scholtz). The experimental work to be carried out in the controlled environment of an anechoic chamber supports the characterization of ultra-wideband antennas for impulse radio, and the characterization of narrower-band antennas to evaluate their ability to receive/reject impulses. This environment will be used to evaluate new UWB antenna designs that provide more robust spatial coverage, better pulse shaping characteristics, polarization diversity, and to study the propagation of UWB signals through different kinds of materials. The equipment purchased under this grant is destined for an anechoic chamber atleast 15' (w) x 15' (h) x 30' (l) to be constructed in the near future (architectural work on the building containing this chamber is beginning now).
(3) Experiments with Software-Compressed Asynchronous Video over UWB Wireless Links (Ortega, Chugg). Two stages are planned in the experimental work with robust asynchronous video transmission. In a first stage, laptops and wireless LAN equipment will be used to test techniques, which are currently under study for rate control over a time varying channel. All video functions (compression, rate control, error correction, retransmission, etc.) will be implemented in software. In a second stage the algorithms will be adapted to operate over an experimental impulse radio link. At that point the results from the initial stage will be incorporated and also changes needed for the particular transmission conditions of impulse radio will be implemented.
Each of the student researchers supported under the requested funding will be involved in all three of these aspects. This research will be further supported by the UltRa Lab infrastructure, which has been developed recently under the support of NSF and through the Integrated Media System Center ERC and our industrial sponsors. The infrastructure includes laboratory space, equipment, a team of undergraduate merit scholars to assist in laboratory procedures, and access to circuit and semiconductor process expertise from both USC collaborators and industrial partners. The research described in this proposal is a portion of the fundamental investigations required to develop and demonstrate UWB indoor radio multimedia systems (complementary investigations into integrated circuit design are also necessary, but are not part of the research proposed herein). The original equipment budget for this proposal has been revised to reflect equipment acquisitions from other sources and no other proposals for this work are currently pending.
|
1 |
1998 — 2001 |
Ortega, Antonio |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Tools For Image and Video Transmission Over Heterogeneous Network @ University of Southern California
Most current digital video systems merely replace existing modes of analog video delivery with their more efficient digital counterparts. However, as packet-based networks, such as the Internet, start to be an essential component of the communications infrastructure and general purpose computers become the predominant communication terminal, there is a need to provide a new set of algorithms for video compression and processing, which are optimized for an endto- end software processing environment. The challenge is then to find efficient ways of embedding video data on a fundamentally asynchronous transmission/processing infrastructure. To address this challenge this project concentrates on two main areas: (i) computation/bandwidth scalable coding and (ii) robust asynchronous video transmission algorithms. The Internet is a heterogeneous network, both in terms of its components (hosts, routers) and in terms of available bandwidth and processing speed. Thus bandwidth/computation scalable algorithms are desirable to enable less complex encoding/decoding at the expense of a reduction in video quality. This project is developing a design methodology for variable complexity, input dependent coding/decoding algorithms where the goal is optimizing average (instead of "worst-case") complexity performance, robust asynchronous video transmission algorithms are also being developed, as motivated by the limited transmission reliability offered by the Internet. Video is sensitive not only to losses of information (e.g. packets being dropped due to congestion) but also to excessive delay. This work concentrates on new compression techniques for asynchronous video transmission where each video unit (a part of a frame) has a ``lifetime'' during which it is useful, and which can span more than a single frame time interval. This area includes novel work in both rate control and image domain error correction/detection techniques.
|
1 |
2004 — 2009 |
Chugg, Keith (co-PI) [⬀] Gupta, Sandeep [⬀] Ortega, Antonio Breuer, Melvin (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Itr-(Ase+Nhs)-(Int): a Digital System Paradigm For Yield Enhancement and Graceful Degradation Via Error Acceptance @ University of Southern California
Abstract ITR-(ASE+NHS)-(int): A Digital System Paradigm For Yield Enhancement and Graceful Degradation Via Error Acceptance
Despite extensive research into improving fabrication processes, continued VLSI scaling will be inhibited by high variations in process parameters, higher defect densities, and higher susceptibility to external noise. We propose two notions, namely error-tolerance and acceptable operation, to facilitate imprecise computation: these notions systematically capture the fact that an increasingly large class of digital systems can be useful even if they do not perfectly conform to a rigid design specification. We propose to develop a systematic methodology for design and test of this class of digital systems that will exploit the notion of error tolerance, to enable dramatic improvements in scale, speed, and cost. In the proposed methodology, system specification will include a description of the types of errors at system outputs, and the thresholds on their severities, that are tolerable. The design methodology will exploit this information to obtain designs that provide higher performance and/or lower costs.
Over the next 15 years, the proposed approach will provide dramatic improvements in scale, speed, and cost for a wide class of digital systems, including many integral to NHS. This will enable development and wider deployment of devices with advanced capabilities in areas such as speech processing, real-time translation of spoken natural languages, and biometrics.
|
1 |
2006 — 2010 |
Mitra, Urbashi (co-PI) [⬀] Ortega, Antonio Heidemann, John [⬀] Papadopoulos, Christos |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Nets-Nbd: Maltraffic Analysis and Detection in Challenging and Aggregate Traffic (Madcat) @ University of Southern California
Many compromised computers today generate maltraffic, such as denial-of-service (DoS) attacks, spyware reporting home, unauthorized applications, spam, and worms. Current defenses are becoming increasingly brittle. There are several reasons for this challenge: encryption limits packet content inspecting, aggregation at network edge limits use of filtering and blacklisting due to potential collateral damage, increased traffic volumes allow maltraffic to hide, and applications are often cloaked through layered protocols (SOAP over HTTP or varying port allocation) or active concealment.
This proposal applies signal processing and detection theory to network traffic to detect maltraffic in these challenging scenarios. We will use features such as packet timing and frequency, careful design of the measurement and detection systems, and study of inherent behaviors in protocols to address these challenges.
Broader Impact: The results of this work will include (a) the development of a systematic methodology for applying signal processing methods to network traffic; (b) the analysis of new signal representation and detection methods specific to maltraffic; and (c) the identification, understanding, and modeling of key identifying features and inherent behaviors of maltraffic and how they are shaped by the network. Our new approaches will yield a deeper understanding of network traffic, and will be tested with traces of real network traffic, resulting in new tools to combat these problems.
|
1 |
2010 — 2014 |
Ortega, Antonio |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Cif: Small: Wavelets On Graphs - Theory and Applications @ University of Southern California
CIF:Small:Wavelets on Graphs -- Theory and Applications -- 1018977 PI: Antonio Ortega (University of Southern California)
A key recent trend has been towards dramatic increases in the amounts of data that can be gathered for analysis. Examples include online social networks, online search logs, DNA analysis, surveillance, among many others. A major challenge is to extract useful information from these large data-sets, to the point that there is a risk that much of these data could be underutilized. This research aims at developing innovative data representation tools to enable significantly faster and more accurate analysis of these emerging data-sets.
This research is motivated by two observations: i) data points in these emerging data-sets can often be seen as part of a large graph and ii) tools for analysis of data on graphs tend to be global in nature, making it difficult to identify trends that manifest themselves in relatively small regions of the graph. Inspired by wavelet techniques developed over the past 20 years, this work studies a new class of wavelets that are defined on graphs. This project studies the underlying theory for these wavelets on graphs, including the design of localized, invertible and critically sampled transforms. The team is also addressing two concrete applications to illustrate the potential benefits of these methods. In one application these new tools are applied to Genomic data sets (where graphs correspond to genetic pathways) and to analysis of data in social networks. The second class of applications consists of applying these new transforms to the development of new tools for image and video processing.
|
1 |
2014 — 2017 |
Mitra, Urbashi [⬀] Ortega, Antonio |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Nets: Medium: a Sparse Decomposition Framework For Complex System Design and Analysis @ University of Southern California
Large scale, pervasive communication systems are an important part of everyday life and, with the emergence of the Internet of Everything technology, are becoming an essential part of our infrastructure. Future communication systems necessitate significantly new design paradigms because they are large scale and involve a complex interaction of communications, "natural" (environmental) networks as well as control. The seamless management of a system of this complexity, interconnection, and scale is daunting. A fundamentally new approach is needed to tackle the issues of scale and complexity. The investigators are studying novel mathematical models for network design and optimization for systems of much larger size than those that are tractable with current methods. They are planning to demonstrate their utility on applications such as cognitive radio, wireless body area sensing networks and potentially models for bacterial populations. The new methods have the potential to impact the design and control of very general, large-scale networks such as biological, social networks and the SmartGrid.
This research develops a novel theoretical framework to address the challenges of large scale system design by analyzing, tracking, and controlling Markov processes over graphs associated with complex systems. The correlation induced in these large Markov chains is exploited via sparse approximation theory employing graph wavelets for representation. Typical complex systems induce an underlying sparsity that enables dimensionality reduction via compressed sensing-like schemes. This research develops new sparse techniques for formal modeling, analysis and optimization of large-scale systems that evolve temporally, by designing novel graph wavelets for directed graphs, in combination with new sparse approximation algorithms and control methods tailored to the complex systems for estimation, communication and control.
|
1 |
2015 — 2018 |
Ortega, Antonio |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Cif: Small: Graph Signal Sampling: Theory and Applications @ University of Southern California
Modern society is increasingly reliant on large scale, distributed, interconnected, and complex systems, such as the Internet, smart grids, intelligent buildings or highways. Furthermore, much of the information now being generated is also interconnected in complex ways (e.g., the Web). While these systems and datasets can be monitored by recording relevant data, the volumes of such data make it difficult to address critical tasks, such as anomaly detection, in a timely manner. These datasets often exhibit a natural graph structure, with graph nodes representing measurements or information (e.g., the temperature of a sensor or data from a web page), and graph edges representing the relationships between nodes (e.g., distance between sensors or links between webpages). This project develops novel methods for sampling of very large scale graph datasets, with the goal of making it possible to measure only a small fraction of carefully selected nodes, while preserving the ability to analyze the whole system.
Sampling theory is a major element of signal processing theory and applications, but has only recently been considered for graph signals. While recent progress has been made under the assumption that the graph is fully known, these techniques are prohibitively expensive for practical datasets of interest. This project addresses fundamental questions for the challenging problem of sampling when only partial graph information is available (e.g., decisions based on smaller subsets of connected nodes). For example, given local graph connectivity information and assumptions about the graph signals of interest, such as their frequency localization, the goal is to identify the best set of vertices to sample locally in order to obtain a reliable estimate of the corresponding global graph signals.
|
1 |
2016 — 2017 |
Ortega, Antonio |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Workshop On Graph Signal Processing: Student Travel Support @ University of Southern California
A graph signal is a signal in which relationships between its components follows the structure encoded in a weighted graph. The purpose of graph signal processing is to exploit this underlying structure to analyze and process graph signals. The last few years have seen significant progress in the development of theory, tools, and applications of graph signal processing. The Graph Signal Processing (GSP) Workshop is a forum intended to disseminate ideas to a broader audience and to exchange ideas and experiences on the future path of this emerging field.
The main goal of the funds requested in this proposal is to facilitate the participation of students by providing a limited number of travel grants. The recipients of the travel awards will be selected by the organizing committee among students who apply for them and who are principal authors of papers presented at the workshop. These awards will be based on merit (e.g. novelty of the research) but also on distance. We expect to provide a $500 travel awards to 20 students.
|
1 |
2017 — 2020 |
Annavaram, Murali Ortega, Antonio |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Shf:Small: Accelerating Graph Analytics Through Coordinated Storage, Memory and Computing Advances @ University of Southern California
Graphs represent the relationship between different entities and are the representation of choice in diverse domains, such as web page ranking, social networks, drug interactions, and communicable disease spreading. Due to the sheer size of graphs in these important domains, billions of vertices with tens of billions of edges, graph processing is a data intensive task. The size of the graphs is expected to far exceed the size of the main memory available in many computer systems. As such graph analytics will be hobbled by their inability to quickly access graph vertices and edges from computer storage. Current storage systems are mostly block based and hence treat graph data as a collection of bytes organized into pages. The advent of affordable solid state drives (SSDs) allows one to envision a future where SSDs can be made semantically aware of the underlying graph storage. Rather than treating storage as a collection of blocks, semantic awareness enables SSDs to consider graph structure while deciding on how vertices and edges are laid out, and how to access the graph elements efficiently.
This research advances the vision of semantic graph storage by proposing to make the SSD controller treat graph vertices and edges as first class objects. In particular, this research will design and implement a set of application programming interfaces (APIs) that allow application developers and algorithmic designers to specify graph layout and query storage systems using graph-oriented access requests, such as finding all the neighbors of a given vertex. A new runtime layer for SSDs will also be developed to exploit the semantic awareness to improve SSD endurance, garbage collection and caching. The benefits of semantic graph storage will be demonstrated by rethinking the implementation of graph signal processing algorithms to achieve an order magnitude improvement in performance. Such dramatic performance improvements in turn will enable a variety of compelling societal benefits such as accelerated drug discovery. This research also provides opportunities for a new generation of students to study, implement and optimize graph analytics on experimental SSD platforms and to study the tradeoffs between clean abstractions and the performance impact of abstractions.
|
1 |
2020 — 2023 |
Ortega, Antonio |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Cif: Small: Graph Signal Processing Methods For Data-Driven System Design @ University of Southern California
Data-driven design is leading to unprecedented performance improvements in many widely used systems. Examples of recent successes can be found in speech recognition, advanced video analysis or imaging-based medical diagnosis, to name just a few. This project is motivated by the observation that more data does not always lead to better system design. In fact, extensive use of poorly understood data can create significant risks once systems are deployed. For example, data may introduce bias toward specific system outputs (e.g., lead to incorrect diagnoses) or performance might degrade significantly under even small changes in data collection (e.g., microphone characteristics, camera resolution). These risks are a major obstacle to wider adoption of data-driven tools, in particular in critical applications. This project develops methods to select data for improved system design, based on new models for large scale datasets. The ultimate goal of the project is to reduce deployment risk by designing systems based on the most representative dataset rather simply using the largest dataset.
In many applications, such as sensing, anomaly detection, classification, recognition or identification, systems are designed by first collecting significant amounts of data, and then optimizing system parameters using that data. As task complexity, data size and the number of system parameters increase, system analysis and characterization tasks become a major challenge, with estimates often based on end-to-end performance on the training set. Examples of these tasks include (i) estimating system accuracy, (ii) characterizing system stability to changes in data, (iii) determining the correct amounts of data needed for training or (iv) predicting their ability to generalize to different situations. In this project, graph-based approaches are developed to characterize large datasets in high dimensional space. This research is focused on theoretical, algorithmic and practical aspects of system characterization and design. On the theoretical front, this project tackles the problem of designing graphs that capture relevant properties of the data space, developing asymptotic results to link the distribution of the data to properties of graphs and related graph signals. On the algorithmic front, efficient methods for graph construction and task complexity estimation are developed, with the goal of enabling selection of the most representative dataset. As an application, practical deep learning architectures are considered, methods to increase their robustness are studied, and new strategies for active and transfer learning are developed.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
|
1 |