The EMS Publishing House is now **EMS Press** and has its new home at ems.press.

Please find all EMS Press journals and articles on the new platform.

# Oberwolfach Reports

Full-Text PDF (2573 KB) | Introduction as PDF | Metadata | Table of Contents | OWR summary

**Volume 4, Issue 3, 2007, pp. 2079–2172**

**DOI: 10.4171/OWR/2007/36**

Published online: 2008-06-30

Wavelet and Multiscale Methods

Albert Cohen^{[1]}, Wolfgang Dahmen

^{[2]}, Ronald A. DeVore

^{[3]}and Angela Kunoth

^{[4]}(1) Université Pierre et Marie Curie, Paris, France

(2) Technische Hochschule Aachen, Germany

(3) Texas A&M University, College Station, USA

(4) Universität zu Köln, Germany

Complex scientific models like climate models, turbulence, fluid structure interaction, and nanosciences, demand finer and finer resolution in order to increase reliability. This demand is not simply solved by increasing computational power. Indeed, higher computability even contributes to the problem by generating wealthy data sets for which efficient organization principles are not available. Extracting essential information from complex structures and developing rigorous models for quantifying the quality of information is an increasingly important issue. This manifests itself through recent developments in various areas. Examples include regression techniques such as projection pursuit in stochastic modeling, the investigation of greedy algorithms in complexity theory, or compression techniques and encoding in signal and image processing. Further representative examples are the compression of fully populated matrices arising from boundary integral equations through concepts like multipole expansions, panel clustering or, more generally, hierarchical matrices, and adaptive solution techniques in numerical simulation based on continuous models such as partial differential or integral equations. The mathematical methods emerging to address these problems have several common features including the nonlinearity of the solution methods as well as the ability of separating solution characteristics living on different length scales. Having to deal with the appearance and interaction of local features at different levels of resolution has, for instance, brought about multigrid methods as a key methodology that has advanced the frontiers of computability for certain problem classes in numerical analysis. In fact, the separation of frequencies plays an important role in preconditioning linear systems arising from elliptic partial differential equations so that the corresponding large scale systems could be solved with discretization error accuracy optimally in linear time. A related but different concept for managing the interaction of different length scales centers on wavelet bases and multilevel decompositions. In the very spirit of harmonic analysis they allow one to decompose complex objects into versatile and simple building blocks that again support analyzing multiscale features. While this ability was exploited first primarily for treating {\em explicitly} given objects, like digital signals and images or data sets, the use of such concepts for recovering also {\em implicitly} given objects, like solutions of partial differential or boundary integral equations, has become a major recent focus of attention. The close marriage of discretization, analysis and the solution process based on {\em adaptive} wavelet methods has led to significant theoretical advances as well as new algorithmic paradigms for linear and nonlinear stationary variational problems. Through thresholding and best $N$-term approximation based on wavelet expansions, concepts from nonlinear approximation theory and harmonic analysis become practically manageable. In our opinion, these ideas open promising perspectives not only for signal and image processing but also for the numerical analysis of differential and integral equations covering, in particular, such operator equations with stochastic data. \medskip These two research areas have developed relatively independently of one another. Our first Oberwolfach Workshop `Wavelet and Multiscale Methods' held in July 2004 sought to bring various disciplines utilizing multiscale techniques together by inviting leading experts and young emerging scientists in areas that rarely interact. That workshop not only accelerated the advancement of nonlinear and multiscale methodologies but also provided beneficial cross fertilizations to an array of diverse disciplines which participated in the workshop, see the Oberwolfach Report 34/2004. Among the several recognizable outcomes of the workshop were: (i) the emergence of compressed sensing as an exciting alternative to the traditional sensing-compression paradigm, (ii) fast online computational algorithms based on adaptive partition for mathematical learning, (iii) clarification of the role of coarsening in adaptive numerical methods for PDEs. The workshop \emph{Wavelet and Multiscale Methods} organised by Albert Cohen (Paris), Wolfgang Dahmen (Aachen), Ronald A. DeVore (Columbia) and Angela Kunoth (Bonn) was held July 29th -- August 4th, 2007. This meeting was well attended with over 50 participants with broad geographic representation from all continents. It was a nice blend of researchers with various backgrounds described in the following. Compressed sensing, as being developed by Candes, Donoho, Vershynin, Gil\-bert, Strauss, and others advocates a fascinating alternative to the usual sensing and compression methodology. The classical model of limited bandwidth is replaced by sparsity models and the role of traditional sampling is played by sensing functionals that are typically based on random vectors. One can then prove that under certain circumstances by far fewer observations are needed to record all the information required to encode sparse signals. Adaptive methods for numerically solving a wide range of equations with proven optimality (in terms of the number of computations needed to achieve a prescribed error tolerance) originally involved coarsening procedures. The necessity of such coarsening was brought into question at the previous workshop and subsequent work of Stevenson has shown that it is possible to avoid coarsening for scalar elliptic problems through cautious bulk chasing. As in the previous workshop, the participants are experts in areas like nonlinear approximation theory (e.g., DeVore, Temlyakov), statistics (e.g., Picard, Kerkyacharian), finite elements (e.g., Braess, Oswald, Xu), multigrid methods (e.g., Braess, Hackbusch), spectral methods (e.g., Canuto), harmonic analysis and wavelets (e.g., Cohen, Daubechies, Petrushev, Schneider, Stevenson), numerical fluid mechanics (e.g., S\"uli), conservation laws (e.g., Tadmor) or systems of stationary operator equations (e.g., Dahmen, Kunoth, Schwab). One of the main objectives of this workshop was to foster synergies by the interaction of scientists from different disciplines resulting in more rapid developments of new methodologies in these various domains. It also served to bridge theoretical foundations with applications. Examples of conceptual issues that were advanced by the workshop were: convergence theory for adaptive multilevel methods for high-dimensional PDEs; extension of fast solution methods such as multigrid and multiscale methods to more complex models such as control problems involving partial differential equations, and partial differential equations with stochastic data; adaptive multiscale methods for coupled systems involving partial differential and integral equations; incorporating anisotropy in analysis, estimation, compression and encoding; adaptive treatment of nonlinear and time--dependent variational problems; interaction of different scales under nonlinear mappings, e.g., for flow problems and for problems with stochastic data. We feel that our workshop propelled further advancement of several emerging areas: the numerical aspects of complete sensing including stability and optimality; deterministic methods for complete sensing based on coding theory; the design and analysis of universal estimators in nonparametric statistical estimation and machine learning --- nonlinear multiscale techniques may offer much more efficient alternatives to schemes based on complexity regularization; solution concepts for problems of high spatial dimension by utilizing anisotropy, for instance, in mathematical finance, in quantum chemistry and electronic structure calculations. In summary, we find that the conceptual similarities that occur in these diverse application areas suggested a wealth of synergies and cross-fertilization. These concepts are in our opinion not only relevant for the development of efficient solution methods for large scale problems but also for the formulation of rigorous mathematical models for quantifying the extraction of essential information from complex objects.

*No keywords available for this article.*

Cohen Albert, Dahmen Wolfgang, DeVore Ronald, Kunoth Angela: Wavelet and Multiscale Methods. *Oberwolfach Rep.* 4 (2007), 2079-2172. doi: 10.4171/OWR/2007/36