Mathematical and Computer Sciences

Computational analysis tools for multiscale engineering systems (Petzold)

Computational analysis tools for multiscale engineering systems (Petzold)

Although a great deal of work has been devoted to simulation of large-scale engineering systems, relatively little effort has focused on the development of algorithms and software for computational analysis: the mathematical and computational tools for extracting information from the simulation and making use of it for decision-making and design. This is particularly the case for multiscale engineering systems, where reliable techniques for simulation are just now coming of age. Prof. Petzold's research group has been engaged in the development of methods and software for sensitivity analysis, estimation of Lyapunov exponents, and design optimization for continuum-scale and multiscale systems, with applications to the study and design of microfluidic systems for mixing.
Sharp gradients and interface tracking (Ceniceros, Gibou, Liu)

Sharp gradients and interface tracking (Ceniceros, Gibou, Liu)

Solutions with sharp gradients occur in a wide variety of important applications such as interfacial flows and material multiphase decomposition. Prof. Ceniceros, Prof. Gibou and Prof. Liu work on different numerical approaches to accurately and efficiently resolve flows with sharp transitions. Prof. Liu is one of the developers of the Ghost Fluid Method for multiphase flows. This method captures the boundary conditions on the fluid interface in a sharp fashion. He is also developing high order conservative schemes for multidimensional hyperbolic equations and a second order version of the Ghost Fluid Method. Prof. Ceniceros is developing adaptive and computationally efficient numerical strategies for immersed-interface and for diffuse-interface models to study multiphase flows. For the immersed interface setting these strategies merge the level set method with adaptive (moving) meshfront tracking and adaptive mesh refinements. For the diffuse-interface (phase field) model, a new fast and stable method for 2D and 3D simulations is being developed. Prof. Gibou is developing high order accurate numerical methods for free surface flows, two-phase flows and multiphase flows with phase change in a level set framework. A new class of multiphase flow solvers for adaptive Cartesian grids is being developed in two and three spatial dimensions. A hallmark of this approach is that its design does not assume any particular structure on the mesh; hereby avoiding mesh generation constraints (see http://www1.engr.ucsb.edu/~fgibou/).
Stochastic partial differential equations (Birnir)

Stochastic partial differential equations (Birnir)

The modeling of complex fluids and materials frequently includes interface fluctuations that can be described by stochastic partial differential equations that are driven by noise. Even when no external noise is present, the equations describing fluid and material interfaces are nonlinear and sometime described by ill-posed problems. These systems amplify very small noise in the surroundings and as a result, can mimic stochastic PDEs. In spite of the fact that their solutions are random variables, they can be solved numerically and the solutions used to compute statistical averages. However since the solutions are not smooth, and this in turn influences the averages, great care must be taken in their numerical solutions, especially to capture the small scales. Prof. Birnir and his collaborators have developed numerical schemes that permit the accurate computation of statistical quantities such as the width (correlation) function and applied them to surface growth and landsurface evolution, completely characterizing the interface dynamics and texture.
Homogenization (Birnir)

Homogenization (Birnir)

Homogenization of fluids is used in flow and electromagnetic problems where separation of scales exists so that large-scale, typically slow, modulations are imposed on small-scale and usually fast oscillations. Homogenization of advanced materials can be used to guide experimental tests of materials and in their design. Recent mathematical advances have made possible the homogenization of both complex electromagnetic resonances and turbulent flows. Prof. Birnir and his collaborators have applied homogenization to both fluid and material problems.
Cluster computing and computational grid computing environments (Wolski, Yang)

Cluster computing and computational grid computing environments (Wolski, Yang)

Prof. Wolski's research is devoted to the development of grid computing as a generic resource. In the area of cluster computing, Prof. Yang is developing a runtime system for threaded execution of MPI parallel programs on networked workstations and SMPs. He is investigating a cluster-based storage system for high reliability and expandability with the goal of allowing cluster users seamless access a large low-cost storage system. Finally, he is developing a web-based execution environment for parallel jobs in a multiprogrammed cluster. His goal is to improve availability and manageability of a computing cluster by means of clustering infrastructure support.
Semi-automatic generation of graphical user interfaces for scientific computing (Petzold)

Semi-automatic generation of graphical user interfaces for scientific computing (Petzold)

There is a great need for tools and environments that can facilitate the development of scientific computing software and make it easier to use. Prof. Petzold's research group has been developing an environment which would allow developers and/or sophisticated users of scientific software to quickly, easily and in a semi-automatic fashion create matching Java front ends for their programs. This is accomplished via a process of compile-time dataflow analysis and automated constraint extraction. The process of revision management presents some of the most interesting and challenging research problems, in reestablishing a linkage between two independent but interlinked modules when one of the two modules is changed.
Fast Solvers (Chandrasekaran)

Fast Solvers (Chandrasekaran)

A common bottle-neck in numerical computations is the solution of linear systems of equations. One of the most common techniques is to exploit properties of the matrix (sparsity, smoothness, etc.) to develop a fast matrix-vector multiplication algorithm, and then use that in an iterative solver. However, the speed of iterative solvers is problem-dependent. They usually require the availability of a good pre-conditioner, which is difficult to come by. Prof. Chandrasekaran and his collaborators have instead taken a different approach to the problem. They exploit the low-ranks of certain sub-blocks of the matrix to design fast direct solvers. These fast solvers require no pre-conditioners. Furthermore, the structure of the matrix is captured using a purely algebraic representation, which can be computed rapidly. Hence the technique is widely applicable. For example, this technique has led to the first fast direct solver for Kress' spectral discretization of the integral equations of two-dimensional scattering theory. Currently the usefulness of the algorithm for solving both sparse and dense linear systems from PDEs and integral equations is being investigated.

Computation in Complex Fluids

High Resolution Simulation of Free Surface Flows (Gibou)

High Resolution Simulation of Free Surface Flows (Gibou)

Free surface flows are models that are used to simulate many physical phenomena with applications to science and engineering. Prof. Gibou’s research is two-fold: First, he is developing high resolution algorithms that can simulate and predict the behavior of complex free surface flows. Second, he seeks to apply these algorithms to a wide range of applications in collaboration with scientists and engineers (Banerjee, Fast, Meiburg, Nguyen, etc.). A characteristic of his research is the development of so-called sharp interface and multiscale numerical algorithms (see http://www1.engr.ucsb.edu/~fgibou/).
High Resolution Simulation of Multiphase Flows with Phase Change (Gibou)

High Resolution Simulation of Multiphase Flows with Phase Change (Gibou)

Over the last two decades, there has been on ongoing quest for new computational methods to solve multiphase flows with phase change. This thrust has been motivated in part by the energy industry as phase change processes allow fluids to store and release large amounts of heat energy. Other applications include the study of condensation in the context of manned space flight dehumidification systems, particularly difficult to study experimentally in microgravity environments and of considerable interest to NASA. The study of phase change with physical experiments remains a challenge, mainly because of the small time and length scales associated with these processes. Consequently, such studies are limited to empirical correlations of specific cases. Theoretical results, starting with the work of Rayleigh have offered some insight on the nature of simple solutions and have provided revealing stability analyses. However, they rely on considerable simplifications.

Numerical simulations offer a promising avenue and several approaches have been introduced in the last two decades. The main challenges for a direct numerical simulation come from the fact that the interface location must be calculated as part of the solution process and because discontinuities in materials properties across the interface must be preserved. Finally, the problem involves dissimilar length scales with smaller scales influencing larger ones so that nontrivial pattern formation dynamics can be expected to occur on all intermediate scales. This results in a highly nonlinear problem that is very sensitive to numerical errors and prone to numerical instabilities.

Prof. Gibou is developing efficient numerical methods for the simulation of multiphase flows with phase change. In particular, he has developed with his co-workers at UCSB (Banerjee and Chen) and at Lockheed Martin (Nguyen), the first numerical algorithm that treat properly interfacial phase change in the sharp limit. The goal is to pursue this work to consider three dimensional flows with applications to various physical studies of interest at NASA and DOE national laboratories (see http://www1.engr.ucsb.edu/~fgibou/).

Multi-scale Computational Methods for Polymeric Fluids and Soft Materials (Ceniceros)

Multi-scale Computational Methods for Polymeric Fluids and Soft Materials (Ceniceros)

Complex fluids and soft material are complicated mixtures characterized by multiple phases and micro- and nano-structures that subjected to processing flows determine the macroscopic properties of the material such as toughness, ductility, optical clarity, etc. A computational approach based on molecular dynamics in which a physical model is constructed with atomic resolution is impossible to use for practical materials. Thus a successful computational method must use upscaling or coarse-graining. It must also be multi-scale to faithfully capture the micro-structure-flow coupling. Prof. Ceniceros, in collaboration with Profs. Banerjee, Fredrickson, and Garcia-Cervera, is working on the development and analysis of efficient multi-scale numerical methods based on the Field Theoretic approach in which particle-particle interactions are replaced by interactions of the particles and one or more fluctuating fields.
Numerical Methods for Multi-phase Flows and Free Surface Phenomena (Ceniceros)

Numerical Methods for Multi-phase Flows and Free Surface Phenomena (Ceniceros)

A wide variety of important flows are characterized by the presence of fluid interfaces that separate the different bulk components in a multiphase immiscible fluid. Examples include droplets and bubbles, water waves, fluid jets, etc. As they evolve in a typically complex motion, the fluid interfaces can deform significantly leading to regions of high curvature that are difficult to resolve numerically. Moreover, multiphase flow material quantities have sharp gradients across a fluid interface and vorticity concentrates largely there. Prof. Ceniceros and collaborators are developing and applying accurate computational methods for interfacial flows in two and three dimensions. The numerical methods span a wide range of approaches including adaptive Front-Tracking, Level Set capturing, moving meshes, adaptive mesh refinements, boundary integral methods, immersed boundary method, and diffused (phase-field) models.
Field-theoretic computer simulations (Fredrickson)

Field-theoretic computer simulations (Fredrickson)

Prof. Fredrickson has developed a novel and promising computer simulation strategy for handling the rich variety of self-assembly and equilibrium phase behavior exhibited by complex fluids. Rather than sample atomic and molecular coordinates, (as in a conventional Monte Carlo or molecular dynamics computer simulation), molecular-based models are transformed into statistical field theories by formal analytical methods. The fields are the fluctuating chemical and/or electrostatic potentials and the statistical weights (replacing the usual Boltzmann factor) are complex-valued, rather than real and positive definite. Simulations are carried out using a finite difference or element scheme. Prof. Fredrickson is studying the equilibrium properties of multi-block copolymer melts, polyelectrolyte solutions, colloidal suspensions, and microemulsions. Simulation results are benchmarked against experimental measurements carried out at UCSB in the laboratories of David Pine, Timothy Deming, and Edward Kramer, and in partnership with a number of international companies, including Dow Chemical, Rhodia, Atofina, Mitsubishi Chemical, CSP Technologies.
Korteweg stresses in miscible fluid flows (Meiburg)

Korteweg stresses in miscible fluid flows (Meiburg)

Miscible fluid flows occur in a wide range of industrial, environmental, and biological processes. Prof. Meiburg is investigating the influence of non-conventional, (so-called Korteweg) stresses on such flows in the presence of steep concentration gradients, which may give rise to an 'effective surface tension' under certain conditions. He employs both linear stability theory and highly resolved direct numerical simulations, for both capillary tubes and Hele-Shaw cells, in close collaboration with corresponding experimental studies at USC and ESPCI, Paris. The goal is to establish the magnitude of Korteweg stresses, and to derive a set constitutive equations that can serve as a basis for analyzing such flows.
Flow of macromolecular fluids (Leal)

Flow of macromolecular fluids (Leal)

The dynamics of macromolecular fluids in flow is critical in many materials processing applications, as well as biological and other naturally occurring systems. The goal of computational simulation is prediction not only of continuum flow variables (u, p), but also the corresponding microstructural state and stress distributions since these control both the flow and transport properties of the fluid, as well as the properties of any product that results from this flow. The unusual feature of macromolecular liquids (and, indeed, all “non-Newtonian” fluids) is that internal relaxation processes are slow, and thus the microstructural state can be modified greatly from the equilibrium configuration by interaction with a flow, with major changes in the macroscopic properties.

The computational problem is thus to solve the Cauchy equations of motion, together with material model equations that describe the coupling of the microstructural states of the material with flow. The transition from microstructure to macroflow occurs via the relationship between stress and the microstructural state of the material. The state of the material at each material point is described via a statistical distribution function and the latter is either calculated directly by solving a multidimensional advection-diffusion equation, or a corresponding stochastic “Langevin-type” equation. Alternatively, one can attempt to derive equations for the leading moments of the distribution function starting from the fundamental statistical mechanical models, but this involves closure approximations that may change the mathematical character of the problem. There are a large variety of challenging computational problems associated with each of the possible approaches; solving an advection-diffusion based model, solving the stochastic DE model, or introducing closures or other approximations. Both in the configuration space for micro-variables, and in physical space for flow, Lagrangian or particle-based techniques (such as “smooth particle hydrodynamics”) appear to be advantageous, and amenable to parallelization, but there are unresolved fundamental issues. The huge size of the problems is also a major issue. For a fully 3D, time-dependent flow, the multi-dimensional, time-dependent configuration-space problem must be solved at enough material points to provide adequate spatial resolution for the stress. The configuration problem for each material point is itself multidimensional in the configuration space independent variables and time. This is a large problem under any circumstances. However, in some key materials, the microstructural state can also develop very short length scales, even in a flow domain where one would expect smooth variations on longer length scales. For example, in liquid crystalline polymers, instabilities in the flow lead to disclinations, and the onset of a very short length scale “polydomain” structure. This second type of problem represents special challenges for simulations.

Interface dynamics (Leal)

Interface dynamics (Leal)

Many key processes involve the motions of multiphase fluids, consisting of two (or more) bulk fluid phases that are immiscible and separated by an interface that contains additional surface-active components that are known as surfactants. One physical phenomenon that is being studied in the group of Prof. Leal involves the coalescence of two drops in a flow. Our theoretical approach to this problem is via the standard continuum description of two Newtonian fluids, with a sharp interface, and a fully coupled mass-transfer mechanism for surfactant distribution on the interface. Non-uniform interface concentrations are reflected by Marangoni stresses that couple with the bulk fluid motion, and have a major effect on the circumstances in which a collision actually leads to coalescence. From a computational point of view, the problem is a special challenge due to the necessity of obtaining very accurate solutions in regions with extremely different length scales; one at the whole drop scale (1-100microns) and a second at the scale of the extremely thin fluid film between the drops (which may become as thin as 50-100 Angstroms). The flow-induced deformation of the interface is critical to determining whether film rupture occurs, and thus it is critical to obtain very accurate representations of the evolving interface geometry during a collision.. We are currently exploring a numerical implementation of the method of matched asymptotic expansions, as well as novel boundary-integral codes, with and without surfactant at the interface. We are also pursuing more recent developments using diffuse interface models in collaboration with other CSI researchers.

Computation in Microscale Engineering

Mixing in microchannels (Mezic)

Mixing in microchannels (Mezic)

Prof. Mezic is researching effective stirring processes to decrease the mixing length and microchannel cross-section using theoretical, computational, and experimental methods in both passive and active modes. Passive designs include patterning of the bottom microchannel surface to induce three-dimensional flows with a substantial cross-sectional component. Active designs include the transverse momentum mixer under study at UCSB in collaboration with Prof. Carl Meinhart, wherein oscillatory motion is introduced in side channels to stir the flow effectively. Dynamical systems and control theory tools are developed to control and optimize mixer performance.
Bubbles and bubble migration in microdevices (Homsy)

Bubbles and bubble migration in microdevices (Homsy)

Prof. Homsy is studying the manipulation of microbubbles in microchannels by the exploitation of surface tension variations using both theoretical and experimental approaches. He studies the speed of propagation of a bubble in a temperature gradient, and its dependence on system parameters such as the geometry of the microchannel and the viscosity and surface tension of the liquid. He also studies the production of vapor bubbles by asymptotic methods and simplified models of evaporation near contact lines between liquid, solid and vapor. At issue is the prediction of the size and shape of fully three dimensional bubbles as functions of heat input into the system, channel geometry, and fluid properties.

Computation in Materials

Analysis and simulations of complex materials (Garcia-Cervera)

Analysis and simulations of complex materials (Garcia-Cervera)

The dynamical formation and evolution of microstructure are common features in a large number of physical systems, such as ferromagnetic and elastic materials, superconductors, and polymeric melts. Prof. Garcia-Cervera is developing fast and accurate numerical methods for the study of microstructure in systems with nonlocal interactions. These methods combine fast summation algorithms with adaptive mesh refinements, and effective time-stepping techniques. He is one of the authors of the Gauss-Seidel Projection Method, which made it possible to perform efficiently realistic computations in the presence of nanometer scale magnetic vortices. Prof. Garcia-Cervera has used asymptotic analysis to study the structure of domain walls, and the dynamics in thin ferromagnetic films, and is currently studying the numerical solution of complex Langevin equations in the framework of ferromagnetism, where thermal effects play a fundamental role in the origin of microstructure.
High Resolution Simulation of the Stefan Problem (Gibou)

High Resolution Simulation of the Stefan Problem (Gibou)

The technology of crystal growth has advanced enormously during the past two or three decades and among these advances, the development and refinement of molecular beam epitaxy (MBE) has been among the most important. Broadly stated, MBE is simply crystallization by condensation or reaction of a vapor in ultra high vacuum. Applications include device structures in solid-state physics, electronics and opto-electronics. The Stefan problem is a moving boundary model where the main physical process is diffusion. It is therefore one of the main models used in the simulation of epitaxial growth. Other applications of this model include solidification processes, tissue engineering, combustion, bacterial colonies etc. Prof. Gibou is developing high resolution numerical methods to solve this problem (see http://www1.engr.ucsb.edu/~fgibou/).

Computation in Systems Biology

Application of systems engineering tools to biological problems (Doyle)

Application of systems engineering tools to biological problems (Doyle)

Our research in computational systems biology is focused on the application of systems engineering tools to problems in biology. Here we bring traditional systems engineering tools (for example, model identification, parametric sensitivity, and closed-loop analysis) to analyze complex, hierarchical biological systems. The guiding principle is that systems-level behavior can only be understood by considering systematic interactions across multiple time and spatial dimensions.
Multiscale simulation of complex biological systems (Petzold)

Multiscale simulation of complex biological systems (Petzold)

In microscopic systems formed by living cells, the small numbers of reactant molecules can result in dynamical behavior that is discrete and stochastic rather than continuous and deterministic. The Stochastic Simulation Algorithm (SSA) of Gillespie has been widely used to treat these problems. However as a procedure that simulates every reaction event, the SSA is necessarily inefficient for most realistic problems. There are two main reasons for this, both arising from the underlying multiscale nature of the problem: (a) stiffness, i.e. the presence of multiple time scales; and (b) the need to include in the simulation both species that are present in relatively small quantities and should be modeled by a discrete stochastic process, and species that are present in larger quantities and are more efficiently modeled by a deterministic differential equation (or at some scale in between). The work in Professor Petzold's research group seeks to address both of these issues, with accelerated discrete stochastic methods that are specifically designed to deal with stiffness, and with hybrid methods designed to model each reaction at the appropriate scale.
Image Segmentation with Application to Radiotherapy (Gibou)

Image Segmentation with Application to Radiotherapy (Gibou)

Segmentation is the art of automatically separating an image into different regions in a fashion that mimics the human visual system. It is therefore a broad term that is highly dependent on the application at hand, e.g. one might want to segment each object individually, groups of objects, parts of objects, etc. In order to segment a particular image, one must first identify the intended result before a set of rules can be chosen to target this goal. The human eye uses low-level information such as the presence of boundaries, regions of different intensity or colors, brightness and texture, etc., but also mid-level and high-level cognitive information, for example, to identify objects or to group individual objects together. As a direct consequence, there are a wide variety of approaches to the segmentation problem, and many successful algorithms have been proposed and developed to simulate a number of these different processes. My research on this topic has focused on a class of method known as deforma ble model based on energy minimization.

A natural field of application for such an algorithm is in medicine. Three-dimensional conformal radiotherapy (3DCRT) and intensity-modulated radiation therapy (IMRT) are being widely developed and implemented for clinical applications. These procedures depend upon intense use of patient imaging. The availability of spiral computerized tomography (CT) scanners has made practical the acquisition of large patient image sets consisting of around one hundred reconstructed planes. Most frequently these three-dimensional studies are fused with a treatment planning CT to transfer the target volume onto the treatment planning CT. Using this radiotherapy technology, the radiation oncologist can prescribe dose distributions that conform closely to tumor target volumes. With computerized treatment planning, it is also possible to reduce the dose that neighboring normal anatomical structures receive during the course of the radiotherapy procedure. However, the implementation of this technology is hampered by the effort required to segment tumor volumes and normal anatomical structures such that they are numerically represented in the computers. More often than not, these structures must be segmented on workstations by drawing closed contours around the cross-sections of the anatomy as perceived by the operator in axial CT reconstructions. The construction of a series of such closed polygons in consecutive CT reconstruction planes (or slices) constitutes the process of anatomical structure segmentation as it is most commonly implemented for radiotherapy treatment planning. Software tools that support this procedure are provided in most commercial treatment planning systems. These tools use the current state-of-the-art image display and graphic interaction techniques. Nevertheless, the segmentation process is still a subjective and time-consuming part of the treatment planning process.

Prof. Gibou is developing real-time segmentation algorithms that take into account prior knowledge of the organ to be segmented. The key idea is to incorporate the structure of the target organ into the segmentation, while processing three-dimensional data. The benefit of this approach is that the human time that is required during the manual segmentation process can be cut down drastically while still retaining the desired accuracy. This work is in close collaboration with researchers at Stanford University (see http://www1.engr.ucsb.edu/~fgibou/