I’ve been running some of the test cases with the Stanford University Unstructured (SU2, read “S-U-squared”) code to generate CFD data sets to play with. I also recently used SU2 to produce the function evaluation in a Monte Carlo demo I did recently for ME470: Uncertainty Quantification. It’s a very nice set of codes with good documentation. It’s easy to get running — both on your laptop and your cluster.
Also, it turns out that the Tecplot files produced by SU2 are surprisingly easy to work with in Hadoop. Stay tuned for some BIGDATA computations on CFD flow fields (which I hope to finish in time for SIAM CSE).
I’ve been working with Qiqi Wang to implement a continuation method combined with a nonlinear least squares solver to approximate trajectories of the Lorenz equation at different parameter values. I used DOLFIN to build the solver, which interfaces with PETSc’s SNES. Here is a figure of the trajectories as a function of the R parameter. I started with a reference solution at R=30 and used the continuation method with the solver to take R down to 1.
Click the figure for higher resolution. Here’s a link to the code. Check out Qiqi’s arXiv paper here.
You know, at some point we’ll update the pictures in the header of this blog. You’d think a blog called “simulation informatics” would have cool pictures.
Mathematicians are known to engage in hero worship, and I’m not really an exception.
Matt Knepley is a researcher at Argonne National Labs who works on the award-winning software package PETSc (Portable, Extensible Toolkit for Scientific Computation). It contains a suite of scalable tools for a host of scientific computing problems.
I’ve never met Matt in person, but I suspect that when I do I’ll be a little star struck.
This is pretty impressive:
STANFORD RESEARCHERS BREAK MILLION-CORE SUPERCOMPUTER BARRIER
Joe will be speaking in our minisymposium Is MapReduce Good for Science and Simulation Data? at SIAM CSE in late February. I hope he talks about this data!
It’s hard to apply informatics to simulation data if you don’t have any simulation data.
I’ve recently started playing with FEniCS — a set of tools for solving PDEs with finite element methods. They include a python-like language with syntax very similar to mathematical expressions (Unified Form Language UFL) for posing and solving PDEs in variational form. Scripts written in UFL can be compiled and run on a variety of platforms including distributed memory clusters and multicore architectures. I think they have some support for GPUs, too. This is all done under the hood. The compiled code can link to and take advantage of powerful solver libraries like PETSc and Trilinos.
FEniCS is developed and maintained by a team of top-notch computational science researchers and software developers. Development is very active, and their support via Launchpad is astounding — prompt, helpful, and informative. I really can’t say enough good things about them.
Following the download-and-install instructions on the website was straightforward for both my 13-in MacBook Pro (dual core, 8G RAM) from late 2009 running OSX 10.6 and a Dell workstation with two quad core processors and 12G of shared memory running Ubuntu 12.04. The demos are remarkably helpful, and so is the accompanying book.
I’m working on getting it running on a distributed memory cluster. Stay tuned.
I finished a set of background notes for the uncertainty quantification class (ME470) I’m teaching this quarter with Gianluca Iaccarino. A link to the notes is below. Let me know if you think I left anything out (or if you find any typos)!
ME470 background notes
I’ve heard a few statisticians complain about the rise in popularity (and funding) of research in uncertainty quantification, claiming that it’s all just a rehashing of statistics. And I think they have a point, for the most part. At the very least, those of us working in UQ should be aware of the excellent resources from the statistics literature that address UQ-like problems.
‘Computer Experiments’ by Koehler and Owen (Chapter 9 from ‘Handbook of statistics 13,’ 1996) is one of my personal favorites. It discusses Kriging and least squares surrogate models for expensive computer simulations and a variety of experimental designs. I consider it a must-read for people interested in UQ.
Here’s a PDF.
I gave a talk at the INFORMS ICS conference last week at the beautiful Eldorado hotel in Santa Fe. I gave a broad overview and introduction to uncertainty quantification — with a bit about model reduction. This was part of a session organized by Brian Adams and Laura Swiler from Sandia Labs. Here’s a link to my slides.
INFORMS ICS slides
I’m going to start posting here more when interesting things happen. Stay tuned.
At SIAM Annual this year, Emily Shuckburgh gave a really great talk about simulation informations in the study of ocean currents. They began with simulations of ocean currents and hypothesized the existence of unstable manifolds in the ocean around Antarctica. Based on this idea, they then looked for evidence of these unstable manifolds in satellite data, and found then. Finally, to VERIFY their existence, then went to Antarctica to drop floats in the ocean to track where they go. These floats followed the expected manifolds!
That’s taking simulation informatics — the initial simulation studies — all the way to real world science!
See the outline here: http://meetings.siam.org/sess/dsp_programsess.cfm?SESSIONCODE=14996
I’m looking for relevant papers to post, but don’t expect to get around to it soon, so don’t expect them to make their way onto this page.
People from our mini having a great time at dinner! Bet you wish you were here!