I’ve been working with Qiqi Wang to implement a continuation method combined with a nonlinear least squares solver to approximate trajectories of the Lorenz equation at different parameter values. I used DOLFIN to build the solver, which interfaces with PETSc’s SNES. Here is a figure of the trajectories as a function of the R parameter. I started with a reference solution at R=30 and used the continuation method with the solver to take R down to 1.
Click the figure for higher resolution. Here’s a link to the code. Check out Qiqi’s arXiv paper here.
You know, at some point we’ll update the pictures in the header of this blog. You’d think a blog called “simulation informatics” would have cool pictures.
It’s hard to apply informatics to simulation data if you don’t have any simulation data.
I’ve recently started playing with FEniCS — a set of tools for solving PDEs with finite element methods. They include a python-like language with syntax very similar to mathematical expressions (Unified Form Language UFL) for posing and solving PDEs in variational form. Scripts written in UFL can be compiled and run on a variety of platforms including distributed memory clusters and multicore architectures. I think they have some support for GPUs, too. This is all done under the hood. The compiled code can link to and take advantage of powerful solver libraries like PETSc and Trilinos.
FEniCS is developed and maintained by a team of top-notch computational science researchers and software developers. Development is very active, and their support via Launchpad is astounding — prompt, helpful, and informative. I really can’t say enough good things about them.
Following the download-and-install instructions on the website was straightforward for both my 13-in MacBook Pro (dual core, 8G RAM) from late 2009 running OSX 10.6 and a Dell workstation with two quad core processors and 12G of shared memory running Ubuntu 12.04. The demos are remarkably helpful, and so is the accompanying book.
I’m working on getting it running on a distributed memory cluster. Stay tuned.