Skip to content

3D Spectral boundary integral solver for cell-scale blood flow

License

Notifications You must be signed in to change notification settings

comp-physics/RBC3D

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RBC3D Banner

RBC3D

Spectral boundary integral solver for cell-scale flows

Authors: S. H. Bryngelson, H. Zhao, A. Isfahani, J. B. Freund

RBC3D is a flow solver for soft capsules and cells via the methods discussed in Zhao et al., JCP (2010) and more. This codebase solves the boundary integral form of the Stokes equations via an algorithm tailored for cell-scale simulations:

  • Spectrally-accurate spherical harmonics represent the deforming surfaces
  • Modified Green’s function approximation used for near-range interactions
  • Electrostatic-like repulsion prevents cells from intersecting
  • Weak-formulation of no-slip boundary conditions (e.g., vessel walls)
  • These features ensure that simulations are robust. Parallel communication via MPI enables large simulations, such as model vascular networks.

Installation

Test on ICE

To install on PACE Phoenix, you need to salloc a node to make sure srun is available and then run this in the RBC3D root directory:

ml gcc/12.1.0-qgxpzk mvapich2/2.3.7-733lcv python/3.9.12-rkxvr6 netcdf-fortran cmake
./rbc.sh install-phoenix

Note that if the gcc, mvapich2, mkl, and fftw modules work on your Phoenix account, you should use this installer script for a faster build.

ml gcc mvapich2 mkl python/3.9.12-rkxvr6 netcdf-fortran fftw cmake
./rbc.sh install

Or if you're on COC-ICE, you just need to load different modules to run the installer script.

ml gcc/12.3.0 mvapich2/2.3.7-1 intel-oneapi-mkl/2023.1.0 python/3.10.10 netcdf-fortran/4.6.0-mva2-hdf5-1.14 fftw/3.3.10-mva2 cmake
./rbc.sh install

Before you can run cmake, you must set these environment variables. You can place them in your ~/.bashrc. If you didn't place RBC3D in your $HOME directory, then replace it with where you placed RBC3D.

export PETSC_DIR=$HOME/RBC3D/packages/petsc-3.19.6
export PETSC_ARCH=arch-linux-c-opt

Then to execute and run a case, you can:

mkdir build
cd build
cmake ..
make case # or just `make` to make common and all the cases
cd case
srun -n 1 ./initcond
srun ./tube

This will generate output files in build/case/D. To keep output files in examples/case/D, you can cd examples/case and srun ../../build/case/initcond and same for tube.

On other supercomputing clusters, it should be easy to replace the module loads with the modules available on your system. If one of these isn't available, you can follow the manual build instructions available here.

Papers that use RBC3D

This is an attempt to document the papers that make use of RBC3D.

License

MIT.