Notes on the workflow for generating the XSEDE16 Visualization Showcase submission and other issues that have arisen with VisIt, ParaView, Blue Waters, etc.

My general approach is, as much as possible, I prefer to work on my workstation using ParaView. My workstation is beefier than a single BW node and I find ParaView (usually) to have more user friendly interface than VisIt. Some decisions were probably suboptimal because of this, considering the longer term view of productivity for day-to-day vis. The main drawback is that I only have about 1.5TB of storage, so moving over the entire run of .silos wouldn’t work. Another is that it can’t render 4K when going through an OpenGL backend. In theory I could do Mesa software rendering to get this, but at that point, might as well go back to BW.

This is also driven by wanting to do a lot of quick iterations on different visualizations especially when choosing colormaps and camera angles.

(A note on Python. VTK, ParaView, and VisIt each have different Python API’s. I prefer to use vtkpython when I can, because I learned VTK first and it’s easy to whip up a downstream flowing vis pipeline in VTK. pvpython/pvbatch are needed when the pipeline makes use of a PV specific feature. VisIt’s python requires rethinking the pipeline because VisIt’s philosophy is top-down upstream flow.)

Preprocessing

Some of this is irrelevant going forward, because I think we’re committed to VisIt with harmdecomp and parallelization.

To reduce file size, extracted B, rho, and the BHmesh as separate, compressed VTK XML files.

  • serial.pbs: controls the extractions. Couldn’t get a task-parallel version to work (well, in a way that would recover gracefully.)

  • silo2vtk.py: pvpython file which does the actual extraction. Had to use pvpython called from pvbatch because it relies on the VisIt Bridge plugin of ParaView to read Silo format. These were renamed as extract$var.py by the job script.

  • The extraction process creates a .vtm and a sub-directory containing the .vt* with the actual data. I usually then just scp’d the actual data and deleted all the extraneous vtm’s and directories from the command line. BH’s became .vtu vtkXMLUnstructuredGrids, rho and B became .vts vtkXMLStructuredGrids

Magnetic Field Line Entrainment

Geometry renders faster than a full pipeline, so once the pipeline is set, might as well export the geometry.

  • streamline_export_3.pvsm: Saved streamline geometry as .vtp vtkXMLPolyData.

    • These needed to be generated with the full spherical B mesh.

    • The BH’s were brought in, transposed up in Z and then glyphed with a polygonal sphere source.

    • These spheres were then used as a custom streamline source.

    • Then did File | Save Geometry ...

  • write_glyphs.py: Uses vtkpython, produces vtkXMLPolyData.

  • make_orbit_plane_slice.py: Uses vtkpython, produces vtkXMLPolyData.

  • Then created a ParaView scene by loading the geometries for the BHs, streamlines, and orbit plane slices. Once I was happy with the scene, I rendered the streamlines separately from the slice and BHs. (This is skipping over a lot of trial and error to find what was most flexible for Adobe After Effects.)

  • Streamlines were rendered as lines with very low opacity, going for a particular effect, something like: http://www.shutterstock.com/pic-285929834/stock-vector-rainbow-waves-colorful-gradient-light-blend-line-vector-curves.html?src=RPKsMJJPcV3D3tepaFcZ7Q-1-12 


In VisIt, because it has an option to randomly generate seeds within a radius, could avoid the StreamTracerWithCustomSource approach, which can be flaky. With the camera where it is, not much of a visual difference. I’m not sure if VisIt can export geometry, but if it can, then the rest of the workflow would be the same.

Gas Density

This was a total hack. ParaView had no idea how to volume render spherical meshes, it would just churn for hours (let it run overnight once) and produce nothing. However, ParaView has a GPU ray cast option, which is very fast, but requires vtkXMLImageData or vtkPointData and that the mesh can fit into VRAM.

Unfortunately, I don’t seem to have the ParaView state file for this, but essentially the process was:

  • Load the vts sequence.

  • Apply a Clip filter, box mode, I think around -200, 200, -200, 200 and -50, 50 in X, Y, Z. Important to make sure to keep the cells intersecting the box for the next step.

  • Apply an Image Resampling filter. I believe I resampled at 2000x2000x250.

  • Then it was either Save Geometry … or in the Python console calling a Write().

GPU ray casting was now interactive. The hard part was getting the colormap and opacity curves to be interesting. This was just having patience and playing with the settings until getting something useful.

  • render.sh, rho_volume.py: These might have been for testing, I ran into an issue when I got to After Effects and I may have ended up just re-rendering the frames directly from ParaView.

Magnetic Field Intensity

  • make_slice_normal.py: Uses vtkpython, outputs vtkXMLPolyData. Extracted slice geometry again, the trick was to extract the slice based on the BH’s

  • orbit_camera.py: Uses vtkpython, outputs images. Places the camera based on the normal to the slice plane at the origin. Couldn’t figure out how to synchronize the camera orbit with the slices in PV, either through the GUI or with pvpython.

  • visit_renders.pbs, BHslice.py: This is a bare-bones VisIt version which works on Blue Waters.


Video Production

  • Test videos done on workstation with ffmpeg.

  • Drafts and final video done with Adobe After Effects on Win 7.

  • During the previous steps, rendered frames as individual .tifs or .pngs at HD1080. Would have preferred to render at 4K or 8K and then downsampled. Still have jaggies in a lot of places. Also some color balance or video output settings issues, nothing looks quite as good as the individual frames on my workstation monitor.

  • Time scale was kind of guessing to stretch out what we had, I think I ended up going with about 12 rendered frames per sec, which was then time interpolated for 30 fps for the .mov.

  • Text annotations added in AE, added gaussian blur, that might be a defect of our Windows setup (long story).

  • Un-annotated colormap bars rendered separately for placement in AE, easier than futzing around in ParaView.

  • fade-in/out intertitles, looks better and lengthens video. Video was a little short by showcase standards. Most were closer to 3-5 minutes.

  • Saved as Quicktime H.264 but there’s a million options in AE.

Miscellaneous

PV will not correctly mix polygonal geometry inside of a volume. VisIt will, but only if opaque. Because of this, and parallelization issues, moving further production to VisIt.


  • No labels