Currently job artifacts in CI/CD pipelines on LRZ GitLab never expire. Starting from Wed 26.1.2022 the default expiration time will be 30 days (GitLab default). Currently existing artifacts in already completed jobs will not be affected by the change. The latest artifacts for all jobs in the latest successful pipelines will be kept. More information: https://gitlab.lrz.de/help/user/admin_area/settings/continuous_integration.html#default-artifacts-expiration

Commit 7188d612 authored by Sven K's avatar Sven K
Browse files

Adding some details about the Carpet ASCII plotter.

parent 0f1846da
......@@ -598,6 +598,29 @@ The following combination of the peano block regular files is supported:
\end{description}
\subsubsection{The Carpet output file format}
The following combinations for the Carpet output file format are supported:
%
\begin{equation*}
\texttt{Carpet}
~\times~
\texttt{Cartesian}
~\times~
\texttt{vertices}
~\times~
\left\{
\begin{matrix}
\texttt{ascii} \\
\texttt{hdf5}
\end{matrix}
\right\}
\end{equation*}
%
That is, by design, the Carpet file format always stores vertex data,
sampled on a Cartesian grid. The Carpet plotter can be used in ASCII mode
(creating CSV tables) and in HDF5 mode (creating compact H5 files). Furthermore,
the plotter can be used in a Finite Volume, Discontinous Galerkin or
Hybrid (Limiting) solver application. The two options in detail are:
\begin{description}
\item [\texttt{Carpet::Cartesian::Vertices::HDF5}]
The CarpetHDF5 block regular fileformat is the file format
......@@ -619,15 +642,43 @@ The following combination of the peano block regular files is supported:
In ExaHyPE, we currently support plotting the CarpetHDF5 file format straightforward
from an ADERDG scheme by interpolating on a Cartesian subgrid. We also support
dumping data in the CarpetHDF5 file format from the Finite Volume scheme, however
this yields in data loss in the outputted files as the interpolation looses
exactness of the cell centered simulation data. The output will look blurred
this yields in quality loss in the outputted files as the interpolation from
cell to vertex data is not ideal, also given the limited amount of ghost
cell information available. The output will look blurred
on coarse grids.
\\
For details about the CarpetHDF5 files as well as an overview about the many
postprocessing tools, see section \ref{section:carpethdf5-file-format} in the
appendix. In order to use the CarpetHDF5 file format, you have to build
\exahype\ with HDF5 support.
%
\item [\texttt{Carpet::Cartesian::Vertices::ASCII}]
This flavour of the Carpet file format does \emph{not} represent exactly the
same data structure just in another container (as with the Peano format) but
instead really writes very lengthy CSV tables. They can be directly read in
by Gnuplot or similiar tools. However, in practice they are only useful for
lower dimensional data because they quickly grow big.
\end{description}
The Carpet writer is especially known for some distinct features:
\begin{itemize}
\item Actual dimension reducing slicing of the data (i.e. writing a 1D cut,
or a 2D plane from 3D data). As the file format is so far in use only in
highly symmetric astrophysical simulations, we concentrated on easy
Cartesian slicing only (i.e. planes and lines aligned with the Cartesian
coordinate system).
\item The option to write all the written output vector to one file or to
multiple files, as well as to open one file per timestep or to write
several timesteps in a single file.
\item So far, every MPI rank writes their own file. This is similar to all
other ExaHyPE plotters.
\end{itemize}
\todo{Add some screenshots of how Carpet files look like}
\subsubsection{The Flash output file format}
\begin{description}
\item [\texttt{Flash::hdf5}] The experimental FlashHDF5 block regular
file format tries to
resemble the file format used by the FLASH code. At the current stage, a
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment