#### The one place for your designs

To enable design management, you'll need to meet the requirements. If you need help, reach out to our support team for assistance.

I plan to design/implement a new density processor. The difference to the existing ones is

- generalize observable area to polygon shapes
- measure at multiple fixed points inside the polygon. Current processors i) average density over a specified area (i.e. scalar) or ii) compute the density for every pedestrian in every timestep. Because of point 1) I want to use an unstructured grid; the prefered algorithm is the already developed & implemented @BZoennchen (has to be merged before)

Something like this, where each vertex in the triangulation is a point to evaluate the density:

The output would be something like:

timeStep | point ID | density |
---|---|---|

1 | 1 | 0.5 |

1 | 2 | 0.6 |

1 | 3 | 0.1 |

2 | 1 | 0.9 |

2 | 2 | 0.6 |

2 | 3 | 0.8 |

- true/false to consider agents inside or outside the defined polygon
- set density algorithm (use available ones)
- options to forward to the triangulation algorithm (be able to give the accuracy of the density sampling)
- options to forward to the density algorithm (e.g. cut-off for Gaussian density)
- only compute the density of every
`X`

timestep (as this can be quite an expensive operation) - allow averaging the density to one point (maybe this would be another processor)

- To reconstruct the density of a polygon, it is desirable to write a description of either the points or the entire triangulation in either the data file or in a separate file.

Have a density evolution over time at fixed points in the scenario. I plan to investigate how many dimensions are needed to represent the density (use PCA) and use the output for QoI.

To enable design management, you'll need to meet the requirements. If you need help, reach out to our support team for assistance.