Browsing by Author "Ahrens, James"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Approaches for In Situ Computation of Moments in a Data-Parallel Environment(The Eurographics Association, 2020) Tsai, Karen C.; Bujack, Roxana; Geveci, Berk; Ayachit, Utkarsh; Ahrens, James; Frey, Steffen and Huang, Jian and Sadlo, FilipFeature-driven in situ data reduction can overcome the I/O bottleneck that large simulations face in modern supercomputer architectures in a semantically meaningful way. In this work, we make use of pattern detection as a black box detector of arbitrary feature templates of interest. In particular, we use moment invariants because they allow pattern detection independent of the specific orientation of a feature. We provide two open source implementations of a rotation invariant pattern detection algorithm for high performance computing (HPC) clusters with a distributed memory environment. The first one is a straightforward integration approach. The second one makes use of the Fourier transform and the Cross-Correlation Theorem. In this paper, we will compare the two approaches with respect to performance and flexibility and showcase results of the in situ integration with real world simulation code.Item Selection of Optimal Salient Time Steps by Non-negative Tucker Tensor Decomposition(The Eurographics Association, 2021) Pulido, Jesus; Patchett, John; Bhattarai, Manish; Alexandrov, Boian; Ahrens, James; Agus, Marco and Garth, Christoph and Kerren, AndreasChoosing salient time steps from spatio-temporal data is useful for summarizing the sequence and developing visualizations for animations prior to committing time and resources to their production on an entire time series. Animations can be developed more quickly with visualization choices that work best for a small set of the important salient timesteps. Here we introduce a new unsupervised learning method for finding such salient timesteps. The volumetric data is represented by a 4-dimensional non-negative tensor, X(t; x; y; z).The presence of latent (not directly observable) structure in this tensor allows a unique representation and compression of the data. To extract the latent time-features we utilize non-negative Tucker tensor decomposition. We then map these time-features to their maximal values to identify the salient time steps. We demonstrate that this choice of time steps allows a good representation of the time series as a whole.