Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 86 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 17 tok/s Pro
GPT-5 High 14 tok/s Pro
GPT-4o 88 tok/s Pro
GPT OSS 120B 471 tok/s Pro
Kimi K2 207 tok/s Pro
2000 character limit reached

An end-to-end KNN-based PTV approach for high-resolution measurements and uncertainty quantification (2205.02766v3)

Published 5 May 2022 in physics.flu-dyn

Abstract: We introduce a novel end-to-end approach to improving the resolution of PIV measurements. The method blends information from different snapshots without the need for time-resolved measurements on grounds of similarity of flow regions in different snapshots. The main hypothesis is that, with a sufficiently large ensemble of statistically-independent snapshots, the identification of flow structures that are morphologically similar but occurring at different time instants is feasible. Measured individual vectors from different snapshots with similar flow organisation can thus be merged, resulting in an artificially increased particle concentration. This allows to refine the interrogation region and, consequently, increase the spatial resolution. The measurement domain is split in subdomains. The similarity is enforced only on a local scale, i.e. morphologically-similar regions are sought only among subdomains corresponding to the same flow region. The identification of locally-similar snapshots is based on unsupervised K-nearest neighbours search in a space of significant flow features. Such features are defined in terms of a Proper Orthogonal Decomposition, performed in subdomains on the original low-resolution data, obtained either with standard cross-correlation or with binning of Particle Tracking Velocimetry data with a relatively large bin size. A refined bin size is then selected according to the number of "sufficiently close" snapshots identified. The statistical dispersion of the velocity vectors within the bin is then used to estimate the uncertainty and to select the optimal K which minimises it. The method is tested and validated against datasets with a progressively increasing level of complexity: two virtual experiments based on direct simulations of the wake of a fluidic pinball and a channel flow and the experimental data collected in a turbulent boundary layer.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.