This paper seeks to qualitatively advance the consistent classification of plume-generated fire patterns by addressing two central research questions: what representations capture the qualitative semantics of fire patterns to help with consistent classification; and what image analytics can accurately identify correlated spatial fire patterns for precise location identification.
Fire patterns, consisting of fire effects that offer insights into fire behavior and origin, are currently classified based on investigators' visual observations, leading to subjective interpretations. This study proposes a quantitative fire pattern classification framework to support fire investigators, aiming for consistency and accuracy. The framework integrates four components. First, it leverages human-computer interaction to extract fire patterns from surfaces, combining investigator expertise with computational analysis. Second, it employs an aspect ratio-based random forest model to classify fire pattern shapes. Third, fire scene point cloud segmentation enables identification of fire-affected areas and mapping 2D fire patterns to 3D scenes for spatial relationships analysis. Lastly, spatial relationships between fire patterns and elements support an interpretation of fire scenes. These components provide pattern analysis that synthesizes qualitative and quantitative data. The framework's fire pattern shape classification results achieve 93 percent precision on synthetic data and 83 percent on real fire patterns. (Published Abstract Provided)