Nonlinear Dynamic Field Embedding: On Hyperspectral Scene Visualization
AbstractIn many areas of research, complex signals are commonly represented by high dimensional feature vectors. However, high dimensional vectors are difficult to analyze and interpret due to the curse of dimensionality. Ongoing research efforts simplify this problem by seeking data representations on a low dimensional space. Traditional dimensionality reduction methods spend little effort toward formulating unifying platforms and very few approaches provide intuitive insights that enable the design of local topology preserving algorithms. This study introduces a new localized bilateral kernel function for computing the high dimensional neighborhood graph of image data. The kernel function injects spatial sensitivity details that allow maintaining disjoint neighborhoods in the embedding space. Furthermore, the study exploits the force field interpretation from mechanics and devise a unifying nonlinear graph embedding framework. The generalized framework leads to novel unsupervised multidimensional artificial field embedding techniques that rely on the simple additive assumption of pair-dependent attraction and repulsion functions. The formulations capture long range and short range distance related effects often associated with living organisms and help to establish algorithmic properties that mimic mutual behavior for the purpose of dimensionality reduction. The main benefits from the proposed models includes the ability to preserve the local topology of data and produce quality visualizations i.e. maintaining disjoint meaningful neighborhoods. As part of evaluation, visualization, gradient field trajectories, and semisupervised classification experiments are conducted for image scenes acquired by multiple sensors at various spatial resolutions over different types of objects. The results demonstrate the superiority of the proposed embedding framework over various widely used methods.