- Browse by Subject
Browsing by Subject "Dimension reduction"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Active learning with generalized sliced inverse regression for high-dimensional reliability analysis(Elsevier, 2022-01) Yin, Jianhua; Du, Xiaoping; Mechanical and Energy Engineering, School of Engineering and TechnologyIt is computationally expensive to predict reliability using physical models at the design stage if many random input variables exist. This work introduces a dimension reduction technique based on generalized sliced inverse regression (GSIR) to mitigate the curse of dimensionality. The proposed high dimensional reliability method enables active learning to integrate GSIR, Gaussian Process (GP) modeling, and Importance Sampling (IS), resulting in an accurate reliability prediction at a reduced computational cost. The new method consists of three core steps, 1) identification of the importance sampling region, 2) dimension reduction by GSIR to produce a sufficient predictor, and 3) construction of a GP model for the true response with respect to the sufficient predictor in the reduced-dimension space. High accuracy and efficiency are achieved with active learning that is iteratively executed with the above three steps by adding new training points one by one in the region with a high chance of failure.Item Principal component analysis of hybrid functional and vector data(Wiley, 2021) Jang, Jeong Hoon; Biostatistics and Health Data Science, School of MedicineWe propose a practical principal component analysis (PCA) framework that provides a nonparametric means of simultaneously reducing the dimensions of and modeling functional and vector (multivariate) data. We first introduce a Hilbert space that combines functional and vector objects as a single hybrid object. The framework, termed a PCA of hybrid functional and vector data (HFV-PCA), is then based on the eigen-decomposition of a covariance operator that captures simultaneous variations of functional and vector data in the new space. This approach leads to interpretable principal components that have the same structure as each observation and a single set of scores that serves well as a low-dimensional proxy for hybrid functional and vector data. To support practical application of HFV-PCA, the explicit relationship between the hybrid PC decomposition and the functional and vector PC decompositions is established, leading to a simple and robust estimation scheme where components of HFV-PCA are calculated using the components estimated from the existing functional and classical PCA methods. This estimation strategy allows flexible incorporation of sparse and irregular functional data as well as multivariate functional data. We derive the consistency results and asymptotic convergence rates for the proposed estimators. We demonstrate the efficacy of the method through simulations and analysis of renal imaging data.