Results 241 to 250 of about 397,347 (283)
Some of the next articles are maybe not open access.
Domain adaptive partial least squares regression
Chemometrics and Intelligent Laboratory Systems, 2020Abstract In practical applications, the problem of training- and test-samples from different distributions is often encountered, such as instruments or external environmental factors change when measuring the data. Therefore, a multivariate calibration model established, based on the training set needs to be adaptive to meet the requirements of test ...
Guangzao Huang +5 more
openaire +1 more source
Spectral Partial Least Squares Regression
IEEE 10th INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING PROCEEDINGS, 2010Linear Graph Embedding (LGE) is the linearization of graph embedding, and has been applied in many domains successfully. However, the high computational cost restricts these algorithms to be applied to large scale high dimensional data sets. One major limitation of such algorithms is that the generalized eigenvalue problem is computationally expensive ...
Jiangfeng Chen, Baozong Yuan
openaire +1 more source
Microwave characterization using partial least square regression
2016 IEEE Conference on Electromagnetic Field Computation (CEFC), 2016Inverse problems for determination of dielectric materials properties (complex permittivity) are usually solved by iterative methods using numerically based forward model. These methods are computationally expensive. In this paper, we propose a fast inversion model based on partial least square regression (PLSR).
Sadou, Hakim +4 more
openaire +2 more sources
Partial least squares regression for graph mining
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining, 2008Attributed graphs are increasingly more common in many application domains such as chemistry, biology and text processing. A central issue in graph mining is how to collect informative subgraph patterns for a given learning task. We propose an iterative mining method based on partial least squares regression (PLS).
Saigo H., Kramer N., Tsuda K.
openaire +2 more sources
PoLiSh — smoothed partial least-squares regression
Analytica Chimica Acta, 2001Partial least-squares (PLS) regression is a very widely used technique in spectroscopy for calibration/prediction purposes. One of the most important steps in the application of the PLS regression is the determination of the correct number of dimensions to use in order to avoid over-fitting, and therefore to obtain a robust predictive model.
Douglas N. Rutledge +2 more
openaire +1 more source
Orthogonal Nonlinear Partial Least-Squares Regression
Industrial & Engineering Chemistry Research, 2003A multivariate statistical regression technique is proposed to address underlying nonlinear correlations among the predictor variables, as well as between the predictor variables and the response variable. The method is based on a neural network architecture that preserves the orthogonality properties of the principal component analysis (PCA) approach.
Fuat Doymaz +2 more
openaire +1 more source
The objective function of partial least squares regression
Journal of Chemometrics, 1998A simple objective function in terms of undeflated X is derived for the latent variables of multivariate PLS regression. The objective function fits into the basic framework put forward by Burnham et al. (J. Chemometrics, 10, 31–45 (1996)). We show that PLS and SIMPLS differ in the constraint put on the length of the X-weight vector.
ter Braak, C.J.F., de Jong, S.
openaire +2 more sources
Extreme partial least-squares regression
2021A new approach, called Extreme-PLS, is proposed for dimension reduction in regression and adapted to distribution tails. The goal is to find linear combinations of predictors that best explain the extreme values of the response variable by maximizing the associated covariance.
Bousebata, Meryem +2 more
openaire +2 more sources
Sparse Kernel Partial Least Squares Regression
2003Partial Least Squares Regression (PLS) and its kernel version (KPLS) have become competitive regression approaches. KPLS performs as well as or better than support vector regression (SVR) for moderately-sized problems with the advantages of simple implementation, less training cost, and easier tuning of parameters.
Michinari Momma, Kristin P. Bennett
openaire +1 more source
Additive Splines for Partial Least Squares Regression
Journal of the American Statistical Association, 1997Abstract This article introduces a generalization of the partial least squares regression (PLS). Transforming the predictors by means of spline functions is a useful way to extend PLS into nonlinearity and to obtain a multiresponse additive model. We describe both statistical and computational aspects of this new method, termed additive splines partial
Jean-François Durand, Robert Sabatier
openaire +1 more source

