Results 241 to 250 of about 398,426 (282)
Some of the next articles are maybe not open access.

Partial least squares regression for graph mining

Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining, 2008
Attributed graphs are increasingly more common in many application domains such as chemistry, biology and text processing. A central issue in graph mining is how to collect informative subgraph patterns for a given learning task. We propose an iterative mining method based on partial least squares regression (PLS).
Saigo H., Kramer N., Tsuda K.
openaire   +2 more sources

PoLiSh — smoothed partial least-squares regression

Analytica Chimica Acta, 2001
Partial least-squares (PLS) regression is a very widely used technique in spectroscopy for calibration/prediction purposes. One of the most important steps in the application of the PLS regression is the determination of the correct number of dimensions to use in order to avoid over-fitting, and therefore to obtain a robust predictive model.
Douglas N. Rutledge   +2 more
openaire   +1 more source

Orthogonal Nonlinear Partial Least-Squares Regression

Industrial & Engineering Chemistry Research, 2003
A multivariate statistical regression technique is proposed to address underlying nonlinear correlations among the predictor variables, as well as between the predictor variables and the response variable. The method is based on a neural network architecture that preserves the orthogonality properties of the principal component analysis (PCA) approach.
Fuat Doymaz   +2 more
openaire   +1 more source

Partial least squares regression with multiple domains

Journal of Chemometrics, 2023
Abstract This paper introduces the multiple domain‐invariant partial least squares (mdi‐PLS) method, which generalizes the recently introduced domain‐invariant partial least squares method (di‐PLS). In contrast to di‐PLS which solely allows transferring of knowledge from a single source to a single target domain, the proposed approach
Bianca Mikulasek   +4 more
openaire   +1 more source

The objective function of partial least squares regression

Journal of Chemometrics, 1998
A simple objective function in terms of undeflated X is derived for the latent variables of multivariate PLS regression. The objective function fits into the basic framework put forward by Burnham et al. (J. Chemometrics, 10, 31–45 (1996)). We show that PLS and SIMPLS differ in the constraint put on the length of the X-weight vector.
ter Braak, C.J.F., de Jong, S.
openaire   +2 more sources

Extreme partial least-squares regression

2021
A new approach, called Extreme-PLS, is proposed for dimension reduction in regression and adapted to distribution tails. The goal is to find linear combinations of predictors that best explain the extreme values of the response variable by maximizing the associated covariance.
Bousebata, Meryem   +2 more
openaire   +2 more sources

Sparse Kernel Partial Least Squares Regression

2003
Partial Least Squares Regression (PLS) and its kernel version (KPLS) have become competitive regression approaches. KPLS performs as well as or better than support vector regression (SVR) for moderately-sized problems with the advantages of simple implementation, less training cost, and easier tuning of parameters.
Michinari Momma, Kristin P. Bennett
openaire   +1 more source

Additive Splines for Partial Least Squares Regression

Journal of the American Statistical Association, 1997
Abstract This article introduces a generalization of the partial least squares regression (PLS). Transforming the predictors by means of spline functions is a useful way to extend PLS into nonlinearity and to obtain a multiresponse additive model. We describe both statistical and computational aspects of this new method, termed additive splines partial
Jean-François Durand, Robert Sabatier
openaire   +1 more source

Partial Least Squares Regression for Beta Regression Models

2021
International ...
Bertrand, Frédéric   +1 more
openaire   +2 more sources

A twist to partial least squares regression

Journal of Chemometrics, 2005
AbstractA modification of the PLS1 algorithm is presented. Stepwise optimization over a set of candidate loading weights obtained by taking powers of the y–X correlations and X standard deviations generalizes the classical PLS1 based on y–X covariances and hence adds flexibility to the modelling.
openaire   +1 more source

Home - About - Disclaimer - Privacy