Results 51 to 60 of about 2,762 (131)
Selection of tuning parameters in bridge regression models via Bayesian information criterion
We consider the bridge linear regression modeling, which can produce a sparse or non-sparse model. A crucial point in the model building process is the selection of adjusted parameters including a regularization parameter and a tuning parameter in bridge
A Antoniadis +29 more
core +1 more source
Variable Selection for Progressive Multistate Processes Under Intermittent Observation
ABSTRACT Multistate models offer a natural framework for studying many chronic disease processes. Interest often lies in identifying which among a large list of candidate variables play a role in the progression of such processes. We consider the problem of variable selection for progressive multistate processes under intermittent observation based on ...
Xianwei Li, Richard J. Cook, Liqun Diao
wiley +1 more source
Smooth Information Criterion for Regularized Estimation of Item Response Models
Item response theory (IRT) models are frequently used to analyze multivariate categorical data from questionnaires or cognitive test data. In order to reduce the model complexity in item response models, regularized estimation is now widely applied ...
Alexander Robitzsch
doaj +1 more source
Learning Conditional Independence Differential Graphs From Time-Dependent Data
Estimation of differences in conditional independence graphs (CIGs) of two time series Gaussian graphical models (TSGGMs) is investigated where the two TSGGMs are known to have similar structure.
Jitendra K. Tugnait
doaj +1 more source
Abstract We consider random sample splitting for estimation and inference in high‐dimensional generalized linear models (GLMs), where we first apply the lasso to select a submodel using one subsample and then apply the debiased lasso to fit the selected model using the remaining subsample. We show that a sample splitting procedure based on the debiased
Omar Vazquez, Bin Nan
wiley +1 more source
A Simple Information Criterion for Variable Selection in High‐Dimensional Regression
ABSTRACT High‐dimensional regression problems, for example with genomic or drug exposure data, typically involve automated selection of a sparse set of regressors. Penalized regression methods like the LASSO can deliver a family of candidate sparse models.
Matthieu Pluntz +3 more
wiley +1 more source
We designed a distributed UAV swarm SAR observation architecture that deploys multiple UAV‐borne SAR systems in the three‐dimensional space around the target scene. Each node operates in a monostatic mode, which not only reduces synchronization requirements but also provides robustness against single‐node failure.
Wei Li +5 more
wiley +1 more source
Time–frequency–spatial (TFS) features play a crucial role in motor imagery electroencephalogram (EEG) classification. However, effectively leveraging these multidimensional features to enhance classification accuracy remains a significant challenge. Although feature selection techniques are widely used to extract informative TFS representations, most ...
Shaorong Zhang +10 more
wiley +1 more source
Conditional Variable Screening for Ultra‐High Dimensional Longitudinal Data With Time Interactions
ABSTRACT In recent years, we have been able to gather large amounts of genomic data at a fast rate, creating situations where the number of variables greatly exceeds the number of observations. In these situations, most models that can handle a moderately high dimension will now become computationally infeasible or unstable.
Andrea Bratsberg +2 more
wiley +1 more source
Cauchy non-convex sparse feature selection method for the high-dimensional small-sample problem in motor imagery EEG decoding. [PDF]
Zhang S +9 more
europepmc +1 more source

