Results 51 to 60 of about 2,762 (131)

Selection of tuning parameters in bridge regression models via Bayesian information criterion

open access: yes, 2012
We consider the bridge linear regression modeling, which can produce a sparse or non-sparse model. A crucial point in the model building process is the selection of adjusted parameters including a regularization parameter and a tuning parameter in bridge
A Antoniadis   +29 more
core   +1 more source

Variable Selection for Progressive Multistate Processes Under Intermittent Observation

open access: yesStatistics in Medicine, Volume 44, Issue 6, 15 March 2025.
ABSTRACT Multistate models offer a natural framework for studying many chronic disease processes. Interest often lies in identifying which among a large list of candidate variables play a role in the progression of such processes. We consider the problem of variable selection for progressive multistate processes under intermittent observation based on ...
Xianwei Li, Richard J. Cook, Liqun Diao
wiley   +1 more source

Smooth Information Criterion for Regularized Estimation of Item Response Models

open access: yesAlgorithms
Item response theory (IRT) models are frequently used to analyze multivariate categorical data from questionnaires or cognitive test data. In order to reduce the model complexity in item response models, regularized estimation is now widely applied ...
Alexander Robitzsch
doaj   +1 more source

Learning Conditional Independence Differential Graphs From Time-Dependent Data

open access: yesIEEE Access
Estimation of differences in conditional independence graphs (CIGs) of two time series Gaussian graphical models (TSGGMs) is investigated where the two TSGGMs are known to have similar structure.
Jitendra K. Tugnait
doaj   +1 more source

Debiased lasso after sample splitting for estimation and inference in high‐dimensional generalized linear models

open access: yesCanadian Journal of Statistics, Volume 53, Issue 1, March 2025.
Abstract We consider random sample splitting for estimation and inference in high‐dimensional generalized linear models (GLMs), where we first apply the lasso to select a submodel using one subsample and then apply the debiased lasso to fit the selected model using the remaining subsample. We show that a sample splitting procedure based on the debiased
Omar Vazquez, Bin Nan
wiley   +1 more source

A Simple Information Criterion for Variable Selection in High‐Dimensional Regression

open access: yesStatistics in Medicine, Volume 44, Issue 1-2, 15-30 January 2025.
ABSTRACT High‐dimensional regression problems, for example with genomic or drug exposure data, typically involve automated selection of a sparse set of regressors. Penalized regression methods like the LASSO can deliver a family of candidate sparse models.
Matthieu Pluntz   +3 more
wiley   +1 more source

A 3D Imaging Method for UAV Swarm Polarimetric SAR Based on Joint Sparse and Low‐Rank Structure Constraints

open access: yesIET Image Processing, Volume 19, Issue 1, January/December 2025.
We designed a distributed UAV swarm SAR observation architecture that deploys multiple UAV‐borne SAR systems in the three‐dimensional space around the target scene. Each node operates in a monostatic mode, which not only reduces synchronization requirements but also provides robustness against single‐node failure.
Wei Li   +5 more
wiley   +1 more source

Hierarchical Feature Recalibration Network for Motor Imagery Electroencephalogram (EEG) Classification

open access: yesIET Signal Processing, Volume 2025, Issue 1, 2025.
Time–frequency–spatial (TFS) features play a crucial role in motor imagery electroencephalogram (EEG) classification. However, effectively leveraging these multidimensional features to enhance classification accuracy remains a significant challenge. Although feature selection techniques are widely used to extract informative TFS representations, most ...
Shaorong Zhang   +10 more
wiley   +1 more source

Conditional Variable Screening for Ultra‐High Dimensional Longitudinal Data With Time Interactions

open access: yesBiometrical Journal, Volume 66, Issue 8, December 2024.
ABSTRACT In recent years, we have been able to gather large amounts of genomic data at a fast rate, creating situations where the number of variables greatly exceeds the number of observations. In these situations, most models that can handle a moderately high dimension will now become computationally infeasible or unstable.
Andrea Bratsberg   +2 more
wiley   +1 more source

Cauchy non-convex sparse feature selection method for the high-dimensional small-sample problem in motor imagery EEG decoding. [PDF]

open access: yesFront Neurosci, 2023
Zhang S   +9 more
europepmc   +1 more source

Home - About - Disclaimer - Privacy