Results 231 to 240 of about 245,585 (258)
Some of the next articles are maybe not open access.
2021
?? ???????????? ???????????????? ???????????????????? ?????????????????? ?????????????? ???????????????????? LASSO ?? ?????????????????????????????? ?????????????????? ?????????????????????????? ???????????????? ???????? ?? ???????????? ???????????????? ?????????????????????? (??????????????????????????-?????????????????????? ????????????????) CRA. ????
openaire +3 more sources
?? ???????????? ???????????????? ???????????????????? ?????????????????? ?????????????? ???????????????????? LASSO ?? ?????????????????????????????? ?????????????????? ?????????????????????????? ???????????????? ???????? ?? ???????????? ???????????????? ?????????????????????? (??????????????????????????-?????????????????????? ????????????????) CRA. ????
openaire +3 more sources
Computational Statistics & Data Analysis, 2015
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Kwon, Sunghoon +2 more
openaire +1 more source
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Kwon, Sunghoon +2 more
openaire +1 more source
Proceedings of the AAAI Conference on Artificial Intelligence, 2015
Graphical models provide a rich framework for summarizing the dependencies among variables. The graphical lasso approach attempts to learn the structure of a Gaussian graphical model (GGM) by maximizing the log likelihood of the data, subject to an l1 penalty on the elements of the inverse covariance matrix.
Maxim, Grechkin +3 more
openaire +2 more sources
Graphical models provide a rich framework for summarizing the dependencies among variables. The graphical lasso approach attempts to learn the structure of a Gaussian graphical model (GGM) by maximizing the log likelihood of the data, subject to an l1 penalty on the elements of the inverse covariance matrix.
Maxim, Grechkin +3 more
openaire +2 more sources
Group lasso with overlap and graph lasso
Proceedings of the 26th Annual International Conference on Machine Learning, 2009We propose a new penalty function which, when used as regularization for empirical risk minimization procedures, leads to sparse estimators. The support of the sparse vector is typically a union of potentially overlapping groups of co-variates defined a priori, or a set of covariates which tend to be connected to each other when a graph of covariates ...
Laurent Jacob +2 more
openaire +1 more source
IEEE Transactions on Neural Networks, 2004
In the last few years, the support vector machine (SVM) method has motivated new interest in kernel regression techniques. Although the SVM has been shown to exhibit excellent generalization properties in many experiments, it suffers from several drawbacks, both of a theoretical and a technical nature: the absence of probabilistic outputs, the ...
openaire +2 more sources
In the last few years, the support vector machine (SVM) method has motivated new interest in kernel regression techniques. Although the SVM has been shown to exhibit excellent generalization properties in many experiments, it suffers from several drawbacks, both of a theoretical and a technical nature: the absence of probabilistic outputs, the ...
openaire +2 more sources
Proceedings of the ACM/IEEE international conference on Human-robot interaction, 2007
Good situation awareness (SA) is especially necessary when robots and their operators are not collocated, such as in urban search and rescue (USAR). This paper compares how SA is attained in two systems: one that has an emphasis on video and another that has an emphasis on a three-dimensional map.
Jill L. Drury +2 more
openaire +1 more source
Good situation awareness (SA) is especially necessary when robots and their operators are not collocated, such as in urban search and rescue (USAR). This paper compares how SA is attained in two systems: one that has an emphasis on video and another that has an emphasis on a three-dimensional map.
Jill L. Drury +2 more
openaire +1 more source

