Results 71 to 80 of about 3,741,721 (286)

Condensed-gradient boosting

open access: yesInternational Journal of Machine Learning and Cybernetics
Abstract This paper presents a computationally efficient variant of Gradient Boosting (GB) for multi-class classification and multi-output regression tasks. Standard GB uses a 1-vs-all strategy for classification tasks with more than two classes. This strategy entails that one tree per class and iteration has to be trained.
Seyedsaman Emami   +1 more
openaire   +3 more sources

Inhibiting stearoyl‐CoA desaturase suppresses bone metastatic prostate cancer by modulating cellular stress, mTOR signaling, and DNA damage response

open access: yesFEBS Letters, EarlyView.
Bone metastasis in prostate cancer (PCa) patients is a clinical hurdle due to the poor understanding of the supportive bone microenvironment. Here, we identify stearoyl‐CoA desaturase (SCD) as a tumor‐promoting enzyme and potential therapeutic target in bone metastatic PCa.
Alexis Wilson   +7 more
wiley   +1 more source

Convection and extracellular matrix binding control interstitial transport of extracellular vesicles

open access: yesJournal of Extracellular Vesicles, 2023
Extracellular vesicles (EVs) influence a host of normal and pathophysiological processes in vivo. Compared to soluble mediators, EVs can traffic a wide range of proteins on their surface including extracellular matrix (ECM) binding proteins, and their ...
Peter A. Sariano   +11 more
doaj   +1 more source

The multiple effects of gradient coupling on network synchronization

open access: yes, 2007
Recent studies have shown that synchronizability of complex networks can be significantly improved by asymmetric couplings, and increase of coupling gradient is always in favor of network synchronization.
Lai, Choy Heng   +3 more
core   +1 more source

Exponentiated Gradient Meets Gradient Descent

open access: yes, 2019
The (stochastic) gradient descent and the multiplicative update method are probably the most popular algorithms in machine learning. We introduce and study a new regularization which provides a unification of the additive and multiplicative updates.
Ghai, Udaya, Hazan, Elad, Singer, Yoram
openaire   +2 more sources

The (Glg)ABCs of cyanobacteria: modelling of glycogen synthesis and functional divergence of glycogen synthases in Synechocystis sp. PCC 6803

open access: yesFEBS Letters, EarlyView.
We reconstituted Synechocystis glycogen synthesis in vitro from purified enzymes and showed that two GlgA isoenzymes produce glycogen with different architectures: GlgA1 yields denser, highly branched glycogen, whereas GlgA2 synthesizes longer, less‐branched chains.
Kenric Lee   +3 more
wiley   +1 more source

Diffusion pore imaging with generalized temporal gradient profiles

open access: yes, 2013
In porous material research, one main interest of nuclear magnetic resonance diffusion (NMR) experiments is the determination of the shape of pores. While it has been a longstanding question if this is in principle achievable, it has been shown recently ...
Kuder, Tristan Anselm   +1 more
core   +1 more source

PARP inhibitors elicit distinct transcriptional programs in homologous recombination competent castration‐resistant prostate cancer

open access: yesMolecular Oncology, EarlyView.
PARP inhibitors are used to treat a small subset of prostate cancer patients. These studies reveal that PARP1 activity and expression are different between European American and African American prostate cancer tissue samples. Additionally, different PARP inhibitors cause unique and overlapping transcriptional changes, notably, p53 pathway upregulation.
Moriah L. Cunningham   +21 more
wiley   +1 more source

Adenosine‐to‐inosine editing of miR‐200b‐3p is associated with the progression of high‐grade serous ovarian cancer

open access: yesMolecular Oncology, EarlyView.
A‐to‐I editing of miRNAs, particularly miR‐200b‐3p, contributes to HGSOC progression by enhancing cancer cell proliferation, migration and 3D growth. The edited form is linked to poorer patient survival and the identification of novel molecular targets.
Magdalena Niemira   +14 more
wiley   +1 more source

AdaBatch: Efficient Gradient Aggregation Rules for Sequential and Parallel Stochastic Gradient Methods [PDF]

open access: yes, 2017
We study a new aggregation operator for gradients coming from a mini-batch for stochastic gradient (SG) methods that allows a significant speed-up in the case of sparse optimization problems.
Bach, Francis, Défossez, Alexandre
core   +2 more sources

Home - About - Disclaimer - Privacy