Results 31 to 40 of about 295,233 (270)

Learning to Learn without Gradient Descent by Gradient Descent

open access: yes, 2016
We learn recurrent neural network optimizers trained on simple synthetic functions by gradient descent. We show that these learned optimizers exhibit a remarkable degree of transfer in that they can be used to efficiently optimize a broad range of derivative-free black-box functions, including Gaussian process bandits, simple control objectives, global
Chen, Yutian   +6 more
openaire   +2 more sources

ADADELTA: An Adaptive Learning Rate Method [PDF]

open access: yes, 2012
We present a novel per-dimension learning rate method for gradient descent called ADADELTA. The method dynamically adapts over time using only first order information and has minimal computational overhead beyond vanilla stochastic gradient descent.
Zeiler, Matthew D.
core  

An upstream open reading frame regulates expression of the mitochondrial protein Slm35 and mitophagy flux

open access: yesFEBS Letters, EarlyView.
This study reveals how the mitochondrial protein Slm35 is regulated in Saccharomyces cerevisiae. The authors identify stress‐responsive DNA elements and two upstream open reading frames (uORFs) in the 5′ untranslated region of SLM35. One uORF restricts translation, and its mutation increases Slm35 protein levels and mitophagy.
Hernán Romo‐Casanueva   +5 more
wiley   +1 more source

Structural instability impairs function of the UDP‐xylose synthase 1 Ile181Asn variant associated with short‐stature genetic syndrome in humans

open access: yesFEBS Letters, EarlyView.
The Ile181Asn variant of human UDP‐xylose synthase (hUXS1), associated with a short‐stature genetic syndrome, has previously been reported as inactive. Our findings demonstrate that Ile181Asn‐hUXS1 retains catalytic activity similar to the wild‐type but exhibits reduced stability, a looser oligomeric state, and an increased tendency to precipitate ...
Tuo Li   +2 more
wiley   +1 more source

Exponentiated Gradient Meets Gradient Descent

open access: yes, 2019
The (stochastic) gradient descent and the multiplicative update method are probably the most popular algorithms in machine learning. We introduce and study a new regularization which provides a unification of the additive and multiplicative updates.
Ghai, Udaya, Hazan, Elad, Singer, Yoram
openaire   +2 more sources

Inhibiting stearoyl‐CoA desaturase suppresses bone metastatic prostate cancer by modulating cellular stress, mTOR signaling, and DNA damage response

open access: yesFEBS Letters, EarlyView.
Bone metastasis in prostate cancer (PCa) patients is a clinical hurdle due to the poor understanding of the supportive bone microenvironment. Here, we identify stearoyl‐CoA desaturase (SCD) as a tumor‐promoting enzyme and potential therapeutic target in bone metastatic PCa.
Alexis Wilson   +7 more
wiley   +1 more source

Reparameterizing Mirror Descent as Gradient Descent

open access: yes, 2020
Most of the recent successful applications of neural networks have been based on training with gradient descent updates. However, for some small networks, other mirror descent updates learn provably more efficiently when the target is sparse. We present a general framework for casting a mirror descent update as a gradient descent update on a different ...
Amid, Ehsan, Warmuth, Manfred K.
openaire   +2 more sources

The (Glg)ABCs of cyanobacteria: modelling of glycogen synthesis and functional divergence of glycogen synthases in Synechocystis sp. PCC 6803

open access: yesFEBS Letters, EarlyView.
We reconstituted Synechocystis glycogen synthesis in vitro from purified enzymes and showed that two GlgA isoenzymes produce glycogen with different architectures: GlgA1 yields denser, highly branched glycogen, whereas GlgA2 synthesizes longer, less‐branched chains.
Kenric Lee   +3 more
wiley   +1 more source

PARP inhibitors elicit distinct transcriptional programs in homologous recombination competent castration‐resistant prostate cancer

open access: yesMolecular Oncology, EarlyView.
PARP inhibitors are used to treat a small subset of prostate cancer patients. These studies reveal that PARP1 activity and expression are different between European American and African American prostate cancer tissue samples. Additionally, different PARP inhibitors cause unique and overlapping transcriptional changes, notably, p53 pathway upregulation.
Moriah L. Cunningham   +21 more
wiley   +1 more source

Adenosine‐to‐inosine editing of miR‐200b‐3p is associated with the progression of high‐grade serous ovarian cancer

open access: yesMolecular Oncology, EarlyView.
A‐to‐I editing of miRNAs, particularly miR‐200b‐3p, contributes to HGSOC progression by enhancing cancer cell proliferation, migration and 3D growth. The edited form is linked to poorer patient survival and the identification of novel molecular targets.
Magdalena Niemira   +14 more
wiley   +1 more source

Home - About - Disclaimer - Privacy