Results 51 to 60 of about 38,453 (306)

Ultra-Short-Term Wind Power Prediction Based on Bidirectional Gated Recurrent Unit and Transfer Learning

open access: yesFrontiers in Energy Research, 2021
Wind power forecasting (WPF) is imperative to the control and dispatch of the power grid. Firstly, an ultra-short-term prediction method based on multilayer bidirectional gated recurrent unit (Bi-GRU) and fully connected (FC) layer is proposed.
Wenjin Chen   +9 more
doaj   +1 more source

Improving speech recognition by revising gated recurrent units

open access: yes, 2017
Speech recognition is largely taking advantage of deep learning, showing that substantial benefits can be obtained by modern Recurrent Neural Networks (RNNs).
Bengio, Yoshua   +3 more
core   +1 more source

Convolutional Neural Network-Based Bidirectional Gated Recurrent Unit–Additive Attention Mechanism Hybrid Deep Neural Networks for Short-Term Traffic Flow Prediction

open access: yesSustainability
To more accurately predict short-term traffic flow, this study posits a sophisticated integrated prediction model, CNN-BiGRU-AAM, based on the additive attention mechanism of a convolutional bidirectional gated recurrent unit neural network.
Song Liu   +5 more
semanticscholar   +1 more source

Gated Convolutional Bidirectional Attention-based Model for Off-topic Spoken Response Detection

open access: yes, 2020
Off-topic spoken response detection, the task aiming at predicting whether a response is off-topic for the corresponding prompt, is important for an automated speaking assessment system.
Li, Ruobing, Lin, Hui, Zha, Yefei
core   +1 more source

RaMBat: Accurate identification of medulloblastoma subtypes from diverse data sources with severe batch effects

open access: yesMolecular Oncology, EarlyView.
To integrate multiple transcriptomics data with severe batch effects for identifying MB subtypes, we developed a novel and accurate computational method named RaMBat, which leveraged subtype‐specific gene expression ranking information instead of absolute gene expression levels to address batch effects of diverse data sources.
Mengtao Sun, Jieqiong Wang, Shibiao Wan
wiley   +1 more source

Contextual Urdu Lemmatization Using Recurrent Neural Network Models

open access: yesMathematics, 2023
In the field of natural language processing, machine translation is a colossally developing research area that helps humans communicate more effectively by bridging the linguistic gap.
Rabab Hafeez   +7 more
doaj   +1 more source

Simple Recurrent Units for Highly Parallelizable Recurrence

open access: yes, 2018
Common recurrent neural architectures scale poorly due to the intrinsic difficulty in parallelizing their state computations. In this work, we propose the Simple Recurrent Unit (SRU), a light recurrent unit that balances model capacity and scalability ...
Artzi, Yoav   +4 more
core   +1 more source

Age‐Related Characteristics of SYT1‐Associated Neurodevelopmental Disorder

open access: yesAnnals of Clinical and Translational Neurology, EarlyView.
ABSTRACT Objectives We describe the clinical manifestations and developmental abilities of individuals with SYT1‐associated neurodevelopmental disorder (Baker‐Gordon syndrome) from infancy to adulthood. We further describe the neuroradiological and electrophysiological characteristics of the condition at different ages, and explore the associations ...
Sam G. Norwitz   +3 more
wiley   +1 more source

Bidirectional convolutional recurrent neural network architecture with group-wise enhancement mechanism for text sentiment classification

open access: yesJournal of King Saud University: Computer and Information Sciences, 2022
Sentiment analysis has been a well-studied research direction in computational linguistics. Deep neural network models, including convolutional neural networks (CNN) and recurrent neural networks (RNN), yield promising results on text classification ...
Aytuğ Onan
doaj   +1 more source

Glyph-aware Embedding of Chinese Characters

open access: yes, 2017
Given the advantage and recent success of English character-level and subword-unit models in several NLP tasks, we consider the equivalent modeling problem for Chinese.
Cai, Zheng, Dai, Falcon Z.
core   +1 more source

Home - About - Disclaimer - Privacy