Results 261 to 270 of about 20,444,531 (305)
Some of the next articles are maybe not open access.

Hybrid-Domain Neural Network Processing for Sparse-View CT Reconstruction

IEEE Transactions on Radiation and Plasma Medical Sciences, 2021
X-ray computed tomography (CT) is one of the most widely used tools in medical imaging, industrial nondestructive testing, lesion detection, and other applications. However, decreasing the projection number to lower the X-ray radiation dose usually leads
Dianlin Hu   +8 more
semanticscholar   +1 more source

Adaptively Sparse Transformers Hawkes Process

International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, 2023
Nowadays, many sequences of events are generated in areas as diverse as healthcare, finance, and social network. People have been studying these data for a long time. They hope to predict the type and occurrence time of the next event by using relationships among events in the data. recently, with the successful application of Recurrent Neural Network
Gao, Yue, Liu, Jian-Wei
openaire   +1 more source

Group Sparse Optimal Transport for Sparse Process Flexibility Design

Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence, 2023
As a fundamental problem in Operations Research, sparse process flexibility design (SPFD) aims to design a manufacturing network across industries that achieves a trade-off between the efficiency and robustness of supply chains. In this study, we propose a novel solution to this problem with the help of computational optimal transport techniques ...
Dixin Luo, Tingting Yu, Hongteng Xu
openaire   +1 more source

Sparse Low-rank Adaptation of Pre-trained Language Models

Conference on Empirical Methods in Natural Language Processing, 2023
Fine-tuning pre-trained large language models in a parameter-efficient manner is widely studied for its effectiveness and efficiency. The popular method of low-rank adaptation (LoRA) offers a notable approach, hypothesizing that the adaptation process is
Ning Ding   +6 more
semanticscholar   +1 more source

Structured Nyquist Correlation Reconstruction for DOA Estimation With Sparse Arrays

IEEE Transactions on Signal Processing, 2023
Sparse arrays are known to achieve an increased number of degrees-of-freedom (DOFs) for direction-of-arrival (DOA) estimation, where an augmented virtual uniform array calculated from the correlations of sub-Nyquist spatial samples is processed to ...
Chengwei Zhou   +3 more
semanticscholar   +1 more source

Frequency-difference beamforming and sparse processing

Journal of the Acoustical Society of America
Frequency-difference processing enables the estimation of the direction of arrival (DOA) for sources beyond the spatial aliasing frequency. The beamforming method takes advantage of the frequency difference between multiple frequencies, enabling ...
Yongsung Park, P. Gerstoft
semanticscholar   +1 more source

Sparse Signal Processing

2014
Conventional sampling techniques are based on Shannon-Nyquist theory which states that the required sampling rate for perfect recovery of a band-limited signal is at least twice its bandwidth. The band-limitedness property of the signal plays a significant role in the design of conventional sampling and reconstruction systems.
Masoumeh Azghani, Farokh Marvasti
openaire   +1 more source

Sparse inverse kernel Gaussian Process regression

Statistical Analysis and Data Mining: The ASA Data Science Journal, 2013
AbstractRegression problems on massive data sets are ubiquitous in many application domains including the Internet, earth and space sciences, and finances. Gaussian Process regression (GPR) is a popular technique for modeling the input–output relations of a set of variables under the assumption that the weight vector has a Gaussian prior.
Das, Kamalika, Srivastava, Ashok N.
openaire   +2 more sources

Validation-Based Sparse Gaussian Process Classifier Design

Neural Computation, 2009
Gaussian processes (GPs) are promising Bayesian methods for classification and regression problems. Design of a GP classifier and making predictions using it is, however, computationally demanding, especially when the training set size is large. Sparse GP classifiers are known to overcome this limitation.
Shevade, Shirish, Sundararajan, S
openaire   +2 more sources

SIGMA: A Sparse and Irregular GEMM Accelerator with Flexible Interconnects for DNN Training

International Symposium on High-Performance Computer Architecture, 2020
The advent of Deep Learning (DL) has radically transformed the computing industry across the entire spectrum from algorithms to circuits. As myriad application domains embrace DL, it has become synonymous with a genre of workloads across vision, speech ...
Eric Qin   +7 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy