Results 201 to 210 of about 2,681,518 (295)
Leveraging large language models for data analysis automation. [PDF]
Jansen JA +3 more
europepmc +1 more source
Multi-step retrieval and reasoning improves radiology question answering with large language models. [PDF]
Wind S +11 more
europepmc +1 more source
Shaping inter-brain plasticity: A feasibility study of enhancing inter-brain synchrony with dyadic neurofeedback. [PDF]
Francis M +3 more
europepmc +1 more source
A Practical Guide and Assessment on Using ChatGPT to Conduct Grounded Theory: Tutorial.
Yue Y, Liu D, Lv Y, Hao J, Cui P.
europepmc +1 more source
Generalizable and scalable multistage biomedical concept normalization leveraging large language models. [PDF]
Dobbins NJ.
europepmc +1 more source
rMATS-cloud: Large-scale Alternative Splicing Analysis in the Cloud. [PDF]
Adams JI +7 more
europepmc +1 more source
Some of the next articles are maybe not open access.
Related searches:
Related searches:
IEEE Transactions on Communications, 2005
In this paper, we introduce the concept of nonsystematic turbo codes and compare them with classical systematic turbo codes. Nonsystematic turbo codes can achieve lower error floors than systematic turbo codes because of their superior effective free distance properties.
Banerjee A. +3 more
openaire +3 more sources
In this paper, we introduce the concept of nonsystematic turbo codes and compare them with classical systematic turbo codes. Nonsystematic turbo codes can achieve lower error floors than systematic turbo codes because of their superior effective free distance properties.
Banerjee A. +3 more
openaire +3 more sources
Proceedings. International Symposium on Information Theory, 2005. ISIT 2005., 2005
In this paper we introduce a new coding scheme - so-called laminated turbo codes. It is characterized by a block-convolutional structure that enables us to combine the advantages of a convolutional encoder memory and a block-oriented decoding method. We show that this block-convolutional structure is superior in terms of its error correction capability
Huebner, A. +3 more
openaire +2 more sources
In this paper we introduce a new coding scheme - so-called laminated turbo codes. It is characterized by a block-convolutional structure that enables us to combine the advantages of a convolutional encoder memory and a block-oriented decoding method. We show that this block-convolutional structure is superior in terms of its error correction capability
Huebner, A. +3 more
openaire +2 more sources
DeepSeek-Coder-V2: Breaking the Barrier of Closed-Source Models in Code Intelligence
arXiv.orgWe present DeepSeek-Coder-V2, an open-source Mixture-of-Experts (MoE) code language model that achieves performance comparable to GPT4-Turbo in code-specific tasks. Specifically, DeepSeek-Coder-V2 is further pre-trained from an intermediate checkpoint of
DeepSeek-AI +39 more
semanticscholar +1 more source

