Results 231 to 240 of about 67,235 (260)
Some of the next articles are maybe not open access.
This paper introduces a novel hybrid neural network architecture, the EGFTransformerModel, which integrates Particle Swarm Optimization (PSO) directly into a Transformer encoder. We replace a standard multi-head attention (MHA) layer with a PSOAttentionLayer, where the Query, Key, and Value (QKV) projection weights are treated as particles in a swarm ...
openaire +1 more source
openaire +1 more source
To VSM or Not to VSM: That is the Question
Journal of the Operational Research Society, 1995openaire +1 more source

