Results 291 to 300 of about 552,537 (332)
Modern particle physics experiments observing collisions of particle beams generate large amounts of data. Complex trigger and data acquisition systems are built to select on line the most interesting events and write them to persistent storage. The final stage of this selection process nowadays often happens on large computer clusters.
Kristina Marasović, Domagoj Jakobović
openalex +3 more sources
The FLUKA Monte Carlo code has been widely applied and benchmarked in recent years to problems related to Galactic Cosmic Rays (GCR) and Solar Energetic Particle (SEP) events. Applications range from fundamental cosmic ray and neutrino physics, to doses to commercial aircraft and space radiation issues in low orbit or in deep space. The main results of
G. Battistoni
openalex +2 more sources
This paper examines issues encountered attempting to exploit a high-bandwidth, high-latency link in support of a high-energy physics (HEP) analysis application. The primary issue was that the TCP additive increase/multiplicative decrease (AIMD) algorithm is not suitable for "long fat networks". While this is a known problem, the magnitude of the impact
William Allcock +10 more
openalex +3 more sources
Modern particle physics experiments observing collisions of particle beams generate large amounts of data. Large trigger and data acquisition systems are built to select on-line the most interesting events and write them to persistent storage. The final stage of this selection process nowadays often happens on large computer farms.
Kristina Marasović +2 more
openalex +4 more sources
Some of the next articles are maybe not open access.
Related searches:
Related searches:
Optimizing Graph Neural Networks for Jet Tagging in Particle Physics on FPGAs
International Conference on Field-Programmable Logic and Applications, 2022This work proposes a novel reconfigurable architecture for reducing the latency of JEDI-net, a Graph Neural Network (GNN) based algorithm for jet tagging in particle physics, which achieves state-of-the-art accuracy.
Zhiqiang Que +5 more
semanticscholar +1 more source
MadMiner: Machine Learning-Based Inference for Particle Physics
Computing and Software for Big Science, 2019Precision measurements at the LHC often require analyzing high-dimensional event data for subtle kinematic signatures, which is challenging for established analysis methods.
J. Brehmer +3 more
semanticscholar +1 more source
FPGA-Accelerated Machine Learning Inference as a Service for Particle Physics Computing
Computing and Software for Big Science, 2019Large-scale particle physics experiments face challenging demands for high-throughput computing resources both now and in the future. New heterogeneous computing paradigms on dedicated hardware with increased parallelization, such as Field Programmable ...
Javier Mauricio Duarte +22 more
semanticscholar +1 more source
Full Event Particle-Level Unfolding with Variable-Length Latent Variational Diffusion
SciPost PhysicsThe measurements performed by particle physics experiments must account for the imperfect response of the detectors used to observe the interactions. One approach, unfolding, statistically adjusts the experimental data for detector effects.
A. Shmakov +5 more
semanticscholar +1 more source
Physical characterization of aerosol particles during the Chinese New Year’s firework events
Atmospheric Environment, 2010Abstract Measurements for particles 10 nm to 10 μm were taken using a Wide-range Particle Spectrometer during the Chinese New Year (CNY) celebrations in 2009 in Shanghai, China. These celebrations provided an opportunity to study the number concentration and size distribution of particles in an especial atmospheric pollution situation due to firework
Min Zhang +8 more
openaire +1 more source
A fast Monte Carlo event generator for particle physics
Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment, 1986Abstract The principal component analysis is applied to a data sample, with the aim of retaining the smallest number of variables necessary to adequately describe the data. We find that in a particular example, of the initial 20 variables necessary to fully describe the data, only 10 principal components suffice to determine their structure.
openaire +1 more source

