Results 341 to 350 of about 121,264 (366)
Some of the next articles are maybe not open access.
IEEE Spectrum, 1988
The development of minisupercomputers is discussed. These desk-side units combine interactive, high-resolution graphics and vector processing. Available and soon-to-be-available minisupers are described. Vector processing, used by all these machines, is examined, as is virtual memory, another key feature.
C.G. Bell +2 more
openaire +2 more sources
The development of minisupercomputers is discussed. These desk-side units combine interactive, high-resolution graphics and vector processing. Available and soon-to-be-available minisupers are described. Vector processing, used by all these machines, is examined, as is virtual memory, another key feature.
C.G. Bell +2 more
openaire +2 more sources
Classical benchmarking of Gaussian Boson Sampling on the Titan supercomputer
Quantum Information Processing, 2018Gaussian Boson Sampling (GBS) is a model of photonic quantum computing where single-mode squeezed states are sent through linear-optical interferometers and measured using single-photon detectors. In this work, we employ a recent exact sampling algorithm
Brajesh Gupt +3 more
semanticscholar +1 more source
Supercomputing in medical imaging
IEEE Engineering in Medicine and Biology Magazine, 1988It is suggested that the diagnostic imaging department of the future will make extensive use of computer networks, mass storage devices, and sophisticated workstations at which humans and machines will interact, assisted by techniques of computer vision and artificial intelligence, to achieve integration of multimodality imaging information and expert ...
openaire +3 more sources
The IEEE/ACS International Conference onPervasive Services, 2004. ICPS 2004. Proceedings., 2004
Summary form only given. The synergistic advances in high-performance computing and in reconfigurable computing, based on field programmable gate arrays (FPGAs), form the basis for a new paradigm in supercomputing, namely reconfigurable supercomputing. This can be achieved through hybrid systems of microprocessors and FPGA modules that can leverage the
openaire +2 more sources
Summary form only given. The synergistic advances in high-performance computing and in reconfigurable computing, based on field programmable gate arrays (FPGAs), form the basis for a new paradigm in supercomputing, namely reconfigurable supercomputing. This can be achieved through hybrid systems of microprocessors and FPGA modules that can leverage the
openaire +2 more sources
Computer, 2004
We need mobile supercomputers that provide massive computational performance from the power in a battery. These supercomputers will make our personal devices much easier to use. They will perform real-time speech recognition, video transmission and analysis, and high bandwidth communication.
Wayne Wolf +5 more
openaire +2 more sources
We need mobile supercomputers that provide massive computational performance from the power in a battery. These supercomputers will make our personal devices much easier to use. They will perform real-time speech recognition, video transmission and analysis, and high bandwidth communication.
Wayne Wolf +5 more
openaire +2 more sources
The Leading Edge, 1986
The first and second glances at Cray Research’s manufacturing plant (in the distinctly un‐Silicon Valley milieu of Chippewa Falls, Wisconsin) do not trigger images of the high technology frontier. The long, low, aesthetically neutral building looks very much like the shoe factory it, until quite recently, was; inside platoons of women, working shoulder
openaire +2 more sources
The first and second glances at Cray Research’s manufacturing plant (in the distinctly un‐Silicon Valley milieu of Chippewa Falls, Wisconsin) do not trigger images of the high technology frontier. The long, low, aesthetically neutral building looks very much like the shoe factory it, until quite recently, was; inside platoons of women, working shoulder
openaire +2 more sources
Graphic Supercomputing on the Ardent and Stellar Graphics Supercomputers
1990Both the size of scientific and engineering problems and the quantity of data generated present a large challenge to researchers: namely, how to understand the results of their computations. Solving these problems interactively is essential and requires significant computation and graphics power.
openaire +2 more sources
1986
Better numerical procedures, improved computational power and additional physical insights have contributed significantly to progress in dealing with classical and quantum statistical mechanics problems. Past developments are discussed and future possibilities outlined.
openaire +2 more sources
Better numerical procedures, improved computational power and additional physical insights have contributed significantly to progress in dealing with classical and quantum statistical mechanics problems. Past developments are discussed and future possibilities outlined.
openaire +2 more sources
International Symposium on VLSI Design, Automation and Test, 2022
Mitsuhisa Sato
semanticscholar +1 more source
Mitsuhisa Sato
semanticscholar +1 more source
SPIE Proceedings, 1987
In the late 1990's supercomputers will have computational performance approaching one trillion floating point operations per second. These high performance systems will contain tens to hundreds of giga bytes of internal high speed memory either distributed, centralized or combined to provide the required bandwidth for efficient computation.
openaire +2 more sources
In the late 1990's supercomputers will have computational performance approaching one trillion floating point operations per second. These high performance systems will contain tens to hundreds of giga bytes of internal high speed memory either distributed, centralized or combined to provide the required bandwidth for efficient computation.
openaire +2 more sources

