Results 281 to 290 of about 7,221,536 (339)
Some of the next articles are maybe not open access.
The Mathematical Gazette, 1956
The Farey series F n consists of all the proper fractions, in their lowest terms and in order of magnitude from 0/1 to 1/1, whose denominators do not exceed n . The series
openaire +2 more sources
The Farey series F n consists of all the proper fractions, in their lowest terms and in order of magnitude from 0/1 to 1/1, whose denominators do not exceed n . The series
openaire +2 more sources
Miscellaneous Notes (Tenth Series, Eleventh Series, Twelfth Series and Thirteenth Series)
Archives of Internal Medicine, 1962These casual notes, turned out as the spirit moved Parkes Weber, for the most part deal with recollections and reminiscences of the last century. At the present time they resemble little snatches of conversation or recollected bits and snatches which might be interjected in talk or into a letter. The subject matter varies from nature and God, coins and
openaire +1 more source
SIAM Journal on Discrete Mathematics, 1996
Given that in a multitude of situations polynomial and power-series invariants of structures which enumerate ``combinatorial events'' on these structures exhibit similar abstract properties, it becomes an interesting and important objective to develop all-encompassing simplest ``theories'' to capture general characteristics common to these invariants ...
openaire +2 more sources
Given that in a multitude of situations polynomial and power-series invariants of structures which enumerate ``combinatorial events'' on these structures exhibit similar abstract properties, it becomes an interesting and important objective to develop all-encompassing simplest ``theories'' to capture general characteristics common to these invariants ...
openaire +2 more sources
Chronos: Learning the Language of Time Series
Trans. Mach. Learn. Res.We introduce Chronos, a simple yet effective framework for pretrained probabilistic time series models. Chronos tokenizes time series values using scaling and quantization into a fixed vocabulary and trains existing transformer-based language model ...
Abdul Fatir Ansari +16 more
semanticscholar +1 more source
Unified Training of Universal Time Series Forecasting Transformers
International Conference on Machine LearningDeep learning for time series forecasting has traditionally operated within a one-model-per-dataset framework, limiting its potential to leverage the game-changing impact of large pre-trained models.
Gerald Woo +5 more
semanticscholar +1 more source

