Results 181 to 190 of about 17,587,632 (232)
Some of the next articles are maybe not open access.

One Fits All: Power General Time Series Analysis by Pretrained LM

Neural Information Processing Systems, 2023
Although we have witnessed great success of pre-trained models in natural language processing (NLP) and computer vision (CV), limited progress has been made for general time series analysis.
Tian Zhou   +4 more
semanticscholar   +1 more source

Miscellaneous Notes (Tenth Series, Eleventh Series, Twelfth Series and Thirteenth Series)

Archives of Internal Medicine, 1962
These casual notes, turned out as the spirit moved Parkes Weber, for the most part deal with recollections and reminiscences of the last century. At the present time they resemble little snatches of conversation or recollected bits and snatches which might be interjected in talk or into a letter. The subject matter varies from nature and God, coins and
openaire   +1 more source

Chronos: Learning the Language of Time Series

Trans. Mach. Learn. Res.
We introduce Chronos, a simple yet effective framework for pretrained probabilistic time series models. Chronos tokenizes time series values using scaling and quantization into a fixed vocabulary and trains existing transformer-based language model ...
Abdul Fatir Ansari   +16 more
semanticscholar   +1 more source

Multipartition Series

SIAM Journal on Discrete Mathematics, 1996
Given that in a multitude of situations polynomial and power-series invariants of structures which enumerate ``combinatorial events'' on these structures exhibit similar abstract properties, it becomes an interesting and important objective to develop all-encompassing simplest ``theories'' to capture general characteristics common to these invariants ...
openaire   +2 more sources

Monotone Series

American Journal of Mathematics, 1946
Not ...
openaire   +1 more source

Unified Training of Universal Time Series Forecasting Transformers

International Conference on Machine Learning
Deep learning for time series forecasting has traditionally operated within a one-model-per-dataset framework, limiting its potential to leverage the game-changing impact of large pre-trained models.
Gerald Woo   +5 more
semanticscholar   +1 more source

TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting

International Conference on Learning Representations
Time series forecasting is widely used in extensive applications, such as traffic planning and weather forecasting. However, real-world time series usually present intricate temporal variations, making forecasting extremely challenging.
Shiyu Wang   +7 more
semanticscholar   +1 more source

MOMENT: A Family of Open Time-series Foundation Models

International Conference on Machine Learning
We introduce MOMENT, a family of open-source foundation models for general-purpose time series analysis. Pre-training large models on time series data is challenging due to (1) the absence of a large and cohesive public time series repository, and (2 ...
Mononito Goswami   +5 more
semanticscholar   +1 more source

Series, Taylor — Maclaurin Series

1976
By a series we mean a set of numbers a1, a2, a3… such that we have a rule for calculating a2, a3 etc. from the first number a1.Series occur in many problems in chemistry such as specific heats of solids, the theory of black-body radiation, solution of the Schrodinger equation, statistical thermodynamics and Fourier series in X-ray crystallography.
openaire   +1 more source

Poincar� series

Lithuanian Mathematical Journal, 1985
Let \(G\) be a subgroup of finite index of the full modular group \(\Gamma\), \(G_0\) be the subgroup of linear translations lying in \(G\). Denoting a generator of the group \(G_0\) by \(T^q=\begin{pmatrix} 1 & q \\ 0 & 1\end{pmatrix}\) and letting \(\mathcal K\) be a set of representatives of left cosets of \(G\) modulo \(G_0\), in the upper complex ...
openaire   +1 more source

Home - About - Disclaimer - Privacy