Results 111 to 120 of about 33,931 (272)
Introduction to Multi-Armed Bandits [PDF]
Multi-armed bandits a simple but very powerful framework for algorithms that make decisions over time under uncertainty. An enormous body of work has accumulated over the years, covered in several books and surveys. This book provides a more introductory, textbook-like treatment of the subject.
openaire +2 more sources
Enhancing Analytic Hierarchy Process Modelling Under Uncertainty With Fine‐Tuning LLM
ABSTRACT Given that decision‐making typically encompasses stages such as problem recognition, the generation of alternatives, and the selection of the optimal choice, Large Language Models (LLMs) are progressively being integrated into tasks requiring the enumeration and comparative evaluation of alternatives, thereby promoting more rational decision ...
Haeun Park +3 more
wiley +1 more source
Combining Cmab with Matrix Factorization and Clustering For Enhanced Movie Recommendations [PDF]
The contextual multi-armed bandit (CMAB) algorithm faces problems of a large-scale, sparse data matrix and many arms. This paper proposes a method that combines matrix factorization with a clustering algorithm to enhance the performance of the CMAB ...
Xiong Shuyue
doaj +1 more source
ABSTRACT This article examines Venezuelan irregularized transits in South America, focusing on the dynamics of mobility and control that shape the southern corridor—a transnational space linking the Andean Region (Venezuela, Colombia, Ecuador, Peru, and Bolivia) to the Southern Cone, particularly Chile.
Soledad Álvarez Velasco +1 more
wiley +1 more source
Scaling Multi-Armed Bandit Algorithms
The Multi-Armed Bandit (MAB) is a fundamental model capturing the dilemma between exploration and exploitation in sequential decision making. At every time step, the decision maker selects a set of arms and observes a reward from each of the chosen arms. In this paper, we present a variant of the problem, which we call the Scaling MAB (S-MAB): The goal
Fouché, E., Komiyama, J., Böhm, K.
openaire +2 more sources
Multi armed bandit based resource allocation in Near Memory Processing architectures
Recent advances in 3D fabrication have allowed handling the memory bottlenecks for modern data-intensive applications by bringing the computation closer to the memory, enabling Near Memory Processing (NMP).
Shubhang Pandey, T.G. Venkatesh
doaj +1 more source
Multi-Armed Bandit Based Traffic Signal Control for Congestion Management [PDF]
Against the background of rapid urbanization and continuous growth of vehicle numbers, traffic congestion has become increasingly prominent, causing serious negative impacts on urban life and economic development.
Bai Ke
doaj +1 more source
Moving beyond diagnostic labels in psychiatry: outcome‐linked treatment modelling
General Psychiatry, Volume 38, Issue 6, December 2025.
Stanley Lyndon
wiley +1 more source
Learning the distribution with largest mean: two bandit frameworks*
Over the past few years, the multi-armed bandit model has become increasingly popular in the machine learning community, partly because of applications including online content optimization. This paper reviews two different sequential learning tasks that
Kaufmann Emilie, Garivier Aurélien
doaj +1 more source
Bandit Algorithms for Advertising Optimization: A Comparative Study [PDF]
In recent years, the rapid development of digital advertising has challenged advertisers to make optimal choices among multiple options quickly. This is crucial for increasing user engagement and return on investment.
Tian Ziyue
doaj +1 more source

