Results 151 to 160 of about 15,666 (213)
Some of the next articles are maybe not open access.

Medusa: Accelerating Serverless LLM Inference with Materialization

International Conference on Architectural Support for Programming Languages and Operating Systems
Serverless is a promising paradigm to provide scalable, cost-efficient, and easy-to-use model inference services. However, the cold start of model inference functions requires loading models to the devices, which incurs high latencies and undermines the ...
Shaoxun Zeng   +4 more
semanticscholar   +1 more source

Reliability-Aware Personalized Deployment of Approximate Computation IoT Applications in Serverless Mobile Edge Computing

IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
Over the past few years, the integration of mobile edge computing (MEC) and serverless computing, known as serverless MEC (SMEC), has garnered considerable attention.
Kun Cao   +3 more
semanticscholar   +1 more source

λScale: Enabling Fast Scaling for Serverless Large Language Model Inference

arXiv.org
Serverless computing has emerged as a compelling solution for cloud-based model inference. However, as modern large language models (LLMs) continue to grow in size, existing serverless platforms often face substantial model startup overhead. This poses a
Minchen Yu   +10 more
semanticscholar   +1 more source

Optimus: Warming Serverless ML Inference via Inter-Function Model Transformation

European Conference on Computer Systems
Serverless ML inference is an emerging cloud computing paradigm for low-cost, easy-to-manage inference services. In serverless ML inference, each call is executed in a container; however, the cold start of containers results in long inference delays ...
Zicong Hong   +6 more
semanticscholar   +1 more source

Blockchain-enabled infrastructural security solution for serverless consortium fog and edge computing

PeerJ Computer Science
The robust development of the blockchain distributed ledger, the Internet of Things (IoT), and fog computing-enabled connected devices and nodes has changed our lifestyle nowadays. Due to this, the increased rate of device sales and utilization increases
Abdullah Ayub Khan   +6 more
semanticscholar   +1 more source

Going serverless

Communications of the ACM, 2018
Serverless computing lets businesses and application developers focus on the program they need to run, without worrying about the machine on which it runs, or the resources it requires.
openaire   +1 more source

Serverless Cold Starts and Where to Find Them

European Conference on Computer Systems
This paper analyzes a month-long trace of 85 billion user requests and 11.9 million cold starts from Huawei's serverless cloud platform. Our analysis spans workloads from five data centers.
Artjom Joosen   +7 more
semanticscholar   +1 more source

LLMs for Generation of Architectural Components: An Exploratory Empirical Study in the Serverless World

International Conference on Software Architecture
Recently, the exponential growth in capability and pervasiveness of Large Language Models (LLMs) has led to significant work done in the field of code generation. However, this generation has been limited to code snippets.
Shrikara Arun   +2 more
semanticscholar   +1 more source

A Study of Triggering Serverless Functions in Serverless Environment

Tuijin Jishu/Journal of Propulsion Technology, 2023
Serverless computing has gained significant popularity due to its ability to provide scalable and cost-efficient cloud services. However, the adoption of serverless functions comes with its own set of challenges. One critical aspect is the triggering mechanism that initiates the execution of serverless functions in a serverless environment.
openaire   +1 more source

EXPLORING SERVERLESS SECURITY: IDENTIFYING SECURITY RISKS AND IMPLEMENTING BEST PRACTICES

INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT
FaaS or Serverless computing extends the existing cloud computing by removing or abstracting the notion of a server and the need to scale up and down on demand.
Ramasankar Molleti, Anirudh Khanna
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy