Results 111 to 120 of about 1,734,941 (332)
Web Crawler Architecture over Cloud Computing compared with Grid Computing [PDF]
Mohamed Elaraby+3 more
openalex +1 more source
Pengembangan Aplikasi Pencari Harga Terbaik Berbasis Selenium Web Crawler
Lina Andriyani+3 more
openalex +2 more sources
A web spider is an automated program or a script that independently crawls websites on the internet. At the same time its job is to pinpoint and extract desired data from websites. The data is then saved in a database and is later used for different purposes.
openaire +1 more source
A Risk Manager for Intrusion Tolerant Systems: Enhancing HAL 9000 With New Scoring and Data Sources
ABSTRACT Background Intrusion Tolerant Systems (ITS) aim to maintain system security despite adversarial presence by limiting the impact of successful attacks. Current ITS risk managers rely heavily on public databases like NVD and Exploit DB, which suffer from long delays in vulnerability evaluation, reducing system responsiveness. Objective This work
Tadeu Freitas+6 more
wiley +1 more source
A user-oriented web crawler for selectively acquiring online content in e-health research. [PDF]
Xu S, Yoon HJ, Tourassi G.
europepmc +1 more source
Abstract Gerber half‐joints, broadly used in the last century as elements of concrete bridges, are prone to corrosion‐induced deterioration, which may lead to brittle shear collapse. It is of paramount importance to develop advanced numerical models for simulating the collapse behavior of Gerber half‐joints, taking material deterioration into account ...
Dario De Domenico+4 more
wiley +1 more source
Analysis and Detection of Bogus Behavior in Web Crawler Measurement
Quan Bai+3 more
openalex +1 more source
Recent advances in embedded technologies and self‐sensing concrete for structural health monitoring
Abstract Fully embedded and spatially diffuse sensors are central to the advancement of civil and construction engineering. Indeed, they serve as an enabling technology necessary for addressing the current challenges associated with through‐life management and structural health monitoring of existing structures and infrastructures. The need to identify
Marco Civera+2 more
wiley +1 more source
Nuevos retos de la tecnología web crawler para la recuperación de información
Manuel Blázquez Ochando
openalex +2 more sources
Abstract The preservation of the degrading transport infrastructure is vital, as replacing it is impossible in view of limited budgets and environmental impacts. One component in overcoming this challenge is the evaluation of the reliability and the calculation of the remaining service life of existing structures.
Stefan Küttenbaum+5 more
wiley +1 more source