Results 201 to 210 of about 56,430 (243)
Some of the next articles are maybe not open access.
2019 IEEE 13th International Symposium on Applied Computational Intelligence and Informatics (SACI), 2019
BitTorrent is one of the largest source of data exchange over the Internet. The copyright and legal pressure from authorities systematically close torrent hosting sites, futilely trying to stop the damage done to the creative industry. As a consequence, the centralized portals allowing for the search of torrent files, are more and more replaced by the ...
Robert-George Simion, Mihai-Lica Pura
openaire +2 more sources
BitTorrent is one of the largest source of data exchange over the Internet. The copyright and legal pressure from authorities systematically close torrent hosting sites, futilely trying to stop the damage done to the creative industry. As a consequence, the centralized portals allowing for the search of torrent files, are more and more replaced by the ...
Robert-George Simion, Mihai-Lica Pura
openaire +2 more sources
Accuracy Crawler: An Accurate Crawler for Deep Web Data Extraction
2018 International Conference on Control, Power, Communication and Computing Technologies (ICCPCCT), 2018With the daily amalgamation in the size of data available on internet, the size of deep web is also continuously growing. The large size of the deep web in comparison with the surface web makes it very difficult to locate various deep web resources.
Anshul Khurana, Prafful Mishra
openaire +2 more sources
2018 OCEANS - MTS/IEEE Kobe Techno-Oceans (OTO), 2018
NOMAD is an autonomous benthic crawler carrying scientific instrumentation for scanning a continuous track of the seafloor and performing cyclic oxygen profiles and in-situ measurements of total exchange rates in depth of up to 6000m. It expands the line of preceding crawlers by achieving the highest payload to weight ratio by the application of ...
Johannes Lemburg+4 more
openaire +2 more sources
NOMAD is an autonomous benthic crawler carrying scientific instrumentation for scanning a continuous track of the seafloor and performing cyclic oxygen profiles and in-situ measurements of total exchange rates in depth of up to 6000m. It expands the line of preceding crawlers by achieving the highest payload to weight ratio by the application of ...
Johannes Lemburg+4 more
openaire +2 more sources
ChainMR Crawler: A Distributed Vertical Crawler Based on MapReduce
2016With the explosive growth of data in the Internet, the single vertical crawler cannot meet the requirements of the high performance of the crawler. The existing distributed vertical crawlers also have the problem of weak capability of customization. In order to solve the above problem, this paper proposes a distributed vertical crawler named ChainMR ...
Xixia Liu, Zhengping Jin
openaire +2 more sources
Smartphones are becoming more important in our everyday lives and it is increasingly common to perform critical tasks on these devices, such as making payments. For this reason, ensuring the quality of these applications is an important task. One way to do this is through software testing.
Jorge Ferreira, Ana C. R. Paiva
openaire +1 more source
Crawler Detection: A Bayesian Approach
International Conference on Internet Surveillance and Protection (ICISP06), 2006In this paper, we introduce a probabilistic modeling approach for addressing the problem of Web robot detection from Web-server access logs. More specifically, we construct a Bayesian network that classifies automatically access-log sessions as being crawler- or human-induced, by combining various pieces of evidence proven to characterize crawler and ...
Stassopoulou, Athena+3 more
openaire +3 more sources
Effect of Compliance on Ground Adaptability of Crawler Mobile Robots with Sub-Crawlers
2020 IEEE/SICE International Symposium on System Integration (SII), 2020There is a demand for remotely controlled mobile robots with high terrain performance on uneven roads where the road surface has taken on a rugged profile due to rubble deposition in the aftermath of a disaster. Crawler-type mobile robots with sub-crawlers (ground-adaptive crawler robot) are known to have a high level of rough terrain performance ...
Katsuji Ogane+5 more
openaire +2 more sources
OCEANS 2018 MTS/IEEE Charleston, 2018
Mine and un-exploded ordnance removal is an international concern that involves locating the potential threat, classifying it, and containing, disabling or destroying it to prevent any danger or harm. Currently, AUVs are being used to survey the seabed to detect mines; however, the tasks of classifying the mines and of containing, disabling and ...
Stephen J. Wood, Gisela Susanne Bahr
openaire +2 more sources
Mine and un-exploded ordnance removal is an international concern that involves locating the potential threat, classifying it, and containing, disabling or destroying it to prevent any danger or harm. Currently, AUVs are being used to survey the seabed to detect mines; however, the tasks of classifying the mines and of containing, disabling and ...
Stephen J. Wood, Gisela Susanne Bahr
openaire +2 more sources
2020
In this chapter, we will discuss a crawling framework called Scrapy and go through the steps necessary to crawl and upload the web crawl data to an S3 bucket.
openaire +2 more sources
In this chapter, we will discuss a crawling framework called Scrapy and go through the steps necessary to crawl and upload the web crawl data to an S3 bucket.
openaire +2 more sources
2014 International Conference on Circuits, Systems, Communication and Information Technology Applications (CSCITA), 2014
Web Forums or Internet Forums provide a space for users to share, discuss and request information. Web Forums are sources of huge amount of structured information that is rapidly changing. So crawling Web Forums require special softwares. A Generic Deep Web Crawler or a Focused Crawler cannot be used for this purpose.
Sreeja S R, Sangita Chaudhari
openaire +2 more sources
Web Forums or Internet Forums provide a space for users to share, discuss and request information. Web Forums are sources of huge amount of structured information that is rapidly changing. So crawling Web Forums require special softwares. A Generic Deep Web Crawler or a Focused Crawler cannot be used for this purpose.
Sreeja S R, Sangita Chaudhari
openaire +2 more sources