Results 241 to 250 of about 1,549,240 (292)
Some of the next articles are maybe not open access.

Knowledge Verification From Data

IEEE Transactions on Neural Networks and Learning Systems
Knowledge verification is an important task in the quality management of knowledge graphs (KGs). Knowledge is a summary of facts and events based on human cognition and experience. Due to the nature of knowledge, most knowledge quality (KQ) management methods are designed by human experts or the characteristics of existing knowledge, which may be ...
Xiangyu Wang   +5 more
openaire   +2 more sources

DATA COLLECTION, PROCESSING, VALIDATION, AND VERIFICATION

Health Physics, 2008
The collection, processing, validation, verification, formatting, filing, and storage of the required input data are some of the most important components in the National Institute for Occupational Safety and Health (NIOSH) Radiation Dose Reconstruction Program.
Deborah L, Martin   +8 more
openaire   +2 more sources

Differential Data Quality Verification on Partitioned Data

2019 IEEE 35th International Conference on Data Engineering (ICDE), 2019
Modern companies and institutions rely on data to guide every single decision. Missing or incorrect information seriously compromises any decision process. In previous work, we presented Deequ, a Spark-based library for automating the verification of data quality at scale. Deequ provides a declarative API, which combines common quality constraints with
Sebastian Schelter   +7 more
openaire   +1 more source

Data Stream Verification

2015
The problem is concerned with the following setting. A computationally limited client wants to compute some property of a massive input, but lacks the resources to store even a small fraction of the input, and hence cannot perform the desired computation locally.
openaire   +1 more source

Type-based data structure verification

ACM SIGPLAN Notices, 2009
We present a refinement type-based approach for the static verification of complex data structure invariants. Our approach is based on the observation that complex data structures are typically fashioned from two elements: recursion (e.g., lists and trees), and maps (e.g., arrays and hash tables).
Ming Kawaguchi   +2 more
openaire   +1 more source

source data verification (SDV)

2009
Also s.d. validation; procedures to ensure that data contained in the case record form (CRF) and later in the final report match original observations; these procedures (audit, inspection, quality control) may apply to raw data, hard copies, electronic CRFs, computer printouts, statistical analyses, tables etc.; s.d.v. should be carried out on key data
openaire   +1 more source

Integrity Verification of Cloud Data

2013
This paper proposes a new program on the integrity of the data to the probabilistic methods when the number of documents inserted the pseudo-tuple becomes large. In this program, pseudo-tuple inserted in accordance with different particle size were signed. And a user constructs a new type of data structure.
Fan-xin Kong, Li Liu
openaire   +1 more source

Consistency Verification using Data Coloring

2010
Today, the number of functional errors escaping design verification and released into final silicon is growing, due to the increasing complexity of microprocessor systems and the shrinking production schedules of their development. The recent, widespread adoption of multi-core processor architectures is exacerbating the problem, due to the variable ...
Ilya Wagner, Valeria Bertacco
openaire   +1 more source

Ground Data Verification

10th Annual International Symposium on Geoscience and Remote Sensing, 2005
P.N. Churchill, R.J. Miller
openaire   +1 more source

Cancer Statistics, 2021

Ca-A Cancer Journal for Clinicians, 2021
Rebecca L Siegel, Kimberly D Miller
exaly  

Home - About - Disclaimer - Privacy