Results 61 to 70 of about 20,567 (255)
Bounds on the Excess Minimum Risk via Generalized Information Divergence Measures
Given finite-dimensional random vectors Y, X, and Z that form a Markov chain in that order (Y→X→Z), we derive the upper bounds on the excess minimum risk using generalized information divergence measures.
Ananya Omanwar +2 more
doaj +1 more source
Strength through diversity: how cancers thrive when clones cooperate
Intratumor heterogeneity can offer direct benefits to the tumor through cooperation between different clones. In this review, Kuiken et al. discuss existing evidence for clonal cooperativity to identify overarching principles, and highlight how novel technological developments could address remaining open questions.
Marije C. Kuiken +3 more
wiley +1 more source
Reducing Uncertainty of Weak Supervised Signals via Mutual Information Techniques
Weakly supervised learning (WSL) refers to training models using imperfect or noisy labels, which can significantly reduce the costs associated with manual labeling.
Yichen Liu, Hanlin Feng, Xin Zhang
doaj +1 more source
Monitoring circulating tumor DNA (ctDNA) in patients with operable breast cancer can reveal disease relapse earlier than radiology in a subset of patients. The failure to detect ctDNA in some patients with recurrent disease suggests that ctDNA could serve as a supplement to other monitoring approaches.
Kristin Løge Aanestad +35 more
wiley +1 more source
The cancer problem is increasing globally with projections up to the year 2050 showing unfavourable outcomes in terms of incidence and cancer‐related deaths. The main challenges are prevention, improved therapeutics resulting in increased cure rates and enhanced health‐related quality of life.
Ulrik Ringborg +43 more
wiley +1 more source
Decoding MRI-informed brain age using mutual information
Objective We aimed to develop a standardized method to investigate the relationship between estimated brain age and regional morphometric features, meeting the criteria for simplicity, generalization, and intuitive interpretability.
Jing Li, Linda Chiu Wa Lam, Hanna Lu
doaj +1 more source
Kolmogorov Capacity with Overlap
The notion of δ-mutual information between non-stochastic uncertain variables is introduced as a generalization of Nair’s non-stochastic information functional.
Anshuka Rangi, Massimo Franceschetti
doaj +1 more source
This paper presents a novel approach for efficient feature extraction using mutual information (MI). In terms of mutual information, the optimal feature extraction is creating a feature set from the data which jointly have the largest dependency on the ...
Farid Oveisi, Abbas Erfanian
doaj +1 more source
The effect of anisotropy on holographic entanglement entropy and mutual information
We study the effect of anisotropy on holographic entanglement entropy (HEE) and holographic mutual information (MI) in the Q-lattice model, by exploring the HEE and MI for infinite strips along arbitrary directions.
Peng Liu, Chao Niu, Jian-Pin Wu
doaj +1 more source
LDAcoop: Integrating non‐linear population dynamics into the analysis of clonogenic growth in vitro
Limiting dilution assays (LDAs) quantify clonogenic growth by seeding serial dilutions of cells and scoring wells for colony formation. The fraction of negative wells is plotted against cells seeded and analyzed using the non‐linear modeling of LDAcoop.
Nikko Brix +13 more
wiley +1 more source

