Results 61 to 70 of about 247,902 (260)
Machine Learning with Squared-Loss Mutual Information
Mutual information (MI) is useful for detecting statistical independence between random variables, and it has been successfully applied to solving various machine learning problems.
Masashi Sugiyama
doaj +1 more source
Derivative of BICM mutual information [PDF]
submitted to the IET Electronics Letters.
Fabregas, Albert Guillen I +1 more
openaire +2 more sources
Subtype‐specific enhancer RNAs define transcriptional regulators and prognosis in breast cancers
This study employed machine learning methodologies to perform the subtype‐specific classification of RNA‐seq data sets, which are mapped on enhancers from TCGA‐derived breast cancer patients. Their integration with gene expression (referred to as ProxCReAM eRNAs) and chromatin accessibility profiles has the potential to identify lineage‐specific and ...
Aamena Y. Patel +6 more
wiley +1 more source
Improved Neural Networks Based on Mutual Information via Information Geometry
This paper presents a new algorithm based on the theory of mutual information and information geometry. This algorithm places emphasis on adaptive mutual information estimation and maximum likelihood estimation.
Meng Wang +5 more
doaj +1 more source
Cytarabine is a key therapy for acute myeloid leukaemia (AML), but its efficacy is limited by the dNTPase SAMHD1, which hydrolyses its active metabolite. Screening nucleotide biosynthesis inhibitors revealed that IMPDH inhibitors selectively sensitise SAMHD1‐proficient AML cells to cytarabine.
Miriam Yagüe‐Capilla +9 more
wiley +1 more source
The mutual information of two random variables plays fundamental roles in many areas of applied mathematics. This quantity can be generalized to finite sets of random variables. In contrast to the nonnegativity of the usual mutual information, the triple mutual information can be negative as well as positive and its sign gives us a rough indication of ...
openaire +3 more sources
Hierarchical clustering using mutual information [PDF]
We present a method for hierarchical clustering of data called {\it mutual information clustering} (MIC) algorithm. It uses mutual information (MI) as a similarity measure and exploits its grouping property: The MI between three objects $X, Y,$ and $Z$ is equal to the sum of the MI between $X$ and $Y$, plus the MI between $Z$ and the combined object ...
Kraskov, A. +3 more
openaire +4 more sources
We have established a humanized orthotopic patient‐derived xenograft (Hu‐oPDX) mouse model of high‐grade serous ovarian cancer (HGSOC) that recapitulates human tumor–immune interactions. Using combined anti‐PD‐L1/anti‐CD73 immunotherapy, we demonstrate the model's improved biological relevance and enhanced translational value for preclinical ...
Luka Tandaric +10 more
wiley +1 more source
KDM7A and KDM1A inhibition suppresses tumour promoting pathways in prostate cancer
Treatment resistance is a major challenge for patients with advanced prostate cancer. This study examined an alternative approach to target the major prostate cancer‐promoting pathway by targeting epigenetic factors, whose levels are higher in tumours.
Jennie N Jeyapalan +16 more
wiley +1 more source
Dictator functions maximize mutual information [PDF]
accepted for publication in the Annals of Applied Probability; 8 pages, 1 ...
Georg, Pichler +2 more
openaire +4 more sources

