Results 31 to 40 of about 142,324 (226)

A lightweight image classification method based on dual-source adaptive knowledge distillation

open access: yesXi'an Gongcheng Daxue xuebao, 2023
In the task of knowledge distillation, a dual-source adaptive knowledge distillation (DSAKD) method is proposed to address the issues of feature information loss during the feature alignment process and the lack of consideration for the differences in ...
ZHANG Kaibing, MA Dongtong, MENG Yalei
doaj   +1 more source

Private Model Compression via Knowledge Distillation

open access: yes, 2018
The soaring demand for intelligent mobile applications calls for deploying powerful deep neural networks (DNNs) on mobile devices. However, the outstanding performance of DNNs notoriously relies on increasingly complex models, which in turn is associated
Bao, Weidong   +5 more
core   +1 more source

Dapagliflozin prevents methylglyoxal‐induced retinal cell death in ARPE‐19 cells

open access: yesFEBS Open Bio, EarlyView.
Diabetic macular oedema is a diabetes complication of the eye, which may lead to permanent blindness. ARPE‐19 are human retinal cells used to study retinal diseases and potential therapeutics. Methylglyoxal is a compound increased in uncontrolled diabetes due to elevated blood glucose.
Naina Trivedi   +7 more
wiley   +1 more source

Counterclockwise block-by-block knowledge distillation for neural network compression

open access: yesScientific Reports
Model compression is a technique for transforming large neural network models into smaller ones. Knowledge distillation (KD) is a crucial model compression technique that involves transferring knowledge from a large teacher model to a lightweight student
Xiaowei Lan   +6 more
doaj   +1 more source

Knowledge Distillation in Image Classification: The Impact of Datasets

open access: yesComputers
As the demand for efficient and lightweight models in image classification grows, knowledge distillation has emerged as a promising technique to transfer expertise from complex teacher models to simpler student models.
Ange Gabriel Belinga   +3 more
doaj   +1 more source

Adversarially Robust Distillation

open access: yes, 2019
Knowledge distillation is effective for producing small, high-performance neural networks for classification, but these small networks are vulnerable to adversarial attacks.
Feizi, Soheil   +3 more
core   +1 more source

Erythropoietin modulates hepatic inflammation, glucose homeostasis, and soluble epoxide hydrolase and epoxides in high‐fat diet‐induced obese mice

open access: yesFEBS Open Bio, EarlyView.
Erythropoietin administration suppresses hepatic soluble epoxide hydrolase (sEH) expression, leading to increased CYP‐derived epoxides. This is associated with a shift in hepatic macrophage polarization characterized by reduced M1 markers and increased M2 markers, along with reduced hepatic inflammation, suppressed hepatic lipogenesis, and attenuated ...
Takeshi Goda   +12 more
wiley   +1 more source

Electrochemical Evaluation of Compressed Selective Laser Melted AlSi7Mg and AlSi10Mg Alloys in Chloride Environment

open access: yesAdvanced Engineering Materials, EarlyView.
The corrosion performance of AlSi7Mg and AlSi10Mg alloys produced through selective laser melting (SLM) was examined under compressive stress in a chloride environment. Electrochemical analyses, including open‐circuit potential (OCP), potentiodynamic polarization (CPP), and electrochemical impedance spectroscopy (EIS), were complemented by scanning ...
Femi John Akinfolarin   +2 more
wiley   +1 more source

Additive Gaussian Process Regression for Predictive Design of High‐Performance, Printable Silicones

open access: yesAdvanced Engineering Materials, EarlyView.
A chemistry‐aware design framework for tuning printable polydimethylsiloxane (PDMS) for vat photopolymerization (VPP) is developed using additive Gaussian process (GP) modeling. Polymer network mechanics informs variable groupings, feasible formulation constraints, and interaction variables.
Roxana Carbonell   +3 more
wiley   +1 more source

Contrastive Learning‐Based Multi‐Level Knowledge Distillation

open access: yesCAAI Transactions on Intelligence Technology
With the increasing constraints of hardware devices, there is a growing demand for compact models to be deployed on device endpoints. Knowledge distillation, a widely used technique for model compression and knowledge transfer, has gained significant ...
Lin Li   +4 more
doaj   +1 more source

Home - About - Disclaimer - Privacy