Results 21 to 30 of about 9,200 (172)

CycleGAN Variants for Industrial Defect Data Augmentation [PDF]

open access: yesITM Web of Conferences
Industrial visual inspection is constrained by scarce labeled defect samples and complex surface patterns in bearings, steel, and ICs, significantly hindering deep learning detection models.
Xu Xiaoyu
doaj   +1 more source

Cycle-consistent adversarial networks improves generalizability of radiomics model in grading meningiomas on external validation

open access: yesScientific Reports, 2022
The heterogeneity of MRI is one of the major reasons for decreased performance of a radiomics model on external validation, limiting the model’s generalizability and clinical application.
Yae Won Park   +8 more
doaj   +1 more source

Segmentation-Enhanced CycleGAN [PDF]

open access: yes, 2019
AbstractAlgorithmic reconstruction of neurons from volume electron microscopy data traditionally requires training machine learning models on dataset-specific ground truth annotations that are expensive and tedious to acquire. We enhanced the training procedure of an unsupervised image-to-image translation method with additional components derived from
Januszewski, Michał, Jain, Viren
openaire   +1 more source

Multi‐head mutual‐attention CycleGAN for unpaired image‐to‐image translation

open access: yesIET Image Processing, 2020
The image‐to‐image translation, i.e. from source image domain to target image domain, has made significant progress in recent years. The most popular method for unpaired image‐to‐image translation is CycleGAN.
Wei Ji, Jing Guo, Yun Li
doaj   +1 more source

Deteriorated Characters Restoration for Early Japanese Books Using Enhanced CycleGAN

open access: yesHeritage, 2023
Early Japanese books, classical humanities resources in Japan, have great historical and cultural value. However, Kuzushi-ji, the old character in early Japanese books, is scratched, faded ink, and lost due to weathering and deterioration over the years.
Hayata Kaneko, Ryuto Ishibashi, Lin Meng
doaj   +1 more source

Unsupervised Single-Generator CycleGAN-Based Pansharpening With Spatial-Spectral Degradation Modeling

open access: yesIEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2023
Supervised pansharpening methods require the ground truth, which is generally unavailable. Therefore, the popularity of unsupervised pansharpening methods has increased.
Wenxiu Diao   +3 more
doaj   +1 more source

Research on the Style of Art Works based on Deep Learning

open access: yesJournal of Advanced Transportation, 2022
In view of the unsatisfactory effect and major limitations of the style transfer of art works, this paper takes Chinese ink painting for the research subject.
Shulin Liu
doaj   +1 more source

Borrow from Anywhere: Pseudo Multi-modal Object Detection in Thermal Imagery

open access: yes, 2020
Can we improve detection in the thermal domain by borrowing features from rich domains like visual RGB? In this paper, we propose a pseudo-multimodal object detector trained on natural image domain data to help improve the performance of object detection
Akolekar, Ninad   +3 more
core   +1 more source

Comprehensive evaluation of similarity between synthetic and real CT images for nasopharyngeal carcinoma

open access: yesRadiation Oncology, 2023
Background Although magnetic resonance imaging (MRI)-to-computed tomography (CT) synthesis studies based on deep learning have significantly progressed, the similarity between synthetic CT (sCT) and real CT (rCT) has only been evaluated in image quality ...
Siqi Yuan   +5 more
doaj   +1 more source

Imaging Study of Pseudo-CT Synthesized From Cone-Beam CT Based on 3D CycleGAN in Radiotherapy

open access: yesFrontiers in Oncology, 2021
PurposeTo propose a synthesis method of pseudo-CT (CTCycleGAN) images based on an improved 3D cycle generative adversarial network (CycleGAN) to solve the limitations of cone-beam CT (CBCT), which cannot be directly applied to the correction of ...
Hongfei Sun   +14 more
doaj   +1 more source

Home - About - Disclaimer - Privacy