Results 131 to 140 of about 226,203 (262)

Modulating Two‐Photon Absorption in a Pyrene‐Based MOF Series: An In‐Depth Investigation of Structure–Property Relationships

open access: yesAdvanced Functional Materials, EarlyView.
This study investigates H4TBAPy‐based metal–organic frameworks (MOFs) ‐ NU‐1000, NU‐901, SrTBAPy, and BaTBAPy ‐ for multiphoton absorption (MPA) performance. It observes topology‐dependent variations in the 2PA cross‐section, with BaTBAPy exhibiting the highest activity.
Simon N. Deger   +10 more
wiley   +1 more source

DC-LoRA: Domain correlation low-rank adaptation for domain incremental learning

open access: yesHigh-Confidence Computing
Continual learning, characterized by the sequential acquisition of multiple tasks, has emerged as a prominent challenge in deep learning. During the process of continual learning, deep neural networks experience a phenomenon known as catastrophic ...
Lin Li   +4 more
doaj   +1 more source

Laser‐Induced Graphene from Waste Almond Shells

open access: yesAdvanced Functional Materials, EarlyView.
Almond shells, an abundant agricultural by‐product, are repurposed to create a fully bioderived almond shell/chitosan composite (ASC) degradable in soil. ASC is converted into laser‐induced graphene (LIG) by laser scribing and proposed as a substrate for transient electronics.
Yulia Steksova   +9 more
wiley   +1 more source

Efficient Chinese-Malay Speech-Text Translation via Layer-Freezing Adaptation of Multimodal Foundation Models

open access: yesIEEE Access
This paper addresses the challenge of Chinese-Malay speech-to-text translation (S2TT), a crucial yet under-resourced language pair in computational linguistics. We introduce Layer-Freezing Adaptive Fine-Tuning (LFAFT), a parameter-efficient strategy that
Xiao Liang   +4 more
doaj   +1 more source

AdapterGNN: Parameter-Efficient Fine-Tuning Improves Generalization in GNNs

open access: yesProceedings of the AAAI Conference on Artificial Intelligence
Fine-tuning pre-trained models has recently yielded remarkable performance gains in graph neural networks (GNNs). In addition to pre-training techniques, inspired by the latest work in the natural language fields, more recent work has shifted towards applying effective fine-tuning approaches, such as parameter-efficient fine-tuning (PEFT).
Li, Shengrui, Han, Xueting, Bai, Jing
openaire   +2 more sources

Substrate Stress Relaxation Regulates Cell‐Mediated Assembly of Extracellular Matrix

open access: yesAdvanced Functional Materials, EarlyView.
Silicone‐based viscoelastic substrates with tunable stress relaxation reveal how matrix mechanics regulates cellular mechanosensing and cell‐mediated matrix remodelling in the stiff regime. High stress relaxation promotes assembly of fibronectin fibril‐like structures, increased nuclear localization of YAP and formation of β1 integrin‐enriched ...
Jonah L. Voigt   +2 more
wiley   +1 more source

Decision-focused fine-tuning of time series foundation models for dispatchable feeder optimization

open access: yesEnergy and AI
Time series foundation models provide a universal solution for generating forecasts to support optimization problems in energy systems. Those foundation models are typically trained in a prediction-focused manner to maximize forecast quality. In contrast,
Maximilian Beichter   +9 more
doaj   +1 more source

Modulating Electrochemical CO2 Reduction Pathways via Interfacial Electric Field

open access: yesAdvanced Functional Materials, EarlyView.
Engineering interfacial electric fields in Cu/ITO electrodes enables precise control of CO2 reduction pathways. Charge transfer from Cu to ITO generates positively charged Cu species that steer selectivity from ethylene toward methane. This work demonstrates how interfacial electric‐field modulation can direct reaction intermediates and transform ...
Mahdi Salehi   +7 more
wiley   +1 more source

CO2 Reduction on Copper‐Nitrogen‐Doped Carbon Catalysts Tuned by Pulsed Potential Electrolysis: Effect of Pulse Potential

open access: yesAdvanced Functional Materials, EarlyView.
This study demonstrates that pulsed potential electrolysis significantly improves CO2 reduction performance on copper‐nitrogen doped carbon electrodes. The formation of cationic copper sites and metallic clusters as a function of applied intermittent potential leads to notable selectivity changes compared to potentiostatic reduction.
Dorottya Hursán   +13 more
wiley   +1 more source

Fine-tuning Large Language Models for Turkish Flutter Code Generation

open access: yesSakarya University Journal of Computer and Information Sciences
The rapid advancement of large language models (LLMs) for code generation has largely centered on English programming queries. This paper focuses on a low-resource language scenario, specifically Turkish, in the context of Flutter mobile app development.
Bugra Uluırmak, Rifat Kurban
doaj   +1 more source

Home - About - Disclaimer - Privacy