Quantitative Treatments for Explaining the Mechanism and Kinetics of Catalytic Electron Transfers in Murburn Processes, Particularly Involving Heme Enzymes Like (Per)oxidases and P450s. [PDF]
The seminal Michaelis–Menten theorization for biological catalysis was based on “transition state” (TS), involving the formation of a topologically complementary substrate (S) and enzyme (E) complex (ES) at the “active site” of the latter. Rudolph Marcus put forth the theory of outer sphere electron transfer (ET) in a “donor–acceptor” TS complex, which
Manoj KM +4 more
europepmc +2 more sources
Data Stitching for Dynamic Field Monitoring With NMR Probes. [PDF]
ABSTRACT Purpose To propose a new method for characterizing sequences with higher resolution or readout length than allowed by standard field monitoring approaches. Methods Our proposed method was devised to characterize entire readout gradients by stitching multiple segment‐specific dynamic field measurements obtained across a matched number of ...
Zhang J +11 more
europepmc +2 more sources
MR sequence design to account for nonideal gradient performance. [PDF]
Abstract Purpose MRI systems are traditionally engineered to produce close to idealized performance, enabling a simplified pulse sequence design philosophy. An example of this is control of eddy currents produced by gradient fields; usually these are compensated by pre‐emphasizing demanded waveforms.
West DJ +6 more
europepmc +2 more sources
Exact, time-dependent analytical equations for spiral trajectories and matching gradient and density-correction waveforms. [PDF]
Abstract Purpose To analytically define a spiral waveform and trajectory that match the constraints of gradient frequency, slew rate, and amplitude. Theory and Methods Piecewise analytical solutions for gradient waveforms under the desired constraints are derived using the circle of an involute rather than an Archimedean spiral.
Krishnamoorthy G, Pipe JG.
europepmc +2 more sources
DeepZero: Scaling up Zeroth-Order Optimization for Deep Model Training [PDF]
Zeroth-order (ZO) optimization has become a popular technique for solving machine learning (ML) problems when first-order (FO) information is difficult or impossible to obtain.
Aochuan Chen +9 more
semanticscholar +1 more source
Zeroth-Order Optimization Meets Human Feedback: Provable Learning via Ranking Oracles [PDF]
In this study, we delve into an emerging optimization challenge involving a black-box objective function that can only be gauged via a ranking oracle-a situation frequently encountered in real-world scenarios, especially when the function is evaluated by
Zhiwei Tang +2 more
semanticscholar +1 more source
Quantifying Over Trees in Monadic Second-Order Logic [PDF]
Monadic Second-Order Logic (MSO) extends First-Order Logic (FO) with variables ranging over sets and quantifications over those variables. We introduce and study Monadic Tree Logic (MTL), a fragment of MSO interpreted on infinite-tree models, where the ...
M. Benerecetti +3 more
semanticscholar +1 more source
Impact of through-slice gradient optimization for dynamic slice-wise shimming in the cervico-thoracic spinal cord. [PDF]
Abstract Purpose This study investigates the effectiveness of through‐slice gradient optimization in dynamic slice‐wise B0 shimming of the cervico‐thoracic spinal cord to enhance signal recovery in gradient‐echo (GRE) EPI sequences commonly used in functional MRI studies. Methods Six volunteers underwent MRI acquisitions with dynamic shim updating (DSU)
Breheret A +4 more
europepmc +2 more sources
On Exact Sampling in the Two-Variable Fragment of First-Order Logic [PDF]
In this paper, we study the sampling problem for first-order logic proposed recently by Wang et al.—how to efficiently sample a model of a given first-order sentence on a finite domain? We extend their result for the universally-quantified subfragment of
Yuanhong Wang +3 more
semanticscholar +1 more source
Communication-Efficient Stochastic Zeroth-Order Optimization for Federated Learning [PDF]
Federated learning (FL), as an emerging edge artificial intelligence paradigm, enables many edge devices to collaboratively train a global model without sharing their private data.
Wenzhi Fang +5 more
semanticscholar +1 more source

