Results 331 to 340 of about 17,207,076 (361)
Some of the next articles are maybe not open access.
Journal of Visual Communication and Image Representation, 2009
Deformable models have been widely used in image segmentation since the introduction of the snakes. Later the introduction of level set frameworks to solve the energy minimization problem associated with the deformable model overcame some limitations of the parametric active contours with respect to topological changes by embedding surface ...
Qi Duan +2 more
openaire +1 more source
Deformable models have been widely used in image segmentation since the introduction of the snakes. Later the introduction of level set frameworks to solve the energy minimization problem associated with the deformable model overcame some limitations of the parametric active contours with respect to topological changes by embedding surface ...
Qi Duan +2 more
openaire +1 more source
Unbundling active functionality
ACM SIGMOD Record, 1998New application areas or new technical innovations expect from database management systems more and more new functionality. However, adding functions to the DBMS as an integral part of them, tends to create monoliths that are difficult to design, implement, validate, maintain and adapt.
Gatziu, Stella +3 more
openaire +1 more source
Simplified Hardware Implementation of the Softmax Activation Function
International Conference on Modern Circuits and Systems Technologies, 2019In this paper a simplified hardware implementation of a CNN softmax layer is proposed. Initially the softmax activation function is analyzed in terms of required accuracy and certain optimizations are proposed.
I. Kouretas, Vassilis Paliouras
semanticscholar +1 more source
Functionally Active VEGF Fusion Proteins
Protein Expression and Purification, 2001Angiogenesis is stimulated by vascular endothelial growth factor (VEGF) acting via endothelial cell-specific receptors, such as VEGFR-2, that are overexpressed at the sites of angiogenesis. If VEGF retains activity as a fusion protein with a large N-terminal extension, it would facilitate development of VEGF-based vehicles for receptor-mediated ...
M V, Backer, J M, Backer
openaire +2 more sources
LiSHT: Non-Parametric Linearly Scaled Hyperbolic Tangent Activation Function for Neural Networks
International Conference on Computer Vision and Image Processing, 2019The activation function in neural network is one of the important aspects which facilitates the deep training by introducing the non-linearity into the learning process.
S. K. Roy +3 more
semanticscholar +1 more source
Other Activate Functionalities
2018There are various ways to perform optimization based on model simulation. Two methods were already introduced: the use of the BobyqaOpt block presented in Section 8.2, and the use of batch simulation presented in Section 12.1.
Stephen L. Campbell, Ramine Nikoukhah
openaire +1 more source
Cortical activity and cognitive functioning
Electroencephalography and Clinical Neurophysiology, 1960Abstract The present study was designed to investigate the effects of photically induced EEG disruption on cognitive functioning. Twenty-eight male subjects were selected for study. Fourteen of these had shown EEG activation and 14 had failed to activate during several previous exposures to intermittent photic stimulation.
L C, JOHNSON +3 more
openaire +3 more sources
Optimizing nonlinear activation function for convolutional neural networks
Signal, Image and Video Processing, 2021Munender Varshney, Pravendra Singh
semanticscholar +1 more source
Multivariate activation functions
International Journal of Wavelets, Multiresolution and Information ProcessingActivation functions are critical components in neural network architectures, significantly influencing model performance and learning efficiency. While traditional univariate activation functions have been extensively studied and optimized, the exploration of multidimensional (multivariate) activation functions remains relatively nascent.
openaire +1 more source
[Proceedings] 1991 IEEE International Joint Conference on Neural Networks, 1991
The high degree complexity of the features associated with a unit in neural networks suggested that the introduction of fuzziness into the activity of the unit would be appropriate. It is demonstrated that the idea of imprecise distinction between excitation and inhibition can be manipulated easily by fuzzy activation functions.
openaire +1 more source
The high degree complexity of the features associated with a unit in neural networks suggested that the introduction of fuzziness into the activity of the unit would be appropriate. It is demonstrated that the idea of imprecise distinction between excitation and inhibition can be manipulated easily by fuzzy activation functions.
openaire +1 more source

