Results 131 to 140 of about 480,764 (191)
Some of the next articles are maybe not open access.
Related searches:
Related searches:
Journal of Community Psychology, 1979
In this study, the perceptions of unilateral terminators, "dropouts," of their visit to a community mental health center were examined. A telephone survey was conducted to assess client satisfaction, impressions of the setting and therapist, helpfulness of the visit, expectations of and reported services received, and source and degree of problem(s ...
W H, Silverman, R P, Beech
openaire +2 more sources
In this study, the perceptions of unilateral terminators, "dropouts," of their visit to a community mental health center were examined. A telephone survey was conducted to assess client satisfaction, impressions of the setting and therapist, helpfulness of the visit, expectations of and reported services received, and source and degree of problem(s ...
W H, Silverman, R P, Beech
openaire +2 more sources
Remedial and Special Education, 1989
With the goal of achieving a better understanding of the nature of the dropout problem for students in special education, this review focuses first on the literature related to the dropout phenomenon in general education. The issues addressed are (a) consequences of dropping out, (b) definitions of dropouts and calculations of dropout rates, (c ...
Clara Wolman +2 more
openaire +1 more source
With the goal of achieving a better understanding of the nature of the dropout problem for students in special education, this review focuses first on the literature related to the dropout phenomenon in general education. The issues addressed are (a) consequences of dropping out, (b) definitions of dropouts and calculations of dropout rates, (c ...
Clara Wolman +2 more
openaire +1 more source
IEEE Transactions on Neural Networks and Learning Systems, 2023
Optimization algorithms are of great importance to efficiently and effectively train a deep neural network. However, the existing optimization algorithms show unsatisfactory convergence behavior, either slowly converging or not seeking to avoid bad local optima.
Huangxing Lin +5 more
openaire +2 more sources
Optimization algorithms are of great importance to efficiently and effectively train a deep neural network. However, the existing optimization algorithms show unsatisfactory convergence behavior, either slowly converging or not seeking to avoid bad local optima.
Huangxing Lin +5 more
openaire +2 more sources
Journal of Psychosomatic Research, 1980
Abstract Of 90 couples entering home hemodialysis training for the treatment of end-stage renal disease, 12 patients withdrew during home training or the first year of home treatment to transfer to center dialysis. Four patients, all men over 60, withdrew from training due to inability to learn the procedures.
M R, Lowry, E, Atcherson
openaire +2 more sources
Abstract Of 90 couples entering home hemodialysis training for the treatment of end-stage renal disease, 12 patients withdrew during home training or the first year of home treatment to transfer to center dialysis. Four patients, all men over 60, withdrew from training due to inability to learn the procedures.
M R, Lowry, E, Atcherson
openaire +2 more sources
Canadian Psychiatric Association Journal, 1973
Un des aspects les plus troublants de la pratique psychiatrique consiste dans le fait qu'une proportion notable des malades qui desirent de l'aide ne se rendent pas au premier rendez-vous fixe.
C M, Rosenberg, A E, Raynes
openaire +2 more sources
Un des aspects les plus troublants de la pratique psychiatrique consiste dans le fait qu'une proportion notable des malades qui desirent de l'aide ne se rendent pas au premier rendez-vous fixe.
C M, Rosenberg, A E, Raynes
openaire +2 more sources
Neural Networks, 2018
Training a deep neural network with a large number of parameters often leads to overfitting problem. Recently, Dropout has been introduced as a simple, yet effective regularization approach to combat overfitting in such models. Although Dropout has shown remarkable results on many deep neural network cases, its actual effect on CNN has not been ...
Alvin Poernomo, Dae-Ki Kang
openaire +2 more sources
Training a deep neural network with a large number of parameters often leads to overfitting problem. Recently, Dropout has been introduced as a simple, yet effective regularization approach to combat overfitting in such models. Although Dropout has shown remarkable results on many deep neural network cases, its actual effect on CNN has not been ...
Alvin Poernomo, Dae-Ki Kang
openaire +2 more sources

