An entropy model for artificial grammar learning [PDF]
A model is proposed to characterize the type of knowledge acquired in Artificial Grammar Learning (AGL). In particular, Shannon entropy is employed to compute the complexity of different test items in an AGL task, relative to the training items ...
Emmanuel Pothos
doaj +6 more sources
Individual strategies in artificial grammar learning [PDF]
Artificial Grammar Learning (AGL) has been used extensively to study theories of learning. We argue that compelling conclusions cannot be forthcoming without an analysis of individual strategies.
Pothos, E. M. +2 more
core +12 more sources
Eye-movements in implicit artificial grammar learning [PDF]
Artificial grammar learning (AGL) has been probed with forced-choice behavioral tests (active tests). Recent attempts to probe the outcomes of learning (implicitly acquired knowledge) with eye-movement responses (passive tests) have shown null results ...
Folia, Vasiliki +3 more
core +6 more sources
Implicit learning of recursive context-free grammars. [PDF]
Context-free grammars are fundamental for the description of linguistic syntax. However, most artificial grammar learning experiments have explored learning of simpler finite-state grammars, while studies exploring context-free grammars have not assessed
Martin Rohrmeier +2 more
doaj +4 more sources
Assessing serial recall as a measure of artificial grammar learning [PDF]
IntroductionImplicit statistical learning is, by definition, learning that occurs without conscious awareness. However, measures that putatively assess implicit statistical learning often require explicit reflection, for example, deciding if a sequence ...
Holly E. Jenkins +5 more
doaj +2 more sources
Comparing feedforward and recurrent neural network architectures with human behavior in artificial grammar learning [PDF]
In recent years artificial neural networks achieved performance close to or better than humans in several domains: tasks that were previously human prerogatives, such as language processing, have witnessed remarkable improvements in state of the art ...
Andrea Alamia +3 more
doaj +2 more sources
Metrical presentation boosts implicit learning of artificial grammar. [PDF]
The present study investigated whether a temporal hierarchical structure favors implicit learning. An artificial pitch grammar implemented with a set of tones was presented in two different temporal contexts, notably with either a strongly metrical ...
Tatiana Selchenkova +5 more
doaj +10 more sources
Fronto-parietal contributions to phonological processes in successful artificial grammar learning [PDF]
Sensitivity to regularities plays a crucial role in the acquisition of various linguistic features from spoken language input. Artificial grammar (AG) learning paradigms explore pattern recognition abilities in a set of structured sequences (i.e.
Dariya Goranskaya +5 more
doaj +2 more sources
Artificial grammar learning of melody is constrained by melodic inconsistency: Narmour's principles affect melodic learning. [PDF]
Considerable evidence suggests that people acquire artificial grammars incidentally and implicitly, an indispensable capacity for the acquisition of music or language.
Martin Rohrmeier, Ian Cross
doaj +2 more sources
Stimulus variation-based training enhances artificial grammar learning
The current study was designed to explore whether statistical learning ability is affected by the diversity of the stimulus set used in the training phase.
Rachel Schiff +3 more
doaj +3 more sources

