Results 1 to 10 of about 176,605 (340)
LPG–PCFG: An Improved Probabilistic Context- Free Grammar to Hit Low-Probability Passwords [PDF]
With the development of the Internet, information security has attracted more attention. Identity authentication based on password authentication is the first line of defense; however, the password-generation model is widely used in offline password ...
Xiaozhou Guo +4 more
doaj +4 more sources
A stochastic context free grammar based framework for analysis of protein sequences [PDF]
Background In the last decade, there have been many applications of formal language theory in bioinformatics such as RNA structure prediction and detection of patterns in DNA.
Nebel Jean-Christophe, Dyrka Witold
doaj +3 more sources
PCFGs Can Do Better: Inducing Probabilistic Context-Free Grammars with Many Symbols [PDF]
Probabilistic context-free grammars (PCFGs) with neural parameterization have been shown to be effective in unsupervised phrase-structure grammar induction.
Songlin Yang, Yanpeng Zhao, Kewei Tu
semanticscholar +3 more sources
A synchronous context free grammar for time normalization. [PDF]
We present an approach to time normalization (e.g. the day before yesterday⇒2013-04-12) based on a synchronous context free grammar. Synchronous rules map the source language to formally defined operators for manipulating times (FindEnclosed ...
Bethard S.
europepmc +2 more sources
Grammar Compression with Probabilistic Context-Free Grammar [PDF]
We propose a new approach for universal lossless text compression, based on grammar compression. In the literature, a target string $T$ has been compressed as a context-free grammar $G$ in Chomsky normal form satisfying $L(G) = \{T\}$. Such a grammar is often called a \emph{straight-line program} (SLP).
Naganuma, Hiroaki +4 more
openaire +4 more sources
A Context-Free Grammar Associated with Fibonacci and Lucas Sequences [PDF]
We introduce a context-free grammar G=s⟶s+d,d⟶s to generate Fibonacci and Lucas sequences. By applying the grammar G, we give a grammatical proof of the Binet formula.
Harold Ruilong Yang
doaj +2 more sources
Implicit learning of recursive context-free grammars.
Context-free grammars are fundamental for the description of linguistic syntax. However, most artificial grammar learning experiments have explored learning of simpler finite-state grammars, while studies exploring context-free grammars have not assessed
Martin Rohrmeier +2 more
doaj +3 more sources
On the Size Complexity of Non-Returning Context-Free PC Grammar Systems [PDF]
Improving the previously known best bound, we show that any recursively enumerable language can be generated with a non-returning parallel communicating (PC) grammar system having six context-free components.
Erzsébet Csuhaj-Varjú, György Vaszil
doaj +3 more sources
In an attempt to provide a unified theory of grammars, a model is introduced which has two components. The first is a ''grammar form,'' which provides the general structure of the productions in the grammars to be defined. The second is an ''interpretation'', which yields a specific grammar.
A. Cremers, S. Ginsburg
semanticscholar +2 more sources
Cross-Domain Feature Enhancement-Based Password Guessing Method for Small Samples [PDF]
As a crucial component of account protection system evaluation and intrusion detection, the advancement of password guessing technology encounters challenges due to its reliance on password data. In password guessing research, there is a conflict between
Cheng Liu +7 more
doaj +2 more sources

