A shorted version from Anastasia Berdnikova amp Denis Miretskiy Colourless green ideas sleep furiously Chomsky constructed finite formal machines grammars Does the language contain this sentence intractable ID: 298153
Download Presentation The PPT/PDF document "Transformational grammars" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
Slide1
Transformational grammars
A
shorted version from
:
Anastasia
Berdnikova
&
Denis
MiretskiySlide2
‘Colourless green ideas sleep furiously’.Chomsky constructed finite formal machines – ‘
grammars
’.‘Does the language contain this sentence?’ (intractable) ‘Can the grammar create this sentence?’ (can be answered).TG are sometimes called generative grammars.
Transformational grammars
2
IntroductionSlide3
TG =
(
{symbols}, {rewriting rules α→β
- productions} ){symbols} = {
nonterminal} U {
terminal
}α contains at least one nonterminal, β – terminals and/or nonterminals.S → aS, S → bS, S → e (S → aS | bS | e)Derivation: S=>aS=>abS=>abbS=>abb.Parse tree: root – start nonterminal S, leaves – the terminal symbols in the sequence, internal nodes are nonterminals.The children of an internal node are the productions of it.
Transformational grammars
3
DefinitionSlide4
W –
nonterminal, a – terminal,
α and γ –strings of nonterminals and/or terminals including the null string, β
– the same not including the null string.
regular grammars: W
→
aW or W → acontext-free grammars: W → βcontext-sensitive grammars: α1Wα2 → α1βα2. AB → BAunrestricted (phase structure) grammars: α1Wα2 → γTransformational grammars4The Chomsky hierarchySlide5
Transformational grammars
5
The Chomsky hierarchySlide6
Each grammar has a corresponding abstract computational device
–
automaton.Grammars: generative models, automata: parsers that accept or reject a given sequence.- automata are often more easy to describe and understand than their equivalent grammars.
- automata give a more concrete idea of how we might recognise a sequence using a formal grammar.
Transformational grammars
6
AutomataSlide7
---------------------------------------------------
Grammar Parsing automaton
---------------------------------------------------regular grammars finite state automaton
context-free grammars push-down automaton
context-sensitive grammars linear bounded automatonunrestricted grammars Turing machine
---------------------------------------------------
Transformational grammars7Parser abstractions associated with the hierarchy of grammarsSlide8
W →
aW
or W → asometimes allowed:
W → e
RG generate sequence from left to right
(or right to left:
W → Wa or W → a)RG cannot describe long-range correlations between the terminal symbols (‘primary sequence’)Transformational grammars8Regular grammarsSlide9
An example of a regular grammar that generates only strings of as and b
s that have an odd number of
as: start from S, S →
aT | bS,
T → aS | bT | e.
Transformational grammars
9An odd regular grammarSlide10
One symbol at a time from an input string.
The symbol may be accepted => the automaton enters a new state.
The symbol may not be accepted => the automaton halts and reject the string.If the automaton reaches a final ‘accepting’ state, the input string has been succesfully recognised and parsed by the automaton.
{states, state transitions of FSA}
{nonterminals,
productions
of corresponding grammar}Transformational grammars10Finite state automataSlide11
RG cannot describe language L when:
L
contains all the strings of the form aa, bb, abba, baab, abaaba, etc. (a palindrome language).L contains
all the strings of the form aa, abab, aabaab
(a copy language).
Transformational grammars
11What a regular grammar can’t doSlide12
Regular language: a b a a a b
Palindrome language: a a b b a aCopy language: a a b a a bPalindrome and copy languages have correlations between distant positions.
Transformational grammars
12Slide13
The reason: RNA secondary structure is a kind of palindrome language.The context-free grammars (CFG) permit additional rules that allow the grammar to create nested, long-distance
pairwise correlations
between terminal symbols.S → aSa | bSb | aa | bb
S => aSa => aaSaa => aabSbaa => aabaabaa
Transformational grammars
13
Context-free grammarsSlide14
The parsing automaton for CFGs is called a push-down automaton
.
A limited number of symbols are kept in a push-down stack.A push-down automaton parses a sequence from left to right according to the algorithm. The stack is initialised by pushing the start nonterminal into it.
The steps are iterated until no input symbols remain.If the stack is empty at the end then the sequence has been successfully parsed.
Transformational grammars
14
Push-down automataSlide15
Pop a symbol off the stack.If the poped symbol is nonterminal:
- Peek ahead in the input from the current position and choose a valid production for the nonterminal. If there is no valid production, terminate and reject the sequence.
- Push the right side of the chosen production rule onto the stack, rightmost symbols first.
If the poped symbol is a terminal: - Compare it to the current symbol of the input. If it matches, move the automaton to the right on the input (the input symbol is accepted). If it does not match, terminate and reject the sequence.
Transformational grammars
15
Algorithm: Parsing with a push-down automatonSlide16
Copy language: cc, acca, agaccaga, etc.
initialisation:
S → CW terminal generation:
nonterminal generation:
CA →
aC
W → AÂW | GĜW | C CG → gCnonterminal reordering: ÂC → CaÂG → GÂ ĜC → CgÂA → AÂ termination:ĜA → AĜ CC → ccĜG → GĜTransformational grammars16Context-sensitive grammarsSlide17
A mechanism for working backwards through all possible derivations:
either the start was reached, or valid derivation was not found.Finite number of possible derivations to examine.Abstractly: ‘tape’ of linear memory and a read/write head.The number of possible derivations is exponentially large.
Transformational grammars
17
Linear bounded automatonSlide18
Nondeterministic polynomial problems
:
there is no known polynomial-time algorithm for finding a solution, but a solution can be checked for correctness in polynomial time. [Context-sensitive grammars parsing.]
A subclass of NP problems - NP-complete problems. A polynomial time algorithm that solves one NP-complete problem will solve all of them. [Context-free grammar parsing.]
Transformational grammars
18
NP problems and ‘intractability’Slide19
Left and right sides of the production rules can be any combinations of symbols.The parsing automaton is a Turing machine.
There is
no general algorithm for determination whether a string has a valid derivation in less than infinite time. Transformational grammars
19
Unrestricted grammars and Turing machines