Context Free Grammars and Languages Pumping Lemma for CFL's: Theorem: Let L be any CFL. Then there is a constant n, depending on L, such that if z is in L and |z| > n, then we may write z = uvwxy such that 1) |vx| > 1, 2) |vwx| < n and 3) for all i > 0 uviwxiy is in L. Proof: Let L be the CFL Let G be the CNF grammar generating L We consider any string z in L Let the grammar G have k variables, we can take n = 2k Let z be the string in L, |z| > n Now the parse tree for z must have a path of length atleast k+1 In the longest path of z atleast one variable must appear more than once Let P be the longest path in the parse tree of z There must be two vertices v1, v2 in the path P, satisfying the following properties 1) v1 and v2 have identical label 2) vertex v1 is closer to root, vertex v2 is closer to leaf Now let us take the subtree T1 whose root is v1 and T2 whose root is v2. Now let us take z1 as yield of T1 and z2 as yield of T2. Now z1 can be divided into three parts z1 =z3z2z4, such that z3 and z4 both cannot be ∈ but one of them may be ∈. Now we can show that z, parse tree T, parse tree T1, parse tree T2, with an example in the following way A  BC / a, B BA / b, C  BA Here k =3, n =2k = 8 Let z = bbbbbaba A |z| > n = 8 z1 = bbbba C B z1 = z3z2z4 z1 = bbbb a B A v z3 z2 B A 1 z4 = ∈ a b * bbbba i.e. A  * zzz b C A  B 3 2 4 G G * A A  z3Az4 b B G B * T C  A z3z3Az4z4 1 b B G * z z z Az z z b A v2 B A  3 3 3 4 4 4 G Path P . b a . T2 . * z iAz i A  3 4 G Now the original string z can be divided as follows z = b bbbba ba u z1 y

z = uz3z2z4y, we rename the substring z3, z2, z4 as v, w, x respectively z = uvwxy Now we observe that uviwxiy ∈ L ∀ i > 0 and |vx| > 1 |vwx| < n Hence the theorem. Closure properties of CFL: Theorem: Context free languages are closed under union, concatenation and kleene closure. Proof: Let L1, L2 are two CFL's generated by grammars G1 (V1, T1, P1, S1), G2 (V2, T2, P2, S2) Let us assume that variables in G1 and G2 are disjoint. Union: For L1 U L2 we can construct a grammar G3 (V1UV2US, T1UT2, P1U P2US  S1 | S2, S) Now any derivation from G3 will begin with S i.e. we can start with * w (or) S  S  * w S  S1  1 2 2 G3 G3 We also observe that w1, w2 is in the language L1UL2. So G3 is the CFG for L1UL2 Concatenation: For L1L2 we can construct a grammar G4 (V1UV2US, T1UT2, P1UP2US  S1S2, S) Now any derivation from G4 will begin with S i.e. we can start with * wS  * ww S  S1S2  1 2 1 2 G4 G4 We also observe that w1w2 is in the language L1L2. So G4 is the CFG for L1L2. Kleene closure: For L1* we can construct a grammar G5 (V1US, T1, P1US  SS1 | S1S | ∈, S) Now any derivation from G5 will begin with S i.e. we can start with * w (or) S  S S  * w S  SS1  1 1 1 G5 G5 We also observe that w1 is in the language L1*. So G5 is the CFG for L1*. So CFL's are closed under union, concatenation and kleene closure. Theorem: Context free languages are closed under intersection and complementation. Proof: Intersection: Let L1 = {aibicj / i, j > 1} L2 = {aibjcj / i, j > 1} are two CFL's L1 = {abc, aabbc, abcc, aabbcc, ….} L2 = {abc, aabc, abbcc, aaabbbccc, …} L1 ∩ L2 = {abc, aabbcc, aaabbbccc, ….} L1 ∩ L2 = { aibici / i > 1} It is not CFL we can prove it by using pumping lemma. So CFL's are not closed under intersection. Complementation: Let us assume that CFL's are closed under complementation. From Demorgan's law L1 ∩ L2 = ∑* - [(∑* - L1) U (∑* - L2)], L1, L2 are CFL's * ̅ ̅ ̅ ̅ = ∑ - [𝐿1 U 𝐿2], 𝐿1, 𝐿2 are CFL's = ∑* - L3, L3 is CFL = 𝐿̅3, 𝐿̅3 is also CFL. But L1 ∩ L2 is not a CFL, so our assumption fails which is a contradiction. So CFL's are not closed under complementation.

Context Free Grammars and Languages.pdf

Context Free Grammars and Languages.pdf. Context Free Grammars and Languages.pdf. Open. Extract. Open with. Sign In. Main menu.

355KB Sizes 2 Downloads 235 Views

Recommend Documents

Context Free Grammars and Languages 7.pdf
Context Free Grammars and Languages 7.pdf. Context Free Grammars and Languages 7.pdf. Open. Extract. Open with. Sign In. Main menu.

Ambiguity Detection Methods for Context-Free Grammars
Aug 17, 2007 - occur in derivations in which every live production is used at most once. (The live produc- tions of a CNF grammar are those of the form A → BC.) His algorithm consists of searching those derivations for duplicate strings (like .....

Grammars and Pushdown Automata - GitHub
A −→ bA | ε ..... Fundamentals of Theoretical Computer Science. ... Foundations of Computer Science, pages 371–382, San Juan, Puerto Rico, October 1979.

Psycholinguistics, formal grammars, and ... - Linguistics Network
Linguists have long debated what the best system for describ- ing and explaining ... In some ways, it is ironic that other schools of linguistics that explicitly refer to ..... them out, and more importantly, until recently, no sensitive online measu

Counting dependencies and Minimalist Grammars.
This article describes the existence of a MG genera- ting the counting dependencies Lm = {1n2n ···mn,n ∈. IN}, and an algorithm of construction of the lexicon.

Generative and Discriminative Latent Variable Grammars - Slav Petrov
framework, and results in the best published parsing accuracies over a wide range .... seems to be because the complexity of VPs is more syntactic (e.g. complex ...

Free PDF Judaism and Enlightenment (Ideas in Context ...
Nov 14, 2005 - Sutcliffe can accompany you during that time. It will certainly not make you really feel weary. Besides, by doing this will certainly additionally ...

From Context to Micro-context – Issues and ...
Sensorizing Smart Spaces for Assistive Living .... of smart home sensor data in such a manner as to meet critical timing ..... Ambient Assisted Living, (http://www.aaliance.eu/public/documents/aaliance-roadmap/aaliance-aal-roadmap.pdf). 25.

Counterfactuals and context
If John had made a mistake, it would not have been a big mistake. Therefore, if John had made a big mistake, he would not have made a mistake. (Wet Match).

Concepts and Context
data stored in the long-term memory of the subject (or roughly the same, as prototypes can be modified by ... With a better grasp of what generality may consist in, one may still ask what is the point of characterizing concepts ... people state gener

OT Grammars, Beyond Partial Orders: ERC Sets and ...
Jul 31, 2014 - sets into the power set lattice, which we will call MChain. This MChain ... The intersection contains nine linear extensions and are the bolded linear extensions in the table below. ... in Figure 1 below, produced from the five sets co

Semantic Property Grammars for Knowledge Extraction ... - CiteSeerX
available for this task, which only output a parse tree. In addition, we ... to a DNA domain or region, while sometimes it refers to a protein domain or region.

Synchronous Dependency Insertion Grammars A ...
applying statistical learning to structured data, began to emerge. Syntax based statistical MT approaches began with (Wu 1997), who introduced a polyno-.

Semantic Property Grammars for Knowledge Extraction ... - CiteSeerX
source of wanted concept extraction, we can directly apply the same method- ... assess a general theme in the (sub)text: since the parser retrieves the seman-.

Acquiring new musical grammars – A statistical ...
Center for New Music & Audio Technologies. University of California, Berkeley. Berkeley ...... Proceedings of the. International Computer Music Conference, 1998.

Shopping context and consumers' mental ...
of opening hour restrictions and shopping purpose, and find support for the ... to account for non-normative effects in consumers' percep- ...... Time saving. 27. 30.

Landscape context and microenvironment influences on liana ...
Mar 30, 2008 - Carson 2001). Treefall gaps provide differential micro- ... to show an asymmetric unimodal trend, decreasing at larger and older gaps ...