2017 IEEE International Symposium on Information Theory (ISIT)

Graph Information Ratio Lele Wang†∗ and Ofer Shayevitz∗ † Stanford University and ∗ Tel Aviv University Emails: [email protected], [email protected] Abstract—We introduce the notion of information ratio Ir(H/G) between two (simple, undirected) graphs G and H, which characterizes the maximal number of source symbols per channel use that can be reliably sent over a channel with confusion graph H, where reliability is measured w.r.t. a source confusion graph G. Many different results are provided, including in particular lower and upper bounds on Ir(H/G) in terms of various graph properties, inequalities and identities for behavior under strong product and disjoint union, relations to graph cores, and notions of graph criticality. Informally speaking, Ir(H/G) can be interpreted as a measure of similarity between G and H. We make this notion precise by introducing the concept of information equivalence between graphs, a more quantitative version of homomorphic equivalence. We then describe a natural partial ordering over the space of information equivalence classes, and endow it with a suitable metric structure that is contractive under the strong product. Various examples and intuitions are discussed.

I. I NTRODUCTION The zero-error capacity of a noisy channel is a well known problem in information theory, originally introduced and studied by Shannon [1]. One canonical way to describe this problem is the following: A sender is trying to convey one of M distinct messages to a receiver over a channel with some finite input alphabet V . The channel noise is characterized by a (simple, undirected) channel confusion graph H over the input alphabet (i.e., with a vertex set V (H) = V ) and an edge set E(H). The sender maps his messages to a sequence of channel inputs in v1 , . . . , vn , and the receiver in turn observes an arbitrary sequence of edges e1 , . . . , en ∈ E(H), such that vi ∈ ei . This mapping of messages to inputs is called zero-error if the receiver can always determine the message uniquely from the sequence of edges. The rate of the mapping is defined as R = n−1 log M , which corresponds to the number of bits sent per channel use. The zero-error capacity of the channel, also known as the Shannon graph capacity C(H), is the supremum over all rates R for which a zeroerror mapping exists. It is not difficult to verify that the maximal zero-error rate for one use of the channel (n = 1) is exactly log α(H), where α(H) is the independence number of the channel graph H. More generally, the graph capacity is given by C(H) = log Θ(H), where  def Θ(H) = lim n α(H n ). n→∞

n

Here H is the n-fold strong product (a.k.a. AND product) of H with itself. Recall that for two graphs H1 , H2 , the strong product H1  H2 is a graph with vertex set V (H1 ) × V (H2 ),

978-1-5090-4096-4/17/$31.00 ©2017 IEEE

where (h1 , h2 ) ∼ (h1 , h2 ) if for each i either hi ∼ hi in Hi or hi = hi . The limit above exists due to supermultiplicativity, but is in general notoriously difficult to compute or even to approximate [2]. Another closely related problem is that of zero-error compression. Here we are given a source confusion graph G, and the sender needs to map source sequences g1 , . . . , gk ∈ G to one of M messages, that is then sent noiselessly to the receiver. This mapping is called zero-error if any two source sequences g1 , . . . , gk ∈ G and g1 , . . . , gk ∈ G that are not confusable w.r.t. G, i.e., for which gj ∼ gj in at least one coordinate j, are always mapped to distinct messages. In other words, this means that the receiver is always able to output a list of source sequences that are all confusable with the correct one, and are guaranteed to contain it. The rate of the mapping is R = k −1 log M , which corresponds to the number of message bits per source symbol. The zero-error compression rate of G is the infimum over all rates for which a zero-error mapping exists. We note that this problem was originally introduced by K¨orner [3] in a slightly different setting where the source is probabilistic and a small error probability is allowed; in that case, the optimal compression ratio is the so-called K¨orner entropy of the graph. It is not difficult to verify that the minimal rate for source sequences of length one (n = 1) is exactly log χ(G), where χ(G) is the chromatic number of the complement graph G. More generally, the zero-error graph compression rate of G is given by log χ ¯f (G), where  k ∨k def χ ¯f (G) = lim χ(G ). k→∞

∨k

Here G is the k-fold OR product of G with itself. Recall that for two graphs G1 , G2 , the OR product G1 ∨ G2 is a graph with vertex set V (G1 ) × V (G2 ), where (g1 , g2 ) ∼ (g1 , g2 ) if gi ∼ gi in Gi for at least one coordinate i. The above limit χ ¯f (G), also known as the fractional chromatic number of G, exists due to supermultiplicativity and can computed by solving a simple linear program [4], [5]. Now, it is only natural to consider the more general problem where a sender wishes to communicate a source sequence with confusion graph G over a noisy channel with confusion graph H. Suppose the sender has a source sequence of length k, and can use the channel n times. The sender would like to map the source sequence to the channel inputs in a way that the receiver will always be able to output a list of source sequences that are all confusable with the correct one, and are guaranteed to contain it. In graph theoretic terms, we are looking for a

913

2017 IEEE International Symposium on Information Theory (ISIT)

mapping φ from the vertices of Gk to the vertices of H n , such that for any g, g  ∈ Gk where g ∼ g  , the images φ(g) ∼ φ(g  ) in H n . We will call such a mapping non-adjacency preserving. Does such a mapping exist? The answer depends on k and n. This leads us to define the information ratio between H and G to be def

Ir(H/G) = sup{k/n : ∃ a non-adjacency preserving mapping from Gk to H n }. The information ratio can hence be thought of as the maximal number of source symbols per channel use that can be reliably conveyed to the receiver, in the above sense. Related work. The question we consider is reminiscent of the joint source-channel coding (JSCC) problem studied in the classical (non-zero-error) information theoretic setup, where a source is to be communicated over a noisy channel under some end-to-end distortion constraint (expected distortion, excess distortion exponent, etc.) [6]. Our problem differs from these classical JSCC setups in a number of aspects. First, our setting is combinatorial in nature and does not allow any errors; in this sense, a more closely related study is [7], [8] where the authors consider JSCC over an adversarial channel. But more importantly, the way we measure success cannot be cast in a per-symbol distortion JSCC framework. The natural distortion for our setup is one where confusable symbols are assigned zero distortion and non-confusable symbols are assigned infinite distortion. However, this results in a different (more degenerate) setup; for example, if G has a vertex that is connected to all the other vertices, then the receiver can always reconstruct this vertex with no communication cost, and admit zero distortion. A better way to think about our setup is perhaps that of list decoding with structural constraints. Unlike the pure communication setup where the receiver must commit to one message, here the receiver can output a list of possible messages, but this list must have a certain “similarity structure” that is captured by the source confusion graph G. In a related work [9], K¨orner and Marton introduced and studied the relative capacity R(H|G) of a graph pair (G, H), defined (originally in terms of OR products) as the maximal ratio k/n such that H n contains an induced subgraph that is isomorphic to Gk . This is a stronger requirement than ours; the information-ratio setup is concerned with mappings (from Gk to H n ) that are only required to preserve non-adjacency, whereas the relative capacity setup considers mappings that must preserve both adjacency and non-adjacency. The relative capacity R(H|G) is therefore a lower bound on the information ratio Ir(H/G). We note that the relative capacity was determined in [9] for the special case where G is an empty graph, where it can be easily seen to equal the associated information ratio. The main contribution in [9] is a general upper bound on the relative capacity, and its ramifications in a certain problem of graph dimension. This upper bound is not informative in the associated information ratio setup. In another related work [10], Polyanskiy studied certain mappings of the Hamming graph H(m, d), which is a graph

over the Hamming cube {0, 1}m where two vertices are connected if their Hamming distance is at most d. He investigated conditions for the non-existence of (α, β)-mappings, which he defined as non-adjacency preserving mappings between H(m, αm) and H(, β) (for a fixed n = k = 1 in our notation, i.e., no graph products). He then provided impossibility results in the limit of m → ∞ for a fixed ratio m/, via a derivation of general conditions for existence of graph homomorphism. This problem is also closely related to the combinatorial JSCC setup mentioned above. Notation. We denote by G the complement of the graph G, i.e., g1 ∼ g2 in G if and only if g1 ∼ g2 in G. We write Ks to mean a complete graph over s vertices and Cn to mean a cycle with n vertices. We write K(n, r) to mean the Kneser graph, whose vertices are all r-subsets of {1, 2, . . . , n}, and where two vertices are adjacent if and only if they correspond to disjoint subsets. II. M AIN R ESULTS The following is a road map for the full version of the paper in [11]. All proofs are omitted due to space limitations. A. Lower Bounds We recall the two extremes of the information ratio problem, which follow directly from the definitions. Proposition 1 (Zero-Error Channel Coding). Ir(H/K2 ) = log Θ(H). Proposition 2 (Zero-Error Source Compression). Ir(K2 /G) =

1 . log χ ¯f (G)

A simple lower bound on the information ratio can be obtained by a separation scheme, namely where the source is first optimally compressed into a message using log χ ¯f (G) bits per source symbol, and then this message is optimally sent over the channel using C(H) = log Θ(H) bits per channel use. This yields the following lower bound. Theorem 3 (Separation scheme). Ir(H/G) ≥

log Θ(H) . log χ ¯f (G)

¯f (K2 ) = 2, we see that separation Noting that Θ(K2 ) = χ is (trivially) optimal when G = K2 or H = K2 . However, and in contrast to the classical JSCC case mentioned above [6], separation is not always optimal. Example 4 (Separation can be strictly suboptimal). Let G = H = C5 be the pentagon (a cycle with five vertices). Clearly, the “uncoded” (identity) mapping from G to H is nonadjacency preserving. Thus Ir(H/G) ≥ 1. However, the sepa√ log Θ(C5 ) log 5 ration scheme only achieves log χ¯f (C5 ) = log 2.5 = 0.878 < 1.

914

2017 IEEE International Symposium on Information Theory (ISIT)

We prove various algebraic identities and inequalities for the information ratio. To that end, one may find it instructive to think of the information ratio, very informally, as log |H| “ Ir(H/G) ≈ log |G|



Proposition 5. Ir(H/G) Ir(G/H) ≤ 1. Second, the information ratio is super-multiplicative w.r.t. the strong product. Theorem 6. (1)

Remark 7. Recall that C(H) = log Θ(H). By Proposition 1, when F = K2 , Theorem 6 recovers Shannon’s lower bound on the capacity of the strong product of two graphs C(G  H) ≥ C(G) + C(H).

In special cases, the inequalities in Theorems 6, 8, and 10 can be tight, which leads to the following identities. Theorem 12 (Identities).

This statement is meant to imply that, loosely speaking, the information ratio behaves like the ratio of logarithmic graph “sizes”, in terms of satisfying algebraic identities/inequalities similar to the ones satisfied by positive real numbers, where the strong product G  H is thought of as “multiplication”, and the disjoint union G + H is though of as “addition”. More accurately, we prove the following relations. First, the product of reciprocal information ratios cannot exceed unity.

Ir(G  H/F ) ≥ Ir(G/F ) + Ir(H/F ).

are graphs for which the inequality in (5) can be strict, and hence also in (4).

(2)

Shannon conjectured that equality in (2) holds [1], which was then disproved by Alon [12]. There are graphs for which the inequality can be strict in (2), and hence also in (1). In Theorems 12 and 29, we discuss conditions under which equality in (1) holds.

Ir(G  H/G) = 1 + Ir(H/G), Ir(G/H) Ir(G/G  H) = , 1 + Ir(G/H) 1 , Ir(G + G/G) = 1 + log χ ¯f (G) log Θ(G) . Ir(G/G + G) = 1 + log Θ(G) B. Upper Bounds A simple but useful observation is that the information ratio between two graphs can be equivalently defined in terms of homomorphisms between the respective complement graphs. A homomorphism φ from G to H is a mapping of V (G) to (a subset of) V (H) that preserves adjacency, i.e. where g1 ∼ g2 in G implies φ(g1 ) ∼ φ(g2 ) in H. This relation is written G → H. It is therefore immediately clear that Ir(H/G) is the supermum of k/n such that there exists a homomorphism ∨k ∨n Gk → H n , or equivalently, G → H . We use the homomorphic definition in conjunction with some known results on hom-monotone functions, i.e., functions that are monotone w.r.t. homomorphisms, to derive several upper bounds on the information ratio. Theorem 13 (Upper bounds).   log Θ(H) log ϑ(H) log χ ¯f (H) Ir(H/G) ≤ min , , , log Θ(G) log ϑ(G) log χ ¯f (G) where ϑ(·) is the Lov´asz theta function [13].

Theorem 8. Ir(F/G  H) ≥

Ir(F/G) Ir(F/H) . Ir(F/G) + Ir(F/H)

(3)

Remark 9. By Proposition 2, when F = K2 , Theorem 8 holds with equality. This is the well-known fact that χ ¯f (G  H) = χ ¯f (G)χ ¯f (H). In Theorems 12 and 29, we discuss other conditions under which equality in (3) holds. Furthermore, the following information ratio power inequality holds w.r.t. disjoint union. Theorem 10. χ ¯f (F )Ir(G+H/F ) ≥ χ ¯f (F )Ir(G/F ) + χ ¯f (F )Ir(H/F ) .

(4)

Remark 11. Recall that C(H) = log Θ(H). By Proposition 2, when F = K2 , Theorem 10 recovers Shannon’s lower bound on the capacity of disjoint union of two graphs 2C(G+H) ≥ 2C(G) + 2C(H) .

(5)

This suggests that we can informally think of the source graph F as the “logarithm base”. Shannon conjectured that equality in (5) holds, which was then disapproved by Alon [12]. There

Unlike the case of a single graph, in which there is an order among the three graph invariants Θ(G) ≤ ϑ(G) ≤ χ ¯f (G), there is in general no order among the three upper bounds for information ratio. Example 14 (No orders among the upper bounds). Let G be a strongly regular graph with parameter (27, 16, 10, 8), i.e., a graph with 27 vertices such that every vertex has 16 neighbors, every adjacent pair of vertices has 10 common neighbors, and every nonadjacent pair has 8 common neighbors [14, pp. 464– 465]. This is also called the Schl¨afli graph [15]. It is known that Θ(G) = 3, ϑ(G) = 3, χ ¯f (G) = 4.5, 6 ≤ Θ(G) ≤ 7, and ϑ(G) = χ ¯f (G) = 9 [15], [16]. For the pair (K2 , G), the upper bound in terms of capacity is the tightest: log Θ(G) ≤ log 7, log Θ(K2 )

log ϑ(G) log χ ¯f (G) = = log 9. log ϑ(K2 ) log χ ¯f (K2 )

For the pair (G, G), the upper bound in terms of Lov´asz’s theta function is the tightest:

915

log Θ(G) ≥ 0.56, log Θ(G)

log ϑ(G) = 0.5, log ϑ(G)

log χ ¯f (G) = 0.68. log χ ¯f (G)

2017 IEEE International Symposium on Information Theory (ISIT)

log χ ¯f (G) = 1.46. log χ ¯f (G)

Theorem 15. The upper and lower bounds for Ir(H/G) coincide when any one of the following conditions are satisfied: 1) Θ(H) = χ ¯f (H); 2) Θ(G) = χ ¯f (G); 3) Θ(H) = ϑ(H) and χ ¯f (G) = ϑ(G). Our upper bounds can be used in conjunction with specific mappings (e.g., separation/uncoded) to find the exact information ratio in several special cases.



Example 16. For any graph F and m1 , m2 ∈ , m1 . Ir(F m1 /F m2 ) = m2 Example 17. For any graph F and m1 , m2 ∈ ,  1+m log χ¯ (F ) 1 f if m1 ≤ m2 , ¯f (F ) m1 m1 m2 m2 2 log χ Ir(F +F /F +F ) = 1+m 1+m1 log Θ(F ) if m1 > m2 . 1+m2 log Θ(F )



We refer to [11, Section 4] for two other upper bounds. C. Information Equivalence The graph homomorphism approach leads to further interesting observations. We say that two graphs G and H are homomorphically equivalent, denoted G ↔ H, if both G → H and H → G. This induces an equivalence relation on the family of (finite, simple, undirected) graphs. It is well known that for any graph G, there exists a unique (up to isomorphism) representative G• of the equivalence class of G, called the core of G [17]. Loosely speaking, the core is the graph with the least number of vertices in the equivalence class. Examples of cores include complete graphs, odd cycles, and Kenser graphs. Using the notion of a core, we can show that the information ratio Ir(G/H) depends only on the cores of G and H, hence cores are sufficient for the purpose of computing the information ratio.   Theorem 18. Ir(H/G) = Ir (H)• (G)• . This can simplify the computation, as illustrated below. Example 19. Let G = Km1 + Km2 + · · · + Kms and H = KM1 + KM2 + · · · + KMt be disjoint unions of cliques. Then log t Ir(H/G) = log s . To see this, note that G is a complete spartite graph. The core of G is Ks . Similarly, the core of H log t is Kt . By Theorem 18, Ir(H/G) = Ir(Kt /Ks ) = log s. Example 20. Recall that (u1 , v1 ) ∼ (u2 , v2 ) in the tensor product F1 × F2 , if u1 ∼ u2 in F1 and v1 ∼ v2 in F2 . If F1 → F2 , then for any G and H,

Remark 21 (Deficiency of the homomorphic equivalence). Clearly, if G and H are homomorphically equivalent (i.e., have the same core), then Ir(H/G) = Ir(G/H) = 1. However, the reverse implication does not always hold. For example, for G = K(6, 2) and H = K(12, 4), it is known that Θ(G) = χ ¯f (G) = Θ(H) = χ ¯f (H) = 3 [5], [18], and thus Ir(G/H) = Ir(H/G) = 1. However, it is also known that K(6, 2) → K(12, 4) and K(12, 4) → K(6, 2) [19]. The reason for this deficiency of the homomorphic equivalence is that it is possible for G and H not to be homomorpically equivalent, yet for their OR powers to become asymptotically “almost” so. This observation leads us to define a new equivalence relation on graphs: we say that G and H are information– equivalent, denoted G H, if Ir(H/G) = Ir(G/H) = 1. As noted above, it can be shown that the information equivalence is a coarsening of the homomorphic equivalence of the complement graphs, and captures their asymptotic similarity. Proposition 22. F1 ↔ F2 implies F1 not true in general.

I

log ϑ(G) = 2, log ϑ(G)

I

log Θ(G) ≥ 1.63, log Θ(G)

To see this, note that if F1 → F2 , then F1 + F2 ↔ F2 and F1 × F2 ↔ F1 .

F2 . The converse is

We define the notion of the spectrum of a graph, which is a characterization of the graph structure in terms of information ratios. Fix an enumeration of the set of all non-isomorphic cores {Γi }∞ i=1 . For any graph G, the sequence def

σS (G) ={Ir(Γi /G)}∞ i=1 is called the source spectrum of G, and the sequence def

σC (G) ={Ir(G/Γi )}∞ i=1 is called the channel spectrum of G. In the sequel, when we sum multiple spectra, or compare spectra, the operations will be meant element-wise in a natural way. Theorem 1) F1 2) The 3) The

23. The following statements are equivalent: F2 . source spectra are identical σC (F1 ) = σC (F2 ). channel spectra are identical σS (F1 ) = σS (F2 ).

I

For the pair (G, G), the upper bound in terms of fractional chromatic number is the tightest:

Information equivalence enjoys several nice properties. Proposition 24. Equipped with the function def

d(G, H) = − log(min{Ir(G/H), Ir(H/G)}), the set of information equivalence classes forms a metric space. This metric is contractive w.r.t. the strong product.

Ir(F1 + F2 /G) = Ir(F2 /G),

Theorem 25. d(G  F, H  F ) ≤ d(G, H).

Ir(F1 × F2 /G) = Ir(F1 /G),

There exists a natural information partial order  on the set of information equivalent classes, defined by L(G)  L(H) if Ir(H/G) ≥ 1.

Ir(H/F1 + F2 ) = Ir(H/F2 ), Ir(H/F1 × F2 ) = Ir(H/F1 ).

916

2017 IEEE International Symposium on Information Theory (ISIT)

Theorem 26. The following statements are equivalent: 1) L(F1 )  L(F2 ). 2) σS (F2 ) ≤ σS (F1 ). 3) σC (F1 ) ≤ σC (F2 ).

Proposition 32. Suppose F is a connected triangle-free graph with at least three vertices, and χ ¯f (F ) < 3. Then F is information–critical.

The functions Θ(G), ϑ(G), and χ ¯f (G) are all monotonically non-decreasing w.r.t. the partial order . Proposition 27 (Information-monotone functions). Suppose that L(F1 )  L(F2 ). Then 1) Θ(F1 ) ≤ Θ(F1 ); 2) ϑ(F1 ) ≤ ϑ(F2 ); 3) χ ¯f (F1 ) ≤ χ ¯f (F2 ).

Proposition 28. G

I

I

In particular, Θ(F1 ) = Θ(F2 ) whenever F1 F2 . This is a generalization of Shannon’s condition, which states Θ(F1 ) = Θ(F2 ) whenever F1 ↔ F2 [1]. It is well known that for any positive integer s, α(G) = χ(G) = s if and only if G ↔ Ks . The following proposition provides an asymptotic version of this statement. Ks if and only if Θ(G) = χ ¯f (G) = s.

I

I

Recall from Theorem 15 that if Θ(G) = χ ¯f (G), then the separation scheme is optimal. Proposition 28 states that when Θ(G) = χ ¯f (G), the graph G is information–equivalent to an empty graph, which is intuitively why separation incurs no loss in the this case, regardless of the structure of the channel graph H. We further say that G and H are weakly information– equivalent, denoted G w H, if Ir(G/H) Ir(H/G) = 1. This is a coarsening of the information equivalence that is insensitive to strong products, in the sense that now Gk w Gn for any k, n. This provides the most general conditions for which Theorems 6 and 8 hold with equiality.

I

I

I

Theorem 29. Suppose that at least one pair of G, H, F is weakly information–equivalent, i.e., either G w H, or F w G, or F w H. Then Ir(G  H/F ) = Ir(G/F ) + Ir(H/F ), Ir(F/G) Ir(F/H) . Ir(F/G  H) = Ir(F/G) + Ir(F/H) D. Information–Critical Graphs We discuss a notion of graph criticality induced by the information ratio. A graph F is called information–critical if there exists an edge e ∈ E(F ) that, once removed, changes all the information ratios, i.e., Ir(H/(F \ e)) < Ir(H/F ) and Ir((F \ e)/G) > Ir(F/G) for all G, H. Theorem 30. The following statements are equivalent: 1) F is information–critical. 2) σS (F \ e) < σS (F ) for some edge e ∈ E(F ). 3) σC (F \ e) > σC (F ) for some edge e ∈ E(F ). 4) Ir((F \ e)/F ) > 1 for some edge e ∈ E(F ). Here are simple sufficient conditions for criticality. Proposition 31. If there exists an edge e ∈ E(F ) such that Θ(F \ e) > χ ¯f (F ), then F is information–critical.

Example 33. The following graphs are information–critical: 1) Cn , n ≥ 4; 2) K(n, r), where r does not divide n. Example 34 (A noncritical graph). Consider C4 , the cycle over 4 vertices. Removing any edge e from C4 results in a path over 4 vertices, denoted by P4 . Clearly C4 ↔ P4 and thus it holds that Ir(C4 /F ) = Ir(P4 /F ) = Ir((C4 \ e)/F ) and Ir(F/C4 ) = Ir(F/P4 ) = Ir(F/(C4 \ e)) for any graph F , and any edge e. Therefore C4 is not information–critical. Finally, we present a few open problems in [11, Section 9]. R EFERENCES [1] C. Shannon, “The zero error capacity of a noisy channel,” IRE Trans. on Inf. Theory, vol. 2, no. 3, pp. 8–19, September 1956. [2] N. Alon and E. Lubetzky, “The shannon capacity of a graph and the independence numbers of its powers,” IEEE Trans. Inf. Theory, vol. 52, no. 5, pp. 2172–2176, May 2006. [3] J. K¨orner, “Coding of an information source having ambiguous alphabet and the entropy of graphs,” in Transactions of the Sixth Prague Conference on Information Theory, Statistical Decision Functions, Random Processes (Tech Univ., Prague, 1971; dedicated to the memory of ˇ cek). Prague: Academia, 1973, pp. 411–425. Anton´ın Spaˇ [4] A. Pirnazar and D. H. Ullman, “Girth and fractional chromatic number of planar graphs,” J. Graph Theory, vol. 39, no. 3, pp. 201–217, 2002. [5] E. R. Scheinerman and D. H. Ullman, Fractional Graph Theory: A Rational Approach to the Theory of Graphs. Dover, New York, 2011. [6] C. E. Shannon, “Coding theorems for a discrete source with a fidelity criterion,” in IRE Int. Conv. Rec., 1959, vol. 7, part 4, pp. 142–163, reprint with changes (1960). In R. E. Machol (ed.) Information and Decision Processes, pp. 93–126. McGraw-Hill, New York. [7] Y. Kochman, A. Mazumdar, and Y. Polyanskiy, “The adversarial joint source-channel problem,” in Proc. IEEE Int. Symp. Inf. Theory, July 2012, pp. 2112–2116. [8] ——, “Results on combinatorial joint source-channel coding,” in Proc. IEEE Inf. Theory Workshop, Sept 2012, pp. 10–14. [9] J. K¨orner and K. Marton, “Relative capacity and dimension of graphs,” Discrete Math., vol. 235, no. 1-3, pp. 307–315, 2001, combinatorics (Prague, 1998). [10] Y. Polyanskiy, “On metric properties of maps between Hamming spaces and related graph homomorphisms,” J. Combin. Theory Ser. A, vol. 145, pp. 227–251, 2017. [11] L. Wang and O. Shayevitz, “Graph information ratio,” 2016, Preprint available at https://arxiv.org/abs/1612.09343. [12] N. Alon, “The Shannon capacity of a union,” Combinatorica, vol. 18, no. 3, pp. 301–310, 1998. [13] L. Lov´asz, “On the shannon capacity of a graph,” IEEE Trans. Inf. Theory, vol. 25, no. 1, pp. 1–7, Jan 1979. [14] D. B. West, Introduction to graph theory, 2nd ed. Prentice Hall, Inc., Upper Saddle River, New Jersey, 2001. [15] A. E. Brouwer, “Schl¨afli graph,” [Online] Available at http://www.win.tue.nl/aeb/drg/graphs/Schlaefli.html. [16] W. Haemers, “On some problems of lov´asz concerning the shannon capacity of a graph,” IEEE Trans. Inf. Theory, vol. 25, no. 2, pp. 231– 232, Mar 1979. [17] G. Hahn and C. Tardif, “Graph homomorphisms: structure and symmetry,” in Graph symmetry (Montreal, PQ, 1996), ser. NATO Adv. Sci. Inst. Ser. C Math. Phys. Sci. Kluwer Acad. Publ., Dordrecht, 1997, vol. 497, pp. 107–166. [18] A. E. Brouwer and A. Schrijver, “Uniform hypergraphs,” in Packing and covering in combinatorics, ser. Mathematical Centre Tracts, A. Schrijver, Ed. Mathematisch Centrum, Amsterdam, 1979, vol. 106, pp. 39–73. [19] S. Stahl, “n-tuple colorings and associated graphs,” J. Combinatorial Theory Ser. B, vol. 20, no. 2, pp. 185–203, 1976.

917

Graph Information Ratio

Abstract—We introduce the notion of information ratio. Ir(H/G) between two (simple, undirected) graphs G and H, which characterizes the maximal number of source symbols per channel use that can be reliably sent over a channel with confusion graph H, where reliability is measured w.r.t. a source confusion graph G. Many ...

137KB Sizes 24 Downloads 248 Views

Recommend Documents

RATIO AND PROPORTION Ratio Ratio of two ... -
the product of the extremes = the product of the means. i.e. ad = bc. 2. Compounded ratio of the ratios (a : b), (c : d), (e : f) is (ace : bdf). 3. Duplicate ratio of (a : b) ...

Constrained Information-Theoretic Tripartite Graph Clustering to ...
1https://www.freebase.com/. 2We use relation expression to represent the surface pattern of .... Figure 1: Illustration of the CTGC model. R: relation set; E1: left.

Constrained Information-Theoretic Tripartite Graph Clustering to ...
bDepartment of Computer Science, University of Illinois at Urbana-Champaign. cMicrosoft Research, dDepartment of Computer Science, Rensselaer ...

Ratio simplifying.pdf
34) An estate is shared between Tom, Thom and Thomas in the ratio. 2:3:5. ... 37) In the Queen's Imperial State Crown, the ratio of diamonds to. emeralds is 3:4.

Pursuit on a Graph Using Partial Information
instrumented node, the UGS therein informs the pursuer if ... If this happens, the. UGS is triggered and this information is instantaneously relayed to the pursuer, thereby enabling capture. On the other hand, if the evader reaches one of the exit no

Information Fusion in Navigation Systems via Factor Graph Based ...
Jan 5, 2013 - (pdf) of all states, a computationally-expensive process in the general ..... dition, in order to avoid re-calculating ∆xi→j, the effect of re-linearization ... obtained every 50 IMU measurements and bias nodes are added at the same

Ratio Units Discussion.pdf
Page 1 of 1. Ratio Units Discussion.pdf. Ratio Units Discussion.pdf. Open. Extract. Open with. Sign In. Details. Comments. General Info. Type. Dimensions. Size.

Knowledge Graph Identification
The web is a vast repository of knowledge, but automatically extracting that ... Early work on the problem of jointly identifying a best latent KB from a collec- ... limitations, and we build on and improve the model of Jiang et al. by including ....

Graph Theory.PDF
(c) Find the chromatic number of the Graph 4. given below. If the chromatic number is k, ... MMTE-001 3. Page 3 of 5. Main menu. Displaying Graph Theory.PDF.

Graph Algorithms
DOWNLOAD in @PDF Algorithms in C++ Part 5: Graph Algorithms: Graph Algorithms Pt.5 By #A#. Download EBOOK EPUB KINDLE. Book detail.

Robust Graph Mode Seeking by Graph Shift
ample, in World Wide Web, dense subgraphs might be communities or link spam; in telephone call graph, dense subgraphs might be groups of friends or families. In these situations, the graphs are usually very sparse in global, but have many dense subgr

Ratio & Proportion Recipe Project.pdf
Choose one recipe from the internet, cookbook or home. 2. The recipe must have at least 8 ingredients. 3. The original recipe must serve between 4 and 10 ...

Mole Ratio Worksheet Answers.pdf
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. Mole Ratio Worksheet Answers.pdf. Mole Ratio Worksheet Answers.pdf. Open. Extract. Open with. Sign In. Main

High Order to Trade Ratio (OTR). - NSE
4 days ago - Sub: High Order to Trade Ratio (OTR). This has reference to SEBI Circular No. SEBI/HO/MRD/DP/CIR/P/2018/62 dated April 09, 2018.

Graph Coloring.pdf
Sign in. Loading… Whoops! There was a problem loading more pages. Retrying... Whoops! There was a problem previewing this document. Retrying.

Practice Profile: 5-to-1 Ratio
By investing students in the value of the classroom through creating positive interactions, teachers can encourage better behavior and stronger feelings of student belonging in the classroom. Further, when students feel connected and more positive th

Graph Theory Notes - CiteSeerX
To prove the minimality of the set MFIS(X), we will show that for any set N ..... Prove that for a non-empty regular bipartite graph the number of vertices in both.

Graph Theory Notes - CiteSeerX
Let us define A = {v1,...,vm} and B = V (G)−A. We split the sum m. ∑ i=1 di into two parts m. ∑ i=1 di = C + D, where C is the contribution of the edges with both ...

Graph Theory.pdf
Page. 1. /. 2. Loading… Page 1 of 2. Page 1 of 2. Page 2 of 2. Page 2 of 2. Main menu. Displaying Graph Theory.pdf. Page 1 of 2.

Graph representations
Models for small world? ▫ Erdos-Renyi model. ▫ n nodes, each node has a probability p of ... Barabasi-Albert model. ▫ Graph not static, but grows with time.

graph-contraction.pdf
... loading more pages. Retrying... graph-contraction.pdf. graph-contraction.pdf. Open. Extract. Open with. Sign In. Main menu. Displaying graph-contraction.pdf.

Graph Theory.pdf
Define proper coloring and determine chromatic number of the graph given below. (02). b. State Hall's matching condition. (02). c. For the bipartite graph given ...

graph-search.pdf
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. graph-search.

Graph Algorithms
Read Algorithms in C++ Part 5: Graph Algorithms: Graph Algorithms Pt.5 - Online. Book detail. Title : Read Algorithms in C++ Part 5: Graph q. Algorithms: Graph ...