A Constrained Channel Coding Approach to Joint Communication and Channel Estimation
A joint communication and channel state estimation problem is investigated, in which reliable information transmission over a noisy channel, and high-fidelity estimation of the channel state, are simultaneously sought. The tradeoff between the achiev…
Authors: Wenyi Zhang, Satish Vedantam, Urbashi Mitra
A Constrained Channel Co ding Approach to J oint Communication and Channel Es timation W enyi Zhang, Satish V edantam, and Urba shi Mitra Ming Hsieh Departmen t of Electrical En gineerin g University of South ern California { wenyizha, vedantam, ubli } @usc. edu Abstract — A joint communication and channel state estimation problem is in vestigated, in which reliable informa tion transmis- sion ov er a noisy channel, and high-fideli ty estimation of the channel state, are simultaneously sought. The tradeoff between the achievable in form ation rate and the estimation distortion is quantified by fo rmulating th e p roblem as a constrained chann el coding problem, and the resulting capacity-distortion function characterizes the fund amental limit of the joint communication and channel estimation problem. The analytical results ar e illustrated thr ough case stu dies, and further issues such as multiple cost constraints, channel uncertainty , and capacity per unit distortion are also briefly discussed. I . I N T RO D U C T I O N In this paper, we consider the pro blem of join t commu- nication and channel estimation over a channel with a time- varying ch annel state. W e co nsider a n oisy ch annel with a random chan nel state that evolves with tim e, in a memor yless fashion, and is neither available to the transmitter nor the receiver . The objecti ve is to ha ve th e receiver recover bo th the informa tion transmitted f rom th e transmitter as well the state of the chann el over which th e inform ation was tra nsmitted. The prob lem setting may prove rele vant f or situations such a s en vironme nt monitoring in sen sor networks [1], underwater acoustic ap plications [2], and co gnitive radio [3]. A distinct feature of our pr oblem formulatio n is that both com municatio n and cha nnel estimation ar e r equired. The interplay between information me asures and estimation (minimum mean-squar ed error (MMSE) in particular) has long been in vestigated; see, e.g. , [4] and reference s therein. Pre vi- ously , h owe ver , estimation was o nly to facilitate inform ation transmission, rath er than a separate goal. For examp le, a common strategy in block interference ch annels [ 5] is c hannel estimation via train ing [6]. The pu rpose of chan nel training is only to increase the inf ormation rate for com munication , and thus the q uality of c hannel estimate is not traded off with the informa tion rate, as we consider in th is pap er . The problem f ormulatio n in [7], [8] bea rs some similarity to the o ne we co nsider in that th e receiver is in terested in both communica tion and chann el estima tion. It differs from our work in a critical way: th e cha nnel state is assumed non-ca usally known at the transmitter . In contrast, neither the transmitter nor the receiver knows the channel state in our problem fo rmulation . Intuitively , th ere e x ists a tradeo ff between a channel’ s c apa- bility to transfer inf ormation and its capability to exhib it state. Increasing rando mness in channel inputs increases info rmation transfer wh ile reducing the receiver’ s ability to estimate the channel. I n contrast, dete rministic sig naling facilitates channel estimation at the expen se o f zero info rmation tran sfer . I n this paper, we show that the optimal tradeoff can be fo rmulated a s a chann el c oding pr oblem, with the chan nel input distribution constrained b y an average “estimation co st” con straint. The r est of this paper is organ ized as follows. Section II introd uces the ch annel mo del an d the capacity -distortion function , and Sec tion III formu lates the eq uiv alen t constrained channel coding pro blem. Section IV illustrates th e application of the capacity-d istortion f unction th rough several simple examples. Section V briefly d iscusses so me related issues including multiple cost constraints, channel uncertainty , and capacity p er unit distor tion. Fina lly , Section VI conclu des the paper . I I . C H A N N E L M O D E L W e consider the cha nnel mode l in Figure 1. For a len gth- n block of chann el in puts, a message M is equally prob ably se- lected among { 1 , . . . , e nR } , and is e ncoded by the encode r , generating the correspond ing c hannel inputs { X 1 , . . . , X n } . W e pr ovide the following definition. Definition 1: (En coder) An en coder is defined by a func- tion, f n : M = { 1 , . . . , e nR } → X n , fo r each n ∈ N . encoder joint decoder and estimator channel P S f r a g r e p l a c e m e n t s M ˆ M X i Y i S i ˆ S i P ( y | x, s ) Fig. 1. Channel model for joint communication and channel estimation. The ch annel is described b y a transition fun ction P ( y | x, s ) , which is the probability d istribution of the channe l ou tput Y , condition ed on the channel input X an d the channel state S . Upon receiving the length- n block of chan nel ou tputs, th e joint de coder and estimator (d efined below) declares ˆ M ∈ { 1 , . . . , e nR } as th e deco ded m essage, an d a leng th- n b lock of estimates of the channel state. For techn ical p urpo ses, in this paper , we assume th at th e random channel state ev olves with tim e in a memo ryless fashion. W e note that this model encom passes the b lock interferen ce channel m odel, because we can treat a block a s a super-symbol a nd thus convert a blo ck interferen ce chan nel into a memoryless chann el. Definition 2: (Join t d ecoder an d estimator) A jo int d ecoder and estimator is defined by a pair of func tions, g n : Y n → M and h n : Y n → S n , fo r each n ∈ N . This definition differs f rom that of the conventional channel decoder ( e.g. , [ 9]) in th at it explicitly requ ires estimation of the chann el state S at the receiver . T he quality of estimation is measured by the distor tion fu nction d : S × S → R + ∪ { 0 } . That is, if ˆ S i is the i th elem ent of h n ( Y n ) , th en d ( S i , ˆ S i ) denotes th e distortion at time i , i = 1 , . . . , n . F o r techn ical conv en ience, we assume that d ( · , · ) is bou nded fr om ab ove so that there exists a finite T > 0 with d ( s, s ′ ) ≤ T < ∞ for any s, s ′ ∈ S . No te that for length- n b lock coding schemes, the average distortion is giv en by ¯ d ( S n , ˆ S n ) = 1 n n X i =1 d ( S i , ˆ S i ) . (1) Finally , we ha ve the following defin itions. Definition 3: (Ach iev able rate) A nonn egativ e number R ( D ) is an achiev able rate if there exist a sequence of encoder s a nd correspond ing joint decoders and estimators such that (a ) the average p robab ility o f deco ding error P ( n ) e = (1 / e nR ( D ) ) · P ⌈ e nR ( D ) ⌉ m =1 Pr[ ˆ M 6 = m | M = m ] tends to z ero as n → ∞ ; an d (b) the average distortion in chan nel state estimation, lim sup n →∞ E ¯ d ( S n , ˆ S n ) ≤ D . (2) Definition 4: (Capac ity-distortion functio n) The capacity- distortion fun ction is defined as C ( D ) = sup f n ,g n ,h n R ( D ) . (3) Remark : The reader may want to d istinguish b etween the capacity-d istortion fun ction and the rate-distortio n fu nction in lossy source cod ing [9]. Th e capacity-d istortion functio n is defined with respect to a state-depend ent channel, seeking to chara cterize the fu ndamen tal trad eoff between the rate o f informa tion transmission and the d istortion of state estimation. In contrast, the rate- distortion functio n is defin ed with respect to a sourc e d istribution, seekin g to character ize th e fun damen- tal tradeoff between th e rate of its lossy description and the achiev ab le d istortion due to the d escription. I I I . A C O N S T R A I N E D C H A N N E L C O D I N G F O R M U L AT I O N In this section, we show that the joint comm unication and channel estimation p roblem c an be equiv alently f ormulated as a constrain ed channel coding prob lem. For this pur pose, the following min imum condition al distor tion will be important. The m inimum cond itional distortion func tion is defin ed for each po ssible realization of the channel input X , as d ∗ ( x ) = inf h 0 : X × Y → S E [ d ( S , h 0 ( x, Y ))] , (4) where the expec tation is with re spect to th e chan nel state S and the channel outpu t Y condition ed up on the channel input X = x , and h 0 : X × Y → S denotes an arbitrar y one-sho t estimator of S given the channel input and outpu t. The following theore m estab lishes the constrained chann el coding fo rmulation . Theor e m 1: The capacity-disto rtion functio n for the chan- nel mod el in Figure 1 is g iv en by C ( D ) = sup P X ∈ P D I ( X ; Y ) , (5) where P D = ( P X : X x ∈ X P X ( x ) d ∗ ( x ) ≤ D ) . (6) Remark : T heorem 1 applies to general input/outpu t/state alphabets. If X is a continu ous random variable, the summation in (6) should be u nderstoo d as an integral over X . In o rder to prove Theorem 1, we shall employ th e fo llowing lemmas. Lemma 1: For any ( f n , g n , h n ) -sequence that achieves C ( D ) , as n → ∞ , the achieved average distortion (2) is (in probab ility) equal to the a vera ge distortio n with ˆ S n replaced by ˆ S n = h ∗ n ( X n , Y n ) , (7) where h ∗ n ( X n , Y n ) deno tes the blo ck- n e stimator th at a chieves the min imum average d istortion condition ed u pon bo th the block- n cha nnel inputs an d ou tputs. Pr oof : For each n , let us replace the estimator h n by h ∗ n in (7), with its first argum ent being the channe l inpu ts ˆ X n correspo nding to th e de coded message ˆ M . When ˆ M = M , the minimum a verag e distortion is achie ved by h ∗ n ; when ˆ M 6 = M , the incr ement in the average distortion due to rep lacing h n by h ∗ n is bo unded from above because d ( · , · ) ≤ T < ∞ . By Definitions 3 and 4, as n → ∞ , the a verag e p robab ility of decodin g er ror P ( n ) e → 0 . Hen ce as n → ∞ , the minimu m av er age d istortion is ach iev ed b y h ∗ n ( ˆ X n , Y n ) , wh ich is furth er equal to (7), in probability . Q.E.D . Lemma 1 shows that the joint decoder an d e stimator can utilize th e r eliably d ecoded ch annel in puts for chann el state estimation. The next lemma, Lemma 2, fu rther shows that the length- n block estimator can be decomposed into n one-sho t estimators, each for one channel use. Lemma 2: For any ( f n , g n , h n ) -sequence that achieves C ( D ) , as n → ∞ , the achieved average distortion (2) is (in probab ility) equ al to tha t achieved b y ˆ S i = h ∗ 0 ( X i , Y i ) , i = 1 , . . . , n, (8) where h ∗ 0 ( X i , Y i ) den otes the one-sho t estimato r that a chieves the m inimum expected distortio n for S i condition ed upon bo th the chan nel inpu t X i and ou tput Y i . Pr oof : From Lemma 1, as n → ∞ , h n ( Y n ) is in p robab ility equiv alen t to h ∗ n ( X n , Y n ) . The decomp osition (8) then follows because the chan nel is memo ryless. For each fixed n , we have P ( S n | X n , Y n ) = P ( X n , Y n , S n ) P ( X n , Y n ) = Q n i =1 P ( S i , X i , Y i ) P S n Q n i =1 P ( S i , X i , Y i ) = Q n i =1 P ( S i , X i , Y i ) Q n i =1 P S i P ( Y i | X i , S i ) P ( S i ) P X ( X i ) = n Y i =1 P ( S i , X i , Y i ) P ( Y i | X i ) P X ( X i ) = n Y i =1 P ( S i | X i , Y i ) . (9) As we take n → ∞ , th e lemma is established. Q.E.D. Pr oof o f Theorem 1 : From Lemmas 1 and 2, we can rewrite the average distortion con straint (2) as lim sup n →∞ 1 n n X i =1 E d ( S i , ˆ S i ) ≤ D ⇒ lim sup n →∞ 1 n n X i =1 E d ( S i , h ∗ 0 ( X i , Y i )) ≤ D . (10) Utilizing (4) and the fact that the chan nel is memory less, we can fur ther deduc e f rom (1 0) that E d ∗ ( X ) ≤ D . (11) So now the constraints in Definition 3 redu ce to having P ( n ) e → 0 as n → ∞ , sub ject to the con straint (11). This is exactly the pr oblem of channel coding with a cost constraint on the inpu t distribution, a nd Theor em 1 dire ctly f ollows fro m standard pro ofs; see, e.g. , [1 0]. Q.E.D . Discussion : (1) The pr oof of Theor em 1 suggests the joint d ecoder and estimator first deco de the transmitted message in a “non- coheren t” fashion, then utilize the reconstructed channel inputs along w ith the chann el outp uts to estimate the chan nel states. As the co ding block leng th grows large, such a two-stage proced ure be comes asymp totically o ptimal. (2) For each x ∈ X , d ∗ ( x ) quantifies its associated min- imum distortion. A lternatively , d ∗ ( x ) can be viewed as the “estimation co st” du e to signaling w ith x . Hence the average distortion co nstraint in (6) regulates the input d istribution such that the signalin g is estimation-efficient. W e emph asize that, d ∗ ( x ) is depen dent on the chan nel thro ugh the distribution o f the channel state S , and thus dif f ers from other u sual costs such as symbol energies or tim e d urations. (3) A key cond ition that leads to the co nstrained chan nel coding formu lation is th at the chan nel is memoryless. Du e to the m emoryless prop erty , we can de compo se a block estimator into mu ltiple one -shot estimators, w ithout loss of op timality asymptotically . If the chan nel state evolv es with time in a correlated fashion, then such a decompo sition is generally suboptimal. I V . I L L U S T R A T I V E E X A M P L E S In this section, we discuss several simple examples to illustrate th e applicatio n o f Theo rem 1. A. Uniform Estimation Costs A special ca se is th at d ∗ ( x ) = d 0 for all x ∈ X . For such typ e of ch annels, the a verag e cost constra int in (6) exhibits a singular behavior . I f D < d 0 , then the joint commun ication and channel estimation p roblem is infeasible; otherwise, P D consists of all p ossible input d istributions, and thus th e cap acity-distortio n function C ( D ) is e qual to the uncon strained capacity of the channe l. One of the simplest channels with unifor m estimatio n costs is the add itiv e ch annel Y i = X i + S i , for which as the rec eiv er reliably deco des M , it can subtract off X i from Y i . B. A Sca lar Mu ltiplicative Chan nel Consider the following scalar multiplicative c hannel Y i = S i X i , (12) where all the alpha bets are binar y , X = Y = S = { 0 , 1 } , and the multip lication is in the conventional sense f or real number s. The reader m ay in terpret S a s th e status o f an informe d jamming source, a fading level, or the status of another tr ansmitter . Activating S to its “e ffecti ve status” S = 0 shuts down the link be tween X and Y ; oth erwise, the link X → Y is essentially n oiseless. W e take the distortion measure as the Ham ming distance: d ( s, ˆ s ) = 1 if and only if ˆ s 6 = s and zero oth erwise. The tradeoff between co mmun ication and chan nel estima- tion is straig htforward to observe f rom the natu re of the channel: for g ood estimation o f S , we want X = 1 a s ofte n as possible, wh ereas th is would red uce the ach iev ed info rmation rate. In this example, we assume that P ( S = 1) = r ≤ 1 / 2 . W e shall optim ize P ( X = 1) , den oted by p ∈ [0 , 1] . The channel mutual info rmation is I ( X ; Y ) = H 2 ( pr ) − p · H 2 ( r ) , where H 2 ( · ) denotes th e binar y entropy fu nction H 2 ( t ) = − t lo g t − (1 − t ) log(1 − t ) . For x = 0 , th e optimal on e-shot estimator is ˆ S = 0 (no te that P ( S = 1 ) = r ≤ 1 / 2 ), and the resulting minimu m conditiona l distor tion is d ∗ (0) = r . For x = 1 , the optimal one-sho t estimato r is ˆ S = Y = S , leadin g to d ∗ (1) = 0 . Theref ore the input distribution should satisfy (1 − p ) r ≤ D . After m anipulatio ns, we find that the optimal solution is giv en by If D ≥ r − h 1 + e H 2 ( r ) /r i − 1 , p ∗ = 1 r h 1 + e H 2 ( r ) /r i − 1 , and C ( D ) = H 2 ( p ∗ r ) − p ∗ · H 2 ( r ); else p ∗ = 1 − D r , and C ( D ) = H 2 ( r − D ) − 1 − D r H 2 ( r ) . From the solution , we obser ve the following. For relatively large D , th e average distortio n constraint is not active, an d thus the optimal inp ut distribution coincides with that fo r the uncon strained chann el capacity . As the e stimation distortion constraint D falls below a threshold , the av e rage distor tion constraint b ecomes a ctiv e, and th e capacity-d istortion function C ( D ) d eviates from the uncon strained ch annel cap acity . W e can show fro m the expression of C ( D ) that, as D → 0 , C ( D ) = log(1 − r ) − r D + o ( D ) . (13) Figure 2 depicts C ( D ) versus D for d ifferent values of r . W e no tice th at th e tradeoff between comm unication r ates an d estimation distor tions is evidently visible. 0 0.2 0.4 0.6 0.8 1 0 0.05 0.1 0.15 0.2 0.25 r = 0.1, 0.2, 0.3, 0.4, 0.5 P S f r a g r e p l a c e m e n t s D C ( D ) Fig. 2. Capaci ty-distortion function for the s calar m ultipl icati ve channel. C. A Block Multiplicative Channel A generalization of the scalar multiplica ti ve channel is th e following b lock multiplicative chann el Y i = S i X i , (14) where X and Y are leng th- K b locks so that the supe r-symbols in the block memory less chann el have alphab ets X K = Y K = { 0 , 1 } K . The channel state S ∈ S = { 0 , 1 } remain s fixed for each b lock, an d changes in a memo ryless fashion across blocks. W e again adopt the Hammin g distance as the distortion measure. For such a channel, the re are 2 K possible vectors f or an input super-symbo l. Howe ver, we n ote th at, all o f them except the all-zero x = 0 are symmetric. This is becau se they all lead to th e same conditional d istribution for Y as well as the same minimum con ditional d istortion d ∗ ( x ) = 0 , ∀ x 6 = 0 . So from the concavity p roper ty of ch annel mutua l in formatio n in input distributions, the optimal input distribution should take the f ollowing form: P X (0) = 1 − p, and P X ( x ) = p/ (2 K − 1) , ∀ x 6 = 0 . W e can find that the c hannel m utual in formation p er channel use is I ( X ; Y ) K = 1 K H 2 ( pr ) + p · r log(2 K − 1) − H 2 ( r ) , (15) and that the average distor tion con straint is (1 − p ) r ≤ D , (16) the same as that in th e scalar multiplicative channel case. After some manipulations, we find that the resulting optimal solution for ge neral K ≥ 1 is Case 1 2 K > 1 + (1 − r ) − 1 /r : p ∗ = 1 , C ( D ) = r log(2 K − 1) K . Case 2 2 K ≤ 1 + (1 − r ) − 1 /r : if D ≥ r − 1 + 1 2 K − 1 e H 2 ( r ) /r − 1 ≥ 0 , p ∗ = 1 r 1 + 1 2 K − 1 e H 2 ( r ) /r − 1 ; else p ∗ = 1 − D r . C ( D ) = 1 K H 2 ( p ∗ r ) + p ∗ r log(2 K − 1) − H 2 ( r ) . Case 1 arises because if the chan nel blo ck length K is sufficiently large such that 2 K > 1 + (1 − r ) − 1 /r , th en the resulting p ∗ as given by Case 2 would b e g reater than one, which is impo ssible fo r a valid p robab ility . In Case 1, we h ave P X (0) = 0 , and all the nonzer o sy mbols selected with equal probab ility 1 / (2 K − 1) . In fact, Case 1 kicks in fo r rathe r small values of K . In o ur channel m odel we hav e assumed r ∈ [0 , 1 / 2] . For r smaller than 0 . 1 75 , Case 1 arises for K ≥ 2 ; and fo r r larger tha n 0 . 175 , Case 1 arises for K ≥ 3 . In the scalar multiplicative channel ( K = 1 ), we have noticed that C ( D ) linearly scales to zero a s D → 0 ; see (13). For K > 1 , howe ver , we h ave C (0) = r log(2 K − 1) K > 0 . (17) For compariso n, let us con sider a sub optimal ap proach based upon training that transmits X = 1 in th e first ch annel u se in each chan nel bloc k. The receiver can thu s perf ectly estimate the chan nel state S an d achieve D = 0 . The e ncoder the n can use the rem aining ( K − 1) chan nel u ses in each channel block to encod e info rmation, an d the resulting ach iev able rate is R (0) = r log(2 K − 1 ) K . (18) Comparing C (0) and R (0) , we n otice that their ratio ap- proach es one a s K → ∞ , consistent with the intuition that training usually lea ds to negligible r ate loss for chan nels w ith long coher ence blocks. V . F U RT H E R I S S U E S In this section, we briefly discuss a few issues th at are related to the capacity -distortion func tion fo rmulation . A. Multiple Estimators an d Other Cost Constraints In certain applications, multiple cost constraints may b e present. For example, the receiver may be simultaneou sly interested in two or mo re different d istortion measures, or the tr ansmitter may have an average energy constrain t for the channel inp ut, besid es the a verag e distortion c onstraint. The multiple c ost c onstraints shou ld be simultane ously satisfied by augmen ting the feasible set of inp ut d istributions, P D (6), to the in tersection of m ultiple feasible sets, each f or one cost constraint. For either single or multiple cost con straints, th e capacity- distortion function can be defin ed following Section II, for- mulated as a con strained channel c oding problem f ollowing Section III, an d comp uted following ef ficien t algorith ms like the Blahut-Ar imoto algorithm [11], [12] f or discrete alph abets. B. Uncertainty in Channel State Statistics The con strained chann el coding form ulation in Sectio n III can also be extended to the case in which th e distribution of the ch annel state S is uncertain. For such a compoun d channel setting, we assume that the jo int channel distribu- tion P θ ( x, s, y ) = P ( y | x, s ) P X ( x ) P S ,θ ( s ) is param etrized b y an u nknown parameter θ ∈ Θ , which is in duced b y th e parametrize d distribution of S , P S ,θ ( s ) . If all the alphabets X , Y , and S are discrete, we can show fo llowing the proof in [13] that the capa city-distortion f unction of the compoun d channel is sup P X ∈ P D inf θ ∈ Θ I θ ( X ; Y ) , (19) where P D = ( P X : X x ∈ X P X ( x ) d ∗ θ ( x ) ≤ D , ∀ θ ∈ Θ ) . (20) In I θ ( X ; Y ) an d d ∗ θ ( x ) , the su bscript θ deno tes that they are ev aluated with respect to P θ ( x, s, y ) . C. Capacity P er Unit Distortion In light of the de finition of chan nel capacity per u nit cost for general cost-constrained channels [14], we can analogously define the cap acity per un it distortion, and sh ow that it is equal to C d = sup P X I ( X ; Y ) E [ d ∗ ( X )] . The capacity per unit distortion qu antifies th e m aximum efficiency measu red by the ra tio between the amoun t of transmitted info rmation and th e incur red d istortion in chann el state estimation . From [14], if d ∗ ( x ) = 0 for at least two d ifferent input letters, then C d = ∞ ; if there exists a unique x 0 ∈ X with d ∗ ( x 0 ) = 0 , th en C d is also gi ven b y C d = sup x ∈ X ,x 6 = x 0 D ( P Y | x k P Y | x 0 ) d ∗ ( x ) , (21) where D ( ·k· ) denotes the Kullback-Leibler di vergence be- tween two distributions. He re, note that in P Y | X we marginal- ize over the chan nel state S . Giv en (21), we can then conveniently evaluate C d for various chan nels. For example, the scalar multiplicative chan- nel in Section IV -B has C d = log(1 − r ) − r . In co ntrast, block multiplicative ch annels in Section IV -C with K ≥ 2 h ave C d = ∞ , becau se all input letters excep t 0 lead to d ∗ ( · ) = 0 . V I . C O N C L U S I O N S In this paper, we intro duce a joint c ommun ication and channel estimation prob lem for state-depend ent ch annels, and characterize its fu ndamen tal tra deoff by formulatin g it as a channel co ding problem with inp ut distribution c onstrained by an a verag e “estimation cost” constraint. The resulting capacity-d istortion function p ermits a systematic in vestiga- tion of th e chann el property for com municatio n an d state estimation. Futur e research topics include specializing the general framework to particular chan nel m odels in realistic applications, and generalizin g the resu lts to multiuser sy stems and cha nnels of gen erally corre lated state processes. A C K N O W L E D G M E N T This work has been suppor ted in part by NSF OCE052 0324 , the Ann enberg Foundation, and the University of Southern California. R E F E R E N C E S [1] R. Szewc zyk, E. Osterweil , J. Polastre, M. Hamilton, A. Mainwaring, and D. Estrin, “Habitat Monitoring with Sensor Networks, ” Communicati ons of the ACM , vol. 47, no. 6, pp. 34-40, Jun. 2004. [2] M. Stojanovi c, “Recent Advance s in High-Speed Underw ater Acoustic Communicat ions, ” IEE E J . Oceanic Eng. , vol. 21, no. 2, pp. 125–137 , Apr . 1996. [3] S. Haykin, “Cogni tive Radio: Brain-Empowere d W ireless Communica- tions, ” IEEE J. Select. A rea s Commun. , vol. 23, no. 2, pp. 201–220, Feb . 2005. [4] D. Guo, S. Shamai (Shitz), and S. V erd ´ u, “Mutual Information and Min- imum Mean-Square E rror in Gaussian Channels, ” IEEE T rans. Inform. Theory , vol. 51, no. 4, pp. 1261–1281, Apr . 2005. [5] R. McElie ce and W . Stark, “Cha nnels with Block Interferenc e, ” IEEE T rans. Inform. Theory , vol. 30, no. 1, pp. 44–53, J an. 1984. [6] B. Hassibi a nd B. Hochwal d, “How Much Tra ining is Needed in Multipl e- Antenna W ireless Links?” IEEE T rans. Inform. Theory , vol. 49, no. 4, pp. 951–963, Apr . 2003. [7] A. Sutiv ong, M. Chiang, T . M. Cover , and Y .-H. Kim, “Channel Capacity and State Estimation for State-Depende nt Gaussian Channels, ” IEE E T rans. Inform. Theory , vol. 51, no. 4, pp. 1486–1496, Apr . 2005. [8] T . M. Cove r , Y .-H. Kim, and A. Suti vong, “Simultane ous Communication of Data and State, ” [Online] A vai lable at ArXiv , 2007. [9] T . M. Cover and J. A. Thomas, E lements of Informati on Theory , John W iley & Sons, Inc., 1991. [10] R. G. Gallag er , Information Theory and Reliable Communication , Wile y , 1968. [11] R. E. Blahut, “Computa tion of Channe l Capac ity and Rate-Distorti on Function s, ” IEE E T rans. Inform. Theory , vo l. 18, no. 4, pp. 460–478, Jul. 1972. [12] S. Arimoto, “ An Algorithm for Calculat ing the Capacit y of an Arbitrary Discrete Memoryless Cha nnel, ” IEE E T rans. Inform. Theory , vol. 18, no. 1, pp. 14–20, Jan. 1972. [13] D. Blackwe ll, L. Breiman, and A. J. T homasian, “The Capacity of a Class of Channels, ” The Annals of Mathematical Statistics , vol. 30, no. 4, pp. 1229–1241, Dec. 1959. [14] S. V erd ´ u, “On Channel Capacit y Per U nit Cost, ” IEEE T rans. Inform. Theory , vol. 36, no. 5, pp. 1019–1030, Sep. 1990.
Original Paper
Loading high-quality paper...
Comments & Academic Discussion
Loading comments...
Leave a Comment