Multipath Channels of Unbounded Capacity

The capacity of discrete-time, noncoherent, multipath fading channels is considered. It is shown that if the variances of the path gains decay faster than exponentially, then capacity is unbounded in the transmit power.

Authors: Tobias Koch, Amos Lapidoth

MUL TIP A TH CHANNELS OF UNBOUNDED CAP A CITY T obias K och Amos Lapidoth ETH Zurich, Switzerland Sternwartstrasse 7, CH-8092 Zurich Email: { tkoch, lapidoth } @isi.ee.ethz.ch ABSTRA CT The capacity of discre te-time, nonco herent, multip ath fadin g channels is considered. It is shown that if the variances of the path gains decay faster than expo nentially , then capacity is unbou nded in the transmit power . Index T erms — Channel capacity , information rates, mul- tipath channels, fading channels, noncoherent. 1. INTR ODUCTION This paper studies the capac ity of multipath ( frequen cy- selecti ve) fading channels. A noncoh erent chan nel m odel is co nsidered wh ere neith er tran smitter n or receiver ar e cog- nizant of the fading ’ s re alization, but both are aware of its statistic. Ou r focus is on the high signal-to -noise ratio (SNR) regime. For the special case of n oncoh erent fr equency-fl at fad- ing channels (where we have o nly one p ath), it was shown by Lapidoth & Mo ser [1 ] that if the fading p rocess is of fi- nite entro py rate, then at high SNR capacity grows do uble- logarithm ically with the SNR. Th is is in stark c ontrast to the logarithm ic growth of the capacity of coh erent fading chan- nels (where the rea lization o f the fading is known to the receiver) [2]. Thus, commun icating over nonco herent flat- fading channels at high SNR is po wer inefficient. Recently , it has been dem onstrated that co mmunicatin g over non coheren t multipath fading chan nels at h igh SNR is not mer ely power in efficient, but may be even worse: if the delay spread is large in th e sense t hat the variances o f th e path gains decay exponentially or slower , then capacity is bounded in the SNR; see [3, T hm. 1]. For such channels, capacity d oes not tend to infinity as the SNR tends to infinity . In con trast, if the variances of the path ga ins decay faster than dou ble-exponentially , then capacity is unb ounde d in the SNR; see [3, Thm. 2]. This condition is cer tainly satisfied if th e nu mber of path s is finite, i.e., if the chan nel ou tput is only in fluenced by the pr esent an d by the L previous chan - nel inputs. (Here only the variances of the first ( L + 1 ) path gains are positive, while th e o ther variances are zero.) It was shown in [4] that in this case c apacity is no t on ly unb ounded in the SNR, but its growth with the SNR is also independ ent of the n umber of path s and equals the g rowth of th e cap acity of nonco herent freq uency-flat fading channels, i.e., lim SNR →∞ C ( SNR ) log log SNR = 1 . Thus, for finite L , the capacity pre-log log is unaffected by the number of paths L . The above results demon strate that whe ther the capacity of a multipath channel is unbou nded in the SNR depends crit- ically on the d ecay r ate of the variances o f the path g ains. Howe ver , [3, Thm. 1] on ly accounts for decay rates that are exponentially or slo wer, wher eas [3, Thm. 2] only re gard s de- cay r ates that are faster than d ouble- exponentially . Thu s, [3 , Thm. 1] & [3, T hm. 2 ] fail to characterize the capacity of channels for wh ich the variances of the path ga ins decay faster than expo nentially but slower than d ouble-exp onentially . In this paper , we bridge this g ap by showing that if the v ariances of the p ath gains d ecay faster than exponen tially , then capac- ity is unboun ded in the SNR. 1.1. Channel Model Let C and N den ote the set of co mplex n umbers and the set of positi ve integers, respectively . W e consider a discrete-time multipath fading channel w hose chann el ou tput Y k ∈ C at time k ∈ N correspo nding to the time-1 through time- k chan- nel inputs x 1 , . . . , x k ∈ C is given by Y k = k − 1 X ℓ =0 H ( ℓ ) k x k − ℓ + Z k , k ∈ N . (1) Here { Z k } models additi ve noise, and H ( ℓ ) k denotes the time- k gain of the ℓ -th path. W e assume th at { Z k } is a sequ ence of indepen dent and id entically distributed ( IID), ze ro-mean , variance- σ 2 , circular ly-symmetr ic, com plex Ga ussian ran- dom variables. For each path ℓ ∈ N 0 (where N 0 denotes the set o f n onnegative integers), we assume that  H ( ℓ ) k , k ∈ N  is a zero -mean, complex station ary pro cess. W e denote its variance and its differential entropy rate by α ℓ , E h   H ( ℓ ) k   2 i , ℓ ∈ N 0 (2) and h ℓ , lim n →∞ 1 n h  H ( ℓ ) 1 , . . . , H ( ℓ ) n  , ℓ ∈ N 0 . (3) W ithout loss of g enerality w e assume that α 0 > 0 . W e further assume that ∞ X ℓ =0 α ℓ , α < ∞ (4) and inf ℓ ∈L h ℓ > − ∞ , (5) where the set L is d efined as L , { ℓ ∈ N 0 : α ℓ > 0 } . W e finally assume that the processes  H (0) k , k ∈ N  ,  H (1) k , k ∈ N  , . . . are ind epend ent (“u ncorrela ted scattering ”); that they are jointly independe nt of { Z k } ; and that the joint law of  { Z k } ,  H (0) k , k ∈ N  ,  H (1) k , k ∈ N  , . . .  does not d epend on the input sequence { x k } . W e co nsider a noncoh erent chann el mo del where neither transmitter nor receiver is c ognizan t of the realization o f  H ( ℓ ) k , k ∈ N  , ℓ ∈ N 0 , but both are aware of their law . W e do not assume that the path gains are Gaussian. 1.2. Channel Capacity Let A n m denote the sequence A m , . . . , A n . W e d efine th e ca- pacity as C ( SNR ) , lim n →∞ 1 n sup I  X n 1 ; Y n 1  , (6) where the supremum is over all joint distributions on X 1 , . . . , X n satisfying the power constraint 1 n n X k =1 E  | X k | 2  ≤ P , (7) and where SNR is defined as SNR , P σ 2 . (8) By Fano’ s ineq uality , no rate above C ( SNR ) is achiev able. (See [5] for a d efinition of an ac hiev able rate.) W e do not claim that ther e is a cod ing theorem associated with (6), i.e., that C ( SNR ) is achiev able. A coding theo rem will hold, for example, if there are o nly ( L + 1 ) p aths (f or some L < ∞ ), and if the processes correspon ding to these paths  H (0) k , k ∈ N  , . . . ,  H ( L ) k , k ∈ N  are jointly ergodic, see [6]. In [3] a necessary and a sufficient co ndition for C ( SNR ) to be bounde d in SNR was deriv ed: Theorem 1. Con sider the above c han nel model. Then  lim ℓ →∞ α ℓ +1 α ℓ > 0  = ⇒  sup SNR > 0 C ( SNR ) < ∞  (9) and  lim ℓ →∞ 1 ℓ log log 1 α ℓ = ∞  = ⇒  sup SNR > 0 C ( SNR ) = ∞  , (10) wher e we define a/ 0 , ∞ for every a > 0 and 0 / 0 , 0 . Pr oof. For the fir st con dition (9) see [ 3, Thm. 1 ], and for the second condition (10) see [3, Thm. 2]. For example, when α ℓ = e − ℓ , then capacity is b ounde d, and wh en α ℓ = exp  − e xp( ℓ κ )  for some κ > 1 , th en ca- pacity is unbo unded . Roughly speaking, we can say th at when { α ℓ } decay s exp onentially o r slower , then C ( SNR ) is bound ed in SNR, an d wh en { α ℓ } decays faster than double- exponentially , th en C ( SNR ) is unbo unded in SNR. 1.3. Main Result Our main result is an improved ach iev ability result. W e deri ve a weaker co ndition that satisfies to guarantee that capacity is unbou nded in the SNR. Theorem 2. Con sider the above c han nel model. Then  lim ℓ →∞ 1 ℓ log 1 α ℓ = ∞  = ⇒  sup SNR > 0 C ( SNR ) = ∞  , (11) wher e we define 1 / 0 , ∞ . Pr oof. See Section 2. By noting that  lim ℓ →∞ α ℓ +1 α ℓ = 0  = ⇒  lim ℓ →∞ 1 ℓ log 1 α ℓ = ∞  we obtain from Theorem s 1 & 2 the immed iate corollary: Corollary 3. Con sider the above c han nel model. Then i )  lim ℓ →∞ α ℓ +1 α ℓ > 0  = ⇒  sup SNR > 0 C ( SNR ) < ∞  (12) ii )  lim ℓ →∞ α ℓ +1 α ℓ = 0  = ⇒  sup SNR > 0 C ( SNR ) = ∞  , (1 3) wher e we define a/ 0 , ∞ for every a > 0 and 0 / 0 , 0 . For example, when α ℓ = exp( − ℓ κ ) for some κ > 1 , the n capacity is unboun ded. Theorem 2 and Coro llary 3 demon strate that when { α ℓ } decays faster than exponentially , then C ( SNR ) is unb ounde d in SNR, thus bridging the gap between (9) and (10). 2. PROOF OF THEOREM 2 In or der to prove Theo rem 2, we shall d erive in Section 2. 1 a lower bo und on capacity and then show in Section 2 .2 that this bound can be made arbitrarily large, provided that lim ℓ →∞ 1 ℓ log 1 α ℓ = ∞ . 2.1. Capacity Lower Bound T o deri ve a lower bound on capacity , we ev aluate 1 n I ( X n 1 ; Y n 1 ) for the f ollowing distribution on the inputs { X k } . Let L ( P ) be such that ∞ X ℓ = L ( P )+1 α ℓ · P ≤ σ 2 . (14) T o shor ten notation, we shall wr ite in the f ollowing L instead of L ( P ) . Let τ ∈ N be some p ositiv e integer th at po ssibly depend s on L , and let X b = ( X b ( L + τ )+1 , . . . , X ( b +1)( L + τ ) ) . W e choose { X b } to be IID with X b =  0 , . . . , 0 | {z } L , ˜ X bτ +1 , . . . , ˜ X ( b +1) τ  , where ˜ X bτ +1 , . . . , ˜ X ( b +1) τ is a sequence of indep enden t, zero-mea n, cir cularly-sym metric, complex r andom variables with lo g | ˜ X bτ + ν | 2 being u niform ly distributed ov er the inter- val  log P ( ν − 1) /τ , log P ν /τ  , i.e., for each ν = 1 , . . . , τ log | ˜ X bτ + ν | 2 ∼ U   log P ( ν − 1) /τ , log P ν /τ   . (Here and through out this proo f we assume that P > 1 .) Let κ , ⌊ n L + τ ⌋ ( where ⌊ a ⌋ denotes the largest in teger that is less than or equal to a ), and let Y b denote the vector ( Y b ( L + τ )+1 , . . . , Y ( b +1)( L + τ ) ) . By the chain r ule for m utual informa tion [5, Thm. 2.5.2] we have I  X n 1 ; Y n 1  ≥ I  X κ − 1 0 ; Y κ − 1 0  = κ − 1 X b =0 I  X b ; Y κ − 1 0   X b − 1 0  ≥ κ − 1 X b =0 I ( X b ; Y b ) , (15) where the first inequ ality follows b y r estricting the numbe r of observables; and wh ere the last inequ ality follows by r e- stricting the n umber of o bservables and by n oting that { X b } is IID. W e continue by lower b oundin g each summan d on the right-ha nd sid e (RHS) of (15). W e use again the chain ru le and that r educing ob servations cannot increase mu tual infor- mation to obtain I ( X b ; Y b ) = τ X ν =1 I  ˜ X bτ + ν ; Y b   ˜ X bτ + ν − 1 bτ +1  ≥ τ X ν =1 I  ˜ X bτ + ν ; Y b ( L + τ )+ L + ν   ˜ X bτ + ν − 1 bτ +1  ≥ τ X ν =1 I  ˜ X bτ + ν ; Y b ( L + τ )+ L + ν  , (16) where we have additionally used in the last inequality that ˜ X bτ +1 , . . . , ˜ X ( b +1) τ are independ ent. Defining W bτ + ν , b ( L + τ )+ L + ν − 1 X ℓ =1 H ( ℓ ) b ( L + τ )+ L + ν X b ( L + τ )+ L + ν − ℓ + Z b ( L + τ )+ L + ν (17) each summand on the RHS of (16) can be written as I  ˜ X bτ + ν ; Y b ( L + τ )+ L + ν  = I  ˜ X bτ + ν ; H (0) b ( L + τ )+ L + ν ˜ X bτ + ν + W bτ + ν  . (18) A lo wer bou nd on (18) follo ws from the following lemma. Lemma 4. Let the random variables X , H , an d W h ave finite second momen ts. A ssume th at both X an d H ar e of finite differ ential en tr opy . F inally , a ssume that X is indepen dent of H ; that X is ind ependen t of W ; and that X ⊸ − − H ⊸ − − W forms a Markov chain. Then I ( X ; H X + W ) ≥ h ( X ) − E  log | X | 2  + E  log | H | 2  − E " log  π e  σ H + σ W | X |  2  # , (1 9) wher e σ 2 H ≥ 0 and σ 2 H > 0 deno te the varian ces of W and H . ( Note tha t the assumptio ns that X and H h ave finite sec- ond mo ments an d a r e of fin ite differ ential entr opy guarantee that E  log | X | 2  and E  log | H | 2  ar e finite, see [ 1, Lemm a 6.7e] .) Pr oof. See [7, Lemma 4]. It can be easily verified th at fo r the chann el model g iv en in Section 1.1 and for the above cod ing scheme the lemma’ s condition s are satisfied. W e the refore obtain from Lemma 4 I  ˜ X bτ + ν ; H (0) b ( L + τ )+ L + ν ˜ X bτ + ν + W bτ + ν  ≥ h  ˜ X bτ + ν  − E h log | ˜ X bτ + ν | 2 i + E h log   H (0) b ( L + τ )+ L + ν   2 i − E " log  π e  √ α 0 + p E [ | W bτ + ν | 2 ] | ˜ X bτ + ν |  2  # . (20) Using that the d ifferential entropy o f a circ ularly-sym metric random variable is given by (see [1, Eqs. (320) & (316) ]) h  ˜ X bτ + ν  = E h log | ˜ X bτ + ν | 2 i + h  log | ˜ X bτ + ν | 2  + log π , (21) and evaluating h (lo g | ˜ X bτ + ν | 2 ) for our choice of ˜ X bτ + ν , yields for the first two terms on the RHS of (20) h  ˜ X bτ + ν  − E h log | ˜ X bτ + ν | 2 i = log log P 1 /τ + lo g π. (22) W e next upper bound E  | W bτ + ν | 2  | ˜ X bτ + ν | 2 = L X ℓ =1 α ℓ E  | X b ( L + τ )+ L + ν − ℓ | 2  | ˜ X bτ + ν | 2 + b ( L + τ )+ L + ν − 1 X ℓ = L +1 α ℓ E  | X b ( L + τ )+ L + ν − ℓ | 2  | ˜ X bτ + ν | 2 + σ 2 | ˜ X bτ + ν | 2 . (2 3) T o this end we n ote that f or ou r cho ice of { X k } an d by the assumption that P > 1 , we have E  | X ℓ | 2  ≤ P , ℓ ∈ N , (24) E  | X b ( L + τ )+ L + ν − ℓ | 2  ≤ P ( ν − ℓ ) /τ , ℓ = 1 , . . . , L , (25) and | ˜ X bτ + ν | 2 ≥ P ( ν − 1) /τ ≥ 1 , (26) from which we obtain E  | X b ( L + τ )+ L + ν − ℓ | 2  | ˜ X bτ + ν | 2 ≤ P ( ν − ℓ ) /τ P ( ν − 1) /τ ≤ 1 , ℓ = 1 , . . . , L (27) and E  | X b ( L + τ )+ L + ν − ℓ | 2  | ˜ X bτ + ν | 2 ≤ P , L < ℓ < b ( L + τ ) + L + ν . (28) Applying (26)–(28) to (23) yields E  | W bτ + ν | 2  | ˜ X bτ + ν | 2 ≤ L X ℓ =1 α ℓ + b ( L + τ )+ L + ν − 1 X ℓ = L +1 α ℓ · P + σ 2 ≤ α + ∞ X ℓ = L +1 α ℓ · P + σ 2 ≤ α + 2 σ 2 , (29) with α being defin ed in (4). Here the second inequ ality f ol- lows because α ℓ , ℓ ∈ N 0 and P are no nnegative, and th e last inequality follows from (14). By co mbining (20) with (2 2) & (29), and by noting tha t by the stationarity of  H (0) k , k ∈ N  E h log   H (0) b ( L + τ )+ L + ν   2 i = E h log   H (0) 1   2 i , we obtain the lower bound I  ˜ X bτ + ν ; H (0) b ( L + τ )+ L + ν ˜ X bτ + ν + W bτ + ν  ≥ log log P 1 /τ + E h log   H (0) 1   2 i − 1 − 2 log  √ α 0 + p α + 2 σ 2  . (30) Note that the RHS of (30) n either depends on ν nor on b . W e therefor e have from (30), (16), and (15) I  X n 1 ; Y n 1  ≥ κτ log log P 1 /τ + κτ Υ , (31) where we define Υ as Υ , E h log   H (0) 1   2 i − 1 − 2 log  √ α 0 + p α + 2 σ 2  . (32 ) Dividing the RHS of (3 1) by n , and compu ting the limit as n tends to infinity , yields the lower bound on capacity C ( SNR ) ≥ τ L + τ log log P 1 /τ + τ L + τ Υ , P > 1 , (3 3) where we have used that lim n →∞ κ/n = 1 / ( L + τ ) . 2.2. Unbounded Capacity W e next show that lim ℓ →∞ 1 ℓ log 1 α ℓ = ∞ (34) implies that the RHS of (33) can be made arbitrarily large. T o this end we note that by (34 ) we can find for every 0 <  < 1 an ℓ 0 ∈ N such that α ℓ <  ℓ , ℓ > ℓ 0 . (35) W e therefo re have ∞ X ℓ = ℓ ′ +1 α ℓ < ∞ X ℓ = ℓ ′ +1  ℓ =  ℓ ′  1 −  , ℓ ′ ≥ ℓ 0 . (36) W e choose L so that it satisfies  L  1 −  P ≤ σ 2 , (37) i.e., we choose L =     log  SNR  1 −   log 1      (38) (where ⌈ a ⌉ denotes the smallest inte ger that is greater than or equal to a ). W e shall argue next tha t this choice also satis- fies ( 14). In deed, we hav e by (3 8) that L tend s to in finity as SNR → ∞ , which implies th at, fo r sufficiently large SNR, L is greater than ℓ 0 . It follows then from (36) and (37) that ∞ X ℓ = L +1 α ℓ · P <  L  1 −  P ≤ σ 2 . (39) W e con tinue by ev aluating the R HS of (33) for our choice of L (38) and for τ = L C ( SNR ) ≥ τ L + τ log log P 1 /τ + τ L + τ Υ = 1 2 log  log P L  + 1 2 Υ . (40) T aking the limit as SNR tends to infinity yields lim SNR →∞ C ( SNR ) ≥ lim SNR →∞ 1 2 log log( SNR · σ 2 ) log( SNR · / (1 −  )) log(1 / ) ! + 1 2 Υ = 1 2 log log 1  + 1 2 Υ . (41) As this can be made ar bitrarily large by choosing  suffi- ciently small, we conclude that lim ℓ →∞ 1 ℓ log 1 α ℓ = ∞ implies that C ( SNR ) is unbou nded in SNR. 3. SUMMAR Y W e stud ied the capacity of d iscrete-time, non coheren t, mul- tipath fadin g ch annels. It was shown that if the variances of the path gains decay faster than expo nentially , then capacity is unbou nded in the SNR. Th is complemen ts previous results obtained in [3] and [4]. The overall pictur e looks as follows : • If the nu mber of path s is infinite in the sense th at the channel ou tput is influen ced b y the present and by a ll previous chann el inputs, and if the variances of the path gains dec ay exponen tially or slo wer , then cap acity is bound ed e ven as the SNR grows without bound. • If the num ber of path s is infinite but th e variances of the path g ains decay faster than exponen tially , then ca- pacity tends to infinity as SNR → ∞ . • If the numb er of paths is finite, then, irrespectiv e of the number of p aths, the capacity pre -loglog is 1 . Thus, in this ca se the multipath beh avior has n o significant effect on the high-SNR capacity . W e thus see th at the hig h-SNR b ehavior of th e capacity of no ncoher ent m ultipath fading chan nels depen ds critically on the as sumed channel mode l. Consequen tly , when stud ying such ch annels at high SNR, the ch annel mo deling is crucial, as sligh t changes in the mo del might lead to co mpletely d if- ferent capacity results. 4. A CKNO WLE DGMENT The a uthors wish to thank Olivier Leveque an d Nih ar Jindal for their comm ents which were the inspir ation for th e pro of of Theorem 2. 5. REFERENCES [1] A. L apidoth and S. M. Moser , “Capacity b ound s via dual- ity with applications to multiple-anten na systems on flat fading channels, ” I EEE T r ans. In form. Theory , vol. 49, no. 10, pp. 2426 –246 7, Oct. 200 3. [2] T . H. E. Ericson, “ A Gaussian channel with slo w fading, ” IEEE T r ans. In form. Theory , vol. 16, no. 3, pp . 353–3 55, May 1970. [3] T . Koch and A. Lap idoth, “Multip ath cha nnels of bound ed capacity , ” in Pr oc. Inform. Theo ry W o rkshop (ITW) , Porto, Portugal, May 5–9, 2008. [4] , “On multipa th fading channels at high SNR, ” in Pr oc. IEEE I nt. S ymposium on I nf. Theory , T or onto, Canada, July 6–11 , 200 8. [5] T . M. Cover an d J. A. Thom as, Elemen ts of Info rmation Theory . John W iley & Sons, 1991. [6] Y .-H. Kim, “ A coding theorem fo r a class of stationary channels with feedbac k, ” I EEE T r ans. Inform. Theory , vol. 54, no. 4, pp. 1488–14 99, Ap r . 2008. [7] A. Lapidoth , “On the asymp totic capacity of stationar y Gaussian fading chann els, ” IE EE T r ans. Inform. Theory , vol. 51, no. 2, pp. 437–446 , Feb. 200 5.

Original Paper

Loading high-quality paper...

Comments & Academic Discussion

Loading comments...

Leave a Comment