Tài liệu 17 Cyclostationary Signal Analysis docx

33 246 0
Tài liệu 17 Cyclostationary Signal Analysis docx

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

Giannakis, G.B. “Cyclostationary Signal Analysis” Digital Signal Processing Handbook Ed. Vijay K. Madisetti and Douglas B. Williams Boca Raton: CRC Press LLC, 1999 c  1999byCRCPressLLC 17 Cyclostationary Signal Analysis Georgios B. Giannakis University of Virginia 17.1 Introduction 17.2 Definitions, Properties, Representations 17.3 Estimation, Time-Frequency Links, Testing Estimating Cyclic Statistics • Links with Time-Frequency Rep- resentations • Testing for Cyclostationarity 17.4 CS Signals and CS-Inducing Operations Amplitude Modulation • Time Index Modulation • Fractional Sampling andMultivariate/Multirate Processing • Periodically Varying Systems 17.5 Application Areas CS Signal Extraction • Identification and Modeling 17.6 Concluding Remarks Acknowledgments References 17.1 Introduction Processes encountered in statistical signal processing, communications, and time series analysis applications are often assumed stationary. The plethora of available algorithms testifies to the need for processing and spectral analysis of stationary signals (see, e.g., [42]). Due to the varying nature of physical phenomena and certain man-made operations, however, time-invariance and the related notion of stationarity are often violated in practice. Hence, study of time-varying systems and nonstationary processes is well motivated. Research in nonstationary signals and time-varying systems has led both to the development of adaptive algorithms and to several elegant tools, including short-time (or running) Fourier trans- forms, time-frequency representations such as the Wigner-Ville (a member of Cohen’s class of dis- tributions), Loeve’s and Karhunen’s expansions (leading to the notion of evolutionary spectra), and time-scale representations based on wavelet expansions (see [37, 45] and references therein). Adap- tive algorithms derived from stationary models assume slow variations in the underlying system. On the other hand, time-frequency and time-scale representations promise applicability to general nonstationarities and provide useful visual cues for preprocessing. When it comes to nonstationary signal analysis and estimation in the presence of noise, however, they assume availability of multiple independent realizations. In fact, it is impossible to perform spectral analysis, detection, and estimation tasks on signals involving generally unknown nonstationarities, when only a single data record is available. For instance, consider extracting a deterministic signal s(n) observed in stationary noise v(n), using regression techniques based on nonstationary data x(n) = s(n)+ v(n), n = 0, 1, .,N−1. Unless s(n) is finitely parameterized by a d θ s × 1 vector θ s (with d θ s <N), the problem is ill-posed because c  1999 by CRC Press LLC adding a new datum, say x(n 0 ), adds a new unknown, s(n 0 ), tobe determined. Thus, only structured nonstationarities can be handled when rapid variations are present; and only for classes of finitely parameterized nonstationary processes can reliable statistical descriptors be computed using a single time series. One such class is that of (wide-sense) cyclostationary processes which are characterized by the periodicity they exhibit in their mean, correlation, or spectral descriptors. An overview of cyclostationary signal analysis and applications are the main goals of this sec- tion. Periodicity is omnipresent in physical as well as manmade processes, and cyclostationary signals occur in various real life problems entailing phenomena and operations of repetitive nature: communications [15], geophysical and atmospheric sciences (hydrology [66], oceanography [14], meteorology [35], and climatology [4]), rotating machinery [43], econometrics [50], and biological systems [48]. In 1961 Gladysev [34] introduced key representations of cyclostationary time series, while in 1969 Hurd’s thesis [38] offered an excellent introduction to continuous time cyclostationary processes. Since 1975 [22], Gardner and co-workers have contributed to the theory of continuous-time cyclo- stationary signals, and especially their applications to communications engineering. Gardner [15] adopts a “non-probabilistic” viewpoint of cyclostationarity (see [19] for an overview and also [36] and [18] for comments on this approach). Responding to a recent interest in digital periodically varying systems and cyclostationary time series, the exposition here is probabilistic and focuses on discrete-time signals and systems, with emphasis on their second-order statistical characterization and their applications to signal processing and communications. The material in the remaining sections is organized as follows: Section 17.2 provides definitions, properties, and representationsof cyclostationary processes, along with their relations with stationary and general classes of nonstationary processes. Testing a time series for cyclostationarity and retrieval of possibly hidden cycles along with single record estimation of cyclic statistics are the subjects of Section 17.3. Typical signal classes and operations inducing cyclostationarity are delineated in Section 17.4 to motivate the key uses and selected applications described in Section 17.5. Finally, Section 17.6 concludes and presents trade-offs, topics not covered, and future directions. 17.2 Definitions, Properties, Representations Let x(n) be a discrete-index random process (i.e., a time series) with mean µ x (n) := E{x(n)}, and covariance c xx (n; τ):= E{[x(n) − µ x (n)][x(n+ τ)− µ x (n + τ)]}.Forx(n) complex valued, let also ¯c xx (n; τ):= c xx∗ (n; τ),where∗ denotes complex conjugation, and n, τ are in the set of integers Z. DEFINITION 17.1 Process x(n) is (wide-sense) cyclostationary (CS) iff there exists an integer P such that µ x (n) = µ x (n + lP), c xx (n; τ) = c xx (n + lP; τ),or,¯c xx (n; τ) =¯c xx (n + lP; τ), ∀n, l ∈ Z. The smallest of all such P s is called the period. Being periodic, they all accept Fourier Series expansions over complex harmonic cycles with the set of cycles defined as: A c xx := {α k = 2πk/P , k= 0, .,P− 1}; e.g., c xx (n; τ)and its Fourier coefficients called cyclic correlations are related by: c xx (n; τ)= P−1  k=0 C xx  2π P k; τ  e j 2π P kn FS ←→ C xx  2π P k; τ  = 1 P P−1  n=0 c xx (n; τ)e −j 2π P kn . (17.1) Strict sense cyclostationarity, or, periodic (non-) stationarity, can also be defined in terms of probability distributions or density functions when these functions vary periodically (in n). But c  1999 by CRC Press LLC the focus in engineering is on periodically and almost periodically correlated 1 time series, since real data are often zero-mean, correlated, and with unknown distributions. Almost periodicity is very common in discrete-time because sampling a continuous-time periodic process will rarely yield a discrete-time periodic signal; e.g., sampling cos(ω c t + θ)every T s seconds results in cos(ω c nT s + θ) for which an integer period exists only if ω c T s = 2π/P. Because 2π/(ω c T s ) is “almost an integer” period, such signals accept generalized (or limiting) Fourier expansions (see also Eq. (17.2) and [9] for rigorous definitions of almost periodic functions). DEFINITION 17.2 Process x(n) is (wide-sense) almost cyclostationary (ACS) iff its mean and correlation(s) are almost periodic sequences. For x(n) zero-mean and real, the time-varying and cyclic correlations are defined as the generalized Fourier Series pair: c xx (n; τ) =  α k ∈A c xx C xx (α k ; τ)e jα k n FS ←→ C xx (α k ; τ) = lim N→∞ 1 N N−1  n=0 c xx (n; τ)e −jα k n . (17.2) The set of cycles, A c xx (τ ) := {α k : C xx (α k ; τ) = 0 , −π<α k ≤ π}, must be countable and the limit is assumed to exist at least in the mean-square sense [9, Thm. 1.15]. Definition 17.2 and Eq. (17.2) for ACS, subsume CS Definition 17.1 and Eq. (17.1). Note that the latter require integer period and a finite set of cycles. In the α-domain, ACS signals exhibit lines but not necessarily at harmonically related cycles. The following example will illustrate the cyclic quantities defined thus far: EXAMPLE 17.1: Harmonic in multiplicative and additive noise Let x(n) = s(n) cos(ω 0 n) + v(n) , (17.3) where s(n), v(n) are assumed real, stationary, and mutually independent. Such signals appear when communicating through flat-fading channels, and with weather radar or sonar returns when, in addition to sensor noise v(n), backscattering, target scintillation, or fluctuating propagation media give rise to random amplitude variations modeled by s(n) [33]. We will consider two cases: Case 1: µ s = 0. The mean in (17.3)isµ x (n) = µ s cos(ω 0 n) + µ v , and the cyclic mean: C x (α) := lim N→∞ 1 N N−1  n=0 µ x (n)e −jαn = µ s 2 [δ(α− ω 0 ) + δ(α + ω 0 )]+µ v δ(α) , (17.4) wherein(17.4) we used the definition of Kronecker’s delta lim N→∞ 1 N N−1  n=0 e jαn = δ(α) :=  1 α = 0 0 else . (17.5) 1 The term cyclostationarity is due to Bennet [3]. Cyclostationary processes in economics and atmospheric sciences are also referred to as seasonal time series [50]. c  1999 by CRC Press LLC Signal x(n) in (17.3) is thus (first-order) cyclostationary with set of cycles A c x ={±ω 0 , 0}.If X N (ω) :=  N−1 n=0 x(n) exp(−jωn),thenfrom(17.4)wefindC x (α) = lim N→∞ N −1 E{X N (α)}; thus, the cyclic mean can be interpreted as an averaged DFT and ω 0 can be retrieved by picking the peak of |X N (ω)| for ω = 0. Case 2: µ s = 0.From(17.3) we find the correlation c xx (n; τ) = c ss (τ )[cos(2ω 0 n + ω 0 τ)+ cos(ω 0 τ)]/2 + c vv (τ ). Because c xx (n; τ) is periodic in n, x(n) is (second-order) CS with cyclic correlation [c.f. (17.2) and (17.5)] C xx (α; τ) = c ss (τ ) 4  δ(α+ 2ω 0 )e jω 0 τ + δ(α− 2ω 0 )e −jω 0 τ  +  c ss (τ ) 2 cos(ω 0 τ)+ c vv (τ )  δ(α) . (17.6) The set of cycles is A c xx (τ ) ={±2ω 0 , 0} provided that c ss (τ ) = 0 and c vv (τ ) = 0. The set A c xx (τ ) is lag-dependent in the sense that some cycles may disappear while others may appear for different τs. To illustrate the τ -dependence, let s(n) be an MA process of order q. Clearly, c ss (τ ) = 0 for |τ| >q, and thus A c xx (τ ) ={0} for |τ| >q. The CS process in (17.3) is just one example of signals involving products and sums of stationary processes such as s(n) with (almost) periodic deterministic sequences d(n), or, CS processes x(n). For such signals, the following properties are useful: Property 1 Finite sums and products of ACS signals are ACS. If x i (n) is CS with period P i , then for λ i constants, y 1 (n) :=  I 1 i=1 λ i x i (n) and y 2 (n) :=  I 2 i=1 λ i x i (n) are also CS. Unless cycle cancellations occur among x i (n) components, the period of y 1 (n) and y 2 (n) equals the least common multiple of the P i s. Similarly, finite sums and products of stationary processes with deterministic (almost) periodic signals are also ACS processes. As examples of random–deterministic mixtures, consider x 1 (n) = s(n) + d(n) and x 2 (n) = s(n)d(n) , (17.7) where s(n) is zero-mean, stationary, and d(n) is deterministic (almost) periodic with Fourier Series coefficients D(α). Time-varying correlations are, respectively, c x 1 x 1 (n; τ)= c ss (τ ) + d(n)d(n + τ) and c x 2 x 2 (n; τ)= c ss (τ )d(n)d(n + τ) . (17.8) Both are (almost) periodic in n, with cyclic correlations C x 1 x 1 (α; τ)= c ss (τ )δ(α) + D 2 (α; τ) and C x 2 x 2 (α; τ)= c ss (τ )D 2 (α; τ) , (17.9) where D 2 (α; τ) =  β D(β)D(α − β)exp[j(α − β)τ], since the Fourier Series coefficients of the product d(n)d(n + τ) are given by the convolution of each component’s coefficients in the α-domain. To reiterate the dependence on τ, notice that if d(n) is a periodic ±1 sequence, then c x 2 x 2 (n; 0) = c ss (0)d 2 (n) = c ss (0), and hence periodicity disappears at τ = 0. ACS signals appear often in nature with the underlying periodicity hidden, unknown, or inacces- sible. In contrast, CS signals are often man-made and arise as a result of, e.g., oversampling (by a known integer factor P) digital communication signals, or by sampling a spatial waveform with P antennas (see also Section 17.4). Both CS and ACS definitions could also be given in terms of the Fourier Transforms (τ → ω) of c xx (n; τ) and C xx (α; τ), namely the time-varying and the cyclic spectra which we denote by S xx (n; ω) and S xx (α; ω). Suppose c xx (n; τ)and C xx (α; τ)are absolutely summable w.r.t. τ for all c  1999 by CRC Press LLC n in Z and α k in A c xx (τ ). We can then define and relate time-varying and cyclic spectra as follows: S xx (n; ω) := ∞  τ=−∞ c xx (n; τ)e −jωτ =  α k ∈A s xx S xx (α k ; ω)e jα k n (17.10) S xx (α k ; ω) := ∞  τ=−∞ C xx (α k ; τ)e −jωτ = lim N→∞ 1 N N−1  n=0 S xx (n; ω)e −jα k n . (17.11) Absolute summability w.r.t. τ implies vanishing memory as the lag separation increases, and many real life signals satisfy these so called mixing conditions [5, Ch. 2]. Power signals are not absolutely summable, but it is possible to define cyclic spectra equivalently [for real-valued x(n)]as S xx (α k ; ω) := lim N→∞ 1 N E{X N (ω)X N (α k − ω)} ,X N (ω) := N−1  n=0 x(n)e −jωn . (17.12) If x(n) is complex ACS, then one also needs ¯ S xx (α k ; ω) := lim N→∞ N −1 E{X ∗ N (−ω) X N (α k − ω)}. Both S xx and ¯ S xx reveal presence of spectral correlation. This must be contrasted to stationary processeswhose spectral components, X N (ω 1 ), X N (ω 2 )are knownto be asymptotically uncorrelated unless |ω 1 ± ω 2 |=0 (mod 2π)[5, Ch. 4]. Specifically, we have from (17.12) that: Property 2 If x(n) is ACS or CS, the N-point Fourier transform X N (ω 1 ) is correlated with X N (ω 2 ) for |ω 1 ± ω 2 |=α k (mod 2π), and α k ∈ A s xx . Before dwelling further on spectral characterization of ACS processes, it is useful to note the diver- sity of tools available for processing. Stationary signals are analyzed with time-invariant correlations (lag-domain analysis), or with power spectral densities (frequency-domain analysis). However, CS, ACS, and generally nonstationary signals entail four variables: (n, τ, α, ω) :=(time, lag, cycle, fre- quency). Grouping two variables at a time, four domains of analysis become available and their relationship is summarized in Fig. 17.1. Note that pairs (n; τ)↔ (α; τ),or,(n; ω) ↔ (α; ω),have τ or ω fixed and are Fourier Series pairs; whereas (n; τ) ↔ (n; ω),or,(α; τ) ↔ (α; ω),haven or α fixed and are related by Fourier Transforms. Further insight on the links between stationary and FIGURE 17.1: Four domains for analyzing cyclostationary signals. cyclostationary processes is gained through the uniform shift (or phase) randomization concept. Let c  1999 by CRC Press LLC x(n) be CS with period P , and define y(n) := x(n+ θ),whereθ is uniformly distributed in [0,P) and independent of x(n). With c yy (n; τ):= E θ {E x [x(n+ θ)x(n+ τ + θ)]}, we find: c yy (n; τ)= 1 P P−1  p=0 c xx (p; τ):= C xx (0; τ):= c yy (τ ) , (17.13) where the first equality follows because θ is uniform and the second uses the CS definition in (17.1). Noting that c yy is not a function of n, we have established (see also [15, 38]): Property 3 ACSprocessx(n) can be mapped to a stationary process y(n) using a shift θ, uniformly distributed over its period, and the transformation y(n) := x(n+ θ). Such a mapping is often used with harmonic signals; e.g., x(n) = Aexp[j(2π n/P + θ)]+v(n) is according to Property2aCSsignal, but can be stationarized by uniform phase randomization. An alternative trick for stationarizing signals which involve complex harmonics is conjugation. Indeed, c xx∗ (n; τ) = A 2 exp(−j2π τ/P ) + c vv (τ ) is not a function of n — but why deal with CS or ACS processes if conjugation or phase randomization can render them stationary? Revisiting Case 2 of Example 17.1 offers a partial answer when the goal is to estimate the frequency ω 0 . Phase randomization of x(n) in (17.3) leads to a stationary y(n) with correlation found by substituting α = 0 in (17.6). This leads to c yy (τ ) = (1/2)c ss (τ ) cos(ω 0 τ)+ c vv (τ ), and shows that if s(n) has multiple spectral peaks, or if s(n) is broadband, then multiple peaks or smearing of the spectral peak hamper estimation of ω 0 (in fact, it is impossible to estimate ω 0 from the spectrum of y(n) if s(n) is white). In contrast, picking the peak of C xx (α; τ)in (17.6) yields ω 0 , provided that ω 0 ∈ (0,π) so that spectral folding is prevented [33]. Equation (17.13) provides a more general answer. Phase randomization restricts a CS process only to one cycle, namely α = 0. In other words, the cyclic correlation C xx (α; τ) contains the “stationarized correlation” C xx (0; τ) and additional information in cycles α = 0. Since CS and ACS processes form a superset of stationary ones, it is useful to know how a stationary process can be viewed as a CS process. Note that if x(n) is stationary, then c xx (n; τ)= c xx (τ ) and on using (17.2) and (17.5) we find: C xx (α; τ)= c xx (τ )  lim N→∞ 1 N N−1  n=0 e −jαn  = c xx (τ )δ(α) . (17.14) Intuitively, (17.14) is justified if we think that stationarity reflects “zero time-variation” in the corre- lation c xx (τ ). Formally, (17.14) implies: Property 4 Stationary processes can be viewed as ACS or CS with cyclic correlation C xx (α; τ) = c xx (τ )δ(α). Separation of information bearing ACS signals from stationary ones (e.g., noise) is desired in many applications and can be achieved based on Property 4 by excluding the cycle α = 0. Next, it is of interest to view CS signals as special cases of general nonstationary processes with 2-D correlation r xx (n 1 ,n 2 ) := E{x(n 1 )x(n 2 )}, and 2-D spectral densities S xx (ω 1 ,ω 2 ) := FT[r xx (n 1 ,n 2 )] that are assumed to exist. 2 Two questions arise: What are the implications of pe- riodicity in the (ω 1 ,ω 2 ) plane? and how does the cyclic spectra in (17.10) through (17.12) relate to S xx (ω 1 ,ω 2 )? The answers are summarized in Fig. 17.2, which illustrates that the support of CS processes in the (ω 1 ,ω 2 ) plane consists of 2P − 1 parallel lines (with unity slope) intersecting the axes at equidistant points 2π/P far apart from each other. More specifically, we have [34]: 2 Nonstationary processes with Fourier transformable 2-D correlations are called harmonizable processes. c  1999 by CRC Press LLC FIGURE 17.2: Support of 2-D spectrum S xx (ω 1 ,ω 2 ) for CS processes. Property 5 A CS process with period P is a special case of a nonstationary (harmonizable) process with 2-D spectral density given by S xx (ω 1 ,ω 2 ) = P−1  k=−(P−1) S xx ( 2π P k; ω 1 )δ D (ω 2 − ω 1 + 2π P k) , (17.15) where δ D denotes the delta of Dirac. For stationary processes, only the k = 0 term survives in (17.15) and we obtain S xx (ω 1 ,ω 2 ) = S xx (0; ω 1 )δ D (ω 2 −ω 1 ); i.e., the spectral mass is concentrated on the diagonal of Fig. 17.2.The well-structured spectral support for CS processes will be used to test for presence of cyclostationarity and estimate the period P . Furthermore, the superposition of lines parallel to the diagonal hints towards representing CS processes as a superposition of stationary processes. Next we will examine two such representations introduced by Gladysev [34] (see also [22, 38, 49], and [56]). We can uniquely write n 0 = nP + i and express x(n 0 ) = x(nP + i), where the remainder i takes values0, 1, .,P−1.Foreachi, define the subprocess x i (n) := x(nP +i). In multirate processing, the P × 1 vector x(n) := [x 0 (n) .x P−1 (n)]  constitutes the so-called polyphase decomposition of x(n) [51, Ch. 12]. As shown in Fig. 17.3,eachx i (n) is formed by downsampling an advanced copy of x(n). On the other hand, combining upsampled and delayed x i (n)s, we can synthesize the CS process as: x(n) = P−1  i=0  l x i (l)δ(n − i − lP) . (17.16) We maintain that subprocesses{x i (n)} P−1 i=0 are (jointly) stationary, and thus x(n) is vector stationary. Suppose for simplicity that E{x(n)}=0, and start with E{x i 1 (n)x i 2 (n+τ)}=E{x(nP +i 1 )x(nP + τP + i 2 )}:=c xx (i 1 + nP; i 2 − i 1 + τP). Because x(n) is CS, we can drop nP and c xx becomes independent of n establishing that x i 1 (n), x i 2 (n) are (jointly) stationary with correlation: c x i 1 x i 2 (τ ) = c xx (i 1 ; i 2 − i 1 + τP) , i 1 ,i 2 ∈[0,P− 1] . (17.17) c  1999 by CRC Press LLC FIGURE 17.3: Representation 1: (a) analysis, (b) synthesis. Using (17.17), it can be shown that auto- and cross-spectra of x i 1 (n), x i 2 (n) can be expressed in terms of the cyclic spectra of x(n) as [56], S x i 1 x i 2 (ω) = 1 P P−1  k 1 =0 P−1  k 2 =0 S xx  2π P k 1 ; ω − 2πk 2 P  e j[( ω−2πk 2 P )(i 2 −i 1 )+ 2π P k 1 i 1 ] . (17.18) To invert (17.18), we Fourier transform (17.16) and use (17.12) to obtain [for x(n) real] S xx ( 2π P k; ω) = P−1  i 1 =0 P−1  i 2 =0 S x i 1 x i 2 (ω)e jω(i 2 −i 1 ) e −j 2π P ki 2 . (17.19) Basedon(17.16) through (17.19), we infer that cyclostationary signals with period P can be analyzed as stationary P × 1 multichannel processes and vice versa. In summary, we have: Representation 1 (Decimated Components) CS process x(n) can be represented as a P-variate sta- tionary multichannel process x(n) with components x i (n) = x(nP + i), i = 0, 1, .,P − 1. Cyclic spectra and stationary auto- and cross-spectra are related as in (17.18) and (17.19). An alternative means of decomposing a CS process into stationary components is by splitting the (−π, π] spectral support of X N (ω) into bands each of width 2π/P [22]. As shown in Fig. 17.4, this can be accomplished bypassing modulated copies of x(n) through an ideal low-pass filter H 0 (ω) with spectral support (−π/P, π/P]. The resulting subprocesses ¯x m (n) can be shifted up in frequency and recombined to synthesize the CS process as: x(n) =  P−1 m=0 ¯x m (n) exp(−j2πmn/P ). Within each band, frequencies are separated by less than 2π/P and according to Property 2, there is no correlation between spectral components ¯ X m,N (ω 1 ) and ¯ X m,N (ω 2 );hence,¯x m (n) components are stationary with auto- and cross-spectra having nonzero support over −π/P < ω < π/P.Theyare related with the cyclic spectra as follows: S ¯x m 1 ¯x m 2 (ω) = S xx  2π P (m 1 − m 2 ); ω + 2π P m 1  , |ω| < π P . (17.20) Equation (17.20) suggests that cyclostationary signal analysis is linked with stationary subband pro- cessing. Representation 2 (Subband Components) CS process x(n) can be represented as a superposition of P stationary narrowband subprocesses according to: x(n) =  P−1 m=0 ¯x m (n) exp(−j2πmn/P ). Auto- and c  1999 by CRC Press LLC FIGURE 17.4: Representation 2: (a) analysis, (b) synthesis. cross-spectra of ¯x m (n) can be found from the cyclic spectra of x(n) as in (17.20). Because ideal low-pass filters cannot be designed, the subband decomposition seems less practical. However, using Representation 1 and exploiting results from uniform DFT filter banks, it is possible using FIR low-pass filters to obtain stationary subband components (see e.g., [51, Ch. 12]). We will not pursue this approach further, but Representation 1 will be used next for estimating time-varying correlations of CS processes based on a single data record. 17.3 Estimation, Time-Frequency Links, Testing The time-varying and cyclic quantities introduced in (17.1), (17.2), and (17.10) through (17.12), entail ideal expectations (i.e., ensemble averages) and unless reliable estimators can be devised from finite (and often noisy) data records, their usefulness in practice is questionable. For stationary processes with (at least asymptotically) vanishing memory, 3 sample correlations and spectral density estimators converge to their ensembles as the record length N →∞. Constructing reliable (i.e., consistent) estimators for nonstationary processes, however, is challenging and generally impossible. Indeed, capturing time-variations calls for short observation windows, whereas variance reduction demands long records for sample averages to converge to their ensembles. Fortunately, ACS and CS signals belong to the class of processes with “well-structured” time- variations that under suitable mixing conditions allow consistent single record estimators. The key is to note that although c xx (n; τ) and S xx (n; ω) are time-varying, they are expressed in terms of cyclic quantities, C xx (α k ; τ)and S xx (α k ; ω), which are time-invariant. Indeed, in (17.2) and (17.10) time-variation is assigned to the Fourier basis. 3 Well-separated samples of such processes are asymptotically independent. Sufficient (so-called mixing) conditions include absolute summability of cumulants and are satisfied by many real life signals (see [5, 12, Ch. 2]). c  1999 by CRC Press LLC [...]... statistics as discussed in Section 17. 4.3 c 1999 by CRC Press LLC FIGURE 17. 11: Spectral densities and cyclic correlation signals in Example 17. 4 FIGURE 17. 12: Cyclic spectrum of x(n) in Example 17. 4 EXAMPLE 17. 5: Diversity for channel estimation Suppose we sample the output of the receiver’s filter every T0 /2 seconds, to obtain x(n) samples obeying (17. 35) with P = 2 (see also Fig 17. 8) In the absence of noise,... P kl 2 + σv δ(k)δ(τ ) 2π 2 k − ω + σv δ(k) P (17. 36) (17. 37) Although similar, the order of the FIR channel h in (17. 35) is, due to oversampling, P times larger than that of (17. 31) Cyclic spectra in (17. 32) and (17. 37) carry phase information about the underlying H , which is not the case with spectra of stationary processes (P = 1) Interestingly, (17. 35) can be used also to model spread spectrum... fractionally sample) (17. 30) by a factor P With x(n) := rc (nT0 /P ), we obtain (see also Fig 17. 8) w(l)h(n − lP ) + v(n) , x(n) = (17. 35) l where now h(n) := hc (nT0 /P − ), and v(n) := vc (nT0 /P ) Figure 17. 8 shows the continuous- FIGURE 17. 8: (a) Fractionally sampled communications model and (b) multirate equivalent time model and the multirate discrete time equivalent of (17. 35) With P = 1, (17. 35) reduces... random signals, IEEE Trans on Signal Processing, 131–146, 1993 [58] Schell, S.V., An overview of sensor array processing for cyclostationary signals, in Cyclostationarity in Communications and Signal Processing, Gardner, W.A., Ed., IEEE Press, New York, 1994, 168–239 [59] Spooner, C.M and Gardner, W.A., The cumulant theory of cyclostationary time-series: development and applications, IEEE Trans on Signal. .. E{x(n)w∗ (n + τ )}, which is given by ¯ cxw (n; τ ) = ¯ h(n; l) cxw (n − l; l + τ ) ¯ (17. 41) l If the n-dependence is dropped from (17. 40) and (17. 41), one recovers the well-known auto- and cross-correlation expressions of stationary processes passing through linear TI systems Relying on definitions (17. 2), (17. 11), and (17. 37), the auto- and cross-cyclic correlations and cyclic spectra can be found as... ; τ + l1 − l2 ) , ¯ H (β; l)e−j (α−β)l Cww (α − β; l + τ ) , ¯ Cxw (α; τ ) = β (17. 42) (17. 43) l ¯ H (β1 ; α + β2 − β1 − ω)H ∗ (β2 ; −ω)Sww (α − β1 + β2 ; ω) , ¯ Sxx (α; ω) = (17. 44) β1 ,β2 ¯ H (β; α − β − ω) Sww (α − β; ω) ¯ Sxw (α; ω) = (17. 45) β Simpler expressions are obtained as special cases of (17. 42) through (17. 45) when w(n) is stationary; e.g., cyclic auto- and cross-spectra reduce to: ¯... similar to those developed for stationary signals and time-invariant systems CS signal analysis exploits two extra features not available with scalar stationary signal processing, namely: (1) ability to separate signals on the basis of their cycles and (2) diversity offered by means of cycles Of course, the cycles must be known or estimated as we discussed in Section 17. 3 Suppose x(n) = s(n) + v(n), where... variance σv = 0.1 Figure 17. 5a shows |Cxx (α; 0)| 1 2 peaking at α = ±2(π/8), ±2(π/4), 0 as expected, while Fig 17. 5b depicts ρxx (ω1 , ω2 ) computed as in (17. 29) with M = 64 The parallel lines in Fig 17. 5b are seen at |ω1 − ω2 | = 0, π/8, π/4 revealing π the periods present One can easily verify from (17. 11) that Cxx (α; 0) = (2π )−1 −π Sxx (α; ω)dω It also follows from (17. 15) that Sxx (α; ω) =... ozone data, J Time Series Analysis, 15, 127–150, 1994 [5] Brillinger, D.R., Time Series, Data Analysis and Theory, McGraw-Hill, New York, 1981 [6] Castedo, L., Figueiras, V., and Anibal, R., An adaptive beamforming technique based on cyclostationary signal properties, IEEE Trans on Signal Processing, 43, 1637–1650, 1995 c 1999 by CRC Press LLC [7] Chen, C.-K and Gardner, W.A., Signal- selective time-difference-of-arrival... Cxx (α; τ ) 17. 4 CS Signals and CS-Inducing Operations We have already seen in Examples 17. 1 and 17. 2 that amplitude or index transformations of repetitive nature give rise to one class of CS signals A second category consists of outputs of repetitive (e.g., periodically varying) systems excited by CS or even stationary inputs Finally, it is possible to have c 1999 by CRC Press LLC ˆ FIGURE 17. 7: Cycle . 1999byCRCPressLLC 17 Cyclostationary Signal Analysis Georgios B. Giannakis University of Virginia 17. 1 Introduction 17. 2 Definitions, Properties, Representations 17. 3. , i 1 ,i 2 ∈[0,P− 1] . (17. 17) c  1999 by CRC Press LLC FIGURE 17. 3: Representation 1: (a) analysis, (b) synthesis. Using (17. 17), it can be shown that

Ngày đăng: 16/12/2013, 04:15

Từ khóa liên quan

Tài liệu cùng người dùng

  • Đang cập nhật ...

Tài liệu liên quan