Tài liệu Digital Signal Processing Handbook P72 doc

14 266 1
Tài liệu Digital Signal Processing Handbook P72 doc

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

Steven H. Isabelle, et. Al. “Nonlinear Maps.” 2000 CRC Press LLC. <http://www.engnetbase.com>. NonlinearMaps StevenH.Isabelle MassachusettsInstituteofTechnology GregoryW.Wornell MassachusettsInstituteofTechnology 72.1Introduction 72.2EventuallyExpandingMapsandMarkovMaps EventuallyExpandingMaps 72.3SignalsFromEventuallyExpandingMaps 72.4EstimatingChaoticSignalsinNoise 72.5ProbabilisticPropertiesofChaoticMaps 72.6StatisticsofMarkovMaps 72.7PowerSpectraofMarkovMaps 72.8ModelingEventuallyExpandingMapswithMarkovMaps References 72.1 Introduction One-dimensionalnonlinearsystems,althoughsimpleinform,areapplicableinasurprisinglywide varietyofengineeringcontexts.Asmodelsforengineeringsystems,theirrichlycomplexbehavior hasprovidedinsightintotheoperationof,forexample,analog-to-digitalconverters[1],nonlinear oscillators[2],andpowerconverters[3].Asrealizablesystems,theyhavebeenproposedasrandom numbergenerators[4]andassignalgeneratorsforcommunicationsystems[5,6].Asanalytictools, theyhaveservedasmirrorsforthebehaviorofmorecomplex,higherdimensionalsystems[7,8,9]. Althoughone-dimensionalnonlinearsystemsare,ingeneral,hardtoanalyze,certainusefulclasses ofthemarerelativelywellunderstood.Thesesystemsaredescribedbytherecursion x[n]=f(x[n−1]) (72.1a) y[n]=g(x[n]), (72.1b) initializedbyascalarinitialconditionx[0],wheref(·)andg(·)arereal-valuedfunctionsthatdescribe theevolutionofanonlinearsystemandtheobservationofitsstate,respectively.Thedependence ofthesequencex[n]onitsinitialconditionisemphasizedbywritingx[n]=f n (x[0])wheref n (·) representsthen-foldcompositionoff(·)withitself. Withoutfurtherrestrictionsoftheformoff(·)andg(·),thisclassofsystemsistoolargeto easilyexplore.However,systemsandsignalscorrespondingtocertain“well-behaved”mapsf(·) andobservationfunctionsg(·)canberigorouslyanalyzed.Mapsofthistypeoftengeneratechaotic signals—looselyspeaking,boundedsignalsthatareneitherperiodicnortransient—undereasily verifiableconditions.Thesechaoticsignals,althoughcompletelydeterministic,areinmanyways analogoustostochasticprocesses.Infact,one-dimensionalchaoticmapsillustrateinarelatively simplesettingthatthedistinctionbetweendeterministicandstochasticsignalsissometimesartificial c  1999byCRCPressLLC and can be profitably emphasized or deemphasized according to the needs of an application. For instance, problems of signal recovery from noisy observations are often best approached with a deterministic emphasis, while certain signal generation problems [10] benefit most from a stochastic treatment. 72.2 Eventually Expanding Maps and Markov Maps Although signal models of the form [1] have simple, one-dimensional state spaces, they can behave in a variety of complex ways that model a wide range of phenomena. This flexibility comes at a cost, however; without some restrictions on its form, this class of models is too large to be analytically tractable. Two tractable classes of models that appear quite often in applications are eventually expanding maps and Markov maps. 72.2.1 Eventually Expanding Maps Eventuallyexpandingmaps—whichhavebeen used tomodel sigma-delta modulators [11], switching power converters [3], other switched flow systems [12], and signal generators [6, 13]—have three defining features: they are piecewise smooth, they map the unit interval to itself, and they have some iterate with slope that is everywhere greater than unity. Maps with these features generate time series that are chaotic, but on average well behaved. For reference, the formal definition is as follows, where the restriction to the unit interval is convenient but not necessary: DEFINITION 72.1 A nonsingular map f :[0, 1]→[0, 1] is called eventually expanding if 1. There is a set of partition points 0 = a 0 <a 1 < ···a N = 1 such that restricted to each of the intervals V i =[a i−1 ,a i ), called partition elements, the map f(·) is monotonic, continuous and differentiable. 2. The function 1/|f  (x)| is of bounded variation [14]. (In some definitions, this smooth- ness condition on the reciprocal of the derivative is replaced with a more restrictive bounded slope condition, i.e., there exists a constant B such that |f  (x)| <Bfor all x.) 3. There exists a real λ>1 and a integer m such that     d dx f m (x)     ≥ λ wherever the derivative exists. This is the eventually expanding condition. Every eventually expanding map can be expressed in the form f(x)= N  i=1 f i (x)χ i (x) (72.2) where each f i (·) is continuous, monotonic, and differentiable on the interior of the ith partition element and the indicator function χ i (x) is defined by χ i (x) =  1 x ∈ V i , 0 x ∈ V i . (72.3) This class is broad enough to include for example, discontinuous maps and maps with discontinuous or unbounded slope. Eventually expanding maps also include a class that is particularly amenable to analysis—the Markov maps. c  1999 by CRC Press LLC Markov maps are analytically tractable and broadly applicable to problems of signal estimation, signal generation, and signal approximation. They are defined as eventually expanding maps that are piecewise-linear and have some extra structure. DEFINITION 72.2 A map f :[0, 1]→[0,1] is an eventually expanding, piecewise-linear, Markov map if f is an eventually expanding map with the following additional properties: 1. The map is piecewise-linear, i.e., there is a set of partition points 0 = a 0 <a 1 < ··· < a N = 1 such that restricted to each of the intervals V i =[a i−1 ,a i ), called partition elements, the map f(·) is affine, i.e., the functions f i (·) on the right side of (72.2)areof the form f i (x) = s i x + b i . 2. The map has the Markov property that partition points map to partition points, i.e., for each i, f (a i ) = a j for some j. Every Markov map can be expressed in the form f(x)= N  i=1 ( s i x + b i ) χ i (x) , (72.4) where s i = 0 for all i. Fig. 72.1 shows the Markov map f(x)=  (1 − a)x/a + a 0 ≤ x ≤ a (1 − x)/(1 − a) a < x ≤ 1 , (72.5) which has partition points {0,a,1}, and partition elements V 1 =[0,a)and V 2 =[a,1). FIGURE 72.1: An example of a piecewise-linear Markov map with two partition elements. Markov maps generate signals with two useful properties: they are, when suitably quantized, indistinguishable from signals generated by Markov chains; they are close, in a sense, to signals generated by more general eventually expanding maps [15]. These twoproperties lead toapplications of Markov maps for generating random numbers and approximating other signals. The analysis underlying these types of applications depends on signal representations that provide insight into the structure of chaotic signals. c  1999 by CRC Press LLC 72.3 Signals From Eventually Expanding Maps Thereareseveralgeneralrepresentationsforsignalsgeneratedbyeventuallyexpandingmaps.Each providesdifferentinsightsintothestructureofthesesignalsandprovesusefulindifferentapplications. First,andmostobviously,asequencegeneratedbyaparticularmapiscompletelydeterminedby (andisthusrepresentedby)itsinitialconditionx[0].Thisrepresentationallowscertainsignal estimationproblemstoberecastasproblemsofestimatingthescalarinitialcondition.Second,and lessobviously,thequantizedsignaly[n]=g(x[n]),forn≥0generatedby(72.1)withg(·)defined by g(x)=ix∈V i , (72.6) uniquelyspecifiestheinitialconditionx[0]andhencetheentirestatesequencex[n].Suchquantized sequencesy[n]arecalledthesymbolicdynamicsassociatedwithf(·)[7].Certainpropertiesofa map,suchasthecollectionofinitialconditionsleadingtoperiodicpoints,aremosteasilydescribed intermsofitssymbolicdynamics.Finally,ahybridrepresentationofx[n]combiningtheinitial conditionandsymbolicrepresentations H[N]= { g(x[0]), .,g(x[N]),x[N] } isoftenuseful. 72.4Estimating Chaotic Signals in Noise Thehybridsignalrepresentationdescribedintheprevioussectioncanbeappliedtoaclassicalsignal processingproblem—estimatingasignalinwhiteGaussiannoise.Forexample,supposetheproblem istoestimateachaoticsequencex[n],n=0, .,N−1fromthenoisyobservations r[n]=x[n]+w[n],n=0, .,N−1 (72.7) wherew[n]isastationary,zero-meanwhiteGaussiannoisesequencewithvarianceσ 2 w ,andx[n] isgeneratedbyiterating(72.1)fromanunknowninitialcondition.Becausew[n]iswhiteand Gaussian,themaximumlikelihoodestimationproblemisequivalenttotheconstrainedminimum distanceproblem minimize x[n]:x[i]=f(x[i−1])ε[N]= N  k=0 ( r[k]−x[k] ) 2 (72.8) andtothescalarproblem minimize x[0]∈[0,1] ε[N]= N  k=0  r[k]−f k (x[0])  2 (72.9) Thus,themaximum-likelihoodproblemcan,inprinciple,besolvedbyfirstestimatingtheinitial condition,theniterating(72.1)togeneratetheremainingestimates.However,theinitialconditionis oftendifficulttoestimatedirectlybecausethelikelihoodfunction(72.9),whichishighlyirregularwith fractalcharacteristics,isunsuitableforgradient-descenttypeoptimization[16].Anothersolution dividesthedomainoff(·)intosubintervalsandthensolvesadynamicprogrammingproblem[17]; however,thissolutionis,ingeneral,suboptimalandcomputationallyexpensive. Althoughthemaximumlikelihoodproblemdescribedaboveneednot,ingeneral,haveacomputa- tionallyefficientrecursivesolution,itdoeshaveonewhen,forexample,themapf(·)isasymmetric tentmapoftheform f(x)=β−1−β|x|,x∈[−1,1] (72.10) c  1999byCRCPressLLC withparameter1<β≤2[5].Thisalgorithmsolvesforthehybridrepresentationoftheinitial conditionfromwhichanestimateoftheentiresignalcanbedetermined.Thehybridrepresentation isoftheform H[N]= { y[0], .,y[N],x[N] } , whereeachy[i]takesoneoftwovalueswhich,forconvenience,wedefineasy[i]=sgn(x[i]).Since eachy[n]canindependentlytakesoneoftwovalues,thereare2 N feasiblesolutionstothisproblem andadirectsearchfortheoptimalsolutionisthusimpracticalevenformoderatevaluesofN. Theresultingalgorithmhascomputationalcomplexitythatislinearinthelengthoftheobservation, N.Thisefficiencyistheresultofaspecialseparationproperty,possessedbythemap[10]:given y[0], .,y[i−1]andy[i+1], .,y[N]theestimateoftheparametery[i]isindependentof y[i+1], .,y[N].Thealgorithmisasfollows.Denotingby ˆ φ[n|m]theMLestimatesofany sequenceφ[n]givenr[k]for0≤k≤m,theMLsolutionisoftheform, ˆx[n|n]=  β 2 −1  β 2n r[n]+  β 2n −1  ˆx[n|n−1] β 2(n+1) −1 (72.11) ˆy[n|N]=sgnˆx[n|n] (72.12) ˆx ML [n|n]=L β (ˆx[n|n]), (72.13) whereˆx[n|n−1]=f(ˆx[n−1|n−1]),theinitializationisˆx[0|0]=r[0],andthefunctionL β (ˆx[n|n]), definedby L β (x)=    xx∈(−1,β−1) −1 x≤−1 · β−1 x≥β−1 , (72.14) servestorestricttheMLestimatestotheintervalx∈(−1,β−1).Thesmoothedestimatesˆx ML [n|N] areobtainedbyconvertingthehybridrepresentationtotheinitialconditionandtheniteratingthe estimatedinitialconditionforward. 72.5 ProbabilisticPropertiesofChaoticMaps Almostallwaveformsgeneratedbyaparticulareventuallyexpandingmaphavethesameaverage behavior[18],inthesensethatthetimeaverage ¯ h(x[0])=lim n→∞ 1 n n−1  k=0 h(x[k])=lim n→∞ 1 n n−1  k=0 h  f k (x[0])  (72.15) existsandisessentiallyindependentoftheinitialconditionx[0]forsufficientlywell-behavedfunc- tionsh(·).Thisresult,whichisreminiscentofresultsfromthetheoryofstationarystochastic processes[19],formsthebasisforaprobabilisticinterpretationofchaoticsignals,whichinturn leadstoanalyticmethodsforcharacterizingtheirtime-averagebehavior. Toexplorethelinkbetweenchaoticandstochasticsignals,firstconsiderthestochasticprocess generatedbyiterating(72.1)fromarandominitialconditionx[0],withprobabilitydensityfunction p 0 (·).Denotebyp n (·)thedensityofthenthiteratex[n].Although,ingeneral,themembersofthe sequencep n (·)willdiffer,therecanexistdensities,calledinvariantdensities,thataretime-invariant, i.e., p 0 (·)=p 1 (·)= .=p n (·)  =p(·). (72.16) Whentheinitialconditionx[0]ischosenrandomlyaccordingtoaninvariantdensity,theresulting stochasticprocessisstationary[19]anditsensembleaveragesdependontheinvariantdensity.Even c  1999byCRCPressLLC when the initial condition is not random, invariant densities play an important role in describing the time-average behavior of chaotic signals. This role depends on, among other things, the number of invariant densities that a map possesses. A general one-dimensional nonlinear map may possess many invariant densities. For example, eventually expanding maps with N partition elements have at least one and at most N invariant densities [20]. However, maps can often be decomposed into collections of maps, each with only one invariant density [19], and little generality is lost by concentrating on maps with only one invariant density. In this special case, the results that relate the invariant density to the average behavior of chaotic signals are more intuitive. The invariant density, although introduced through the device of a random initial condition, can also be used to study the behavior of individual signals. Individual signals are connected to ensembles of signals, which correspond to random initial conditions, through a classical result due to Birkhoff, which asserts that the time average ¯ h(x[0]) defined by Eq. (72.15) exists whenever f(·) has an invariant density. When the f(·) has only one invariant density, the time average is independent of the initial condition for almost all (with respect to the invariant density p(·)) initial conditions and equals lim n→∞ 1 n n−1  k=0 h(x[k]) = lim n→∞ 1 n n−1  k=0 h  f k (x[0])  =  h(x)p(x)dx . (72.17) where the integral is performed over the domain of f(·) and where h(·) is measurable. Birkhoff’s theorem leads to a relative frequency interpretation of time-averages of chaotic signals. To seethis, considerthe time-averageof theindicatorfunction ˜χ [s−,s+] (x),whichiszeroeverywhere but in the interval [s − , s + ] where it is equal to unity. Using Birkhoff’s theorem with Eq. (72.17) yields lim n→∞ 1 n n−1  k=0 ˜χ [s−,s+] (x[k]) =  ˜χ [s−,s+] (x)p(x)dx (72.18) =  [s−,s+] p(x)dx (72.19) ≈ 2p(s) , (72.20) where Eq. (72.20) follows from Eq. (72.19) when  is small and p(·) is sufficiently smooth. The time-average (72.18) is exactly the fraction of time that the sequence x[n] takes values in the interval [s − , s + ]. Thus, from (72.20), the value of the invariant density at any point s is approximately proportional to the relative frequency with which x[n] takes values in a small neighborhood of the point. Motivated by this relative frequency interpretation, the probability that an arbitrary function h(x[n]) falls into an arbitrary set A can be defined by Pr { h(x) ∈ A } = lim n→∞ 1 n n−1  k=0 ˜χ A (h(x[k])) . (72.21) Using this definition of probability , it can be shown that for any Markov map, the symbol sequence y[n] defined in Section 72.3 is indistinguishable from a Markov chain in the sense that Pr { y[n]|y[n − 1], .,y[0] } = Pr { y[n]|y[n − 1] } , (72.22) holds for all n [21]. The first order transition probabilities can be shown to be of the form Pr(y[n]|y[n − 1]) =   V y[n]     s y[n]     V y[n−1]   , c  1999 by CRC Press LLC where the s i are the slopes of the map f(·) as in Eq. (72.4) and |V y[n] | denotes the length of the interval V y[n] . As an example, consider the asymmetric tent map f(x)=  x/a 0 ≤ x ≤ a (1 − x)/(1 − a) a < x ≤ 1 , with parameter in the range 0 <a<1 and a quantizer g(·) of the form (72.6). The previous results establish that y[n]=g(x[n]) is equivalent to a sample sequence from the Markov chain with transition probability matrix [P ] ij =  a 1 − a a 1 − a  , where [P ] ij = Pr{y[n]=i|y[n − 1]=j }. Thus, the symbolic sequence appears to have been generated by independent flips of a biased coin with the probability of heads, say, equal to a. When the parameter takes the value a = 1/2, this corresponds to a sequence of independent equally likely bits. Thus, a sequence of Bernoulli random variables can been constructed from a deterministic sequence x[n]. Based on this remarkable result, a circuit that generates statistically independent bits for cryptographic applications has been designed [4]. Some of the deeper probabilisticproperties of chaotic signals depend on the integral (72.17), which in turn depends on the invariant density. For some maps, invariant densities can be determined explicitly. For example, the tent map (72.10) with β = 2 has invariant density p(x) =  1/2 −1 ≤ x ≤ 1 0 otherwise ascan be readilyverified using elementary resultsfromthe theory of deriveddistributions offunctions ofrandom variables [22]. Moregenerally, all Markovmaps haveinvariant densitiesthat arepiecewise- constant function of the form n  i=1 c i χ i (x) (72.23) where c i are real constants that can be determined from the map’s parameters [23]. This makes Markov maps especially amenable to analysis. 72.6 Statistics of Markov Maps The transition probabilities computed above may be viewed as statistics of the sequence x[n]. These statistics, which are important in a variety of applications, have the attractive property that they are defined by integrals having, for Markov maps, readily computable, closed-form solutions. This property holds more generally—Markov maps generate sequences for which a large class of statistics can be determined in closed form. These analytic solutions have two primary advantages over empirical solutions computed by time averaging: they circumvent some of the numerical problems that arise when simulating the long sequences of chaotic data that are necessary to generate reliable averages; and they often provide insight into aspects of chaotic signals, such as dependence on a parameter, that could not be easily determined by empirical averaging. Statistics that can be readily computed include correlations of the form R f ;h 0 ,h 1 , .,h r [ k 1 , .,k r ] = lim L→∞ 1 L L−1  n=0 h 0 (x[n])h 1 ( x[n + k 1 ] ) ···h r ( x[n + k r ] ) (72.24) =  h 0 (x[n])h 1 ( x[n + k 1 ] ) ···h r ( x[n + k r ] ) p(x) dx , (72.25) c  1999 by CRC Press LLC wheretheh i (·)  saresuitablywell-behavedbutotherwisearbitraryfunctions,thek  i sarenonnegative integers,thesequencex[n]isgeneratedbyEq.(72.1),andp(·)istheinvariantdensity.This classofstatisticsincludesasimportantspecialcasestheautocorrelationfunctionandallhigher- ordermomentsofthetime-series.Ofprimaryimportanceindeterminingthesestatisticsisalinear transformationcalledtheFrobenius-Perron(FP)operator,whichentersintothecomputationof thesecorrelationsintwoways.First,itsuggestsamethodfordetermininganinvariantdensity. Second,itprovidesa“changeofvariables”withintheintegralthatleadstosimpleexpressionsfor correlationstatistics. ThedefinitionoftheFPoperatorcanbemotivatedbyusingthedeviceofarandominitialcondition x[0]withdensityp 0 (x)asinSection72.5.TheFPoperatordescribesthetimeevolutionofthisinitial probabilitydensity.Moreprecisely,itrelatestheinitialdensitytothedensitiesp n (·)oftherandom variablesx[n]=f n (x[0])throughtheequation p n (x)=P n f p 0 (x) (72.26) whereP n f denotesthen-foldself-compositionofP f .ThisdefinitionoftheFPoperator,although phrasedintermsofitsactiononprobabilitydensities,canbeextendedtoallintegrablefunctions.This extendedoperator,whichisalsocalledtheFPoperator,islinearandcontinuous.Itspropertiesare closelyrelatedtothestatisticalstructureofsignalsgeneratedbychaoticmaps(see[9]forathorough discussionoftheseissues).Forexample,theevolutionequation(72.26)impliesthataninvariant densityofamapisafixedpointofitsFPoperator,thatis,itsatisfies p(x)=P f p(x). (72.27) ThisrelationcanbeusedtodetermineexplicitlytheinvariantdensitiesofMarkovmaps[23],which mayinturnbeusedtocomputemoregeneralstatistics. UsingthechangeofvariablespropertyoftheFPoperator,thecorrelationstatistic(72.25)canbe expressedastheensembleaverage R f;h 0 ,h 1 , .,h r [ k 1 , .,k r ] = (72.28)  h r (x)P k r −k r−1 f  h r−1 (x)···P k 2 −k 1 f  h 1 (x)P k 1 f { h 0 (x)p(x) }  ···  dx. (72.29) Althoughsuchintegralsare,forgeneralone-dimensionalnonlinearmaps,difficulttoevaluate,closed- formsolutionsexistwhenf(·)isaMarkovmap—adevelopmentthatdependsonanexplicit expressionforFPoperator. TheFPoperatorofaMarkovmaphasasimple,finite-dimensionalmatrixrepresentationwhenit operatesoncertainpiecewisepolynomialfunctions.Anyfunctionoftheform h(·)= K  i=0 N  j=1 a ij x i χ j (x) canberepresentedbyanN(K+1)dimensionalcoordinatevectorwithrespecttothebasis  θ 1 (x),θ 2 (x), .,θ N(K+1)   =  χ 1 (x), .,χ N (x),xχ 1 (x), .,xχ N (x), .,x K χ 1 (x), .,x K χ N (x)  . (72.30) TheactionoftheFPoperatoronanysuchfunctioncanbeexpressedasamatrix-vectorproduct: whenthecoordinatevectorofh(x)ish,thecoordinatevectorofq(x)=P f h(x)is q=P K h, c  1999byCRCPressLLC where P k is the square N(K + 1) dimensional, block upper-triangular matrix P K =      P 00 P 01 ··· ··· P 0K 0 P 11 P 12 ··· P 1K . . . . . . . . . . . . . . . 00··· ··· P KK      , (72.31) andwhereeachnonzeroN × N blockisoftheform P ij =  j i  P 0 B j−i S j for j ≥ i. (72.32) The N × N matrices B and S are diagonal with elements B ii =−b i and S ii = 1/s i , respectively, while P 0 = P 00 is the N × N matrix with elements [ P 0 ] ij =  1/   s j   i ∈ I j , 0 otherwise. (72.33) Theinvariantdensity of aMarkovmap,which isneeded tocomputethecorrelationstatistic (72.25), can be determined as the solution of an eigenvector problem. It can be shown that such invariant densities are piecewise constant functions so that the fixed point equation (72.27) reduces to the matrix expression P 0 p = p . Due to the properties of the matrix P 0 , this equation always has a solution that can be chosen to have nonnegative components. It follows that the correlation statistic (72.29) can always be expressed as R f ;h 0 ,h 1 , .,h r [ k 1 , .,k r ] = g T 1 Mg 2 (72.34) where M is a basis correlation matrix with elements [ M ] ij =  θ i (x)θ j (x) dx . (72.35) and g i are the coordinate vectors of the functions g 1 (x) = h r (x) (72.36) g 2 (x) = P k r −k r−1 f  h r−1 (x) ···P k 2 −k 1 f  h 1 (x)P k 1 f { h 0 (x)p(x) }  ···  . (72.37) By the previous discussion, the coordinate vectors g 1 and g 2 can be determined using straightforward matrix-vector operations. Thus, expression (72.34) provides a practical way of exactly computing the integral (72.29), and reveals some important statistical structure of signals generated by Markov maps. 72.7 Power Spectra of Markov Maps An important statistic in the context of many engineering applications is the power spectrum. The power spectrum associated with a Markov map is defined as the Fourier transform of its autocorre- lation sequence R xx [k]=  x[n]x[n + k]p(x)dx (72.38) c  1999 by CRC Press LLC [...]... S.H., A Signal Processing Framework for the Analysis and Application of Chaos, Ph.D thesis, M.I.T., Cambridge, MA, Feb 1995 Also RLE Tech Rep No 593, Feb 1995 [16] Myers, C., Kay S and Richard, M., Signal separation for nonlinear dynamical systems, in Proc Intl Conf Acoust Speech, Signal Processing, 1992 [17] Kay, S and Nagesha, V., Methods for chaotic signal estimation, IEEE Trans Signal Processing, ... 27, 1172–1173, 1991 [4] Espejo, S., Martin, J.D and Rodriguez-Vazquez, A., Design of an analog /digital truly random number generator, in 1990 IEEE International Symposium on Circuits and Systems, 1368– 1371, 1990 [5] Papadopoulos, H.C and Wornell, G.W., Maximum likelihood estimation of a class of chaotic signals, IEEE Trans Inform Theory, 41, 312–317, Jan 1995 [6] Chen, B and Wornell, G.W., Efficient... transformations which have unique absolutely continuous invariant measures, Trans Am Math Soc., 255, 243–262, 1979 [24] Sakai, H and Tokumaru, H., Autocorrelations of a certain chaos, IEEE Trans Acoust., Speech, Signal Processing, 28(5), 588–590, 1990 c 1999 by CRC Press LLC ... good approximation to f (·), and then finding the statistics of Markov map using the techniques described in Section 72.6 References [1] Feely, O and Chua, L.O., Nonlinear dynamics of a class of analog-to -digital converters, Intl J Bifurcation and Chaos in Appl Sci Eng., 325, June 1992 [2] Tang, Y.S., Mees, A.I and Chua, L.O., Synchronization and chaos, IEEE Trans Circuits and Systems, CAS-30(9), 620–626,... method of driving white noise through a linear time-invariant filter with a rational system function, but also by iterating deterministic nonlinear dynamics For this reason it is natural to view chaotic signals corresponding to Markov maps as “chaotic ARMA (autoregressive moving-average) processes” Special cases correspond to the “chaotic white noise” described in [5] and the first order autoregressive . Chaotic Signals in Noise Thehybridsignalrepresentationdescribedintheprevioussectioncanbeappliedtoaclassicalsignal processingproblem—estimatingasignalinwhiteGaussiannoise.Forexample,supposetheproblem. other signals. The analysis underlying these types of applications depends on signal representations that provide insight into the structure of chaotic signals.

Ngày đăng: 22/12/2013, 21:17

Từ khóa liên quan

Tài liệu cùng người dùng

  • Đang cập nhật ...

Tài liệu liên quan