Calculating the Lyapunov Exponent of a Time Series (with python code)

Tags

, , , , ,

I found this method during my Masters while recreating the results of an interesting paper on how some standard tests for chaos fail to distinguish chaos from stochasticity (Stochastic neural network model for spontaneous bursting in hippocampal slices, Biswal and Dasgupta, 2002).

Procedure

The Lyapunov exponent is a measure of sensitive dependence on initial conditions, i.e. how quickly two nearby states diverge.

Now consider two points in the time-series, ti and tj, whose values are very close. That means the system reached near the same state at the ith and jth iterations. Now consider the two sequences  ti,  ti+1,  ti+2 … and  tj,  tj+1,  tj+2 … We wish to know how these two sequences diverge from each other. For this, consider the distance between the two sequences after k steps: d(k) = | ti+k- tj+k |. (This is for a 1D time series. For higher dimensions, you can define this to be the Euclidean distance and modify the code accordingly.) If the system is chaotic, d(k) will initially rise exponentially with k. For this, one can plot ln d(k) vs k and apply a linear fit. The slope will be an estimate for the Lyapunov exponent.

(Since the system is bounded, the two nearby states will not diverge indefinitely though. Their exponential divergence will stop after some length. We must fit the straight line only within this region.)

Now, this was for a single pair of initial states. The Lyapunov exponent is an average of this divergence exponent over all nearby initial pairs. So for this, define <ln d(k)>, where <..> is averaging over all starting pairs  ti,  tj, such that the initial distance d(0) = | t- tj | is less than some fixed small value. The program finds all such initial pairs, calculates <ln d(k)>, plots it against k, and the slope of the initial linear part gives us the Lyapunov exponent.

Python Code

The following code takes a text file with the time series, ‘timeseries.txt’, as the argument. The text file must contain only the time series values in a single column, no serial numbers or any other text before or after. It asks for the starting diameter within which to limit the initial pairs. It displays how many such pairs it is finding in the time series, so you can vary the diameter based on this.

It outputs a text file, ‘lyapunov.txt’ with two columns, k and <ln d(k)>, which you can then plot and fit in the correct region by visual inspection.
from math import log

def d(series,i,j):
    return abs(series[i]-series[j])

f=open('timeseries.txt', 'r')
series=[float(i) for i in f.read().split()]
f.close()
N=len(series)
eps=input('Initial diameter bound: ')
dlist=[[] for i in range(N)]
n=0 #number of nearby pairs found
for i in range(N):
    for j in range(i+1,N):
        if d(series,i,j) < eps:
            n+=1
            print n
            for k in range(min(N-i,N-j)):
                dlist[k].append(log(d(series,i+k,j+k)))
f=open('lyapunov.txt','w')
for i in range(len(dlist)):
    if len(dlist[i]):
        print>>f, i, sum(dlist[i])/len(dlist[i])
f.close()

The following is the plot and fit of the resulting data from a logistic map series with an appropriately chosen initial diameter.

Lyapunov Exponent of Logistic Map

I deliberately did not automate the plotting and fitting part, because a. it’s tedious and hard to write the code in a way that runs on most installations, and b. human eyes will do a much more reliable job of identifying where the linear portion ends.

R code for multivariate random-walk Metropolis sampling

Tags

, , , , , , , , , , ,

I couldn’t find a simple R code for random-walk Metropolis sampling (the symmetric proposal version of Metropolis Hastings sampling) from a multivariate target distribution in arbitrary dimensions, so I wrote one. This is also my first R code.
It requires the package MASS to sample from the multivariate normal proposal distribution using the mvrnorm function. If you are using R on Windows, you can download the package zip for Windows from the link, and use Packages > Install package(s) from local zip files… from the GUI to install the package.
The reason I couldn’t write the code for a general Metropolis algorithm (i.e. for any arbitrary symmetric proposal distribution, not just normal) or a more general Metropolis-Hastings algorithm (with any arbitrary proposal distribution, symmetric or not) is that generating the proposal point would then require sampling from an arbitrary proposal distribution. This is only easy for a few standard distributions, but hard in general (which is the point of using such algorithms in the first place).

I. Function

The following is the function that does the Random Walk Metropolis-Hastings sampling when supplied with the required arguments. Notes about the arguments follow the code.

rwmetro <- function(target,N,x,VCOV,burnin=0)
{
    require(MASS)   #requires package MASS for normal sampling
    samples <- x
    for (i in 2:(burnin+N))
    {   
        prop <- mvrnorm(n = 1, x, VCOV)
        if (runif(1) < min(1, target(prop)/target(x))) 
            x <- prop
        samples <- rbind(samples,x)
    }
    samples[(burnin+1):(N+burnin),]
}

II. Arguments

  1. target function
    The function defining the multivariate target distribution, written as a function of an n-vector, where n is the number of variables on which the distribution is defined. The different variables of your distribution must be written as x[1], x[2] etc.
    Example
    An example is the following, defining the function f(x,y)=exp(-5*abs(x^2+y^2-1)) in two dimensions:

    ring2D <- function(x)	# x is a vector
    {
    	exp(-5*abs(x[1]^2+x[2]^2-1))
    }
    
  2. N integer
    The final sample size (i.e., excluding the burn-in length).
    Example
    100
  3. x numeric vector
    The starting point (vector) of your Metropolis-Hastings chain. This vector needs to be the same length as the dimension of the target distribution. Your target function must be able to accept this vector as an argument.
    Example
    c(-0.1,20.5)
  4. VCOV numeric matrix
    The variance-covariance matrix for the multivariate normal that is used as the proposal distribution for random-walk Metropolis-Hastings. The length of this matrix also has to be the same as the dimension of the target distribution, i.e. the length of vectors acceptable by target and the length of x. You can vary the entries of this matrix and observe your results to see what works better for sampling your target distribution.
    Example
    The following line defines a matrix in two dimensions with .01 variance for each variable and no covariance between them.

    vcov2D <- .01*diag(2)
    
  5. burnin (optional) integer
    The ‘burn-in’ length for the chain. The number specified will be the number of initial samples chucked. If nothing is specified, it’s taken to be 0.
    Example
    20

III. Output
numeric matrix
The output is a matrix where each row is a sample from your target distribution, excluding the initial burn-in samples. The number of rows is thus the sample size, and the number of columns is equal to the dimension of the target distribution. You can use this matrix however you want, to save, visualize or analyze your sample.
Example
An example output of sampling a 2D distribution with sample size 5. Each row is an x,y sample.

[1,] 0.12961923 0.03708061
[2,] 0.10765053 -0.02798036
[3,] 0.01112930 -0.07255766
[4,] 0.06049625 -0.04546265
[5,] 0.1832164 -0.1244354

IV. Usage Example
Let’s take the target distribution we used as an example above, f(x,y)=exp(-5*abs(x^2+y^2-1)). This looks like a ring of radius 1 rising from the x-y plane and centered at the origin. Here is a gnuplot surface plot of the distribution (because I found it frustratingly hard to figure out a level plot in R):

Target Distribution in Gnuplot
Let’s generate a sample of size 40,000 from this distribution with the starting point (0,0) and without any burn-in length, and with the variance-covariance matrix we defined before. This is done by calling the function with the correct arguments:

ringsample2D <- rwmetro(ring2D,40000,c(0,0), vcov2D)

This assumes that you have already defined the target ring2D and the matrix vcov2D in the way explained, preceding this function call.
ringsample now contains a random sample from the distribution.
With the following one-line R code, I made a scatter plot of the sample from the ringsample matrix.

plot(ringsample2D[1,], ringsample2D[2,], xlim=c(-1.5,1.5),ylim=c(-1.5,1.5), main="Metropolis-Hastings Sample",xlab="x", ylab="y", pch='.')

The following is the plot:
Metropolis-Hastings Sample, scatter plot

Putting all the code together in sequence, this is what the full code for defining the arguments, drawing the sample, and making the plot for this example looks like:

# Define arguments
ring2D <- function(x)    # x is a vector
{
    exp(-5*abs(x[1]^2+x[2]^2-1))
}

vcov2D <- .01*diag(2)

#Define the sampling function
rwmetro <- function(target,N,x,VCOV,burnin=0)
{
    require(MASS)   #requires package MASS for normal sampling
    samples <- x
    for (i in 2:(burnin+N))
    {   
        prop <- mvrnorm(n = 1, x, VCOV)
        if (runif(1) < min(1, target(prop)/target(x))) 
            x <- prop
        samples <- rbind(samples,x)
    }
    samples[(burnin+1):(N+burnin),]
}

# Call the sampling function with the arguments
ringsample2D <- rwmetro(ring2D,40000,c(0,0),vcov2D)

# Use the sample
plot(ringsample2D[1,], ringsample2D[2,], xlim=c(-1.5,1.5),ylim=c(-1.5,1.5), main="Metropolis-Hastings Sample",xlab="x", ylab="y", pch='.')

The following is another sample, this time for a 3-variable function, f(x,y,z)=exp(-5*abs(x^2+y^2+z^2-1)). This is just the 3D version of the previous function, and its density is mostly concentrated on a spherical shell of radius 1 and of some thickness, centered at (0,0,0). Notice how we never have to spare a thought about normalizing these functions to get the sample, which is one of the advantages of the Metropolis-Hastings algorithm.
The target function is now:

ring3D <- function(x)	# x is a vector
{
	exp(-5*abs(x[1]^2+x[2]^2+x[3]^2-1))
}

and the variance-covariance matrix is, say, .01 down the diagonal again:

vcov3D <- .01*diag(3)

Let’s go for a sample of 20000 points after a burn-in of 20000, starting from (0,0,0). The function is then called as:

ringsample3D <- rwmetro(ring3D,20000,c(0,0,0),vcov3D,20000)

ringsample3D is now a 3×1000 matrix, and I use the following to get a rotatable 3D plot of it (requires the rgl package):

require(rgl)
plot3d(ringsample3D[,1], ringsample3D[,2], ringsample3D[,3], xlab='x',ylab='y', zlab='z',col="red", size=1)

Metropolis Sample

I would have preferred a spinning 3D gif, but I’m too tired to work that out right now.
Share the knowledge.

Partners meet halfway: a simple correlation study of an undergrad lab class

Tags

, , ,

Last semester I taught two classes of an Introductory Physics (Electrodynamics) for Engineers lab course at the University of Texas at Austin. The first day that my students came in, most of them did not know each other. They just came in and sat at tables of two, and this pretty much became their permanent seats in the room for the rest of the course (that’s human habit). They were supposed to work with their partner on the experiments in the first part of the class, but individually on their worksheets in the second part. So choosing their seats also fixed the partners. Thus, it is a good assumption to say that the pairing had been pretty random, at least as far as academic ability of the students was concerned.

I noticed though that both partners in a group tended to move towards a common average in their performance over the course. If one partner was diligent, it would motivate (and sometimes force) the other to work harder, while the sloppiness and uncaring attitude of a partner would also negatively affect the performance of the other. I was happy to notice though that the ‘uplifting’ effect was usually more dominant than the ‘down-dragging’. In any case, this caused the performance of partners to pull closer, irrespective of their starting academic ability.

At the end of the semester, after I had the final scores that I handed out to them, I decided to do a simple correlation test. The following are two scatter plots of pairs of student scores for each of my classes, each pair ordered as (better score, worse score) (which is why all points are below the bottom-left to top-right diagonal of the plot). The linear fits through them have also been shown. Although I had suspected that there would be a significant correlation between partners’ scores, I was surprised at how high the correlation coefficients turned out to be. (Note that if you pair an even number of numbers randomly, on the average you should expect a linear correlation coefficient of zero.)

 

image

 

image

I had felt this trend even by the middle of the semester, and I had this crazy idea of using this to try and pull up the performance of the students who were not doing too good. I thought of sorting the students by their score so far into the course, and re-assigning partners so as to pair the best student with the worst, the second-best to the second-worst and so on. However, I know that habit is a deadly force, and if I suggest my students in the middle of the course that they have to say goodbye to their partner with whom they already have some steady chemistry going and have new partners again, ones that I have chosen for them, all hell would break loose. So I had to rest it in my head as a theoretical result with possible practical applications, to be taken up by a braver soul than me. But hey, here’s the data, so you can quote it if you want to try it and someone doesn’t take the idea too well.

End of the Road

Tags

, ,

Time and darkness, the greatest equalizers. One takes everything to the same form, while the other makes form irrelevant.

Lost are we the children of electricity. We go to sleep in our impure darknesses clothed in flickering electronic lights, our silent digital sentinels watching through the fake night. For this is not the pure night our fathers came home to over geological ages, this is not the pure ancient darkness of the universe they learned to hold close as they slept.

And learn the lessons from this we shall. For this only means that the final equalization into formless darkness and endless time shall be abrupt, unfamiliar, uncaring, for the universe slows down for nobody. And no mercy or patience shall be shown to excuse our brick, iron and electronic hideaways where we fled life all our lives.

That is what is waiting at the end of the road. All our roads end at that one point. Formless darkness. Endless time.

And all your lights and firework and expensive digital sentinels are but a successful commercial distraction while you wait. Congratulations, you’re alive.

Fever

Tags

, ,

I’ve never had hallucinations before when I was sick.

Some invisible form was holding my knees, my feet, pulling them, dragging me slowly by my legs downwards. It was so real. And I knew that the more I let myself float into these imaginings the more real and physical they would become, the more inescapable. Just like a drug trip.

I would struggle to try and wake myself. It’s important, terribly important, I would tell myself in my head in my half-awake blurred darkness. If you don’t try and wake and remember the real life where you are now, you are going to drown in this, and he will have you. He will pull you down. This is going to get scarier and he will get more real, and there will be real damage. This is all in your head, but your entire life has all been in your head, right? So don’t underestimate how real something in your head can be. Don’t give in, wake up.

And I would jolt awake every time, dizzy from the pain of immediate transfer into reality, having to take it all in again. My room, the time of the day, sounds of repairmen in the bathroom fixing the heater. All over again. Why did I have to take in the same reality so many times to know it was real? It was all so tiring that sometimes I wondered if it was worth it to not just go with the hallucinations.

Who is he? I blearily scanned my room. Daylight lay only faint fingers through the closed blinds on the soft, hazy darkness. There was no one. I was so scared in my head. I couldn’t decide if I wouldn’t be less scared if I actually saw someone to explain the touching and the pulling.

Then I would lie and breathe deeply, half-panting, and start to drowse again back into his gently but steadily pulling arms.

 

Later I would sit the electronic keyboard in a state medically known as wide awake, feeling weak but fine otherwise, and forget the notes I play every day. I could only tell that it didn’t sound the way it sounds every day. I would look at my fingers on the keys, pressing down where I had trusted muscle memory to so naturally lead them, and hear the plain wrongness of the notes, and draw a complete blank on what I was doing wrong. And then, as I sat poring over my fingers as though my unaccepting, disbelieving stare will heal them of their malady, I would be so scared again inside my head as I felt the inklings of something happening within me that I have never known before.

Follow

Get every new post delivered to your Inbox.

Join 1,953 other followers