Is there a symmetric alternative to Kullback-Leibler divergence?

Beyond the symmetric KL-divergence, Information Theoretic Learning presented several symmetric distribution "distances". The idea is just to realize that pdfs are like any other functions in a L2-space. Thus, you can calculate the Euclidian distance $\int_x(p(x)-q(x))^2dx$, Cauchy-Schwarz distance, etc. There are even approximations to these distances directly from data, using Parzen Windows. Check the link above for more.