Is there a general way to prove series and products are modular?
The following$$\eta(q)=q^{1/24}(q)_\infty$$ $$E_{n}(z)=\sum_{z \in \Lambda\setminus \lbrace0\rbrace}z^{-n}$$ $$F(q)=q^{-1/60} \sum_{n \ge0} \frac{q^{n^2}}{(q;q)_\infty}$$ $$F(q)=q^{11/60} \sum_{n \ge0} \frac{q^{n^2+n}}{(q;q)_\infty}$$ and many other functions (Nahm sums etc.) are modular forms. I think I have seen a couple of bashy proofs of some of these being modular forms.
Is there a general method for proving such sums and products are modular forms? If so could you be so kind as to show the technique in practice on some subset of the above (or similar)?
For clarity, I mean modular over $\text{SL}(2,\mathbb{Z})$.
Solution 1:
Of the functions you list, two are $q$-series, one is an Eisenstein series and one is the eta function. A general strategy in the form of a computational process to prove or disprove modularity in such a generality does not exist. The subject is simply too rich and elaborate to be amenable to a canonical line of attack. However, useful principles exist, along with a fascinating theory that underlies the identities used to prove the modularity of at least the $q$-series you presented.
What do we know about modular forms for a specific $\Gamma<\textrm{SL}(2,\mathbb{Z})$ and a specific height? First of all, they are finite-dimensional vector spaces. We have explicit generators, such as Eisenstein series, theta series etc. whose modularity is proven by methods (that I will mention below) interesting in their own right and generalizable. We have operators on these spaces that are diagonalized by nice bases such as Hecke eigenfunctions. See http://maths.dur.ac.uk/~dma0hg/ModForms.pdf for a good, free introduction. The point is that spaces of modular forms have a very rich and complex structure, and membership in that elite club should never be taken lightly.
Let me start with a few basic methods for showing modularity, leading up to a suite of structural statements related to the modularity of your series.
The most basic method to prove the modularity of a function is to have it exhibited as a sum over representatives of $\Gamma$ in the first place. Then an application of an action of $\Gamma$ will simply permute the sum and hopefully alter the summands by a factor of automorphy, as in the case of Eisenstein series; of course we are not done: we need to prove the analytical conditions involved in modularity, which are very crucial and non-trivial. Fortunately, if we can prove these conditions for a few functions, and then show that more complicated ones are functionally related to those in a nice ways, growth conditions will be much easier to establish.
If a function is presented as a Fourier series over $\mathbb{Z}$ (so we at least know it is invariant under $z\to z+1$), a uniform method to detect modularity is the (not necessarily straightforward!) application of the Poisson summation formula. This is one of the ways to prove modularity for the theta function, for instance. Although this tool may seem somewhat ad hoc, it is not: the Poisson summation formula (and its big adelic brother, about which you can read in any exposition of Tate's thesis) is one instance of a trace formula, a highly structural statement linking the geometry of orbits of a group such as $\Gamma$ acting on a suitable space and certain algebraic representations of an overlying algebraic group (such as $\textrm{SL}(2,\mathbb{R})$). This is also the first indication that certain aspects of Lie groups may have something to do with modularity.
Once we have proven modularity, directly, by Poisson summation or other Fourier/complex analytic method of some basic functions, the next step is to see if we can write our functions 'in terms' of the previous ones. What 'in terms' means can differ wildly from function to function; for instance, although the $M_k$ are finite dimensional, it is hopeless to expect a cheap proof that some series is modular by exhibiting it as a linear combination of some Eisenstein series (usually it goes the other way round: we show a function is modular and then we stand gaping at the results as finite-dimensionality gives us all sorts of impossible-seeming equations for the arithmetically significant Fourier coefficients of each side).
One notion of 'in terms' that is directly relevant to the functions you have is via differential equations (of hypergeometric type for instance), leading to relationship between various $q$-series and (modified) theta functions allowing to prove modularity of the former via the latter. This is a tricky thing, combining elementary combinatorial techniques illustrated, for example, in http://www.math.uiuc.edu/~berndt/articles/q.pdf and the method you can find in http://people.mpim-bonn.mpg.de/zagier/files/doi/10.1007/978-3-540-74119-0_1/fulltext.pdf in section 5.4. There are entire books (e.g. the survey Parititions, q-series and modular forms) relevant to these techniques, but they are heavy-going and I would not recommend them at this stage.
In the techniques above, heavy use is made of certain functional equations involving series, products, continued fractions, integral representations and more. These functional equations sometimes give modularity immediately or assume a crucial role in the process. Examples include the Jacobi triple product identity, the Rogers-Ramanujan identities (which immediately give modularity for some $q$ -series) and more classical identities of that form.
It turns out that many of these functional equations have a common origin: the representation theory of semisimple Lie algebras and related algebraic objects (such as vertex operator algebras, Kac-moody algebras etc.) and in particular, formulas for the characters of such representations in terms of combinatorial data. If there is a unifying principle that links modularity to series of the form your present above, this is it. To illustrate this unifying principle, you should read about the Macdonald identities, e.g. at http://en.wikipedia.org/wiki/Macdonald_identities.
A good book to read for the link between modularity and representation theory in a casual manner is a book I am currently reading myself called Moonshine beyond the Monster. It covers many (too many) topics, and although it can be vague and somewhat inaccurate, it conveys the general ideas in with good prose and includes sufficiently many references to authoritative material. Plus the modularity and affine algebra interpretations sections are among the most well-written in the book.