Has the notion of having a complex amount of dimensions ever been described? And what about negative dimensionality?

The notion of having a number $a \in \mathbb{R}_{\geq 0} $ associated to any metric space is described by the definition of a "Hausdorff Dimension". I was wondering if work has been done on spaces which (seem to) have a complex amount dimensions associated with it? Does this concept exist? If so, when is it useful, if at all?

Inspired by the comments, I am also interested as to whether the concept of negative dimensionality has been explored already.

Thanks in advance.


Negative dimension is actually much easier to talk about than complex dimension. Super vector spaces are a natural collection of objects that can have negative dimension; given a super vector space $(V_0, V_1)$ we can define its dimension to be $\dim V_0 - \dim V_1$, and this definition has many nice properties; see this blog post, for example.

More generally, there is a natural notion of dimension in any (braided?) monoidal category with duals (see Traces in symmetric monoidal categories by Ponto and Shulman for a definition and thorough discussion). It includes as special case many notions of Euler characteristic, and in particular is frequently negative, although it is not always a number; in general it takes values in the monoid $\text{End}(I)$ where $I$ is the identity object. (If the category is preadditive with the monoidal product distributing over addition of morphisms, then $\text{End}(I)$ is a ring, and one can ask whether it is isomorphic to a subring of $\mathbb{C}$.)


Algebraic stacks are a far-reaching generalization of algebraic varieties. If an algebraic variety is considered as a stack, then its dimension as stack is the same as its dimension as variety. However there are many stacks that do not correspond to varieties, and some of these have negative dimension.

Specifically, if $V$ is a variety and $G$ is an algebraic group acting on $V$, then we can always form the quotient stack $[V/G]$ (which in most cases won't be a variety). Then we have $$\dim([V/G])=\dim(V)-\dim(G)$$ which may well be negative. For instance, if you let $G$ act trivially on a point $P$, then the quotient stack $[P/G]$, known as the classifying stack of $G$, will have dimension $\dim([P/G])=-\dim(G)$.

The same game can be played with differentiable stacks, smooth manifolds, and Lie groups acting on those manifolds. Also: topological stacks, topological manifolds, and topological groups acting on those manifolds.


One notion of complex dimension that has been used extensively has to do with self-similar sets. A $t$-neighborhood (i.e. points within distance $t$) of such a set may have volume $v(t)$ bounded above and below by constant multiples of $t^d$, where $d$ is the dimension of the boundary and $t$ is small, but such that $t^{-d} v(t)$ is oscillatory and non-convergent as $t$ goes to zero. In such a case the oscillatory information may sometimes be described using a complex power of $t$. If $v(t)=t^{d+ci}= t^d \exp(ic \log t)$ one could think of the dimension being $d+ci$. In general sets will have many complex dimensions. There are applications to Weyl asymptotics and connections to number theory (spectral zeta functions).

For details see the work of Michel Lapidus (on fractal strings) or the paper of Erin Pearse on complex dimensions of self-similar systems.


Interesting question!

The Hausdorff dimension is defined as an infimum over a subset of the positive reals. That means that it's positive and real by definition, so if you're looking for a way for a space to have negative or complex dimension, you're either going to have to generalize the Hausdorff dimension in some way or use some other definition of dimensionality.

There are some other ways to define dimensionality.

The Lebesgue covering dimension is a nonnegative integer by definition.

The inductive dimension is defined by induction. If you start with a number that isn't a nonnegative integer, then the induction won't bottom out with a point of dimension zero.

The Nöbeling-Pontryagin theorem and Menger-Nöbeling theorem may constrain what you can do here. They would seem to imply that your space has to be either not normal or not countable, or else it could be related in certain ways to $\mathbb{R}^n$, which makes it seem unlikely that it could have negative or complex dimension.

To find some motivation for a generalized notion of dimension that could be negative or complex:

(1) If it's negative, that suggests that you want to study situations where it's sensible to subtract dimensions of spaces.

(2) If it's complex, that suggests thinking about when it's sensible to multiply dimensions, because the complex numbers have no interest unless you can multiply them. Also, how would a space with dimension i differ from a space with dimension -i?


I thought about allowing a negative value of $d$ in the definition given in the article in the OP. My definitions do not extend Hausdorff dimension, but considers an alternate interpretation of dimension itself as nonpositive numbers rather than nonnegative numbers.

It seems to me that the definitions of Hausdorff dimension offer little leeway into generalizing into negative dimensions: you would have to admit values of $d$ which are negative in the definition of $\dim_H(X)$, and I don't see that playing very well with the infimum that's stuck in there.

As it stands in the article you linked, we have this $C_H^d$ function (the "Hausdorff content") that takes input of some set in a metric space $X$ and spits out the smallest among the sums of the form $\displaystyle\sum_{i} r_i^d$ where $\{r_i\}$ is some set of real numbers that define the radii of some collection of open balls that cover $D$.

In the article, the value of $d$ is allowed to grow without bound, but to compensate that we took the infimum (to "filter out the big sums"). So, I think it's reasonable that allowing a negative value of $d$ would mean we probably want to look at a function like this

$$C_{\tilde{H}}^d = \displaystyle\sup_{d < 0} \left\{ \displaystyle\sum_i r_i^d \right\},$$

where $\{r_i\}$ is defined the same way it is for $C_H^d$. This time we use a $\sup$ to compensate for the unboundedness of $d$ toward $-\infty$ (to "filter out the small sums").

Now in the article, they use the function $C_H^d$ to define this other function $\dim_H$ (the Hausdorff dimension) where $\dim_H(D) = \inf\{d \geq 0 \colon C_H^d(D) = 0\}$. Since the $d$'s in the original grew without bound, the filter out the big ones and end up with the infimum.

What we should do is now clear: define the nonpositive Hausdorff dimension

$$\dim_{\tilde{H}}(D) = \sup \{d < 0 \colon C_{\tilde{H}}^d(D) = 0 \}.$$

Does this result in anything new? Is the nonpositive Hausdorff dimension of a set simply the negative of the positive Hausdorff dimension? Is the Hausdorff dimension of a line (which normally has Hausdorff dimension $1$) simply $-1$ or is it more interesting? Some conjectures that should be investigated to find out:

Conjecture: $\dim_{\tilde{H}}(\emptyset) = 0$?

Conjecture: Theorem (using modified $C_{\tilde{H}}^d$ below): $\dim_{\tilde{H}}(\{1,2,3\}) = 0$

Conjecture: $\dim_{\tilde{H}}(LINE) = -1$?

Conjecture: $\dim_{\tilde{H}}(PLANE) = -2$?

Conjecture: $\dim_{\tilde{H}}(CANTOR SET) = -\frac{\log(2)}{\log(3)}$?

If these conjectures are true, then I suspect that this alternative formulation of dimension is pretty boring in that it doesn't tell us anything we didn't already know. If not, I'm interested in seeing what happens.

Of course this idea could easily generalize to allowing complex-valued $d$'s. We may have to throw a modulus in there, but it could work similarly. In fact we could somehow pick complex-valued dimensions with complex-valued $d$'s or cast it back down to reals while using complex-valued $d$'s and taking moduli. Lots of possibilities there to find something interesting.


addendum:

I took the time to show that $\dim_{\tilde{H}}(\{1\}) = 0$. Before that, while doing this, I decided that we should be using this definition of $C_{\tilde{H}}^d$ (note: I added a negative sign to the previous definition):

$$C_{\tilde{H}} = \displaystyle\sup_{d < 0} \left\{ -\displaystyle\sum_i r_i^d \right\}.$$

Here we are taking the negative of the sum (why this was important will be apparent soon). For now, set $d=-1$, as it will be illustrative of arbitrary $d<0$. Then,

$$\begin{array}{ll} C_{\tilde{H}}^d(\{1\}) &= \displaystyle\sup_{d < 0} \left\{ -\displaystyle\sum_i r_i^d \right\} \\ &= \displaystyle\sup_{d < 0} \left\{ - \frac{1}{r_1} \right\} \\ \end{array}.$$

where $r_1$ is the radius of any open ball that covers $\{1\}$. We are seeking the supremum of a set of negative numbers, so we want to make $\frac{1}{r_1}$ as small as possible (in order that $-\frac{1}{r_1}$ be as large as possible), implying that $C_{\tilde{H}}^{-1}(\{1\})=0$. (NOTE: this is why I amended the definition of $C_{\tilde{H}}^d$ -- if we did not put it there, then we would have to deduce $C_{\tilde{H}}^{-1}(\{1\}) = \infty$ by making $r_1$ arbitrarily small! This is a big problem because we will not have any $d$ such that $C_{\tilde{H}}^d(\{1\})=0$ so the dimension will not be defined!)

Now notice that we will get this same behavior no matter which $d \in (-\infty,0)$ we choose, yielding $C_{\tilde{H}}^d(\{1\})=0$. So by definition,

$$\dim_{\tilde{H}}(\{1\}) = \sup \{d < 0 \colon C_{\tilde{H}}^d(D) = 0 \} = 0,$$

since $0$ is the supremum of $(-\infty,0)$. $\blacksquare$

The same process can be used on the set $\{1,2,3\}$, so I resolve the above conjecture, under the assumption that we use the modified $C_{\tilde{H}}$ (with the negative sign).

Since I am unfamiliar with the techniques of showing the Hausdorff dimension of a line is $1$, I cannot proceed further very easily.