I am interested in examples of calculus on "strange" spaces. For example, you can take the derivative of a regular expression[1][2]. Also the concept extends past regular languages, to more general formal languages[3].

You can also do calculus on abstract data types, here is an example in Haskell[4]. Differential equations are type-inference equations. You can also taylor-expand types[5].

I am looking at more examples of this. Note that I am interested where calculus is similar enough to "normal" calculi (e.g. calculus on functions of complex variables, functional calculus, etc). At least operators must be linear, for example the arithmetic derivative is not interesting to me because the operators are not linear.

The examples I gave are all from computer-science, but I am interested in more general answers.

  1. Brzozowski: Derivatives of Regular Expressions
  2. Owens: Regular-expression derivatives reexamined
  3. Might: Parsing with Derivatives
  4. The Algebra of Algebraic Data Types, Part 3
  5. The Algebra of Algebraic Data Types, Part 2

There are some (partial) examples:

Calculus on normed vector spaces:

“Calculus on Normed Vector Spaces” by Rodney Coleman.

"Calculus on Normed Vector Spaces"

Differentiation in Fréchet spaces.

p-adic analysis:

“p-Adic Analysis and Lie Groups” by Peter Schneider.

“An Introduction to p-adic Numbers and p-adic Analysis” by Andrew Baker.

“p-adic Numbers, p-adic Analysis, and Zeta-Functions” by Neal Koblitz.

Haar integral of a function on a locally compact topological group:

Wikipedia article.

MSE question about Haar measure.

Formal derivative in ring theory.


Have you looked at differential posets? (Definitions below from Wikipedia.)

Let $P$ be a locally finite graded poset with unique minimal element. An element $x$ of $P$ is said to cover another element $y$ of $P$ if $x>y$ and the rank of $x$ is one more than that of $y$--or equivalently, if $x>y$ but there is no element $z$ of $P$ with $x > z > y$.

Then $P$ is $r$-differential if

  • For all $x \in P$, exactly $r$ more elements cover $x$ than are covered by $x$.
  • For $x, y \in P$ distinct, the number of elements covering both $x$ and $y$ and the number of elements covered by both $x$ and $y$ are the same.

These conditions have an equivalent form as a sort of differential identity involving linear operators on a vector space with basis the elements of $P$. Let $U$ take the basis vector $x \in P$ to the sum of elements covering $x$, and let $D$ take $x$ to the sum of elements covered by $x$. It can be checked that, for basis vectors $x,y$ belonging to the same level of $P$, the scalar $y$-component of $(DU - UD)x$ is the number of elements covering both $x$ and $y$ minus the number of elements covered by both $x$ and $y$--or, when $x=y$, just the number of elements covering $x$ minus the number covered by $x$.

Thus $P$ is $r$-differential if and only if $$DU-UD = rI.$$

The operator $D$ can be thought of as a differential operator, decreasing the rank of an (non-minimal) element by one just as the derivative decreases the degree of a non-constant polynomial by one. Similarly, $U$ can be thought of as multiplication by a variable. For $1$-differential posets, the above identity is similar to a case of the product rule: $$\frac{\partial }{\partial x}x f(x) - x\frac{\partial }{\partial x}f(x) = f(x).$$

An interesting $1$-differential poset is Young's lattice, wherein "differential calculus" on the space of Young diagrams (equivalently, integer partitions) can reveal combinatorial identities about Young tableaux. Namely, it is an easy consequence of the "product rule" that $D^nU^n(\varnothing)=n!\varnothing$ for $\varnothing$ the minimal element of a $1$-differential poset--just like $\frac{\partial^n }{\partial x^n}x^n=n!$ in calculus. Interpreting standard Young tableaux as covering sequences of Young diagrams, that equation becomes the remarkable Young-Frobenius identity as explained on pp. 8-11 by Mitchell Lee here.