Element-wise (or pointwise) operations notation?

Solution 1:

I've seen several conventions, including $\cdot$, $\circ$, $*$, $\otimes$, and $\odot$. However, most of these have overloaded meanings (see http://en.wikipedia.org/wiki/List_of_mathematical_symbols).

  • $\times$ (\times) -- cross product or cartesian product.
  • $*$ (*) -- convolution.
  • $\cdot$ (\cdot) -- dot product
  • $\bullet$ (\bullet) -- dot product
  • $\otimes$ (\otimes) -- tensor product.
  • $\circ$ (\circ) -- function composition. Not a problem for vectors, but can be ambiguous for matrices.

Thus, in my personal experience, the best choice I've found is:

  • $\odot$ (\odot) -- to me the dot makes it look naturally like a multiply operation (unlike other suggestions I've seen like $\diamond$) so is relatively easy to visually parse, but does not have an overloaded meaning as far as I know.

Also:

  • This question comes up often in multi-dimensional signal processing, so I don't think just trying to avoid vector multiplies is an appropriate notation solution. One important example is when you map from discrete coordinates to continuous coordinates by $x = i \odot\Delta + b$ where $i$ is an index vector, $\Delta$ is sample spacing (say in mm), $b$ is an offset vector, and $x$ is spatial coordinates (in mm). If sampling is not isotropic, then $\Delta$ is a vector and element-wise multiplication is a natural thing to want to do. While in the above example I could avoid the problem by writing $x_k = i_k \Delta_k + b_k$, having a symbol for element-wise multiplication lets us mix and match matrix multiplies and elementwise multiplies, for example $y = A(i \odot \Delta) + b$.
  • Another alternative notation I've seen for $z = x \odot y$ for vectors is $z = $ diag$(x) y$. While this technically works for vectors, I find the $\odot$ notation to be far more intuitive. Furthermore, the "diag" approach only works for vectors -- it doesn't work for the Hadamard product of two matrices.
  • Often I have to play nicely with documents that other people have written, so changing the overloaded operator (like changing dot products to $\left< \cdot , \cdot \right>$ notation) often isn't an option, unfortunately.

Thus I recommend $\odot$, as it is the only option I have yet to come across that has seems to have no immediate drawbacks.

Solution 2:

Element-wise product of matrices is known as the Hadamard product, and can be notated as $A \circ B$.

Some basic properties of the Hadamard Product are described in this section from an open source linear algebra text. Wikipedia also mentions it in the article on Matrix Multiplication, with an alternate name as the Schur product.

As for the significance of element-wise multiplications (in signal processing), we encounter them frequently for time-windowing operations, as well as pointwise multiplying in the DFT spectrum which is equivalent to convolution in time.

I wouldn't say this notation has completely caught on, in many cases $A \cdot B$ is used (like in the statement of the convolution theorem above.

Searching for Hadamard Product on Math.SE will get you some other examples. (sorry, couldn't add as many links as I planned.)

Solution 3:

If you need to do this, generally a good idea is to write your "vectors" as functions $I \to \mathbb{R}$ (or whatever) where $I$ is your index set, and then say that $fg(i) = f(i) g(i)$ is the pointwise product. Mathematicians don't have a special notation for this because

  • nobody takes the pointwise product of vectors (in the geometric sense) because it's not invariant under change of coordinates, and
  • if you are taking the pointwise product of functions then it is generally clear that you are doing this from context (e.g. if $f, g$ are continuous functions $X \to \mathbb{R}$ where $X$ is a topological space).

If you are working in a context where both dot products and pointwise products make sense, the answer is to change your notation for dot products to something like $\langle f, g \rangle$.

Solution 4:

I'd recommend $f^I$ for the elementwise version of $f$, following standard conventions in category theory. In particular, observe that given a set $I$, the function $\mathbf{Set} \rightarrow \mathbf{Set}$ given by $X \mapsto X^I$ becomes a functor as follows. For all functions $f : X \rightarrow Y$, the corresponding function $f^I : X^I \rightarrow Y^I$ is defined by composition: $$f^I(\tilde{x}) = f \circ \tilde{x}$$