Isomorphisms: preserve structure, operation, or order?

Everyone always says that isomorphisms preserve structure... but given the (multiple) definitions of isomorphism, I fail to see how the definitions equate with the intuitive meaning, which is that two sets are "basically the same if you ignore naming and notation".

Here are the different definitions I've come across:

Order Isomorphism

Let $A$ be a (totally) ordered set with ordering $\le$ and $B$ be a (totally) ordered set with ordering $\preceq$. An isomorphism of $A$ onto $B$ is a bijection $f:A\mapsto B$ that satisfies $$a \le b \iff f(a) \preceq f(b)$$ for all $a,b \in A$.

Group Isomorphism

Let $A$ be a group with operation $\ast$ and $B$ be a group with operation $\#$. An isomorphism of $A$ onto $B$ is a bijection $f:A\mapsto B$ that satisfies $$a \ast b = c \iff f(a) \# f(b) = f(c)$$ rather, put simply, $$f(a\ast b) = f(a) \# f(b)$$ for all $a,b,c \in A$.

Field Isomorphism

Let $A$ be a field with operations $\#$ and $\ast$ and $B$ be a field with operations $+$ and $\times$. An isomorphism of $A$ onto $B$ is a bijection $f:A\mapsto B$ that satisfies $$f(a \# b) = f(a) + F(b)$$ and $$f(a \ast b) = f(a) \times F(b)$$ for all $a,b,c \in A$.

Homomorphism

The same as an isomorphism but not necessarily a bijection.

An example of why this is confusing: It is technically not the case that $\mathbb{N}\subseteq\mathbb{Z}$. For example, in the Natural Numbers, $0$ is the empty set $\varnothing$, which has no elements, however in the integers, $0$ is the equivalence class $[(0,0)]$ of ordered pairs whose components are natural numbers that are equal, and it has infinitely many elements. These sets are not by any means equal, but for all intents and purposes, we consider $\mathbb{N}$ to be a subset of $\mathbb{Z}$, because the set of non-negative integers is "basically the same as" the set of natural numbers. That is, the natural numbers have "isomorphic copies" in the integers. The question is, which type of isomorphism?

Which of these definitions of isomorphism is "correct", and how does it equate to the intuitive meaning that the two sets are "basically the same"? Furthermore, what's the point of homomorphism, why is it useful, and how is its intuitive meaning similar or different from the intuitive meaning of isomorphism?


None of them is incorrect; they are all correct within their magisteria. That's why we talk about "group isomorphism", "order isomorphism", "field isomorphism", etc. We omit the specific kind when it is understood from context. Asking which is "correct" and which is "incorrect" is like asking which of the many "Jones" in the telephone directory is "the real Jones", and which ones are impostors. And just like a particular surname may be common, if in your place of work there is only one person with that surname, they may be refered to as "Jones" without fear of confusion.

When you are working with groups, you are interested in group homomorphisms and group isomorphisms. When you are working with semigroups, you are interested in semigroup homomorphisms and semigroup isomorphisms. When you are working with ordered sets, yo uare interested in order homomorphisms and order isomorphisms. Etc.

The reason that an isomorphism corresponds to the "essentially the same object" idea is that a bijection works like a "relabeling" of the objects. Consider the act of translating numbers from English to Spanish. Addition of numbers should not depend on which language we are speaking, in the sense that since "two" corresponds to dos, and "three" corresponds to tres, we should expect "five" (which is "two" plus "three") to corrrespond to whatever dos mas tres (namely, cinco) corresponds. The properties that the numbers and addition of numbers have do not depend on what we call the numbers, but rather on the properties of the numbers. So, numbers-under-addition is "essentially the same, except for the names we use" as números-bajo-suma. An isomorphism is the way of saying that the only differences between the two objects, as far as the particular structure we are interested about is concerned are the "names" we give to the objects and the operations.

Your example deals with a very specific construction of $\mathbb{Z}$ in terms of $\mathbb{N}$. The identification is in fact an identification that carries a lot of properties; it is a bijection that respects (i) order; (ii) addition; (iii) multiplication; and (iv) any derived operation from these. There are other ways of defining $\mathbb{Z}$ that do include $\mathbb{N}$ as a subset. The point of the isomorphism is that it does not matter how we construct $\mathbb{Z}$ and how we construct $\mathbb{N}$, in the end we have a set with certain properties, sitting inside another set that has further properties, and these properties are maintained regardless of how we constructed these objects.

Correction: It is not quite correct that "homomorphism is the same as isomorphism but not necessarily a bijection". For example, in the case of "order homomorphism", the condition only requires that $a\leq b\implies f(a)\preceq f(b)$, and the converse is not required (though the converse is required for isomorphisms). It is a bit better to say that an isomorphism of partially ordered sets is a homomorphism that has an inverse that is also a homomorphism; this gives you the biconditional as a consequence, rather than as a premise, and it makes it fit into the general scheme better. (We do the same thing with topological spaces, where the concept of isomorphism is that of homeomorphism, which is a continuous map that has an inverse, and the inverse is also continuous. The definition works in the context of groups, rings, semigroups, fields, etc., but it turns out that in those cases, being a bijective homomorphism suffices to guarantee that the set-theoretic inverse is also a homomorphism.)

As to homomorphisms vs. isomorphisms: There is a very fruitful philosophy that is that maps between objects are more important than objects. If you look at vector spaces, you can stare at particular vector spaces and get a lot of nice things, but vector spaces don't truly come into their own (in terms of power, applicability, usefulness, etc) until you introduce linear transformations (which are homomorphisms of vector spaces). Groups are ubiquitous, but it is homomorphisms between groups (which allow you to consider representations, group actions, and many other things) that make them impressively useful. The real numbers, as a metric space, is very nice; but it is continuous functions (homomorphisms of metric spaces/topological spaces) that are the cornerstone of their applicability to physics and other contexts. Homomorphism are "functions that play nice with the structure we are interested in." Functions are very useful, the structure may be very useful, and homomorphisms is a way of getting the best of both worlds: functions, and structure.

Homomorphisms are more powerful than simply isomorphisms because isomorphisms don't really let us "change" the structure, it only let us change the names we give to the objects in the structure. It is homomorphisms that truly allow us to switch contexts. If you only allowed continuous bijections from $\mathbb{R}$ to $\mathbb{R}$ with continuous inverses (what an isomorphism of metric spaces is), you would get a lot of interesting stuff, but nowhere near what you get when you allow yourself to consider all continuous functions.

This philosophy (concentrate on maps between objects, not on objects themselves) is at the very core of Category Theory (as Pete Clark mentions in the comments), and of modern takes on many subjects.


Added. As Jackson Walters points out, I've downplayed the importance of isomorphisms above. One of the fundamental problems in almost every area of mathematics is "When are two objects that may look different actually the same?" This is usually stated in terms of a "classification problem," and it usually comes down to asking whether there is an easy way to tell if there is an isomorphism between two given objects without having to explicitly look for one, or whether there is some "tractable" complete list of all objects of interest up to isomorphism. Examples of the former are the theorem that says that two vector spaces over the same field are isomorphic if and only if they have the same dimension. Examples of the latter are the "Fundamental Theorem of Finitely Generated Abelian Groups" (that tells you that every finitely generated abelian group is isomorphic to one with a particularly nice structure), and to a lesser extent the classification of finite simple groups (which tells you that there are some families plus a finite list that contains all finite simple groups). Isomorphisms are important, especially as a way of simplifying the study of objects by helping us look at what the things "are" instead of what they look like.


Each is a correct example of an isomorphism, though they exist within different categories. The idea that isomorphisms between two objects mean the objects are "the same" up to some kind of scheme for renaming the internal elements is a legitimate, albeit limited, interpretation. Suppose we collect "all of the things that could be said" about elements of a group $G$, for example, and use an isomorphism $f$ as a way of renaming the elements of $G$: if we take any of our collected true statements about $G$ and replaced every instance of specific $g\in G$ with $f(g)$s, we would end up with a true statement in the target object. The converse also holds: any true statement involving the elements of the target object translates into a true statement about the elements of the source object. On this view, two structures are "the same" when everything that can be said about them (or their constituent elements) are precisely one and the same (up to naming, of course).

It is kind of like changing the names of characters or places in a story. The plot structure, and indeed various literary features of the story beyond the plot itself, will stay the same. In this way we can imagine it is "the same" story. On the flipside, category theory allows for loftier abstractions of structures: objects do not even need to be comprised of elements for us to do cateogry theory with them. Therefore our interpretation of isomorphisms is restricted in scope to familiar cases where we have sets, and additional algebraic and relational structure defined on them, as our objects.

For many intents and purposes, a morphism can be thought of as putting one structure inside of another structure. It may be the case that we fold up and compress the source object when we put it inside the target object, and it may also be the case that our placement of source inside target does not fill up the entire target object but only a portion of it. This is where we have injectivity and surjectivity come into play. An isomorphism occurs when our placement is a perfect overlay of source onto target: no folding up was necessary, and we fill up all of the room in the target.

To be specific: a failure of injectivity ("folding-up," or "compression") occurs when distinct elements of the source object are sent to a single element in the target object, and a failure of surjectivity occurs when there are elements of the target object with no corresponding element in the source.

This is just some of my personal intuition on the topic.