Polynomials over a field

In this context, we consider a commutative field (simply: field) \K. The polynomials will be constructed as a certain associative unital algebra over \K (“\K-aua”), together with a distinguished element called the formal variable.

\K itself, as a vector space over itself and the usual multiplication in \K as the third law, is a \K-aua,

Morphisms between two \K-auas are linear mappings that conserve the third law and map the unit of the first algebra onto the unit of the second one.

All that will be said of the polynomials over \K will be based on the universal property given below. The polynomials will thus be defined uniquely up to an isomorphism.


The polynomials over \K is any \K-aua A together with a distinguished element X of A such that the following universal property holds:

For any \K-aua A' and any a \in A, there exists exactly one morphism \phi: A \to A' such that \phi(X) = a.

Unicity up to an isomorphism

Let (A_1, X_1) and (A_2, X_2) be two sets of polynomials over \K. The universal property implies the existence of a morphism \phi_1: A_1 \to A_2 such that \phi(X_1) = X_2, and of a morphism \phi_2: A_2 \to A_1 such that \phi(X_2) = X_1.

We then have (\phi_2 \circ \phi_1)(X_1) = X_1. Hence \phi_2 \circ \phi_1 is a morphism A_1 to A_1 that maps X_1 to X_1. Now \Id_{A_1} is another such morphism. By virtue of the universal property of (A_1, X_1), it follows that \phi_2 \circ \phi_1 = \Id_{A_1}, that is, is the \K-aua category identity on A_1.

The same reasoning shows that \phi_1 \circ \phi_2 is the \K-aua category identity on A_2.

Hence \phi_1 is an isomorphism A_1 \to A_2 that maps the formal variable of A_1 to that of A_2.


Let A be an associative unital algebra over \K, and X an element of A. Then (A, X) represents the polynomials over \K if and only if the family (X^n)_{n \in \N}, with X^0 = 1_A and X^{n+1} =  X X^n, is a family-base of the \K-vector space A.

Let us first suppose that the latter condition holds on (A, X), and prove the universal property.

Let A' be a \K-aua and a' an element of A'.

Let us suppose that \phi is a morphism A \to A' such that \phi(X) = a'. Then for all n \in \N, we have \phi(X^n) = a'^n. Since \phi as a \K-aua morphism, it is in particular a linear mapping. Since a linear mapping is defined by the image of a basis, and (X^n)_{n \in \N} is a basis, \phi necessarily is the unique linear mapping A \to A' such that \forall n \in \N, \phi(X^n) = a'^n. Now it is easy to check that this linear mapping does preserve the multiplication in A, and is hence a \K-aua morphism.

Thus the universal property is proven for (A, X).

Let us now suppose the universal property for (A, X), and show that (X^n)_{n \in \N} is a family-base of the \K-vector space A.

Let us first show that (X^n)_{n \in \N} is linearly independent.

Let (A', \alpha) be a free \K-vector space on set \N. This entails that (\alpha(n))_{n \in \N} is a family-base of A'. To make A' into an associative unital algebra, we must define a multiplication law that is bilinear, and also associative.

A bilinear law can be defined by the images of all the ordered pairs of base elements. Hence we may define the multiplication in A' by:

\forall n_1, n_2 \in \N, \alpha(n_1) \cdot \alpha(n_2) = \alpha(n_1 + n_2)

We then have (\alpha(n_1) \cdot \alpha(n_2)) \cdot \alpha(n_3) = \alpha(n_1 + n_2) \cdot \alpha(n_3) = \alpha(n_1 + n_2 + n_3) = \alpha(n_1) \cdot \alpha(n_2 + n_3) = \alpha(n_1) \cdot (\alpha(n_2) \cdot \alpha(n_3)).

Hence the multiplication is associative with regard to the elements of the base; it is easy to check that this follow through to arbitrary elements, which makes A' an associative algebra.

\alpha(0) is a unit; hence A' is an associative unital algebra.

Applying the universal property of (A, X), we find that there exists exactly one \K-aua morphism \phi: A \to A' such that \phi(X) = \alpha(1).

We then have \phi(X^0) = \phi(1_A) = 1_{A'} = \alpha(0); \phi(X^1) = \phi(X) = \alpha(1); and, for any n > 1, \phi(X^n) = \phi(X \cdot \ldots \cdot X) = \phi(X) \cdot \ldots \cdot \phi(X) = \alpha(1) \cdot \ldots \cdot \alpha(1) = \alpha(1 + \ldots + 1) = \alpha(n). Hence for all n \in \N, \phi(X^n) = \alpha(n).

Let us consider a linear combination of the family (X^n)_{n \in \N} that vanishes; that is, a finite sequence of values of \K, (a_0, a_1, \ldots a_n), such that \sum_{i = 0}^n a_i X^i = 0_A. Taking the image of this linear combination by \phi, we obtain \sum_{i = 0}^n a_i \alpha(i) = 0_{A'}. Since the \alpha(i) form a basis, all the a_i must be zero.

Thus the family (X^n)_{n \in \N} is linearly independent.

Lastly, we must show that this family generates the \K-vector space A.

Let B be the sub-aua of A generated by \{X\}.

There exists a unique aua morphism \phi: A \to B such that \phi(X) = X.

Let then \phi' be \phi, but with all of A as codomain; in other words, \phi': A \to A, x \mapsto \phi(x). It is clear that \phi' too is an aua morphism, and it is such that \phi'(X) = X.

But there is exactly one aua morphism A \to A that maps X to X; and \Id_A is such.

Hence \phi' = \Id_A.

The image of \phi' is that of \phi, and is a subset of B. But the image of \Id_A is A itself. Hence B = A.

Thus the sub-aua of A generated by \{X\} is A itself. This sub-aua is the set of linear combinations of all the powers of X; hence the sub-vector space generated by X^0, X^1, X^2, \ldots is A.

Thus (X^n)_{n \in \N} is a family-base of the \K-vector space A.