Coprimality with a product of integers

The theorem

If n > 0 and a, b_1, \ldots b_n \in \Z are such that a is separately coprime with each b_i, then a is coprime with their product b_1 b_2 \ldots b_n.

The proof

Proof: Through Bézout. a being coprime with each b_i, we can write, for each i = 1, \ldots n:

p_i a + q_i b_i = 1

If we take the product of each of these expressions, we get:

\prod_{i = 1}^n (p_i a + q_i b_i) = 1

Among the 2^n terms of this product, all but one contain a as a factor. The one that doesn’t have a as a factor is q_1 b_1 q_2 b_2 \ldots q_n b_n. Hence we obtain an expression of the form:

(\ldots) a + (q_1 \ldots q_n) (b_1 \ldots b_n) = 1

Hence a and b_1 b_2 \ldots b_n are coprime. \blacksquare

In the case of polynomials

The same result holds with polynomials: if P and Q_1, \ldots Q_n are polynomials over a field \K, such that P is coprime with each Q_i, then P is coprime with their product Q_1 Q_2 \ldots Q_n.

The proof is identical.

Degree and Euclidean division of polynomials

For the definition and characterization of the polynomials over a field \K, see Polynomials over a field.

Definition of the degree of a polynomial

The degree of a polynomial P, written \deg P, is an element of \{-\infty\} \cup \N. The symbol -\infty is taken to have the usual properties of order and of addition relative to natural numbers.

The family (X^0, X^1, X^2, \ldots) is a \K-vector space basis of \K[X]. For any P \in \K[X], there exists an unique (a_0, a_1, \ldots) \in \K^\N, only finitely many of which are nonzero, such that P = \sum_n a_n X^n. If all the a_n are zero, that is, if P is the zero element of \K[X], then \deg P is defined as -\infty. If not all the a_n are zero, since only finitely many are nonzero, there is a greatest value of n such that a_n \neq 0_\K. Then \deg P is defined as this greatest value.

Elementary properties of the degree

The following are easily checked to hold for any polynomials P and Q (even when one or both are zero):

\deg (P + Q) \le \sup(\deg P, \deg Q)

if \deg P \neq \deg Q, then \deg (P + Q) = \sup(\deg P, \deg Q)

\deg PQ = \deg P + \deg Q

Euclidean division

The notion of the degree of a polynomial allows us to formulate the following property of Euclidean division:

For all polynomials A and B, with B nonzero, there exists exactly one pair of polynomials (Q, R) such that A = Q B + R with \deg R < \deg B.

For the proof, we fix the value of nonzero polynomial B, and recurse over the degree of A.

More specifically, B being given, we consider the following proposition dependent on n \in \{-\infty\} \cup \N:

\mathcal P(n) iff for all polynomials A such that \deg A \le n, there exist polynomials  Q, R, with \deg R < \deg B, such that A = Q B + R.

\mathcal P(-\infty) is trivially true (for \deg A \le -\infty means that A is zero, and Q = 0, R = 0 satisfies A = Q B + R with \deg R < \deg B).

Let us suppose \mathcal P(n) true for some n \in \{-\infty\} \cup \N, and consider a polynomial A the degree of which is less or equal to the successor of n, that is to 0 if n = -\infty, or n+1 otherwise.

If the degree of A is less or equal to n, then \mathcal P(n) applies and there exist the wanted Q and R.

If the degree of A is less than that of B, then we can directly write A = Q B + R with Q = 0 and R = A.

Remains the case in which \deg A is the successor of n.

Polynomials over a field

In this context, we consider a commutative field (simply: field) \K. The polynomials will be constructed as a certain associative unital algebra over \K (“\K-aua”), together with a distinguished element called the formal variable.

\K itself, as a vector space over itself and the usual multiplication in \K as the third law, is a \K-aua,

Morphisms between two \K-auas are linear mappings that conserve the third law and map the unit of the first algebra onto the unit of the second one.

All that will be said of the polynomials over \K will be based on the universal property given below. The polynomials will thus be defined uniquely up to an isomorphism.

Definition

The polynomials over \K is any \K-aua A together with a distinguished element X of A such that the following universal property holds:

For any \K-aua A' and any a \in A, there exists exactly one morphism \phi: A \to A' such that \phi(X) = a.

Unicity up to an isomorphism

Let (A_1, X_1) and (A_2, X_2) be two sets of polynomials over \K. The universal property implies the existence of a morphism \phi_1: A_1 \to A_2 such that \phi(X_1) = X_2, and of a morphism \phi_2: A_2 \to A_1 such that \phi(X_2) = X_1.

We then have (\phi_2 \circ \phi_1)(X_1) = X_1. Hence \phi_2 \circ \phi_1 is a morphism A_1 to A_1 that maps X_1 to X_1. Now \Id_{A_1} is another such morphism. By virtue of the universal property of (A_1, X_1), it follows that \phi_2 \circ \phi_1 = \Id_{A_1}, that is, is the \K-aua category identity on A_1.

The same reasoning shows that \phi_1 \circ \phi_2 is the \K-aua category identity on A_2.

Hence \phi_1 is an isomorphism A_1 \to A_2 that maps the formal variable of A_1 to that of A_2.

Characterization

Let A be an associative unital algebra over \K, and X an element of A. Then (A, X) represents the polynomials over \K if and only if the family (X^n)_{n \in \N}, with X^0 = 1_A and X^{n+1} =  X X^n, is a family-base of the \K-vector space A.

Let us first suppose that the latter condition holds on (A, X), and prove the universal property.

Let A' be a \K-aua and a' an element of A'.

Let us suppose that \phi is a morphism A \to A' such that \phi(X) = a'. Then for all n \in \N, we have \phi(X^n) = a'^n. Since \phi as a \K-aua morphism, it is in particular a linear mapping. Since a linear mapping is defined by the image of a basis, and (X^n)_{n \in \N} is a basis, \phi necessarily is the unique linear mapping A \to A' such that \forall n \in \N, \phi(X^n) = a'^n. Now it is easy to check that this linear mapping does preserve the multiplication in A, and is hence a \K-aua morphism.

Thus the universal property is proven for (A, X).

Let us now suppose the universal property for (A, X), and show that (X^n)_{n \in \N} is a family-base of the \K-vector space A.

Let us first show that (X^n)_{n \in \N} is linearly independent.

Let (A', \alpha) be a free \K-vector space on set \N. This entails that (\alpha(n))_{n \in \N} is a family-base of A'. To make A' into an associative unital algebra, we must define a multiplication law that is bilinear, and also associative.

A bilinear law can be defined by the images of all the ordered pairs of base elements. Hence we may define the multiplication in A' by:

\forall n_1, n_2 \in \N, \alpha(n_1) \cdot \alpha(n_2) = \alpha(n_1 + n_2)

We then have (\alpha(n_1) \cdot \alpha(n_2)) \cdot \alpha(n_3) = \alpha(n_1 + n_2) \cdot \alpha(n_3) = \alpha(n_1 + n_2 + n_3) = \alpha(n_1) \cdot \alpha(n_2 + n_3) = \alpha(n_1) \cdot (\alpha(n_2) \cdot \alpha(n_3)).

Hence the multiplication is associative with regard to the elements of the base; it is easy to check that this follow through to arbitrary elements, which makes A' an associative algebra.

\alpha(0) is a unit; hence A' is an associative unital algebra.

Applying the universal property of (A, X), we find that there exists exactly one \K-aua morphism \phi: A \to A' such that \phi(X) = \alpha(1).

We then have \phi(X^0) = \phi(1_A) = 1_{A'} = \alpha(0); \phi(X^1) = \phi(X) = \alpha(1); and, for any n > 1, \phi(X^n) = \phi(X \cdot \ldots \cdot X) = \phi(X) \cdot \ldots \cdot \phi(X) = \alpha(1) \cdot \ldots \cdot \alpha(1) = \alpha(1 + \ldots + 1) = \alpha(n). Hence for all n \in \N, \phi(X^n) = \alpha(n).

Let us consider a linear combination of the family (X^n)_{n \in \N} that vanishes; that is, a finite sequence of values of \K, (a_0, a_1, \ldots a_n), such that \sum_{i = 0}^n a_i X^i = 0_A. Taking the image of this linear combination by \phi, we obtain \sum_{i = 0}^n a_i \alpha(i) = 0_{A'}. Since the \alpha(i) form a basis, all the a_i must be zero.

Thus the family (X^n)_{n \in \N} is linearly independent.

Lastly, we must show that this family generates the \K-vector space A.

Let B be the sub-aua of A generated by \{X\}.

There exists a unique aua morphism \phi: A \to B such that \phi(X) = X.

Let then \phi' be \phi, but with all of A as codomain; in other words, \phi': A \to A, x \mapsto \phi(x). It is clear that \phi' too is an aua morphism, and it is such that \phi'(X) = X.

But there is exactly one aua morphism A \to A that maps X to X; and \Id_A is such.

Hence \phi' = \Id_A.

The image of \phi' is that of \phi, and is a subset of B. But the image of \Id_A is A itself. Hence B = A.

Thus the sub-aua of A generated by \{X\} is A itself. This sub-aua is the set of linear combinations of all the powers of X; hence the sub-vector space generated by X^0, X^1, X^2, \ldots is A.

Thus (X^n)_{n \in \N} is a family-base of the \K-vector space A.

The composition of polynomials

Definition

Let \K be a commutative field. The polynomials \K[X] form a commutative unital algebra over \K containing an element X such that the following universal property holds:

For any unital algebra A (commutative or otherwise) and any element a \in A, there exists exactly one unital algebra morphism \phi: \K[X] \to A such that \phi(X) = a.

If P and Q are elements of \K[X], let \gamma_Q: \K[X] \to \K[X] be the unique morphism such that \gamma_Q(X) = Q. Then the composition P(Q) of P and Q is defined by P(Q) = \gamma_Q(P).

Effect of composition of polynomials on the corresponding functions

In terms of polynomial functions, the composition of polynomials is the equivalent to the composition (via “\circ”) of the associated functions.

Let us first define the function that is associated with a polynomial.

The set \K^\K of functions \K \to \K is made into a commutative and unital associative algebra by the usual operations of summing ((f + g)(x) = f(x) + g(x), multiplication by a scalar ((af)(x) = a(f(x)) and internal multiplication ((fg)(x) = f(x) g(x)).

Hence, following the universal property of the algebra of polynomials, there exists a unique morphism \phi: \K[X] \to \K such that \phi(X) = \Id_\K.

By definition, the function p associated with a polynomial P is simply p = \phi(P).

We wish to show that if P and Q are polynomials and p and q the respective associated functions, the function associated with P(Q) is p \circ q.

Let Q be a polynomial and q = \phi(Q). Let us consider the mapping \alpha: \K[X] \to \K^\K, P \mapsto \phi(P) \circ q.

It is easy to check that \alpha is a morphism.

Furthermore, \alpha(X) = \phi(X) \circ q = \Id_\K \circ q = q.

On the other hand, \phi \circ \gamma_Q is also a morphism from \K[X] to \K^\K, and (\phi \circ \gamma_Q)(X) = \phi(\gamma_Q(X)) = \phi(Q) = q.

Thus both \alpha and \phi \circ \gamma_Q are morphisms \K[X] \to \K^\K and they agree on the value X. The universal property of polynomials entails that they are equal.

Hence for any P \in \K[X], (\phi \circ \gamma_Q)(P) = \alpha(P).

That is, \phi(P(Q)) = \phi(P) \circ \phi(Q).

Effect on the evaluation of a polynomial

If P \in \K[X] and a \in \K, the evaluation of P at a is, by definition, the value at P of the unique morphism \epsilon_a: \K[X] \to \K such that \epsilon_a(X) = a.

Given polynomials P and Q and a \in \K, we wish to evaluate P(Q) at a, and, specifically, show that \epsilon_a(P(Q)) = \epsilon_{\epsilon_a(Q)}(P).

Since P(Q) = \gamma_Q(P), we have \epsilon_a(P(Q)) = (\epsilon_a \circ \gamma_Q)(P).

Both \epsilon_a \circ \gamma_Q and \epsilon_{\epsilon_a(Q)} are morphisms \K[X] \to \K. Both evaluate at X to \epsilon_a(Q). Hence they are equal:

\epsilon_a \circ \gamma_Q = \epsilon_{\epsilon_a(Q)}

which entails (\epsilon_a \circ \gamma_Q)(P) = \epsilon_{\epsilon_a(Q)}(P)

hence: \epsilon_a(P(Q)) =  \epsilon_{\epsilon_a(Q)}(P), as announced.

In particular: if P \in \K[X] and a, b \in \K, then \epsilon_a(P(X + b)) = \epsilon_{\epsilon_a(X + b)}(P) = \epsilon_{\epsilon_a(X) + \epsilon_a(b)}(P) = \epsilon_{a + b}(P).

Associativity

If P, Q and R are polynomials in \K[X], we wish to show that:

P(Q(R)) = (P(Q))(R)

Using the definition of composition, this translates into:

\gamma_{\gamma_R(Q)}(P) = \gamma_R(\gamma_Q(P))

That is:

\gamma_{\gamma_R(Q)}(P) = (\gamma_R \circ \gamma_Q)(P)

We need to show that for any Q and R in \K[X], the two \K[X] \to \K[X] mappings \gamma_{\gamma_R(Q)} and \gamma_R \circ \gamma_Q are identical.

Both are endomorphisms of \K[X].

We have:

  • \gamma_{\gamma_R(Q)}(X) = \gamma_R(Q) by definition of \gamma_{\gamma_R(Q)}.
  • (\gamma_R \circ \gamma_Q)(X) = \gamma_R(\gamma_Q)(X)) = \gamma_R(Q) by definition of \gamma_Q).

Hence both morphisms agree on the image of X. By virtue of the universal property of polynomials, they are equal.

Hence for any P \in \K[X], we indeed have \gamma_{\gamma_R(Q)}(P) = (\gamma_R \circ \gamma_Q)(P), that is, P(Q(R)) = (P(Q))(R).

Effect of composition on the evaluation of polynomials

todo

Derivation of the composition of polynomials

todo