Today's idea came from someone anonymous at math.stackexchange.com. We'll be seeing, with an interesting group action, why the intersection of two subgroups of finite index is finite.
Let $(G, +)$ be a group, and $H$ and $K$ subgroups. Let $G/H$ and $G/K$ be the sets of left cosets of $H$ and $K$, respectively. Consider the diagonal action of $G$ on $G/H \times G/K$.
(In other words, $g$ acting on $(g' + H, g'' + K)$ is $((g + g') + H, (g + g'') + K)$. One can quickly check this is a well-defined group action.)
The stabilizer of $H\times K$ is the set of all $g \in G$ such that $g + H = H$ and $g + K = K$, so the stabilizer of $H \times K$ is $H \cap K$.
Now we suppose $H$ and $K$ are subgroups of finite index. Then the orbit of $H \times K$ has to be finite, since $G/H \times G/K$ is finite. By the orbit stabilizer theorem, it follows that $[G : H \cap K]$ is finite.
Friday, October 17, 2014
Thursday, October 9, 2014
Sum of powers in finite fields
Today's proof is from A Course in Arithmetic by Serre. Here we'll let $0^0 = 1$.
Let $u \geq 0$, and let $K$ be a finite field of order $q = p^l$ (and hence of characteristic $p$). Then the sum
$$S^u(x) = \displaystyle\sum_{x \in K} x^u$$ is equal to $-1$ if $u \geq 1$ and divisible by $q - 1$; otherwise, the sum equals zero.
If $u = 0$, then we immediately get $S^u(x) = p^l \cdot 1 = 0$.
If $u \geq 1$ and is divisible by $q - 1$, we have $0^u = 0$ and $x^u = 1$ if $x \neq 0$. Therefore, $S^u(x) = (q - 1) \cdot 1 = -1$.
If $u \geq 1$ and not divisible by $q - 1$, then there exists $y \in K^\times$ such that $y ^u \neq 1$. (We can let $y$ be a generator of $K^\times$.) Then
$$S^u(x) = \displaystyle\sum_{x \in K^\times} x^u = \displaystyle\sum_{x \in K^\times} y^u x^u = y^uS^u(x),$$ so $S^u(x) = 0$.
Let $u \geq 0$, and let $K$ be a finite field of order $q = p^l$ (and hence of characteristic $p$). Then the sum
$$S^u(x) = \displaystyle\sum_{x \in K} x^u$$ is equal to $-1$ if $u \geq 1$ and divisible by $q - 1$; otherwise, the sum equals zero.
If $u = 0$, then we immediately get $S^u(x) = p^l \cdot 1 = 0$.
If $u \geq 1$ and is divisible by $q - 1$, we have $0^u = 0$ and $x^u = 1$ if $x \neq 0$. Therefore, $S^u(x) = (q - 1) \cdot 1 = -1$.
If $u \geq 1$ and not divisible by $q - 1$, then there exists $y \in K^\times$ such that $y ^u \neq 1$. (We can let $y$ be a generator of $K^\times$.) Then
$$S^u(x) = \displaystyle\sum_{x \in K^\times} x^u = \displaystyle\sum_{x \in K^\times} y^u x^u = y^uS^u(x),$$ so $S^u(x) = 0$.
Wednesday, October 8, 2014
Fundamental theorem of algebra
Today we'll prove the fundamental theorem of algebra: that every nonconstant polynomial (in one variable) has a root (in $\mathbb C$).
The main result we'll be using today is Liouville's theorem, stated below.
Result 1. A bounded entire function is constant.
We won't prove Liouville's theorem, but we will prove the following results.
Result 2. The function $\mathbb C \rightarrow \mathbb C$ given by a polynomial in $z$ is entire. (A function is said to be entire if it has domain $\mathbb C$ and is holomorphic in all of $\mathbb C$.)
Result 3. If $f : U \rightarrow\mathbb C$ is holomorphic on an open subset $U$ of $\mathbb C$ and has no zeroes, then the function $g$ such that $g(z) = 1/f(z)$ is holomorphic on $U$.
Result 4. As $|z| \rightarrow \infty$, we have $|p(z)| \rightarrow \infty$.
Before we prove these, though, we will present our proof of the fundamental theorem of algebra, as given on page 87 in the complex analysis book by Greene and Krantz.
Suppose a nonconstant polynomial $p$ did not have a root in $\mathbb C$. Then the function $g$ where $g(z) = 1/p(z)$ is entire (by results 2 and 3). By result 4, we know $|p(z)| \rightarrow \infty$ as $|z| \rightarrow \infty$, meaning $|g(z)| \rightarrow 0$ as $|z| \rightarrow \infty$. This means $g$ is bounded and entire, so by Liouville's theorem, $g$ is constant, a contradiction.
With our cute proof out of the way, we'll now focus on some of our messier results.
Proof of result 2: What we have to prove here depends on our initial definition of holomorphic. We'll assume the result (or definition) that says it suffices to prove that the polynomial function is complex differentiable. By the linearity of differentiation, it suffices to show that each individual $z \mapsto z^k$ for integers $k \geq 0$ is complex differentiable.
Now, the identity map is continuous, so $\lim_{z \to z_0} z = z_0$. Using this and the properties of limits gives us
$$\lim_{z \to z_0} z^{k - 1} + z^{k - 2} z_0 + ... + z_0^{k - 1} = kz_0^{k - 1}.$$Then we see that
$$\displaystyle\lim_{z \rightarrow z_0} \frac{z^k - z_0^k}{z - z_0}$$exists for all $k \geq 0$, since
$$(z^k - z_0^k) / (z - z_0) = z^{k - 1} + z^{k - 2} z_0 + ... + z_0^{k - 1}.$$
Proof of result 3: Let $z_0 \in U$. Then
\[
\begin{align*}
\displaystyle\lim_{z \rightarrow z_0} \frac{ \frac1{f(z)} - \frac1{f(z_0)} }{z - z_0} &= \lim_{z \rightarrow z_0} \frac{f(z_0) - f(z)}{z - z_0} \cdot \frac 1{f(z)f(z_0)}\\
&= \frac{-1}{f(z_0)^2} \cdot f'(z_0).
\end{align*}\]Since $g$ has a complex derivative at $z_0$ for any $z_0 \in U$, $g$ is holomorphic.
Proof of result 4: Here we write $p(z)$ as
$$z^n ( a_n + \frac {a_{n - 1}}z + ... + \frac{a_0}{z^n} ).$$ By repeatedly applying the triangle inequality,
\[
\begin{align*}
|a_n + \frac{a_{n - 1}}{z} + ... + \frac{a_0}{z^n}|
&\geq |a_0| - \left|\frac{a_{n - 1}}z\right| - ... - \left|\frac{a_0}{z^n}\right|\\
&= |a_0| - \left( \frac{|a_{n - 1}|}{|z|}+ ... + \frac{|a_0|}{|z|^n} \right).
\end{align*}\]As $|z| \to \infty$, the right side of the above inequality approaches $|a_0|$. This means that we can pick an $\epsilon > 0$ such that $\epsilon < |a_0|$, and then there exists a $K$ such that $|z| > K$ implies that the right side of the above inequality is in $(|a_0| - \epsilon, |a_0| + \epsilon)$.
So, when $|z| > K$,
$$|p(z)| \geq |z|^n |a_0 - \epsilon|,$$which becomes arbitrarily large as $|z| \to \infty$. This concludes our proofs.
The main result we'll be using today is Liouville's theorem, stated below.
Result 1. A bounded entire function is constant.
We won't prove Liouville's theorem, but we will prove the following results.
Result 2. The function $\mathbb C \rightarrow \mathbb C$ given by a polynomial in $z$ is entire. (A function is said to be entire if it has domain $\mathbb C$ and is holomorphic in all of $\mathbb C$.)
Result 3. If $f : U \rightarrow\mathbb C$ is holomorphic on an open subset $U$ of $\mathbb C$ and has no zeroes, then the function $g$ such that $g(z) = 1/f(z)$ is holomorphic on $U$.
Result 4. As $|z| \rightarrow \infty$, we have $|p(z)| \rightarrow \infty$.
Before we prove these, though, we will present our proof of the fundamental theorem of algebra, as given on page 87 in the complex analysis book by Greene and Krantz.
Suppose a nonconstant polynomial $p$ did not have a root in $\mathbb C$. Then the function $g$ where $g(z) = 1/p(z)$ is entire (by results 2 and 3). By result 4, we know $|p(z)| \rightarrow \infty$ as $|z| \rightarrow \infty$, meaning $|g(z)| \rightarrow 0$ as $|z| \rightarrow \infty$. This means $g$ is bounded and entire, so by Liouville's theorem, $g$ is constant, a contradiction.
With our cute proof out of the way, we'll now focus on some of our messier results.
Proof of result 2: What we have to prove here depends on our initial definition of holomorphic. We'll assume the result (or definition) that says it suffices to prove that the polynomial function is complex differentiable. By the linearity of differentiation, it suffices to show that each individual $z \mapsto z^k$ for integers $k \geq 0$ is complex differentiable.
Now, the identity map is continuous, so $\lim_{z \to z_0} z = z_0$. Using this and the properties of limits gives us
$$\lim_{z \to z_0} z^{k - 1} + z^{k - 2} z_0 + ... + z_0^{k - 1} = kz_0^{k - 1}.$$Then we see that
$$\displaystyle\lim_{z \rightarrow z_0} \frac{z^k - z_0^k}{z - z_0}$$exists for all $k \geq 0$, since
$$(z^k - z_0^k) / (z - z_0) = z^{k - 1} + z^{k - 2} z_0 + ... + z_0^{k - 1}.$$
Proof of result 3: Let $z_0 \in U$. Then
\[
\begin{align*}
\displaystyle\lim_{z \rightarrow z_0} \frac{ \frac1{f(z)} - \frac1{f(z_0)} }{z - z_0} &= \lim_{z \rightarrow z_0} \frac{f(z_0) - f(z)}{z - z_0} \cdot \frac 1{f(z)f(z_0)}\\
&= \frac{-1}{f(z_0)^2} \cdot f'(z_0).
\end{align*}\]Since $g$ has a complex derivative at $z_0$ for any $z_0 \in U$, $g$ is holomorphic.
Proof of result 4: Here we write $p(z)$ as
$$z^n ( a_n + \frac {a_{n - 1}}z + ... + \frac{a_0}{z^n} ).$$ By repeatedly applying the triangle inequality,
\[
\begin{align*}
|a_n + \frac{a_{n - 1}}{z} + ... + \frac{a_0}{z^n}|
&\geq |a_0| - \left|\frac{a_{n - 1}}z\right| - ... - \left|\frac{a_0}{z^n}\right|\\
&= |a_0| - \left( \frac{|a_{n - 1}|}{|z|}+ ... + \frac{|a_0|}{|z|^n} \right).
\end{align*}\]As $|z| \to \infty$, the right side of the above inequality approaches $|a_0|$. This means that we can pick an $\epsilon > 0$ such that $\epsilon < |a_0|$, and then there exists a $K$ such that $|z| > K$ implies that the right side of the above inequality is in $(|a_0| - \epsilon, |a_0| + \epsilon)$.
So, when $|z| > K$,
$$|p(z)| \geq |z|^n |a_0 - \epsilon|,$$which becomes arbitrarily large as $|z| \to \infty$. This concludes our proofs.
Saturday, October 4, 2014
Automorphisms of reals fixing rationals
Our goal today is to show that $\text{Aut}(\mathbb R / \mathbb Q) = 1$. We follow the outline suggested in the exercises of section 14.1 of Dummit and Foote.
Let $\sigma$ be an automorphism of the reals fixing the rationals. Let $k \in \mathbb R$, and observe that
$$\sigma(k^2) = \sigma(k)\sigma(k) = \sigma(k)^2.$$
Any positive real can be written as a square in $\mathbb R$, so applying $\sigma$, we get another square and hence another positive real.
Next, suppose $a < b$ for any $a, b \in \mathbb R$. Then $0 < b - a$, so $0 < \sigma(b) - \sigma(a)$,
so $\sigma(b) > \sigma(a)$. Therefore $\sigma$ must be increasing.
Now, let $m$ be a positive integer. Then
$$-\frac 1m < b - a < \frac 1m \implies -\frac 1 m < \sigma(b) - \sigma(a) < \frac 1 m$$
by applying $\sigma$, so $\sigma$ is continuous. Every real number is a limit of a sequence of rationals, so since $\sigma$ is the identity on the rationals, it must also be the identity on every real number.
Let $\sigma$ be an automorphism of the reals fixing the rationals. Let $k \in \mathbb R$, and observe that
$$\sigma(k^2) = \sigma(k)\sigma(k) = \sigma(k)^2.$$
Any positive real can be written as a square in $\mathbb R$, so applying $\sigma$, we get another square and hence another positive real.
Next, suppose $a < b$ for any $a, b \in \mathbb R$. Then $0 < b - a$, so $0 < \sigma(b) - \sigma(a)$,
so $\sigma(b) > \sigma(a)$. Therefore $\sigma$ must be increasing.
Now, let $m$ be a positive integer. Then
$$-\frac 1m < b - a < \frac 1m \implies -\frac 1 m < \sigma(b) - \sigma(a) < \frac 1 m$$
by applying $\sigma$, so $\sigma$ is continuous. Every real number is a limit of a sequence of rationals, so since $\sigma$ is the identity on the rationals, it must also be the identity on every real number.
Thursday, October 2, 2014
The multiplicative group of a finite field is cyclic
Today our goal is to show (following Dummit and Foote's approach) that the finite subgroup $H$ of the multiplicative group of a field $F$ is cyclic. The title of this post follows as a corollary.
We will use the fundamental theorem of finitely generated abelian groups, found here. This tells us that our subgroup is isomorphic to
$$\mathbb Z/n_1 \mathbb Z \times \cdots \times \mathbb Z / n_k \mathbb Z$$
for some positive integer $k$ and integers $n_k \mid n_{k - 1} \mid \cdots \mid n_1$.
Recall that if $G$ is a cyclic group, say $\mathbb Z / n \mathbb Z$, and $d \mid |G|$, then $G$ contains precisely $d$ elements of order dividing $d$, namely $0, n/d, 2n/d, ..., (d - 1)n / d$.
Now, if $k \geq 2$, there are at least two direct factors of $H$ with $n_k$ elements of order dividing $n_k$, so there are more than $n_k$ elements of the direct product with order dividing $n_k$. But then there are more than $n_k$ roots to the polynomial $x^{n_k} - 1$ in the field $F$, a contradiction. Therefore, $k = 1$, so $H$ is cyclic.
We will use the fundamental theorem of finitely generated abelian groups, found here. This tells us that our subgroup is isomorphic to
$$\mathbb Z/n_1 \mathbb Z \times \cdots \times \mathbb Z / n_k \mathbb Z$$
for some positive integer $k$ and integers $n_k \mid n_{k - 1} \mid \cdots \mid n_1$.
Recall that if $G$ is a cyclic group, say $\mathbb Z / n \mathbb Z$, and $d \mid |G|$, then $G$ contains precisely $d$ elements of order dividing $d$, namely $0, n/d, 2n/d, ..., (d - 1)n / d$.
Now, if $k \geq 2$, there are at least two direct factors of $H$ with $n_k$ elements of order dividing $n_k$, so there are more than $n_k$ elements of the direct product with order dividing $n_k$. But then there are more than $n_k$ roots to the polynomial $x^{n_k} - 1$ in the field $F$, a contradiction. Therefore, $k = 1$, so $H$ is cyclic.
Wednesday, October 1, 2014
A special case of Dirichlet's Theorem
Our goal here is to prove that there are infinitely many primes $p$ with $p \equiv 1 \pmod m$. We will follow (in a different order) the approach outlined in the exercises of section 13.6 of Dummit and Foote.
We first state two results.
Result 1. Let $P(x) \in \mathbb Z[x]$ be a monic polynomial of degree at least one. Then there are infinitely many distinct prime divisors of the integers $P(1), P(2), P(3), ...$.
Result 2. Let $a \in \mathbb Z$. Then if $p$ is an odd prime dividing $\Phi_m(a)$ then either $p$ divides $m$ or $p \equiv 1 \pmod m$.
Let's suppose these are true, and consider $\Phi_m(1), \Phi_m(2), ...$. These (together) have infinitely many prime divisors by result 1. All of these prime divisors must be even, divide $m$, or be 1 mod $m$ by result 2. Since only finitely many primes can be even or divide $m$, infinitely many primes are 1 mod $m$.
We proved the desired result assuming result 1 and result 2, so now it just suffices to prove these.
Proof of result 1:
Suppose $p_1, ..., p_k$ are the only primes dividing $P(1), P(2), ...$. Let $N$ be such that $P(N) \neq 0$, and let $a = P(N)$. Now let $Q(x) = a^{-1} P(N + ap_1 ... p_kx)$. Note that all the coefficients of $Q(x)$ are integers by applying the binomial theorem and the fact that $P(N) = a$. Again applying the binomial theorem and the fact that $P(N) = a$, we see that $Q(n) \equiv 1 \pmod {p_1...p_k}$ for all positive integers $n$. Therefore, there is an integer $M$ such that $Q(M)$ has a prime factor different from $p_1, ..., p_k$ (since $Q(x) = \pm 1$ can only have finitely many solutions).
It now follows that $P(N + a p_1...p_kM) = aQ(M)$ has a prime factor different from $p_1, ..., p_k$, completing the proof of result 1.
Proof of result 2:
Let $p$ be an odd prime not dividing $m$. Suppose $\Phi_m(a) \equiv 0 \pmod p$. We claim that the order of $a$ mod $p$ is $m$.
Suppose the order is $d$, which must divide $m$. Then $\Phi_d(a) \equiv 0 \pmod p$ (since $a$ is a primitive $d$-th root of unity!). Now we have, if $d \neq m$,
$$x^m - 1 = \prod_{c \mid m} \Phi_c(x) = \Phi_m(x) \Phi_d(x)h(x)$$
for some polynomial $h(x)$. This means that $x^m - 1$ has $a$ as a multiple root, a contradiction. (This is a contradiction since $x^m - 1$ is separable, since its derivative $mx^{m - 1}$ is relatively prime to it, since $p \nmid m$.)
Now that we know that the order of $a$ mod $p$ is $m$, we also know $m \mid p - 1$, so $p \equiv 1 \pmod m$, completing the proof of result 2.
We first state two results.
Result 1. Let $P(x) \in \mathbb Z[x]$ be a monic polynomial of degree at least one. Then there are infinitely many distinct prime divisors of the integers $P(1), P(2), P(3), ...$.
Result 2. Let $a \in \mathbb Z$. Then if $p$ is an odd prime dividing $\Phi_m(a)$ then either $p$ divides $m$ or $p \equiv 1 \pmod m$.
Let's suppose these are true, and consider $\Phi_m(1), \Phi_m(2), ...$. These (together) have infinitely many prime divisors by result 1. All of these prime divisors must be even, divide $m$, or be 1 mod $m$ by result 2. Since only finitely many primes can be even or divide $m$, infinitely many primes are 1 mod $m$.
We proved the desired result assuming result 1 and result 2, so now it just suffices to prove these.
Proof of result 1:
Suppose $p_1, ..., p_k$ are the only primes dividing $P(1), P(2), ...$. Let $N$ be such that $P(N) \neq 0$, and let $a = P(N)$. Now let $Q(x) = a^{-1} P(N + ap_1 ... p_kx)$. Note that all the coefficients of $Q(x)$ are integers by applying the binomial theorem and the fact that $P(N) = a$. Again applying the binomial theorem and the fact that $P(N) = a$, we see that $Q(n) \equiv 1 \pmod {p_1...p_k}$ for all positive integers $n$. Therefore, there is an integer $M$ such that $Q(M)$ has a prime factor different from $p_1, ..., p_k$ (since $Q(x) = \pm 1$ can only have finitely many solutions).
It now follows that $P(N + a p_1...p_kM) = aQ(M)$ has a prime factor different from $p_1, ..., p_k$, completing the proof of result 1.
Proof of result 2:
Let $p$ be an odd prime not dividing $m$. Suppose $\Phi_m(a) \equiv 0 \pmod p$. We claim that the order of $a$ mod $p$ is $m$.
Suppose the order is $d$, which must divide $m$. Then $\Phi_d(a) \equiv 0 \pmod p$ (since $a$ is a primitive $d$-th root of unity!). Now we have, if $d \neq m$,
$$x^m - 1 = \prod_{c \mid m} \Phi_c(x) = \Phi_m(x) \Phi_d(x)h(x)$$
for some polynomial $h(x)$. This means that $x^m - 1$ has $a$ as a multiple root, a contradiction. (This is a contradiction since $x^m - 1$ is separable, since its derivative $mx^{m - 1}$ is relatively prime to it, since $p \nmid m$.)
Now that we know that the order of $a$ mod $p$ is $m$, we also know $m \mid p - 1$, so $p \equiv 1 \pmod m$, completing the proof of result 2.
Subscribe to:
Posts (Atom)