Post Reply 
 
Thread Rating:
  • 0 Votes - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Artin Emil: Galois Theory
02-11-2013, 06:20 PM
Post: #1
Artin Emil: Galois Theory
1 Linear Algebra

1.1 Definition A field is a set $F$ with operations $+,\;\cdot\;$ (called addition and multiplication) on its
$\quad\quad$ elements such that $(F,+)$ forms an abelian group and $(F\setminus \{0\},\; \cdot)\;$ forms a group (not
$\quad\quad$ necessarily commutative), where $0,\; 1$ are the identity elements of groups respectively, the
$\quad\quad$ operations are connected with distributive law. Furthermore,
$\quad\quad$ $0x = 0\cdot x = x\cdot 0 = x0 = 0, \; 1x = x1 = x \quad (\forall x\in F)$.

1.2 Definition If $V$ is an additive abelian group with elements $A,\, B,\ldots,\quad F$ is a field with elements
$\quad\quad$ $a,\,b,\ldots$, and the product $aA \in V$, then $V$ is called a (left) vector space over $F$ if \[\begin{array}{l}1)\; a(A + B) = aA + aB\\ 2) \;(a + b)A = aA + aB \\ 3)\; a(bA) = (ab)A \\ 4)\; 1A = A\end{array}\] $\quad\quad$ Clearly $0A = a\mathbf{0} = \mathbf{0}$ where $\mathbf{0}$ is the zero element of $V$.

$\quad\quad$ Sometimes products between elements of $F,\; V$ are written in the form $Aa$ in which case $V$ is
$\quad\quad$ called a right vector space over $F$. If, in discussion, left and right vector spaces do not occur
$\quad\quad$ simultaneously, we shall simply use the term "vector space"

1.3 Definition Let $F$ be a field and $a_{ij}\in F\; (i=\overline{1,m},\; j=\overline{1,n})$, the equations in $F$

$\quad\quad (1)\quad\quad\quad\quad\quad\quad \begin{array}{rllll} & a_{11}x_1+& a_{12}x_2 +\cdots + & a_{1n}x_n =& 0 \\ &\quad. & \quad. &\quad. &\,. \\ &\quad. & \quad. &\quad. &\,. \\& a_{m1}x_1+& a_{m2}x_2 +\cdots + & a_{mn}x_n =& 0\end{array}$

$\quad\quad$ are called linear homogeneous equations. And $x_1,\ldots, x_n \in F\;$ for which (1) holds is called a
$\quad\quad$ solution of the system. If $x_i = 0\;(\forall i)$ the solution is called trivial, otherwise non-trivial.

1.4 Theorem A system of linear homogeneous equations always has a non-trivial solution if $n > m$
$\quad\quad$ (the number of unknowns greater than the number of equations)
Proof Clearly the theory is true if $n \le 2$. Assue $n > 2$ and the theorem is true for the number of
$\quad\quad$ equations is less than the number of unknowns where the latter is less than $n$. If $a_{in} = 0\;(\forall i)\;$
$\quad\quad$ then (1) has non-trivial solution already. Otherwise without loss of generality, assume that
$\quad\quad$ $a_{mn} \ne 0$. Write $L_i = \sum_{j=1}^n a_{ij}x_j$ and let $L_i^* = L_i - a_{in} a_{mn}^{-1}L_m$, then $L_i^* = 0 \; (i = \overline{1,m-1})$
$\quad\quad$ is a system has non-trivial solution $x_1,\ldots,x_{n-1}$ by assumption which determines $x_n$ uniquely
$\quad\quad$ by $L_m = 0$. We now see that such determined $x_i\;(i=\overline{1,n})$ form a non-trivial solution since
$\quad\quad$ $L_i = L_i^* + a_{in}a_{mn}^{-1} L_m = 0 + a_{in}a_{mn}^{-1} 0 = 0\;(i < n)$. This completes the proof by induction. $\square$

Remark If the equations is written in the form $\sum x_j a_{ij} = 0$, the theorem will still hold by a slightly
$\quad\quad$ modified proof.

1.5 Definition Given $A_i \in V$(vector space over a field $F$), if $\sum_1^n a_i A_i = \mathbf{0} \implies a_i = 0 (\forall i)$,
$\quad\quad$ we call $A_1,\ldots, A_n$ independent. $A_1,\ldots,A_n$ are dependent if they are not independent.

$\quad\quad$ The demension $\dim(V)$ of a vector space $V$ over a field $F$ is the maximum number of
$\quad\quad$ independent elements in $V$

$\quad\quad$ $G = \{A_1,\ldots,A_m\}$ is a generating system of $V$ if $V = \text{span}(G) = \{\sum_i^m a_i A_i \mid a_i \in F\}$

1.6 Theorem If $V = \text{span}(G)$, then $\dim(V) = \max\{|B| \mid B\subset G \text{ is independent }\}$

Proof WLG, assume $G = \{A_1,\ldots,A_m\}$ and Let $n = dim(V)$ and the RHS above $ = r$
$\quad\quad$ and $\{A_1,\ldots, A_r \}$ is independent. By the definition of dimension, $n \ge r$ and clearly
$\quad\quad$ $V = \text{span}(\{A_1,\ldots,A_r\}).\quad$ If $\; s > r\;$ and $B_j = \sum_{i=1}^r a_{ij}A_i \in V,\quad j=\overline{1,s}$,
$\quad\quad$ then $\sum_{j=1}^s x_j B_j = \mathbf{0}$ if and only if $\;\mathbf{0} = \sum_{j=1}^s x_j \sum_{i=1}^r a_{ij}A_i = \sum_{i=1}^r \left(\sum_{j=1}^s x_j a_{ij} \right)A_i$
$\quad\quad$ if and only if $\quad \displaystyle{\sum_{j=1}^s x_j a_{ij} = 0\quad (i = \overline{1,r})}\quad$ since $\quad \{A_1,\ldots,A_r\}\,$ is independent.
$\quad\quad$ By Theorem 1.4, we have non-trivial solution $x_1,\ldots, x_s$ and so $B_1,\ldots, B_s$ are dependent.
$\quad\quad$ Therefore $n \ge r \ge n$ or simply $n = r\quad\square$

1.7 Definitions Let $V$ be a vector space over a field $F$. We call $S \subset V$ a subspace of $V$
$\quad\quad$ If $a,b \in F,\; A,B \in S \implies aA + bB \in S$

$\quad\quad$ An s-tuple $A = (a_1,\ldots,a_s)\; (a_i\in F,\; i = \overline{1,s})$ is called a row vector. With the laws
$\quad\quad\quad (\alpha)\;$ $((a_1,\ldots,a_s) = (b_1,\ldots,b_s)) \Longleftrightarrow (a_i=b_i \quad(i=\overline{1,s}))$

$\quad\quad\quad (\beta)\;$ $(a_1,\ldots,a_s) + (b_1,\ldots,b_s) = (a_1 +b_1,\ldots, a_s +b_s)$

$\quad\quad\quad (\gamma)\;$ $k(a_1,\ldots,a_s) = (ka_1,\ldots, ka_s)\quad (k\in F)$

$\quad\quad$ $s$-tuples in the column form $\begin{bmatrix}a_1\\a_2\\ \vdots \\a_s \end{bmatrix}$ are called column vectors.

1.8 Theorem row (column) vector space $F^n$ of all $n$-tuples from a field $F$ is a vector space of
$\quad\quad$ dimension $n$ over $F$.
Proof $G = \{\mathbf{v} \mid \mathbf{v} = \mathbf{e}_i = (a_{i1},\ldots,a_{in}), \;a_{ij} = \delta_{ij}\; 1\le i,j \le n\}$ is independent and $F^n = \text{span}(G)$
$\quad\quad$ where $\delta_{ij} = \begin{cases} 1 & i = j, \\ 0 & i \ne j. \end{cases}\quad \square$

1.9 Definitions
$\quad\quad$ call the $\quad m\times n$ elements array in $F\quad\quad \begin{bmatrix}a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\
\vdots & \vdots & & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} \end{bmatrix}\quad \quad (m\times n)\;$ matrix.

$\quad\quad$ By the right row rank of a matrix we mean the maximum number of independent row vectors
$\quad\quad$ among the rows of the maxtrx when multiplication by filed elements is from the right. Similarly,
$\quad\quad$ we define leftrow rank, right column rand and left collumn rank.
Find all posts by this user
Quote this message in a reply
02-15-2013, 07:13 PM
Post: #2
Liner Algebra II: Artin Emil: Galois Theory

1.10 Theorem In any matrix the right column rank equals the left row rank and the left column rank
$\quad\quad$ equals the right row rank. If the field is commutative, the 4 numbers are equal and are called
$quad\quad$ the rank of the matrix.
Proof Let $C_j,\, R_i$ be the column and row vectors of the matrix. Then $\sum C_j x_j = \mathbf{0}$ is equivalent to \[\begin{array}{ccl}a_{11}x_1 & + a_{12}x_2 + \cdots + & a_{1n}x = 0 \\ \;\vdots & &\quad\vdots \\ a_{m1}x_1 &+ a_{m2}x_2 + \cdots +& a_{mn}x_n = 0 \end{array} \tag{1}\]$\quad\quad$ which is equivalent to the system corresponding to a max (left)independent row vectors
Find all posts by this user
Quote this message in a reply
Post Reply 


Forum Jump:


Contact Us | Software Frontier | Return to Top | Return to Content | Lite (Archive) Mode | RSS Syndication