Theorem
Proof (a):
Let ,
Proof (b):
Let , , and
Proof (c):
Let .
.
Proof (d):
Let and .
.
Proof (e):
Let .
.
Proof (f):
Let .
.
Theorem
- For any scalar ,
Proof (a):
Let be an matrix. By the definition of the transpose, the -th entry of is .
Taking the transpose again, the -th entry of becomes .
Then .
Proof (b):
Let and be matrices. The -th entry of is .
Taking the transpose, the -th entry of is .
The -th entry of is also .
So .
Proof (c):
Let be an matrix and be a scalar. The -th entry of is .
Taking the transpose, the -th entry of is .
The -th entry of is .
Then .
Proof (d):
Let be an matrix and be an matrix. The -th entry of is .
Taking the transpose, the -th entry of is .
The -th entry of is .
By the definition of the transpose, and .
Then .
So .
Theorem
- (associative law of multiplication)
- (left distributive law)
- (right distributive law)
- for any scalar
- (identity for matrix multiplication)
Proof (a):
Let be an matrix, an matrix, and a matrix.
Define the product as:
Now, multiply with . The -th entry of is:
Next, compute first. The -th entry of is:
Now multiply with . The -th entry of is:
Since the entries of and are identical, we have:
Proof (b):
Let , , and be matrices such that is defined.
By definition of matrix addition:
Distributing:
Proof (c):
Let , , and be matrices such that is defined.
Distributing:
Proof (d):
Let and be matrices such that is defined, and let be a scalar.
By definition of scalar multiplication:
Distributing :
and similarly:
Thus, .
Proof (e):
Let be an matrix, and let and be the identity matrices of size and , respectively.
The columns of can be written as , where is the -th standard basis vector.
Multiplying with :
Similarly:
Thus, and .