Thus,
P
W
is given by
P
W
(
~x
) =
h
~x, ~w
1
i
h
~w
1
, ~w
1
i
~w
1
+
h
~x, ~w
2
i
h
~w
2
, ~w
2
i
~w
2
+
h
~x, ~w
3
i
h
~w
3
, ~w
3
i
~w
3
.
Using this formula, we compute
P
W
(
~e
1
) =
1
3
~w
1

1
3
~w
2
+
1
15
~w
3
=
1
15
11
2

6
2
,
P
W
(
~e
2
) =
1
3
~w
1
+
1
3
~w
2
+
2
15
~w
3
=
1
15
2
14
3

1
,
P
W
(
~e
3
) =
0
3
~w
1
+
1
3
~w
2

1
15
~w
3
=
1
15

6
3
6
3
,
P
W
(
~e
4
) =
1
3
~w
1
+
0
3
~w
2

3
15
~w
3
=
1
15
2

1
3
14
.
Thus, the desired matrix is
1
15
11
2

6
2
2
14
3

1

6
3
6
3
2

1
3
14
.
3. Let
V
=
C
([0
,
2]) be the vector space (over
R
) of continuous functions from [0
,
2] to
R
. Define
h
f, g
i
=
Z
1
0
f
(
t
)
g
(
t
)
dt
for
f, g
∈
V.
Prove that
h·
,
·i
is
not
an inner product on
V
.
Proof
. Define
f
(
t
) =
(
0
if
t
≤
1
,
t

1
if
t >
1
, so that
f
is continuous, and hence
f
∈ C
([0
,
2]).
Clearly
f
6
=
~
0, since
f
is not everywhere 0 on [0
,
2]. However,
h
f, f
i
=
Z
1
0
f
2
dt
=
Z
1
0
0
dt
= 0
,
contradicting the positivity axiom for inner products. So
h·
,
·i
is not an inner product.
QED
4. Decide whether or not
C
=
1
0
0

1
1
1
1
0
0
is diagonalizable.
Answer
. The characteristic polynomial is det(
C

λI
) =

λ
(1

λ
)
2
; so the characteristic polynomial
splits, and the only eigenvalues are
λ
= 0
,
1.
To finish testing for diagonalizability, we need to check that each eigenspace has the appropriate
dimension.
The eigenvalue
λ
= 0 has algebraic multiplicity 1, so its eigenspace is automatically
dimension 1, and we don’t need to check explicitly. [But since the first and third rows of
C
are the
same, clearly the eigenspace of 0, i.e., the null space of
C
, is nontrivial. And if you really want to be
specific, it’s Span
{
[0 1

1]
}
.]
For
λ
= 1, we compute
C

I
=
0
0
0

1
0
1
1
0

1
, which has rank 1 and therefore nullity 2, since each
row is a multiple of [1 0

1]. Thus, the eigenspace of 1 has the proper dimension of 2, and therefore
C
is diagonalizable.
[If you want to be more specific, the eigenspace of 1 is Span
{
[1
0
1]
,
[0
1
0]
}
.
So our three
eigenvectors
{
[1 0 1]
,
[0 1 0]
}
,
[0 1

1]
}
form a basis for
F
3
, thus diagonalizing
C
.]
5. Let
A
∈
M
3
×
3
(
R
) be symmetric. Suppose that
~v
1
=
1
1
0
and
~v
2
=
1

1
2
are eigenvectors of
A
with eigenvalues
λ
1
= 3 and
λ
2
=

2, respectively. Suppose also that det(
A
) = 6. Find
A
.
Answer
. Call the third eigenvalue
λ
3
. [At the moment, it’s conceivable that
λ
3
might be a repeat
of one of the two we already know.]
Since the product of the eigenvalues equals det(
A
), we have

6
λ
3
= 6, and hence
λ
3
=

1.
Since
A
is symmetric, any eigenvector
~v
3
for
λ
3
=

1 must be orthogonal to the other two. That is,
setting
B
=
1
1
0
1

1
2
, we must have
B~v
3
=
~
0. Solving this system gives Ker(
B
) = Span
1

1

1
,
so we may choose
~v
3
=
1

1

1
.
Dividing each of our three eigenvectors by its length, we get that
Q
=
1
/
√
2
1
/
√
6
1
/
√
3
1
/
√
2

1
/
√
6

1
/
√
3
0
2
/
√
6

1
/
√
3
is
an orthogonal matrix whose columns are the eigenvectors of
A
.
Thus, putting the eigenvalues in a
diagonal matrix in the same order, we have
A
=
Q
3
0
0
0

2
0
0
0

1
Q
t
=
· · ·
=
1
6
5
13

2
13
5
2

2
2

10
.