Proof: if n-dimensional real vector p is orthogonal to any n-dimensional real vector, then p must be a zero vector

Proof: if n-dimensional real vector p is orthogonal to any n-dimensional real vector, then p must be a zero vector


Suppose P is (A1, A2, A3, A4,..., an)
Since any real vector is orthogonal, we may as well take the unit coordinate vector (1,0,0,..., 0)
So A1 * 1 + A2 * 0 +... + an * 0 = A1 = 0
Then take the unit coordinate vector (0,1,0,0,..., 0) to get A2 = 0
If we continue, we will get A1 = A2 =... = an = 0
Then verify that the inner product of (0,0,0,..., 0) and any (B1, B2, B3,..., BN) is 0, so it satisfies any hypothesis



Let n-dimensional vector group α 1, α 2,..., α n be linearly independent. It is proved that if n-dimensional vector β is orthogonal to every α I (I = 1,2,..., n), then β = 0


Let the vector be a column vector. If the n-dimensional vector β is orthogonal to every α I, then
α I '* β = 0 (α I' denotes transpose of α I)
Namely
α1'*β=0
α2'*β=0
...
αn'*β=0
Let a be a square matrix of order n with rows of α I ', I = 1,2,3... N
So we get the equation a * β = 0, and treat every element in β as an unknown quantity
Since the vector groups α 1, α 2,..., α n are linearly independent, so | a | is not equal to 0
According to Cramer's law, the system has only 0 solution
So β = 0



Proof: if the n-dimensional vector α and β are orthogonal, then for any real number k, l, then K α and l β are orthogonal


Because (a, b) = 0
So (KA, LB) = KL (a, b) = 0



Let X be an n-dimensional column vector, x ^ t * x = 1, let a = e-2x * x ^ t, and prove that a is an orthogonal matrix


Use the definition of orthogonal matrix to verify. The economic mathematics team will help you answer. Please evaluate in time