How to Judge the Pairwise Orthogonal of Vector Groups in Linear Algebra

How to Judge the Pairwise Orthogonal of Vector Groups in Linear Algebra

0

Orthogonality problem of linear algebra Given the transposition of vector α=(1,3,2,4) and the transposition of β=(k,-1,-3,2k) Qiuk Linear Algebra Two-vector Orthogonal Problem The transposition of the known vector α=(1,3,2,4) and the transposition of β=(k,-1,-3,2k) are orthogonal Qiuk Orthogonality problem of linear algebra Given the transposition of vector α=(1,3,2,4) and the transposition of β=(k,-1,-3,2k Qiuk

1*K+3*(-1)+2*(-3)+4*(2k)=0,
9K-9=0,
K =1.

It is proved that a vector group containing only one zero vector is linearly related, and a vector group containing only one nonzero vector is linearly independent.

Isn't that by definition?
If the vector group contains only one 0 vector, there is a constant 1 such that 1*0=0, so that the vector group is linearly related (there is a coefficient that is not all 0, so that the vector group is accumulated to 0, so that the vector group is linearly related, where coefficient 1 is obviously not 0)
If there is only one non-zero vector v in the vector group, kv=0 can obviously get k=0, which also satisfies the definition of vector linear independence

Isn't that by definition?
If the vector group contains only one 0 vector, then there is a constant 1 such that 1*0=0, so the vector group is linearly related (there is a coefficient that is not all 0, so that the vector group is accumulated to 0, then the vector group is linearly related, where coefficient 1 is obviously not 0)
If there is only one non-zero vector v in the vector group, kv=0 can obviously get k=0, which also satisfies the definition of vector linear independence

Let n-dimensional vectors a1, a2,..., ar be a set of pairwise orthogonal nonzero vectors. It is proved that a1, a2,..., ar are linearly independent.

It is proved that if k1a1+k2a2+...+ksas=0, then ai (k1a1+k2a2+...+ksas)=0,(i=1,2,..., s)(*), because a1, a2,..., as are orthogonal and non-zero, then ai*aj=0(i=j), and aiai=a2i=0, so 0+0+...+kia2i+..+0=0, i.e. kia2i=0...

N-dimensional column vectors α1,α2,α3,...α(n-1) are linearly independent and orthogonal to non-zero vectors β1,β2, It is proved that β1,β2 are linearly related,α1,α2,α3,...α(n-1),β1 is linearly independent.

Suppose β1 can be expressed linearly by α1,α2,α3,...α(n-1),β1=k1 1+k2 2+k3 3+…… +K (n-1)*α(n-1) Because α1,α2,α3,...α(n-1) is orthogonal to β1, i.e.αi point multiplies β1=0(i=1,... , N-1) can be deduced ki=0(i=1,... , N-1) i.e.β1=0 and the problem...

Let n-dimensional vector a1a2 linearly independent a3a4 linearly independent if both a1a2 and a3a4 are orthogonal to prove a1 a2,a3,a4 linearly independent Let n-dimensional vector a1a2 linearly independent a3a4 linearly independent if both a1a2 and a3a4 are orthogonally proved a1 a2,a3,a4 linearly independent

It is shown that the n-dimensional vector group A: a1, a2 is linearly independent, b1, b2 is linearly independent, and a1, a2 are orthogonal to b1, b2 respectively. It is proved that a1,a2,b1,b2 is linearly independent
Let x1a1+x2a2+y1b1+y2b2=0, prove that x1=x2=y1=y2=0.
X1a1+x2a2=-y1b1-y2b2
Since a1, a2 are orthogonal to b1, b2, respectively,
So x1a1+x2a2 is orthogonal to b1, b2,
So that x1a1+x2a2 and -y1b1-y2b2 are also orthogonal,
So x1a1+x2a2=-y1b1-y2b2=0(premise: real vector)
Since a1 and a2 are linearly independent, x1=x2=0 is obtained by x1a1+x2a2=0.
Since b1 and b2 are linearly independent, y1=y2=0 is obtained by -y1b1-y2b2=0
So,a1,a2,b1,b2 linearly independent