Contacts

Linear dependence Basis The dimension of the basis of the basis. The existence of a vector space basis supplement the base subspace to the base of the entire space

Definition. The system of elements x ..., the HC of linear space V is called linearly dependent, if there are numbers A ", ..., otq, not all equal zero and such that if equality (1) is performed only at a] \u003d ... \u003d AQ \u003d 0, then the system of elements xj, ..., x9 is called linearly independent. Fair the following statements. Theorem 1. The system of elements x \\, ..., xq (q ^ 2) is linearly dependent in that and only if at least one of its elements can be represented as a linear combination of the rest. Suppose first that the system of elements is x ..., xq is linearly dependent. We will consider for certainty that in equality (1) is different from zero coefficient A9. Transferring all the terms, except for the latter, to the right side, after dividing on OTQ f o, we obtain that element X is a linear combination of Xi, ..., XQ elements: back, if one of the elements is equal to the linear combination of the rest, then carrying it into the left Part, we obtain a linear combination in which there are different coefficients (-1 f 0). So, the XI element system, _____ xq linearly dependent. Theorem 2. Let the system of elements x |, ..., x9 is linearly independent and y \u003d a \\ x \\ +. + Aqxq. Then the ORI, ..., AQ coefficients are determined by the element in the singular. m Let then Linear addiction Basis The dimension of the basis of the basis from where. Of linear independence Elements x |, ..., xq implies that A (and, it means, and theorem 3. The system of elements containing a linearly dependent subsystem is linearly dependent. "Let the first Q elements of the system x ..., xq, xg + l, ..., HT is linearly dependent. Then there is a linear combination of these elements such as not all coefficients from ", ..., AQ are zero. When adding elements, ..., HT with zero multipliers, we get as in linear The combination of RIS-5 is zero not all coefficients. Example. Vectors from VJ are linearly dependent if and only if they are compartment (Fig. 5). An ordered system of elements in |, ..., E "Linear space V is called the basis of this linear. Spaces, if items in |, ..., EP are linearly independent and each element from V can be represented as their linear combination. Ordering means here that each element is assigned a certain (ordinal) number. From one system n elements can be built P! Ordered systems. Example, let A.j - Troika of noncomplete vectors from V J (Fig. 6). Then ordered three are different bases Let C \u003d (B! ... EP) - the basis of space V. Then for any element X from V there is a set of numbers ..., with such that due to theorem 2, ... C - the coordinates of the element in the base C are definitely defined. Let's see what happens with the coordinates of the elements in simpleness of the sizes. Let it be for any number and thus, when the elements are addition, their corresponding coordinates are folded, and when the element multiplying the element, all its coordinates are multiplied by this number. The coordinates are elementary convenient to record as a column. For example, P is a coordinate element column in the base with. We decompose an arbitrary system of elements x |, ..., x, by base with, and consider the coordinate columns of elements x |, ..., x9 in this basis: Theorem 4. The system of elements x \\, ..., xq linearly depending then And only if the system of their coordinate columns in some base is linearly dependent. * Let at least one of the coefficients A * differ from zero. We write it in more detail from here, due to the uniqueness of the decomposition of the base element, it follows that the linear dependence of the basis The dimension of the basis of the basis of the basis in this way, a linear combination of the coordinate columns of XT elements, ..., xq is equal to a zero column (with the same coefficients a |, ... BUT?). This means that the coordinate column system is linearly dependent. If equality (2) is performed, then, by conducting reasoning in reverse order, we obtain formula (1). Thus, the appeal to zero is some nontrivial (at least one of the coefficients differ from zero) of a linear combination of linear space elements is equivalent to the fact that a non-trivial linear combination of their coordinate columns (with the same coefficients) is equal to a zero column. Theorem 5. Let the basis from the linear space V consists of n elements. Then any system of elements, where T\u003e P, linearly dependent. Or, that, too, * by virtue of Theorem 3 sufficiently consider the case let xj, ..., hp + | - Arbitrary elements of space V. Spread each element on the basis with and write the coordinates of the elements ........... in the form of a matrix, reducing the column of the element coordinates. We obtain the matrix of the PI rows + 1 columns - in view of the fact that the ring of the matrix K does not exceed the number of its lines, the columns of the matrix to (they are p + 1) are linearly dependent. And since these are the coordinate columns of the elements, then according to theorem 4 the system of elements x | ..... x "+ | Also linearly dependent. Corollary. All bases of linear space V consist of the same element chiya. Let the Basis C consists of p elements, and the basis with "from P of the elements. By just proven theorem from the linear independence of the E \\, ..., E" system, we conclude that P "^ p. By changing the bases E and C" In some places, by virtue of the same theorem, we obtain that P ^ P ". Thus, n \u003d p. Lshkrity / Olyaznar Space of V is called the number of elements of the basis of this space. Example 1. The basis of the coordinate space of the EP form elements 4 EI.EJ elements system. .., EP is linearly independent: we obtain from the equality, which means, in addition, any element E, \u003d ... From R, it can be written in the form of a linear combination of the elements in the same way, the dimension of the space R is equal to paragraph. Example 2. Uniform linear The system having non-zero solutions has a fundamental solution system (FSR). FSER is the basis of the linear space of solutions of a homogeneous system. The dimension of this linear space is equal to the number of FSR elements, i.e. P - G. Where r is the rank of the matrix of the coefficients of a homogeneous system, AN are the number of unknowns. Example 3. The dimension of the linear space of MP of polynomials is not higher than n is equal to P + 1. 4 Since any polynomial / * (() degree is not higher than n has the form that it is enough to show the linear independence of the elements in | \u003d. Consider equality where T is arbitrarily. Believing T \u003d 0, we obtain that "O \u003d 0. 5 Zak.750 We indifferentiate equality (3) by T: again putting t \u003d 0, we obtain that 0 | \u003d 0. Continuing this process, we consistently convinced that oo \u003d "I \u003d ... \u003d a" \u003d 0. This means that the system of elements in | \u003d 1, ..., ep4) \u003d * p linearly independent. Consequently, the desired dimension is equal to P + 1. Agreement. Further, this chapter is considered to be everywhere, unless otherwise specified that the dimension of the linear space V is equal. It is clear that if W is the subspace of the N-dimensional linear space V, then Dim W ^ p. We show that in the p-dimensional linear space V there are linear subspaces of any dimension to ^ p. Let C \u003d - the space basis V. It is easy to make sure that the chtower sheath has dimension to. By defining the theorem B (about the replenishment of the basis). The impestsystem of the elements of the linear space V dimension p linearly independent and to. Then in the space V there are elements A * + 1, ..., AP such that system A "- Basis V. M Suppose B - an arbitrary element of the linear space V. If the system linearly dependent, then ^ as in a non-trivial linear combination the coefficient due to linear independence of the system A if the decomposition of the form (4) could be written for any element B of space V, then the original system A |, ..., and * would be a basis According to the definition. But due to the condition it is impossible. Therefore, there must be an element A * + I € V such that the replenished system AI, ..., Oh, A * + | It will be an independent. If K + 1 \u003d P, then this system is the basis of the space V. If K + 1, then for the system A, the previous arguments should be repeated. In this way, any given linearly independent system of elements can be completed before the basis of the entire space V. Example. Complete the system of two vectors A | \u003d (1,2,0,1), Aj \u003d (-1,1,0) R4 space before the basis of this space. M Take in the space R4 vectors AJ \u003d (and show that the system of vectors ai.aj.aj, a4 - the basis R4. The rank of the matrix of which are the coordinates of the AEG vectors, AZ, E4, is four. This means that the strings of the matrix A, And, therefore, AT vectors. AG. AZ, A ^ linearly independent.\u003e This approach is used in general: to supplement the system to linearly independent elements to space base, matrix Linear dependence Basis The dimension of the basis of the basis for elementary transformations of rows is driven by trapezoid form, and then supplemented by P - to strings of the form so that the rank of the resulting matrix is \u200b\u200bequal to p. Fairly the following assertion. Theorem 7. Let the linear subspace of the linear space V, then. Replacing the basis. Let - Linear Linear Bases. Write the Basis elements According to the basis with. We have these relationships. It is convenient to record the matrix in the matrix form called the matrix of the transition from the basis with to the basis of the C ". Properties of the transition matrix proof WHO properties are carried out from nasty. From the equality of DET S \u003d 0, the linear dependence of the columns of the matrix S is follows. These columns are coordinate columns element ", ..." e "P in the base with. Therefore (and as a result of Theorem 4) elements E "and ..., e" n should be linearly dependent. The latter is contrary to the fact that with "- basis. It means that the assumption that Det S \u003d 0 is incorrect. 2. If ..., and ..., the coordinates of the element x in bases with and with" respectively, then _ replacing in The formula of their expressions (1), we obtain that from here, due to the uniqueness of the decomposition of the element on the basis, we have the I moving to the matrix record of the equalities found, we are convinced of the Fair Properties 2. 3. S-1 - Matrix of Transition from Basis with "to the basis with.

Let be V. Vector space over the field R, S. - system of vectors from V..

Definition 1. Base system vectors S. called such an ordered linearly independent subsystem B. 1, B. 2, ..., B. R. Systems S.that any system vector S. Linear combination of vectors B. 1, B. 2, ..., B. R..

Definition 2. Ranking system vectors S. called the number of system base vectors S.. Designated rank system vector S. symbol R. \u003d Rang S..

If s \u003d ( 0 ), then the system has no basis and it is assumed that Rang S.= 0.

Example 1. Let the system of vectors be given A. 1 = (1,2), A. 2 = (2,3), A. 3 = (3,5), A. 4 \u003d (1,3). Vector A. 1 , A. 2 form the basis of this system, as they are linearly independent (see example 3.1) and A. 3 = A. 1 + A. 2 , A. 4 = 3A. 1 - A. 2. The rank of this system of vectors is two.

Theorem 1. (Theorems on Bases). Let S be the final system of vectors from V, S. ≠{0 }. Then the approval.

1 ° Any linearly independent subsystem S system can be supplemented before the base.

2 ° System S has a basis.

2 ° Any two basis systems S contain the same number of vectors, i.e. the rank of the system does not depend on the choice of the base.

4 ° If a R. \u003d Rang S., then any R linearly independent vectors form a system basis S.

5 ° If a R. \u003d Rang S., All k\u003e r verts of the system S are linearly dependent.

6 ° Any vector A. € s is single linearly expressed through the basis of the basis, i.e., if B.1, B.2, ..., B.R Basis System S, then

A. = A.1 B. 1 + A.2 B. 2 +...+ A.R.B. R.; A.1 , A.2 , ..., A.N. € p,(1)

And such a representation is the only.

By virtue of 5 ° base is Maximum linearly independent subsystem Systems S., and rank system S. The number of vectors in such a subsystem.

Presentation of vector A. in the form (1) called Decomposition of vector vectors, and numbers A1, A2 , ..., ar called Coordinates of the vector A. In this base.

Evidence. 1 ° B. 1, B. 2, ..., B. K. - linearly independent system subsystem S.. If each system vector S.Linearly expressed through the vector of our subsystem, then by definition it is the basis of the system S..

If there is a vector in the system S. which is not linearly expressed through the vector B. 1, B. 2, ..., B. K., we denote it through B. K.+1. Then systems B. 1, B. 2, ..., B. K., B. K.+1 - linearly independent. If each system vector S.Linearly expressed through the vector of this subsystem, then by definition it is the basis of the system S..

If there is a vector in the system S. which is not linearly expressed in B. 1, B. 2, ..., B. K., B. K.+1, then repeat reasoning. Continuing this process, we either come to the basis of the system S.or increase the number of vectors in a linearly independent system per unit. As in the system S. The final number of vectors, then the second alternative cannot continue infinitely and at some step we get the basis of the system S..

2 ° S. The final system of vectors and S. ≠{0 ). Then in the system S. Eating vector B. 1 ≠ 0, which forms a linearly independent system subsystem S. . In the first part, it can be supplemented to the system base. S. . Thus, the system S. It has a basis.

3 ° Suppose that the system S. It has two bases:

B. 1, B. 2, ..., B. R. , (2)

C. 1, C. 2, ..., C. S. , (3)

By defining the basis of the vectors system (2) linearly independent and (2) í S. . Next, by defining the basis, each system vector system (2) is a linear combination of system vectors (3). Then by the main theorem on two systems of vectors R. £ S.. Similarly proves that S. £ R.. From these two inequality follows R. = S..

4 ° R. \u003d Rang S., A. 1, A. 2, ..., A. R. - linearly independent subsystem S.. We show that it is the basis of systems S.. If it is not a basis, then in the first part it can be supplemented to the base and we get the basis A. 1, A. 2, ..., A. R., A. R.+1,..., A. R.+T. more than R.

5 ° if K. vectors A. 1, A. 2, ..., A. K. (K. > R.) Systems S. - linearly independent, then on the first part this system of vectors can be supplemented to the base and we get the basis A. 1, A. 2, ..., A. K., A. K.+1,..., A. K.+T. more than R. vectors. This is contrary to the proven in the third part.

6 ° B. 1, B. 2, ..., B. R. Basis system S.. By definition of the basis any vector A. S. There is a linear combination of base vectors:

A. \u003d A1. B. 1 + A2. B. 2 + ... + AR B. R.

Proving the uniqueness of such a presentation to be appropriate, that there is another idea:

A. \u003d B1. B. 1 + B2. B. 2 + ... + br B. R.

Equal Equality Find Mount

0 \u003d (A1 - B1) B. 1 + (A2 - B2) B. 2 + ... + (Ar - Br) B. R.

Since Basis B. 1, B. 2, ..., B. R. linearly independent system, then all coefficients AI - Bi \u003d 0; I. = 1, 2, ..., R.. Consequently, Ai \u003d Bi; I. = 1, 2, ..., R. And the uniqueness is proved.

Golzizin V.V. Lectures on algebra and geometry. five

Lectures on algebra and geometry. Semester 2.

Lecture 23. Basis of vector space.

Summary: criterion of linear dependence of the system of nonzero vectors, vectors system subsystems, generating vectors system, minimal generating system and maximum linearly independent system, vector space base, and its equivalent definitions, dimension of vector space, finite-dimensional vector space and the existence of its basis, addition to Basis.

p.1. Criterion for the linear dependence of the system of nonzero vectors.

Theorem. The system of nonzero vectors is linearly dependent if and only if there is a system vector, which is linearly expressed through the previous vectors of this system.

Evidence. Let the system consist of non-zero vectors and linearly dependent. Consider the system from the same vector:
. Because
, then system
- linearly independent. Join her vector . If the system received
Similarly independent, then connect the following to it: . Etc. We continue until we get a linearly dependent system
where. Such a number will be found, because Source system
It is linearly dependent under the condition.

So, on the construction, the linearly dependent system was obtained.
, moreover, system
It is linearly independent.

System
represents the zero vector is nontrivially, i.e. There is such a nonzero set of scalars
, what

where scalar
.

Indeed, otherwise, if
then we would have a non-trivial representation of the zero vector linearly independent system
that is impossible.

Dividing the last equality on a nonzero scalar
we can express from it vector :

,

Since the opposite statement is obvious, the theorem has been proven.

p. Vector space system subsystems.

Definition. Any non-empty subset of the system system
called the subsystem of this system vectors.

Example. Let be
- system of 10 vectors. Then systems of vectors:
;
,
- subsystems of this system of vectors.

Theorem. If the vectors system contains a linearly dependent subsystem, then the system of vectors itself is also linearly dependent.

Evidence. Let the system of vectors be given
And even for certainty the subsystem
where
It is linearly dependent. Then it represents the zero vector is nontrivial:

where among the coefficients
There are at least one no equal zero. But then the next equality is a non-trivial view of the zero vector:

from where, by definition, a linear dependence of the system
, Ch.T.D.

Theorem is proved.

Corollary. Any subsystem of a linearly independent vectors system is linearly independent.

Evidence. Suppose the opposite. Let some subsystem of this system be linearly dependent. Then, the linear dependence of this system follows from the theorem, which contradicts the condition.

The investigation is proved.

p.3. Column systems of arithmetic vector space of columns.

From the results of the previous paragraph, as a special case, follows the theorem.

1) The column system is linearly dependent if and only if there is at least one column in the system, which is linearly expressed in other columns of this system.

2) The column system is linearly independent if and only if no system column is linearly expressed in other columns of this system.

3) The column system containing a zero column is linearly dependent.

4) The column system containing two equal columns is linearly dependent.

5) The column system containing two proportional columns is linearly dependent.

6) The column system containing the linearly dependent subsystem is linearly dependent.

7) Any subsystem of a linearly independent column system is linearly independent.

The only thing that may be required to clarify this concept of proportional columns.

Definition. Two nonzero columns
call proportional if there is a scalar
, such that
or

,
, …,
.

Example. System
It is linearly dependent, since its first two columns are proportional.

Comment. We already know (see lecture 21) that the determinant is zero if the system of its columns (strings) is linearly dependent. In the future, it will be proved that the inverse statement: if the determinant is zero, then the system of its columns and the system of its rows are linearly dependent.

p.4. Basis vector space.

Definition. System vectors
The vector space over the field K is called the generating (forming) system of vectors of this vector space, if it represents any of its vector, i.e. If there is such a set of scalars
, what .

Definition. The system of vector space vectors is called the minimum generating system if, when removing from this system of any vector, it ceases to be generating the system.

Comment. From the definition immediately follows that if the generating system of the vectors is not minimal, then there are at least one vector system, when the system is removed from the system, the remaining system of the vectors will still be generating.

Lemma (about the linearly dependent generating system.)

If one of the vectors are linearly expressed in a linearly dependent and generating system of the vectors, it can be removed from the system and the remaining system of vectors will be generating.

Evidence. Let the system
Linearly dependent and generating, and let one of its vectors are linearly expressed through other vectors of this system.

For certainty and for ease of recording, we assume that

As
- generating system, then
There is such a set of scalars
, what

.

From here we get

those. Any vector x is linearly expressed through system vectors
, which means that it is a generating system, Ch.T.D.

Corollary 1. The linearly dependent and generating system of vectors is not minimal.

Evidence. Immediately follows from the lemma and determining the minimum generating vectors system.

Corollary 2. The minimum generating system of vectors is linearly independent.

Evidence. Allowed the opposite, we come to a contradiction with a consequence of 1.

Definition. The system of vectors of vector space is called the maximum linearly independent system, if, when adding to this system of any vector, it becomes linearly dependent.

Comment. From the definition immediately follows that if the system is linearly independent, but not maximum, then there is a vector, when adding to the system, a linearly independent system is obtained.

Definition. The basis of the vector space V above the field K is an ordered system of its vectors, representing any vector of vector space by the only way.

Otherwise, system vectors
vector space V over the field K is called its basis if
There is a single set of scalars
, such that.

Theorem. (About four equivalent definitions of the base.)

Let be
- ordered vector system vector system. Then the following statements are equivalent:

1. System
is a basis.

2. System
It is a linearly independent and generating system of vectors.

3. System
It is the maximum linearly independent system of vectors.

4. System
It is the minimum generating system of vectors.

Evidence.

Let the system vectors
is a basis. From the definition of the base, it immediately follows that this system of vectors is an ingoing vector space system system, so we only need to prove its linear independence.

Suppose that this system The vectors are linearly dependent. Then there are two representations of the zero vector - trivial and non-trivial, which contradicts the definition of the basis.

Let the system vectors
It is linearly independent and generating. We need to prove that this linearly independent system is maximum.

Suppose the opposite. Let this linearly independent system of the vectors are not maximum. Then, due to the comments above, there is a vector that can be added to this system and the resulting system of vectors remains linearly independent. However, on the other hand, the vector added to the system can be represented as a linear combination of the source system of vectors due to the fact that it is generating the system.

And we get that in a new, extended, vector system, one of its vectors is linearly expressed in other vector vectors of this system. Such a system of vectors is linearly dependent. Received a contradiction.

Let the system vectors
The vector space is the maximum linearly independent. We prove that it is the minimum generating system.

a) First, we prove that it is a generating system.

Note that due to linear independence, the system
Does not contain zero vector. Let - arbitrary nonzerulic vector. Add it to this system vectors:
. The resulting non-zero vectors system is linearly dependent, because The initial vector system is maximal linearly independent. So, in this system, there is a vector linearly expressed through the previous ones. In the original linear independent system
None of the vectors can be expressed through the previous ones, therefore, linearly expressed through the previous only vector x. Thus, the system
Represents any nonzero vector. It remains to notice that this system is obviously represents and zero vector, i.e. system
is generating.

b) Now we will prove its minimity. Suppose the opposite. Then one of the vectors of the system can be removed from the system and the remaining vector system will still be the generating system and, therefore, the vector removed from the system is also linearly expressed in the remaining vector vector vectors, which contradicts the linear independence of the original system of vectors.

Let the system vectors
The vector space is the minimum generating system. Then it represents any vector vector space. We need to prove the uniqueness of the presentation.

Suppose the opposite. Let some vector x linearly expressed through the vectors of this system in two different ways:

Sumped out of one equality, we get:

By virtue of the investigation 2, system
It is linearly independent, i.e. Represents the zero vector only trivmally, so all the coefficients of this linear combination must be zero:

Thus, any vector x is linearly expressed through the vectors of this system the only way, bt.d.

Theorem is proved.

p.5. The dimension of the vector space.

Theorem 1. (On the number of vectors in linearly independent and generating systems of vectors.) The number of vectors in any linearly independent system of vectors does not exceed the number of vectors in any vector generating system of the same vector space.

Evidence. Let be
Arbitrary linearly independent vectors system,
- Arbitrary generating system. Suppose that.

Because
generating system, then it represents any space vector, including vector . Attach it to this system. We get a linearly dependent and generating system of vectors:
. Then there is a vector
This system, which is linearly expressed through the previous vectors of this system and it, due to the lemma, can be removed from the system, and the remaining system of vectors will still generate.


. Because This system is generating, it represents a vector
And, attaching it to this system, again we get a linearly dependent and generating system :.

Next, everything is repeated. There is a vector in this system, which is linearly expressed through the previous ones, and it can not be vector because Source system
linearly independent and vector not expressed linearly through vector
. So it can only be one of the vectors.
. Removing it from the system, we get after reprimion, the system that will generate the system. Continuing this process, after steps we obtain the generating system of vectors: where
because In our assumption. So this system, as generating, represents both the vector, which contradicts the condition of linear independence of the system
.

Theorem 1 is proved.

Theorem 2. (On the number of vectors in the base.) In any basis of the vector space, the number of vectors contains one.

Evidence. Let be
and
- Two arbitrary vector space bases. Any basis is a linearly independent and generating system of vectors.

Because The first system is linearly independent, and the second is generating, then, by Theorem 1,
.

Similarly, the second system is linearly independent, and the first is generating, then. Hence it follows that
, Ch.T.D.

Theorem 2 is proved.

This theorem allows you to enter the following definition.

Definition. The dimension of the vector space V above the field K is the number of vectors in its base.

Designation:
or
.

p.6. The existence of a vector space basis.

Definition. The vector space is called finite-dimensional if it has a finite vector generating vectors.

Comment. We will study only finite-dimensional vector spaces. Despite the fact that we already know quite a lot about the basis of the finite-dimensional vector space, we do not have the confidence that the basis of such space generally exists. All previously obtained properties were obtained in the assumption that the basis exists. The following theorem closes this question.

Theorem. (On the existence of the basis of the finite-dimensional vector space.) Any finite-dimensional vector space has a basis.

Evidence. By condition, there is a finite generating system of vectors of this finite-dimensional vector space V:
.

Note immediately that if the generating system of the vectors is empty, i.e. It does not contain a single vector, then by definition, it is believed that this vector space is zero, i.e.
. In this case, by definition, it is believed that the basis of the zero vector space is an empty basis and its dimension is considered to be zero.

Let it be further, nonzero vector space and
Finite generating system of non-zero vectors. If it is linearly independent, then everything is proven, because Linearly independent and generating vector vectors system is its basis. If this system of vectors is linearly dependent, then one of the vectors of this system is linearly expressed through the remaining and can be removed from the system, and the remaining system of vectors, by virtue of Lemma P.5, will still generate.

Purge the remaining system of vectors:
. Next, reasoning is repeated. If this system is linearly independent, it is a basis. If not, then there is a vector in this system again, which can be deleted, and the remaining system will generate.

Repeating this process, we cannot stay with an empty system of vectors, because In the most extreme case, we will come to the generating system from one nonzero vector, which is linearly independent, and, consequently, the basis. Therefore, at some step we come to a linearly independent and generating system of vectors, i.e. to base.

Theorem is proved.

Lemma. Let be . Then:

1. Any system from the vector is linearly dependent.

2. Any linearly independent system of vectors is its basis.

Evidence. one). By the Lemma condition, the number of vectors in the base is equal to the basis of the system, so the number of vectors in any linearly independent system cannot exceed.

2). As follows from just proven, any linearly independent system from the vectors of this vector space is maximum, and therefore the basis.

Lemma is proved.

Theorem (about the supplement to the base.) Any linearly independent system of vector space vectors can be supplemented to the basis of this space.

Evidence. Let the vector space of dimension N and
Some linearly independent system of its vectors. Then
.

If a
, then in the previous lemma, this system is a basis and there is nothing to prove.

If
, then this system is not the maximum linear independent system (otherwise it would be a basis, which is impossible, because). Therefore, there is a vector
, such that system
- linearly independent.

If, now, then the system
is a basis.

If
, all repeats. The process of replenishment of the system cannot continue infinitely, because At each step, we obtain a linearly independent system of space vectors, and according to the previous lemma, the number of vectors in such a system may not exceed the dimension of the space. Consequently, at some step we will come to the basis of this space.

Theorem is proved.

p.7. Example.

1. Let K be an arbitrary field - an arithmetic vector space of a height column. Then. To prove, consider the system of columns of this space.

It is called finite-dimensional if it has a finite generating system of vectors.

Comment. We will study only finite-dimensional vector spaces. Despite the fact that we already know quite a lot about the basis of the finite-dimensional vector space, we have no confidence that there is such a space in general. All previously received were obtained in the assumption that the basis exists. The next closes this question.

Theorem. (On the existence of the basis of the finite-dimensional vector space.)

Any finite-dimensional vector space has a basis.

Evidence. By condition, there is a finite generating system of this finite-dimensional vector space V :.

Note immediately that if the generating system of the vectors is empty, i.e. It does not contain a single vector, then by definition, it is believed that this vector space is zero, i.e. . In this case, by definition, it is believed that the basis of the zero vector space is an empty basis and it is believed to be zero by definition.

If this system is independent, then everything is proven, because Linearly independent and generating vector vectors system is its basis.

If this system of vectors is linearly dependent, then one of the vectors of this system is linearly expressed through the remaining and can be removed from the system, and the remaining vectors system will still generate.

Purge the remaining system of vectors :. Next, reasoning is repeated.

If this system is linearly independent, it is a basis. If not, then there is a vector in this system again, which can be deleted, and the remaining system will generate.

Repeating this process, we cannot stay with an empty system of vectors, because In the most extreme case, we will come to the generating system from one nonzero vector, which is linearly independent, and, consequently, the basis. Therefore, at some step we come to a linearly independent and generating system of vectors, i.e. to Basisu, Ch.T.D.

Theorem is proved.

Lemma. (On systems of vectors in N-dimensional vector space.)

Let be . Then:

1. Any system from the vector is linearly dependent.

2. Any linearly independent system of vectors is its basis.

Evidence. one). By the condition of the lemma, the number of vectors in the base is equal to the basis of the system, so the number of vectors in any linearly independent system cannot exceed, i.e. Any system containing a vector is linearly dependent.

2). As follows from just proven, any linearly independent system from the vectors of this vector space is maximum, and therefore the basis.

Lemma is proved.

Theorem (about the supplement to the base.) Any linearly independent system of vector space vectors can be supplemented to the basis of this space.

Evidence. Let the vector space of the dimension N and some linearly independent system of its vectors. Then.

If, on the previous lemma, this system is a basis and there is nothing to prove.

If, then this system is not the maximum independent system (otherwise it would be a basis, which is impossible, because). Therefore, there is a vector, such that system - linearly independent.

If, now, then the system is a basis.

If, everything is repeated. The process of replenishment of the system cannot continue infinitely, because At each step, we obtain a linearly independent system of space vectors, and according to the previous lemma, the number of vectors in such a system may not exceed the dimension of the space. Consequently, at some step we will come to the basis of this space., Ch.T.D.

Definition. Basis

the arithmetic vector space of the height columns N is called canonical or natural.



Did you like the article? Share it