Canonical Form of Reduced 3-by-3 Matrix with One Characteristic Root and with Some Zero Subdiagonal Elements

A canonical form for a reduced matrix of order 3 with one characteristic root and with some zero subdiagonal elements is constructed. Thus, the problem of classification with respect to semiscalar equivalence of a selected set of polynomial matrices is solved.


Introduction
Let a matrix ( ) ∈ ( , C[ ]) have a unit first invariant factor and only one (without taking into account multiplicity) characteristic root. Without loss of generality, we assume that this uniquely characteristic root is zero. Consider the transformation ( ) → ( ) ( ) = ( ), where ∈ ( ,C), ( ) ∈ ( , C[ ]). In accordance with [1] (see also [2]) matrices ( ), ( ) are called semiscalarly equivalent (abbreviation: ss.e.; notation: ( ) ≈ ( )). In [3], the author proved that in the class { ( ) ( )}, where ( ) ∈ (3, C[ ]), there exists a matrix which has the following properties: Here deg denotes the junior degree of polynomial. Junior degree of polynomial ( ) ∈ C[ ], ( ) ̸ = 0, is the least degree of the monomial (of nonzero coefficient) of this polynomial; notation deg . The monomial of degree deg and its coefficient are called the junior term and junior coefficient, respectively. Denote by symbol +∞ the junior degree of the polynomial ( ) ≡ 0. If both elements 1 ( ), 2 ( ) of the matrix ( ) are nonzero, then we may take their junior coefficients to be identity elements. In the opposite case, we may take the junior coefficients of the nonzero subdiagonal elements of the matrix ( ) to be one. Such matrix ( ) in [3] is called the reduced matrix. The purpose of this paper is to construct the canonical form of the matrix ( ) ∈ (3, C) in the class { ( ) ( )} of ss.e. matrices. For this purpose we base our work on the reduced matrix ( ) of the form (1). This article is a continuation of the work [3]. The case 1 ( ) = 2 ( ) = 3 ( ) ≡ 0 is trivial. Then the Smith form is canonical for the matrix ( ) in the class { ( ) ( )}. If 1 = 2 , then 2 ( ) ≡ 0. This case is considered in the author's paper [4]. For this reason in the sequel we shall take 1 ̸ = 2 . In this paper we consider the case, when some of the elements 1 ( ), 2 ( ), 3 ( ) of the matrix ( ) are equal to zero and at least one of them is different from zero. Recall that the zero equality of some subdiagonal elements of the matrix ( ) is an invariant (see the element ( ) ̸ = 0 does not contain -monomial, 2 ( ) = 2 ( )/ 1 , ( ) ≡ 0. The matrix ( ) is uniquely defined.
Uniqueness. It suffices to prove for some of the cases: = 1, = 2, or = 3. The proof in two other cases is analogous. Assume in the reduced matrices ( ) and ( ) of the forms (1) and (2)

The Canonical Form of a Reduced Matrix with One Zero Subdiagonal Element
Let us now consider the ss.e.t. with the left transforming matrix of one of the next forms:  (1) and (2). We determine the polynomials 32 ( ) and 33 ( ) as follows: Next, we form the columns 3 , 32 , and 33 from the coefficients of the polynomials 3 ( ), 32 ( ), and 33 ( ), respectively. In doing so we place on the first positions in these columns the coefficients of the monomials of degree deg 3 . Below we arrange every succeeding coefficients in the order of growth of the degrees of the monomials, up to ( 2 − 1)-coefficient. At the same time we do not omit zero coefficients. Take into consideration the polynomials 22 ( ) :≡ ( 2 ( )) 2 (mod 2 − 1 ) , With the coefficients of the polynomials 2 ( ), 3 ( ), 22 ( ), and 23 ( ) we form the columns 2 , 30 , 22 and 23 , respectively, of the height 2 − deg 2 . Here we place on the first positions the coefficients of monomials of degree deg 2 − 1 . Further we put all coefficients of the monomials of higher degrees up to degree 2 − 1 −1. Let us build for ( ) the matrices: Evidently, each column in these matrices is composed by the coefficients of the monomials of the same degrees. By complete analogy for ( ) we build the matrices: (1) respectively, the (2 3 )and 3 -monomials are absent, if In the first column of the matrix (26) with the coefficients of the polynomials 3 ( ), 2 ( ) are zero elements, which correspond to the maximal system of the first linearly independent rows of the submatrix Existence. (1) Let 3 < 2 , 3 < 1 , and 1 identify (2 3 )coefficient of the polynomial 3 ( ) in the matrix ( ). Let us apply to the matrix ( ) ss.e.t.-III. In so doing, we set 13 = 1 in the left transforming matrix (see (9)). In the obtained reduced matrix ( ) the element 3 ( ) satisfies congruence (19), where 21 ( ) ≡ 0. From (19) we deduce that for 3 ( ) the condition (1) of the theorem is fulfilled.
(3) Let 3 > 2 , 3 < 1 . We may take that in 3 ( ) the (2 3 )-monomial is absent. We can always do this as has been shown above in p. (1). Let us denote by 3 From the last one it follows that in 2 ( ) the 1 -monomial is absent and 2 ( ) ≡ 2 ( )(mod 1 ). If 0 = 1, then everything is proved; that is ( ) is the matrix to be found. In the opposite case we take that already for ( ) the polynomial 2 ( ) does not contain 1 -monomial. Let the row 21 0 23 of the matrix 0 be the first linearly independent row of 11 0 13 and 2 21 0 23 is the respective row of the matrix . Let these rows be formed of the 2 -coefficients. We find the (unique) solution 10 30 of the equation 11 13 By ss.e.t-I-II we pass from ( ) to the reduced matrix ( ).
Here in the matrix (see (20)) we set 12 = 30 , 23 = 10 . The divisor 2 ( ) of the element 2 ( ) in ( ), as is seen from the congruence (29), does not contain 1 -and 2 -coefficients. If the matrix 3 is nonzero, then the row 1 = 1 0 1 of ( 2 + 3 )-coefficients is its first nonzero row. At the first step we apply to ( ) ss.e.t-II. In addition we set in the left transforming matrix of the form (8) in place of 23 the coefficient of the monomial of degree 2 + 3 in the polynomial 3 ( ) (see p. (2)). In the obtained reduced matrix the polynomial in the position (3, 1) does not contain the ( 2 + 3 )-monomial. If 2 3 < 2 , then we consider the row of the form 2 = * 1 0 formed of the (2 3 )-coefficients in the matrix, analogous to 3 . This row is the first linearly independent of 1 . We do the second step. This is ss.e.t-III of the obtained matrix after the first step. For this purpose we set (2 3 )-coefficient of the polynomial in the position (3, 1) of the obtained matrix after the first step instead 13 in the left transforming matrix of the form (9) (see p. (1)). In the obtained matrix after the second step the polynomial in position (3, 1) does not contain ( 2 + 3 )and (2 3 )monomials. In order not to introduce the new notations, we will consider that in the matrix ( ) its element 3 ( ) possesses this property. If 3 < 2 − 1 , then in the submatrix 2 of the matrix 0 the row 3 = * 0 1 of the 3coefficients is the first linearly independent row with the collection 1 , 2 . Then we make the third step. It is ss.e.t.-I of the matrix ( ). For this purpose in the left transforming matrix of the form (7) instead of 12 we set 3 -coefficient in (3)). After this step we obtain the matrix in which the element in the position (3, 1) is not changing and the polynomial in the position (3, 2) does not contain ( 3 + 1 )monomial. We obtain the required matrix.
If after the first step it turns out that 2 3 ≥ 2 , that is, 3 = 1 but 3 < 2 − 1 , then we immediately take the third step. Next, if 2 + 3 < 2 − 1 , then the row * 1 * of the ( 2 + 3 )-coefficients is the first linearly independent row with the collection 1 , 3 in the submatrix, analogous to 0 . Then we take the fourth step. It is ss.e.t.-III with a left transforming matrix (see (9)), in which instead of 13 is ( 2 + 3 + 1 )-coefficients of the polynomial in the position (3, 2) of the matrix obtained after the third step. After that, the element in the position (3, 1) of the resulting matrix will not change, and in position (3, 2) the desired element will stand. The obtained matrix is the required one.
If 2 3 ≥ 2 , then from (33) we have 3 ( ) = 3 ( ). Recall that in (33), (35) 23 = 0. Also 12 = 0, if 3 < 2 − 1 . Therefore, if 2 + 3 ≥ 2 − 1 , then everything has been proved. Otherwise, if 2 + 3 < 2 − 1 , then in the columns 2 , 2 the corresponding first 2 = 3 elements coincide and in each of the matrices 2 , 2 their ( 2 + 1)-th row * 1 * is first linearly independent of the collection 1 , 3 . Since We will construct columns 3 , 31 , and 33 of height 2 − deg 3 from the coefficients of the polynomials 3 ( ), 31 ( ), and 33 ( ), respectively. In these columns, in the first place, we put the coefficients of the monomials of degree deg 3 . Then, in the order of increasing powers of monomials, we place the rest of the coefficients, together with zero ones up to ( 2 − 1)-coefficients. We create columns 1 , 03 and 13 11 of height 1 − deg 1 from the coefficients of polynomials 1 ( ), 3 ( ) and from the coefficients of polynomials: respectively. Here, in the first place, we put the coefficients of the monomials of degree deg 1 , and then in the order of increasing powers of monomials we place the remaining coefficients up to the monomials of degree 1 − 1 inclusive. For ( ) we construct matrices of the form (38) Obviously, each row in these matrices consists of the coefficient of the monomials of the same degree.
Quite similarly for ( ) we construct matrices: (3) In 1 ( ) the 3 -monomial is absent and in the first of the polynomials ( ), = 1, 3, for which + 3 < , the (4) In the first column of the matrix (39) with the coefficients of the polynomials 3 ( ), 1 ( ) there are zero elements corresponding to the maximal system of the first linearly independent rows of the submatrix The matrix ( ) is uniquely defined.

Proof.
Existence. (1) The proof is completely analogous to the proof of condition (1) in Theorem 2.
(2) We can assume that in the matrix ( ) element 3 ( ) does not contain (2 3 )-monomial. Otherwise, we act as in p.
(3) Let 3 > 1 and 3 < 2 − 1 . If 3 ≥ 1 , then in 1 ( ) there is no 3 -monomial, since deg 1 < 1 . Otherwise, we denote the (nonzero) coefficient of the 3 -monomial in 1 ( ) by 2 and apply to ( ) ss.e.t.-II. Thus, in the left transforming matrix of the form (8) we put 23 = − 2 . In the resulting reduced matrix ( ) element 1 ( ) satisfy the congruence (14) and 3 ( ) = 3 ( ). From (14) it is clear that in 1 ( ) there is no 3 -monomial. In order not to introduce new notation, we assume that already in ( ) the polynomial 1 ( ) does not contain a 3 -monomial. If + 3 ≥ , = 1, 3, then everything has already been proved. Then the matrix ( ) is the desired one. Otherwise, we find the first of two values From this congruence it follows that in ( ) there is no ( + 3 )-monomial. If = 1, then from the same congruence it is also evident that in 1 ( ), as in 1 ( ), there is no 3monomial. The same thing is proved by the congruence if = 3.
(4) If 0 = 0, then everything has already been proved. Then the matrix ( ) is the desired one. If 0 ̸ = 0, but 3 = 0, then the element 3 ( ) in ( ) satisfies condition (4). This element will not change with all subsequent ss.e.t. Then 1 + 3 ≥ 2 . This means that in 1 the second column is zero. Let 11 0 13 be the first nonzero row of the matrix 1 and 1 11 0 13 is the corresponding row of the matrix . Let these rows be composed of the coefficients of the monomials of degree 1 . We find some solution 10 30 of the equation It follows from this that in 1 ( ) there is no 1 -monomial and 1 ( ) ≡ 1 ( )(mod 1 ). If 0 = 1, then all subsequent rows in 1 are linearly dependent on 11 0 13 , then the matrix ( ) is to be sought. In order not to introduce new notations, we assume that the element 1 ( ) in ( ) does not contain 1 -monomial. Let 21 0 23 be the first linearly independent row from the row 11 0 13 of the matrix 0 and let 2 21 0 23 be the corresponding row of the matrix . Let these rows be composed of the coefficients of the monomials of degree 2 > 1 . From equation Therefore, ( ) is the desired matrix.
Next let us consider the situation when 3 ̸ = 0. Then, 1 := 1 + 3 < 2 and the row 1 = 0 0 1 of the 1coefficients is the first nonzero row of the matrix 3 . We do the first step. It is ss.e.t.-I of the matrix ( ), in which we put the 1 -coefficient of polynomial 3 ( ) instead of 12 in the left transforming matrix (see (7)). In the obtained reduced matrix ( ), element 3 ( ) does not contain a 1 -monomial (see p. (2)). If 2 := 2 3 < 2 , then the first linearly independent row with row 1 in 3 is a row 2 = 0 1 * of 2 -coefficients, 2 > 1 . We do the second step. This is ss.e.t.-III of the matrix ( ), in which in the left transforming matrix of the form (9) we assume the 2 -coefficient of polynomial 3 ( ) instead of 13 . In the matrix obtained after the second step, the element in position (3, 1) does not contain 1 -and 2 -monomials. In order not to introduce new notations for the matrices resulting from transformations, we assume that the element 3 ( ) in ( ) has this property. If 3 := 3 < 1 , then in the matrix 0 the row 3 = 1 0 * of the 3 -coefficients is the first linearly independent row with the collection 1 , 2 . To ( ), apply ss.e.t.-II. In this case, in place of 23 in the left transforming matrix (see (8)) we put the 3 -coefficient of the polynomial 1 ( ) taken with the opposite sign. This will be the third step. In the resulting matrix, the element in position (2, 1) does not contain a 3 -coefficient. Since the element in position (3,1) does not change at the same time, such a matrix is the desired one.
If after the first step it turns out that 2 ≥ 2 , but 3 < 1 , then we do the third step. Further, if 4 := 1 + 3 < 1 , then the row * 1 * of the 4 -coefficients in the matrix analogous to 0 is the first linearly independent with the collection 1 , 3 . We do the fourth step. It will be the ss.e.t-III of the matrix obtained after the third step. In this case, in the left transforming matrix (see (9)) instead of 13 , we put the 4 -coefficient of the polynomial in the position (2, 1). As a result, the element in the position (3, 1) will not change, and (2, 1)-element will be free from 3 -and 4 -monomials. The resulting matrix is the required one.  (1), (2) satisfy condition (1) of the theorem. Then, taking into account Corollary 1 and Remark [3], we can write the congruences: If + 3 ≥ for each = 1,3, then immediately we have ( ) = ( ). Otherwise, if we compare the ( + 3 )coefficients in both parts (48) for the first of two values of the index such that + 3 < , then we obtain 13 = 0. In any case ( ) = ( ).
If 1 + 3 ≥ 2 and < 1 , then in matrices 0 , 0 we have 3 = 3 = 0 and the second columns in 1 , 1 are zero. In this case, from (46) and (49) we have 3 ( ) = 3 ( ) and 1 ( ) ≡ 1 ( ) (mod ), respectively. Therefore, in subcolumns 1 , 1 of the matrices , the first = − 1 elements coincide and the corresponding rows in the matrices 1 , 1 are zero. In these matrices their ( + 1)-th rows are the first nonzero rows. They are the same. We will denote them by V 1 = V 11 0 V 13 . The elements in 1 , 1 corresponding to these rows are zero (see (8)). So we really have 1 ( ) ≡ 1 ( ) (mod +1 ). This means that the ( + 2)-th rows in the matrices 1 , 1 coincide. We will denote them by V 2 = V 21 0 V 23 . From (49) it is clear that V 11 V 13 23 13 = 0. If V 1 , V 2 are linearly independent, then in 1 , 1 there are zero elements corresponding to V 2 . Therefore, on the basis of (49) we have V 21 V 23 23 13 = 0. Hence 23 = 12 = 0 and everything is proved. If V 1 , V 2 are linearly dependent, then V 21 V 23 23 13 = 0, and this on the basis of (49) means that the ( + 2)-th elements in 1 , 1 coincide. Therefore, the following ( + 3)-th rows in 1 , 1 coincide. Denote them as V 3 = V 31 0 V 33 and again consider two situations: when V 1 , V 3 are linearly independent and when they are linearly dependent. In the first case, we will have that 23 = 12 = 0. Then the proof is finished. In the second case we have that the ( +3)-th elements in 1 , 1 coincide. This means that in 1 , 1 the ( + 4)-th rows coincide. Continuing similarly to our considerations, at some point we will obtain 23 = 12 = 0 or eventually we will have 1 = 1 . In any case ( ) = ( ).

Conclusions
The matrix ( ), established by each of Theorems 1-4, can be considered canonical for the class { ( ) ( )} of ss.e. matrices. It can be applied to the classification of sets of numerical matrices (over the field C) with respect to simultaneous similarity. In this context, the work of [12][13][14][15] should be noted. From the proof of Theorems 1-4 we can construct an algorithm for finding the transforming matrices (left nonsingular numerical and right invertible polynomial) that reduce ( ) to the canonical matrix ( ).

Data Availability
The data used to support the findings of this study are included within the article.

Conflicts of Interest
The author declares that there are no conflicts of interest regarding the publication of this paper.