Rank
Rank
You are given a matrix which contains only 0 and 1. Furthermore, there are exactly two 1 in every column of this matrix. Calculate the rank of this matrix.
A rank of a matrix is a maximum number k such that there exist a set of k linearly independent rows. A set {v1
, ..., vk
} is called linearly independent if and only if for any real numbers a1
, ..., ak
such that a1^2
+ ... + ak^2
≠ 0 the inequality a1
* v1
+ ... + ak
* vk
≠ 0 holds.
For the rows, we assume the use of the standard rules of multiplication by a scalar and addition:
a (u(1), ..., u(m)) = (au(1), ..., au(m)),
(u(1), ..., u(m)) + (v(1), ..., v(m)) = (u(1) + v(1), ..., u(m) + v(m)).
A zero row is 0 = (0, ..., 0).
Input
The first line contains two integers n and m (2 ≤ n ≤ 2 * 105
, 1 ≤ m ≤ 2 * 105
): the number of rows and columns in the matrix. Next 2m lines each contain two integers r and c (1 ≤ r ≤ n, 1 ≤ c ≤ m) specifying the position of a 1 in the matrix. Here r is the row and c is the column. Any cell which was not mentioned in this list contains a zero. It is guaranteed that all pairs (r, c) will be distinct and that there will be exactly two 1 in each column of the matrix.
Output
Print the only number: the rank of the given matrix.
3 3 1 1 2 1 2 2 3 2 1 3 3 3
3