In a binary matrix (all elements are 0
and 1
), every row is sorted in ascending order (0
to the left of 1
). Find the leftmost column index with a 1
in it.
Example 1:
Input:
[[0, 0, 0, 1],
[0, 0, 1, 1],
[0, 1, 1, 1],
[0, 0, 0, 0]]
Output: 1
Example 2:
Input:
[[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0]]
Output: -1
Expected solution better than O(r * c)
.
Notes:
This question is mainly on the optimization.
The naive method is O(r*c), which is below the requirement.
One improvement is to use binary search for each row, since each row is sorted. Find the lower bound of each row, then choose the lowest one of them. The time complexity is O(r*(log(c)).
Can we do better?
The answer is yes (if r is very large), we can do a greedy search:
starting from the top right corner,
if it is 1, we move to the left until the next element is 0;
then we just move down.
In the new row, we just repeat the above process until reach the boundary.
if it is 0, we continue to move down;
if it is 1, we start to move left until the next element is 0.
at the end of the search, the current position is the left-most column having at least one 1.
The time complexity is O(r + c).
Why does greedy work?
The answer is that: for each step we have find a "left-most: column" so far; after finishing the whole search, we obtain the "left-most column" overall.
See the code below:
class Solution {
public:
int mostLeftColumn(vector<vector<int>>& mt) {
if(!mt.size() || !mt.front().size()) return 0;
int m = mt.size(), n = mt.front().size();
int i = 0, j = n;
while(i<m && j>0) {
if(mt[i][j-1] == 1) --j;
else ++i;
}
return j == n ? -1 : j;
}
}
No comments:
Post a Comment