**Author:** Trung Nguyen

**Tester and Editorialist:** Alei Reyes ## Problem Link

Practice

## Difficulty

Easy

## PREREQUISITES:

Implrementation

## Problem

A matrix is non-decreasing if every row and column is non decreasing. Given a matrix with some unknowns, complete the unknowns to get a non-decreasing matrix.

## Explanation

Let A[i][j] denote the element at row i and column j of matrix A. Since the matrix is non-decreasing, the following two conditions should be held:

- A[i][j] ≥ A[i][j-1], in row i elements are non-decreasing.
- A[i][j] ≥ A[i-1][j], in column j elements are non-decreasing.

Let i be the first row of A that contains a -1 and in this row let j be the column of the leftmost -1. It is always convenient to replace A[i][j] with the minimum possible value, otherwise it may be impossible to find a valid value for another -1 that is down to the right. So one possible solution (and the lexicographically smallest) is to set A[i][j] = max { A[i][j-1], A[i-1][j] }.

After filling some of the unknown positions in A, it may turn out that one of the values of A is smaller than some of the elements that are up to the left. In this case there is no solution.