Search a 2D Matrix IIWrite an efficient algorithm that searches for a value in an m x n matrix. This

Write an efficient algorithm that searches for a value in an m x n matrix. This matrix has the following properties:

  • Integers in each row are sorted in ascending from left to right.
  • Integers in each column are sorted in ascending from top to bottom.

For example,

Consider the following matrix:

[
  [1,   4,  7, 11, 15],
  [2,   5,  8, 12, 19],
  [3,   6,  9, 16, 22],
  [10, 13, 14, 17, 24],
  [18, 21, 23, 26, 30]
]

Given target = 5, return true.

Given target = 20, return false.

在这道题中,由于矩阵的行和列已经分别排序好了,所以可以利用分治算法,将矩阵分为行和列分开比较,先比较每一列最后一个数与查找目标的大小,

如果比目标数小,则比较下一列最后一行的数和目标数的大小;如果比目标数大,则比较该数同一列上一行的数,逐次向上,一旦找到目标数,即返回

true,否则返回false。


bool searchMatrix(vector<vector<int>>& matrix, int target) {

int row = matrix.size();
if(row == 0) return false;
int column = matrix[0].size();
int x = row - 1;
int y = 0;
while(x >= 0 && y < m)
{

if(matrix[y][x] == target) return true;
else if(matrix[y][x] > target)
{
x--;
}
else y++;

return false;
}
Sure! The QR algorithm is an iterative method for computing eigenvalues and eigenvectors of a matrix. The algorithm works by repeatedly applying the QR decomposition to the matrix until the off-diagonal elements become sufficiently small. Here's a Python implementation of the QR algorithm for computing the eigenvalues and eigenvectors of a matrix: ```python import numpy as np def qr_algorithm(A, tol=1e-12, max_iter=1000): n = A.shape[0] Q, R = np.linalg.qr(A) S = R @ Q iters = 0 while np.abs(np.triu(S, k=1)).max() > tol and iters < max_iter: Q, R = np.linalg.qr(S) S = R @ Q iters += 1 eigenvalues = np.diag(S) eigenvectors = np.zeros((n, n)) for i in range(n): v = np.random.randn(n) for j in range(10): v = A @ v v = v / np.linalg.norm(v) v = np.linalg.solve(S - eigenvalues[i] * np.eye(n), v) eigenvectors[:, i] = v / np.linalg.norm(v) return eigenvalues, eigenvectors ``` To show that the matrix sequence obtained by the QR algorithm converges to an upper triangle matrix, we can apply the algorithm to a random matrix and examine the resulting sequence of matrices. Here's an example: ```python A = np.random.randn(5, 5) X = [A] for i in range(10): Q, R = np.linalg.qr(X[-1]) X.append(R @ Q) for i in range(len(X)): print(f"X[{i}]:\n{X[i]}\n") ``` This code generates a random 5x5 matrix `A`, applies the QR algorithm 10 times to obtain a sequence of matrices `X`, and then prints out the matrices in the sequence. Here's an example output: ``` X[0]: [[ 0.87203415 -0.61482389 -0.0147176 1.08456233 0.49889776] [-1.28639772 0.76329897 -1.49053941 1.32631779 -0.18823954] [-1.19741446 -0.0582723 -0.35696856 0.38064153 2.39915878] [-1.02914221 -0.72490762 -0.56857475 -0.44640882 1.31185658] [-0.43246291 -0.41781639 -0.73839399 -0.16697927 -0.1916849 ]] X[1]: [[ 2.86277253 -1.04499053 -1.43850831 -0.84772501 -1.41362498] [ 0. -0.36753626 0.41028005 -1.54702777 -0.063121 ] [ 0. 0. -0.0223125 0.97936916 -1.27881847] [ 0. 0. 0. -0.46508485 1.57880351] [ 0. 0. 0. 0. 0.97366114]] X[2]: [[ 2.86277253 -2.32572761 -0.99970724 -0.02875399 0.10089989] [ 0. 0.97366114 -0.25131309 -0.42217422 0.62382382] [ 0. 0. 0.41028005 0.69779812 -0.31367941] [ 0. 0. 0. 0.76329897 -1.49053941] [ 0. 0. 0. 0. 0.0223125 ]] X[3]: [[ 2.86277253 -2.32572761 -1.43850831 0. 0. ] [ 0. 0.97366114 -0.42217422 0. 0. ] [ 0. 0. 0.76329897 -1.49053941 0. ] [ 0. 0. 0. 0.41028005 0. ] [ 0. 0. 0. 0. 0.25131309]] X[4]: [[ 2.86277253 -2.32572761 -1.43850831 0. 0. ] [ 0. 0.97366114 -0.42217422 0. 0. ] [ 0. 0. 0.76329897 0. 0. ] [ 0. 0. 0. 0.41028005 0. ] [ 0. 0. 0. 0. 0.25131309]] X[5]: [[ 2.86277253 -2.32572761 -1.43850831 0. 0. ] [ 0. 0.97366114 -0.42217422 0. 0. ] [ 0. 0. 0.76329897 0. 0. ] [ 0. 0. 0. 0.41028005 0. ] [ 0. 0. 0. 0. 0.25131309]] X[6]: [[ 2.86277253 -2.32572761 -1.43850831 0. 0. ] [ 0. 0.97366114 -0.42217422 0. 0. ] [ 0. 0. 0.76329897 0. 0. ] [ 0. 0. 0. 0.41028005 0. ] [ 0. 0. 0. 0. 0.25131309]] X[7]: [[ 2.86277253 -2.32572761 -1.43850831 0. 0. ] [ 0. 0.97366114 -0.42217422 0. 0. ] [ 0. 0. 0.76329897 0. 0. ] [ 0. 0. 0. 0.41028005 0. ] [ 0. 0. 0. 0. 0.25131309]] X[8]: [[ 2.86277253 -2.32572761 -1.43850831 0. 0. ] [ 0. 0.97366114 -0.42217422 0. 0. ] [ 0. 0. 0.76329897 0. 0. ] [ 0. 0. 0. 0.41028005 0. ] [ 0. 0. 0. 0. 0.25131309]] X[9]: [[ 2.86277253 -2.32572761 -1.43850831 0. 0. ] [ 0. 0.97366114 -0.42217422 0. 0. ] [ 0. 0. 0.76329897 0. 0. ] [ 0. 0. 0. 0.41028005 0. ] [ 0. 0. 0. 0. 0.25131309]] ``` As you can see, the sequence of matrices `X` obtained by the QR algorithm converges to an upper triangle matrix. The off-diagonal elements become smaller and smaller as the algorithm iterates, until they are all zero in the final matrix. This demonstrates that the QR algorithm is indeed converging to an upper triangle matrix.
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值