12
$\begingroup$

Suppose we have an infinite-dimensional real vector $y=(y_1,...)$. Suppose we have an infinite-dimensional real matrix $C=(c_{ij})$, $i,j\in\mathbb{N}$. Let $C^k$ be a submatrix of $C$, $C^k=(c_{ij})_{i,j=1,k}$ and $y_k$ a subvector of $y$, $y^k=(y_1,...,y_k)$. Define

$$\theta^k=C_k^{-1}y^k$$

My question (which probably is too general) is what conditions should $C$ and $y$ satisfy so that pointwise limits

$$\lim_{k\to\infty}\theta^k_i$$

exist? I got a feeling that this could be easily solved by applying theory of linear operators, but I cannot figure out how to reformulate the problem.

To make this question less general we can assume that $y\in\ell_2$ and $C_k$ is symmetric positive-definite matrix for each $k$.

This question is related to this one I've asked on mathoverflow.

Update @fedja below produces a possible sketch of a proof. It requires though that $\|C_k^{-1}\|$ is bounded sequence (take the matrix norm $\|\|_2$). If we suppose that $C$ is a linear operator in $\ell_2$, does the property that each submatrix $C_k$ is positive-definite ensures that the sequence $\|C_k^{-1}\|$ is bounded?

This question can be further rephrased the following way. For symmetric positive-definite matrix $A$ denote its minimal and maximal eigen values by $\lambda_{\min}$ and $\lambda_{\max}$. Then $\|A\|_2=\lambda_{\max}$ and $\|A^{-1}\|_2=\lambda_{\min}^{-1}$. With this in mind the previous question is identical whether $\lambda_{min}(C_k)$ is bounded away from zero.

  • 0
    Note that not only does the number of equations go to infinity, you also change already existing equations themselves in the process (by adding more variables).2011-07-04
  • 0
    I am not sure, if this will help you in any way, but Theorem 3.18 in the book [Hanson Yakovlev] Operator Theory for Electromagnetics is about something similar, but it deals with convergence in $\ell_2$, not pointwise convergence. http://books.google.com/books?id=fS1CippwF84C&pg=PA176&dq=%22matrix+operator%22+%22theorem+3.18%22&hl=en&ei=3FMVTo7_H43G-QbXy-gT&sa=X&oi=book_result&ct=result&resnum=1&ved=0CCgQ6AEwAA#v=onepage&q=%22matrix%20operator%22%20%22theorem%203.18%22&f=false2011-07-07
  • 0
    @mpiktas: I suppose my answer was not exactly what you were hoping for, as you offered a bounty after my answer was there. (I myself wrote that it is closer to a comment than to an answer.) That's why I decided to add a new bounty - perhaps this will attract someones attention and you'll get a better answer.2011-07-19
  • 0
    @Martin, thanks. Your answer gave useful additional information, so I gave the bounty to you in order not to waste it. Thanks again for your help!2011-07-19
  • 1
    There is quite a lot of literature devoted to the question how and to what extent a bounded operator $C$ on a Banach or Hilbert space inherits properties of an approximating sequence $(C_n)_n$ of operators with finite-dimensional range. One can study questions like yours in the abstract framework of Banach- and $C^*$-algebras, see for instance Hagen, Roch, Silbermann _$C^*$-algebras and numerical analysis_.2011-07-19
  • 0
    @lvb, thanks. Google references I found sound very promising, I will try to get the book.2011-07-19
  • 0
    Since for positive definite matrices $\lambda_{min}=\inf (Cx,x)$ over the unit sphere, we, indeed, have $\lambda_{min}(C_k)\ge \lambda_{min}$ in this case (restricting to the vectors with only first $k$ non-zero coordinates doesn't drop the minimum). So, bounded positive definite matrices with bounded inverses are fine.2011-07-20
  • 0
    @fedja, what I am actually asking (which is not very clear I admit) is whether we can deduce that $\lambda_\min=\inf(Cx,x)>0$ if we know something about $\lambda_{\min}(C_k)$, or to be more precise, what should we require from $\lambda_{\min}(C_k)$ so that $\lambda_\min=\inf(Cx,x)>0$. I think lvb guessed best what kind of answer I am looking for. In my actual problem I know a lot about $C_k$, and I only need to make sure that $C$ inherits the "good" properties of $C_k$ when $k$ goes to infinity.2011-07-25
  • 0
    If $C$ is self-adjoint and bounded, then $\lambda_{\text{min}}=\inf_k\lambda_{\text{min}(C_k)}$. I agree that it was not at all clear that you asked about going from $C_k$ to $C$, not vice versa. Why don't you just tell what $C$ is and what exactly you want from it?2011-07-26

2 Answers 2

3

I'm not sure what exactly you are looking for but if $\|C\|<+\infty$, $\sup_k \|C_k^{-1}\|<+\infty$, and $y\in\ell^2\cap C\ell^2$, then the convergence holds and, moreover, $\theta_k$ converge to the (unique) solution $\theta$ of $C\theta=y$ in $\ell^2$. The reason is very simple. If $P_k$ is the orthogonal projection to the first $k$ coordinates, then $C_k$ is the non-trivial part of $P_kCP_k$ so if $\theta_k$ is extended by zeroes, we have $P_kCP_k\theta_k=P_ky$. On the other hand, $|P_kCP_k\theta-P_ky|\le \|C\|\cdot|P_k\theta-\theta|$. Thus, $|\theta_k-P_k\theta|\le\|C_k^{-1}\|\cdot\|C\|\cdot |P_k\theta-\theta_k|\to 0$ as $k\to\infty$.

  • 0
    The last term in inequality should be $P_k\theta-\theta$. Good idea, I've already upvoted it. But upon the closer inspection I have a problem with the term $\|C_k^{-1}\|$. Since it depends on $k$ it should be proved that it is bounded.2011-07-19
2

This is closer to a comment than an answer, but it would be too long for a comment. Basically a few thougths that might or might not help. (Just a things this reminded me of, not necessarily helpfull for your problem.)

Let us rewrite your equation as $y^k=C_k\theta^k$. If I consider $y^k$ and $\theta^k$ as infinite sequences (I add zeroes), then I have

$$z^k=C\theta^k,$$

where the vector $z^k$ has the same values as $y^k$ on the first $k$ coordinates.

If we additionaly assume that $y^k\to y$ (which implies $z^k\to y$) and $\theta^k\to\theta$ pointwise and that the operator corresponding to $C$ is in some sense continuous, then we get

$$y=C\theta.$$

Hence a necessary condition is that $y$ is in the range of $C$.

My guess is that this condition should be sufficient for the continuity of $C$:

$$\|C\| = \sup_n \sum_k |c_{nk}| < \infty$$


I am not sure that this really helps but the last equation $$y=C\theta$$ is related to summability theory. In fact, matrix methods studied in summability theory are defined in the way that you transform a sequence using a given matrix and then take a limit.

For overview of some known results you can consult e.g. Boos: Classical and modern methods in summability, you can find also some notes here: http://thales.doa.fmph.uniba.sk/sleziak/texty/rozne/trf/boos/

From a functional analytic viewpoint: Morrison's Functional analysis

  • 0
    More refernces on summability thoery can be found in this answer: http://math.stackexchange.com/questions/52139/reference-request-introduction-to-mathematical-theory-of-regularization/52144#521442011-07-19