Sparse Reconstruction with Compressed Sensing: Difference between revisions

From Cornell University Computational Optimization Open Textbook - Optimization Wiki
Jump to navigation Jump to search
(Created page with "Author: Ngoc Ly (SysEn 5800 Fall 2021)")
 
No edit summary
Line 1: Line 1:
Author: Ngoc Ly (SysEn 5800 Fall 2021)
Author: Ngoc Ly (SysEn 5800 Fall 2021)
Introduction
<math>x \in \mathbb{R}^N</math>often not really sparse but aproximently sparse
Φ ∈ R M × N <math>\Phi \in \mathbb{R}^{M \times N}</math>for M � N i<math>M \ll N</math>s a Random Gaussian or Bernoulli matrix
y ∈ R M <math>y \in \mathbb{R}^M</math>are the observed y samples
e ∈ R M nosie vector k e k 2 ≤ η
s.t.
How can we reconstruct x from y = Φx + e
The goal is to reconstruct x 0 ∈ R N given y and Φ
Sensing matrix Φ must satisfy RIP i.e. Random Gaussian or Bernoulli ma-
traxies satsifies (cite)
let [ N ] = { 1, . . . , N } be an index set
[ N ] enumerates the colums of Φ and x
Φ is an underdetermined systems with infinite solutions. Why l 2 norm does
not work
What is compression is sunomunus with to the sparcity.
The problem formulation is to recover sparse data x 0 ∈ R N
The support of x 0 is supp ( x 0 ) = { i ∈ [ N ] : x 0 ( i ) 6 = 0 }
we say x is k sparse when | supp ( x )| ≤ k
We are interested in the smallest supp ( x ) , i.e. min(supp ( x ) )
Restricted isometry property (RIP) Introduced by Candes, Tao
Random Gausian and Bernoulli satsifies RIP
Let Φ ∈ R M × N satsify RIP, Let [ N ] be an index set
Before we get into RIP lets talk about RIC
Resiricted Isometry Constant (RIC) is the smallest δ | s s.t. s ⊆ [ N ] that satsi-
fies the RIP condition
For s is a restrciton on x denoted by x | s x ∈ R N tos k-sparse x s.t. RIP is
satisified the s = | Γ | i.e. s ⊆ [ N ] and Φ | s ⊆ Φ where the columns of Φ | s is
indexed by i ∈ S
x i , i f i ∈ S
( x | S ) i =
0 otherwise
RIP defined as
( 1 − δ s )k x k 22 ≤ k Φx k 22 ≤ ( 1 + δ s )k x k 22
3 Lemmas Page 267 Blumensath Davies IHT for CS
Lemma 1(Blumensath, Davis 2009 Iterative hard thresholding for compressed
sensing), For all index sets Γ and all Φ for which RIP holds with s = | Γ | that is
s = supp ( x )
1k Φ Γ T k 2 ≤
q
1 + δ | Γ | k y k 2
( 1 − δ | Γ | )k x Γ k 22 ≤ k Φ Γ T Φ Γ x Γ k 22 ≤ ( 1 + δ | Γ | )k x Γ k 22
and
k( I − Φ Γ T Φ Γ )k 2 ≤ δ | Γ | k x Γ k 2
SupposeΓ ∩ Λ = ∅
k Φ Γ T Φ Λ ) x Λ k 2 ≤ δ s k x Λ k 2
Lemma 2 (Needell Tropp, Prop 3.5 in CoSaMP: Iterative signal recovery
from incomplete and inaccurate √ samples)
If Φ satisfies RIP k Φx s k 2 ≤ 1 + δ s k x s k 2 , ∀ x s : k x s k 0 ≤ s, Then ∀ x
k Φx k 2 ≤
p
1 + δ s k x k 2 +
p
1 + δ s
k x k 1
sqrts
Lemma 3 (Needell Tropp, Prop 3.5 in CoSaMP: Iterative signal recovery
from incomplete and inaccurate samples)
Let x s be the best s-term approximation to x. Let x r = x − x s Let
y = Φx + e = Φx s + Φx r + e = Φx s + ẽ
If Φ statisfies RIP for sparcity s, then the norm of error ẽ is bounded by
k ẽ k 2 ≤
p
1 + δ s k x − x s k 2 +
p
1 + δ s
k x − x s k 1
+ k e k 2
s
∀ x
Theory
Gel’fend n-width
Errors E ( S, Φ, D )
Definition Mutual Coherence
LetA ∈ R M × N , themutualcoherenceµ A isde f inedby :
µ A =
|h a i , a j i|
k a i kk a j k
i 6 = j
We want a small µ A because it will be close ot the normal matrix, which
will satisfie RIP
Numerical Example
Basis Persuit
Iterative Hard Thresholding IHT
Solver
2

Revision as of 01:27, 28 November 2021

Author: Ngoc Ly (SysEn 5800 Fall 2021)


Introduction

often not really sparse but aproximently sparse

Φ ∈ R M × N for M � N is a Random Gaussian or Bernoulli matrix

y ∈ R M are the observed y samples

e ∈ R M nosie vector k e k 2 ≤ η

s.t.

How can we reconstruct x from y = Φx + e

The goal is to reconstruct x 0 ∈ R N given y and Φ

Sensing matrix Φ must satisfy RIP i.e. Random Gaussian or Bernoulli ma-

traxies satsifies (cite)

let [ N ] = { 1, . . . , N } be an index set

[ N ] enumerates the colums of Φ and x

Φ is an underdetermined systems with infinite solutions. Why l 2 norm does

not work

What is compression is sunomunus with to the sparcity.

The problem formulation is to recover sparse data x 0 ∈ R N

The support of x 0 is supp ( x 0 ) = { i ∈ [ N ] : x 0 ( i ) 6 = 0 }

we say x is k sparse when | supp ( x )| ≤ k

We are interested in the smallest supp ( x ) , i.e. min(supp ( x ) )

Restricted isometry property (RIP) Introduced by Candes, Tao

Random Gausian and Bernoulli satsifies RIP

Let Φ ∈ R M × N satsify RIP, Let [ N ] be an index set

Before we get into RIP lets talk about RIC

Resiricted Isometry Constant (RIC) is the smallest δ | s s.t. s ⊆ [ N ] that satsi-

fies the RIP condition

For s is a restrciton on x denoted by x | s x ∈ R N tos k-sparse x s.t. RIP is

satisified the s = | Γ | i.e. s ⊆ [ N ] and Φ | s ⊆ Φ where the columns of Φ | s is

indexed by i ∈ S

x i , i f i ∈ S

( x | S ) i =

0 otherwise

RIP defined as

( 1 − δ s )k x k 22 ≤ k Φx k 22 ≤ ( 1 + δ s )k x k 22

3 Lemmas Page 267 Blumensath Davies IHT for CS

Lemma 1(Blumensath, Davis 2009 Iterative hard thresholding for compressed

sensing), For all index sets Γ and all Φ for which RIP holds with s = | Γ | that is

s = supp ( x )

1k Φ Γ T k 2 ≤

q

1 + δ | Γ | k y k 2

( 1 − δ | Γ | )k x Γ k 22 ≤ k Φ Γ T Φ Γ x Γ k 22 ≤ ( 1 + δ | Γ | )k x Γ k 22

and

k( I − Φ Γ T Φ Γ )k 2 ≤ δ | Γ | k x Γ k 2

SupposeΓ ∩ Λ = ∅

k Φ Γ T Φ Λ ) x Λ k 2 ≤ δ s k x Λ k 2

Lemma 2 (Needell Tropp, Prop 3.5 in CoSaMP: Iterative signal recovery

from incomplete and inaccurate √ samples)

If Φ satisfies RIP k Φx s k 2 ≤ 1 + δ s k x s k 2 , ∀ x s : k x s k 0 ≤ s, Then ∀ x

k Φx k 2 ≤

p

1 + δ s k x k 2 +

p

1 + δ s

k x k 1

sqrts

Lemma 3 (Needell Tropp, Prop 3.5 in CoSaMP: Iterative signal recovery

from incomplete and inaccurate samples)

Let x s be the best s-term approximation to x. Let x r = x − x s Let

y = Φx + e = Φx s + Φx r + e = Φx s + ẽ

If Φ statisfies RIP for sparcity s, then the norm of error ẽ is bounded by

k ẽ k 2 ≤

p

1 + δ s k x − x s k 2 +

p

1 + δ s

k x − x s k 1

+ k e k 2

s

∀ x

Theory

Gel’fend n-width

Errors E ( S, Φ, D )

Definition Mutual Coherence

LetA ∈ R M × N , themutualcoherenceµ A isde f inedby :

µ A =

|h a i , a j i|

k a i kk a j k

i 6 = j

We want a small µ A because it will be close ot the normal matrix, which

will satisfie RIP

Numerical Example

Basis Persuit

Iterative Hard Thresholding IHT

Solver

2