EE6132 Programming Assignment-2 Filtering and Hybrid Images solution

$24.99

Original Work ?

Download Details:

  • Name: PA2-EE17B113-80lrn5.zip
  • Type: zip
  • Size: 15.95 MB

Category: You will Instantly receive a download link upon Payment||Click Original Work Button for Custom work

Description

5/5 - (4 votes)

1 Filtering
In signal processing, €ltering is a process of removing unwanted features or components from
a signal. Most o‰en, this means removing or suppressing some frequencies or frequency bands.
Filtering operation can be linear, non-linear, space-variant or invariant.
Below, we will look at some operations that can be done on a signal. For the below operations,
consider a discrete time, 1D signal X of length 16, where X = {x0, x1, x2, . . . x15}. ‘e output of
the €ltering process is a signal Y = {y0, y1, y2, . . . y15}, the same length as X. Implement the following €ltering operations on the signal X de€ned as xk = 3 + sin 2πk/15 for k = {0, 1, . . . , 15}
and 0 otherwise.
For the €lters that are linear and space-invariant, verify that the convolutional implementation
of the €lter gives the same output as direct implementation.
a) yk = xk+1 − xk
b) yk = xk − X¯, where X¯ =
1
L + 1
X
L
i=0
xi
c) yk = median({xl
: l ∈ [k − 2, k + 2]})
d) yk = xk+0.5 − xk−0.5
e) yk = |xk+0.5 − xk−0.5|
f) yk =
1
5
X
k+2
i=k−2
xi
(Note: Linearly interpolate the neighboring samples to obtain signal values such as xk+0.5.)
Tasks
1. Implement each of the €ltering operations to obtain the desired output. Each of the outputs
have to be the same size as the input signal.
2. For each operation, determine if these operations are linear and space-invariant.
3. ‘ose operations that are linear and space-invariant, propose an equivalent convolution
operation to implement the €ltering process and also implement it.
4. For €lters that are implemented via convolution, verify if the results are the same visually.
2
2 Filtering in Fourier space
Filtering operations that are linear and space-invariant can be represented as a convolutional operation. Once, such a representation is obtained, we can use the Fourier property of convolutions
to eciently implement the €lter in the Fourier domain.
Tasks
1. For those €lters above that are linear and space-invariant, implement them in the Fourier
domain.
2. Verify that the desired output from the Fourier implementation is the same as the spatial
domain implementation. If there’s any di‚erence, explain why.
3. If for any of the cases, the output from the spatial and Fourier domain implementations are
di‚erent, then suggest the modi€cation to make the outputs same. Implement the modi€-
cation and re-verify if the results are the same.
3 Hybrid Images
We will write an image convolution function (image €ltering) and use it to create hybrid images!1
‘e technique was invented by Oliva, Torralba, and Schyns in 2006, and published in a paper at
SIGGRAPH. High frequency image content tends to dominate perception but, at a distance, only
low frequency (smooth) content is perceived. By blending high and low frequency content, we
can create a hybrid image that is perceived di‚erently at di‚erent distances.
A hybrid image is the sum of a low-pass €ltered version of a €rst image and a high-pass
€ltered version of a second image. We must tune a free parameter for each image pair to control
how much high frequency to remove from the €rst image and how much low frequency to leave
in the second image. ‘is is called the cut-o‚ frequency. ‘e paper suggests to use two cut-o‚
frequencies, one tuned for each image, and you are free to try this too. Using a single cut-o‚
frequency for both images should be sucient. We will use a symmetric, zero-mean gaussian
€lter for our €ltering (low-pass and high-pass) operations. In our case, the cut-o‚ frequency will
represent the standard deviation of the gaussian €lter that will be used.
1
Parts of this assignment are borrowed from this page
3
Process to generate a hybrid image:
a. Implement a my €lter.py function to implement the 2-D image €ltering operation. Your
function should:
– pad the input image with zeros before €ltering
– accept any arbitrary €lter kernel with which to convolve the image. If the €lter has
even dimension, then raise an exception.
– Return a €ltered image of the same spatial resolution as the input
– Supports €ltering of both grayscale and colour images.
b. Implement a function to generate a Gaussian kernel of a given standard deviation. ‘e
standard deviation represents the cut-o‚ frequency of the €lter.
c. Remove high frequencies from image1 by blurring image1 with the Gaussian €lter
d. Remove low frequencies from image2 using a two-step process. First, remove high frequencies from image2 using the process described in (c). ‘en subtract the low-pass €ltered
image2 from the original image to leave only the high frequency components.
e. Each of the €ltered images can have image values that are smaller than 0.0 or larger than
1.0. In such cases the clip the values smaller than 0.0 to 0.0 and values larger than 1.0 to 1.0.
For high pass €ltered image add a contant value of 0.5 to the whole image before clipping
the values to be between 0. and 1..
f. Combine the two images to generate the hybrid image.
Once the hybrid images are successfully created, you can view it from di‚erent distances
to perceive di‚erent images. An useful way to visualize this hybrid image is by progressively
downsampling the image. By progressive downsampling, we remove a part of the frequency
content of the signal. Which part of the frequency content is removed and why? How does this
a‚ect the visualization of the hybrid images at di‚erent resolutions?
Tasks
1. ‘ere are 7 di‚erent pairs of images in the data directory provided. For each pair, you
have to generate 2 di‚erent hybrid images by considering the €rst image in the directory as
4
image1 and image2 respectively. ‘ese pairs of images can be color, or grayscale and also
of di‚erent resolutions. Your code should be able to handle all the di‚erent cases.
2. Use the function provided in the helpers.py €le to visualize all the hybrid images at
di‚erent resolutions.
3. Don’t forget to tune the cut-o‚ frequency for each pair of hybrid images to get the most
visually pleasing results.
5