辅导CSE 152、Python程序调试、讲解Python设计、辅导Computer Vision 辅导留学生Prolog|辅导R语言编程
- 首页 >> 其他 HW2
April 26, 2019
1 CSE 152: Intro to Computer Vision - Spring 2019 Assignment 2
1.1.1 Assignment published on Friday, April 26, 2019
1.1.2 Due on Wednesday, May 8, 2019 at 11:59pm
1.2 Instructions
This assignment must be completed individually. Review the academic integrity and collaboration
policies on the course website.
All solutions should be written in this notebook.
If you want to modify the skeleton code, you may do so. It has been merely been provided
as a framework for your solution.
You may use Python packages for basic linear algebra (e.g. NumPy or SciPy for basic operations),
but you may not use packages that directly solve the problem. If you are unsure
about using a specific package or function, ask the instructor and/or teaching assistants for
clarification.
You must submit this notebook exported as a PDF. You must also submit this notebook as an
.ipynb file. Submit both files (.pdf and .ipynb) on Gradescope. You must mark the PDF
pages associated with each question in Gradescope. If you fail to do so, we may dock
points.
It is highly recommended that you begin working on this assignment early.
Late policy: a penalty of 10% per day after the due date.
1.3 Problem 1: Stereo and Disparity [3 pts]
Consider two cameras whose (virtual) image planes are the z=1 plane, and whose focal points are
at (-20, 0, 0) and (20, 0, 0). We”ll call a point in the first camera (x, y), and a point in the second
camera (u, v). Points in each camera are relative to the camera center. So, for example if (x, y) = (0,
0), this is really the point (-20, 0, 1) in world coordinates, while if (u, v) = (0, 0) this is the point (20,
10, 1). a) Suppose the points
(x, y) = (12, 12) is matched to the point (u, v) = (1, 12). What is the 3D location of this point?
b) Consider points that lie on the line x + z = 0, y = 0. Use the same stereo setup as before.
Write an analytic expression giving the disparity of a point on this line after it projects onto
the two images, as a function of its position in the right image. So your expression should
only involve the variables u and d (for disparity). Your expression only needs to be valid for
points on the line that are in front of the cameras, i.e. with z > 1.
1.4 Problem 2: Sparse Stereo Matching [20 pts]
In this problem we will play around with sparse stereo matching methods. You will work on two
image pairs, a warrior figure and a figure from the Matrix movies. These files both contain two
images, two camera matrices, and associated sets of corresponding points (extracted by manually
clicking the images).
For the problems below, you will complete functions to demonstrate results on warrior image
pairs (warrior1.png, warrior2.png). In all cases, you should apply the same procedures on the
matrix image pair (matrix1.png, matrix2.png) as well. (Provide the same thing for BOTH matrix
and warrior.) Note that the matrix image pair is harder, in the sense that matching algorithms will
not work quite as well on it. You should expect good results, however, on warrior.
In [ ]: import numpy as np
import matplotlib.pyplot as plt
from skimage import io
In [ ]: def rgb2gray(rgb):
""" Convert rgb image to grayscale.
"""
return np.dot(rgb[...,:3], [0.299, 0.587, 0.114])
# convert points from euclidean to homogeneous
def to_homog(points):
points = np.concatenate((points, np.ones((1, points.shape[1]))), axis=0)
return points
# convert points from homogeneous to euclidean
def from_homog(points_homog):
z = points_homog[-1,:]
2points_homog = points_homog[:2,:]/z
return points_homog
In [ ]: # plot matching result
def show_matching_result(img1, img2, matching):
fig = plt.figure(figsize=(8, 8))
plt.imshow(np.hstack((img1, img2)), cmap='gray') # two dino images are of different sizes, resize one before use
for p1, p2 in matching:
plt.scatter(p1[0], p1[1], s=35, edgecolors='r', facecolors='none')
plt.scatter(p2[0] + img1.shape[1], p2[1], s=35, edgecolors='r', facecolors='none')
plt.plot([p1[0], p2[0] + img1.shape[1]], [p1[1], p2[1]])
plt.show()
1.4.1 Epipolar Geometry
Using the fundamental_matrix function and the corresponding points provided in cor1.npy and
cor2.npy, calculate the fundamental matrix. Note that estimation of the fundamental matrix is
ill-conditioned; we need to normalize coordinates before computing the fundamental matrix in
order to remedy this problem. The fundamental_matrix function contains code for normalization,
so you just need to complete the compute_fundamental function by implementing the eight-point
algorithm.
Next, use this fundamental matrix, implement plot_epipolar_lines to plot the
epipolar lines in both image pairs. For this part you may want to complete the
function compute_fundamental and then use plot_epipolar_lines to draw epipolar
lines on both images. Below, we provide some example results on the dino image
pair. Your results should look similar. Include your results for matrix and warrior
as per the figure below.
3In [ ]: import numpy as np
import matplotlib.pyplot as plt
imgs = []
for i in range(2):
img = io.imread('p4/warrior/warrior' + str(i) + '.png')
#img = io.imread('p4/matrix/matrix' + str(i) + '.png')
imgs.append(rgb2gray(img))
cor1 = np.load("./p4/warrior/cor1.npy")
cor2 = np.load("./p4/warrior/cor2.npy")
matching = [(cor1[:,i], cor2[:,i]) for i in range(cor1.shape[1])]
show_matching_result(imgs[0],imgs[1], matching)
# Remember to show your result for matrix image pair
1.4.2 Compute the Fundamental Matrix [5 pts]
Please complete the compute_fundamental function. You only need to write the part between
"Your Code Here!" and "Your Code End!"
In [ ]: def compute_fundamental(x1,x2):
""" Computes the fundamental matrix from corresponding points
(x1,x2 3*n arrays) using the 8 point algorithm.
Each row in the A matrix below is constructed as
[x*x', x*y', x, y*x', y*y', y, x', y', 1]
"""
n = x1.shape[1]
if x2.shape[1] != n:
raise ValueError("Number of points don't match.")
4'''
Your Code Here!
'''
# build matrix for equations
A = np.zeros((n,9))
'''
Your Code End!
'''
# compute linear least square solution
U,S,V = np.linalg.svd(A)
F = V[-1].reshape(3,3)
# constrain F
# make rank 2 by zeroing out last singular value
U,S,V = np.linalg.svd(F)
S[2] = 0
F = np.dot(U,np.dot(np.diag(S),V))
return F/F[2,2]
def fundamental_matrix(x1,x2):
n = x1.shape[1]
if x2.shape[1] != n:
raise ValueError("Number of points don't match.")
# normalize image coordinates
x1 = x1 / x1[2]
mean_1 = np.mean(x1[:2],axis=1)
S1 = np.sqrt(2) / np.std(x1[:2])
T1 = np.array([[S1,0,-S1*mean_1[0]],[0,S1,-S1*mean_1[1]],[0,0,1]])
x1 = np.dot(T1,x1)
x2 = x2 / x2[2]
mean_2 = np.mean(x2[:2],axis=1)
S2 = np.sqrt(2) / np.std(x2[:2])
T2 = np.array([[S2,0,-S2*mean_2[0]],[0,S2,-S2*mean_2[1]],[0,0,1]])
x2 = np.dot(T2,x2)
# compute F with the normalized coordinates
F = compute_fundamental(x1,x2)
# reverse normalization
F = np.dot(T1.T,np.dot(F,T2))
5return F/F[2,2]
1.4.3 Epipolar Lines [5 pts]
In [ ]: def plot_epipolar_lines(F,img1,img2, cor1, cor2):
"""Plot epipolar lines on image given fundamental matrix, image, corners
Args:
F: Fundamental matrix
img1: Image 1.
img2: Image 2.
cor1: Corners in homogeneous image coordinate in image 1 (3xn)
cor2: Corners in homogeneous image coordinate in image 2 (3xn)
"""
"""
Your Code Here !!!
"""
In [ ]: F = fundamental_matrix(cor1,cor2)
plot_epipolar_lines(F,imgs[0],imgs[1],cor1,cor2)
1.4.4 Image Rectification
An interesting case for epipolar geometry occurs when two images are parallel to each other.
In this case, there is no rotation component involved between the two images and the essential
matrix is E = [Tx]R = [Tx]. Also if you observe the epipolar lines l and l
′
for parallel images,
they are horizontal and consequently, the corresponding epipolar lines share the same vertical
coordinate. Therefore the process of making images parallel becomes useful while discerning
the relationships between corresponding points in images. Rectifying a pair of images can also
be done for uncalibrated camera images (i.e. we do not require the camera matrix of intrinsic
parameters). Using the fundamental matrix we can find the pair of epipolar lines li and l
′
i
for
each of the correspondences. The intersection of these lines will give us the respective epipoles
e and e
′
. Now to make the epipolar lines to be parallel we need to map the epipoles to infinity.
Hence, we need to find a homography that maps the epipoles to infinity. The method to find
the homography has been implemented for you. You can read more about the method used to
estimate the homography in the paper "Theory and Practice of Projective Rectification" by Richard
6Hartley.
In this part you first need to complete the function compute_epipole. The function compute_epipole
is used to calculate the epipoles for a given fundamental matrix and corner point
correspondences in the two images.
Using the compute_epipoles function and the given compute_matching_homographies function,
we get HL and HR. Then we can complete the function image_rectification to find the rectified
images and plot the parallel epipolar lines using the plot_epipolar_lines function from above. You
need to run this for both the matrix and the warrior images. A sample output is provided below:
1.4.5 Compute Epipole [5 pts]
In [ ]: def compute_epipole(F):
'''
This function computes the epipoles for a given fundamental matrix and corner point correspondences
7input:
F--> Fundamental matrix
output:
e1--> corresponding epipole in image 1
e2--> epipole in image2
'''
"""
Your Code Here!!!
"""
return e1/e1[2], e2/e2[2]
1.4.6 Image Rectification [5 pts]
In [ ]: def compute_matching_homographies(e2, F, im2, points1, points2):
'''This function computes the homographies to get the rectified images
input:
e2--> epipole in image 2
F--> the Fundamental matrix
im2--> image2
points1 --> corner points in image1
points2--> corresponding corner points in image2
output:
H1--> Homography for image 1
H2--> Homography for image 2
'''
# calculate H2
width = im2.shape[1]
height = im2.shape[0]
T = np.identity(3)
T[0][2] = -1.0 * width / 2
T[1][2] = -1.0 * height / 2
e = T.dot(e2)
e1_prime = e[0]
e2_prime = e[1]
if e1_prime >= 0:
alpha = 1.0
else:
alpha = -1.0
R = np.identity(3)
R[0][0] = alpha * e1_prime / np.sqrt(e1_prime**2 + e2_prime**2)
R[0][1] = alpha * e2_prime / np.sqrt(e1_prime**2 + e2_prime**2)
R[1][0] = - alpha * e2_prime / np.sqrt(e1_prime**2 + e2_prime**2)
R[1][1] = alpha * e1_prime / np.sqrt(e1_prime**2 + e2_prime**2)
8f = R.dot(e)[0]
G = np.identity(3)
G[2][0] = - 1.0 / f
H2 = np.linalg.inv(T).dot(G.dot(R.dot(T)))
# calculate H1
e_prime = np.zeros((3, 3))
e_prime[0][1] = -e2[2]
e_prime[0][2] = e2[1]
e_prime[1][0] = e2[2]
e_prime[1][2] = -e2[0]
e_prime[2][0] = -e2[1]
e_prime[2][1] = e2[0]
v = np.array([1, 1, 1])
M = e_prime.dot(F) + np.outer(e2, v)
points1_hat = H2.dot(M.dot(points1.T)).T
points2_hat = H2.dot(points2.T).T
W = points1_hat / points1_hat[:, 2].reshape(-1, 1)
b = (points2_hat / points2_hat[:, 2].reshape(-1, 1))[:, 0]
# least square problem
a1, a2, a3 = np.linalg.lstsq(W, b)[0]
HA = np.identity(3)
HA[0] = np.array([a1, a2, a3])
H1 = HA.dot(H2).dot(M)
return H1, H2
def image_rectification(im1,im2,points1,points2):
'''This function provides the rectified images along with the new corner points as outputs for a given pair of
images with corner correspondences
input:
im1--> image1
im2--> image2
points1--> corner points in image1
points2--> corner points in image2
output:
rectified_im1-->rectified image 1
rectified_im2-->rectified image 2
new_cor1--> new corners in the rectified image 1
new_cor2--> new corners in the rectified image 2
'''
9'''
Your Code Here!!!
'''
return rectified_ims[0],rectified_ims[1],new_cors[0],new_cors[1]
In [ ]: # find the rectified images and plot the parallel epipolar lines
rectified_im1,rectified_im2,new_cor1,new_cor2 = image_rectification(imgs[0],imgs[1],cor1,cor2)
newF = fundamental_matrix(new_cor1,new_cor2)
plot_epipolar_lines(newF, rectified_im1,rectified_im2, new_cor1, new_cor2)
1.5 Problem 3: RANSAC for Estimating the Fundamental Matrix [17 pts]
We will now use SIFT to detect and match features, then use RANSAC to eliminate outliers that
do not conform to a fundamental matrix model. For this problem, we are providing matched SIFT
points in text files that you may simply read as input.
1.5.1 Visualization of matching points [2 pts]
Use the provided matched SIFT points in the two images road1.png (leftimage) and road2.png
(right image). Visualize the matched features by drawing lines between the left and right images.
You may use the provided show_matching_result function. The data in points1.txt are the keypoints
in the left image and the data in points2.txt are the keypoints in the right image. Each row has
the x and y coordinates for a point. Corresponding rows in the two files are the matching points.
Randomly visualize 20 matchings from all matched points.
In [ ]: import numpy as np
from skimage import io
import matplotlib.pyplot as plt
import random
x1 = np.loadtxt("points1.txt").T
x2 = np.loadtxt("points2.txt").T
roadimgs = []
for i in range(2):
img = io.imread('road' + str(i+1) + '.png')
roadimgs.append(rgb2gray(img))
# Your code here
1.5.2 Estimation of fundamental matrix using SIFT matches and plotting epipolar lines [3 pts]
Estimate the fundamental matrix using the SIFT matches. Plot the epipolar lines for 5 randomly
selected keypoints. You may use fundamental_matrix and plot_epipolar_lines functions to do this.
Note that it’s normal to not get a correct result due to the noisy matching pairs.
In [ ]: # Your code here
101.5.3 RANSAC with 8-point algorithm [10 pts]
Use RANSAC with the 8-point algorithm to remove outliers and re-estimate the fundamental
matrix with the inliers. Visualize the inlier matches by drawing lines between the left and right
images. Plot the epipolar lines for 5 randomly selected keypoints.
In [ ]: from tqdm import tqdm
def compute_fundamental_RANSAC(cor1, cor2, epiConstThres, nSample):
"""
Input:
- cor1, cor2: corners in image1 and image2
- epiConstThres: Threshold for accepting inliers
- nSample: number of iterations for RANSAC
Output:
- bestF: best fundamental matrix
- bestInliersIdx: under bestF, the index of inliers of matching points
- bestInliersNumList: record the best number of inliers so far at each iteration, with length nSample
"""
""" YOUR CODE HERE !!!"""
bestInliersNumList = []
return bestF, bestInliersIdx, bestInliersNumList
def fundamental_matrix_RANSAC(x1,x2, epiConstThres, nSample):
n = x1.shape[1]
if x2.shape[1] != n:
raise ValueError("Number of points don't match.")
# normalize image coordinates
x1 = x1 / x1[2]
mean_1 = np.mean(x1[:2],axis=1)
S1 = np.sqrt(2) / np.std(x1[:2])
T1 = np.array([[S1,0,-S1*mean_1[0]],[0,S1,-S1*mean_1[1]],[0,0,1]])
x1 = np.dot(T1,x1)
x2 = x2 / x2[2]
mean_2 = np.mean(x2[:2],axis=1)
S2 = np.sqrt(2) / np.std(x2[:2])
T2 = np.array([[S2,0,-S2*mean_2[0]],[0,S2,-S2*mean_2[1]],[0,0,1]])
x2 = np.dot(T2,x2)
# compute F with the normalized coordinates
bestF, bestInliersIdx, bestInliersNumList = compute_fundamental_RANSAC(x1,x2,epiConstThres,nSample)
# reverse normalization
bestF = np.dot(T1.T,np.dot(bestF,T2))
11return bestF/bestF[2,2], bestInliersIdx, bestInliersNumList
# calculating F using RANSAC
epiConstThres = 0.1
nSample = 3000
np.random.seed(10)
F, bestInliersIdx, bestInliersNumList = fundamental_matrix_RANSAC(x1_h, x2_h, epiConstThres, nSample)
inlierPts1 = x1_h[:,bestInliersIdx]
inlierPts2 = x2_h[:,bestInliersIdx]
chooseidx = np.random.choice(inlierPts1.shape[1], 5, replace=False)
plot_epipolar_lines(F, roadimgs[0], roadimgs[1], inlierPts1[:,chooseidx], inlierPts2[:,chooseidx])
print('Number of inliers as iteration increases:')
plt.plot(np.arange(len(bestInliersNumList)), bestInliersNumList, 'b-')
1.5.4 Conceptual question [2 pts]
Conceptually, can you guess approximately where the epipole should lie for the two images
above? Explain your reasoning. Do your epipolar lines above match that intuition?
Your answer here.
1.6 Submission Instructions
Remember to submit a PDF version of this notebook to Gradescope. Please make sure the contents
in each cell are clearly shown in your final PDF file.
There are multiple options for converting the notebook to PDF: 1. You can find the export
option at File → Download as → PDF via LaTeX 2. You can first export as HTML and then convert
to PDF
In [ ]:
12
April 26, 2019
1 CSE 152: Intro to Computer Vision - Spring 2019 Assignment 2
1.1.1 Assignment published on Friday, April 26, 2019
1.1.2 Due on Wednesday, May 8, 2019 at 11:59pm
1.2 Instructions
This assignment must be completed individually. Review the academic integrity and collaboration
policies on the course website.
All solutions should be written in this notebook.
If you want to modify the skeleton code, you may do so. It has been merely been provided
as a framework for your solution.
You may use Python packages for basic linear algebra (e.g. NumPy or SciPy for basic operations),
but you may not use packages that directly solve the problem. If you are unsure
about using a specific package or function, ask the instructor and/or teaching assistants for
clarification.
You must submit this notebook exported as a PDF. You must also submit this notebook as an
.ipynb file. Submit both files (.pdf and .ipynb) on Gradescope. You must mark the PDF
pages associated with each question in Gradescope. If you fail to do so, we may dock
points.
It is highly recommended that you begin working on this assignment early.
Late policy: a penalty of 10% per day after the due date.
1.3 Problem 1: Stereo and Disparity [3 pts]
Consider two cameras whose (virtual) image planes are the z=1 plane, and whose focal points are
at (-20, 0, 0) and (20, 0, 0). We”ll call a point in the first camera (x, y), and a point in the second
camera (u, v). Points in each camera are relative to the camera center. So, for example if (x, y) = (0,
0), this is really the point (-20, 0, 1) in world coordinates, while if (u, v) = (0, 0) this is the point (20,
10, 1). a) Suppose the points
(x, y) = (12, 12) is matched to the point (u, v) = (1, 12). What is the 3D location of this point?
b) Consider points that lie on the line x + z = 0, y = 0. Use the same stereo setup as before.
Write an analytic expression giving the disparity of a point on this line after it projects onto
the two images, as a function of its position in the right image. So your expression should
only involve the variables u and d (for disparity). Your expression only needs to be valid for
points on the line that are in front of the cameras, i.e. with z > 1.
1.4 Problem 2: Sparse Stereo Matching [20 pts]
In this problem we will play around with sparse stereo matching methods. You will work on two
image pairs, a warrior figure and a figure from the Matrix movies. These files both contain two
images, two camera matrices, and associated sets of corresponding points (extracted by manually
clicking the images).
For the problems below, you will complete functions to demonstrate results on warrior image
pairs (warrior1.png, warrior2.png). In all cases, you should apply the same procedures on the
matrix image pair (matrix1.png, matrix2.png) as well. (Provide the same thing for BOTH matrix
and warrior.) Note that the matrix image pair is harder, in the sense that matching algorithms will
not work quite as well on it. You should expect good results, however, on warrior.
In [ ]: import numpy as np
import matplotlib.pyplot as plt
from skimage import io
In [ ]: def rgb2gray(rgb):
""" Convert rgb image to grayscale.
"""
return np.dot(rgb[...,:3], [0.299, 0.587, 0.114])
# convert points from euclidean to homogeneous
def to_homog(points):
points = np.concatenate((points, np.ones((1, points.shape[1]))), axis=0)
return points
# convert points from homogeneous to euclidean
def from_homog(points_homog):
z = points_homog[-1,:]
2points_homog = points_homog[:2,:]/z
return points_homog
In [ ]: # plot matching result
def show_matching_result(img1, img2, matching):
fig = plt.figure(figsize=(8, 8))
plt.imshow(np.hstack((img1, img2)), cmap='gray') # two dino images are of different sizes, resize one before use
for p1, p2 in matching:
plt.scatter(p1[0], p1[1], s=35, edgecolors='r', facecolors='none')
plt.scatter(p2[0] + img1.shape[1], p2[1], s=35, edgecolors='r', facecolors='none')
plt.plot([p1[0], p2[0] + img1.shape[1]], [p1[1], p2[1]])
plt.show()
1.4.1 Epipolar Geometry
Using the fundamental_matrix function and the corresponding points provided in cor1.npy and
cor2.npy, calculate the fundamental matrix. Note that estimation of the fundamental matrix is
ill-conditioned; we need to normalize coordinates before computing the fundamental matrix in
order to remedy this problem. The fundamental_matrix function contains code for normalization,
so you just need to complete the compute_fundamental function by implementing the eight-point
algorithm.
Next, use this fundamental matrix, implement plot_epipolar_lines to plot the
epipolar lines in both image pairs. For this part you may want to complete the
function compute_fundamental and then use plot_epipolar_lines to draw epipolar
lines on both images. Below, we provide some example results on the dino image
pair. Your results should look similar. Include your results for matrix and warrior
as per the figure below.
3In [ ]: import numpy as np
import matplotlib.pyplot as plt
imgs = []
for i in range(2):
img = io.imread('p4/warrior/warrior' + str(i) + '.png')
#img = io.imread('p4/matrix/matrix' + str(i) + '.png')
imgs.append(rgb2gray(img))
cor1 = np.load("./p4/warrior/cor1.npy")
cor2 = np.load("./p4/warrior/cor2.npy")
matching = [(cor1[:,i], cor2[:,i]) for i in range(cor1.shape[1])]
show_matching_result(imgs[0],imgs[1], matching)
# Remember to show your result for matrix image pair
1.4.2 Compute the Fundamental Matrix [5 pts]
Please complete the compute_fundamental function. You only need to write the part between
"Your Code Here!" and "Your Code End!"
In [ ]: def compute_fundamental(x1,x2):
""" Computes the fundamental matrix from corresponding points
(x1,x2 3*n arrays) using the 8 point algorithm.
Each row in the A matrix below is constructed as
[x*x', x*y', x, y*x', y*y', y, x', y', 1]
"""
n = x1.shape[1]
if x2.shape[1] != n:
raise ValueError("Number of points don't match.")
4'''
Your Code Here!
'''
# build matrix for equations
A = np.zeros((n,9))
'''
Your Code End!
'''
# compute linear least square solution
U,S,V = np.linalg.svd(A)
F = V[-1].reshape(3,3)
# constrain F
# make rank 2 by zeroing out last singular value
U,S,V = np.linalg.svd(F)
S[2] = 0
F = np.dot(U,np.dot(np.diag(S),V))
return F/F[2,2]
def fundamental_matrix(x1,x2):
n = x1.shape[1]
if x2.shape[1] != n:
raise ValueError("Number of points don't match.")
# normalize image coordinates
x1 = x1 / x1[2]
mean_1 = np.mean(x1[:2],axis=1)
S1 = np.sqrt(2) / np.std(x1[:2])
T1 = np.array([[S1,0,-S1*mean_1[0]],[0,S1,-S1*mean_1[1]],[0,0,1]])
x1 = np.dot(T1,x1)
x2 = x2 / x2[2]
mean_2 = np.mean(x2[:2],axis=1)
S2 = np.sqrt(2) / np.std(x2[:2])
T2 = np.array([[S2,0,-S2*mean_2[0]],[0,S2,-S2*mean_2[1]],[0,0,1]])
x2 = np.dot(T2,x2)
# compute F with the normalized coordinates
F = compute_fundamental(x1,x2)
# reverse normalization
F = np.dot(T1.T,np.dot(F,T2))
5return F/F[2,2]
1.4.3 Epipolar Lines [5 pts]
In [ ]: def plot_epipolar_lines(F,img1,img2, cor1, cor2):
"""Plot epipolar lines on image given fundamental matrix, image, corners
Args:
F: Fundamental matrix
img1: Image 1.
img2: Image 2.
cor1: Corners in homogeneous image coordinate in image 1 (3xn)
cor2: Corners in homogeneous image coordinate in image 2 (3xn)
"""
"""
Your Code Here !!!
"""
In [ ]: F = fundamental_matrix(cor1,cor2)
plot_epipolar_lines(F,imgs[0],imgs[1],cor1,cor2)
1.4.4 Image Rectification
An interesting case for epipolar geometry occurs when two images are parallel to each other.
In this case, there is no rotation component involved between the two images and the essential
matrix is E = [Tx]R = [Tx]. Also if you observe the epipolar lines l and l
′
for parallel images,
they are horizontal and consequently, the corresponding epipolar lines share the same vertical
coordinate. Therefore the process of making images parallel becomes useful while discerning
the relationships between corresponding points in images. Rectifying a pair of images can also
be done for uncalibrated camera images (i.e. we do not require the camera matrix of intrinsic
parameters). Using the fundamental matrix we can find the pair of epipolar lines li and l
′
i
for
each of the correspondences. The intersection of these lines will give us the respective epipoles
e and e
′
. Now to make the epipolar lines to be parallel we need to map the epipoles to infinity.
Hence, we need to find a homography that maps the epipoles to infinity. The method to find
the homography has been implemented for you. You can read more about the method used to
estimate the homography in the paper "Theory and Practice of Projective Rectification" by Richard
6Hartley.
In this part you first need to complete the function compute_epipole. The function compute_epipole
is used to calculate the epipoles for a given fundamental matrix and corner point
correspondences in the two images.
Using the compute_epipoles function and the given compute_matching_homographies function,
we get HL and HR. Then we can complete the function image_rectification to find the rectified
images and plot the parallel epipolar lines using the plot_epipolar_lines function from above. You
need to run this for both the matrix and the warrior images. A sample output is provided below:
1.4.5 Compute Epipole [5 pts]
In [ ]: def compute_epipole(F):
'''
This function computes the epipoles for a given fundamental matrix and corner point correspondences
7input:
F--> Fundamental matrix
output:
e1--> corresponding epipole in image 1
e2--> epipole in image2
'''
"""
Your Code Here!!!
"""
return e1/e1[2], e2/e2[2]
1.4.6 Image Rectification [5 pts]
In [ ]: def compute_matching_homographies(e2, F, im2, points1, points2):
'''This function computes the homographies to get the rectified images
input:
e2--> epipole in image 2
F--> the Fundamental matrix
im2--> image2
points1 --> corner points in image1
points2--> corresponding corner points in image2
output:
H1--> Homography for image 1
H2--> Homography for image 2
'''
# calculate H2
width = im2.shape[1]
height = im2.shape[0]
T = np.identity(3)
T[0][2] = -1.0 * width / 2
T[1][2] = -1.0 * height / 2
e = T.dot(e2)
e1_prime = e[0]
e2_prime = e[1]
if e1_prime >= 0:
alpha = 1.0
else:
alpha = -1.0
R = np.identity(3)
R[0][0] = alpha * e1_prime / np.sqrt(e1_prime**2 + e2_prime**2)
R[0][1] = alpha * e2_prime / np.sqrt(e1_prime**2 + e2_prime**2)
R[1][0] = - alpha * e2_prime / np.sqrt(e1_prime**2 + e2_prime**2)
R[1][1] = alpha * e1_prime / np.sqrt(e1_prime**2 + e2_prime**2)
8f = R.dot(e)[0]
G = np.identity(3)
G[2][0] = - 1.0 / f
H2 = np.linalg.inv(T).dot(G.dot(R.dot(T)))
# calculate H1
e_prime = np.zeros((3, 3))
e_prime[0][1] = -e2[2]
e_prime[0][2] = e2[1]
e_prime[1][0] = e2[2]
e_prime[1][2] = -e2[0]
e_prime[2][0] = -e2[1]
e_prime[2][1] = e2[0]
v = np.array([1, 1, 1])
M = e_prime.dot(F) + np.outer(e2, v)
points1_hat = H2.dot(M.dot(points1.T)).T
points2_hat = H2.dot(points2.T).T
W = points1_hat / points1_hat[:, 2].reshape(-1, 1)
b = (points2_hat / points2_hat[:, 2].reshape(-1, 1))[:, 0]
# least square problem
a1, a2, a3 = np.linalg.lstsq(W, b)[0]
HA = np.identity(3)
HA[0] = np.array([a1, a2, a3])
H1 = HA.dot(H2).dot(M)
return H1, H2
def image_rectification(im1,im2,points1,points2):
'''This function provides the rectified images along with the new corner points as outputs for a given pair of
images with corner correspondences
input:
im1--> image1
im2--> image2
points1--> corner points in image1
points2--> corner points in image2
output:
rectified_im1-->rectified image 1
rectified_im2-->rectified image 2
new_cor1--> new corners in the rectified image 1
new_cor2--> new corners in the rectified image 2
'''
9'''
Your Code Here!!!
'''
return rectified_ims[0],rectified_ims[1],new_cors[0],new_cors[1]
In [ ]: # find the rectified images and plot the parallel epipolar lines
rectified_im1,rectified_im2,new_cor1,new_cor2 = image_rectification(imgs[0],imgs[1],cor1,cor2)
newF = fundamental_matrix(new_cor1,new_cor2)
plot_epipolar_lines(newF, rectified_im1,rectified_im2, new_cor1, new_cor2)
1.5 Problem 3: RANSAC for Estimating the Fundamental Matrix [17 pts]
We will now use SIFT to detect and match features, then use RANSAC to eliminate outliers that
do not conform to a fundamental matrix model. For this problem, we are providing matched SIFT
points in text files that you may simply read as input.
1.5.1 Visualization of matching points [2 pts]
Use the provided matched SIFT points in the two images road1.png (leftimage) and road2.png
(right image). Visualize the matched features by drawing lines between the left and right images.
You may use the provided show_matching_result function. The data in points1.txt are the keypoints
in the left image and the data in points2.txt are the keypoints in the right image. Each row has
the x and y coordinates for a point. Corresponding rows in the two files are the matching points.
Randomly visualize 20 matchings from all matched points.
In [ ]: import numpy as np
from skimage import io
import matplotlib.pyplot as plt
import random
x1 = np.loadtxt("points1.txt").T
x2 = np.loadtxt("points2.txt").T
roadimgs = []
for i in range(2):
img = io.imread('road' + str(i+1) + '.png')
roadimgs.append(rgb2gray(img))
# Your code here
1.5.2 Estimation of fundamental matrix using SIFT matches and plotting epipolar lines [3 pts]
Estimate the fundamental matrix using the SIFT matches. Plot the epipolar lines for 5 randomly
selected keypoints. You may use fundamental_matrix and plot_epipolar_lines functions to do this.
Note that it’s normal to not get a correct result due to the noisy matching pairs.
In [ ]: # Your code here
101.5.3 RANSAC with 8-point algorithm [10 pts]
Use RANSAC with the 8-point algorithm to remove outliers and re-estimate the fundamental
matrix with the inliers. Visualize the inlier matches by drawing lines between the left and right
images. Plot the epipolar lines for 5 randomly selected keypoints.
In [ ]: from tqdm import tqdm
def compute_fundamental_RANSAC(cor1, cor2, epiConstThres, nSample):
"""
Input:
- cor1, cor2: corners in image1 and image2
- epiConstThres: Threshold for accepting inliers
- nSample: number of iterations for RANSAC
Output:
- bestF: best fundamental matrix
- bestInliersIdx: under bestF, the index of inliers of matching points
- bestInliersNumList: record the best number of inliers so far at each iteration, with length nSample
"""
""" YOUR CODE HERE !!!"""
bestInliersNumList = []
return bestF, bestInliersIdx, bestInliersNumList
def fundamental_matrix_RANSAC(x1,x2, epiConstThres, nSample):
n = x1.shape[1]
if x2.shape[1] != n:
raise ValueError("Number of points don't match.")
# normalize image coordinates
x1 = x1 / x1[2]
mean_1 = np.mean(x1[:2],axis=1)
S1 = np.sqrt(2) / np.std(x1[:2])
T1 = np.array([[S1,0,-S1*mean_1[0]],[0,S1,-S1*mean_1[1]],[0,0,1]])
x1 = np.dot(T1,x1)
x2 = x2 / x2[2]
mean_2 = np.mean(x2[:2],axis=1)
S2 = np.sqrt(2) / np.std(x2[:2])
T2 = np.array([[S2,0,-S2*mean_2[0]],[0,S2,-S2*mean_2[1]],[0,0,1]])
x2 = np.dot(T2,x2)
# compute F with the normalized coordinates
bestF, bestInliersIdx, bestInliersNumList = compute_fundamental_RANSAC(x1,x2,epiConstThres,nSample)
# reverse normalization
bestF = np.dot(T1.T,np.dot(bestF,T2))
11return bestF/bestF[2,2], bestInliersIdx, bestInliersNumList
# calculating F using RANSAC
epiConstThres = 0.1
nSample = 3000
np.random.seed(10)
F, bestInliersIdx, bestInliersNumList = fundamental_matrix_RANSAC(x1_h, x2_h, epiConstThres, nSample)
inlierPts1 = x1_h[:,bestInliersIdx]
inlierPts2 = x2_h[:,bestInliersIdx]
chooseidx = np.random.choice(inlierPts1.shape[1], 5, replace=False)
plot_epipolar_lines(F, roadimgs[0], roadimgs[1], inlierPts1[:,chooseidx], inlierPts2[:,chooseidx])
print('Number of inliers as iteration increases:')
plt.plot(np.arange(len(bestInliersNumList)), bestInliersNumList, 'b-')
1.5.4 Conceptual question [2 pts]
Conceptually, can you guess approximately where the epipole should lie for the two images
above? Explain your reasoning. Do your epipolar lines above match that intuition?
Your answer here.
1.6 Submission Instructions
Remember to submit a PDF version of this notebook to Gradescope. Please make sure the contents
in each cell are clearly shown in your final PDF file.
There are multiple options for converting the notebook to PDF: 1. You can find the export
option at File → Download as → PDF via LaTeX 2. You can first export as HTML and then convert
to PDF
In [ ]:
12