Using numpy and matplotlib, I'm trying to plot a polyfitted set of data points:
x = [0, 5, 10, 15, 20]
y = [0, 0.07, 0.14, 0.2, 0.27]
Using this code:
import numpy as np
import matplotlib.pyplot as plt
x = [0, 5, 10, 15, 20]
y = [0, 0.07, 0.14, 0.2, 0.27]
poly = np.polyfit(x, y, 1)
f = np.poly1d(poly)
plt.plot(f)
plt.show()
The variable f in the above code is 0.0134 x + 0.002. This polynomial, when plotted, is supposed to be leaning to the right. But when I plot it, it shows this:

What could be wrong with the code?
