0

In python 2.7.6, matlablib, scikit learn 0.17.0, When I make a polynomial regression lines on a scatter plot, the polynomial curve will be really messy like this:

enter image description here

The script is like this: it will read two columns of floating data and make a scatter plot and regression

import pandas as pd
import scipy.stats as stats
import pylab 
import numpy as np
import matplotlib.pyplot as plt
import statsmodels.api as sm
import pylab as pl
import sklearn
from sklearn import preprocessing
from sklearn.cross_validation import train_test_split
from sklearn import datasets, linear_model
from sklearn.linear_model import LinearRegression
from sklearn.preprocessing import PolynomialFeatures
from sklearn.pipeline import make_pipeline
from sklearn.linear_model import Ridge

df=pd.read_csv("boston_real_estate_market_clean.csv")

LSTAT = df['LSTAT'].as_matrix()

LSTAT=LSTAT.reshape(LSTAT.shape[0], 1)

MEDV=df['MEDV'].as_matrix()

MEDV=MEDV.reshape(MEDV.shape[0], 1)

# Train test set split
X_train1, X_test1, y_train1, y_test1 =                train_test_split(LSTAT,MEDV,test_size=0.3,random_state=1)

# Ploynomial Regression-nst order

plt.scatter(X_test1, y_test1, s=10, alpha=0.3)

for degree in [1,2,3,4,5]:
    model = make_pipeline(PolynomialFeatures(degree), Ridge())
    model.fit(X_train1,y_train1)
    y_plot = model.predict(X_test1)
    plt.plot(X_test1, y_plot, label="degree %d" % degree
             +'; $q^2$: %.2f' % model.score(X_train1, y_train1)
             +'; $R^2$: %.2f' % model.score(X_test1, y_test1))


plt.legend(loc='upper right')

plt.show()

I guess the reason is because the "X_test1, y_plot" are not sorted properly?

X_test1 is a numpy array like this:

[[  5.49]
 [ 16.65]
 [ 17.09]
 ....
 [ 25.68]
 [ 24.39]]

yplot is a numpy array like this:

[[ 29.78517812]
 [ 17.16759833]
 [ 16.86462359]
 [ 23.18680265]
...[ 37.7631725 ]]

I try to sort with this:

 [X_test1, y_plot] = zip(*sorted(zip(X_test1, y_plot), key=lambda y_plot: y_plot[0]))

     plt.plot(X_test1, y_plot, label="degree %d" % degree
              +'; $q^2$: %.2f' % model.score(X_train1, y_train1)
              +'; $R^2$: %.2f' % model.score(X_test1, y_test1))

The curve looks normal now but the result is weird with a negative R^2.

enter image description here

Could any guru show me the real issue is or how to sort here properly? Thank you!

2
  • That's especially weird as any real number squared should be positive... imaginary numbers?! Commented Feb 18, 2016 at 9:34
  • Have you tried reversing the sort using reverse = True as an argument of sorted? No idea if it will work, but worth a try. Commented Feb 18, 2016 at 9:35

1 Answer 1

3

While the plot is now correct, you messed up the pairing of X_test1 to y_test1 while sorting because you forgot to also sort y_test1 in the same way. The best solution is to sort right after the split. Then y_plot, which is computed later, will be automatically correct: (Here untested example using numpy as np)

X_train1, X_test1, y_train1, y_test1 =             train_test_split(LSTAT,MEDV,test_size=0.3,random_state=1)

sorted_index = np.argsort(X_test1)
X_test1 = X_test1[sorted_index]
y_test1 = y_test1[sorted_index]
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.