我刚开始学习使用Siraj Raval在YouTube上的视频进行机器学习,并尝试了视频“Intro – The Math of Intelligence”中的挑战,即使用来自kaggle.com的数据集通过梯度下降法执行线性回归。这是我的代码:
"""线性回归模型的示例。这里我使用了来自https://www.kaggle.com/alopez247/pokemon的例子,以找出变量“Total”和“HP”之间的关系。"""import numpy as npimport pandas as pdfrom matplotlib import pyplot as pltimport sysimport osdata = pd.read_csv("./pokemon_alopez247.csv")d = {"Total": data['Total'], "HP": data['HP']}smallData = pd.DataFrame(d)test = smallData.valuesepsilon = 0.001def compute_error_for_line(b, m, points): """根据给定的点返回线的误差。""" totalError = 0 for i in range(0, len(points)): x = test[i, 0] y = test[i, 1] totalError += (y - (m * x + b)) ** 2 return totalError / float(len(points))def step_gradient(b_current, m_current, points, learningRate): """返回新的b和m值。""" b_gradient = 0 m_gradient = 0 N = float(len(points)) for i in range(0, len(points)): x = points[i, 0] y = points[i, 1] error = y - ((m_current * x) + b_current) b_gradient += -(2 / N) * error m_gradient += -(2 / N) * x * error new_b = b_current - (learningRate * b_gradient) new_m = m_current - (learningRate * m_gradient) return [new_b, new_m]def main(): """在这里返回并绘制函数。""" plt.figure(num=None, figsize=(20, 10), dpi=80, facecolor='w', edgecolor='k') plt.axis([0, 780, 0, 260]) plt.ylabel("Total") plt.xlabel("HP") plt.scatter(test[:, [1]], test[:, [0]], c='r', s=1) m = 0.3 b = -30 x = np.arange(800) y = m * x + b for i in range(30): error = compute_error_for_line(b, m, test) print("error :", error) if(error > epsilon): y = m * x + b plt.plot(x, y) b, m = step_gradient(b, m, test, 0.0001) print("b , m :", b, ",", m) plt.pause(0.01) plt.show() plt.pause(0.001)if __name__ == '__main__': try: main() except KeyboardInterrupt: print('Interrupted') try: sys.exit(0) except SystemExit: os._exit(0)
输出结果如下:
error : 193676.072288b , m : -29.91451362 , 6.46934413315/usr/local/lib/python3.5/dist-packages/matplotlib/backend_bases.py:2445: MatplotlibDeprecationWarning: Using default event loop until function specific to this GUI is implemented warnings.warn(str, mplDeprecation)error : 16427.2683093b , m : -29.9134163218 , 6.04491523016error : 15588.2873385b , m : -29.9065147511 , 6.07401898958error : 15583.8939554b , m : -29.9000125838 , 6.07192788394error : 15583.4489928b , m : -29.8934831191 , 6.07198242461error : 15583.0227312b , m : -29.8869557061 , 6.07188938575error : 15582.5965792b , m : -29.8804283262 , 6.07180649992error : 15582.1704489b , m : -29.8739011182 , 6.07172291798error : 15581.74434b , m : -29.8673740726 , 6.07163938615error : 15581.3182523b , m : -29.86084719 , 6.0715558531error : 15580.8921858b , m : -29.8543204704 , 6.07147232236error : 15580.4661407b , m : -29.8477939138 , 6.0713887937error : 15580.0401168b , m : -29.8412675201 , 6.07130526712error : 15579.6141143b , m : -29.8347412894 , 6.07122174263error : 15579.1881329b , m : -29.8282152217 , 6.07113822022error : 15578.7621729b , m : -29.821689317 , 6.0710546999error : 15578.3362341b , m : -29.8151635752 , 6.07097118166error : 15577.9103166b , m : -29.8086379963 , 6.07088766551error : 15577.4844204b , m : -29.8021125804 , 6.07080415145error : 15577.0585455b , m : -29.7955873275 , 6.07072063947error : 15576.6326918b , m : -29.7890622375 , 6.07063712957error : 15576.2068594b , m : -29.7825373104 , 6.07055362176error : 15575.7810482b , m : -29.7760125462 , 6.07047011604error : 15575.3552583b , m : -29.769487945 , 6.0703866124error : 15574.9294897b , m : -29.7629635067 , 6.07030311084error : 15574.5037423b , m : -29.7564392314 , 6.07021961138error : 15574.0780162b , m : -29.7499151189 , 6.07013611399error : 15573.6523114b , m : -29.7433911694 , 6.07005261869error : 15573.2266278b , m : -29.7368673827 , 6.06996912548error : 15572.8009655b , m : -29.730343759 , 6.06988563435[Finished in 73.209s]
输出结果表明一切按计划进行。但请看这里。第一条蓝线是原始值,而线条却越来越远!我尝试重写了compute_error_for_line和step_gradient函数,但仍然没有效果。感谢您阅读到最后。
那么,如何找到最适合我的样本空间的线的参数呢?
我的csv文件链接在这里(此文件将在22小时后过期)。
回答:
plt.scatter(test[:, [1]], test[:, [0]], c='r', s=1)
看起来您交换了x和y值。如果您将[1]改为[0],反之亦然,图表看起来会很好