我在尝试使用线性回归模型预测一个值。然而,当我使用 sklearn 的 .predict 函数时,我无法找到一种方法来输入 X 的数据而不导致数据类型错误。
from sklearn import linear_modelKitchenQual_X = KitchenQual_df[["OverallQual", "YearBuilt", "YearRemodAdd", "GarageCars", "GarageArea"]]KitchenQual_Y = KitchenQual_df["dummy_KitchenQual"]regr_KitchenQual = linear_model.LinearRegression()regr_KitchenQual.fit(KitchenQual_X, KitchenQual_Y)print("Predicted missing KitchenQual value: " + regr_KitchenQual.predict(df_both[["OverallQual", "YearBuilt", "YearRemodAdd", "GarageCars", "GarageArea"]].loc[[1555]]))
在我的 Kaggle 笔记本中运行代码时,我收到了以下错误:
---------------------------------------------------------------------------UFuncTypeError Traceback (most recent call last)<ipython-input-206-1f022a48e21c> in <module>----> 1 print("Predicted missing KitchenQual value: " + regr_KitchenQual.predict(df_both[["OverallQual", "YearBuilt", "YearRemodAdd", "GarageCars", "GarageArea"]].loc[[1555]]))UFuncTypeError: ufunc 'add' did not contain a loop with signature matching types (dtype('<U37'), dtype('<U37')) -> dtype('<U37')
任何帮助我都会非常感激 🙂
回答:
假设你的因变量是连续的,使用示例数据并重复你的步骤:
from sklearn import linear_modelimport numpy as npimport pandas as pdKitchenQual_df = pd.DataFrame(np.random.normal(0,1,(2000,6)))KitchenQual_df.columns = ["OverallQual", "YearBuilt", "YearRemodAdd", "GarageCars", "GarageArea","dummy_KitchenQual"]KitchenQual_X = KitchenQual_df[["OverallQual", "YearBuilt", "YearRemodAdd", "GarageCars", "GarageArea"]]KitchenQual_Y = KitchenQual_df["dummy_KitchenQual"]regr_KitchenQual = linear_model.LinearRegression()regr_KitchenQual.fit(KitchenQual_X, KitchenQual_Y)pred = regr_KitchenQual.predict(KitchenQual_df[["OverallQual", "YearBuilt", "YearRemodAdd", "GarageCars", "GarageArea"]].loc[[1555]])
预测结果是一个数组,你不能直接使用 +
将字符串和数组连接起来,以下负面示例会给你相同的错误:
"a" + np.array(['b','c'])"a" + np.array([1,2])UFuncTypeError: ufunc 'add' did not contain a loop with signature matching types (dtype('<U1'), dtype('<U1')) -> dtype('<U1')
你可以这样做:
print("Predicted missing KitchenQual value: " + str(pred[0]))Predicted missing KitchenQual value: -0.11176904834490986