我使用以下数据进行XGBoost回归算法的预测。然而,问题在于,回归算法对任何输入都预测出相同的输出,我不太确定这是为什么。
data= pd.read_csv("depthwise_data.csv", delimiter=',', header=None, skiprows=1, names=['input_size','input_channels','conv_kernel','conv_strides','running_time'])X = data[['input_size', 'input_channels','conv_kernel', 'conv_strides']]Y = data[["running_time"]]X_train, X_test, y_train, y_test = train_test_split( np.array(X), np.array(Y), test_size=0.2, random_state=42)y_train_log = np.log(y_train)y_test_log = np.log(y_test)xgb_depth_conv = xgb.XGBRegressor(objective ='reg:squarederror', n_estimators = 1000, seed = 123, tree_method = 'hist', max_depth=10)xgb_depth_conv.fit(X_train, y_train_log)y_pred_train = xgb_depth_conv.predict(X_train)#y_pred_test = xgb_depth_conv.predict(X_test)X_data=[[8,576,3,2]] #instanceX_test=np.log(X_data)y_pred_test=xgb_depth_conv.predict(X_test)print(np.exp(y_pred_test))MSE_test, MSE_train = mse(y_test_log,y_pred_test), mse(y_train_log, y_pred_train)R_squared = r2_score(y_pred_test,y_test_log)print("MSE-Train = {}".format(MSE_train))print("MSE-Test = {}".format(MSE_test))print("R-Squared: ", np.round(R_squared, 2))
第一个实例的输出
X_data=[[8,576,3,2]]print(np.exp(y_pred_test))[0.7050679]
第二个实例的输出
X_data=[[4,960,3,1]]print(np.exp(y_pred_test))[0.7050679]
回答:
你的问题出在这一行X_test=np.log(X_data)
为什么你在测试案例上应用log
,而在训练样本上没有应用呢?
如果你完全去掉np.log
,甚至从目标(y)中去掉,你会得到非常好的结果。我自己用你提供的数据测试过了。