我在尝试构建一个XGBoost二元分类模型。我设置了训练和测试数据,并执行以下操作以将数据拟合到模型中。
clf_xgb = xgb.XGBClassifier(objective = 'binary: logistic', missing = None, seed = 42)clf_xgb.fit(X_train, y_train, eval_set = [(X_test, y_test)], eval_metric = 'aucpr', early_stopping_rounds=10, verbose = True )
当我运行这段代码时,我得到了以下错误消息:
XGBoostError Traceback (most recent call last)<ipython-input-32-2a6f36907545> in <module>----> 1 clf_xgb.fit(X_train, 2 y_train, 3 eval_set = [(X_test, y_test)], 4 eval_metric = 'aucpr', 5 early_stopping_rounds=10,D:\Softwares\anaconda\lib\site-packages\xgboost\core.py in inner_f(*args, **kwargs) 434 for k, arg in zip(sig.parameters, args): 435 kwargs[k] = arg--> 436 return f(**kwargs) 437 438 return inner_fD:\Softwares\anaconda\lib\site-packages\xgboost\sklearn.py in fit(self, X, y, sample_weight, base_margin, eval_set, eval_metric, early_stopping_rounds, verbose, xgb_model, sample_weight_eval_set, base_margin_eval_set, feature_weights, callbacks) 1174 ) 1175 -> 1176 self._Booster = train( 1177 params, 1178 train_dmatrix,D:\Softwares\anaconda\lib\site-packages\xgboost\training.py in train(params, dtrain, num_boost_round, evals, obj, feval, maximize, early_stopping_rounds, evals_result, verbose_eval, xgb_model, callbacks) 187 Booster : a trained booster model 188 """--> 189 bst = _train_internal(params, dtrain, 190 num_boost_round=num_boost_round, 191 evals=evals,D:\Softwares\anaconda\lib\site-packages\xgboost\training.py in _train_internal(params, dtrain, num_boost_round, evals, obj, feval, xgb_model, callbacks, evals_result, maximize, verbose_eval, early_stopping_rounds) 74 show_stdv=False, cvfolds=None) 75 ---> 76 bst = callbacks.before_training(bst) 77 78 for i in range(start_iteration, num_boost_round):D:\Softwares\anaconda\lib\site-packages\xgboost\callback.py in before_training(self, model) 374 '''Function called before training.''' 375 for c in self.callbacks:--> 376 model = c.before_training(model=model) 377 msg = 'before_training should return the model' 378 if self.is_cv:D:\Softwares\anaconda\lib\site-packages\xgboost\callback.py in before_training(self, model) 513 514 def before_training(self, model):--> 515 self.starting_round = model.num_boosted_rounds() 516 return model 517 D:\Softwares\anaconda\lib\site-packages\xgboost\core.py in num_boosted_rounds(self) 2005 rounds = ctypes.c_int() 2006 assert self.handle is not None-> 2007 _check_call(_LIB.XGBoosterBoostedRounds(self.handle, ctypes.byref(rounds))) 2008 return rounds.value 2009 D:\Softwares\anaconda\lib\site-packages\xgboost\core.py in _check_call(ret) 208 """ 209 if ret != 0:--> 210 raise XGBoostError(py_str(_LIB.XGBGetLastError())) 211 212 XGBoostError: [12:05:23] C:\Users\Administrator\workspace\xgboost-win64_release_1.4.0\src\objective\objective.cc:26: Unknown objective function: `binary: logistic`Objective candidate: survival:aftObjective candidate: binary:hingeObjective candidate: multi:softmaxObjective candidate: multi:softprobObjective candidate: rank:pairwiseObjective candidate: rank:ndcgObjective candidate: rank:mapObjective candidate: count:poissonObjective candidate: survival:coxObjective candidate: reg:gammaObjective candidate: reg:tweedieObjective candidate: reg:squarederrorObjective candidate: reg:squaredlogerrorObjective candidate: reg:logisticObjective candidate: reg:pseudohubererrorObjective candidate: binary:logisticObjective candidate: binary:logitrawObjective candidate: reg:linear
请问有人能解释一下这是怎么回事吗?我该如何修复这个错误?我使用的是Jupyter Notebook和Python 3,并使用了最新的XGB库版本。
回答:
从'binary:logistic'
中移除空格应该就能解决问题。根据这个文档,中间没有空格。