有人尝试过通过在Python中实现ElasticNetCV和在R中实现cvglmnet来获得相同的结果吗?我已经找到了如何在Python中的ElasticNet和R中的glmnet上实现的方法,但无法用交叉验证方法重现…
在Python中重现的步骤:
预处理:
from sklearn.datasets import make_regressionfrom sklearn.linear_model import ElasticNet, ElasticNetCVfrom sklearn.metrics import mean_squared_errorfrom sklearn.model_selection import train_test_splitimport pandas as pddata = make_regression( n_samples=100000, random_state=0)X, y = data[0], data[1]X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=.25)pd.DataFrame(X_train).to_csv('X_train.csv', index=None)pd.DataFrame(X_test).to_csv('X_test.csv', index=None)pd.DataFrame(y_train).to_csv('y_train.csv', index=None)pd.DataFrame(y_test).to_csv('y_test.csv', index=None)
模型:
model = ElasticNet( alpha=1.0, l1_ratio=0.5, fit_intercept=True, normalize=True, precompute=False, max_iter=100000, copy_X=True, tol=0.0000001, warm_start=False, positive=False, random_state=0, selection='cyclic')model.fit( X=X_train, y=y_train)y_pred = model.predict( X=X_test)print( mean_squared_error( y_true=y_test, y_pred=y_pred ))
输出: 42399.49815189786
model = ElasticNetCV( l1_ratio=0.5, eps=0.001, n_alphas=100, alphas=None, fit_intercept=True, normalize=True, precompute=False, max_iter=100000, tol=0.0000001, cv=10, copy_X=True, verbose=0, n_jobs=-1, positive=False, random_state=0, selection='cyclic')model.fit( X=X_train, y=y_train)y_pred = model.predict( X=X_test)print( mean_squared_error( y_true=y_test, y_pred=y_pred ))
输出: 39354.729173913176
在R中重现的步骤:
预处理:
library(glmnet)X_train <- read.csv(path)X_test <- read.csv(path)y_train <- read.csv(path)y_test <- read.csv(path)fit <- glmnet(x=as.matrix(X_train), y=as.matrix(y_train))y_pred <- predict(fit, newx = as.matrix(X_test))y_error = y_test - y_predmean(as.matrix(y_error)^2)
输出: 42399.5
fit <- cv.glmnet(x=as.matrix(X_train), y=as.matrix(y_train))y_pred <- predict(fit, newx = as.matrix(X_test))y_error <- y_test - y_predmean(as.matrix(y_error)^2)
输出: 37.00207
回答:
非常感谢提供的示例,我使用的是笔记本电脑,所以不得不将样本数量减少到100:
from sklearn.datasets import make_regressionfrom sklearn.linear_model import ElasticNet, ElasticNetCVfrom sklearn.metrics import mean_squared_errorfrom sklearn.model_selection import train_test_splitimport pandas as pddata = make_regression( n_samples=100, random_state=0)X, y = data[0], data[1]X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=.25)
当你在glmnet中进行预测时,你需要指定lambda,否则它会返回所有lambda的预测结果,所以在R中:
fit <- glmnet(x=as.matrix(X_train), y=as.matrix(y_train))y_pred <- predict(fit, newx = as.matrix(X_test))dim(y_pred)[1] 25 89
当你运行cv.glmnet时,它会从cv中选择最佳的lambda,即lambda.1se,所以它只给你一组结果,这就是你想要的rmse:
fit <- cv.glmnet(x=as.matrix(X_train), y=as.matrix(y_train))y_pred <- predict(fit, newx = as.matrix(X_test))y_error <- y_test - y_predmean(as.matrix(y_error)^2)[1] 22.03504dim(y_error)[1] 25 1fit$lambda.1se[1] 1.278699
如果我们在glmnet中选择最接近cv.glmnet选择的lambda,你会得到正确范围内的结果:
fit <- glmnet(x=as.matrix(X_train), y=as.matrix(y_train))sel = which.min(fit$lambda-1.278699)y_pred <- predict(fit, newx = as.matrix(X_test))[,sel]mean((y_test - y_pred)^2)dim(y_error)mean(as.matrix((y_test - y_pred)^2))[1] 20.0775
在我们与sklearn进行比较之前,我们需要确保我们在相同的lambda范围内进行测试。
L = c(0.01,0.05,0.1,0.2,0.5,1,2)fit <- cv.glmnet(x=as.matrix(X_train), y=as.matrix(y_train),lambda=L)y_pred <- predict(fit, newx = as.matrix(X_test))y_error <- y_test - y_predmean(as.matrix(y_error)^2)[1] 0.003065869
所以我们期望结果在0.003065869的范围内。我们使用相同的lambda运行它,在ElasticNet中lambda被称为alpha。glmnet中的alpha实际上是你的l1_ratio,请参见说明文档。并且normalize选项应该设置为False,因为:
如果为True,回归变量X将在回归前通过减去均值并除以l2范数进行标准化。如果你希望标准化,请在调用带有normalize=False的估计器的fit之前使用sklearn.preprocessing.StandardScaler。
所以我们只需使用CV运行:
model = ElasticNetCV(l1_ratio=1,fit_intercept=True,alphas=[0.01,0.05,0.1,0.2,0.5,1,2])model.fit(X=X_train,y=y_train)y_pred = model.predict(X=X_test)mean_squared_error(y_true=y_test,y_pred=y_pred)0.0018007824874741929
结果与R的结果大致相同。
如果你对ElasticNet这样做,如果你指定alpha,你会得到相同的结果。