Names features importance plot after preprocessing

first we get list of feature names before preprocessing

dtrain = xgb.DMatrix( X, label=y)
dtrain.feature_names

Then

bst.get_fscore()
mapper = {'f{0}'.format(i): v for i, v in enumerate(dtrain.feature_names)}
mapped = {mapper[k]: v for k, v in bst.get_fscore().items()}
mapped
xgb.plot_importance(mapped, color='red')

that's all


For xgboost 0.82, the answer is quite simple, just overwrite the feature names attribute with the list of feature name strings.

trained_xgbmodel.feature_names = feature_name_list
xgboost.plot_importance(trained_xgbmodel)

Tags:

Python

Xgboost