Grid-Search finding Parameters for AUC

You can make any scorer by your own:

from sklearn.metrics import make_scorer
from sklearn.metrics import roc_curve, auc

# define scoring function 
 def custom_auc(ground_truth, predictions):
     # I need only one column of predictions["0" and "1"]. You can get an error here
     # while trying to return both columns at once
     fpr, tpr, _ = roc_curve(ground_truth, predictions[:, 1], pos_label=1)    
     return auc(fpr, tpr)

# to be standart sklearn's scorer        
 my_auc = make_scorer(custom_auc, greater_is_better=True, needs_proba=True)

 pipeline = Pipeline(
                [("transformer", TruncatedSVD(n_components=70)),
                ("classifier", xgb.XGBClassifier(scale_pos_weight=1.0, learning_rate=0.1, 
                                max_depth=5, n_estimators=50, min_child_weight=5))])

 parameters_grid = {'transformer__n_components': [60, 40, 20] }

 grid_cv = GridSearchCV(pipeline, parameters_grid, scoring = my_auc, n_jobs=-1,
                                                        cv = StratifiedShuffleSplit(n_splits=5,test_size=0.3,random_state = 0))
 grid_cv.fit(X, y)

For more information, please check out here: sklearn make_scorer


You can simply use:

clf = GridSearchCV(clf, parameters, scoring='roc_auc')

use below code which will give you all the list of parameter

import sklearn

sklearn.metrics.SCORERS.keys()

Select appropriate parameter that you want to use

In your case below code will work

clf = GridSearchCV(clf, parameters, scoring = 'roc_auc')