R/models.R
Retrieve either a single or many Gains/Lift tables from H2O objects.
h2o.gainsLift(object, ...) # S4 method for H2OModel h2o.gainsLift(object, newdata, valid = FALSE, xval = FALSE, ...) # S4 method for H2OModelMetrics h2o.gainsLift(object)
object | Either an H2OModel object or an H2OModelMetrics object. |
---|---|
… | further arguments to be passed to/from this method. |
newdata | An H2OFrame object that can be scored on. Requires a valid response column. |
valid | Retrieve the validation metric. |
xval | Retrieve the cross-validation metric. |
Calling this function on H2OModel objects returns a
Gains/Lift table corresponding to the predict
function.
The H2OModelMetrics version of this function will only take H2OBinomialMetrics objects.
predict
for generating prediction frames,
h2o.performance
for creating
H2OModelMetrics.
# NOT RUN { library(h2o) h2o.init() prosPath <- system.file("extdata", "prostate.csv", package="h2o") hex <- h2o.uploadFile(prosPath) hex[,2] <- as.factor(hex[,2]) model <- h2o.gbm(x = 3:9, y = 2, distribution = "bernoulli", training_frame = hex, validation_frame = hex, nfolds=3) h2o.gainsLift(model) ## extract training metrics h2o.gainsLift(model, valid=TRUE) ## extract validation metrics (here: the same) h2o.gainsLift(model, xval =TRUE) ## extract cross-validation metrics h2o.gainsLift(model, newdata=hex) ## score on new data (here: the same) # Generating a ModelMetrics object perf <- h2o.performance(model, hex) h2o.gainsLift(perf) ## extract from existing metrics object # }