API Reference

Scikit-Optimize, or skopt, is a simple and efficient library to minimize (very) expensive and noisy black-box functions. It implements several methods for sequential model-based optimization. skopt is reusable in many contexts and accessible.

skopt: module

Base classes

BayesSearchCV(estimator, search_spaces[, ...])

Bayesian optimization over hyper parameters.

Optimizer(dimensions[, base_estimator, ...])

Run bayesian optimisation loop.

Space(dimensions)

Initialize a search space from given specifications.

Functions

dummy_minimize(func, dimensions[, n_calls, ...])

Random search by uniform sampling within the given bounds.

dump(res, filename[, store_objective])

Store an skopt optimization result into a file.

expected_minimum(res[, n_random_starts, ...])

Compute the minimum over the predictions of the last surrogate model.

expected_minimum_random_sampling(res[, ...])

Minimum search by doing naive random sampling, Returns the parameters that gave the minimum function value.

forest_minimize(func, dimensions[, ...])

Sequential optimisation using decision trees.

gbrt_minimize(func, dimensions[, ...])

Sequential optimization using gradient boosted trees.

gp_minimize(func, dimensions[, ...])

Bayesian optimization using Gaussian Processes.

load(filename, **kwargs)

Reconstruct a skopt optimization result from a file persisted with skopt.dump.

skopt.acquisition: Acquisition

User guide: See the Acquisition section for further details.

acquisition.gaussian_acquisition_1D(X, model)

A wrapper around the acquisition function that is called by fmin_l_bfgs_b.

acquisition.gaussian_ei(X, model[, y_opt, ...])

Use the expected improvement to calculate the acquisition values.

acquisition.gaussian_lcb(X, model[, kappa, ...])

Use the lower confidence bound to estimate the acquisition values.

acquisition.gaussian_pi(X, model[, y_opt, ...])

Use the probability of improvement to calculate the acquisition values.

skopt.benchmarks: A collection of benchmark problems.

A collection of benchmark problems.

User guide: See the Benchmarks section for further details.

Functions

benchmarks.bench1(x)

A benchmark function for test purposes.

benchmarks.bench1_with_time(x)

Same as bench1 but returns the computation time (constant).

benchmarks.bench2(x)

A benchmark function for test purposes.

benchmarks.bench3(x)

A benchmark function for test purposes.

benchmarks.bench4(x)

A benchmark function for test purposes.

benchmarks.bench5(x)

A benchmark function for test purposes.

benchmarks.branin(x[, a, b, c, r, s, t])

Branin-Hoo function is defined on the square \(x1 \in [-5, 10], x2 \in [0, 15]\).

benchmarks.hart6(x[, alpha, P, A])

The six dimensional Hartmann function is defined on the unit hypercube.

skopt.callbacks: Callbacks

Monitor and influence the optimization procedure via callbacks.

Callbacks are callables which are invoked after each iteration of the optimizer and are passed the results “so far”. Callbacks can monitor progress, or stop the optimization early by returning True.

User guide: See the Callbacks section for further details.

callbacks.CheckpointSaver(checkpoint_path, ...)

Save current state after each iteration with skopt.dump.

callbacks.DeadlineStopper(total_time)

Stop the optimization before running out of a fixed budget of time.

callbacks.DeltaXStopper(delta)

Stop the optimization when |x1 - x2| < delta

callbacks.DeltaYStopper(delta[, n_best])

Stop the optimization if the n_best minima are within delta

callbacks.EarlyStopper()

Decide to continue or not given the results so far.

callbacks.TimerCallback()

Log the elapsed time between each iteration of the minimization loop.

callbacks.VerboseCallback(n_total[, n_init, ...])

Callback to control the verbosity.

skopt.learning: Machine learning extensions for model-based optimization.

Machine learning extensions for model-based optimization.

User guide: See the Learning section for further details.

learning.ExtraTreesRegressor([n_estimators, ...])

ExtraTreesRegressor that supports conditional standard deviation.

learning.GaussianProcessRegressor([kernel, ...])

GaussianProcessRegressor that allows noise tunability.

learning.GradientBoostingQuantileRegressor([...])

Predict several quantiles with one estimator.

learning.RandomForestRegressor([...])

RandomForestRegressor that supports conditional std computation.

skopt.optimizer: Optimizer

User guide: See the Optimizer, an ask-and-tell interface section for further details.

optimizer.Optimizer(dimensions[, ...])

Run bayesian optimisation loop.

optimizer.base_minimize(func, dimensions, ...)

Base optimizer class

optimizer.dummy_minimize(func, dimensions[, ...])

Random search by uniform sampling within the given bounds.

optimizer.forest_minimize(func, dimensions)

Sequential optimisation using decision trees.

optimizer.gbrt_minimize(func, dimensions[, ...])

Sequential optimization using gradient boosted trees.

optimizer.gp_minimize(func, dimensions[, ...])

Bayesian optimization using Gaussian Processes.

skopt.plots: Plotting functions.

Plotting functions.

User guide: See the Plotting tools section for further details.

plots.partial_dependence(space, model, i[, ...])

Calculate the partial dependence for dimensions i and j with respect to the objective value, as approximated by model.

plots.partial_dependence_1D(space, model, i, ...)

Calculate the partial dependence for a single dimension.

plots.partial_dependence_2D(space, model, i, ...)

Calculate the partial dependence for two dimensions in the search-space.

plots.plot_convergence(*args, **kwargs)

Plot one or several convergence traces.

plots.plot_evaluations(result[, bins, ...])

Visualize the order in which points were sampled during optimization.

plots.plot_gaussian_process(res, **kwargs)

Plots the optimization results and the gaussian process for 1-D objective functions.

plots.plot_objective(result[, levels, ...])

Plot a 2-d matrix with so-called Partial Dependence plots of the objective function.

plots.plot_objective_2D(result, ...[, ...])

Create and return a Matplotlib figure and axes with a landscape contour-plot of the last fitted model of the search-space, overlaid with all the samples from the optimization results, for the two given dimensions of the search-space.

plots.plot_histogram(result, ...[, bins, ...])

Create and return a Matplotlib figure with a histogram of the samples from the optimization results, for a given dimension of the search-space.

plots.plot_regret(*args, **kwargs)

Plot one or several cumulative regret traces.

skopt.utils: Utils functions.

User guide: See the Utility functions section for further details.

utils.cook_estimator(base_estimator[, space])

Cook a default estimator.

utils.cook_initial_point_generator(...)

Cook a default initial point generator.

utils.dimensions_aslist(search_space)

Convert a dict representation of a search space into a list of dimensions, ordered by sorted(search_space.keys()).

utils.expected_minimum(res[, ...])

Compute the minimum over the predictions of the last surrogate model.

utils.expected_minimum_random_sampling(res)

Minimum search by doing naive random sampling, Returns the parameters that gave the minimum function value.

utils.dump(res, filename[, store_objective])

Store an skopt optimization result into a file.

utils.load(filename, **kwargs)

Reconstruct a skopt optimization result from a file persisted with skopt.dump.

utils.point_asdict(search_space, point_as_list)

Convert the list representation of a point from a search space to the dictionary representation, where keys are dimension names and values are corresponding to the values of dimensions in the list.

utils.point_aslist(search_space, point_as_dict)

Convert a dictionary representation of a point from a search space to the list representation.

utils.use_named_args(dimensions)

Wrapper / decorator for an objective function that uses named arguments to make it compatible with optimizers that use a single list of parameters.

skopt.sampler: Samplers

Utilities for generating initial sequences

User guide: See the Sampling methods section for further details.

sampler.Lhs([lhs_type, criterion, iterations])

Latin hypercube sampling

sampler.Sobol([skip, randomize])

Generates a new quasirandom Sobol' vector with each call.

sampler.Halton([min_skip, max_skip, primes])

Creates Halton sequence samples.

sampler.Hammersly([min_skip, max_skip, primes])

Creates Hammersley sequence samples.

skopt.space.space: Space

User guide: See the Space section for further details.

space.space.Categorical(categories[, prior, ...])

Search space dimension that can take on categorical values.

space.space.Dimension()

Base class for search space dimensions.

space.space.Integer(low, high[, prior, ...])

Search space dimension that can take on integer values.

space.space.Real(low, high[, prior, base, ...])

Search space dimension that can take on any real value.

space.space.Space(dimensions)

Initialize a search space from given specifications.

space.space.check_dimension(dimension[, ...])

Turn a provided dimension description into a dimension object.

skopt.space.transformers: transformers

User guide: See the Transformers section for further details.

space.transformers.CategoricalEncoder()

OneHotEncoder that can handle categorical variables.

space.transformers.Identity()

Identity transform.

space.transformers.LogN(base)

Base N logarithm transform.

space.transformers.Normalize(low, high[, is_int])

Scales each dimension into the interval [0, 1].

space.transformers.Pipeline(transformers)

A lightweight pipeline to chain transformers.

space.transformers.Transformer()

Base class for all 1-D transformers.

space.transformers.LabelEncoder([X])

LabelEncoder that can handle categorical variables.

space.transformers.StringEncoder([dtype])

StringEncoder transform.