Skip to content

Models

Gaussian Process regression models for Bayesian optimization. ALchemist supports two backends: BoTorch (PyTorch-based) and scikit-learn.


BoTorch Model

PyTorch-based Gaussian Process implementation with GPU support and advanced features.

alchemist_core.models.botorch_model.BoTorchModel(training_iter=50, random_state=42, kernel_options=None, cat_dims=None, search_space=None, input_transform_type='none', output_transform_type='none')

Bases: BaseModel

Initialize the BoTorchModel with custom options.

Parameters:

Name Type Description Default
training_iter

Maximum iterations for model optimization.

50
random_state

Random seed for reproducibility.

42
kernel_options dict

Dictionary with kernel options like "cont_kernel_type" and "matern_nu".

None
cat_dims list[int] | None

List of column indices that are categorical.

None
search_space list

Optional search space list.

None
input_transform_type str

Type of input scaling ("none", "normalize", "standardize")

'none'
output_transform_type str

Type of output scaling ("none", "standardize")

'none'

train(exp_manager, **kwargs)

Train the model using an ExperimentManager instance.

predict(X, return_std=False, **kwargs)

Make predictions using the trained model.

get_hyperparameters()

Get model hyperparameters.


Sklearn Model

Scikit-learn based Gaussian Process implementation for CPU-only workflows.

alchemist_core.models.sklearn_model.SklearnModel(kernel_options, n_restarts_optimizer=30, random_state=42, optimizer='L-BFGS-B', input_transform_type='none', output_transform_type='none')

Bases: BaseModel

Initialize the SklearnModel with kernel options and scaling transforms.

Parameters:

Name Type Description Default
kernel_options dict

Dictionary with keys: - "kernel_type": one of "RBF", "Matern", "RationalQuadratic" - If "Matern" is selected, a key "matern_nu" should be provided.

required
n_restarts_optimizer

Number of restarts for the optimizer.

30
random_state

Random state for reproducibility.

42
optimizer

Optimization method for hyperparameter tuning.

'L-BFGS-B'
input_transform_type str

Type of input scaling ("none", "standard", "minmax", "robust")

'none'
output_transform_type str

Type of output scaling ("none", "standard")

'none'

train(experiment_manager, **kwargs)

Train the model using the ExperimentManager.

predict(X, return_std=False, **kwargs)

Make predictions using the trained model.

Parameters:

Name Type Description Default
X

Input features

required
return_std

Whether to return standard deviations

False

Returns:

Type Description

If return_std is False: numpy array of predictions

If return_std is True: tuple of (predictions, standard deviations)

get_hyperparameters()


See Also