huntermcgushion / hyperparameter_hunter Goto Github PK
View Code? Open in Web Editor NEWEasy hyperparameter optimization and automatic result saving across machine learning algorithms and libraries
License: MIT License
Easy hyperparameter optimization and automatic result saving across machine learning algorithms and libraries
License: MIT License
optimization_core.InformedOptimizationProtocol
methods:
_set_hyperparameter_space
_prepare_estimator
_execute_experiment
_get_current_hyperparameters
_find_similar_experiments
_validate_parameters
callbacks.aggregators.AggregatorEpochsElapsed
needs to work with repeated cross-validation schemesrecorders.DescriptionRecorder.format_result
get_config()
returns the final learning rate, rather than the initial one, so experiment description files are misleading by not displaying the actual value usedReduceLROnPlateau
, which dropped the lr down to 0.0001
parameterize_compiled_keras_model
immediately after initializing it, then store the results, then use them in the DescriptionRecorder
experiments.BaseCVExperiment.cv_run_workflow
, or in models.KerasModel.initialize_model/fit
keras.callbacks.Callback
key_handler.KeyMaker.handle_complex_types.visit
functionexperiment_core.ExperimentMeta
model_init_params
kwarg of experiments.BaseExperiment.__init__
optionaloptimization_core.UninformedOptimizationProtocol._get_current_hyperparameters
examples/environment_params_path_example.py
leaderboards.GlobalLeaderboard.add_entry
, where aliased metrics names should be merged together based on their equivalent hashesroc_auc_score
, then using the same function under an alias (like 'roc') would produce two separate columns: 'roc_auc_score', and 'roc'
confusion_matrix
callback that uses callbacks.bases.lambda_callback
confusion_matrix
callback in examples.lambda_callback_example
models.Model
:
__init__
extra_params
fit
predict
leaderboards.Leaderboard.__init__
leaderboards.GlobalLeaderboard.add_entry
(See Leaderboard
implementation)__init__
method of key_handler.CrossExperimentKeyMaker
, and KeyHandler.HyperparameterKeyMaker
key_handler.HyperparameterKeyMaker.filter_parameters_to_hash
(See key_handler.KeyMaker
implementation)current_hyperparameters_list
equivalent to optimization_core.UninformedOptimizationProtocol
optimization_core.InformedOptimizationProtocol
for proper implementationoptimization_core.BaseOptimizationProtocol
for logging in the _optimization_loop
method (in which pertinent flag comments are located)UninformedOptimizationProtocol
, which is very not goodoptimization_core.SKOptimizationProtocol.__init__.n_random_starts
actually do something when specifiedn_random_starts
-many result files cannot be locatedexperiments.BaseExperiment
that generally shouldn't be used by class instances:
additional_preparation_steps
initial_preprocessing
validate_parameters
validate_environment
clean_up
generate_experiment_id
generate_hyperparameter_key
create_script_backup
initialize_random_seeds
random_seed_initializer
update_model_params
optimization_core.SKOptimizationProtocol.__init__
, add support for kwarg acquisition_function
in ['EIps', 'PIps']exception_handler
: EnvironmentInactiveError
, EnvironmentInvalidError
, RepeatedExperimentError
hyperparameter_hunter/library_helpers
directoryoptimization_core.BaseOptimizationProtocol.add_default_options
when completed by #31importer.hook_keras_layer
near top of __init__.py
examples.keras_example.py
for current usage - This will need to be removedhook_keras_layer
does not raise any exceptions if Keras has not been installedkey_handler.KeyMaker
:
__init__
(should be extensive)handle_complex_types
's visit
functionkey_type
models.KerasModel
:
__init__
.
initialization_params
, and extra_params
kwargsinitialize_model
fit
,get_input_dim
validate_keras_params
initialize_keras_neural_network
recorders.BaseRecorder.__init__
reporting.OptimizationReporter
methods:
print_column_name
print_target_value
print_input_values
reset_timer
print_summary
print_header
(parameters section)optimization_core.InformedOptimizationProtocol
:
_execute_experiment
_find_similar_experiments
optimization_core.InformedOptimizationProtocol.optimizer
is "tell
-ed" the utility value of a set of hyperparametersoptimizer
, which will cause problems if target_metric
should be minimized
target_metric
is some loss measuretarget_metric
measures lossreporting
functions:
format_frame_source
stringify_frame_source
add_time_to_content
format_fold_run
format_evaluation_results
experiment_id
(or first several characters) to reporting.OptimizationReporter.print_result
, per todo commentenvironment.Environment.__init__
kwargs: train_dataset
, holdout_dataset
, and test_dataset
eval_set
in models.XGBoostModel.fit
per todo commentmodels.XGBoostModel.fit
has been commented out, meaning models.Model.fit
is being usedmodels.XGBoostModel.fit
should still accommodate eval_set
and eval_metric
argumentsreporting.OptimizationReporter
print_column_name
print_target_value
print_input_values
optimization_core.BaseOptimizationProtocol
methods:
_optimization_loop
_update_current_hyperparameters
_set_hyperparameter_space
_get_current_hyperparameters
search_space_size
(See InformedOptimizationProtocol
implementation)optimization_core.BaseOptimizationProtocol.add_default_options
methodBaseOptimizationProtocol.hyperparameter_space
attribute
space.Space
to reflect new default options being added to original dimensions (if InformedOptimizationProtocol
)add_default_options
should leverage the default hyperparameter search ranges added in #30 for the hyperparameter
provided as input and optimization_core.BaseOptimizationProtocol.model_initializer
dimensions
the first argument of optimization_core.InformedOptimizationProtocol.__init__
dimensions
to a required argument, rather than an optional kwargenvironment.Environment.__init__
params with kwargs and the order of precedence for environment_params_path
argumentsenvironment_params_path
reporting.ReportingHandler
:
validate_parameters
configure_reporting_type
initialize_logging_logging
configure_console_logger_handler
configure_heartbeat_logger_handler
skopt.optimizer.Optimizer.__init__
is copied almost verbatim by optimization_utils.AskingOptimizer.__init__
, which is far from ideal
AskingOptimizer
use hyperparameter_hunter.space.Space
, rather than skopt.space.Space
skopt.optimizer.Optimizer.__init__
to use updated Space
, or need to override the particular section of skopt.optimizer.Optimizer.__init__
, in which skopt.space.Space
is usedskopt.optimizer.Optimizer.__init__
will be completely lost, and will need to be manually recreated__repeated_ask_kwargs
, as noted in the pertinent todo comments and the original optimization_utils.AskingOptimizer.__init__
, which is commented out above the current monstrosityexperiments.BaseExperiment._generate_hyperparameter_key
build_fn
s can yield the same architectureexception_handler.
<handle_exception
, hook_exception_handler
> functionsimporter.hook_keras_layer
, check Keras versionInterceptor
to insert in meta_pathexamples/custom_metrics_example.py
reporting.OptimizationReporter.print_summary
methodmetrics.get_clean_predictions
target_metric
kwarg of experiments.BaseExperiment.__init__
experiments.BaseExperiment.__init__
kwargs as experimental while in development: preprocessing_pipeline
, preprocessing_params
reporting.ReportingHandler
methods:
validate_parameters
configure_reporting_type
initialize_logging_logging
configure_console_logger_handler
configure_heartbeat_logger_handler
_logging_log
_logging_debug
_logging_warn
environment.Environment.__init__
to place metrics_map
/metrics_params
closer to the top, since one of them is required.filter_parameters_to_hash
method of the following classes:
key_handler.KeyMaker
key_handler.HyperparameterKeyMaker
random_state
to space.Space.__init__
space.Space.space_random_state
to the given random_state
, so repeated calls to space.Space.rvs
doesn't get messed up, per flag commentspace.Space.space_random_state
is not changeablemodels
, unless keras.models.load_model
required by models.KerasModel.fit
models.KerasModel
is actually in useimporter
inside hyperparameter_hunter.__init__
models.KerasModel.fit
initialize_folds
method implemented by the following classes:
experiments.BaseCVExperiment
experiments.CrossValidationExperiment
experiments.RepeatedCVExperiment
experiments.StandardCVExperiment
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.