|
| 1 | + |
| 2 | +FAQ and additional resources |
| 3 | +============================ |
| 4 | + |
| 5 | +Extending Selene |
| 6 | +---------------- |
| 7 | + |
| 8 | +The main modules that users may want to extend are |
| 9 | + |
| 10 | + |
| 11 | +* ``selene_sdk.samplers.OnlineSampler`` |
| 12 | +* ``selene_sdk.samplers.file_samplers.FileSampler`` |
| 13 | +* ``selene_sdk.sequences.Sequence`` |
| 14 | +* ``selene_sdk.targets.Target`` |
| 15 | + |
| 16 | +Please refer to the documentation for these classes. |
| 17 | +If you are encounter a bug or have a feature request, please post to our Github ` issues < https://github.com/FunctionLab/selene/issues>`_. E-mail [email protected] if you are interested in being a contributor to Selene. |
| 18 | + |
| 19 | +Join our `Google group <https://groups.google.com/forum/#!forum/selene-sdk>`_ if you have questions about the package, case studies, or model development. |
| 20 | + |
| 21 | +Exporting a Selene-trained model to Kipoi |
| 22 | +----------------------------------------- |
| 23 | + |
| 24 | +We have provided an example of how to prepare a model for upload to `Kipoi's model zoo <http://kipoi.org/>`_ using a model trained during case study 2. You can use `this example <https://github.com/FunctionLab/selene/tree/master/manuscript/case2/3_kipoi_export>`_ as a starting point for preparing your own model for Kipoi. We have provided a script that can help to automate parts of the process. |
| 25 | + |
| 26 | +We are also working on an export function that will be built into Selene and accessible through the CLI. |
| 27 | + |
| 28 | +Hyperparameter optimization |
| 29 | +--------------------------- |
| 30 | + |
| 31 | +Hyperparameter optimization is the process of finding the set of hyperparameters that yields an optimal model against a predefined score (e.g. minimizing a loss function). |
| 32 | +Hyperparameters are the variables that govern the training process (i.e. these parameters are constant during training, compared to model parameters which are optimized/"tuned" by the training process itself). |
| 33 | +Hyperparameter tuning works by running multiple trials of a single training run with different values for your chosen hyperparameters, set within some specified limit. Some examples of hyperparameters: |
| 34 | + |
| 35 | + |
| 36 | +* learning rate |
| 37 | +* number of hidden units |
| 38 | +* convolutional kernel size |
| 39 | + |
| 40 | +You can select hyperparameters yourself (manually) or automatically. |
| 41 | +For automatic hyperparameter optimization, you can look into grid search or random search. |
| 42 | + |
| 43 | +Some resources that may be useful: |
| 44 | + |
| 45 | + |
| 46 | +* `Hyperopt: Distributed Asynchronous Hyper-parameter Optimization <https://github.com/hyperopt/hyperopt>`_ |
| 47 | +* `skorch: a scikit-learn compatible neural network library that wraps PyTorch <https://github.com/dnouri/skorch>`_ |
| 48 | +* `Tune: scalable hyperparameter search <https://ray.readthedocs.io/en/latest/tune.html>`_ |
| 49 | +* `Spearmint <https://github.com/JasperSnoek/spearmint>`_ |
| 50 | +* `weights & biases <https://www.wandb.com/>`_ |
| 51 | +* `comet.ml <https://www.comet.ml/>`_ |
| 52 | + |
| 53 | +To use hyperparameter optimization on models being developed with Selene, you could implement a method that runs Selene (via a command-line call) with a set of hyperparameters and then monitors the validation performance based on the output to ``selene_sdk.train_model.validation.txt``. |
0 commit comments