Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Doc improvements and sundry small items #805

Merged
merged 10 commits into from
Jun 21, 2021
4 changes: 0 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
@@ -8,10 +8,6 @@
<img src="https://github.com/alan-turing-institute/MLJ.jl/workflows/CI/badge.svg"
alt="Build Status">
</a>
<a href="https://slackinvite.julialang.org/">
<img src="https://img.shields.io/badge/chat-on%20slack-yellow.svg"
alt="#mlj">
</a>
<a href="https://alan-turing-institute.github.io/MLJ.jl/dev/">
<img src="https://img.shields.io/badge/docs-stable-blue.svg"
alt="Documentation">
2 changes: 2 additions & 0 deletions docs/Project.toml
Original file line number Diff line number Diff line change
@@ -12,6 +12,7 @@ LossFunctions = "30fc2ffe-d236-52d8-8643-a9d8f7c094a7"
MLJBase = "a7f614a8-145f-11e9-1d2a-a57a1082229d"
MLJClusteringInterface = "d354fa79-ed1c-40d4-88ef-b8c7bd1568af"
MLJDecisionTreeInterface = "c6f25543-311c-4c74-83dc-3ea6d1015661"
MLJEnsembles = "50ed68f4-41fd-4504-931a-ed422449fee0"
MLJGLMInterface = "caf8df21-4939-456d-ac9c-5fefbfb04c0c"
MLJIteration = "614be32b-d00c-4edb-bd02-1eb411ab5e55"
MLJLinearModels = "6ee0df7b-362f-4a72-a706-9e79364fb692"
@@ -33,6 +34,7 @@ TypedTables = "9d95f2ec-7b3d-5a63-8d20-e2491e220bb9"
[compat]
Documenter = "0.26"
MLJBase = "0.18"
MLJEnsembles = "0.1"
MLJIteration = "0.3"
MLJModels = "0.14.4"
MLJScientificTypes = "0.4.6"
4 changes: 3 additions & 1 deletion docs/make.jl
Original file line number Diff line number Diff line change
@@ -12,6 +12,7 @@ import MLJSerialization
import MLJBase
import MLJTuning
import MLJModels
import MLJEnsembles
import MLJOpenML
import MLJScientificTypes
import MLJModelInterface
@@ -39,10 +40,10 @@ pages = [
"Weights" => "weights.md",
"Tuning Models" => "tuning_models.md",
"Learning Curves" => "learning_curves.md",
"Preparing Data" => "preparing_data.md",
"Transformers and other unsupervised models" => "transformers.md",
"Composing Models" => "composing_models.md",
"Controlling Iterative Models" => "controlling_iterative_models.md",
"Homogeneous Ensembles" => "homogeneous_ensembles.md",
"Generating Synthetic Data" => "generating_synthetic_data.md",
"OpenML Integration" => "openml_integration.md",
"Acceleration and Parallelism" => "acceleration_and_parallelism.md",
@@ -72,6 +73,7 @@ makedocs(
MLJBase,
MLJTuning,
MLJModels,
MLJEnsembles,
MLJScientificTypes,
MLJModelInterface,
ScientificTypes,
60 changes: 14 additions & 46 deletions docs/src/adding_models_for_general_use.md
Original file line number Diff line number Diff line change
@@ -752,47 +752,15 @@ method | return type | declarable return values
`is_pure_julia` | `Bool` | `true` or `false` | `false`
`supports_weights` | `Bool` | `true` or `false` | `false`

**New.** A final trait you can optionally implement is the
`hyperparamter_ranges` trait. It declares default `ParamRange` objects
for one or more of your model's hyperparameters. This is for use (in
the future) by tuning algorithms (e.g., grid generation). It does not
represent the full space of *allowed values*. This information is
encoded in your `clean!` method (or `@mlj_model` call).

The value returned by `hyperparamter_ranges` must be a tuple of
`ParamRange` objects (query `?range` for details) whose length is the
number of hyperparameters (fields of your model). Note that varying a
hyperparameter over a specified range should not alter any type
parameters in your model struct (this never applies to numeric
ranges). If it doesn't make sense to provide a range for a parameter,
a `nothing` entry is allowed. The fallback returns a tuple of
`nothing`s.

For example, a three parameter model of the form

```julia
mutable struct MyModel{D} <: Deterministic
alpha::Float64
beta::Int
distribution::D
end
```
you might declare (order matters):

```julia
MMI.hyperparameter_ranges(::Type{<:MyModel}) =
(range(Float64, :alpha, lower=0, upper=1, scale=:log),
range(Int, :beta, lower=1, upper=Inf, origin=100, unit=50, scale=:log),
nothing)
```

Here is the complete list of trait function declarations for `DecisionTreeClassifier`
([source](https://github.com/alan-turing-institute/MLJModels.jl/blob/master/src/DecisionTree.jl)):
Here is the complete list of trait function declarations for
`DecisionTreeClassifier`, whose core algorithms are provided by
DecisionTree.jl, but whose interface actually lives at
[MLJDecisionTreeInterface.jl](https://github.com/alan-turing-institute/MLJDecisionTreeInterface.jl).

```julia
MMI.input_scitype(::Type{<:DecisionTreeClassifier}) = MMI.Table(MMI.Continuous)
MMI.target_scitype(::Type{<:DecisionTreeClassifier}) = AbstractVector{<:MMI.Finite}
MMI.load_path(::Type{<:DecisionTreeClassifier}) = "MLJModels.DecisionTree_.DecisionTreeClassifier"
MMI.load_path(::Type{<:DecisionTreeClassifier}) = "MLJDecisionTreeInterface.DecisionTreeClassifier"
MMI.package_name(::Type{<:DecisionTreeClassifier}) = "DecisionTree"
MMI.package_uuid(::Type{<:DecisionTreeClassifier}) = "7806a523-6efd-50cb-b5f6-3fa6f1930dbb"
MMI.package_url(::Type{<:DecisionTreeClassifier}) = "https://github.com/bensadeghi/DecisionTree.jl"
@@ -803,17 +771,19 @@ Alternatively these traits can also be declared using `MMI.metadata_pkg` and `MM

```julia
MMI.metadata_pkg(DecisionTreeClassifier,name="DecisionTree",
uuid="7806a523-6efd-50cb-b5f6-3fa6f1930dbb",
url="https://github.com/bensadeghi/DecisionTree.jl",
julia=true)
packge_uuid="7806a523-6efd-50cb-b5f6-3fa6f1930dbb",
package_url="https://github.com/bensadeghi/DecisionTree.jl",
is_pure_julia=true)

MMI.metadata_model(DecisionTreeClassifier,
input=MMI.Table(MMI.Continuous),
target=AbstractVector{<:MMI.Finite},
path="MLJModels.DecisionTree_.DecisionTreeClassifier")
input_scitype=MMI.Table(MMI.Continuous),
target_scitype=AbstractVector{<:MMI.Finite},
load_path="MLJDecisionTreeInterface.DecisionTreeClassifier")
```

*Important.* Do not omit the `path` specifcation.
*Important.* Do not omit the `load_path` specification. If unsure what
it should be, post an issue at
[MLJ](https://github.com/alan-turing-institute/MLJ.jl/issues).

```@docs
MMI.metadata_pkg
@@ -823,8 +793,6 @@ MMI.metadata_pkg
MMI.metadata_model
```

You can test all your declarations of traits by calling `MLJBase.info_dict(SomeModel)`.


### Iterative models and the update! method

Loading