From 91ffceb1422b50ee8773299efe16a8b98d135f60 Mon Sep 17 00:00:00 2001 From: njtierney Date: Fri, 16 Aug 2024 16:23:55 +1000 Subject: [PATCH 01/14] add a stub of an installation vignette --- vignettes/installation.Rmd | 77 ++++++++++++++++++++++++++++++++++++++ 1 file changed, 77 insertions(+) create mode 100644 vignettes/installation.Rmd diff --git a/vignettes/installation.Rmd b/vignettes/installation.Rmd new file mode 100644 index 00000000..c35511c8 --- /dev/null +++ b/vignettes/installation.Rmd @@ -0,0 +1,77 @@ +--- +title: "Installing Python Dependencies" +output: rmarkdown::html_vignette +vignette: > + %\VignetteIndexEntry{installation} + %\VignetteEngine{knitr::rmarkdown} + %\VignetteEncoding{UTF-8} +--- + +```{r, include = FALSE} +knitr::opts_chunk$set( + collapse = TRUE, + comment = "#>" +) +``` + +```{r setup} +library(greta) +``` + +# Why we need to install dependencies + +# How to install python dependencies using `install_greta_deps()` + +The `install_greta_deps()` function helps install the Python dependencies (Google's [TensorFlow](https://www.tensorflow.org/) and [tensorflow-probability](https://github.com/tensorflow/probability)). + +By default, `install_greta_deps()` installs versions TF 2.15.0, and TFP version 0.23.0, using python 3.10. To change the versions of TF, TFP, or python that you want to use, you specify the `python_deps` argument of `install_greta_deps()`, which used `greta_python_deps()`. See `?install_greta_deps()` or `?greta_python_deps()` for more information. + +This helper function, `install_greta_deps()`, installs the exact pythons package versions needed. It also places these inside a conda environment, "greta-env-tf2". This isolates these exact python modules from other python installations, so that only `greta` will see them. This helps avoids installation issues, where previously you might update tensorflow on your computer and overwrite the current version needed by `greta`. Using this "greta-env-tf2" conda environment means installing other python packages should not be impact the Python packages needed by `greta`. + +If these python modules aren't yet installed, when `greta` is used, it provides instructions on how to install them for your system. If in doubt follow those. + + + +## `greta_deps_tf_tfp` + +## More Detail on how greta installs python dependencies + +## Troubleshooting installation + +If the previous installation helper did not work, you can try the following: + +```{r install_tensorflow, eval = FALSE} +reticulate::install_miniconda() +reticulate::conda_create( + envname = "greta-env", + python_version = "3.7" + ) +reticulate::conda_install( + envname = "greta-env", + packages = c( + "numpy==1.16.4", + "tensorflow-probability==0.7.0", + "tensorflow==1.14.0" + ) + ) +``` + +Which will install the python modules into a conda environment named "greta-env". + +You can also not install these not into a special conda environment "greta-env", +like so: + +```{r install-deps-plain, eval = FALSE} +reticulate::install_miniconda() +reticulate::conda_install( + packages = c( + "numpy==1.16.4", + "tensorflow-probability==0.7.0", + "tensorflow==1.14.0" + ) + ) +``` + + + +
From fe3d6b0e1e528e53ed4e8ba3705fdeb28b5adff4 Mon Sep 17 00:00:00 2001 From: njtierney Date: Fri, 16 Aug 2024 16:32:46 +1000 Subject: [PATCH 02/14] try fixing CRAN badge --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 3e9a722f..359adbb5 100644 --- a/README.md +++ b/README.md @@ -51,7 +51,7 @@ If these python modules aren't yet installed, when `greta` is used, it provides [![Codecov test coverage](https://codecov.io/gh/greta-dev/greta/branch/master/graph/badge.svg)](https://app.codecov.io/gh/greta-dev/greta?branch=master) [![R-CMD-check](https://github.com/greta-dev/greta/workflows/R-CMD-check/badge.svg)](https://github.com/greta-dev/greta/actions) -[![cran version](http://www.r-pkg.org/badges/version/greta)](https://CRAN.R-project.org/package=greta) +[![cran-version](http://www.r-pkg.org/badges/version/greta)](https://CRAN.R-project.org/package=greta) [![license](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/license/apache-2-0) [![doi](https://zenodo.org/badge/73758247.svg)](https://zenodo.org/badge/latestdoi/73758247) [![joss](https://joss.theoj.org/papers/10.21105/joss.01601/status.svg)](https://joss.theoj.org/papers/10.21105/joss.01601) From 5e63d7732f743c79aaff8f40305eec4f94138b79 Mon Sep 17 00:00:00 2001 From: njtierney Date: Tue, 20 Aug 2024 10:10:19 +1000 Subject: [PATCH 03/14] Update "get_started" to use more current information and to link them to "installation" vignette --- vignettes/get_started.Rmd | 44 +------------------ vignettes/installation.Rmd | 88 +++++++++++++++++++++++++++++++++++--- 2 files changed, 84 insertions(+), 48 deletions(-) diff --git a/vignettes/get_started.Rmd b/vignettes/get_started.Rmd index 38f7f643..c08b05df 100644 --- a/vignettes/get_started.Rmd +++ b/vignettes/get_started.Rmd @@ -50,52 +50,12 @@ library(greta) ### Helper functions to install TensorFlow -Before you can fit models with `greta`, you will also need to have a working installation of Google's [TensorFlow](https://www.tensorflow.org/) python package (version 1.14.0) and the [tensorflow-probability](https://github.com/tensorflow/probability) python package (version 0.7.0). In the future we will support different versions of Tensorflow and Tensorflow Probability, but currently we need these exact versions. +Before you can fit models with `greta`, you will also need to have a working installation of Google's [TensorFlow](https://www.tensorflow.org/) python package (version >= 2.0.0) and the [tensorflow-probability](https://github.com/tensorflow/probability) python package (version >= 0.8.0). -To assist with installing these Python packages, `greta` provides an installation helper, `install_greta_deps()`, which installs the exact pythons package versions needed. It also places these inside a "greta-env" conda environment. This isolates these exact python modules from other python installations, so that only `greta` will see them. This helps avoids installation issues, where previously you might update tensorflow on your computer and overwrite the current version needed by `greta`. Using this "greta-env" conda environment means installing other python packages should not be impact the Python packages needed by `greta`. - -If these python modules aren't yet installed, when `greta` is used, it provides instructions on how to install them for your system. If in doubt follow those. +If these python modules aren't yet installed when `greta` is used, it suggests to use `install_greta_deps()` to install the dependencies. We recommend using this function to install dependencies. For more detail on installation, see the vignette "installation". -#### Standard installation - -If the previous installation helper did not work, you can try the following: - -```{r install_tensorflow, eval = FALSE} -reticulate::install_miniconda() -reticulate::conda_create( - envname = "greta-env", - python_version = "3.7" - ) -reticulate::conda_install( - envname = "greta-env", - packages = c( - "numpy==1.16.4", - "tensorflow-probability==0.7.0", - "tensorflow==1.14.0" - ) - ) -``` - -Which will install the python modules into a conda environment named "greta-env". - -You can also not install these not into a special conda environment "greta-env", -like so: - -```{r install-deps-plain, eval = FALSE} -reticulate::install_miniconda() -reticulate::conda_install( - packages = c( - "numpy==1.16.4", - "tensorflow-probability==0.7.0", - "tensorflow==1.14.0" - ) - ) -``` - - -
### DiagrammeR diff --git a/vignettes/installation.Rmd b/vignettes/installation.Rmd index c35511c8..9993501c 100644 --- a/vignettes/installation.Rmd +++ b/vignettes/installation.Rmd @@ -20,24 +20,56 @@ library(greta) # Why we need to install dependencies +The greta package uses Google's [TensorFlow (TF)](https://www.tensorflow.org/) and [Tensorflow Probability (TFP)](https://github.com/tensorflow/probability)) under the hood to do efficient, fast, and scalable linear algebra and MCMC. TF and TFP are python packages, and so are required to be installed. This is different to how normal dependencies work with R packages, where the dependencies are automagically built and managed by CRAN. Unfortunately there isn't an automatic, reliable way to ensure that these are provided along when you install greta, so we need to take an additional step to install them. We have tried very hard to make the process as easy as possible by providing a helper function, `install_greta_deps()`. + # How to install python dependencies using `install_greta_deps()` -The `install_greta_deps()` function helps install the Python dependencies (Google's [TensorFlow](https://www.tensorflow.org/) and [tensorflow-probability](https://github.com/tensorflow/probability)). +We recommend running: + +```{r} +#| eval: FALSE +install_greta_deps() +``` -By default, `install_greta_deps()` installs versions TF 2.15.0, and TFP version 0.23.0, using python 3.10. To change the versions of TF, TFP, or python that you want to use, you specify the `python_deps` argument of `install_greta_deps()`, which used `greta_python_deps()`. See `?install_greta_deps()` or `?greta_python_deps()` for more information. +And then following any prompts to install dependencies. You will then need to restart R and load `library(greta)` to start using greta. -This helper function, `install_greta_deps()`, installs the exact pythons package versions needed. It also places these inside a conda environment, "greta-env-tf2". This isolates these exact python modules from other python installations, so that only `greta` will see them. This helps avoids installation issues, where previously you might update tensorflow on your computer and overwrite the current version needed by `greta`. Using this "greta-env-tf2" conda environment means installing other python packages should not be impact the Python packages needed by `greta`. +# How `install_greta_deps()` works -If these python modules aren't yet installed, when `greta` is used, it provides instructions on how to install them for your system. If in doubt follow those. +The `install_greta_deps()` function installs the Python dependencies TF and TFP. +By default it installs versions TF 2.15.0, and TFP version 0.23.0. It places these inside a conda environment, "greta-env-tf2". For the default settings, this is python 3.10. Using a conda environment isolates these exact python modules from other python installations, so only `greta` will see them. - +We do this as it helps avoids installation issues, where previously you might update TF on your computer and overwrite the current version needed by `greta`. Using this "greta-env-tf2" conda environment means installing other python packages should not be impact the Python packages needed by `greta`. It is part of the recommended way to [manage python dependencies in an R package](https://rstudio.github.io/reticulate/articles/python_dependencies.html) as recommended by the team at Posit. + +## Using different versions of TF, TFP, and Python + +The `install_greta_deps()` function takes three arguments: + +1. `python_deps`: Specify dependencies with `greta_python_deps()` +2. `timeout`: time in minutes to wait in installation before failing/exiting +3. `restart`: whether to restart R ("force" - restart R, "no", will not restart, "ask" (default) - ask the user) -## `greta_deps_tf_tfp` +You specify the version of TF TFP, or python that you want to use with `greta_python_deps()`, which has arguments: + +- `tf_version` +- `tfp_version` +- `python_version` + +If you specify versions of TF/TFP/Python that are not compatible with each other, it will error before starting installation. We determined the appropriate versions of Python, TF, and TFP from https://www.tensorflow.org/install/source#tested_build_configurations and https://www.tensorflow.org/install/source_windows#tested_build_configurations, and by inspecting TFP release notes. We put this information together into a dataset, `greta_deps_tf_tfp`. You can inspect this with `View(greta_deps_tf_tfp)`. + +If you provide an invalid installation versions, it will error and suggest some alternative installation versions. + +## How we install dependencies. + +We create a separate R instances using [`callr`]() to install python dependencies using `reticulate` to talk to Python, and the R package `tensorflow`, for installing the tensorflow python module. ## More Detail on how greta installs python dependencies ## Troubleshooting installation +### Using reinstallers +### destroying dependencies +### manual installation + If the previous installation helper did not work, you can try the following: ```{r install_tensorflow, eval = FALSE} @@ -75,3 +107,47 @@ reticulate::conda_install(
+ + +# Morgue + + +#### Standard installation + +If the previous installation helper did not work, you can try the following: + +```{r install_tensorflow, eval = FALSE} +reticulate::install_miniconda() +reticulate::conda_create( + envname = "greta-env", + python_version = "3.7" + ) +reticulate::conda_install( + envname = "greta-env", + packages = c( + "numpy==1.16.4", + "tensorflow-probability==0.7.0", + "tensorflow==1.14.0" + ) + ) +``` + +Which will install the python modules into a conda environment named "greta-env". + +You can also not install these not into a special conda environment "greta-env", +like so: + +```{r install-deps-plain, eval = FALSE} +reticulate::install_miniconda() +reticulate::conda_install( + packages = c( + "numpy==1.16.4", + "tensorflow-probability==0.7.0", + "tensorflow==1.14.0" + ) + ) +``` + + + + From 0ea17e623011404dee602f8e8dfe4d53def6bc5c Mon Sep 17 00:00:00 2001 From: njtierney Date: Fri, 16 Aug 2024 20:46:20 +1000 Subject: [PATCH 04/14] improves the text formatting * uses {{{}}} to escape HTML * use
 tags to escape code
* learnt from @maelle and https://mustache.github.io/mustache.5.html
---
 R/write-logfiles.R | 37 +++++++++++++++++++++++++++++++------
 1 file changed, 31 insertions(+), 6 deletions(-)

diff --git a/R/write-logfiles.R b/R/write-logfiles.R
index c618a63b..028f05ec 100644
--- a/R/write-logfiles.R
+++ b/R/write-logfiles.R
@@ -40,15 +40,23 @@ write_greta_install_log <- function(path = greta_logfile) {
     
Miniconda Installation Notes +
+        
+          {{{miniconda_notes}}}
+        
+      
- {{miniconda_notes}}
Miniconda Installation Errors - {{miniconda_error}} +
+        
+          {{{miniconda_error}}}
+        
+      

Conda Environment

@@ -57,14 +65,22 @@ write_greta_install_log <- function(path = greta_logfile) { Conda Environment Notes - {{conda_create_notes}} +
+        
+     {{{conda_create_notes}}}
+        
+      
Conda Environment Errors - {{conda_create_error}} +
+        
+      {{{conda_create_error}}}
+        
+      

Python Module Installation

@@ -73,14 +89,22 @@ write_greta_install_log <- function(path = greta_logfile) { Python Module Installation Notes - {{conda_install_notes}} +
+        
+  {{{conda_install_notes}}}
+        
+      
Python Module Installation Errors - {{conda_install_error}} +
+        
+       {{{conda_install_error}}}
+        
+      
' @@ -98,3 +122,4 @@ write_greta_install_log <- function(path = greta_logfile) { path) } + From 89a7593263f41d9e28db2c9f854d413f6bf3ca69 Mon Sep 17 00:00:00 2001 From: njtierney Date: Fri, 16 Aug 2024 21:06:23 +1000 Subject: [PATCH 05/14] add `read_greta_install_log()` --- NAMESPACE | 1 + NEWS.md | 2 +- R/write-logfiles.R | 32 ++++++++++++++++++++++++++++++++ 3 files changed, 34 insertions(+), 1 deletion(-) diff --git a/NAMESPACE b/NAMESPACE index ae38e342..dd075d07 100644 --- a/NAMESPACE +++ b/NAMESPACE @@ -242,6 +242,7 @@ export(powell) export(proximal_adagrad) export(proximal_gradient_descent) export(rdist) +export(read_greta_install_log) export(reinstall_greta_deps) export(reinstall_greta_env) export(reinstall_miniconda) diff --git a/NEWS.md b/NEWS.md index bb9040a0..7ba51c10 100644 --- a/NEWS.md +++ b/NEWS.md @@ -42,7 +42,7 @@ This release provides a few improvements to installation in greta. It should now * Added checking suite to ensure you are using valid versions of TF, TFP, and Python(#666) * Added data `greta_deps_tf_tfp` (#666), which contains valid versions combinations of TF, TFP, and Python. * remove `greta_nodes_install/conda_*()` options as #493 makes them defunct. -* Added option to write to a single logfile with `greta_set_install_logfile()`, and `write_greta_install_log()`. (#493) +* Added option to write to a single logfile with `greta_set_install_logfile()`, and `write_greta_install_log()`, and `read_greta_install_log()` (#493) * Added `destroy_greta_deps()` function to remove miniconda and python conda environment ## Minor diff --git a/R/write-logfiles.R b/R/write-logfiles.R index 028f05ec..712dcc09 100644 --- a/R/write-logfiles.R +++ b/R/write-logfiles.R @@ -27,6 +27,10 @@ write_greta_install_log <- function(path = greta_logfile) { cli::cli_progress_step( msg = "Writing logfile to {.path {path}}", msg_done = "Logfile written to {.path {path}}" + ) + + cli::cli_progress_step( + msg = "Open with: {.fun read_greta_logfile}" ) template <- ' @@ -123,3 +127,31 @@ write_greta_install_log <- function(path = greta_logfile) { } +# returns NULL if no envvar +sys_get_env <- function(envvar){ + retrieved_envvar <- Sys.getenv(envvar) + env_exists <- nzchar(log_env) + if (env_exists){ + envvar + } else { + envvar <- NULL + } + + envvar +} + +#' Read a greta logfile +#' +#' @param path file to read. Optional. If not specified, it will search for +#' the environment variable "GRETA_INSTALLATION_LOG" +#' +#' @return opens a URL in your default browser +#' @export +read_greta_install_log <- function(path = NULL){ + log_env <- sys_get_env("GRETA_INSTALLATION_LOG") + + path <- path %||% log_env + + browseURL(path) + +} From 933002acb05eeffc17fe6b2c99cc98dab776516d Mon Sep 17 00:00:00 2001 From: njtierney Date: Mon, 19 Aug 2024 12:14:32 +1000 Subject: [PATCH 06/14] address some breaking build changes from mis-specified documentation --- DESCRIPTION | 1 + NAMESPACE | 2 ++ R/greta_create_conda_env.R | 20 ++++++++++++++++++++ R/greta_install_miniconda.R | 11 +++++++++++ R/write-logfiles.R | 14 ++++++++++---- man/greta_create_conda_env.Rd | 27 +++++++++++++++++++++++++++ man/greta_install_miniconda.Rd | 20 ++++++++++++++++++++ man/read_greta_install_log.Rd | 23 +++++++++++++++++++++++ 8 files changed, 114 insertions(+), 4 deletions(-) create mode 100644 man/greta_create_conda_env.Rd create mode 100644 man/greta_install_miniconda.Rd create mode 100644 man/read_greta_install_log.Rd diff --git a/DESCRIPTION b/DESCRIPTION index 23be7270..2df04268 100644 --- a/DESCRIPTION +++ b/DESCRIPTION @@ -46,6 +46,7 @@ Imports: reticulate (>= 1.19.0), rlang, tensorflow (>= 2.8.0), + utils, whisker, yesno Suggests: diff --git a/NAMESPACE b/NAMESPACE index dd075d07..538c28dd 100644 --- a/NAMESPACE +++ b/NAMESPACE @@ -196,7 +196,9 @@ export(gamma) export(gpu_only) export(gradient_descent) export(greta_array) +export(greta_create_conda_env) export(greta_deps_receipt) +export(greta_install_miniconda) export(greta_notes_tf_num_error) export(greta_python_deps) export(greta_set_install_logfile) diff --git a/R/greta_create_conda_env.R b/R/greta_create_conda_env.R index 6542ada6..d40cc9b9 100644 --- a/R/greta_create_conda_env.R +++ b/R/greta_create_conda_env.R @@ -1,6 +1,26 @@ +#' Create conda environment for greta +#' +#' This function runs [reticulate::conda_create()] inside +#' [callr::r_process_options()], to create the conda environment, +#' "greta-env-tf2". This is used within [install_greta_deps()] as part of +#' setting up python dependencies. It uses a version of python that is +#' compatible with the versions of tensorflow and tensorflow-probability, +#' which is established with [greta_python_deps()]. We mostly recommend +#' users use [install_greta_deps()] to manage their python dependency +#' installation. +#' +#' +#' @param timeout time (minutes) until installation stops. Default is 5 minutes. +#' @param python_deps dependency specification, see [greta_python_deps()] for +#' more details. +#' +#' @return nothing - creates a conda environment for a specific python version +#' @export greta_create_conda_env <- function(timeout = 5, python_deps = greta_python_deps()) { + check_greta_python_deps(python_deps) + stdout_file <- create_temp_file("out-greta-conda") stderr_file <- create_temp_file("err-greta-conda") diff --git a/R/greta_install_miniconda.R b/R/greta_install_miniconda.R index c90a7dc1..ed523d6c 100644 --- a/R/greta_install_miniconda.R +++ b/R/greta_install_miniconda.R @@ -1,3 +1,14 @@ +#' Installs miniconda +#' +#' This installs miniconda using [reticulate::install_miniconda()] inside +#' [callr::r_process_options()]. Used internally by [install_greta_deps()]. +#' We mostly recommend users use [install_greta_deps()] to manage their +#' python dependency installation. +#' +#' @param timeout time (minutes) until installation stops. Default is 5 minutes. +#' +#' @return nothing - installs miniconda. +#' @export greta_install_miniconda <- function(timeout = 5) { stdout_file <- create_temp_file("out-miniconda") diff --git a/R/write-logfiles.R b/R/write-logfiles.R index 712dcc09..5dea578c 100644 --- a/R/write-logfiles.R +++ b/R/write-logfiles.R @@ -130,7 +130,7 @@ write_greta_install_log <- function(path = greta_logfile) { # returns NULL if no envvar sys_get_env <- function(envvar){ retrieved_envvar <- Sys.getenv(envvar) - env_exists <- nzchar(log_env) + env_exists <- nzchar(retrieved_envvar) if (env_exists){ envvar } else { @@ -139,11 +139,17 @@ sys_get_env <- function(envvar){ envvar } - #' Read a greta logfile #' +#' This is a convenience function to facilitate reading logfiles. It opens +#' a browser using [utils::browseURL()]. +#' #' @param path file to read. Optional. If not specified, it will search for -#' the environment variable "GRETA_INSTALLATION_LOG" +#' the environment variable "GRETA_INSTALLATION_LOG". To set +#' "GRETA_INSTALLATION_LOG" you can use +#' `Sys.setenv('GRETA_INSTALLATION_LOG'='path/to/logfile.html')`. Or use +#' [greta_set_install_logfile()] to set the path, e.g., +#' `greta_set_install_logfile('path/to/logfile.html')`. #' #' @return opens a URL in your default browser #' @export @@ -152,6 +158,6 @@ read_greta_install_log <- function(path = NULL){ path <- path %||% log_env - browseURL(path) + utils::browseURL(path) } diff --git a/man/greta_create_conda_env.Rd b/man/greta_create_conda_env.Rd new file mode 100644 index 00000000..6ba2f78e --- /dev/null +++ b/man/greta_create_conda_env.Rd @@ -0,0 +1,27 @@ +% Generated by roxygen2: do not edit by hand +% Please edit documentation in R/greta_create_conda_env.R +\name{greta_create_conda_env} +\alias{greta_create_conda_env} +\title{Create conda environment for greta} +\usage{ +greta_create_conda_env(timeout = 5, python_deps = greta_python_deps()) +} +\arguments{ +\item{timeout}{time (minutes) until installation stops. Default is 5 minutes.} + +\item{python_deps}{dependency specification, see \code{\link[=greta_python_deps]{greta_python_deps()}} for +more details.} +} +\value{ +nothing - creates a conda environment for a specific python version +} +\description{ +This function runs \code{\link[reticulate:conda-tools]{reticulate::conda_create()}} inside +\code{\link[callr:r_process_options]{callr::r_process_options()}}, to create the conda environment, +"greta-env-tf2". This is used within \code{\link[=install_greta_deps]{install_greta_deps()}} as part of +setting up python dependencies. It uses a version of python that is +compatible with the versions of tensorflow and tensorflow-probability, +which is established with \code{\link[=greta_python_deps]{greta_python_deps()}}. We mostly recommend +users use \code{\link[=install_greta_deps]{install_greta_deps()}} to manage their python dependency +installation. +} diff --git a/man/greta_install_miniconda.Rd b/man/greta_install_miniconda.Rd new file mode 100644 index 00000000..210a238e --- /dev/null +++ b/man/greta_install_miniconda.Rd @@ -0,0 +1,20 @@ +% Generated by roxygen2: do not edit by hand +% Please edit documentation in R/greta_install_miniconda.R +\name{greta_install_miniconda} +\alias{greta_install_miniconda} +\title{Installs miniconda} +\usage{ +greta_install_miniconda(timeout = 5) +} +\arguments{ +\item{timeout}{time (minutes) until installation stops. Default is 5 minutes.} +} +\value{ +nothing - installs miniconda. +} +\description{ +This installs miniconda using \code{\link[reticulate:install_miniconda]{reticulate::install_miniconda()}} inside +\code{\link[callr:r_process_options]{callr::r_process_options()}}. Used internally by \code{\link[=install_greta_deps]{install_greta_deps()}}. +We mostly recommend users use \code{\link[=install_greta_deps]{install_greta_deps()}} to manage their +python dependency installation. +} diff --git a/man/read_greta_install_log.Rd b/man/read_greta_install_log.Rd new file mode 100644 index 00000000..c8b0bb83 --- /dev/null +++ b/man/read_greta_install_log.Rd @@ -0,0 +1,23 @@ +% Generated by roxygen2: do not edit by hand +% Please edit documentation in R/write-logfiles.R +\name{read_greta_install_log} +\alias{read_greta_install_log} +\title{Read a greta logfile} +\usage{ +read_greta_install_log(path = NULL) +} +\arguments{ +\item{path}{file to read. Optional. If not specified, it will search for +the environment variable "GRETA_INSTALLATION_LOG". To set +"GRETA_INSTALLATION_LOG" you can use +\code{Sys.setenv('GRETA_INSTALLATION_LOG'='path/to/logfile.html')}. Or use +\code{\link[=greta_set_install_logfile]{greta_set_install_logfile()}} to set the path, e.g., +\code{greta_set_install_logfile('path/to/logfile.html')}.} +} +\value{ +opens a URL in your default browser +} +\description{ +This is a convenience function to facilitate reading logfiles. It opens +a browser using \code{\link[utils:browseURL]{utils::browseURL()}}. +} From d5d5af19b0d005f61c644e6488b9394e13b2653f Mon Sep 17 00:00:00 2001 From: njtierney Date: Mon, 19 Aug 2024 15:59:10 +1000 Subject: [PATCH 07/14] add some missing `skip_if_not(check_tf_version())` --- tests/testthat/test-message_if_using_gpu.R | 5 ++++- tests/testthat/test-print_calculate.R | 1 + tests/testthat/test_as_data.R | 1 - tests/testthat/test_calculate.R | 14 -------------- tests/testthat/test_distributions.R | 1 - tests/testthat/test_distributions_cholesky.R | 2 ++ tests/testthat/test_functions.R | 8 -------- tests/testthat/test_seed.R | 3 +++ 8 files changed, 10 insertions(+), 25 deletions(-) diff --git a/tests/testthat/test-message_if_using_gpu.R b/tests/testthat/test-message_if_using_gpu.R index 5fad85a6..33c1ec06 100644 --- a/tests/testthat/test-message_if_using_gpu.R +++ b/tests/testthat/test-message_if_using_gpu.R @@ -1,4 +1,5 @@ test_that("message_if_using_gpu gives the correct message for cpu or gpu use", { + skip_if_not(check_tf_version()) expect_snapshot( message_if_using_gpu(cpu_only()) ) @@ -8,6 +9,7 @@ test_that("message_if_using_gpu gives the correct message for cpu or gpu use", { }) test_that("message_if_using_gpu does not message when option set",{ + skip_if_not(check_tf_version()) withr::local_options( list("greta_gpu_message" = FALSE) ) @@ -19,6 +21,7 @@ test_that("message_if_using_gpu does not message when option set",{ }) test_that("message_if_using_gpu does message when option set",{ + skip_if_not(check_tf_version()) withr::local_options( list("greta_gpu_message" = TRUE) ) @@ -30,7 +33,7 @@ test_that("message_if_using_gpu does message when option set",{ }) test_that("is_using_gpu and is_using_cpu work",{ - + skip_if_not(check_tf_version()) expect_true(is_using_gpu(gpu_only())) expect_false(is_using_gpu(cpu_only())) diff --git a/tests/testthat/test-print_calculate.R b/tests/testthat/test-print_calculate.R index 69450e95..cc6831a4 100644 --- a/tests/testthat/test-print_calculate.R +++ b/tests/testthat/test-print_calculate.R @@ -1,4 +1,5 @@ test_that("calculate print method is different for different inputs", { + skip_if_not(check_tf_version()) # ensure print method is the new MCMC one skip_on_cran() skip_on_ci() diff --git a/tests/testthat/test_as_data.R b/tests/testthat/test_as_data.R index 792bf0eb..27d06102 100644 --- a/tests/testthat/test_as_data.R +++ b/tests/testthat/test_as_data.R @@ -1,7 +1,6 @@ test_that("as_data coerces correctly", { skip_if_not(check_tf_version()) - # logical, integer and numeric # vector, matrix, array, dataframe diff --git a/tests/testthat/test_calculate.R b/tests/testthat/test_calculate.R index a7ee74ce..d64dff18 100644 --- a/tests/testthat/test_calculate.R +++ b/tests/testthat/test_calculate.R @@ -28,7 +28,6 @@ test_that("deterministic calculate works with correct lists", { test_that("stochastic calculate works with correct lists", { skip_if_not(check_tf_version()) - # nolint start # with y ~ N(100, 1 ^ 2), it should be very unlikely that y <= 90 # ( pnorm(90, 100, 1) = 7e-24 ) @@ -132,7 +131,6 @@ test_that("deterministic calculate works with greta_mcmc_list objects", { test_that("calculate with greta_mcmc_list doesn't mix up variables", { skip_if_not(check_tf_version()) - a <- normal(-100, 0.001) b <- normal(100, 0.001) c <- normal(0, 0.001) @@ -153,7 +151,6 @@ test_that("calculate with greta_mcmc_list doesn't mix up variables", { test_that("calculate with greta_mcmc_list doesn't lose track of new nodes", { skip_if_not(check_tf_version()) - z <- normal(0, 1) m <- model(z) draws <- mcmc(m, warmup = 100, n_samples = 100, verbose = FALSE) @@ -170,7 +167,6 @@ test_that("calculate with greta_mcmc_list doesn't lose track of new nodes", { test_that("stochastic calculate works with greta_mcmc_list objects", { skip_if_not(check_tf_version()) - samples <- 10 chains <- 2 @@ -225,7 +221,6 @@ test_that("stochastic calculate works with greta_mcmc_list objects", { test_that("calculate errors if the mcmc samples unrelated to target", { skip_if_not(check_tf_version()) - samples <- 10 chains <- 2 @@ -253,7 +248,6 @@ test_that("calculate errors if the mcmc samples unrelated to target", { test_that("stochastic calculate works with mcmc samples & new stochastics", { skip_if_not(check_tf_version()) - samples <- 10 chains <- 2 @@ -288,7 +282,6 @@ test_that("stochastic calculate works with mcmc samples & new stochastics", { test_that("calculate errors nicely if non-greta arrays are passed", { skip_if_not(check_tf_version()) - x <- c(1, 2) a <- normal(0, 1) y <- a * x @@ -308,7 +301,6 @@ test_that("calculate errors nicely if non-greta arrays are passed", { test_that("calculate errors nicely if values for stochastics not passed", { skip_if_not(check_tf_version()) - x <- as_data(c(1, 2)) a <- normal(0, 1) y <- a * x @@ -325,7 +317,6 @@ test_that("calculate errors nicely if values for stochastics not passed", { test_that("calculate errors nicely if values have incorrect dimensions", { skip_if_not(check_tf_version()) - x <- as_data(c(1, 2)) a <- normal(0, 1) y <- a * x @@ -339,7 +330,6 @@ test_that("calculate errors nicely if values have incorrect dimensions", { test_that("calculate works with variable batch sizes", { skip_if_not(check_tf_version()) - samples <- 100 x <- as_data(c(1, 2)) a <- normal(0, 1) @@ -367,7 +357,6 @@ test_that("calculate works with variable batch sizes", { test_that("calculate errors nicely with invalid batch sizes", { skip_if_not(check_tf_version()) - samples <- 100 x <- as_data(c(1, 2)) a <- normal(0, 1) @@ -390,7 +379,6 @@ test_that("calculate errors nicely with invalid batch sizes", { test_that("calculate returns a named list", { skip_if_not(check_tf_version()) - a <- as_data(randn(3)) b <- a^2 c <- sqrt(b) @@ -449,7 +437,6 @@ test_that("calculate works if distribution-free variables are fixed", { test_that("calculate errors if distribution-free variables are not fixed", { skip_if_not(check_tf_version()) - # fix variable a <- variable() y <- normal(a, 1) @@ -461,7 +448,6 @@ test_that("calculate errors if distribution-free variables are not fixed", { test_that("calculate errors if a distribution cannot be sampled from", { skip_if_not(check_tf_version()) - # fix variable y <- hypergeometric(5, 3, 2) expect_snapshot_error( diff --git a/tests/testthat/test_distributions.R b/tests/testthat/test_distributions.R index 301c107f..ab485133 100644 --- a/tests/testthat/test_distributions.R +++ b/tests/testthat/test_distributions.R @@ -775,7 +775,6 @@ test_that("wishart distribution errors informatively", { ) }) - test_that("lkj_correlation distribution errors informatively", { skip_if_not(check_tf_version()) diff --git a/tests/testthat/test_distributions_cholesky.R b/tests/testthat/test_distributions_cholesky.R index f2db7374..76a6d2fb 100644 --- a/tests/testthat/test_distributions_cholesky.R +++ b/tests/testthat/test_distributions_cholesky.R @@ -61,6 +61,7 @@ test_that("Cholesky factor of LJK_correlation should be a lower triangular matri }) test_that("Post-MCMC, Wishart distribution stays symmetric, chol remains lower tri",{ + skip_if_not(check_tf_version()) # From https://github.com/greta-dev/greta/issues/585 x <- wishart(df = 4, Sigma = diag(3)) m <- model(x) @@ -81,6 +82,7 @@ test_that("Post-MCMC, Wishart distribution stays symmetric, chol remains lower t }) test_that("Post-MCMC, LKJ distribution stays symmetric, chol remains lower tri",{ + skip_if_not(check_tf_version()) # From https://github.com/greta-dev/greta/issues/585 x <- lkj_correlation(eta = 3, dimension = 3) m <- model(x) diff --git a/tests/testthat/test_functions.R b/tests/testthat/test_functions.R index 0f15fc90..0a4f2d50 100644 --- a/tests/testthat/test_functions.R +++ b/tests/testthat/test_functions.R @@ -65,7 +65,6 @@ test_that("primitive functions work as expected", { test_that("cummax and cummin functions error informatively", { skip_if_not(check_tf_version()) - cumulative_funs <- list(cummax, cummin) x <- as_data(randn(10)) @@ -79,7 +78,6 @@ test_that("cummax and cummin functions error informatively", { test_that("complex number functions error informatively", { skip_if_not(check_tf_version()) - complex_funs <- list(Im, Re, Arg, Conj, Mod) x <- as_data(randn(25, 4)) @@ -145,7 +143,6 @@ test_that("kronecker works with greta and base array arguments", { test_that("aperm works as expected", { skip_if_not(check_tf_version()) - a <- randn(5, 4, 3, 2, 1) # default is to reverse dims @@ -201,7 +198,6 @@ test_that("cumulative functions work as expected", { test_that("apply works as expected", { skip_if_not(check_tf_version()) - # check apply.greta_array works like R's apply for X check_apply <- function(X, MARGIN, FUN) { # nolint check_op(apply, a, @@ -244,7 +240,6 @@ test_that("tapply works as expected", { test_that("cumulative functions error as expected", { skip_if_not(check_tf_version()) - a <- as_data(randn(1, 5)) b <- as_data(randn(5, 1, 1)) @@ -270,7 +265,6 @@ test_that("cumulative functions error as expected", { test_that("sweep works as expected", { skip_if_not(check_tf_version()) - stats_list <- list(randn(5), randn(25)) x <- randn(5, 25) @@ -291,7 +285,6 @@ test_that("sweep works as expected", { test_that("sweep works for numeric x and greta array STATS", { skip_if_not(check_tf_version()) - stats <- randn(5) ga_stats <- as_data(stats) x <- randn(5, 25) @@ -437,7 +430,6 @@ test_that("forwardsolve and backsolve error as expected", { test_that("tapply errors as expected", { skip_if_not(check_tf_version()) - group <- sample.int(5, 10, replace = TRUE) a <- ones(10, 1) b <- ones(10, 2) diff --git a/tests/testthat/test_seed.R b/tests/testthat/test_seed.R index dd5df49d..cb5c30a8 100644 --- a/tests/testthat/test_seed.R +++ b/tests/testthat/test_seed.R @@ -77,6 +77,7 @@ test_that("calculate produces the right number of samples", { test_that("calculate samples are the same when the argument seed is the same", { + skip_if_not(check_tf_version()) a <- normal(0, 1) y <- normal(a, 1) m <- model(y) @@ -92,6 +93,7 @@ test_that("calculate samples are the same when the argument seed is the same", { }) test_that("calculate samples are the same when the R seed is the same", { + skip_if_not(check_tf_version()) a <- normal(0, 1) y <- normal(a, 1) m <- model(y) @@ -109,6 +111,7 @@ test_that("calculate samples are the same when the R seed is the same", { }) test_that("mcmc samples are the same when the R seed is the same", { + skip_if_not(check_tf_version()) a <- normal(0, 1) y <- normal(a, 1) m <- model(y) From ca5a77528bc319b419fc030a54035d37c0572008 Mon Sep 17 00:00:00 2001 From: njtierney Date: Tue, 20 Aug 2024 09:42:34 +1000 Subject: [PATCH 08/14] Use `expect_snapshot(error = TRUE, ...)` over `expect_snapshot_error`, since this captures more information about the code that was run for the error, making it easier to debug/explore changes in snapshots. --- tests/testthat/_snaps/calculate.md | 118 ++++-- tests/testthat/_snaps/distributions.md | 346 +++++++++++++----- .../_snaps/extract_replace_combine.md | 90 ++++- tests/testthat/_snaps/functions.md | 210 ++++++++--- tests/testthat/_snaps/future.md | 24 +- tests/testthat/_snaps/iid_samples.md | 13 +- tests/testthat/_snaps/inference.md | 106 ++++-- tests/testthat/_snaps/install_greta_deps.md | 10 +- tests/testthat/_snaps/joint.md | 28 +- tests/testthat/_snaps/misc.md | 82 ++++- tests/testthat/_snaps/mixture.md | 57 ++- tests/testthat/_snaps/operators.md | 18 +- tests/testthat/_snaps/opt.md | 92 +++-- tests/testthat/_snaps/simulate.md | 36 +- tests/testthat/_snaps/syntax.md | 56 ++- tests/testthat/_snaps/transforms.md | 8 +- tests/testthat/_snaps/truncated.md | 18 +- tests/testthat/_snaps/variables.md | 72 ++-- tests/testthat/test-zzzzzz.R | 2 +- tests/testthat/test_calculate.R | 30 +- tests/testthat/test_distributions.R | 80 ++-- tests/testthat/test_extract_replace_combine.R | 28 +- tests/testthat/test_functions.R | 52 +-- tests/testthat/test_future.R | 8 +- tests/testthat/test_iid_samples.R | 4 +- tests/testthat/test_inference.R | 30 +- tests/testthat/test_install_greta_deps.R | 4 +- tests/testthat/test_joint.R | 8 +- tests/testthat/test_misc.R | 22 +- tests/testthat/test_mixture.R | 14 +- tests/testthat/test_operators.R | 4 +- tests/testthat/test_opt.R | 18 +- tests/testthat/test_simulate.R | 10 +- tests/testthat/test_syntax.R | 18 +- tests/testthat/test_transforms.R | 2 +- tests/testthat/test_truncated.R | 4 +- tests/testthat/test_variables.R | 12 +- 37 files changed, 1227 insertions(+), 507 deletions(-) diff --git a/tests/testthat/_snaps/calculate.md b/tests/testthat/_snaps/calculate.md index 0391021b..fc126fb4 100644 --- a/tests/testthat/_snaps/calculate.md +++ b/tests/testthat/_snaps/calculate.md @@ -1,8 +1,12 @@ # stochastic calculate works with greta_mcmc_list objects - `nsim` must be set to sample s not in MCMC samples - the greta arrays `y` have distributions and are not in the MCMC samples, so cannot be calculated from the samples alone. - Set `nsim` if you want to sample them conditionally on the MCMC samples + Code + calc_a <- calculate(a, y, values = draws) + Condition + Error in `calculate_greta_mcmc_list()`: + ! `nsim` must be set to sample s not in MCMC samples + the greta arrays `y` have distributions and are not in the MCMC samples, so cannot be calculated from the samples alone. + Set `nsim` if you want to sample them conditionally on the MCMC samples --- @@ -10,69 +14,125 @@ # calculate errors if the mcmc samples unrelated to target - the target s do not appear to be connected to those in the object + Code + calc_c <- calculate(c, values = draws) + Condition + Error in `calculate_greta_mcmc_list()`: + ! the target s do not appear to be connected to those in the object # stochastic calculate works with mcmc samples & new stochastics - `nsim` must be set to sample s not in MCMC samples - the target s are related to new variables that are not in the MCMC samples, so cannot be calculated from the samples alone. - Set `nsim` if you want to sample them conditionally on the MCMC samples + Code + calc_b <- calculate(b, values = draws) + Condition + Error in `calculate_greta_mcmc_list()`: + ! `nsim` must be set to sample s not in MCMC samples + the target s are related to new variables that are not in the MCMC samples, so cannot be calculated from the samples alone. + Set `nsim` if you want to sample them conditionally on the MCMC samples # calculate errors nicely if non-greta arrays are passed - `calculate()` arguments must be s - The following object passed to `calculate()` is not a : - "x" - Perhaps you forgot to explicitly name other arguments? + Code + calc_y <- calculate(y, x, values = list(x = c(2, 1))) + Condition + Error in `calculate()`: + ! `calculate()` arguments must be s + The following object passed to `calculate()` is not a : + "x" + Perhaps you forgot to explicitly name other arguments? --- - `calculate()` arguments must be s - The following object passed to `calculate()` is not a : - "list(x = c(2, 1))" - Perhaps you forgot to explicitly name other arguments? + Code + calc_y <- calculate(y, list(x = c(2, 1))) + Condition + Error in `calculate()`: + ! `calculate()` arguments must be s + The following object passed to `calculate()` is not a : + "list(x = c(2, 1))" + Perhaps you forgot to explicitly name other arguments? # calculate errors nicely if values for stochastics not passed - Please provide values for the following 1 : - `a` + Code + calc_y <- calculate(y, values = list(x = c(2, 1))) + Condition + Error in `lapply()`: + ! Please provide values for the following 1 : + `a` # calculate errors nicely if values have incorrect dimensions - a provided value has different number of elements than the + Code + calc_y <- calculate(y, values = list(a = c(1, 1))) + Condition + Error: + ! a provided value has different number of elements than the # calculate errors nicely with invalid batch sizes - `trace_batch_size` must be a single numeric value greater than or equal to 1 + Code + calc_y <- calculate(y, values = draws, trace_batch_size = 0) + Condition + Error in `calculate_greta_mcmc_list()`: + ! `trace_batch_size` must be a single numeric value greater than or equal to 1 --- - `trace_batch_size` must be a single numeric value greater than or equal to 1 + Code + calc_y <- calculate(y, values = draws, trace_batch_size = NULL) + Condition + Error in `calculate_greta_mcmc_list()`: + ! `trace_batch_size` must be a single numeric value greater than or equal to 1 --- - `trace_batch_size` must be a single numeric value greater than or equal to 1 + Code + calc_y <- calculate(y, values = draws, trace_batch_size = NA) + Condition + Error in `calculate_greta_mcmc_list()`: + ! `trace_batch_size` must be a single numeric value greater than or equal to 1 # calculate errors if distribution-free variables are not fixed - the target s are related to variables that do not have distributions so cannot be sampled + Code + calc_a <- calculate(a, y, nsim = 1) + Condition + Error in `calculate_list()`: + ! the target s are related to variables that do not have distributions so cannot be sampled # calculate errors if a distribution cannot be sampled from - Sampling is not yet implemented for "hypergeometric" distributions + Code + sims <- calculate(y, nsim = 1) + Condition + Error in `check_sampling_implemented()`: + ! Sampling is not yet implemented for "hypergeometric" distributions # calculate errors nicely if nsim is invalid - nsim must be a positive integer - However the value provided was: 0 + Code + calc_x <- calculate(x, nsim = 0) + Condition + Error in `calculate()`: + ! nsim must be a positive integer + However the value provided was: 0 --- - nsim must be a positive integer - However the value provided was: -1 + Code + calc_x <- calculate(x, nsim = -1) + Condition + Error in `calculate()`: + ! nsim must be a positive integer + However the value provided was: -1 --- - nsim must be a positive integer - However the value provided was: NA + Code + calc_x <- calculate(x, nsim = "five") + Condition + Error in `calculate()`: + ! nsim must be a positive integer + However the value provided was: NA diff --git a/tests/testthat/_snaps/distributions.md b/tests/testthat/_snaps/distributions.md index 43838138..f7372165 100644 --- a/tests/testthat/_snaps/distributions.md +++ b/tests/testthat/_snaps/distributions.md @@ -57,27 +57,43 @@ # poisson() and binomial() error informatively in glm - Wrong function name provided in another model - It looks like you're using greta's `poisson()` function in the family argument of another model. - Maybe you want to use `family = stats::poisson`,instead? + Code + glm(1 ~ 1, family = poisson) + Condition + Error in `family()`: + ! Wrong function name provided in another model + It looks like you're using greta's `poisson()` function in the family argument of another model. + Maybe you want to use `family = stats::poisson`,instead? --- - Wrong function name provided in another model - It looks like you're using greta's `binomial()` function in the family argument of another model. - Maybe you want to use `family = stats::binomial`,instead? + Code + glm(1 ~ 1, family = binomial) + Condition + Error in `family()`: + ! Wrong function name provided in another model + It looks like you're using greta's `binomial()` function in the family argument of another model. + Maybe you want to use `family = stats::binomial`,instead? --- - Wrong function name provided in another model - It looks like you're using greta's `poisson()` function in the family argument of another model. - Maybe you want to use `family = stats::poisson`,instead? + Code + glm(1 ~ 1, family = poisson()) + Condition + Error in `poisson()`: + ! Wrong function name provided in another model + It looks like you're using greta's `poisson()` function in the family argument of another model. + Maybe you want to use `family = stats::poisson`,instead? --- - Wrong function name provided in another model - It looks like you're using greta's `poisson()` function in the family argument of another model. - Maybe you want to use `family = stats::poisson`,instead? + Code + glm(1 ~ 1, family = poisson("sqrt")) + Condition + Error in `poisson()`: + ! Wrong function name provided in another model + It looks like you're using greta's `poisson()` function in the family argument of another model. + Maybe you want to use `family = stats::poisson`,instead? # wishart distribution errors informatively @@ -99,190 +115,334 @@ # lkj_correlation distribution errors informatively - `eta` must be a positive scalar value, or a scalar + Code + lkj_correlation(-1, dim) + Condition + Error in `initialize()`: + ! `eta` must be a positive scalar value, or a scalar --- - `eta` must be a positive scalar value, or a scalar + Code + lkj_correlation(c(3, 3), dim) + Condition + Error in `initialize()`: + ! `eta` must be a positive scalar value, or a scalar --- - `eta` must be a scalar - However `eta` had dimensions: 2x1 + Code + lkj_correlation(uniform(0, 1, dim = 2), dim) + Condition + Error in `initialize()`: + ! `eta` must be a scalar + However `eta` had dimensions: 2x1 --- - `dimension` must be a positive scalar integer giving the dimension of the distribution - `dim(target)` returns: + Code + lkj_correlation(4, dimension = -1) + Condition + Error in `initialize()`: + ! `dimension` must be a positive scalar integer giving the dimension of the distribution + `dim(target)` returns: --- - `dimension` must be a positive scalar integer giving the dimension of the distribution - `dim(target)` returns: + Code + lkj_correlation(4, dim = c(3, 3)) + Condition + Error in `initialize()`: + ! `dimension` must be a positive scalar integer giving the dimension of the distribution + `dim(target)` returns: --- - `dimension` must be a positive scalar integer giving the dimension of the distribution - `dim(target)` returns: + Code + lkj_correlation(4, dim = NA) + Condition + Error in `initialize()`: + ! `dimension` must be a positive scalar integer giving the dimension of the distribution + `dim(target)` returns: # multivariate_normal distribution errors informatively - the dimension of this distribution must be at least 2, but was 1 - multivariate distributions treat each row as a separate realisation - perhaps you need to transpose something? + Code + multivariate_normal(m_c, a) + Condition + Error in `check_dimension()`: + ! the dimension of this distribution must be at least 2, but was 1 + multivariate distributions treat each row as a separate realisation - perhaps you need to transpose something? --- - the dimension of this distribution must be at least 2, but was 1 - multivariate distributions treat each row as a separate realisation - perhaps you need to transpose something? + Code + multivariate_normal(m_d, a) + Condition + Error in `check_dimension()`: + ! the dimension of this distribution must be at least 2, but was 1 + multivariate distributions treat each row as a separate realisation - perhaps you need to transpose something? --- - Dimensions of parameters not compatible with multivariate distribution parameters of multivariate distributions cannot have more than two dimensions - object `x` has dimensions: 3x3x3 + Code + multivariate_normal(m_a, b) + Condition + Error in `lapply()`: + ! Dimensions of parameters not compatible with multivariate distribution parameters of multivariate distributions cannot have more than two dimensions + object `x` has dimensions: 3x3x3 --- - Object must be 2D square array - x But it had dimension: "3x2" + Code + multivariate_normal(m_a, c) + Condition + Error in `lapply()`: + ! Object must be 2D square array + x But it had dimension: "3x2" --- - distribution dimensions do not match implied dimensions - The distribution dimension should be 3, but parameters implied dimensions: 3 vs 4 - Multivariate distributions treat each row as a separate realisation - perhaps you need to transpose something? + Code + multivariate_normal(m_a, d) + Condition + Error in `check_dimension()`: + ! distribution dimensions do not match implied dimensions + The distribution dimension should be 3, but parameters implied dimensions: 3 vs 4 + Multivariate distributions treat each row as a separate realisation - perhaps you need to transpose something? --- - the dimension of this distribution must be at least 2, but was 1 - multivariate distributions treat each row as a separate realisation - perhaps you need to transpose something? + Code + multivariate_normal(0, 1) + Condition + Error in `check_dimension()`: + ! the dimension of this distribution must be at least 2, but was 1 + multivariate distributions treat each row as a separate realisation - perhaps you need to transpose something? --- - `n_realisations is not a positive scalar interger` - `n_realisations` must be a positive scalar integer giving the number of rows of the output - x We see `n_realisations` = `-1` having class: and length `1` + Code + multivariate_normal(m_a, a, n_realisations = -1) + Condition + Error in `check_n_realisations()`: + ! `n_realisations is not a positive scalar interger` + `n_realisations` must be a positive scalar integer giving the number of rows of the output + x We see `n_realisations` = `-1` having class: and length `1` --- - `n_realisations is not a positive scalar interger` - `n_realisations` must be a positive scalar integer giving the number of rows of the output - x We see `n_realisations` = `1` and `3` having class: and length `2` + Code + multivariate_normal(m_a, a, n_realisations = c(1, 3)) + Condition + Error in `check_n_realisations()`: + ! `n_realisations is not a positive scalar interger` + `n_realisations` must be a positive scalar integer giving the number of rows of the output + x We see `n_realisations` = `1` and `3` having class: and length `2` --- - `dimension` must be a positive scalar integer giving the dimension of the distribution - `dim(target)` returns: + Code + multivariate_normal(m_a, a, dimension = -1) + Condition + Error in `check_multivariate_dims()`: + ! `dimension` must be a positive scalar integer giving the dimension of the distribution + `dim(target)` returns: --- - `dimension` must be a positive scalar integer giving the dimension of the distribution - `dim(target)` returns: + Code + multivariate_normal(m_a, a, dimension = c(1, 3)) + Condition + Error in `check_multivariate_dims()`: + ! `dimension` must be a positive scalar integer giving the dimension of the distribution + `dim(target)` returns: # multinomial distribution errors informatively - the dimension of this distribution must be at least 2, but was 1 - multivariate distributions treat each row as a separate realisation - perhaps you need to transpose something? + Code + multinomial(c(1), 1) + Condition + Error in `check_dimension()`: + ! the dimension of this distribution must be at least 2, but was 1 + multivariate distributions treat each row as a separate realisation - perhaps you need to transpose something? --- - `n_realisations is not a positive scalar interger` - `n_realisations` must be a positive scalar integer giving the number of rows of the output - x We see `n_realisations` = `-1` having class: and length `1` + Code + multinomial(10, p_a, n_realisations = -1) + Condition + Error in `check_n_realisations()`: + ! `n_realisations is not a positive scalar interger` + `n_realisations` must be a positive scalar integer giving the number of rows of the output + x We see `n_realisations` = `-1` having class: and length `1` --- - `n_realisations is not a positive scalar interger` - `n_realisations` must be a positive scalar integer giving the number of rows of the output - x We see `n_realisations` = `1` and `3` having class: and length `2` + Code + multinomial(10, p_a, n_realisations = c(1, 3)) + Condition + Error in `check_n_realisations()`: + ! `n_realisations is not a positive scalar interger` + `n_realisations` must be a positive scalar integer giving the number of rows of the output + x We see `n_realisations` = `1` and `3` having class: and length `2` --- - `dimension` must be a positive scalar integer giving the dimension of the distribution - `dim(target)` returns: + Code + multinomial(10, p_a, dimension = -1) + Condition + Error in `check_multivariate_dims()`: + ! `dimension` must be a positive scalar integer giving the dimension of the distribution + `dim(target)` returns: --- - `dimension` must be a positive scalar integer giving the dimension of the distribution - `dim(target)` returns: + Code + multinomial(10, p_a, dimension = c(1, 3)) + Condition + Error in `check_multivariate_dims()`: + ! `dimension` must be a positive scalar integer giving the dimension of the distribution + `dim(target)` returns: # categorical distribution errors informatively - the dimension of this distribution must be at least 2, but was 1 - multivariate distributions treat each row as a separate realisation - perhaps you need to transpose something? + Code + categorical(1) + Condition + Error in `check_dimension()`: + ! the dimension of this distribution must be at least 2, but was 1 + multivariate distributions treat each row as a separate realisation - perhaps you need to transpose something? --- - `n_realisations is not a positive scalar interger` - `n_realisations` must be a positive scalar integer giving the number of rows of the output - x We see `n_realisations` = `-1` having class: and length `1` + Code + categorical(p_a, n_realisations = -1) + Condition + Error in `check_n_realisations()`: + ! `n_realisations is not a positive scalar interger` + `n_realisations` must be a positive scalar integer giving the number of rows of the output + x We see `n_realisations` = `-1` having class: and length `1` --- - `n_realisations is not a positive scalar interger` - `n_realisations` must be a positive scalar integer giving the number of rows of the output - x We see `n_realisations` = `1` and `3` having class: and length `2` + Code + categorical(p_a, n_realisations = c(1, 3)) + Condition + Error in `check_n_realisations()`: + ! `n_realisations is not a positive scalar interger` + `n_realisations` must be a positive scalar integer giving the number of rows of the output + x We see `n_realisations` = `1` and `3` having class: and length `2` --- - `dimension` must be a positive scalar integer giving the dimension of the distribution - `dim(target)` returns: + Code + categorical(p_a, dimension = -1) + Condition + Error in `check_multivariate_dims()`: + ! `dimension` must be a positive scalar integer giving the dimension of the distribution + `dim(target)` returns: --- - `dimension` must be a positive scalar integer giving the dimension of the distribution - `dim(target)` returns: + Code + categorical(p_a, dimension = c(1, 3)) + Condition + Error in `check_multivariate_dims()`: + ! `dimension` must be a positive scalar integer giving the dimension of the distribution + `dim(target)` returns: # dirichlet distribution errors informatively - the dimension of this distribution must be at least 2, but was 1 - multivariate distributions treat each row as a separate realisation - perhaps you need to transpose something? + Code + dirichlet(1) + Condition + Error in `check_dimension()`: + ! the dimension of this distribution must be at least 2, but was 1 + multivariate distributions treat each row as a separate realisation - perhaps you need to transpose something? --- - `n_realisations is not a positive scalar interger` - `n_realisations` must be a positive scalar integer giving the number of rows of the output - x We see `n_realisations` = `-1` having class: and length `1` + Code + dirichlet(alpha_a, n_realisations = -1) + Condition + Error in `check_n_realisations()`: + ! `n_realisations is not a positive scalar interger` + `n_realisations` must be a positive scalar integer giving the number of rows of the output + x We see `n_realisations` = `-1` having class: and length `1` --- - `n_realisations is not a positive scalar interger` - `n_realisations` must be a positive scalar integer giving the number of rows of the output - x We see `n_realisations` = `1` and `3` having class: and length `2` + Code + dirichlet(alpha_a, n_realisations = c(1, 3)) + Condition + Error in `check_n_realisations()`: + ! `n_realisations is not a positive scalar interger` + `n_realisations` must be a positive scalar integer giving the number of rows of the output + x We see `n_realisations` = `1` and `3` having class: and length `2` --- - `dimension` must be a positive scalar integer giving the dimension of the distribution - `dim(target)` returns: + Code + dirichlet(alpha_a, dimension = -1) + Condition + Error in `check_multivariate_dims()`: + ! `dimension` must be a positive scalar integer giving the dimension of the distribution + `dim(target)` returns: --- - `dimension` must be a positive scalar integer giving the dimension of the distribution - `dim(target)` returns: + Code + dirichlet(alpha_a, dimension = c(1, 3)) + Condition + Error in `check_multivariate_dims()`: + ! `dimension` must be a positive scalar integer giving the dimension of the distribution + `dim(target)` returns: # dirichlet-multinomial distribution errors informatively - the dimension of this distribution must be at least 2, but was 1 - multivariate distributions treat each row as a separate realisation - perhaps you need to transpose something? + Code + dirichlet_multinomial(c(1), 1) + Condition + Error in `check_dimension()`: + ! the dimension of this distribution must be at least 2, but was 1 + multivariate distributions treat each row as a separate realisation - perhaps you need to transpose something? --- - `n_realisations is not a positive scalar interger` - `n_realisations` must be a positive scalar integer giving the number of rows of the output - x We see `n_realisations` = `-1` having class: and length `1` + Code + dirichlet_multinomial(10, alpha_a, n_realisations = -1) + Condition + Error in `check_n_realisations()`: + ! `n_realisations is not a positive scalar interger` + `n_realisations` must be a positive scalar integer giving the number of rows of the output + x We see `n_realisations` = `-1` having class: and length `1` --- - `n_realisations is not a positive scalar interger` - `n_realisations` must be a positive scalar integer giving the number of rows of the output - x We see `n_realisations` = `1` and `3` having class: and length `2` + Code + dirichlet_multinomial(10, alpha_a, n_realisations = c(1, 3)) + Condition + Error in `check_n_realisations()`: + ! `n_realisations is not a positive scalar interger` + `n_realisations` must be a positive scalar integer giving the number of rows of the output + x We see `n_realisations` = `1` and `3` having class: and length `2` --- - `dimension` must be a positive scalar integer giving the dimension of the distribution - `dim(target)` returns: + Code + dirichlet_multinomial(10, alpha_a, dimension = -1) + Condition + Error in `check_multivariate_dims()`: + ! `dimension` must be a positive scalar integer giving the dimension of the distribution + `dim(target)` returns: --- - `dimension` must be a positive scalar integer giving the dimension of the distribution - `dim(target)` returns: + Code + dirichlet_multinomial(10, alpha_a, dimension = c(1, 3)) + Condition + Error in `check_multivariate_dims()`: + ! `dimension` must be a positive scalar integer giving the dimension of the distribution + `dim(target)` returns: diff --git a/tests/testthat/_snaps/extract_replace_combine.md b/tests/testthat/_snaps/extract_replace_combine.md index 20195b98..de422ad7 100644 --- a/tests/testthat/_snaps/extract_replace_combine.md +++ b/tests/testthat/_snaps/extract_replace_combine.md @@ -1,59 +1,115 @@ # abind errors informatively - all s must have the same dimensions except on the `along` dimension - However, dimension 1 had varying sizes: 5 and 1 + Code + abind(a, b) + Condition + Error in `abind()`: + ! all s must have the same dimensions except on the `along` dimension + However, dimension 1 had varying sizes: 5 and 1 --- - `along` must be between 0 and 4 - Instead `along` was 5 + Code + abind(a, c, along = 5) + Condition + Error in `abind()`: + ! `along` must be between 0 and 4 + Instead `along` was 5 # assign errors on variable greta arrays - cannot replace values in a variable + Code + z[1] <- 3 + Condition + Error in `[<-`: + ! cannot replace values in a variable # rbind and cbind give informative error messages - all s must be have the same number of columns + Code + rbind(a, b) + Condition + Error in `rbind()`: + ! all s must be have the same number of columns --- - all s must be have the same number of rows + Code + cbind(a, b) + Condition + Error in `cbind()`: + ! all s must be have the same number of rows # replacement gives informative error messages - number of items to replace is not a multiple of replacement length + Code + x[1:2, , 1] <- seq_len(3) + Condition + Error in `[<-`: + ! number of items to replace is not a multiple of replacement length --- - subscript out of bounds + Code + x[1, 1, 3] <- 1 + Condition + Error: + ! subscript out of bounds --- - subscript out of bounds + Code + x[3] <- 1 + Condition + Error in `[<-`: + ! subscript out of bounds # extraction gives informative error messages - subscript out of bounds + Code + x[1, 1, 3] + Condition + Error: + ! subscript out of bounds --- - subscript out of bounds + Code + x[3] + Condition + Error in `x[3]`: + ! subscript out of bounds # dim<- errors as expected - length-0 dimension vector is invalid + Code + dim(x) <- pi[0] + Condition + Error in `dim<-`: + ! length-0 dimension vector is invalid --- - the dims contain missing values + Code + dim(x) <- c(1, NA) + Condition + Error in `dim<-`: + ! the dims contain missing values --- - the dims contain negative values: - `dim(x)` returns 3 and 4 + Code + dim(x) <- c(1, -1) + Condition + Error in `dim<-`: + ! the dims contain negative values: + `dim(x)` returns 3 and 4 --- - dims [product 13] do not match the length of object [12] + Code + dim(x) <- 13 + Condition + Error in `dim<-`: + ! dims [product 13] do not match the length of object [12] diff --git a/tests/testthat/_snaps/functions.md b/tests/testthat/_snaps/functions.md index ed75447b..d733e41f 100644 --- a/tests/testthat/_snaps/functions.md +++ b/tests/testthat/_snaps/functions.md @@ -1,50 +1,94 @@ # cummax and cummin functions error informatively - `cummax()` not yet implemented for greta + Code + fun(x) + Condition + Error: + ! `cummax()` not yet implemented for greta --- - `cummin()` not yet implemented for greta + Code + fun(x) + Condition + Error: + ! `cummin()` not yet implemented for greta # complex number functions error informatively - greta does not yet support complex numbers + Code + fun(x) + Condition + Error: + ! greta does not yet support complex numbers --- - greta does not yet support complex numbers + Code + fun(x) + Condition + Error: + ! greta does not yet support complex numbers --- - greta does not yet support complex numbers + Code + fun(x) + Condition + Error: + ! greta does not yet support complex numbers --- - greta does not yet support complex numbers + Code + fun(x) + Condition + Error: + ! greta does not yet support complex numbers --- - greta does not yet support complex numbers + Code + fun(x) + Condition + Error: + ! greta does not yet support complex numbers # cumulative functions error as expected - `x` must be a column vector - but `x` has dimensions 1x5 + Code + cumsum(a) + Condition + Error: + ! `x` must be a column vector + but `x` has dimensions 1x5 --- - `x` must be a column vector - but `x` has dimensions 5x1x1 + Code + cumsum(b) + Condition + Error: + ! `x` must be a column vector + but `x` has dimensions 5x1x1 --- - `x` must be a column vector - but `x` has dimensions 1x5 + Code + cumprod(a) + Condition + Error: + ! `x` must be a column vector + but `x` has dimensions 1x5 --- - `x` must be a column vector - but `x` has dimensions 5x1x1 + Code + cumprod(b) + Condition + Error: + ! `x` must be a column vector + but `x` has dimensions 5x1x1 # solve and sweep and kronecker error as expected @@ -155,49 +199,93 @@ # colSums etc. error as expected - invalid `dims` + Code + colSums(x, dims = 3) + Condition + Error in `rowcol_idx()`: + ! invalid `dims` --- - invalid `dims` + Code + rowSums(x, dims = 3) + Condition + Error in `rowcol_idx()`: + ! invalid `dims` --- - invalid `dims` + Code + colMeans(x, dims = 3) + Condition + Error in `rowcol_idx()`: + ! invalid `dims` --- - invalid `dims` + Code + rowMeans(x, dims = 3) + Condition + Error in `rowcol_idx()`: + ! invalid `dims` # forwardsolve and backsolve error as expected - `1` must equal `ncol(l)` for s + Code + forwardsolve(a, b, k = 1) + Condition + Error in `forwardsolve()`: + ! `1` must equal `ncol(l)` for s --- - `1` must equal `ncol(r)` for s + Code + backsolve(a, b, k = 1) + Condition + Error in `backsolve()`: + ! `1` must equal `ncol(r)` for s --- - `transpose` must be FALSE for s + Code + forwardsolve(a, b, transpose = TRUE) + Condition + Error in `forwardsolve()`: + ! `transpose` must be FALSE for s --- - `transpose` must be FALSE for s + Code + backsolve(a, b, transpose = TRUE) + Condition + Error in `backsolve()`: + ! `transpose` must be FALSE for s # tapply errors as expected - `x` must be 2D with one column - However `x` has dimensions 10x2 + Code + tapply(b, group, "sum") + Condition + Error in `tapply()`: + ! `x` must be 2D with one column + However `x` has dimensions 10x2 --- - `INDEX` cannot be a + Code + tapply(a, as_data(group), "sum") + Condition + Error in `check_not_greta_array()`: + ! `INDEX` cannot be a # ignored options are errored/warned about - the "digits" argument of `round()` cannot be set for s - s can only be rounded to the nearest integer, so the "digits" argument cannot be set + Code + round(x, 2) + Condition + Error: + ! the "digits" argument of `round()` cannot be set for s + s can only be rounded to the nearest integer, so the "digits" argument cannot be set --- @@ -217,43 +305,79 @@ # incorrect dimensions are errored about - only 2D arrays can be transposed + Code + t(x) + Condition + Error in `t()`: + ! only 2D arrays can be transposed --- - `perm` must be a reordering of the dimensions: 1, 2, and 3 - but was: 2 and 1 + Code + aperm(x, 2:1) + Condition + Error in `aperm()`: + ! `perm` must be a reordering of the dimensions: 1, 2, and 3 + but was: 2 and 1 --- - only two-dimensional, square, symmetric s can be Cholesky decomposed - `dim(x)` returns: 3, 3, and 3 + Code + chol(x) + Condition + Error in `chol()`: + ! only two-dimensional, square, symmetric s can be Cholesky decomposed + `dim(x)` returns: 3, 3, and 3 --- - only two-dimensional, square, symmetric s can be Cholesky decomposed - `dim(x)` returns: 3 and 4 + Code + chol(y) + Condition + Error in `chol()`: + ! only two-dimensional, square, symmetric s can be Cholesky decomposed + `dim(x)` returns: 3 and 4 --- - `chol2symm()` must have two-dimensional, square, upper-triangular s - `dim(x)` returns: 3, 3, and 3 + Code + chol2symm(x) + Condition + Error in `chol2symm()`: + ! `chol2symm()` must have two-dimensional, square, upper-triangular s + `dim(x)` returns: 3, 3, and 3 --- - `chol2symm()` must have two-dimensional, square, upper-triangular s - `dim(x)` returns: 3 and 4 + Code + chol2symm(y) + Condition + Error in `chol2symm()`: + ! `chol2symm()` must have two-dimensional, square, upper-triangular s + `dim(x)` returns: 3 and 4 --- - only two-dimensional, square, symmetric s can be eigendecomposed + Code + eigen(x) + Condition + Error in `eigen()`: + ! only two-dimensional, square, symmetric s can be eigendecomposed --- - only two-dimensional, square, symmetric s can be eigendecomposed + Code + eigen(y) + Condition + Error in `eigen()`: + ! only two-dimensional, square, symmetric s can be eigendecomposed --- - `x1` and `x2` must have the same number of columns - However `ncol(x1)` = 1 and `ncol(x2)` = 4 + Code + rdist(x, y) + Condition + Error in `rdist()`: + ! `x1` and `x2` must have the same number of columns + However `ncol(x1)` = 1 and `ncol(x2)` = 4 diff --git a/tests/testthat/_snaps/future.md b/tests/testthat/_snaps/future.md index be1a13b4..a8e4cb57 100644 --- a/tests/testthat/_snaps/future.md +++ b/tests/testthat/_snaps/future.md @@ -45,17 +45,33 @@ --- - parallel mcmc samplers cannot be run with `plan(multicore)` + Code + check_future_plan() + Condition + Error: + ! parallel mcmc samplers cannot be run with `plan(multicore)` --- - parallel mcmc samplers cannot be run with a fork cluster + Code + check_future_plan() + Condition + Error in `test_if_forked_cluster()`: + ! parallel mcmc samplers cannot be run with a fork cluster --- - parallel mcmc samplers cannot be run with `plan(multicore)` + Code + mcmc(m, verbose = FALSE) + Condition + Error in `run_samplers()`: + ! parallel mcmc samplers cannot be run with `plan(multicore)` --- - parallel mcmc samplers cannot be run with a fork cluster + Code + mcmc(m, verbose = FALSE) + Condition + Error in `test_if_forked_cluster()`: + ! parallel mcmc samplers cannot be run with a fork cluster diff --git a/tests/testthat/_snaps/iid_samples.md b/tests/testthat/_snaps/iid_samples.md index 38551aa0..08646684 100644 --- a/tests/testthat/_snaps/iid_samples.md +++ b/tests/testthat/_snaps/iid_samples.md @@ -1,8 +1,17 @@ # distributions without RNG error nicely - Sampling is not yet implemented for "hypergeometric" distributions + Code + compare_iid_samples(hypergeometric, rhyper, parameters = list(m = 11, n = 8, k = 5)) + Condition + Error in `check_sampling_implemented()`: + ! Sampling is not yet implemented for "hypergeometric" distributions --- - Sampling is not yet implemented for truncated "f" distributions + Code + compare_iid_samples(f, rtf, parameters = list(df1 = 4, df2 = 1, truncation = c( + 2, 3))) + Condition + Error in `dag$draw_sample()`: + ! Sampling is not yet implemented for truncated "f" distributions diff --git a/tests/testthat/_snaps/inference.md b/tests/testthat/_snaps/inference.md index 06c4128d..1fa4c7f7 100644 --- a/tests/testthat/_snaps/inference.md +++ b/tests/testthat/_snaps/inference.md @@ -1,21 +1,40 @@ # bad mcmc proposals are rejected - The log density could not be evaluated at these initial values - Try using these initials as the `values` argument in `calculate()` to see what values of subsequent s these initial values lead to. + Code + draws <- mcmc(m, chains = 1, n_samples = 2, warmup = 0, verbose = FALSE, + initial_values = initials(z = 1e+120)) + Condition + Error in `self$check_valid_parameters()`: + ! The log density could not be evaluated at these initial values + Try using these initials as the `values` argument in `calculate()` to see what values of subsequent s these initial values lead to. --- - Could not find reasonable starting values after 20 attempts. - Please specify initial values manually via the `initial_values` argument + Code + mcmc(m, chains = 1, n_samples = 1, warmup = 0, verbose = FALSE) + Condition + Error in `self$check_reasonable_starting_values()`: + ! Could not find reasonable starting values after 20 attempts. + Please specify initial values manually via the `initial_values` argument # mcmc handles initial values nicely - The number of provided initial values does not match chains - 3 sets of initial values were provided, but there are 2 chains + Code + draws <- mcmc(m, warmup = 10, n_samples = 10, verbose = FALSE, chains = 2, + initial_values = inits) + Condition + Error in `prep_initials()`: + ! The number of provided initial values does not match chains + 3 sets of initial values were provided, but there are 2 chains --- - The initial values provided have different dimensions than the named s + Code + draws <- mcmc(m, warmup = 10, n_samples = 10, verbose = FALSE, chains = 2, + initial_values = inits) + Condition + Error in `FUN()`: + ! The initial values provided have different dimensions than the named s --- @@ -66,18 +85,29 @@ # model errors nicely - `model()` arguments must be s - The following object passed to `model()` is not a : - "a" - + Code + model(a, b) + Condition + Error in `model()`: + ! `model()` arguments must be s + The following object passed to `model()` is not a : + "a" # initials works - initial values must be numeric + Code + initials(a = FALSE) + Condition + Error in `initials()`: + ! initial values must be numeric --- - All initial values must be named + Code + initials(FALSE) + Condition + Error in `initials()`: + ! All initial values must be named --- @@ -93,36 +123,68 @@ # prep_initials errors informatively - `initial_values` must be an initials object created with `initials()`, or a simple list of initials objects + Code + mcmc(m, initial_values = FALSE, verbose = FALSE) + Condition + Error in `prep_initials()`: + ! `initial_values` must be an initials object created with `initials()`, or a simple list of initials objects --- - `initial_values` must be an initials object created with `initials()`, or a simple list of initials objects + Code + mcmc(m, initial_values = list(FALSE), verbose = FALSE) + Condition + Error in `prep_initials()`: + ! `initial_values` must be an initials object created with `initials()`, or a simple list of initials objects --- - Some s passed to `initials()` are not associated with the model: - `g` + Code + mcmc(m, chains = 1, initial_values = initials(g = 1), verbose = FALSE) + Condition + Error in `check_greta_arrays_associated_with_model()`: + ! Some s passed to `initials()` are not associated with the model: + `g` --- - Initial values can only be set for variable s + Code + mcmc(m, chains = 1, initial_values = initials(f = 1), verbose = FALSE) + Condition + Error in `check_nodes_all_variable()`: + ! Initial values can only be set for variable s --- - Initial values can only be set for variable s + Code + mcmc(m, chains = 1, initial_values = initials(z = 1), verbose = FALSE) + Condition + Error in `check_nodes_all_variable()`: + ! Initial values can only be set for variable s --- - Some provided initial values are outside the range of values their variables can take + Code + mcmc(m, chains = 1, initial_values = initials(b = -1), verbose = FALSE) + Condition + Error in `unsupported_error()`: + ! Some provided initial values are outside the range of values their variables can take --- - Some provided initial values are outside the range of values their variables can take + Code + mcmc(m, chains = 1, initial_values = initials(d = -1), verbose = FALSE) + Condition + Error in `unsupported_error()`: + ! Some provided initial values are outside the range of values their variables can take --- - Some provided initial values are outside the range of values their variables can take + Code + mcmc(m, chains = 1, initial_values = initials(e = 2), verbose = FALSE) + Condition + Error in `unsupported_error()`: + ! Some provided initial values are outside the range of values their variables can take # samplers print informatively diff --git a/tests/testthat/_snaps/install_greta_deps.md b/tests/testthat/_snaps/install_greta_deps.md index 77fffd6e..853032d9 100644 --- a/tests/testthat/_snaps/install_greta_deps.md +++ b/tests/testthat/_snaps/install_greta_deps.md @@ -1,4 +1,12 @@ # install_greta_deps errors appropriately - Stopping as installation of greta dependencies took longer than 0.001 minutes You can increase the timeout time by increasing the `timeout` argument. For example, to wait 5 minutes: `install_greta_deps(timeout = 5)` Alternatively, you can perform the entire installation with: `reticulate::install_miniconda()` Then: `reticulate::conda_create(envname = 'greta-env-tf2', python_version = '3.8')` Then: `reticulate::py_install( packages = c( 'numpy', 'tensorflow', 'tensorflow-probability' ), pip = TRUE )` Then, restart R, and load greta with: `library(greta)` + Code + install_greta_deps(timeout = 0.001) + Message + i Installing python modules into 'greta-env-tf2' conda environment, this may ... + x Installing python modules into 'greta-env-tf2' conda environment, this may ... + + Condition + Error in `new_install_process()`: + ! Stopping as installation of greta dependencies took longer than 0.001 minutes You can increase the timeout time by increasing the `timeout` argument. For example, to wait 5 minutes: `install_greta_deps(timeout = 5)` Alternatively, you can perform the entire installation with: `reticulate::install_miniconda()` Then: `reticulate::conda_create(envname = 'greta-env-tf2', python_version = '3.8')` Then: `reticulate::py_install( packages = c( 'numpy', 'tensorflow', 'tensorflow-probability' ), pip = TRUE )` Then, restart R, and load greta with: `library(greta)` diff --git a/tests/testthat/_snaps/joint.md b/tests/testthat/_snaps/joint.md index cee49ad3..73d3cf5c 100644 --- a/tests/testthat/_snaps/joint.md +++ b/tests/testthat/_snaps/joint.md @@ -1,18 +1,34 @@ # joint of fixed and continuous distributions errors - Cannot construct a joint distribution from a combination of discrete and continuous distributions + Code + joint(bernoulli(0.5), normal(0, 1)) + Condition + Error in `initialize()`: + ! Cannot construct a joint distribution from a combination of discrete and continuous distributions # joint with insufficient distributions errors - `joint()` must be passed at least 2 distributions - The number of distributions passed was: 1 + Code + joint(normal(0, 2)) + Condition + Error in `initialize()`: + ! `joint()` must be passed at least 2 distributions + The number of distributions passed was: 1 --- - `joint()` must be passed at least 2 distributions - The number of distributions passed was: 0 + Code + joint() + Condition + Error in `initialize()`: + ! `joint()` must be passed at least 2 distributions + The number of distributions passed was: 0 # joint with non-scalar distributions errors - `joint()` only accepts probability distributions over scalars + Code + joint(normal(0, 2, dim = 3), normal(0, 1, dim = 3)) + Condition + Error in `initialize()`: + ! `joint()` only accepts probability distributions over scalars diff --git a/tests/testthat/_snaps/misc.md b/tests/testthat/_snaps/misc.md index ed413f69..426c5b65 100644 --- a/tests/testthat/_snaps/misc.md +++ b/tests/testthat/_snaps/misc.md @@ -1,12 +1,16 @@ # check_tf_version works - x The expected python packages are not available - i We recommend installing them (in a fresh R session) with: - `install_greta_deps()` - or - `reinstall_greta_deps()` - (Note: Your R session should not have initialised Tensorflow yet.) - i For more information, see `?install_greta_deps` + Code + check_tf_version("error") + Condition + Error in `check_tf_version()`: + ! x The expected python packages are not available + i We recommend installing them (in a fresh R session) with: + `install_greta_deps()` + or + `reinstall_greta_deps()` + (Note: Your R session should not have initialised Tensorflow yet.) + i For more information, see `?install_greta_deps` --- @@ -33,43 +37,83 @@ # define and mcmc error informatively - none of the s in the model are associated with a probability density, so a model cannot be defined + Code + model(variable()) + Condition + Error in `model()`: + ! none of the s in the model are associated with a probability density, so a model cannot be defined --- - none of the s in the model are associated with a probability density, so a model cannot be defined + Code + model(x) + Condition + Error in `model()`: + ! none of the s in the model are associated with a probability density, so a model cannot be defined --- - could not find any non-data s + Code + model() + Condition + Error in `model()`: + ! could not find any non-data s --- - Model contains a discrete random variable that doesn't have a fixed value, so inference cannot be carried out. + Code + model(bernoulli(0.5)) + Condition + Error in `check_unfixed_discrete_distributions()`: + ! Model contains a discrete random variable that doesn't have a fixed value, so inference cannot be carried out. --- - none of the s in the model are unknown, so a model cannot be defined + Code + model(x) + Condition + Error in `model()`: + ! none of the s in the model are unknown, so a model cannot be defined --- - Data s cannot be sampled - `x` is a data (s) + Code + draws <- mcmc(m, verbose = FALSE) + Condition + Error in `mcmc()`: + ! Data s cannot be sampled + `x` is a data (s) # check_dims errors informatively - incompatible dimensions: 3x3, 2x2 + Code + greta:::check_dims(a, c) + Condition + Error: + ! incompatible dimensions: 3x3, 2x2 # disjoint graphs are checked - the model contains 2 disjoint graphs one or more of these sub-graphs does not contain any s that are associated with a probability density, so a model cannot be defined + Code + m <- model(a, b, c) + Condition + Error in `model()`: + ! the model contains 2 disjoint graphs one or more of these sub-graphs does not contain any s that are associated with a probability density, so a model cannot be defined --- - the model contains 2 disjoint graphs one or more of these sub-graphs does not contain any s that are unknown, so a model cannot be defined + Code + m <- model(a, b, d) + Condition + Error in `model()`: + ! the model contains 2 disjoint graphs one or more of these sub-graphs does not contain any s that are unknown, so a model cannot be defined # cleanly() handles TF errors nicely - greta hit a tensorflow error: - Error in other_stop(): Fetchez la vache! + Code + cleanly(other_stop()) + Condition + Error in `cleanly()`: + ! greta hit a tensorflow error: + Error in other_stop(): Fetchez la vache! diff --git a/tests/testthat/_snaps/mixture.md b/tests/testthat/_snaps/mixture.md index 5a44f127..b362bcf0 100644 --- a/tests/testthat/_snaps/mixture.md +++ b/tests/testthat/_snaps/mixture.md @@ -1,35 +1,64 @@ # mixtures of fixed and continuous distributions errors - Cannot construct a mixture distribution from a combination of discrete and continuous distributions + Code + mixture(bernoulli(0.5), normal(0, 1), weights = weights) + Condition + Error in `initialize()`: + ! Cannot construct a mixture distribution from a combination of discrete and continuous distributions # mixtures of multivariate and univariate errors - Cannot construct a mixture from a combination of multivariate and univariate distributions + Code + mixture(multivariate_normal(zeros(1, 3), diag(3)), normal(0, 1, dim = c(1, 3)), + weights = weights) + Condition + Error in `initialize()`: + ! Cannot construct a mixture from a combination of multivariate and univariate distributions # mixtures of supports errors - Component distributions must have the same support - However the component distributions have different support: - "0 to Inf vs. -Inf to Inf" + Code + mixture(normal(0, 1, truncation = c(0, Inf)), normal(0, 1), weights = weights) + Condition + Error in `initialize()`: + ! Component distributions must have the same support + However the component distributions have different support: + "0 to Inf vs. -Inf to Inf" --- - Component distributions must have the same support - However the component distributions have different support: - "0 to Inf vs. -Inf to Inf" + Code + mixture(lognormal(0, 1), normal(0, 1), weights = weights) + Condition + Error in `initialize()`: + ! Component distributions must have the same support + However the component distributions have different support: + "0 to Inf vs. -Inf to Inf" # incorrectly-shaped weights errors - The first dimension of weights must be the number of distributions in the mixture (2) - However it was 1 + Code + mixture(normal(0, 1), normal(0, 2), weights = weights) + Condition + Error in `initialize()`: + ! The first dimension of weights must be the number of distributions in the mixture (2) + However it was 1 # mixtures with insufficient distributions errors - `mixture()` must be passed at least 2 distributions - The number of distributions passed was: 1 + Code + mixture(normal(0, 2), weights = weights) + Condition + Error in `initialize()`: + ! `mixture()` must be passed at least 2 distributions + The number of distributions passed was: 1 --- - `mixture()` must be passed at least 2 distributions - The number of distributions passed was: 0 + Code + mixture(weights = weights) + Condition + Error in `initialize()`: + ! `mixture()` must be passed at least 2 distributions + The number of distributions passed was: 0 diff --git a/tests/testthat/_snaps/operators.md b/tests/testthat/_snaps/operators.md index 866f04bb..832be565 100644 --- a/tests/testthat/_snaps/operators.md +++ b/tests/testthat/_snaps/operators.md @@ -1,13 +1,21 @@ # %*% errors informatively - Incompatible dimensions: "3x4" vs "1x4" + Code + a %*% b + Condition + Error in `a %*% b`: + ! Incompatible dimensions: "3x4" vs "1x4" --- - Only two-dimensional s can be matrix-multiplied - Dimensions for each are: - `x`: "3x4" - `y`: "2x2x2" + Code + a %*% c + Condition + Error in `a %*% c`: + ! Only two-dimensional s can be matrix-multiplied + Dimensions for each are: + `x`: "3x4" + `y`: "2x2x2" # %*% works when one is a non-greta array diff --git a/tests/testthat/_snaps/opt.md b/tests/testthat/_snaps/opt.md index 4e5fe356..d0a22729 100644 --- a/tests/testthat/_snaps/opt.md +++ b/tests/testthat/_snaps/opt.md @@ -25,56 +25,92 @@ # opt fails with defunct optimisers - The optimiser, `powell()`, is defunct and has been removed in greta 0.5.0. - Please use a different optimiser. - See `?optimisers` for detail on which optimizers are removed. + Code + o <- opt(m, optimiser = powell()) + Condition + Error in `optimiser_defunct_error()`: + ! The optimiser, `powell()`, is defunct and has been removed in greta 0.5.0. + Please use a different optimiser. + See `?optimisers` for detail on which optimizers are removed. --- - The optimiser, `momentum()`, is defunct and has been removed in greta 0.5.0. - Please use a different optimiser. - See `?optimisers` for detail on which optimizers are removed. + Code + o <- opt(m, optimiser = momentum()) + Condition + Error in `optimiser_defunct_error()`: + ! The optimiser, `momentum()`, is defunct and has been removed in greta 0.5.0. + Please use a different optimiser. + See `?optimisers` for detail on which optimizers are removed. --- - The optimiser, `cg()`, is defunct and has been removed in greta 0.5.0. - Please use a different optimiser. - See `?optimisers` for detail on which optimizers are removed. + Code + o <- opt(m, optimiser = cg()) + Condition + Error in `optimiser_defunct_error()`: + ! The optimiser, `cg()`, is defunct and has been removed in greta 0.5.0. + Please use a different optimiser. + See `?optimisers` for detail on which optimizers are removed. --- - The optimiser, `newton_cg()`, is defunct and has been removed in greta 0.5.0. - Please use a different optimiser. - See `?optimisers` for detail on which optimizers are removed. + Code + o <- opt(m, optimiser = newton_cg()) + Condition + Error in `optimiser_defunct_error()`: + ! The optimiser, `newton_cg()`, is defunct and has been removed in greta 0.5.0. + Please use a different optimiser. + See `?optimisers` for detail on which optimizers are removed. --- - The optimiser, `l_bfgs_b()`, is defunct and has been removed in greta 0.5.0. - Please use a different optimiser. - See `?optimisers` for detail on which optimizers are removed. + Code + o <- opt(m, optimiser = l_bfgs_b()) + Condition + Error in `optimiser_defunct_error()`: + ! The optimiser, `l_bfgs_b()`, is defunct and has been removed in greta 0.5.0. + Please use a different optimiser. + See `?optimisers` for detail on which optimizers are removed. --- - The optimiser, `tnc()`, is defunct and has been removed in greta 0.5.0. - Please use a different optimiser. - See `?optimisers` for detail on which optimizers are removed. + Code + o <- opt(m, optimiser = tnc()) + Condition + Error in `optimiser_defunct_error()`: + ! The optimiser, `tnc()`, is defunct and has been removed in greta 0.5.0. + Please use a different optimiser. + See `?optimisers` for detail on which optimizers are removed. --- - The optimiser, `cobyla()`, is defunct and has been removed in greta 0.5.0. - Please use a different optimiser. - See `?optimisers` for detail on which optimizers are removed. + Code + o <- opt(m, optimiser = cobyla()) + Condition + Error in `optimiser_defunct_error()`: + ! The optimiser, `cobyla()`, is defunct and has been removed in greta 0.5.0. + Please use a different optimiser. + See `?optimisers` for detail on which optimizers are removed. --- - The optimiser, `slsqp()`, is defunct and has been removed in greta 0.5.0. - Please use a different optimiser. - See `?optimisers` for detail on which optimizers are removed. + Code + o <- opt(m, optimiser = slsqp()) + Condition + Error in `optimiser_defunct_error()`: + ! The optimiser, `slsqp()`, is defunct and has been removed in greta 0.5.0. + Please use a different optimiser. + See `?optimisers` for detail on which optimizers are removed. # TF opt with `gradient_descent` fails with bad initial values - Detected numerical overflow during optimisation - Please try one of the following: - i Using different initial values - i Using another optimiser. (E.g., instead of `gradient_descent()`, try `adam()`) + Code + o <- opt(m, hessian = TRUE, optimiser = gradient_descent()) + Condition + Error in `self$run_minimiser()`: + ! Detected numerical overflow during optimisation + Please try one of the following: + i Using different initial values + i Using another optimiser. (E.g., instead of `gradient_descent()`, try `adam()`) diff --git a/tests/testthat/_snaps/simulate.md b/tests/testthat/_snaps/simulate.md index bc8d24ff..f29df338 100644 --- a/tests/testthat/_snaps/simulate.md +++ b/tests/testthat/_snaps/simulate.md @@ -1,23 +1,43 @@ # simulate errors if distribution-free variables are not fixed - the target s are related to variables that do not have distributions so cannot be sampled + Code + sims <- simulate(m) + Condition + Error in `calculate_list()`: + ! the target s are related to variables that do not have distributions so cannot be sampled # simulate errors if a distribution cannot be sampled from - Sampling is not yet implemented for "hypergeometric" distributions + Code + sims <- simulate(m) + Condition + Error in `check_sampling_implemented()`: + ! Sampling is not yet implemented for "hypergeometric" distributions # simulate errors nicely if nsim is invalid - nsim must be a positive integer - However the value provided was: 0 + Code + simulate(m, nsim = 0) + Condition + Error: + ! nsim must be a positive integer + However the value provided was: 0 --- - nsim must be a positive integer - However the value provided was: -1 + Code + simulate(m, nsim = -1) + Condition + Error: + ! nsim must be a positive integer + However the value provided was: -1 --- - nsim must be a positive integer - However the value provided was: NA + Code + simulate(m, nsim = "five") + Condition + Error: + ! nsim must be a positive integer + However the value provided was: NA diff --git a/tests/testthat/_snaps/syntax.md b/tests/testthat/_snaps/syntax.md index d349a233..9b72d38a 100644 --- a/tests/testthat/_snaps/syntax.md +++ b/tests/testthat/_snaps/syntax.md @@ -1,39 +1,75 @@ # `distribution<-` errors informatively - right hand side must be a + Code + distribution(y) <- x + Condition + Error in `distribution<-`: + ! right hand side must be a --- - right hand side must have a distribution + Code + distribution(y) <- as_data(x) + Condition + Error in `distribution<-`: + ! right hand side must have a distribution --- - right hand side must have a distribution + Code + distribution(y) <- variable() + Condition + Error in `distribution<-`: + ! right hand side must have a distribution --- - left and right hand sides have different dimensions. - The distribution must have dimension of either "3x3x2" or "1x1",but instead has dimension "3x3x1" + Code + distribution(y) <- normal(0, 1, dim = c(3, 3, 1)) + Condition + Error in `distribution<-`: + ! left and right hand sides have different dimensions. + The distribution must have dimension of either "3x3x2" or "1x1",but instead has dimension "3x3x1" --- - left hand side already has a distribution assigned + Code + distribution(y_) <- normal(0, 1) + Condition + Error in `distribution<-`: + ! left hand side already has a distribution assigned --- - right hand side has already been assigned fixed values + Code + distribution(y2) <- y1 + Condition + Error in `distribution<-`: + ! right hand side has already been assigned fixed values --- - distributions can only be assigned to data s + Code + distribution(z) <- normal(0, 1) + Condition + Error in `distribution<-`: + ! distributions can only be assigned to data s --- - distributions can only be assigned to data s + Code + distribution(z2) <- normal(0, 1) + Condition + Error in `distribution<-`: + ! distributions can only be assigned to data s --- - distributions can only be assigned to data s + Code + distribution(z2) <- normal(0, 1) + Condition + Error in `distribution<-`: + ! distributions can only be assigned to data s # distribution() errors informatively diff --git a/tests/testthat/_snaps/transforms.md b/tests/testthat/_snaps/transforms.md index 879c6453..1141020a 100644 --- a/tests/testthat/_snaps/transforms.md +++ b/tests/testthat/_snaps/transforms.md @@ -1,5 +1,9 @@ # imultilogit errors informatively - `x must be two dimensional` - However, `x` has dimensions: 3x4x3 + Code + imultilogit(x) + Condition + Error in `imultilogit()`: + ! `x must be two dimensional` + However, `x` has dimensions: 3x4x3 diff --git a/tests/testthat/_snaps/truncated.md b/tests/testthat/_snaps/truncated.md index 5379a6b5..a40c6aaa 100644 --- a/tests/testthat/_snaps/truncated.md +++ b/tests/testthat/_snaps/truncated.md @@ -1,11 +1,19 @@ # bad truncations error - lower bound must be 0 or higher - lower bound is: -1 + Code + lognormal(0, 1, truncation = c(-1, Inf)) + Condition + Error in `initialize()`: + ! lower bound must be 0 or higher + lower bound is: -1 --- - lower and upper bounds must be between 0 and 1 - lower bound is: -1 - upper bound is: 2 + Code + beta(1, 1, truncation = c(-1, 2)) + Condition + Error in `initialize()`: + ! lower and upper bounds must be between 0 and 1 + lower bound is: -1 + upper bound is: 2 diff --git a/tests/testthat/_snaps/variables.md b/tests/testthat/_snaps/variables.md index a9d68148..8eaef750 100644 --- a/tests/testthat/_snaps/variables.md +++ b/tests/testthat/_snaps/variables.md @@ -1,44 +1,68 @@ # variable() errors informatively - lower and upper must be numeric - lower has class: numeric - lower has length: 1 - upper has class: logical - upper has length: 1 + Code + variable(upper = NA) + Condition + Error in `initialize()`: + ! lower and upper must be numeric + lower has class: numeric + lower has length: 1 + upper has class: logical + upper has length: 1 --- - lower and upper must be numeric - lower has class: numeric - lower has length: 1 - upper has class: function - upper has length: 1 + Code + variable(upper = head) + Condition + Error in `initialize()`: + ! lower and upper must be numeric + lower has class: numeric + lower has length: 1 + upper has class: function + upper has length: 1 --- - lower and upper must be numeric - lower has class: logical - lower has length: 1 - upper has class: numeric - upper has length: 1 + Code + variable(lower = NA) + Condition + Error in `initialize()`: + ! lower and upper must be numeric + lower has class: logical + lower has length: 1 + upper has class: numeric + upper has length: 1 --- - lower and upper must be numeric - lower has class: function - lower has length: 1 - upper has class: numeric - upper has length: 1 + Code + variable(lower = head) + Condition + Error in `initialize()`: + ! lower and upper must be numeric + lower has class: function + lower has length: 1 + upper has class: numeric + upper has length: 1 --- - incompatible dimensions: 3x1, 2x1 + Code + variable(lower = 0:2, upper = 1:2) + Condition + Error in `initialize()`: + ! incompatible dimensions: 3x1, 2x1 --- - upper bounds must be greater than lower bounds - lower is: 1 - upper is: 1 + Code + variable(lower = 1, upper = 1) + Condition + Error in `initialize()`: + ! upper bounds must be greater than lower bounds + lower is: 1 + upper is: 1 # constrained variable constructors error informatively diff --git a/tests/testthat/test-zzzzzz.R b/tests/testthat/test-zzzzzz.R index df70a60d..9c681b00 100644 --- a/tests/testthat/test-zzzzzz.R +++ b/tests/testthat/test-zzzzzz.R @@ -10,7 +10,7 @@ # mockery::stub(check_tf_version, 'have_tf', FALSE) # mockery::stub(check_tf_version, 'have_tfp', FALSE) # -# expect_snapshot_error( +# expect_snapshot(error = TRUE, # check_tf_version("error") # ) # diff --git a/tests/testthat/test_calculate.R b/tests/testthat/test_calculate.R index d64dff18..c30ca954 100644 --- a/tests/testthat/test_calculate.R +++ b/tests/testthat/test_calculate.R @@ -185,7 +185,7 @@ test_that("stochastic calculate works with greta_mcmc_list objects", { ) # this should error without nsim being specified (y is stochastic) - expect_snapshot_error( + expect_snapshot(error = TRUE, calc_a <- calculate(a, y, values = draws) ) @@ -240,7 +240,7 @@ test_that("calculate errors if the mcmc samples unrelated to target", { c <- normal(0, 1) - expect_snapshot_error( + expect_snapshot(error = TRUE, calc_c <- calculate(c, values = draws) ) }) @@ -270,7 +270,7 @@ test_that("stochastic calculate works with mcmc samples & new stochastics", { # this should error without nsim being specified (b is stochastic and not # given by draws) - expect_snapshot_error( + expect_snapshot(error = TRUE, calc_b <- calculate(b, values = draws) ) @@ -287,12 +287,12 @@ test_that("calculate errors nicely if non-greta arrays are passed", { y <- a * x # it should error nicely - expect_snapshot_error( + expect_snapshot(error = TRUE, calc_y <- calculate(y, x, values = list(x = c(2, 1))) ) # and a hint for this common error - expect_snapshot_error( + expect_snapshot(error = TRUE, calc_y <- calculate(y, list(x = c(2, 1))) ) @@ -306,7 +306,7 @@ test_that("calculate errors nicely if values for stochastics not passed", { y <- a * x # it should error nicely - expect_snapshot_error( + expect_snapshot(error = TRUE, calc_y <- calculate(y, values = list(x = c(2, 1))) ) @@ -322,7 +322,7 @@ test_that("calculate errors nicely if values have incorrect dimensions", { y <- a * x # it should error nicely - expect_snapshot_error( + expect_snapshot(error = TRUE, calc_y <- calculate(y, values = list(a = c(1, 1))) ) }) @@ -365,13 +365,13 @@ test_that("calculate errors nicely with invalid batch sizes", { draws <- mcmc(m, warmup = 0, n_samples = samples, verbose = FALSE) # variable valid batch sizes - expect_snapshot_error( + expect_snapshot(error = TRUE, calc_y <- calculate(y, values = draws, trace_batch_size = 0) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, calc_y <- calculate(y, values = draws, trace_batch_size = NULL) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, calc_y <- calculate(y, values = draws, trace_batch_size = NA) ) }) @@ -440,7 +440,7 @@ test_that("calculate errors if distribution-free variables are not fixed", { # fix variable a <- variable() y <- normal(a, 1) - expect_snapshot_error( + expect_snapshot(error = TRUE, calc_a <- calculate(a, y, nsim = 1) ) }) @@ -450,7 +450,7 @@ test_that("calculate errors if a distribution cannot be sampled from", { # fix variable y <- hypergeometric(5, 3, 2) - expect_snapshot_error( + expect_snapshot(error = TRUE, sims <- calculate(y, nsim = 1) ) }) @@ -460,15 +460,15 @@ test_that("calculate errors nicely if nsim is invalid", { x <- normal(0, 1) - expect_snapshot_error( + expect_snapshot(error = TRUE, calc_x <- calculate(x, nsim = 0) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, calc_x <- calculate(x, nsim = -1) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, calc_x <- calculate(x, nsim = "five") ) }) diff --git a/tests/testthat/test_distributions.R b/tests/testthat/test_distributions.R index ab485133..9b974b87 100644 --- a/tests/testthat/test_distributions.R +++ b/tests/testthat/test_distributions.R @@ -733,21 +733,21 @@ test_that("poisson() and binomial() error informatively in glm", { skip_if_not(check_tf_version()) # if passed as an object - expect_snapshot_error( + expect_snapshot(error = TRUE, glm(1 ~ 1, family = poisson) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, glm(1 ~ 1, family = binomial) ) # if executed alone - expect_snapshot_error( + expect_snapshot(error = TRUE, glm(1 ~ 1, family = poisson()) ) # if given a link - expect_snapshot_error( + expect_snapshot(error = TRUE, glm(1 ~ 1, family = poisson("sqrt")) ) }) @@ -785,27 +785,27 @@ test_that("lkj_correlation distribution errors informatively", { "greta_array" )) - expect_snapshot_error( + expect_snapshot(error = TRUE, lkj_correlation(-1, dim) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, lkj_correlation(c(3, 3), dim) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, lkj_correlation(uniform(0, 1, dim = 2), dim) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, lkj_correlation(4, dimension = -1) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, lkj_correlation(4, dim = c(3, 3)) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, lkj_correlation(4, dim = NA) ) }) @@ -835,11 +835,11 @@ test_that("multivariate_normal distribution errors informatively", { )) # bad means - expect_snapshot_error( + expect_snapshot(error = TRUE, multivariate_normal(m_c, a) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, multivariate_normal(m_d, a) ) @@ -850,39 +850,39 @@ test_that("multivariate_normal distribution errors informatively", { )) # bad sigmas - expect_snapshot_error( + expect_snapshot(error = TRUE, multivariate_normal(m_a, b) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, multivariate_normal(m_a, c) ) # mismatched parameters - expect_snapshot_error( + expect_snapshot(error = TRUE, multivariate_normal(m_a, d) ) # scalars - expect_snapshot_error( + expect_snapshot(error = TRUE, multivariate_normal(0, 1) ) # bad n_realisations - expect_snapshot_error( + expect_snapshot(error = TRUE, multivariate_normal(m_a, a, n_realisations = -1) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, multivariate_normal(m_a, a, n_realisations = c(1, 3)) ) # bad dimension - expect_snapshot_error( + expect_snapshot(error = TRUE, multivariate_normal(m_a, a, dimension = -1) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, multivariate_normal(m_a, a, dimension = c(1, 3)) ) }) @@ -917,25 +917,25 @@ test_that("multinomial distribution errors informatively", { )) # scalars - expect_snapshot_error( + expect_snapshot(error = TRUE, multinomial(c(1), 1) ) # bad n_realisations - expect_snapshot_error( + expect_snapshot(error = TRUE, multinomial(10, p_a, n_realisations = -1) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, multinomial(10, p_a, n_realisations = c(1, 3)) ) # bad dimension - expect_snapshot_error( + expect_snapshot(error = TRUE, multinomial(10, p_a, dimension = -1) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, multinomial(10, p_a, dimension = c(1, 3)) ) }) @@ -958,25 +958,25 @@ test_that("categorical distribution errors informatively", { )) # scalars - expect_snapshot_error( + expect_snapshot(error = TRUE, categorical(1), ) # bad n_realisations - expect_snapshot_error( + expect_snapshot(error = TRUE, categorical(p_a, n_realisations = -1) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, categorical(p_a, n_realisations = c(1, 3)) ) # bad dimension - expect_snapshot_error( + expect_snapshot(error = TRUE, categorical(p_a, dimension = -1) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, categorical(p_a, dimension = c(1, 3)) ) }) @@ -1000,25 +1000,25 @@ test_that("dirichlet distribution errors informatively", { )) # scalars - expect_snapshot_error( + expect_snapshot(error = TRUE, dirichlet(1), ) # bad n_realisations - expect_snapshot_error( + expect_snapshot(error = TRUE, dirichlet(alpha_a, n_realisations = -1) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, dirichlet(alpha_a, n_realisations = c(1, 3)) ) # bad dimension - expect_snapshot_error( + expect_snapshot(error = TRUE, dirichlet(alpha_a, dimension = -1) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, dirichlet(alpha_a, dimension = c(1, 3)) ) }) @@ -1066,25 +1066,25 @@ test_that("dirichlet-multinomial distribution errors informatively", { )) # scalars - expect_snapshot_error( + expect_snapshot(error = TRUE, dirichlet_multinomial(c(1), 1) ) # bad n_realisations - expect_snapshot_error( + expect_snapshot(error = TRUE, dirichlet_multinomial(10, alpha_a, n_realisations = -1) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, dirichlet_multinomial(10, alpha_a, n_realisations = c(1, 3)) ) # bad dimension - expect_snapshot_error( + expect_snapshot(error = TRUE, dirichlet_multinomial(10, alpha_a, dimension = -1) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, dirichlet_multinomial(10, alpha_a, dimension = c(1, 3)) ) }) diff --git a/tests/testthat/test_extract_replace_combine.R b/tests/testthat/test_extract_replace_combine.R index f6611d46..9c84664d 100644 --- a/tests/testthat/test_extract_replace_combine.R +++ b/tests/testthat/test_extract_replace_combine.R @@ -388,11 +388,11 @@ test_that("abind errors informatively", { b <- ones(1, 1, 3) c <- ones(5, 1, 1) - expect_snapshot_error( + expect_snapshot(error = TRUE, abind(a, b) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, abind(a, c, along = 5) ) @@ -425,7 +425,7 @@ test_that("assign errors on variable greta arrays", { skip_if_not(check_tf_version()) z <- normal(0, 1, dim = 5) - expect_snapshot_error( + expect_snapshot(error = TRUE, z[1] <- 3 ) }) @@ -436,11 +436,11 @@ test_that("rbind and cbind give informative error messages", { a <- as_data(randn(5, 1)) b <- as_data(randn(1, 5)) - expect_snapshot_error( + expect_snapshot(error = TRUE, rbind(a, b) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, cbind(a, b) ) }) @@ -449,16 +449,16 @@ test_that("replacement gives informative error messages", { skip_if_not(check_tf_version()) x <- ones(2, 2, 2) - expect_snapshot_error( + expect_snapshot(error = TRUE, x[1:2, , 1] <- seq_len(3) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, x[1, 1, 3] <- 1 ) x <- ones(2) - expect_snapshot_error( + expect_snapshot(error = TRUE, x[3] <- 1 ) }) @@ -467,12 +467,12 @@ test_that("extraction gives informative error messages", { skip_if_not(check_tf_version()) x <- ones(2, 2, 2) - expect_snapshot_error( + expect_snapshot(error = TRUE, x[1, 1, 3] ) x <- ones(2) - expect_snapshot_error( + expect_snapshot(error = TRUE, x[3] ) }) @@ -603,19 +603,19 @@ test_that("dim<- errors as expected", { x <- zeros(3, 4) - expect_snapshot_error( + expect_snapshot(error = TRUE, dim(x) <- pi[0] ) - expect_snapshot_error( + expect_snapshot(error = TRUE, dim(x) <- c(1, NA) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, dim(x) <- c(1, -1) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, dim(x) <- 13 ) }) diff --git a/tests/testthat/test_functions.R b/tests/testthat/test_functions.R index 0a4f2d50..d04c43f7 100644 --- a/tests/testthat/test_functions.R +++ b/tests/testthat/test_functions.R @@ -69,7 +69,7 @@ test_that("cummax and cummin functions error informatively", { x <- as_data(randn(10)) for (fun in cumulative_funs) { - expect_snapshot_error( + expect_snapshot(error = TRUE, fun(x) ) } @@ -82,7 +82,7 @@ test_that("complex number functions error informatively", { x <- as_data(randn(25, 4)) for (fun in complex_funs) { - expect_snapshot_error( + expect_snapshot(error = TRUE, fun(x) ) } @@ -244,19 +244,19 @@ test_that("cumulative functions error as expected", { b <- as_data(randn(5, 1, 1)) - expect_snapshot_error( + expect_snapshot(error = TRUE, cumsum(a) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, cumsum(b) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, cumprod(a) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, cumprod(b) ) @@ -384,19 +384,19 @@ test_that("colSums etc. error as expected", { x <- as_data(randn(3, 4, 5)) - expect_snapshot_error( + expect_snapshot(error = TRUE, colSums(x, dims = 3) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, rowSums(x, dims = 3) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, colMeans(x, dims = 3) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, rowMeans(x, dims = 3) ) @@ -409,19 +409,19 @@ test_that("forwardsolve and backsolve error as expected", { b <- as_data(randn(5, 25)) c <- chol(a) - expect_snapshot_error( + expect_snapshot(error = TRUE, forwardsolve(a, b, k = 1) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, backsolve(a, b, k = 1) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, forwardsolve(a, b, transpose = TRUE) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, backsolve(a, b, transpose = TRUE) ) @@ -435,12 +435,12 @@ test_that("tapply errors as expected", { b <- ones(10, 2) # X must be a column vector - expect_snapshot_error( + expect_snapshot(error = TRUE, tapply(b, group, "sum") ) # INDEX can't be a greta array - expect_snapshot_error( + expect_snapshot(error = TRUE, tapply(a, as_data(group), "sum") ) }) @@ -495,7 +495,7 @@ test_that("ignored options are errored/warned about", { skip_if_not(check_tf_version()) x <- ones(3, 3) - expect_snapshot_error( + expect_snapshot(error = TRUE, round(x, 2) ) @@ -523,39 +523,39 @@ test_that("incorrect dimensions are errored about", { x <- ones(3, 3, 3) y <- ones(3, 4) - expect_snapshot_error( + expect_snapshot(error = TRUE, t(x) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, aperm(x, 2:1) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, chol(x) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, chol(y) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, chol2symm(x) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, chol2symm(y) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, eigen(x) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, eigen(y) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, rdist(x, y) ) }) diff --git a/tests/testthat/test_future.R b/tests/testthat/test_future.R index a7203478..0dde1f16 100644 --- a/tests/testthat/test_future.R +++ b/tests/testthat/test_future.R @@ -41,7 +41,7 @@ test_that("mcmc errors for invalid parallel plans", { ) future::plan(future::multicore) - expect_snapshot_error( + expect_snapshot(error = TRUE, check_future_plan() ) @@ -49,7 +49,7 @@ test_that("mcmc errors for invalid parallel plans", { if (.Platform$OS.type != "windows"){ cl <- parallel::makeCluster(2L, type = "FORK") future::plan(future::cluster, workers = cl) - expect_snapshot_error( + expect_snapshot(error = TRUE, check_future_plan() ) } @@ -89,13 +89,13 @@ test_that("mcmc errors for invalid parallel plans", { withr::defer(future::plan(op)) future::plan(future::multicore) - expect_snapshot_error( + expect_snapshot(error = TRUE, mcmc(m, verbose = FALSE) ) cl <- parallel::makeForkCluster(2L) future::plan(future::cluster, workers = cl) - expect_snapshot_error( + expect_snapshot(error = TRUE, mcmc(m, verbose = FALSE) ) diff --git a/tests/testthat/test_iid_samples.R b/tests/testthat/test_iid_samples.R index 23388b88..8c2d547a 100644 --- a/tests/testthat/test_iid_samples.R +++ b/tests/testthat/test_iid_samples.R @@ -284,7 +284,7 @@ test_that("distributions without RNG error nicely", { skip_if_not(check_tf_version()) # univariate - expect_snapshot_error( + expect_snapshot(error = TRUE, compare_iid_samples(hypergeometric, rhyper, parameters = list(m = 11, n = 8, k = 5) @@ -292,7 +292,7 @@ test_that("distributions without RNG error nicely", { ) # truncated RNG not implemented - expect_snapshot_error( + expect_snapshot(error = TRUE, compare_iid_samples(f, rtf, parameters = list( diff --git a/tests/testthat/test_inference.R b/tests/testthat/test_inference.R index 9c0e6962..b70f2f15 100644 --- a/tests/testthat/test_inference.R +++ b/tests/testthat/test_inference.R @@ -15,7 +15,7 @@ test_that("bad mcmc proposals are rejected", { ) expect_match(out, "100% bad") - expect_snapshot_error( + expect_snapshot(error = TRUE, draws <- mcmc(m, chains = 1, n_samples = 2, @@ -30,7 +30,7 @@ test_that("bad mcmc proposals are rejected", { z <- normal(-1e120, 1e-120) distribution(x) <- normal(z, 1e-120) m <- model(z, precision = "single") - expect_snapshot_error( + expect_snapshot(error = TRUE, mcmc(m, chains = 1, n_samples = 1, warmup = 0, verbose = FALSE) ) @@ -119,7 +119,7 @@ test_that("mcmc handles initial values nicely", { # too many sets of initial values inits <- replicate(3, initials(z = rnorm(1)), simplify = FALSE) - expect_snapshot_error( + expect_snapshot(error = TRUE, draws <- mcmc(m, warmup = 10, n_samples = 10, verbose = FALSE, chains = 2, initial_values = inits @@ -128,7 +128,7 @@ test_that("mcmc handles initial values nicely", { # initial values have the wrong length inits <- replicate(2, initials(z = rnorm(2)), simplify = FALSE) - expect_snapshot_error( + expect_snapshot(error = TRUE, draws <- mcmc(m, warmup = 10, n_samples = 10, verbose = FALSE, chains = 2, initial_values = inits @@ -258,7 +258,7 @@ test_that("model errors nicely", { # model should give a nice error if passed something other than a greta array a <- 1 b <- normal(0, a) - expect_snapshot_error( + expect_snapshot(error = TRUE, model(a, b) ) }) @@ -302,11 +302,11 @@ test_that("initials works", { skip_if_not(check_tf_version()) # errors on bad objects - expect_snapshot_error( + expect_snapshot(error = TRUE, initials(a = FALSE) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, initials(FALSE) ) @@ -329,39 +329,39 @@ test_that("prep_initials errors informatively", { m <- model(z) # bad objects: - expect_snapshot_error( + expect_snapshot(error = TRUE, mcmc(m, initial_values = FALSE, verbose = FALSE) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, mcmc(m, initial_values = list(FALSE), verbose = FALSE) ) # an unrelated greta array g <- normal(0, 1) - expect_snapshot_error( + expect_snapshot(error = TRUE, mcmc(m, chains = 1, initial_values = initials(g = 1), verbose = FALSE) ) # non-variable greta arrays - expect_snapshot_error( + expect_snapshot(error = TRUE, mcmc(m, chains = 1, initial_values = initials(f = 1), verbose = FALSE) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, mcmc(m, chains = 1, initial_values = initials(z = 1), verbose = FALSE) ) # out of bounds errors - expect_snapshot_error( + expect_snapshot(error = TRUE, mcmc(m, chains = 1, initial_values = initials(b = -1), verbose = FALSE) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, mcmc(m, chains = 1, initial_values = initials(d = -1), verbose = FALSE) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, mcmc(m, chains = 1, initial_values = initials(e = 2), verbose = FALSE) ) }) diff --git a/tests/testthat/test_install_greta_deps.R b/tests/testthat/test_install_greta_deps.R index d078d615..2a239f14 100644 --- a/tests/testthat/test_install_greta_deps.R +++ b/tests/testthat/test_install_greta_deps.R @@ -1,13 +1,13 @@ test_that("install_greta_deps errors appropriately", { skip_if_not(check_tf_version()) - expect_snapshot_error( + expect_snapshot(error = TRUE, install_greta_deps(timeout = 0.001) ) }) # test_that("reinstall_greta_deps errors appropriately", { # skip_if_not(check_tf_version()) -# expect_snapshot_error( +# expect_snapshot(error = TRUE, # reinstall_greta_deps(timeout = 0.001) # ) # }) diff --git a/tests/testthat/test_joint.R b/tests/testthat/test_joint.R index e8923f6b..c2750d98 100644 --- a/tests/testthat/test_joint.R +++ b/tests/testthat/test_joint.R @@ -78,7 +78,7 @@ test_that("fixed discrete joint distributions can be sampled from", { test_that("joint of fixed and continuous distributions errors", { skip_if_not(check_tf_version()) - expect_snapshot_error( + expect_snapshot(error = TRUE, joint( bernoulli(0.5), normal(0, 1) @@ -89,11 +89,11 @@ test_that("joint of fixed and continuous distributions errors", { test_that("joint with insufficient distributions errors", { skip_if_not(check_tf_version()) - expect_snapshot_error( + expect_snapshot(error = TRUE, joint(normal(0, 2)) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, joint() ) }) @@ -101,7 +101,7 @@ test_that("joint with insufficient distributions errors", { test_that("joint with non-scalar distributions errors", { skip_if_not(check_tf_version()) - expect_snapshot_error( + expect_snapshot(error = TRUE, joint( normal(0, 2, dim = 3), normal(0, 1, dim = 3) diff --git a/tests/testthat/test_misc.R b/tests/testthat/test_misc.R index ebbafbc1..22865be2 100644 --- a/tests/testthat/test_misc.R +++ b/tests/testthat/test_misc.R @@ -5,7 +5,7 @@ test_that("check_tf_version works", { true_version <- tf$`__version__` tf$`__version__` <- "0.9.0" # nolint - expect_snapshot_error( + expect_snapshot(error = TRUE, check_tf_version("error") ) expect_snapshot_warning( @@ -69,26 +69,26 @@ test_that("define and mcmc error informatively", { x <- as_data(randn(10)) # no model with non-probability density greta arrays - expect_snapshot_error( + expect_snapshot(error = TRUE, model(variable()) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, model(x) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, model() ) # can't define a model for an unfixed discrete variable - expect_snapshot_error( + expect_snapshot(error = TRUE, model(bernoulli(0.5)) ) # no parameters here, so define or dag should error distribution(x) <- normal(0, 1) - expect_snapshot_error( + expect_snapshot(error = TRUE, model(x) ) @@ -108,7 +108,7 @@ test_that("define and mcmc error informatively", { # can't draw samples of a data greta array z <- normal(x, 1) m <- model(x, z) - expect_snapshot_error( + expect_snapshot(error = TRUE, draws <- mcmc(m, verbose = FALSE) ) }) @@ -142,7 +142,7 @@ test_that("check_dims errors informatively", { ) # with two differently shaped arrays it shouldn't - expect_snapshot_error( + expect_snapshot(error = TRUE, greta:::check_dims(a, c) ) @@ -165,14 +165,14 @@ test_that("disjoint graphs are checked", { # c is unrelated and has no density c <- variable() - expect_snapshot_error( + expect_snapshot(error = TRUE, m <- model(a, b, c) ) # d is unrelated and known d <- as_data(randn(3)) distribution(d) <- normal(0, 1) - expect_snapshot_error( + expect_snapshot(error = TRUE, m <- model(a, b, d) ) @@ -220,7 +220,7 @@ test_that("cleanly() handles TF errors nicely", { expect_s3_class(cleanly(inversion_stop()), "error") expect_s3_class(cleanly(cholesky_stop()), "error") - expect_snapshot_error( + expect_snapshot(error = TRUE, cleanly(other_stop()) ) diff --git a/tests/testthat/test_mixture.R b/tests/testthat/test_mixture.R index 173cb916..5dbdbcab 100644 --- a/tests/testthat/test_mixture.R +++ b/tests/testthat/test_mixture.R @@ -43,7 +43,7 @@ test_that("mixtures of fixed and continuous distributions errors", { skip_if_not(check_tf_version()) weights <- uniform(0, 1, dim = 2) - expect_snapshot_error( + expect_snapshot(error = TRUE, mixture( bernoulli(0.5), normal(0, 1), @@ -56,7 +56,7 @@ test_that("mixtures of multivariate and univariate errors", { skip_if_not(check_tf_version()) weights <- uniform(0, 1, dim = 2) - expect_snapshot_error( + expect_snapshot(error = TRUE, mixture( multivariate_normal(zeros(1, 3), diag(3)), normal(0, 1, dim = c(1, 3)), @@ -71,7 +71,7 @@ test_that("mixtures of supports errors", { weights <- c(0.5, 0.5) # due to truncation - expect_snapshot_error( + expect_snapshot(error = TRUE, mixture( normal(0, 1, truncation = c(0, Inf)), normal(0, 1), @@ -80,7 +80,7 @@ test_that("mixtures of supports errors", { ) # due to bounds - expect_snapshot_error( + expect_snapshot(error = TRUE, mixture( lognormal(0, 1), normal(0, 1), @@ -93,7 +93,7 @@ test_that("incorrectly-shaped weights errors", { skip_if_not(check_tf_version()) weights <- uniform(0, 1, dim = c(1, 2)) - expect_snapshot_error( + expect_snapshot(error = TRUE, mixture( normal(0, 1), normal(0, 2), @@ -107,14 +107,14 @@ test_that("mixtures with insufficient distributions errors", { weights <- uniform(0, 1) - expect_snapshot_error( + expect_snapshot(error = TRUE, mixture( normal(0, 2), weights = weights ) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, mixture(weights = weights) ) diff --git a/tests/testthat/test_operators.R b/tests/testthat/test_operators.R index df51ab70..94a38fac 100644 --- a/tests/testthat/test_operators.R +++ b/tests/testthat/test_operators.R @@ -99,11 +99,11 @@ test_that("%*% errors informatively", { b <- ones(1, 4) c <- ones(2, 2, 2) - expect_snapshot_error( + expect_snapshot(error = TRUE, a %*% b ) - expect_snapshot_error( + expect_snapshot(error = TRUE, a %*% c ) }) diff --git a/tests/testthat/test_opt.R b/tests/testthat/test_opt.R index b88d1d19..e8ee49ac 100644 --- a/tests/testthat/test_opt.R +++ b/tests/testthat/test_opt.R @@ -86,14 +86,14 @@ test_that("opt fails with defunct optimisers", { m <- model(z) # check that the right ones error about defunct - expect_snapshot_error(o <- opt(m, optimiser = powell())) - expect_snapshot_error(o <- opt(m, optimiser = momentum())) - expect_snapshot_error(o <- opt(m, optimiser = cg())) - expect_snapshot_error(o <- opt(m, optimiser = newton_cg())) - expect_snapshot_error(o <- opt(m, optimiser = l_bfgs_b())) - expect_snapshot_error(o <- opt(m, optimiser = tnc())) - expect_snapshot_error(o <- opt(m, optimiser = cobyla())) - expect_snapshot_error(o <- opt(m, optimiser = slsqp())) + expect_snapshot(error = TRUE,o <- opt(m, optimiser = powell())) + expect_snapshot(error = TRUE,o <- opt(m, optimiser = momentum())) + expect_snapshot(error = TRUE,o <- opt(m, optimiser = cg())) + expect_snapshot(error = TRUE,o <- opt(m, optimiser = newton_cg())) + expect_snapshot(error = TRUE,o <- opt(m, optimiser = l_bfgs_b())) + expect_snapshot(error = TRUE,o <- opt(m, optimiser = tnc())) + expect_snapshot(error = TRUE,o <- opt(m, optimiser = cobyla())) + expect_snapshot(error = TRUE,o <- opt(m, optimiser = slsqp())) }) test_that("opt accepts initial values for TF optimisers", { @@ -190,7 +190,7 @@ test_that("TF opt with `gradient_descent` fails with bad initial values", { distribution(x) <- normal(z, sd) m <- model(z) - expect_snapshot_error( + expect_snapshot(error = TRUE, o <- opt(m, hessian = TRUE, optimiser = gradient_descent()) ) diff --git a/tests/testthat/test_simulate.R b/tests/testthat/test_simulate.R index 9adb39b4..00c0ab3d 100644 --- a/tests/testthat/test_simulate.R +++ b/tests/testthat/test_simulate.R @@ -24,7 +24,7 @@ test_that("simulate errors if distribution-free variables are not fixed", { a <- variable() y <- normal(a, 1) m <- model(y) - expect_snapshot_error( + expect_snapshot(error = TRUE, sims <- simulate(m) ) }) @@ -39,7 +39,7 @@ test_that("simulate errors if a distribution cannot be sampled from", { m <- lognormal(0, 1) distribution(y) <- hypergeometric(m, 3, 2) m <- model(y) - expect_snapshot_error( + expect_snapshot(error = TRUE, sims <- simulate(m) ) }) @@ -51,15 +51,15 @@ test_that("simulate errors nicely if nsim is invalid", { x <- normal(0, 1) m <- model(x) - expect_snapshot_error( + expect_snapshot(error = TRUE, simulate(m, nsim = 0) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, simulate(m, nsim = -1) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, simulate(m, nsim = "five") ) }) diff --git a/tests/testthat/test_syntax.R b/tests/testthat/test_syntax.R index 03ea8a51..cbcb8429 100644 --- a/tests/testthat/test_syntax.R +++ b/tests/testthat/test_syntax.R @@ -36,28 +36,28 @@ test_that("`distribution<-` errors informatively", { x <- randn(1) # not a greta array with a distribution on the right - expect_snapshot_error( + expect_snapshot(error = TRUE, distribution(y) <- x ) - expect_snapshot_error( + expect_snapshot(error = TRUE, distribution(y) <- as_data(x) ) # no density on the right - expect_snapshot_error( + expect_snapshot(error = TRUE, distribution(y) <- variable() ) # non-scalar and wrong dimensions - expect_snapshot_error( + expect_snapshot(error = TRUE, distribution(y) <- normal(0, 1, dim = c(3, 3, 1)) ) # double assignment of distribution to node y_ <- as_data(y) distribution(y_) <- normal(0, 1) - expect_snapshot_error( + expect_snapshot(error = TRUE, distribution(y_) <- normal(0, 1) ) @@ -66,25 +66,25 @@ test_that("`distribution<-` errors informatively", { y2 <- as_data(y) d <- normal(0, 1) distribution(y1) <- d - expect_snapshot_error( + expect_snapshot(error = TRUE, distribution(y2) <- y1 ) # assignment to a variable z <- variable() - expect_snapshot_error( + expect_snapshot(error = TRUE, distribution(z) <- normal(0, 1) ) # assignment to an op z2 <- z^2 - expect_snapshot_error( + expect_snapshot(error = TRUE, distribution(z2) <- normal(0, 1) ) # assignment to another distribution u <- uniform(0, 1) - expect_snapshot_error( + expect_snapshot(error = TRUE, distribution(z2) <- normal(0, 1) ) diff --git a/tests/testthat/test_transforms.R b/tests/testthat/test_transforms.R index f00c2ca0..0cf2a1ff 100644 --- a/tests/testthat/test_transforms.R +++ b/tests/testthat/test_transforms.R @@ -31,7 +31,7 @@ test_that("imultilogit errors informatively", { x <- ones(3, 4, 3) - expect_snapshot_error( + expect_snapshot(error = TRUE, imultilogit(x) ) diff --git a/tests/testthat/test_truncated.R b/tests/testthat/test_truncated.R index 3f0ef1ab..1803e41e 100644 --- a/tests/testthat/test_truncated.R +++ b/tests/testthat/test_truncated.R @@ -623,11 +623,11 @@ test_that("truncated chi squared has correct densities", { test_that("bad truncations error", { skip_if_not(check_tf_version()) - expect_snapshot_error( + expect_snapshot(error = TRUE, lognormal(0, 1, truncation = c(-1, Inf)) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, beta(1, 1, truncation = c(-1, 2)) ) }) diff --git a/tests/testthat/test_variables.R b/tests/testthat/test_variables.R index 0d87eb22..aff5fd5c 100644 --- a/tests/testthat/test_variables.R +++ b/tests/testthat/test_variables.R @@ -2,29 +2,29 @@ test_that("variable() errors informatively", { skip_if_not(check_tf_version()) # bad types - expect_snapshot_error( + expect_snapshot(error = TRUE, variable(upper = NA) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, variable(upper = head) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, variable(lower = NA) ) - expect_snapshot_error( + expect_snapshot(error = TRUE, variable(lower = head) ) # good types, bad values - expect_snapshot_error( + expect_snapshot(error = TRUE, variable(lower = 0:2, upper = 1:2) ) # lower not below upper - expect_snapshot_error( + expect_snapshot(error = TRUE, variable(lower = 1, upper = 1) ) }) From 1561d5966801f95e040113f320fc40879da6c1f6 Mon Sep 17 00:00:00 2001 From: njtierney Date: Tue, 20 Aug 2024 13:06:36 +1000 Subject: [PATCH 09/14] Some small name changes in installation(ers) * change function greta_python_deps() --> greta_deps_spec() * argument: python_deps --> deps * minor changes to logfile writing --- NAMESPACE | 4 +- NEWS.md | 2 +- R/data-deps-tf-tfp.R | 2 +- R/greta_create_conda_env.R | 12 +- R/greta_install_python_deps.R | 22 +-- R/install_greta_deps.R | 112 +++++++------- R/reinstallers.R | 4 +- R/write-logfiles.R | 2 +- man/greta_create_conda_env.Rd | 6 +- man/greta_deps_receipt.Rd | 2 +- ...reta_python_deps.Rd => greta_deps_spec.Rd} | 34 ++--- man/greta_deps_tf_tfp.Rd | 2 +- man/install_greta_deps.Rd | 10 +- ...ython_deps.Rd => print.greta_deps_spec.Rd} | 6 +- ...reta_python_deps.md => greta_deps_spec.md} | 90 +++++------- tests/testthat/test-greta_python_deps.R | 138 ------------------ tests/testthat/test_greta_deps_spec.R | 138 ++++++++++++++++++ 17 files changed, 288 insertions(+), 298 deletions(-) rename man/{greta_python_deps.Rd => greta_deps_spec.Rd} (55%) rename man/{print.greta_python_deps.Rd => print.greta_deps_spec.Rd} (72%) rename tests/testthat/_snaps/{greta_python_deps.md => greta_deps_spec.md} (65%) delete mode 100644 tests/testthat/test-greta_python_deps.R create mode 100644 tests/testthat/test_greta_deps_spec.R diff --git a/NAMESPACE b/NAMESPACE index 538c28dd..eeb4e5b0 100644 --- a/NAMESPACE +++ b/NAMESPACE @@ -106,9 +106,9 @@ S3method(mean,greta_array) S3method(min,greta_array) S3method(plot,greta_model) S3method(print,greta_array) +S3method(print,greta_deps_spec) S3method(print,greta_mcmc_list) S3method(print,greta_model) -S3method(print,greta_python_deps) S3method(print,initials) S3method(print,optimiser) S3method(print,sampler) @@ -198,9 +198,9 @@ export(gradient_descent) export(greta_array) export(greta_create_conda_env) export(greta_deps_receipt) +export(greta_deps_spec) export(greta_install_miniconda) export(greta_notes_tf_num_error) -export(greta_python_deps) export(greta_set_install_logfile) export(greta_sitrep) export(hmc) diff --git a/NEWS.md b/NEWS.md index 7ba51c10..87a05e19 100644 --- a/NEWS.md +++ b/NEWS.md @@ -33,7 +33,7 @@ The following optimisers are removed, as they are no longer supported by Tensorf This release provides a few improvements to installation in greta. It should now provide more information about installation progress, and be more robust. The intention is, it should _just work_, and if it doesn't fail gracefully with some useful advice on problem solving. * Added option to restart R + run `library(greta)` after installation (#523) -* Added installation deps object, `greta_python_deps()` to help simplify specifying package versions (#664) +* Added installation deps object, `greta_deps_sepc()` to help simplify specifying package versions (#664) * removed `method` and `conda` arguments from `install_greta_deps()` as they were not used. * removed `manual` argument in `install_greta_deps()`. diff --git a/R/data-deps-tf-tfp.R b/R/data-deps-tf-tfp.R index 67cf4347..4c73f536 100644 --- a/R/data-deps-tf-tfp.R +++ b/R/data-deps-tf-tfp.R @@ -7,7 +7,7 @@ #' , and by inspecting #' . #' -#' We recommend using the default versions provided in `greta_python_deps()`. +#' We recommend using the default versions provided in `greta_deps_spec()`. #' #' @format ## `greta_deps_tf_tfp` #' A data frame with 63 rows and 5 columns: diff --git a/R/greta_create_conda_env.R b/R/greta_create_conda_env.R index d40cc9b9..8e374313 100644 --- a/R/greta_create_conda_env.R +++ b/R/greta_create_conda_env.R @@ -5,21 +5,21 @@ #' "greta-env-tf2". This is used within [install_greta_deps()] as part of #' setting up python dependencies. It uses a version of python that is #' compatible with the versions of tensorflow and tensorflow-probability, -#' which is established with [greta_python_deps()]. We mostly recommend +#' which is established with [greta_deps_spec()]. We mostly recommend #' users use [install_greta_deps()] to manage their python dependency #' installation. #' #' #' @param timeout time (minutes) until installation stops. Default is 5 minutes. -#' @param python_deps dependency specification, see [greta_python_deps()] for +#' @param deps dependency specification, see [greta_deps_spec()] for #' more details. #' #' @return nothing - creates a conda environment for a specific python version #' @export greta_create_conda_env <- function(timeout = 5, - python_deps = greta_python_deps()) { + deps = greta_deps_spec()) { - check_greta_python_deps(python_deps) + check_greta_deps_spec(deps) stdout_file <- create_temp_file("out-greta-conda") stderr_file <- create_temp_file("err-greta-conda") @@ -31,7 +31,7 @@ greta_create_conda_env <- function(timeout = 5, python_version = python_version ) }, - args = list(python_version = python_deps$python_version), + args = list(python_version = deps$python_version), stdout = stdout_file, stderr = stderr_file ) @@ -43,7 +43,7 @@ greta_create_conda_env <- function(timeout = 5, timeout = timeout, cli_start_msg = glue::glue( "Creating 'greta-env-tf2' conda environment using python \\ - v{python_deps$python_version}, this may take a minute" + v{deps$python_version}, this may take a minute" ), cli_end_msg = "greta-env-tf2 environment created!" ) diff --git a/R/greta_install_python_deps.R b/R/greta_install_python_deps.R index e8e9c0ba..2c50def3 100644 --- a/R/greta_install_python_deps.R +++ b/R/greta_install_python_deps.R @@ -1,26 +1,26 @@ greta_install_python_deps <- function(timeout = 5, - python_deps = greta_python_deps()) { + deps = greta_deps_spec()) { stdout_file <- create_temp_file("out-python-deps") stderr_file <- create_temp_file("err-python-deps") callr_conda_install <- callr::r_process_options( - func = function(python_deps) { + func = function(deps) { cli::cli_progress_step( - msg = "Installing TF (v{python_deps$tf_version})", - msg_done = "Installed TF (v{python_deps$tf_version})!", - msg_failed = "Error installing TF (v{python_deps$tf_version})" + msg = "Installing TF (v{deps$tf_version})", + msg_done = "Installed TF (v{deps$tf_version})!", + msg_failed = "Error installing TF (v{deps$tf_version})" ) tensorflow::install_tensorflow( - version = python_deps$tf_version, + version = deps$tf_version, envname = "greta-env-tf2", method = "conda" ) - dep_tfp <- glue::glue("tensorflow-probability=={python_deps$tfp_version}") + dep_tfp <- glue::glue("tensorflow-probability=={deps$tfp_version}") cli::cli_progress_step( - msg = "Installing TFP (v{python_deps$tfp_version})", - msg_done = "Installed TFP (v{python_deps$tfp_version})!", - msg_failed = "Error installing TFP (v{python_deps$tfp_version})" + msg = "Installing TFP (v{deps$tfp_version})", + msg_done = "Installed TFP (v{deps$tfp_version})!", + msg_failed = "Error installing TFP (v{deps$tfp_version})" ) reticulate::py_install( packages = dep_tfp, @@ -29,7 +29,7 @@ greta_install_python_deps <- function(timeout = 5, method = "conda" ) }, - args = list(python_deps = python_deps), + args = list(deps = deps), stdout = stdout_file, stderr = stderr_file ) diff --git a/R/install_greta_deps.R b/R/install_greta_deps.R index 2d305804..3861c2fc 100644 --- a/R/install_greta_deps.R +++ b/R/install_greta_deps.R @@ -14,11 +14,11 @@ #' not specified the `GRETA_INSTALLATION_LOG` environmental variable, the #' installation notes will indicate how to use [write_greta_install_log()]. #' -#' @param python_deps object created with [greta_python_deps()] where you +#' @param deps object created with [greta_deps_spec()] where you #' specify python, TF, and TFP versions. By default these are TF 2.15.0, #' TFP 0.23.0, and Python 3.10. These versions must be compatible -#' with each other. If they are not, [greta_python_deps()] will error with -#' more information and suggestions. See ?[greta_python_deps()] for more +#' with each other. If they are not, [greta_deps_spec()] will error with +#' more information and suggestions. See ?[greta_deps_spec()] for more #' information, and see the data object `greta_deps_tf_tfp` #' (`?greta_deps_tf_tfp``). #' @@ -70,12 +70,12 @@ #' @importFrom callr r_process #' @importFrom cli cli_alert_success #' @importFrom cli cli_ul -install_greta_deps <- function(python_deps = greta_python_deps(), +install_greta_deps <- function(deps = greta_deps_spec(), timeout = 5, restart = c("ask", "force", "no"), ...) { - check_greta_python_deps(python_deps) + check_greta_deps_spec(deps) restart <- rlang::arg_match( arg = restart, @@ -93,7 +93,7 @@ install_greta_deps <- function(python_deps = greta_python_deps(), if (!have_greta_conda_env()) { greta_create_conda_env( timeout = timeout, - python_deps = python_deps + deps = deps ) } @@ -102,7 +102,7 @@ install_greta_deps <- function(python_deps = greta_python_deps(), # suggest using `reinstall_greta_deps()` greta_install_python_deps( timeout = timeout, - python_deps = python_deps + deps = deps ) # TODO @@ -204,14 +204,14 @@ restart_or_not <- function(restart){ #' Specify python dependencies for greta #' #' A helper function for specifying versions of Tensorflow (TF), Tensorflow -#' Probability (TFP), and Python. Defaulting to 2.15.0, 0.23.0, and 3.10. -#' You can specify the version that you want to install, but it will check -#' if these are compatible. That is, if you specify versions of TF/TFP/Python -#' which do not work with each other, it will error and give a suggested -#' version to install. It does this by using a dataset, `greta_deps_tf_tfp`, -#' to check if the versions of TF, TFP, and Python specified are compatible -#' on your operating system. You can inspect this dataset with -#' `View(greta_deps_tf_tfp)`. +#' Probability (TFP), and Python. Defaulting to 2.15.0, 0.23.0, and 3.10, +#' respectively. You can specify the version that you want to install, but +#' it will check if these are compatible. That is, if you specify versions of +#' TF/TFP/Python which do not work with each other, it will error and give +#' a suggested version to install. It does this by using a dataset, +#' `greta_deps_tf_tfp`, to check if the versions of TF, TFP, and Python +#' specified are compatible on your operating system. You can inspect +#' this dataset with `View(greta_deps_tf_tfp)`. #' #' @param tf_version Character. Tensorflow (TF) version in format #' major.minor.patch. Default is "2.15.0". @@ -224,24 +224,24 @@ restart_or_not <- function(restart){ #' @export #' #' @examples -#' greta_python_deps() -#' greta_python_deps(tf_version = "2.15.0") -#' greta_python_deps(tf_version = "2.15.0", tfp_version = "0.23.0") -#' greta_python_deps(tf_version = "2.15.0", python_version = "3.10") -#' greta_python_deps( +#' greta_deps_spec() +#' greta_deps_spec(tf_version = "2.15.0") +#' greta_deps_spec(tf_version = "2.15.0", tfp_version = "0.23.0") +#' greta_deps_spec(tf_version = "2.15.0", python_version = "3.10") +#' greta_deps_spec( #' tf_version = "2.15.0", #' tfp_version = "0.23.0", #' python_version = "3.10" #' ) #' # this will fail #' \dontrun{ -#' greta_python_deps( +#' greta_deps_spec( #' tf_version = "2.11.0", #' tfp_version = "0.23.0", #' python_version = "3.10" #' ) #' } -greta_python_deps <- function(tf_version = "2.15.0", +greta_deps_spec <- function(tf_version = "2.15.0", tfp_version = "0.23.0", python_version = "3.10"){ @@ -253,7 +253,7 @@ greta_python_deps <- function(tf_version = "2.15.0", deps_obj <- structure( deps_list, - class = c("greta_python_deps", "data.frame") + class = c("greta_deps_spec", "data.frame") ) # check for envvar to silence these checks @@ -267,11 +267,11 @@ greta_python_deps <- function(tf_version = "2.15.0", } -check_greta_python_deps <- function(deps, +check_greta_deps_spec <- function(deps, call = rlang::caller_env()) { - if (!inherits(deps, "greta_python_deps")) { + if (!inherits(deps, "greta_deps_spec")) { cli::cli_abort( - message = "{.arg deps} must be created by {.fun greta_python_deps}.", + message = "{.arg deps} must be created by {.fun greta_deps_spec}.", call = call ) } @@ -282,7 +282,7 @@ check_greta_python_deps <- function(deps, #' @param x greta python deps #' @param ... extra args, not used #' @export -print.greta_python_deps <- function(x, ...){ +print.greta_deps_spec <- function(x, ...){ print.data.frame(x) } @@ -291,7 +291,7 @@ print.greta_python_deps <- function(x, ...){ #' To assist with capturing and sharing python dependencies, we provide a way #' to capture the dependencies currently used. #' -#' @return `greta_python_deps()` object +#' @return `greta_deps_spec()` object #' @export #' #' @examples @@ -300,7 +300,7 @@ print.greta_python_deps <- function(x, ...){ #' } greta_deps_receipt <- function(){ - greta_python_deps( + greta_deps_spec( tf_version = version_tf(), tfp_version = version_tfp(), python_version = as.character(reticulate::py_version()) @@ -308,18 +308,18 @@ greta_deps_receipt <- function(){ } -check_greta_deps_range <- function(python_deps, - deps, +check_greta_deps_range <- function(deps, + module, call = rlang::caller_env()){ - greta_tf_tfp <- greta_deps_tf_tfp[[deps]] - version_provided <- numeric_version(python_deps[[deps]]) + greta_tf_tfp <- greta_deps_tf_tfp[[module]] + version_provided <- numeric_version(deps[[module]]) - version_name <- switch(deps, + version_name <- switch(module, tf_version = "TF", tfp_version = "TFP") - latest_version <- switch(deps, + latest_version <- switch(module, tf_version = numeric_version("2.15.0"), tfp_version = numeric_version("0.23.0")) @@ -345,7 +345,7 @@ check_greta_deps_range <- function(python_deps, valid <- version_provided %in% greta_tf_tfp if (!valid) { - closest_value <- closest_version(version_provided, greta_deps_tf_tfp[[deps]]) + closest_value <- closest_version(version_provided, greta_deps_tf_tfp[[module]]) } if (!valid){ @@ -354,7 +354,7 @@ check_greta_deps_range <- function(python_deps, message = c("{.val {version_name}} version provided does not match \\ supported versions", "The version {.val {version_provided}} was not in \\ - {.val {greta_deps_tf_tfp[[deps]]}}", + {.val {greta_deps_tf_tfp[[module]]}}", "i" = "The nearest valid version that is supported by \\ {.pkg greta} is: {.val {closest_value}}", "i" = "Valid versions of TF, TFP, and Python are in \\ @@ -366,15 +366,15 @@ check_greta_deps_range <- function(python_deps, } } -check_greta_tf_range <- function(python_deps, call = rlang::caller_env()) { - check_greta_deps_range(python_deps = python_deps, - deps = "tf_version", +check_greta_tf_range <- function(deps, call = rlang::caller_env()) { + check_greta_deps_range(deps = deps, + module = "tf_version", call = call) } -check_greta_tfp_range <- function(python_deps, call = rlang::caller_env()) { - check_greta_deps_range(python_deps = python_deps, - deps = "tfp_version", +check_greta_tfp_range <- function(deps, call = rlang::caller_env()) { + check_greta_deps_range(deps = deps, + module = "tfp_version", call = call) } @@ -405,12 +405,12 @@ check_greta_python_range <- function(version_provided, } -check_greta_deps_config <- function(python_deps, +check_greta_deps_config <- function(deps, call = rlang::caller_env()){ - check_greta_python_deps(python_deps) + check_greta_deps_spec(deps) - python_deps <- python_deps |> + deps <- deps |> lapply(numeric_version) |> as.data.frame() @@ -431,21 +431,21 @@ check_greta_deps_config <- function(python_deps, } config_matches <- os_matches |> - subset(tfp_version == python_deps$tfp_version) |> - subset(tf_version == python_deps$tf_version) |> - subset(python_deps$python_version >= python_version_min) |> - subset(python_deps$python_version <= python_version_max) + subset(tfp_version == deps$tfp_version) |> + subset(tf_version == deps$tf_version) |> + subset(deps$python_version >= python_version_min) |> + subset(deps$python_version <= python_version_max) no_matches <- nrow(config_matches) == 0 # Build logic to prioritise valid TFP over others if (no_matches){ - tfp_matches <- subset(os_matches, tfp_version == python_deps$tfp_version) - tf_matches <- subset(os_matches, tf_version == python_deps$tf_version) + tfp_matches <- subset(os_matches, tfp_version == deps$tfp_version) + tf_matches <- subset(os_matches, tf_version == deps$tf_version) py_matches <- os_matches |> - subset(python_deps$python_version >= python_version_min) |> - subset(python_deps$python_version <= python_version_max) + subset(deps$python_version >= python_version_min) |> + subset(deps$python_version <= python_version_max) config_matches <- data.frame( tfp_match = nrow(tfp_matches) > 0, @@ -495,10 +495,10 @@ check_greta_deps_config <- function(python_deps, suggested_py <- as.character(max(suggested_match$python_version_max)) cli::cli_abort( - message = c("Provided {.code greta_python_deps} does not match valid \\ + message = c("Provided {.code greta_deps_spec} does not match valid \\ installation combinations.", "See below for a suggested config to use:", - "{.code greta_python_deps(\\ + "{.code greta_deps_spec(\\ tf_version = {.val {suggested_tf}}, \\ tfp_version = {.val {suggested_tfp}}, \\ python_version = {.val {suggested_py}}\\ diff --git a/R/reinstallers.R b/R/reinstallers.R index bffcde10..9444d98f 100644 --- a/R/reinstallers.R +++ b/R/reinstallers.R @@ -78,13 +78,13 @@ reinstall_miniconda <- function(timeout = 5){ #' # issues with installing greta dependencies #' reinstall_greta_deps() #' } -reinstall_greta_deps <- function(python_deps = greta_python_deps(), +reinstall_greta_deps <- function(deps = greta_deps_spec(), timeout = 5, restart = c("ask", "force", "no")){ remove_greta_env() remove_miniconda() install_greta_deps( - python_deps = python_deps, + deps = deps, timeout = timeout, restart = restart ) diff --git a/R/write-logfiles.R b/R/write-logfiles.R index 5dea578c..a4bff917 100644 --- a/R/write-logfiles.R +++ b/R/write-logfiles.R @@ -37,7 +37,7 @@ write_greta_install_log <- function(path = greta_logfile) {

Greta installation logfile

Created: {{sys_date}}

Use this logfile to explore potential issues in installation with greta

-

Try searching the text for "error" with Cmd/Ctrl+F

+

Try opening this in a browser and searching the text for "error" with Cmd/Ctrl+F

Miniconda

diff --git a/man/greta_create_conda_env.Rd b/man/greta_create_conda_env.Rd index 6ba2f78e..e71548ea 100644 --- a/man/greta_create_conda_env.Rd +++ b/man/greta_create_conda_env.Rd @@ -4,12 +4,12 @@ \alias{greta_create_conda_env} \title{Create conda environment for greta} \usage{ -greta_create_conda_env(timeout = 5, python_deps = greta_python_deps()) +greta_create_conda_env(timeout = 5, deps = greta_deps_spec()) } \arguments{ \item{timeout}{time (minutes) until installation stops. Default is 5 minutes.} -\item{python_deps}{dependency specification, see \code{\link[=greta_python_deps]{greta_python_deps()}} for +\item{deps}{dependency specification, see \code{\link[=greta_deps_spec]{greta_deps_spec()}} for more details.} } \value{ @@ -21,7 +21,7 @@ This function runs \code{\link[reticulate:conda-tools]{reticulate::conda_create( "greta-env-tf2". This is used within \code{\link[=install_greta_deps]{install_greta_deps()}} as part of setting up python dependencies. It uses a version of python that is compatible with the versions of tensorflow and tensorflow-probability, -which is established with \code{\link[=greta_python_deps]{greta_python_deps()}}. We mostly recommend +which is established with \code{\link[=greta_deps_spec]{greta_deps_spec()}}. We mostly recommend users use \code{\link[=install_greta_deps]{install_greta_deps()}} to manage their python dependency installation. } diff --git a/man/greta_deps_receipt.Rd b/man/greta_deps_receipt.Rd index c7f1e344..135ed8f1 100644 --- a/man/greta_deps_receipt.Rd +++ b/man/greta_deps_receipt.Rd @@ -7,7 +7,7 @@ greta_deps_receipt() } \value{ -\code{greta_python_deps()} object +\code{greta_deps_spec()} object } \description{ To assist with capturing and sharing python dependencies, we provide a way diff --git a/man/greta_python_deps.Rd b/man/greta_deps_spec.Rd similarity index 55% rename from man/greta_python_deps.Rd rename to man/greta_deps_spec.Rd index 1f2e41b8..6b2e421b 100644 --- a/man/greta_python_deps.Rd +++ b/man/greta_deps_spec.Rd @@ -1,10 +1,10 @@ % Generated by roxygen2: do not edit by hand % Please edit documentation in R/install_greta_deps.R -\name{greta_python_deps} -\alias{greta_python_deps} +\name{greta_deps_spec} +\alias{greta_deps_spec} \title{Specify python dependencies for greta} \usage{ -greta_python_deps( +greta_deps_spec( tf_version = "2.15.0", tfp_version = "0.23.0", python_version = "3.10" @@ -25,28 +25,28 @@ data frame of valid dependencies } \description{ A helper function for specifying versions of Tensorflow (TF), Tensorflow -Probability (TFP), and Python. Defaulting to 2.15.0, 0.23.0, and 3.10. -You can specify the version that you want to install, but it will check -if these are compatible. That is, if you specify versions of TF/TFP/Python -which do not work with each other, it will error and give a suggested -version to install. It does this by using a dataset, \code{greta_deps_tf_tfp}, -to check if the versions of TF, TFP, and Python specified are compatible -on your operating system. You can inspect this dataset with -\code{View(greta_deps_tf_tfp)}. +Probability (TFP), and Python. Defaulting to 2.15.0, 0.23.0, and 3.10, +respectively. You can specify the version that you want to install, but +it will check if these are compatible. That is, if you specify versions of +TF/TFP/Python which do not work with each other, it will error and give +a suggested version to install. It does this by using a dataset, +\code{greta_deps_tf_tfp}, to check if the versions of TF, TFP, and Python +specified are compatible on your operating system. You can inspect +this dataset with \code{View(greta_deps_tf_tfp)}. } \examples{ -greta_python_deps() -greta_python_deps(tf_version = "2.15.0") -greta_python_deps(tf_version = "2.15.0", tfp_version = "0.23.0") -greta_python_deps(tf_version = "2.15.0", python_version = "3.10") -greta_python_deps( +greta_deps_spec() +greta_deps_spec(tf_version = "2.15.0") +greta_deps_spec(tf_version = "2.15.0", tfp_version = "0.23.0") +greta_deps_spec(tf_version = "2.15.0", python_version = "3.10") +greta_deps_spec( tf_version = "2.15.0", tfp_version = "0.23.0", python_version = "3.10" ) # this will fail \dontrun{ -greta_python_deps( +greta_deps_spec( tf_version = "2.11.0", tfp_version = "0.23.0", python_version = "3.10" diff --git a/man/greta_deps_tf_tfp.Rd b/man/greta_deps_tf_tfp.Rd index 1d93feb5..f5cf04ac 100644 --- a/man/greta_deps_tf_tfp.Rd +++ b/man/greta_deps_tf_tfp.Rd @@ -27,6 +27,6 @@ machines. It was constructed from \url{https://github.com/tensorflow/probability/releases}. } \details{ -We recommend using the default versions provided in \code{greta_python_deps()}. +We recommend using the default versions provided in \code{greta_deps_spec()}. } \keyword{datasets} diff --git a/man/install_greta_deps.Rd b/man/install_greta_deps.Rd index 52ac3c54..45495395 100644 --- a/man/install_greta_deps.Rd +++ b/man/install_greta_deps.Rd @@ -6,24 +6,24 @@ \title{Install Python dependencies for greta} \usage{ install_greta_deps( - python_deps = greta_python_deps(), + deps = greta_deps_spec(), timeout = 5, restart = c("ask", "force", "no"), ... ) reinstall_greta_deps( - python_deps = greta_python_deps(), + deps = greta_deps_spec(), timeout = 5, restart = c("ask", "force", "no") ) } \arguments{ -\item{python_deps}{object created with \code{\link[=greta_python_deps]{greta_python_deps()}} where you +\item{deps}{object created with \code{\link[=greta_deps_spec]{greta_deps_spec()}} where you specify python, TF, and TFP versions. By default these are TF 2.15.0, TFP 0.23.0, and Python 3.10. These versions must be compatible -with each other. If they are not, \code{\link[=greta_python_deps]{greta_python_deps()}} will error with -more information and suggestions. See ?\code{\link[=greta_python_deps]{greta_python_deps()}} for more +with each other. If they are not, \code{\link[=greta_deps_spec]{greta_deps_spec()}} will error with +more information and suggestions. See ?\code{\link[=greta_deps_spec]{greta_deps_spec()}} for more information, and see the data object \code{greta_deps_tf_tfp} (`?greta_deps_tf_tfp``).} diff --git a/man/print.greta_python_deps.Rd b/man/print.greta_deps_spec.Rd similarity index 72% rename from man/print.greta_python_deps.Rd rename to man/print.greta_deps_spec.Rd index df63d637..c9053dc8 100644 --- a/man/print.greta_python_deps.Rd +++ b/man/print.greta_deps_spec.Rd @@ -1,10 +1,10 @@ % Generated by roxygen2: do not edit by hand % Please edit documentation in R/install_greta_deps.R -\name{print.greta_python_deps} -\alias{print.greta_python_deps} +\name{print.greta_deps_spec} +\alias{print.greta_deps_spec} \title{Print method for greta python deps} \usage{ -\method{print}{greta_python_deps}(x, ...) +\method{print}{greta_deps_spec}(x, ...) } \arguments{ \item{x}{greta python deps} diff --git a/tests/testthat/_snaps/greta_python_deps.md b/tests/testthat/_snaps/greta_deps_spec.md similarity index 65% rename from tests/testthat/_snaps/greta_python_deps.md rename to tests/testthat/_snaps/greta_deps_spec.md index 0550c6b3..7ed58aed 100644 --- a/tests/testthat/_snaps/greta_python_deps.md +++ b/tests/testthat/_snaps/greta_deps_spec.md @@ -73,10 +73,10 @@ x The version provided was "3.14". i Try: "3.11" -# greta_python_deps fails appropriately +# greta_deps_spec fails appropriately Code - greta_python_deps() + greta_deps_spec() Output tf_version tfp_version python_version 1 2.15.0 0.23.0 3.10 @@ -84,8 +84,7 @@ --- Code - greta_python_deps(tf_version = "2.14.0", tfp_version = "0.22.1", - python_version = "3.9") + greta_deps_spec(tf_version = "2.14.0", tfp_version = "0.22.1", python_version = "3.9") Output tf_version tfp_version python_version 1 2.14.0 0.22.1 3.9 @@ -93,8 +92,7 @@ --- Code - greta_python_deps(tf_version = "2.12.0", tfp_version = "0.20.0", - python_version = "3.9") + greta_deps_spec(tf_version = "2.12.0", tfp_version = "0.20.0", python_version = "3.9") Output tf_version tfp_version python_version 1 2.12.0 0.20.0 3.9 @@ -102,10 +100,9 @@ --- Code - greta_python_deps(tf_version = "2.16.1", tfp_version = "0.11.0", - python_version = "3.8") + greta_deps_spec(tf_version = "2.16.1", tfp_version = "0.11.0", python_version = "3.8") Condition - Error in `greta_python_deps()`: + Error in `greta_deps_spec()`: ! greta Does not yet support TF > 2.15.0 i See for more information x The provided version was 2.16.1 @@ -117,9 +114,9 @@ --- Code - greta_python_deps(tf_version = "1.9.0", tfp_version = "0.11.0", python_version = "3.8") + greta_deps_spec(tf_version = "1.9.0", tfp_version = "0.11.0", python_version = "3.8") Condition - Error in `greta_python_deps()`: + Error in `greta_deps_spec()`: ! "TF" version provided does not match supported versions The version 1.9.0 was not in 2.15.0, 2.14.0, 2.14.0, 2.13.0, 2.12.0, 2.11.0, 2.10.0, 2.9.1, 2.8.0, 2.7.0, 2.6.0, 2.6.0, 2.5.0, 2.4.0, 2.4.0, 2.4.0, 2.3.0, 2.3.0, ..., 2.1.0, and 2.0.0 i The nearest valid version that is supported by greta is: 2.0.0 @@ -130,10 +127,9 @@ --- Code - greta_python_deps(tf_version = "2.15.0", tfp_version = "0.24.0", - python_version = "3.10") + greta_deps_spec(tf_version = "2.15.0", tfp_version = "0.24.0", python_version = "3.10") Condition - Error in `greta_python_deps()`: + Error in `greta_deps_spec()`: ! greta Does not yet support TFP > 0.23.0 i See for more information x The provided version was 0.24.0 @@ -145,9 +141,9 @@ --- Code - greta_python_deps(tf_version = "2.15.0", tfp_version = "0.6.0", python_version = "3.10") + greta_deps_spec(tf_version = "2.15.0", tfp_version = "0.6.0", python_version = "3.10") Condition - Error in `greta_python_deps()`: + Error in `greta_deps_spec()`: ! "TFP" version provided does not match supported versions The version 0.6.0 was not in 0.23.0, 0.22.1, 0.22.0, 0.21.0, 0.20.0, 0.19.0, 0.18.0, 0.17.0, 0.16.0, 0.15.0, 0.14.1, 0.14.0, 0.13.0, 0.12.2, 0.12.1, 0.12.0, 0.11.1, 0.11.0, ..., 0.9.0, and 0.8.0 i The nearest valid version that is supported by greta is: 0.8.0 @@ -158,9 +154,9 @@ --- Code - greta_python_deps(tf_version = "2.9.1", tfp_version = "0.23.0", python_version = "3.13") + greta_deps_spec(tf_version = "2.9.1", tfp_version = "0.23.0", python_version = "3.13") Condition - Error in `greta_python_deps()`: + Error in `greta_deps_spec()`: ! Python version must be between "3.3"-"3.11" x The version provided was "3.13". i Try: "3.11" @@ -168,9 +164,9 @@ --- Code - greta_python_deps(tf_version = "2.9.1", tfp_version = "0.23.0", python_version = "2.6") + greta_deps_spec(tf_version = "2.9.1", tfp_version = "0.23.0", python_version = "2.6") Condition - Error in `greta_python_deps()`: + Error in `greta_deps_spec()`: ! Python version must be between "3.3"-"3.11" x The version provided was "2.6". i Try: "3.3" @@ -178,13 +174,12 @@ --- Code - greta_python_deps(tf_version = "2.15.0", tfp_version = "0.23.0", - python_version = "3.8") + greta_deps_spec(tf_version = "2.15.0", tfp_version = "0.23.0", python_version = "3.8") Condition - Error in `greta_python_deps()`: - ! Provided `greta_python_deps` does not match valid installation combinations. + Error in `greta_deps_spec()`: + ! Provided `greta_deps_spec` does not match valid installation combinations. See below for a suggested config to use: - `greta_python_deps(tf_version = "2.15.0", tfp_version = "0.23.0", python_version = "3.11")` + `greta_deps_spec(tf_version = "2.15.0", tfp_version = "0.23.0", python_version = "3.11")` i Valid versions of TF, TFP, and Python are in `greta_deps_tf_tfp` i Inspect with: `View(greta_deps_tf_tfp)` @@ -192,13 +187,12 @@ --- Code - greta_python_deps(tf_version = "2.15.0", tfp_version = "0.22.0", - python_version = "3.10") + greta_deps_spec(tf_version = "2.15.0", tfp_version = "0.22.0", python_version = "3.10") Condition - Error in `greta_python_deps()`: - ! Provided `greta_python_deps` does not match valid installation combinations. + Error in `greta_deps_spec()`: + ! Provided `greta_deps_spec` does not match valid installation combinations. See below for a suggested config to use: - `greta_python_deps(tf_version = "2.14.0", tfp_version = "0.22.0", python_version = "3.11")` + `greta_deps_spec(tf_version = "2.14.0", tfp_version = "0.22.0", python_version = "3.11")` i Valid versions of TF, TFP, and Python are in `greta_deps_tf_tfp` i Inspect with: `View(greta_deps_tf_tfp)` @@ -206,13 +200,12 @@ --- Code - greta_python_deps(tf_version = "2.14.0", tfp_version = "0.21.0", - python_version = "3.8") + greta_deps_spec(tf_version = "2.14.0", tfp_version = "0.21.0", python_version = "3.8") Condition - Error in `greta_python_deps()`: - ! Provided `greta_python_deps` does not match valid installation combinations. + Error in `greta_deps_spec()`: + ! Provided `greta_deps_spec` does not match valid installation combinations. See below for a suggested config to use: - `greta_python_deps(tf_version = "2.13.0", tfp_version = "0.21.0", python_version = "3.11")` + `greta_deps_spec(tf_version = "2.13.0", tfp_version = "0.21.0", python_version = "3.11")` i Valid versions of TF, TFP, and Python are in `greta_deps_tf_tfp` i Inspect with: `View(greta_deps_tf_tfp)` @@ -220,13 +213,12 @@ --- Code - greta_python_deps(tf_version = "2.13.0", tfp_version = "0.20.0", - python_version = "3.8") + greta_deps_spec(tf_version = "2.13.0", tfp_version = "0.20.0", python_version = "3.8") Condition - Error in `greta_python_deps()`: - ! Provided `greta_python_deps` does not match valid installation combinations. + Error in `greta_deps_spec()`: + ! Provided `greta_deps_spec` does not match valid installation combinations. See below for a suggested config to use: - `greta_python_deps(tf_version = "2.12.0", tfp_version = "0.20.0", python_version = "3.11")` + `greta_deps_spec(tf_version = "2.12.0", tfp_version = "0.20.0", python_version = "3.11")` i Valid versions of TF, TFP, and Python are in `greta_deps_tf_tfp` i Inspect with: `View(greta_deps_tf_tfp)` @@ -234,13 +226,12 @@ --- Code - greta_python_deps(tf_version = "2.15.0", tfp_version = "0.17.0", - python_version = "3.8") + greta_deps_spec(tf_version = "2.15.0", tfp_version = "0.17.0", python_version = "3.8") Condition - Error in `greta_python_deps()`: - ! Provided `greta_python_deps` does not match valid installation combinations. + Error in `greta_deps_spec()`: + ! Provided `greta_deps_spec` does not match valid installation combinations. See below for a suggested config to use: - `greta_python_deps(tf_version = "2.9.1", tfp_version = "0.17.0", python_version = "3.10")` + `greta_deps_spec(tf_version = "2.9.1", tfp_version = "0.17.0", python_version = "3.10")` i Valid versions of TF, TFP, and Python are in `greta_deps_tf_tfp` i Inspect with: `View(greta_deps_tf_tfp)` @@ -248,10 +239,9 @@ --- Code - greta_python_deps(tf_version = "2.17.0", tfp_version = "0.23.0", - python_version = "3.8") + greta_deps_spec(tf_version = "2.17.0", tfp_version = "0.23.0", python_version = "3.8") Condition - Error in `greta_python_deps()`: + Error in `greta_deps_spec()`: ! greta Does not yet support TF > 2.15.0 i See for more information x The provided version was 2.17.0 @@ -263,9 +253,9 @@ --- Code - greta_python_deps(tf_version = "2.9.0", tfp_version = "0.17.0", python_version = "3.8") + greta_deps_spec(tf_version = "2.9.0", tfp_version = "0.17.0", python_version = "3.8") Condition - Error in `greta_python_deps()`: + Error in `greta_deps_spec()`: ! "TF" version provided does not match supported versions The version 2.9.0 was not in 2.15.0, 2.14.0, 2.14.0, 2.13.0, 2.12.0, 2.11.0, 2.10.0, 2.9.1, 2.8.0, 2.7.0, 2.6.0, 2.6.0, 2.5.0, 2.4.0, 2.4.0, 2.4.0, 2.3.0, 2.3.0, ..., 2.1.0, and 2.0.0 i The nearest valid version that is supported by greta is: 2.8.0 diff --git a/tests/testthat/test-greta_python_deps.R b/tests/testthat/test-greta_python_deps.R deleted file mode 100644 index d7968bc4..00000000 --- a/tests/testthat/test-greta_python_deps.R +++ /dev/null @@ -1,138 +0,0 @@ -test_that("greta python range detection works correctly",{ - skip_if_not(check_tf_version()) - # correct ranges - expect_snapshot(check_greta_python_range("3.11")) - expect_snapshot(check_greta_python_range("3.9")) - expect_snapshot(check_greta_python_range("3.3")) - expect_snapshot(check_greta_python_range("3.8.2")) - expect_snapshot(check_greta_python_range("3.3.3")) - # incorrect ranges - expect_snapshot( - error = TRUE, - check_greta_python_range("3.1") - ) - expect_snapshot( - error = TRUE, - check_greta_python_range("3.12") - ) - expect_snapshot( - error = TRUE, - check_greta_python_range("2.7") - ) - expect_snapshot( - error = TRUE, - check_greta_python_range("3.1.1") - ) - expect_snapshot( - error = TRUE, - check_greta_python_range("3.14") - ) -}) - - -test_that("greta_python_deps fails appropriately", { - skip_if_not(check_tf_version()) - # skip on windows as there are small differences in version recommendations - skip_on_os(os = "windows") - # default - expect_snapshot(greta_python_deps()) - # some correct ranges - expect_snapshot( - greta_python_deps(tf_version = "2.14.0", - tfp_version = "0.22.1", - python_version = "3.9") - ) - expect_snapshot( - greta_python_deps(tf_version = "2.12.0", - tfp_version = "0.20.0", - python_version = "3.9") - ) - # TF above range - expect_snapshot( - error = TRUE, - greta_python_deps(tf_version = "2.16.1", - tfp_version = "0.11.0", - python_version = "3.8") - ) - # TF below range - expect_snapshot( - error = TRUE, - greta_python_deps(tf_version = "1.9.0", - tfp_version = "0.11.0", - python_version = "3.8") - ) - # TFP above range - expect_snapshot( - error = TRUE, - greta_python_deps(tf_version = "2.15.0", - tfp_version = "0.24.0", - python_version = "3.10") - ) - # TFP below range - expect_snapshot( - error = TRUE, - greta_python_deps(tf_version = "2.15.0", - tfp_version = "0.6.0", - python_version = "3.10") - ) - # Python above range - expect_snapshot( - error = TRUE, - greta_python_deps(tf_version = "2.9.1", - tfp_version = "0.23.0", - python_version = "3.13") - ) - # Python below range - expect_snapshot( - error = TRUE, - greta_python_deps(tf_version = "2.9.1", - tfp_version = "0.23.0", - python_version = "2.6") - ) - # Only Python is not valid - # TODO - suggest changing python version in error message - expect_snapshot( - error = TRUE, - greta_python_deps(tf_version = "2.15.0", - tfp_version = "0.23.0", - python_version = "3.8") - ) - # Only TF is not valid - expect_snapshot( - error = TRUE, - greta_python_deps(tf_version = "2.15.0", - tfp_version = "0.22.0", - python_version = "3.10") - ) - expect_snapshot( - error = TRUE, - greta_python_deps(tf_version = "2.14.0", - tfp_version = "0.21.0", - python_version = "3.8") - ) - expect_snapshot( - error = TRUE, - greta_python_deps(tf_version = "2.13.0", - tfp_version = "0.20.0", - python_version = "3.8") - ) - # Only TFP is not valid - expect_snapshot( - error = TRUE, - greta_python_deps(tf_version = "2.15.0", - tfp_version = "0.17.0", - python_version = "3.8") - ) - expect_snapshot( - error = TRUE, - greta_python_deps(tf_version = "2.17.0", - tfp_version = "0.23.0", - python_version = "3.8") - ) - expect_snapshot( - error = TRUE, - greta_python_deps(tf_version = "2.9.0", - tfp_version = "0.17.0", - python_version = "3.8") - ) -}) diff --git a/tests/testthat/test_greta_deps_spec.R b/tests/testthat/test_greta_deps_spec.R new file mode 100644 index 00000000..804b52e4 --- /dev/null +++ b/tests/testthat/test_greta_deps_spec.R @@ -0,0 +1,138 @@ +test_that("greta python range detection works correctly",{ + skip_if_not(check_tf_version()) + # correct ranges + expect_snapshot(check_greta_python_range("3.11")) + expect_snapshot(check_greta_python_range("3.9")) + expect_snapshot(check_greta_python_range("3.3")) + expect_snapshot(check_greta_python_range("3.8.2")) + expect_snapshot(check_greta_python_range("3.3.3")) + # incorrect ranges + expect_snapshot( + error = TRUE, + check_greta_python_range("3.1") + ) + expect_snapshot( + error = TRUE, + check_greta_python_range("3.12") + ) + expect_snapshot( + error = TRUE, + check_greta_python_range("2.7") + ) + expect_snapshot( + error = TRUE, + check_greta_python_range("3.1.1") + ) + expect_snapshot( + error = TRUE, + check_greta_python_range("3.14") + ) +}) + + +test_that("greta_deps_spec fails appropriately", { + skip_if_not(check_tf_version()) + # skip on windows as there are small differences in version recommendations + skip_on_os(os = "windows") + # default + expect_snapshot(greta_deps_spec()) + # some correct ranges + expect_snapshot( + greta_deps_spec(tf_version = "2.14.0", + tfp_version = "0.22.1", + python_version = "3.9") + ) + expect_snapshot( + greta_deps_spec(tf_version = "2.12.0", + tfp_version = "0.20.0", + python_version = "3.9") + ) + # TF above range + expect_snapshot( + error = TRUE, + greta_deps_spec(tf_version = "2.16.1", + tfp_version = "0.11.0", + python_version = "3.8") + ) + # TF below range + expect_snapshot( + error = TRUE, + greta_deps_spec(tf_version = "1.9.0", + tfp_version = "0.11.0", + python_version = "3.8") + ) + # TFP above range + expect_snapshot( + error = TRUE, + greta_deps_spec(tf_version = "2.15.0", + tfp_version = "0.24.0", + python_version = "3.10") + ) + # TFP below range + expect_snapshot( + error = TRUE, + greta_deps_spec(tf_version = "2.15.0", + tfp_version = "0.6.0", + python_version = "3.10") + ) + # Python above range + expect_snapshot( + error = TRUE, + greta_deps_spec(tf_version = "2.9.1", + tfp_version = "0.23.0", + python_version = "3.13") + ) + # Python below range + expect_snapshot( + error = TRUE, + greta_deps_spec(tf_version = "2.9.1", + tfp_version = "0.23.0", + python_version = "2.6") + ) + # Only Python is not valid + # TODO - suggest changing python version in error message + expect_snapshot( + error = TRUE, + greta_deps_spec(tf_version = "2.15.0", + tfp_version = "0.23.0", + python_version = "3.8") + ) + # Only TF is not valid + expect_snapshot( + error = TRUE, + greta_deps_spec(tf_version = "2.15.0", + tfp_version = "0.22.0", + python_version = "3.10") + ) + expect_snapshot( + error = TRUE, + greta_deps_spec(tf_version = "2.14.0", + tfp_version = "0.21.0", + python_version = "3.8") + ) + expect_snapshot( + error = TRUE, + greta_deps_spec(tf_version = "2.13.0", + tfp_version = "0.20.0", + python_version = "3.8") + ) + # Only TFP is not valid + expect_snapshot( + error = TRUE, + greta_deps_spec(tf_version = "2.15.0", + tfp_version = "0.17.0", + python_version = "3.8") + ) + expect_snapshot( + error = TRUE, + greta_deps_spec(tf_version = "2.17.0", + tfp_version = "0.23.0", + python_version = "3.8") + ) + expect_snapshot( + error = TRUE, + greta_deps_spec(tf_version = "2.9.0", + tfp_version = "0.17.0", + python_version = "3.8") + ) +}) From f2dd7cd45f75503dac2832767a77718ea4798a81 Mon Sep 17 00:00:00 2001 From: njtierney Date: Tue, 20 Aug 2024 15:21:44 +1000 Subject: [PATCH 10/14] update to use greta_deps_spec(), and other small tweaks --- README.md | 2 +- vignettes/installation.Rmd | 84 ++++++++++++++------------------------ 2 files changed, 31 insertions(+), 55 deletions(-) diff --git a/README.md b/README.md index 359adbb5..4c655fa0 100644 --- a/README.md +++ b/README.md @@ -42,7 +42,7 @@ devtools::install_github("greta-dev/greta") The `install_greta_deps()` function helps install the Python dependencies (Google's [TensorFlow](https://www.tensorflow.org/) and [tensorflow-probability](https://github.com/tensorflow/probability)). -By default, `install_greta_deps()` installs versions TF 2.15.0, and TFP version 0.23.0, using python 3.10. To change the versions of TF, TFP, or python that you want to use, you specify the `python_deps` argument of `install_greta_deps()`, which used `greta_python_deps()`. See `?install_greta_deps()` or `?greta_python_deps()` for more information. +By default, `install_greta_deps()` installs versions TF 2.15.0, and TFP version 0.23.0, using python 3.10. To change the versions of TF, TFP, or python that you want to use, you specify the `deps` argument of `install_greta_deps()`, which used `greta_deps_spec()`. See `?install_greta_deps()` or `?greta_deps_spec()` for more information. This helper function, `install_greta_deps()`, installs the exact pythons package versions needed. It also places these inside a conda environment, "greta-env-tf2". This isolates these exact python modules from other python installations, so that only `greta` will see them. This helps avoids installation issues, where previously you might update tensorflow on your computer and overwrite the current version needed by `greta`. Using this "greta-env-tf2" conda environment means installing other python packages should not be impact the Python packages needed by `greta`. diff --git a/vignettes/installation.Rmd b/vignettes/installation.Rmd index 9993501c..6c29408f 100644 --- a/vignettes/installation.Rmd +++ b/vignettes/installation.Rmd @@ -44,11 +44,11 @@ We do this as it helps avoids installation issues, where previously you might up The `install_greta_deps()` function takes three arguments: -1. `python_deps`: Specify dependencies with `greta_python_deps()` +1. `deps`: Specify dependencies with `greta_deps_spec()` 2. `timeout`: time in minutes to wait in installation before failing/exiting 3. `restart`: whether to restart R ("force" - restart R, "no", will not restart, "ask" (default) - ask the user) -You specify the version of TF TFP, or python that you want to use with `greta_python_deps()`, which has arguments: +You specify the version of TF TFP, or python that you want to use with `greta_deps_spec()`, which has arguments: - `tf_version` - `tfp_version` @@ -58,96 +58,72 @@ If you specify versions of TF/TFP/Python that are not compatible with each other If you provide an invalid installation versions, it will error and suggest some alternative installation versions. -## How we install dependencies. +## How we install dependencies -We create a separate R instances using [`callr`]() to install python dependencies using `reticulate` to talk to Python, and the R package `tensorflow`, for installing the tensorflow python module. +This is for users who want to know more about the installation process of dependencies in greta. -## More Detail on how greta installs python dependencies +We create a separate R instance using [`callr`](https://callr.r-lib.org/index.html) to install python dependencies using `reticulate` to talk to Python, and the R package `tensorflow`, for installing the tensorflow python module. We use `callr` so that we can ensure the installation of python dependencies happens in a clean R session that doesn't have python or reticulate already loaded. -## Troubleshooting installation +If miniconda isn't installed, we install miniconda. You can think of miniconda as a lightweight version of python with minimal dependencies. -### Using reinstallers -### destroying dependencies -### manual installation +If "greta-tf2-env" isn't found, then we create a new conda environment named "greta-tf2-env", for a version of python that works with the specified versions of TF and TFP. -If the previous installation helper did not work, you can try the following: +Then we install the TF and TFP python modules, using the versions specified in `greta_deps_spec()`. -```{r install_tensorflow, eval = FALSE} -reticulate::install_miniconda() -reticulate::conda_create( - envname = "greta-env", - python_version = "3.7" - ) -reticulate::conda_install( - envname = "greta-env", - packages = c( - "numpy==1.16.4", - "tensorflow-probability==0.7.0", - "tensorflow==1.14.0" - ) - ) -``` +After installation, we ask users if they want to restart R. This only happens in interactive sessions, and only if the user is in RStudio. This is to avoid potential issues where this script might be used in batch scripts online. -Which will install the python modules into a conda environment named "greta-env". +## Troubleshooting installation -You can also not install these not into a special conda environment "greta-env", -like so: +Installation doesn't always go to plan. Sometimes -```{r install-deps-plain, eval = FALSE} -reticulate::install_miniconda() -reticulate::conda_install( - packages = c( - "numpy==1.16.4", - "tensorflow-probability==0.7.0", - "tensorflow==1.14.0" - ) - ) -``` +### Using a logfile - +We write a logfile -
+### Using reinstallers +You -# Morgue +### Destroying dependencies -#### Standard installation +### Manual installation If the previous installation helper did not work, you can try the following: ```{r install_tensorflow, eval = FALSE} reticulate::install_miniconda() reticulate::conda_create( - envname = "greta-env", - python_version = "3.7" + envname = "greta-env-tf2", + python_version = "3.10" ) reticulate::conda_install( - envname = "greta-env", + envname = "greta-env-tf2", packages = c( - "numpy==1.16.4", - "tensorflow-probability==0.7.0", - "tensorflow==1.14.0" + "tensorflow-probability==0.23.0", + "tensorflow==2.15.0" ) ) ``` -Which will install the python modules into a conda environment named "greta-env". +Which will install the python modules into a conda environment named "greta-env-tf2". -You can also not install these not into a special conda environment "greta-env", -like so: +You can also not install these not into a special conda environment like so: ```{r install-deps-plain, eval = FALSE} reticulate::install_miniconda() reticulate::conda_install( packages = c( - "numpy==1.16.4", - "tensorflow-probability==0.7.0", - "tensorflow==1.14.0" + "tensorflow-probability==0.23.0", + "tensorflow==2.15.0" ) ) ``` +
+ + + From c3189ee17138085e60b5cec6818a431a15bc6c4f Mon Sep 17 00:00:00 2001 From: njtierney Date: Wed, 21 Aug 2024 14:00:45 +1000 Subject: [PATCH 11/14] * rename `read_greta_install_log()` --> `open_greta_install_log()` * Update installation vignette --- NAMESPACE | 2 +- NEWS.md | 2 +- R/write-logfiles.R | 2 +- ...stall_log.Rd => open_greta_install_log.Rd} | 6 +-- vignettes/installation.Rmd | 38 ++++++++++++------- 5 files changed, 31 insertions(+), 19 deletions(-) rename man/{read_greta_install_log.Rd => open_greta_install_log.Rd} (88%) diff --git a/NAMESPACE b/NAMESPACE index eeb4e5b0..1c8276ea 100644 --- a/NAMESPACE +++ b/NAMESPACE @@ -236,6 +236,7 @@ export(nelder_mead) export(newton_cg) export(normal) export(ones) +export(open_greta_install_log) export(opt) export(ordered_variable) export(pareto) @@ -244,7 +245,6 @@ export(powell) export(proximal_adagrad) export(proximal_gradient_descent) export(rdist) -export(read_greta_install_log) export(reinstall_greta_deps) export(reinstall_greta_env) export(reinstall_miniconda) diff --git a/NEWS.md b/NEWS.md index 87a05e19..a3fa0dcc 100644 --- a/NEWS.md +++ b/NEWS.md @@ -42,7 +42,7 @@ This release provides a few improvements to installation in greta. It should now * Added checking suite to ensure you are using valid versions of TF, TFP, and Python(#666) * Added data `greta_deps_tf_tfp` (#666), which contains valid versions combinations of TF, TFP, and Python. * remove `greta_nodes_install/conda_*()` options as #493 makes them defunct. -* Added option to write to a single logfile with `greta_set_install_logfile()`, and `write_greta_install_log()`, and `read_greta_install_log()` (#493) +* Added option to write to a single logfile with `greta_set_install_logfile()`, and `write_greta_install_log()`, and `open_greta_install_log()` (#493) * Added `destroy_greta_deps()` function to remove miniconda and python conda environment ## Minor diff --git a/R/write-logfiles.R b/R/write-logfiles.R index a4bff917..571da5f5 100644 --- a/R/write-logfiles.R +++ b/R/write-logfiles.R @@ -153,7 +153,7 @@ sys_get_env <- function(envvar){ #' #' @return opens a URL in your default browser #' @export -read_greta_install_log <- function(path = NULL){ +open_greta_install_log <- function(path = NULL){ log_env <- sys_get_env("GRETA_INSTALLATION_LOG") path <- path %||% log_env diff --git a/man/read_greta_install_log.Rd b/man/open_greta_install_log.Rd similarity index 88% rename from man/read_greta_install_log.Rd rename to man/open_greta_install_log.Rd index c8b0bb83..9f65bef8 100644 --- a/man/read_greta_install_log.Rd +++ b/man/open_greta_install_log.Rd @@ -1,10 +1,10 @@ % Generated by roxygen2: do not edit by hand % Please edit documentation in R/write-logfiles.R -\name{read_greta_install_log} -\alias{read_greta_install_log} +\name{open_greta_install_log} +\alias{open_greta_install_log} \title{Read a greta logfile} \usage{ -read_greta_install_log(path = NULL) +open_greta_install_log(path = NULL) } \arguments{ \item{path}{file to read. Optional. If not specified, it will search for diff --git a/vignettes/installation.Rmd b/vignettes/installation.Rmd index 6c29408f..67ed4d5f 100644 --- a/vignettes/installation.Rmd +++ b/vignettes/installation.Rmd @@ -1,5 +1,5 @@ --- -title: "Installing Python Dependencies" +title: "Installing Dependencies" output: rmarkdown::html_vignette vignette: > %\VignetteIndexEntry{installation} @@ -20,7 +20,9 @@ library(greta) # Why we need to install dependencies -The greta package uses Google's [TensorFlow (TF)](https://www.tensorflow.org/) and [Tensorflow Probability (TFP)](https://github.com/tensorflow/probability)) under the hood to do efficient, fast, and scalable linear algebra and MCMC. TF and TFP are python packages, and so are required to be installed. This is different to how normal dependencies work with R packages, where the dependencies are automagically built and managed by CRAN. Unfortunately there isn't an automatic, reliable way to ensure that these are provided along when you install greta, so we need to take an additional step to install them. We have tried very hard to make the process as easy as possible by providing a helper function, `install_greta_deps()`. +The greta package uses Google's [TensorFlow (TF)](https://www.tensorflow.org/) and [Tensorflow Probability (TFP)](https://github.com/tensorflow/probability)) under the hood to do efficient, fast, and scalable linear algebra and MCMC. TF and TFP are python packages, and so are required to be installed. This is different to how normal dependencies work with R packages, where the dependencies are automagically built and managed by CRAN. + +Unfortunately, there isn't an automatic, reliable way to ensure that these are provided along when you install greta, so we need to take an additional step to install them. We have tried very hard to make the process as easy as possible by providing a helper function, `install_greta_deps()`. # How to install python dependencies using `install_greta_deps()` @@ -60,9 +62,9 @@ If you provide an invalid installation versions, it will error and suggest some ## How we install dependencies -This is for users who want to know more about the installation process of dependencies in greta. +For users who want to know more about the installation process of dependencies in greta. -We create a separate R instance using [`callr`](https://callr.r-lib.org/index.html) to install python dependencies using `reticulate` to talk to Python, and the R package `tensorflow`, for installing the tensorflow python module. We use `callr` so that we can ensure the installation of python dependencies happens in a clean R session that doesn't have python or reticulate already loaded. +We create a separate R instance using [`callr`](https://callr.r-lib.org/index.html) to install python dependencies using `reticulate` to talk to Python, and the R package `tensorflow`, for installing the tensorflow python module. We use `callr` so that we can ensure the installation of python dependencies happens in a clean R session that doesn't have python or reticulate already loaded. It also means that we can hide the large amounts of text output to the console that happens when installation is running - these are written a logfile during installation that you can read with `open_greta_install_log()`. If miniconda isn't installed, we install miniconda. You can think of miniconda as a lightweight version of python with minimal dependencies. @@ -74,20 +76,30 @@ After installation, we ask users if they want to restart R. This only happens in ## Troubleshooting installation -Installation doesn't always go to plan. Sometimes - -### Using a logfile - -We write a logfile +Installation doesn't always go to plan. Here are some approaches to getting your dependencies working. -### Using reinstallers +- Check you have restarted R after installing dependencies + - After you have installed dependencies with `install_greta_deps()`, you will be prompted to restart R. To use greta you must restart R after installing dependencies as this allows greta to connect to the installed python dependencies. -You +- Use `greta_sitrep()` to check dependencies. + - `greta_sitrep()` will provide information about your installed version of Python, TF, TFP, and whether a conda environment is used. This can be helpful to troubleshoot some installation issues. -### Destroying dependencies +- Check the installation logfile + - During installation we write a logfile, which records all of the steps taken during installation. This can sometimes provide useful clues as to what might have gone awry during installation. You can open the logfile with `open_greta_install_log()`, which opens the logfile in a browser window, and scroll through it to try and find errors or things that went wrong during installation. We recommend viewing this with `open_greta_install_log()` and then searching with Ctrl/Cmd+F for things like "error/Error/ERROR/warn/etc" to find problems. There might not be a clear solution to the problem, but the logfile might provide clues to the problem that you can share on a forum or issue on the greta github. +- Reinstall greta dependencies with `reinstall_greta_deps()` + - Sometimes we just need to "turn it off and on again". Use `reinstall_greta_deps()` to remove miniconda, and the greta conda environment, and install them again. -### Manual installation +- Manually remove python installation + - You can manually remove python installation by doing: + - `remove_greta_env()` + - `remove_miniconda()` + - or `destroy_greta_deps()`, which does both of these steps. + - Then install the dependences with: `install_greta_deps()` + - Note that this is functionally what `reinstall_greta_deps()` does, but sometimes it can be useful to separate them out into separate steps. + +- Check internet access + - Installing these dependencies requires an internet connection, and sometimes the internet service provider (perhaps IT?) blocks websites like conda from downloading. In the past we have encountered this issue and have found that it can be avoided by doing re-installation with `reinstall_greta_deps()`. If the previous installation helper did not work, you can try the following: From bbd7142722d3c510e28846ed27d6009ff65a7f4c Mon Sep 17 00:00:00 2001 From: njtierney Date: Wed, 21 Aug 2024 16:53:04 +1000 Subject: [PATCH 12/14] fix duplicate msg arg --- R/write-logfiles.R | 1 - 1 file changed, 1 deletion(-) diff --git a/R/write-logfiles.R b/R/write-logfiles.R index 7923364a..dbcfa165 100644 --- a/R/write-logfiles.R +++ b/R/write-logfiles.R @@ -30,7 +30,6 @@ write_greta_install_log <- function(path = greta_logfile) { ) cli::cli_progress_step( - msg = "Open with: {.fun read_greta_logfile}" msg = "Open with: {.run open_greta_install_log()}" ) From 7ccfeace9ec617cd15f1da98ad1eba63263156ef Mon Sep 17 00:00:00 2001 From: njtierney Date: Wed, 21 Aug 2024 16:58:46 +1000 Subject: [PATCH 13/14] read_greta_install_log --> open_greta_install_log --- NAMESPACE | 1 - man/open_greta_install_log.Rd | 18 ++++++++---------- man/read_greta_install_log.Rd | 21 --------------------- 3 files changed, 8 insertions(+), 32 deletions(-) delete mode 100644 man/read_greta_install_log.Rd diff --git a/NAMESPACE b/NAMESPACE index 2292f530..1c8276ea 100644 --- a/NAMESPACE +++ b/NAMESPACE @@ -245,7 +245,6 @@ export(powell) export(proximal_adagrad) export(proximal_gradient_descent) export(rdist) -export(read_greta_install_log) export(reinstall_greta_deps) export(reinstall_greta_env) export(reinstall_miniconda) diff --git a/man/open_greta_install_log.Rd b/man/open_greta_install_log.Rd index 9f65bef8..1c7a58e3 100644 --- a/man/open_greta_install_log.Rd +++ b/man/open_greta_install_log.Rd @@ -4,20 +4,18 @@ \alias{open_greta_install_log} \title{Read a greta logfile} \usage{ -open_greta_install_log(path = NULL) -} -\arguments{ -\item{path}{file to read. Optional. If not specified, it will search for -the environment variable "GRETA_INSTALLATION_LOG". To set -"GRETA_INSTALLATION_LOG" you can use -\code{Sys.setenv('GRETA_INSTALLATION_LOG'='path/to/logfile.html')}. Or use -\code{\link[=greta_set_install_logfile]{greta_set_install_logfile()}} to set the path, e.g., -\code{greta_set_install_logfile('path/to/logfile.html')}.} +open_greta_install_log() } \value{ opens a URL in your default browser } \description{ This is a convenience function to facilitate reading logfiles. It opens -a browser using \code{\link[utils:browseURL]{utils::browseURL()}}. +a browser using \code{\link[utils:browseURL]{utils::browseURL()}}. It will search for +the environment variable "GRETA_INSTALLATION_LOG" or default to +\code{tools::R_user_dir("greta")}. To set +"GRETA_INSTALLATION_LOG" you can use +\code{Sys.setenv('GRETA_INSTALLATION_LOG'='path/to/logfile.html')}. Or use +\code{\link[=greta_set_install_logfile]{greta_set_install_logfile()}} to set the path, e.g., +\code{greta_set_install_logfile('path/to/logfile.html')}. } diff --git a/man/read_greta_install_log.Rd b/man/read_greta_install_log.Rd deleted file mode 100644 index d1db5f91..00000000 --- a/man/read_greta_install_log.Rd +++ /dev/null @@ -1,21 +0,0 @@ -% Generated by roxygen2: do not edit by hand -% Please edit documentation in R/write-logfiles.R -\name{read_greta_install_log} -\alias{read_greta_install_log} -\title{Read a greta logfile} -\usage{ -read_greta_install_log() -} -\value{ -opens a URL in your default browser -} -\description{ -This is a convenience function to facilitate reading logfiles. It opens -a browser using \code{\link[utils:browseURL]{utils::browseURL()}}. It will search for -the environment variable "GRETA_INSTALLATION_LOG" or default to -\code{tools::R_user_dir("greta")}. To set -"GRETA_INSTALLATION_LOG" you can use -\code{Sys.setenv('GRETA_INSTALLATION_LOG'='path/to/logfile.html')}. Or use -\code{\link[=greta_set_install_logfile]{greta_set_install_logfile()}} to set the path, e.g., -\code{greta_set_install_logfile('path/to/logfile.html')}. -} From d481c9f56f2f0ea9c2268765eba5d26d43ba60af Mon Sep 17 00:00:00 2001 From: njtierney Date: Wed, 21 Aug 2024 23:42:41 +1000 Subject: [PATCH 14/14] update to open_greta_install_log in docs --- NEWS.md | 2 +- R/install_greta_deps.R | 2 +- man/install_greta_deps.Rd | 2 +- 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/NEWS.md b/NEWS.md index b27511bd..190f28f2 100644 --- a/NEWS.md +++ b/NEWS.md @@ -44,7 +44,7 @@ This release provides a few improvements to installation in greta. It should now * remove `greta_nodes_install/conda_*()` options as #493 makes them defunct. * Added option to write to a single logfile with `greta_set_install_logfile()`, and `write_greta_install_log()`, and `open_greta_install_log()` (#493) * Added `destroy_greta_deps()` function to remove miniconda and python conda environment -* Improved `write_greta_install_log()` and `read_greta_install_log()` to use `tools::R_user_dir()` to always write to a file location. `read_greta_install_log()` will open one found from an environment variable or go to the default location. (#703) +* Improved `write_greta_install_log()` and `open_greta_install_log()` to use `tools::R_user_dir()` to always write to a file location. `open_greta_install_log()` will open one found from an environment variable or go to the default location. (#703) ## Minor diff --git a/R/install_greta_deps.R b/R/install_greta_deps.R index a0798224..61175f99 100644 --- a/R/install_greta_deps.R +++ b/R/install_greta_deps.R @@ -12,7 +12,7 @@ #' `tools::R_user_dir("greta")` as the directory to save a logfile named #' "greta-installation-logfile.html". To see installation notes or errors, #' after installation you can open the logfile with -#' [read_greta_install_log()], or you can navigate to the logfile and open +#' [open_greta_install_log()], or you can navigate to the logfile and open #' it in a browser. #' #' @param deps object created with [greta_deps_spec()] where you diff --git a/man/install_greta_deps.Rd b/man/install_greta_deps.Rd index 936ef724..d4ab5032 100644 --- a/man/install_greta_deps.Rd +++ b/man/install_greta_deps.Rd @@ -52,7 +52,7 @@ location with \code{GRETA_INSTALLATION_LOG} using \code{tools::R_user_dir("greta")} as the directory to save a logfile named "greta-installation-logfile.html". To see installation notes or errors, after installation you can open the logfile with -\code{\link[=read_greta_install_log]{read_greta_install_log()}}, or you can navigate to the logfile and open +\code{\link[=open_greta_install_log]{open_greta_install_log()}}, or you can navigate to the logfile and open it in a browser. By default, if using RStudio, it will now ask you if you want to restart