From e733af194d78df9292e35d3428b2d7a1b3c8a8d1 Mon Sep 17 00:00:00 2001 From: Sebastian Fischer Date: Mon, 9 Oct 2023 16:24:19 +0200 Subject: [PATCH 1/2] docs: improve documentation --- R/DataBackendLazy.R | 7 +++++++ R/PipeOpModule.R | 2 +- R/PipeOpTorch.R | 4 ++-- man/mlr_backends_lazy.Rd | 7 +++++++ man/mlr_pipeops_module.Rd | 2 +- man/mlr_pipeops_torch.Rd | 4 ++-- 6 files changed, 20 insertions(+), 6 deletions(-) diff --git a/R/DataBackendLazy.R b/R/DataBackendLazy.R index 9612e564..7ce2ca6a 100644 --- a/R/DataBackendLazy.R +++ b/R/DataBackendLazy.R @@ -21,6 +21,13 @@ #' #' Beware that accessing the backend's hash also contructs the backend. #' +#' **Important** +#' +#' When the constructor generates `factor()` variables it is important that the ordering of the levels in data +#' corresponds to the ordering of the levels in the `col_info` argument. +#' Because the ordering of the level depends on the locale, it is recommended to e.g. use the `C` locale in the +#' `constructor` function, e.g. with `withr::with_locale()`. +#' #' @param constructor (`function()`)\cr #' A function with no arguments, whose return value must be the actual backend. #' This function is called the first time the field `$backend` is accessed. diff --git a/R/PipeOpModule.R b/R/PipeOpModule.R index 5f41fa61..0d07aac8 100644 --- a/R/PipeOpModule.R +++ b/R/PipeOpModule.R @@ -8,7 +8,7 @@ #' represents a neural network architecture. Such a graph can also be used to create a [`nn_graph`] which inherits #' from [`nn_module`]. #' -#' In most cases it is easier to create such a network by creating a isomorphic graph consisting +#' In most cases it is easier to create such a network by creating a structurally related graph consisting #' of nodes of class [`PipeOpTorchIngress`] and [`PipeOpTorch`]. This graph will then generate the graph consisting #' of `PipeOpModule`s as part of the [`ModelDescriptor`]. #' diff --git a/R/PipeOpTorch.R b/R/PipeOpTorch.R index ee035846..6a831211 100644 --- a/R/PipeOpTorch.R +++ b/R/PipeOpTorch.R @@ -5,7 +5,7 @@ #' @description #' `PipeOpTorch` is the base class for all [`PipeOp`]s that represent neural network layers in a [`Graph`]. #' During **training**, it generates a [`PipeOpModule`] that wraps an [`nn_module`][torch::nn_module] and attaches it -#' to the isomorphic architecture, which is also represented as a [`Graph`] consisting mostly of [`PipeOpModule`]s +#' to the architecture, which is also represented as a [`Graph`] consisting mostly of [`PipeOpModule`]s #' an [`PipeOpNOP`]s. #' #' While the former [`Graph`] operates on [`ModelDescriptor`]s, the latter operates on [tensors][torch_tensor]. @@ -74,7 +74,7 @@ #' `id` and the `channel` name of the sending `PipeOp` in the slot `.pointer`. #' #' The new graph in the [`model_descriptor_union`] represents the current state of the neural network -#' architecture. It is isomorphic to the subgraph that consists of all pipeops of class `PipeOpTorch` and +#' architecture. It is structurally similar to the subgraph that consists of all pipeops of class `PipeOpTorch` and #' [`PipeOpTorchIngress`] that are ancestors of this `PipeOpTorch`. #' #' For the output, a shallow copy of the [`ModelDescriptor`] is created and the `.pointer` and diff --git a/man/mlr_backends_lazy.Rd b/man/mlr_backends_lazy.Rd index 71b45b4a..a6de2e74 100644 --- a/man/mlr_backends_lazy.Rd +++ b/man/mlr_backends_lazy.Rd @@ -24,6 +24,13 @@ Information that is available before the backend is constructed is: } Beware that accessing the backend's hash also contructs the backend. + +\strong{Important} + +When the constructor generates \code{factor()} variables it is important that the ordering of the levels in data +corresponds to the ordering of the levels in the \code{col_info} argument. +Because the ordering of the level depends on the locale, it is recommended to e.g. use the \code{C} locale in the +\code{constructor} function, e.g. with \code{withr::with_locale()}. } \examples{ # We first define a backend constructor diff --git a/man/mlr_pipeops_module.Rd b/man/mlr_pipeops_module.Rd index 5d2cb04b..83601e6f 100644 --- a/man/mlr_pipeops_module.Rd +++ b/man/mlr_pipeops_module.Rd @@ -10,7 +10,7 @@ By doing so, this allows to assemble \code{PipeOpModule}s in a computational \co represents a neural network architecture. Such a graph can also be used to create a \code{\link{nn_graph}} which inherits from \code{\link{nn_module}}. -In most cases it is easier to create such a network by creating a isomorphic graph consisting +In most cases it is easier to create such a network by creating a structurally related graph consisting of nodes of class \code{\link{PipeOpTorchIngress}} and \code{\link{PipeOpTorch}}. This graph will then generate the graph consisting of \code{PipeOpModule}s as part of the \code{\link{ModelDescriptor}}. } diff --git a/man/mlr_pipeops_torch.Rd b/man/mlr_pipeops_torch.Rd index 7a938eb4..458fcf7a 100644 --- a/man/mlr_pipeops_torch.Rd +++ b/man/mlr_pipeops_torch.Rd @@ -7,7 +7,7 @@ \description{ \code{PipeOpTorch} is the base class for all \code{\link{PipeOp}}s that represent neural network layers in a \code{\link{Graph}}. During \strong{training}, it generates a \code{\link{PipeOpModule}} that wraps an \code{\link[torch:nn_module]{nn_module}} and attaches it -to the isomorphic architecture, which is also represented as a \code{\link{Graph}} consisting mostly of \code{\link{PipeOpModule}}s +to the architecture, which is also represented as a \code{\link{Graph}} consisting mostly of \code{\link{PipeOpModule}}s an \code{\link{PipeOpNOP}}s. While the former \code{\link{Graph}} operates on \code{\link{ModelDescriptor}}s, the latter operates on \link[=torch_tensor]{tensors}. @@ -84,7 +84,7 @@ This is possible because every incoming \code{\link{ModelDescriptor}} contains t \code{id} and the \code{channel} name of the sending \code{PipeOp} in the slot \code{.pointer}. The new graph in the \code{\link{model_descriptor_union}} represents the current state of the neural network -architecture. It is isomorphic to the subgraph that consists of all pipeops of class \code{PipeOpTorch} and +architecture. It is structurally similar to the subgraph that consists of all pipeops of class \code{PipeOpTorch} and \code{\link{PipeOpTorchIngress}} that are ancestors of this \code{PipeOpTorch}. For the output, a shallow copy of the \code{\link{ModelDescriptor}} is created and the \code{.pointer} and From d32f305cbd1e0c8b2730be880533866d4ba88bd3 Mon Sep 17 00:00:00 2001 From: Sebastian Fischer Date: Tue, 10 Oct 2023 14:42:32 +0200 Subject: [PATCH 2/2] deps: mlr3 version --- DESCRIPTION | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/DESCRIPTION b/DESCRIPTION index b0017f9a..8000415b 100644 --- a/DESCRIPTION +++ b/DESCRIPTION @@ -33,7 +33,7 @@ Description: Deep Learning library that extends the mlr3 framework by building defined in 'mlr3pipelines'. License: LGPL (>= 3) Depends: - mlr3 (>= 0.16.0), + mlr3 (>= 0.16.1), mlr3pipelines (>= 0.5.0), torch (>= 0.11.0), R (>= 3.5.0) @@ -63,8 +63,7 @@ Suggests: testthat (>= 3.0.0), zip Remotes: - r-lib/zip, - mlr-org/mlr3@col_info + r-lib/zip Config/testthat/edition: 3 NeedsCompilation: no ByteCompile: no