Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump required mlr3learners version #196

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

MichaelChirico
Copy link

I am not sure what version is actually required. I am just surfacing from an incredibly painful few weeks of intermittent debugging which finally resolved by updating {mlr3learners}.

Here's the type of stack trace I was dealing with 🫨

── Error (test-double_ml_pliv_exception_handling.R:46:3): Unit tests for deprecation warnings of PLIV ──
Error in `predict.ranger.forest(forest, data, predict.all, num.trees, type, 
    se.method, seed, num.threads, verbose, object$inbag.counts, 
    ...)`: Error: One or more independent variables not found in data.
Backtrace:
     ▆
  1. ├─testthat::expect_warning(dml_obj$tune(par_grids), regexp = msg) at test-double_ml_pliv_exception_handling.R:46:3
  2. │ └─testthat:::quasi_capture(...)
  3. │   ├─testthat (local) .capture(...)
  4. │   │ └─base::withCallingHandlers(...)
  5. │   └─rlang::eval_bare(quo_get_expr(.quo), quo_get_env(.quo))
  6. └─dml_obj$tune(par_grids)
  7.   └─super$tune(param_set, tune_settings, tune_on_folds)
  8.     └─private$nuisance_tuning(...)
  9.       └─private$nuisance_tuning_partialX(...)
 10.         └─DoubleML:::dml_tune(...)
 11.           └─base::lapply(...)
 12.             └─DoubleML (local) FUN(X[[i]], ...)
 13.               └─DoubleML:::tune_instance(tune_settings$tuner, x)
 14.                 └─tuner$optimize(tuning_instance)
 15.                   └─mlr3tuning:::.__Tuner__optimize(...)
 16.                     └─bbotk::optimize_default(inst, self, private)
 17.                       ├─base::tryCatch(...)
 18.                       │ └─base (local) tryCatchList(expr, classes, parentenv, handlers)
 19.                       │   └─base (local) tryCatchOne(expr, names, parentenv, handlers[[1L]])
 20.                       │     └─base (local) doTryCatch(return(expr), name, parentenv, handler)
 21.                       └─private$.optimize(inst)
 22.                         └─mlr3tuning:::.__TunerGridSearch__.optimize(...)
 23.                           └─inst$eval_batch(data[inds])
 24.                             └─bbotk:::.__OptimInstance__eval_batch(...)
 25.                               └─self$objective$eval_many(xss_trafoed)
 26.                                 └─bbotk:::.__Objective__eval_many(...)
 27.                                   ├─mlr3misc::invoke(private$.eval_many, xss, .args = self$constants$values)
 28.                                   │ └─base::eval.parent(expr, n = 1L)
 29.                                   │   └─base::eval(expr, p)
 30.                                   │     └─base::eval(expr, p)
 31.                                   └─private$.eval_many(xss, resampling = `<list>`)
 32.                                     └─mlr3tuning:::.__ObjectiveTuning__.eval_many(...)
 33.                                       └─mlr3::benchmark(...)
 34.                                         └─mlr3:::future_map(...)
 35.                                           └─future.apply::future_mapply(...)
 36.                                             └─future.apply:::future_xapply(...)
 37.                                               ├─future::value(fs)
 38.                                               └─future:::value.list(fs)
 39.                                                 ├─future::resolve(...)
 40.                                                 └─future:::resolve.list(...)
 41.                                                   └─future (local) signalConditionsASAP(obj, resignal = FALSE, pos = ii)
 42.                                                     └─future:::signalConditions(...)

At root here is a problem with monorepos -- we typically update package versions only incrementally. I was trying to update data.table (1.14.x -> 1.15.5) and it led to a chain reaction of required updates, the end of which was this issue with an {mlr3learners} update being required deep in the stack of {DoubleML} tests.

All this to explain why I have not dived into what exact minimal version requirements solve the issue, an exercise for the reader 😉

@PhilippBach
Copy link
Member

Dear @MichaelChirico ,

thanks for opening the PR. I'll check this in more detail.

Best,
Philipp

@PhilippBach
Copy link
Member

Dear @MichaelChirico ,

I'm trying to replicate the warning. Could you provide any information or reproducible minimum example when this is actually issued? Thanks 🙏

@MichaelChirico
Copy link
Author

MichaelChirico commented Jun 5, 2024

hmm I know it's a pain but that may be tough.

does the test suite pass for you with data.table 1.15.4 and mlr3learners 0.3.0 installed?

@PhilippBach PhilippBach mentioned this pull request Jun 6, 2024
@PhilippBach
Copy link
Member

PhilippBach commented Jun 6, 2024

sorry for yet another question. Which R version are you using? R 4.4.0 or an earlier version

@PhilippBach
Copy link
Member

Dear @MichaelChirico , I've spent some time on this and I'm also not sure what causes the error... I played around with different R and mlr3learners version and could not really find out why in some cases I get the same error and in others I don't...

Anyways, thanks for opening this PR and also letting us know about this issue. I'm also wondering whether this is an issue in DoubleML or rather in the mlr3verse. I think I'd have to do a bit more research and also do some checks in our testing workflows. Right now, based on what I have (not) found out, I would not add the version requirement (0.6.0) for mlr3learners for DoubleML.

So I'd keep this PR open and come back to it once I found out more. Thanks!

@MichaelChirico
Copy link
Author

we are on R 4.1.x, I think 4.1.3

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants