Package: torchopt 0.1.4
torchopt: Advanced Optimizers for Torch
Optimizers for 'torch' deep learning library. These functions include recent results published in the literature and are not part of the optimizers offered in 'torch'. Prospective users should test these optimizers with their data, since performance depends on the specific problem being solved. The packages includes the following optimizers: (a) 'adabelief' by Zhuang et al (2020), <arxiv:2010.07468>; (b) 'adabound' by Luo et al.(2019), <arxiv:1902.09843>; (c) 'adahessian' by Yao et al.(2021) <arxiv:2006.00719>; (d) 'adamw' by Loshchilov & Hutter (2019), <arxiv:1711.05101>; (e) 'madgrad' by Defazio and Jelassi (2021), <arxiv:2101.11075>; (f) 'nadam' by Dozat (2019), <https://openreview.net/pdf/OM0jvwB8jIp57ZJjtNEZ.pdf>; (g) 'qhadam' by Ma and Yarats(2019), <arxiv:1810.06801>; (h) 'radam' by Liu et al. (2019), <arxiv:1908.03265>; (i) 'swats' by Shekar and Sochee (2018), <arxiv:1712.07628>; (j) 'yogi' by Zaheer et al.(2019), <https://papers.nips.cc/paper/8186-adaptive-methods-for-nonconvex-optimization>.
Authors:
torchopt_0.1.4.tar.gz
torchopt_0.1.4.zip(r-4.5)torchopt_0.1.4.zip(r-4.4)torchopt_0.1.4.zip(r-4.3)
torchopt_0.1.4.tgz(r-4.4-any)torchopt_0.1.4.tgz(r-4.3-any)
torchopt_0.1.4.tar.gz(r-4.5-noble)torchopt_0.1.4.tar.gz(r-4.4-noble)
torchopt_0.1.4.tgz(r-4.4-emscripten)torchopt_0.1.4.tgz(r-4.3-emscripten)
torchopt.pdf |torchopt.html✨
torchopt/json (API)
NEWS
# Install 'torchopt' in R: |
install.packages('torchopt', repos = c('https://e-sensing.r-universe.dev', 'https://cloud.r-project.org')) |
Bug tracker:https://github.com/e-sensing/torchopt/issues
deep-learningnumerical-optimization
Last updated 1 years agofrom:399f27b52a. Checks:OK: 7. Indexed: yes.
Target | Result | Date |
---|---|---|
Doc / Vignettes | OK | Oct 27 2024 |
R-4.5-win | OK | Oct 27 2024 |
R-4.5-linux | OK | Oct 27 2024 |
R-4.4-win | OK | Oct 27 2024 |
R-4.4-mac | OK | Oct 27 2024 |
R-4.3-win | OK | Oct 27 2024 |
R-4.3-mac | OK | Oct 27 2024 |
Exports:optim_adabeliefoptim_adaboundoptim_adahessianoptim_adamwoptim_madgradoptim_nadamoptim_qhadamoptim_radamoptim_swatsoptim_yogitest_optim
Dependencies:bitbit64callrclicorodescellipsisgluejsonlitemagrittrprocessxpsR6Rcpprlangsafetensorstorchwithr
Readme and manuals
Help Manual
Help page | Topics |
---|---|
Adabelief optimizer | optim_adabelief |
Adabound optimizer | optim_adabound |
Adahessian optimizer | optim_adahessian |
AdamW optimizer | optim_adamw |
MADGRAD optimizer | optim_madgrad |
Nadam optimizer | optim_nadam |
QHAdam optimization algorithm | optim_qhadam |
AdamW optimizer | optim_radam |
SWATS optimizer | optim_swats |
Yogi optimizer | optim_yogi |
Test optimization function | test_optim |