Skip to content

Commit

Permalink
[Security] load --> safe_load (#1987)
Browse files Browse the repository at this point in the history
* load --> safe_load

* retrigger ci

* Update setup.py

* fix for fairscale

* up-pin fairscale version

* Update test_core.sh

(cherry picked from commit 5f4bb24)
  • Loading branch information
sxjscience authored and gradientsky committed Jul 28, 2022
1 parent 1a62b44 commit 23a37e7
Show file tree
Hide file tree
Showing 4 changed files with 5 additions and 3 deletions.
1 change: 1 addition & 0 deletions .github/workflow_scripts/test_core.sh
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ source $(dirname "$0")/env_setup.sh

setup_build_env
install_core_all_tests
python3 -m pip install ray_lightning==0.2.0 # TODO Change this line once we support ray_lightning 0.3.0

cd core/
python3 -m pytest --junitxml=results.xml --runslow tests
2 changes: 1 addition & 1 deletion core/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@

tests_require = [
'pytest',
'ray_lightning>=0.2.0,<0.3.0', # test ray lightning resource calculation
# TODO(Re-enable ray_lightning once it released 0.3.0) 'ray_lightning>=0.2.0,<0.3.0'
]

all_requires = []
Expand Down
2 changes: 1 addition & 1 deletion multimodal/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@
'torch>=1.9,<1.13',
'torchvision<0.14.0',
'torchtext<0.14.0',
'fairscale>=0.4.5,<0.5.0',
'fairscale>=0.4.5,<=0.4.6',
'scikit-image>=0.19.1,<0.20.0',
'smart_open>=5.2.1,<5.3.0',
'pytorch_lightning>=1.6.0,<1.7.0',
Expand Down
3 changes: 2 additions & 1 deletion multimodal/src/autogluon/multimodal/predictor.py
Original file line number Diff line number Diff line change
Expand Up @@ -1121,7 +1121,8 @@ def _top_k_average(
best_k_models_yaml_path = os.path.join(save_path, BEST_K_MODELS_FILE)
if os.path.exists(best_k_models_yaml_path):
with open(best_k_models_yaml_path, "r") as f:
best_k_models = yaml.load(f, Loader=yaml.Loader)
best_k_models = yaml.safe_load(f)

else:
# In some cases, the training ends up too early (e.g., due to time_limit) so that there is
# no saved best_k model checkpoints. In that scenario, we won't perform any model averaging.
Expand Down

0 comments on commit 23a37e7

Please sign in to comment.