Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: 'NoneType' object is not iterable #22

Closed
AlexYiLuo opened this issue Sep 26, 2023 · 1 comment
Closed

TypeError: 'NoneType' object is not iterable #22

AlexYiLuo opened this issue Sep 26, 2023 · 1 comment
Assignees
Labels
bug Something isn't working

Comments

@AlexYiLuo
Copy link

Hello,

I have been having the following errors, and not sure what went wrong. Can you help with the troubleshooting? Thank you very much!

neural-admixture train --k 5 --supervised --populations_path label.txt --name RUN_NAME --data_path phase3_kin_prune.bed --save_dir SAVE_PATH
INFO:neural_admixture.entry:Neural ADMIXTURE - Version 1.3.0
INFO:neural_admixture.entry:[CHANGELOG] Default P initialization was changed to 'pckmeans' in version 1.3.0.
INFO:neural_admixture.entry:[CHANGELOG] Warmup training for initialization of Q was added in version 1.3.0 to improve training stability (only for pckmeans).
INFO:neural_admixture.entry:[CHANGELOG] Convergence check changed so it is performed after 15 epochs in version 1.3.0 to improve training stability.
INFO:neural_admixture.entry:[CHANGELOG] Default learning rate was changed to 1e-5 instead of 1e-4 in version 1.3.0 to improve training stability.
INFO:neural_admixture.src.utils:Reading data...
INFO:neural_admixture.src.snp_reader:Input format is BED.
Mapping files: 100%|███████████████████████████████████████████████████████████████████████████| 3/3 [00:01<00:00, 1.61it/s]
INFO:neural_admixture.src.utils:Data contains 2492 samples and 842416 SNPs.
INFO:neural_admixture.src.utils:Data loaded.
INFO:neural_admixture.src.train:Job args: Namespace(learning_rate=1e-05, max_epochs=50, initialization='pckmeans', optimizer='adam', save_every=10, l2_penalty=0.0005, activation='gelu', seed=42, k=5, min_k=None, max_k=None, hidden_size=64, init_file=None, freeze_decoder=False, supervised=True, validation_data_path='', populations_path='label.txt', validation_populations_path='', wandb_log=False, wandb_user=None, wandb_project=None, pca_path=None, pca_components=2, tol=1e-06, save_dir='SAVE_PATH', data_path='phase3_kin_prune.bed', name='RUN_NAME', batch_size=400, supervised_loss_weight=0.05, warmup_epochs=10)
INFO:neural_admixture.src.train:Will use GPU.
INFO:neural_admixture.src.train:Initializing...
WARNING:neural_admixture.src.train:Initialization filename not provided. Going to store it to SAVE_PATH/RUN_NAME.pkl
INFO:neural_admixture.model.initializations:Running supervised initialization...
INFO:neural_admixture.model.initializations:Weights initialized in 19.29151487350464 seconds.
INFO:neural_admixture.src.train:Variants: 842416
INFO:neural_admixture.src.train:Optimizer successfully loaded.
INFO:neural_admixture.src.train:Going to train 1 head: K=5.
INFO:neural_admixture.src.train:Fitting...
INFO:neural_admixture.model.neural_admixture:Will stop optimization when difference in objective function between two subsequent iterations is < 1e-06 or after 50 epochs.
INFO:neural_admixture.model.neural_admixture:Going to train on supervised mode.
INFO:neural_admixture.model.neural_admixture:Bringing training data into memory...
INFO:neural_admixture.model.neural_admixture:Running warmup epochs...
Warmup epoch 1: 0%| | 0/7 [00:00<?, ?it/s]
Traceback (most recent call last):
File "/home/yluo/.conda/envs/ndam/bin/neural-admixture", line 8, in
sys.exit(main())
File "/home/yluo/.conda/envs/ndam/lib/python3.9/site-packages/neural_admixture/entry.py", line 18, in main
sys.exit(train.main(arg_list[2:]))
File "/home/yluo/.conda/envs/ndam/lib/python3.9/site-packages/neural_admixture/src/train.py", line 162, in main
model, device, _ = fit_model(trX, args, valX, trY, valY)
File "/home/yluo/.conda/envs/ndam/lib/python3.9/site-packages/neural_admixture/src/train.py", line 116, in fit_model
actual_num_epochs = model.launch_training(trX, optimizer, loss_f, num_max_epochs, device, valX=valX,
File "/home/yluo/.conda/envs/ndam/lib/python3.9/site-packages/neural_admixture/model/neural_admixture.py", line 135, in launch_training
_, _ = self._run_warmup_epoch(trX, Q_inits, opt_warmup, loss_f_warmup, batch_size, device, shuffle, epoch_num=wep+1)
File "/home/yluo/.conda/envs/ndam/lib/python3.9/site-packages/neural_admixture/model/neural_admixture.py", line 284, in _run_warmup_epoch
Q_inits=[Y.to(device) for Y in Ys])
TypeError: 'NoneType' object is not iterable

@AlbertDominguez AlbertDominguez added the bug Something isn't working label Sep 27, 2023
@AlbertDominguez AlbertDominguez self-assigned this Sep 27, 2023
@AlbertDominguez
Copy link
Collaborator

Thanks for reporting this! This was caused due to the network trying to run the warmup training, however it should only be available when using the PCK-Means, and not on the supervised mode as in your case.
Fixed this in the latest release (v1.3.1) by skipping warmup unless the initialization is PCK-Means. Please run pip install -U neural-admixture to update the library!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants