You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
During distance normalization, radar array is expanded all at once, which is untenable for medium (0.1 GB) to large files because the expanded array can exceed the RAM size (speaking from experience). Lines 96 and 99 of arrayops.py (below) are the culprits and should be modified.
Solution: chunk-ify distance normalization by breaking the original array into int(norm_vel['normalized'].mean()) + 1 parts and tacking each normalized chunk on to a processed array to return. That way, max memory usage should only be ~3x filesize, instead of norm_vel['normalized'].mean() *filesize (which is far too often something extreme like 75x).
This will require a slicing for loop around the np.repeat() and readgssi.arrayops.reducex() functions to build the new array block by block.
The text was updated successfully, but these errors were encountered:
During distance normalization, radar array is expanded all at once, which is untenable for medium (0.1 GB) to large files because the expanded array can exceed the RAM size (speaking from experience). Lines 96 and 99 of
arrayops.py
(below) are the culprits and should be modified.readgssi/readgssi/arrayops.py
Lines 96 to 99 in bb3bb2c
Solution: chunk-ify distance normalization by breaking the original array into
int(norm_vel['normalized'].mean()) + 1
parts and tacking each normalized chunk on to a processed array to return. That way, max memory usage should only be ~3x filesize, instead ofnorm_vel['normalized'].mean() *
filesize (which is far too often something extreme like 75x).This will require a slicing
for
loop around thenp.repeat()
andreadgssi.arrayops.reducex()
functions to build the new array block by block.The text was updated successfully, but these errors were encountered: