When trying to run this:
import numpy as np
import glob
npfiles = glob.glob("*.npy")
npfiles.sort()
for file in npfiles:
with open(file, 'rb') as f:
array = np.load(f, allow_pickle=True)
try:
test
except NameError:
test = array
else:
test = np.append(test, array)
array.shape
This is the error I receive:
MemoryError: Unable to allocate 1.28 GiB for an array with shape (24115,) and data type <U14262
I'm trying to simply load several .npy files into memory and append them to one another. Their total size is only 1.58GB and I have 64GB of physical RAM.
I found that stackoverflow posts with the same problem were due to under-allocated virtual memory, and to overcome this users increased alloted virtual memory by increasing the page size maximum. Since I am using Windows 10, I increased virtual memory by doing this.
I've rebooted my PC and still experience the issue. I am baffled.
testsize at each iteration?np.appenditeratively, you need space for at least 2 full copies.np.concatenatemakes a new array, copying values from all the arguments. List append with one finalnp.array(...)is more efficient, though probably won't help with overall memory use in your case.