Open
Description
I am encountering what appears to be an error where the program is using too much memory:
When I load 500 episodes, the program runs fine and VAE gets trained and loss decreases.
When I load 2000 episodes, I get the following:
Traceback (most recent call last):
File "vae_train.py", line 77, in <module>
dataset = create_dataset(dataset)
File "vae_train.py", line 60, in create_dataset
data = np.zeros((M, 64, 64, 3), dtype=np.uint8)
MemoryError
The repo uses 10k episodes, but I cannot load even 2k on my 16GB machine. Am I missing something? If my memory really is the issue here, what amount of memory is necessary to replicate the paper with the codes here?
Metadata
Metadata
Assignees
Labels
No labels