

Maybe there is some variable not show by locals().items() or globals().items(), or maybe I simply don't understand how Python allocates memory at all. Running gc.collect from the main scope and making sure from the debugger that no napari-related object exists after the execution of the method. I've printed the local and global variables from the "main" scope before and after running this method: Before the issue: This means that running the method twice or performing any other intense work afterwards fills up the memory and makes the program crash. The issue arises when exiting this method, as the memory is not freed. This method opens multiple stacks of images in napari, so it is expected to use that much memory (nevertheless, I should optimize it). After running a method, the memory usage skyrockets from 40% to 80%. However, I'm facing a problem that I'm unable to diagnose. I'm trying to manage the data in the most storage-efficient and memory-efficient way by following the OOP principles, saving the data in optimized formats, loading the data in minimal batches and keeping almost all variables out of the "main" scope. I'm making an app in python that manages huge image data (hundreds of 32bit 3000x3000 px images). Python process is not managing memory as expected, resulting in the process getting killed.
