Imports using too much RAM/swap

I am creating an importer to load large numbers of .dae files, where each file is one building. Each import is consuming about 300 MB of RAM during the import process, so I quickly run out of RAM and swap space. Importing 100 buildings consumes more than 32 GB. Once the import finishes, almost all of that storage is released.

Why is so much storage being consumed during the import, and how can I get my plugin to release the temporary allocation before the batch loading process is over? Right now I am limited to loading less than 250 buildings before the system runs out of memory.

Each .dae file has an associated texture map. The .dae file size is less than 1 MB on average, and from 20 to 2000 faces. The texture maps are on average about 0.5 MB.

The import process is basically this, plus some stuff to read the list of .dae files and origins:

# Given an array of model path/origin pairs, import each model
# and place it in the scene.
def self.load_models(origins)
    model = Sketchup.active_model
    origins.each do |org|
        self.load_model(org[0], org[1], model)  # place object-centered model at offset
    end
end

# Import a model file into SketchUp.
def self.load_model(path, pos, model)
    options = {
        :merge_coplanar_faces => false,
        :show_summary => false
    }
    defs_before = model.definitions.to_a
    success = model.import(path, options)
    if success
        # Instantiate the newly imported model at its specified location
        defs_after = model.definitions.to_a - defs_before
        model.active_entities.add_instance(defs_after.first, pos)
    end
end

I am using SketchUp Pro 2019 (Version 19.3.252) on a MacBook Pro running MacOS Catalina (Version 10.15.2) with 16 GB of RAM.

Try wrapping the imports in undo operations followed by running Ruby garbage collection
See …

Thanks for the suggestion. I did try using

Sketchup.active_model.start_operation
...
Sketchup.active_model.commit_operation

around each individual file import, and also around the entire batch import. I also tried manually invoking the garbage collector. None of these things seemed to have much impact on the memory allocation, which grows until the import plugin finishes, then drops back down to a reasonable size for the final loaded models.

You’re invoking the shipped DAE importer, right - not a custom one?
In that case the Ruby garbage collector shouldn’t have any impact as the import is done in C/C++ code. I would assume you’d see the same memory usage when using the importer manually…

During import there will be a number of temp data used - though 32BG sounds excessive. But is this what the SketchUp process consumes after consecutive imports?

Are you trying to load all files into a single SketchUp model? Or convert each DAE into separate SKP files?

You’re invoking the shipped DAE importer, right - not a custom one?

Yes. And my observations match what you are saying about garbage collection.

Are you trying to load all files into a single SketchUp model? Or convert each DAE into separate SKP files?

I am trying to load all the DAE files into a single SketchUp scene. I have been able to do it by running my importer on smaller numbers of files. If I import 50 DAE models at a time with my batch importer, it keeps the swap usage under 32 GB. This is the scene with all of the models imported (317 DAE files total):

I should try to be more scientific about my observations. I am looking at the SketchUp process memory usage in the Memory tab of the MacOS Activity Monitor.

  • Empty SketchUp window: 130 MB
  • SketchUp window with full scene loaded from SKP file: 9.8 GB

It seems to get worse with repetition. I can load the first 100 files in one invocation of my import plugin, and the memory usage goes up to about 20 GB. Then it drops back down to less than 3 GB. The next group of 50 files takes it up to 25 GB. Drops back down to 4.75 GB. The next group of 50 takes it up to 32 GB. Drops back down to 3.5 GB after. The next group of 50 takes it up to 38 GB. Drops down to 4 GB.

So it looks like the per-file impact gets worse in later invocations. I could only guess as to why.

Another observation is that the number of ports allocated to the process goes up as files get loaded, and they don’t get released until the plugin finishes. I am guessing they are opened files that don’t get released right away. Maybe they could be explicitly closed after a file is imported. The extra storage could be I/O buffers, but they are much larger than the files being read.

I have an MP4 screen capture of this whole process occurring. I could attach it here if the forum will accept a 243 MB file.

One thing that comes to my mind is using a timer to throttle the import: https://ruby.sketchup.com/UI.html#start_timer-class_method

keep an array of all files to process, shift them once per timer loop and stop the timer once all imports have been processed.

I don’t think a timer would really help.

One possible workaround would be to invoke the batch importer from another plugin that would feed it small batches. This would only be successful if the resources were freed between invocations of the inner plugin. Also, I don’t know if it’s possible to invoke one plugin from another in this fashion.

Isn’t “invoking in small batches” exactly what a timer allows you to do?

Sorry if I misunderstood its purpose. If the goal is to iterate over a big list of files and feed it to my batch importer in small chunks, then I could use a timer or just a regular iterator. It would need to invoke the plugin in such a way that it would allow the storage to be reclaimed between invocations. I don’t know how to do that. If I just call the import methods from within the timer block, wouldn’t it be just running it within the one timer plugin?

The difference between a timer and a regular iterator is that a regular iterator runs synchronously, thus blocking the main thread, while with a timer, you can run your batch asynchronously-ISH.

files = [... lots of files here]
timer_busy = false

timer = UI.start_timer(1, true) {
  if !timer_busy
    timer_busy = true
    if files.count > 0
      file = files.shift()
      ##
      # IMPORT FILE HERE
      ##
    else
      UI.stop_timer(timer)
    end

    timer_busy = false
  end
}

Something like this. Not tested. Best to make this part of the logic of a class instance, or module, so that files, timer_busy and timer are persistent by making those instance variables.

One thing that may be worth trying is to once in a while save the model, close it and then reload it inside the script. Maybe Sketchup cleans up memory when the model is closed…?

counter = 0
origins.each do |org|
    self.load_model(org[0], org[1], model)  # place object-centered model at offset
    if (counter += 1) % 10 == 0
        # SAVE, CLOSE, REOPEN
    end
end
1 Like

@kengey I tried an experiment with your timer approach and it does indeed cause the resources to be reclaimed between invocations of the importer. I cannot say why this would be so; I did not expect it.

So this is a viable workaround for me. I hope it does not keep the SketchUp developers from fixing the root problem. One should be able to iteratively invoke the import plugin without concern of using up system resources.

The native DAE importer is written in C/C++ - Ruby isn’t involved at all here. Even if the import is kicked off by model.import it just delegates to the internal C/C++ code. So no garbage collection. Memory is allocated during import and released when done - but that’s still not garbage collection.

Are you able to share enough of these files with us for the purpose of reproduction? You can share it privately if you want.