I’m currently saving my app preferences quite frequently. On my machine with a solid state drive it saves in about 100ms.
My code is already posted here:
Sketchup freezes while the file is being written. What I’m worried about is that someone with a slow computer will end of experiencing noticeable lag while the settings are written to disk. In a ‘normal program’ (C#), the settings could be written using an async method. I understand SketchUp doesn’t support threads. Are there any options to prevent the UI from freezing?
I’m wondering if there is something else taking time here. Saving a text file with a few settings shouldn’t be much of an issue. What else happens when saving the file? Is the data formatted in any special way or gathered from any special place?
Ok I’ll be honest it’s a design flaw. Every time I edit an opening I update a list of recent openings for quick future use. I was writing that to the same file as as my list of saved openings which is about 2,000 objects. This still ‘only’ took 100ms in my computer. I was worried about performance though on slower setups.
It has the exact same results as doing it without a timer.
UI.start_timer(0.1,false) do
t = Time.now
File.open(path2,'wb') {|f| f.write(a) }
File.delete(path2)
File.open(path2,'wb') {|f| f.write(a) }
File.delete(path2)
File.open(path2,'wb') {|f| f.write(a) }
File.delete(path2)
File.open(path2,'wb') {|f| f.write(a) }
File.delete(path2)
File.open(path2,'wb') {|f| f.write(a) }
File.delete(path2)
File.open(path2,'wb') {|f| f.write(a) }
puts "#{Time.now - t} seconds to save openings."
end
< 2.513278 seconds to save openings.
#sketchup froze for the duration
I read the json in from the file on startup. Then I simply modify the hash as needed and write it all the the file in a single write. Appending wouldn’t really be useful in this case.?? In a ‘real program’ I’d use an SQLite database.
Well you could modify the way you are doing things a bit.
You could write the opening objects each as a json line, and write by appending.
When writing the whole hash to file …
# @hash is a Hash of hashes accessed via a key.
#
# rec is the nested hashes (in hash) that will be
# converted to individual JSON records strings.
#
records = @hash.map {|key,rec| %[{"#{key}" : #{rec.to_json}}] }
File.open(filepath, mode: "w:UTF-8:UTF-8") do |file|
file.write(records.join("\n"))
end
And then when appending a record …
record = %[{"#{key}" : #{@hash[key].to_json}}]
File.open(filepath, mode: "a:UTF-8:UTF-8") do |file|
file.write(record<<"\n")
end
I think you need to be careful to be sure there is a newline character at the end of each record line.
When you read, you do …
# Each line is a JSON record with a newline character at the end.
# The records (after chomping off the newline,) will be parsed into
# little temporary hashes and immediately merged into @hash.
#
IO.foreach(filepath) do |line|
line.chomp!
@hash.merge!(JSON.parse(line))
end
I’m not really adding much info (like a log). Rather I’m editing info. So for example I save the 10 most recently used openings. Appending wouldn’t be useful because I’m always modifying the whole stack.
Well yeah in a db you can append records, but also it’s easy to update data.
UPDATE recent SET openings = @openings WHERE opening_type = @opening_type;
With the ruby text file I don’t really see a way to edit other than reading the whole thing modify the relevant portion, and write it all back to file. Since I expect users to edit/add/delete openings infrequently, I could store the openings in a separate file from the recently used list. That would really cut down on the file size.
Do these settings need to be machine wide? The only reason I could see you store data in a file is if the data needs to be persistent across multiple models. If all it contains is data about a specific model or entity in that model: use attribute dictionaries.
If the data needs to be shared between multiple models: how many read/writes are you doing ?
Regarding sqlite for doing entire read/writes… that would even be slower since there is the overhead of creating everything else a db contains such as indexing. Db’s are only usefull of you are actually querying data.
Something I tend to do quite often when data needs to be persistent is using a file per attribute/setting/object. So instead of having 1 monolith chunck your data up in pieces if possible.
Yes this is setup information that needs to be stored between sessions. I use attribute dictionaries to store everything related to the model.
I think having multiple files is going to be the best solution. My current setup has about 2,000 openings saved. I might have a file per opening type to speed things up.