I've writing a Python GUI for some measurement devices and my code will be used and adopted by some others too, while I keep watching over it.
Now there I have a dict of dicts
d = {
'KEY1' : {'key1': val1, 'key2' : val2},
'KEY2' : {'key3': val3, 'key4' : val4},
...
}
etc., or even deeper dicts.
and they'll have to access the values.
Now I'm thinking, for clarity, wirting access functios where they only need to provide the small keys
get(self, 'key'):
for KEY, VALUES in d.iteritems():
if 'key' in VALUES:
return d[KEY]['key']
however, of course then they'd scan the whole long dicts a lot.
Without trying it, is this probably a rather fast thing and thus negligible, or is that a complete horrible thing to do?
All dicts have about size 10 or so
>>55952289
Throw it in Mongo/CouchDB?
>>55952307
I should elaborate on that a bit, it would probably speed up query times a lot and and separate code and data a little better. I'd benchmark it to see if it's really an issue though.
>>55952339
The code also executes pyvia and pyqt4 commands, so I think those are rather the bottleneck functions. I just want to know if the programming Karma god will hunt my down if I run unnecessay for loops for the sake of providing clearer API sort of commands
Adding more software other than Python modules isn't a good idea, I think.
With that small dicts, the time it takes is very negligible.
If for some reason you want to optimize it anyway, you could put all the keys in a set and do an "if key in set" check on that set instead. That way you only have to do one lookup.
>>55953077
Whoops, I misread your question.
Scrap the set suggestion. Instead, since you only do lookup on the nested dicts, you might as well just put them all into one.