Quote:
Originally Posted by Blackhorse 70V Most users don't share 10,000 files. |
wow.
it's hard to imagine someone seriously saying that sound engineering principles (like scalability) should actively be discouraged just because the current load factors dont require it!
that was the same brain-dead contempt for architectural planning that tanked twitter as it tried to scale up from some lame PHP code hacked together with spit & rubber-bands.
that engineering team was replaced with some pros who actually had studied SW engineering (god forbid anyone ever would use UML to get it right the first time).
anyways, once short-cuts become the accepted work ethic in one part of the code, this attitude usually ends up percolating through to the rest of the code. Any job worth doing is worth doing weil.
also, i do realize that there is a certain platform-based work ethic involved in my criticism of such lackadaisical standards: in the windows community (and even in the linux world), the standard is usually 'good enough' ... but mac users expect 'insanely great'.
btw: the re-indexing is brutal even if only 1000 files!
ps: just what kind of examination is being made of each file to (re)build this index? - is the standard metadata in the headers not sufficient? ... or does LM literally read through every byte inside the file?! (if so, then scouring for what else?)