prevent stuck processes with large folder manipulations
Brian Awood
bawood at umich.edu
Sat Jan 2 21:38:58 EST 2010
On Friday 01 January 2010 @ 15:45, Paul Dekkers wrote:
>
> similar processes were killed. And the new archive-folder now ended up
> with several duplicates, taking about millions instead of tens of
> thousands. (We'll have to see how to dedup that, any ideas are
> appreciated otherwise I'll write something for that.)
I forgot to add, we used to use a perl script called dupseek to clean
these up. It has some nice optimization that make it quite fast.
http://freshmeat.net/projects/dupseek/
-Brian
More information about the Info-cyrus
mailing list