hardlink-1.0-alt4 There were some files with films inside the directory I ran hardlink on, and it failed processing them saying that mmap ran out of memory. And they were not merged with their duplicates (exact copies). I believe this is a major bug, because these means that the really large files are not treated, and getting rid of their duplicates could save really a lot of space. Workaround: * compare the files on which hardlink fails, and delete/hardlink them manually; * fdupes-1.40-alt2 and duff-0.4-alt1 were able to process these files, and they did detect the duplicates, although they work much slower. (I liked the interface of fdupes more than that of duff, because it displays the progress.)
hardlink could fallback to more complex or slower comparison methods on large files, rather than just letting mmap to fail.
hardlink-1.0-alt5 -> sisyphus: * Sat Nov 26 2011 Dmitry V. Levin <ldv@altlinux> 1.0-alt5 - Merged with hardlink-1.0-owl1 (closes: #26632).
(In reply to comment #0) > Workaround: * use hardlinkpy?
Thanks, now it works fine. 2mike: I searched for these tools with "apt-cache search duplicate", so I didn't see it (hardlinkpy) as an alternative.