Unique fingerprints of files?

Gunnar writes about dh-make-drupal and says:

Now, I hate having non-Debian-packaged files spilled over my /usr/share partition. Drupal modules want to be installed in /usr/share/drupal5/modules/module_name (or s/5/6/ for Drupal6, to which I have not yet migrated).

Well, there are modules in /usr/share/drupal*/modules (basically the same applies to themes/), but my understanding is, that this place is intented for drupal core modules/themes. For your site modules/themes /etc/drupal/*/sites/foobar/modules can/ought to be used instead. Of course you can symlink to where ever you want.

Additionally there is /etc/drupal/6/sites/all where you can put common/system-wide modules and themes directories that can be used by all Drupal sites. I don’t know whether Drupal5 does support this, but at least Drupal6 can and you will run into trouble when using this method with Debian packages because of #513522.

So, I would recommend not to store packages into /usr/share/drupal*/modules or themes/ but maybe use something like /usr/share/drupal-contrib/*/modules and themes/ and a tool/method to symlink appropriately into /etc/drupal/*/sites/*/modules similar to update-rc.d or such.

Just my 2 ¢… 😉

Uncategorized

8 thoughts on “Unique fingerprints of files?

  1. Not that it would be very efficient, but you could simply check both the MD5 and SHA1 hash for equality. (or MD5 + filesize, or …)

  2. I fear that checking both sha1 and md5sum would last too long, but checking for file hash *and* filesize *and* maybe file permissions was on my todo list when there's no other method to get a really unique hash. 😉

    Well, of course I should check the filenames for those double hashes as well,maybe… 😉

  3. Do you really have MD5 collisions on your hard drive? Given how long it has been taken to generate one collision in 2004, that sounds strange. Did you compare the double entries, whether they are are really different files with the same MD5 sum? Maybe there is a problem somewhere else.

  4. No, I haven't compared the double files in depth. They just appeared to me when I ran the second script. Some files were printed out repeatedly, but I'll do that tomorrow. Is there a way to print out of double md5sums, something like the inverse of uniq or sort -u?
    Ah, bummer… diff should do the trick of course… 😉

    But well, the problem still exists: when the multiple md5sums reference the same file (hardlinks/symlinks), how to deal with it easily?

  5. Thanks for the pointers!
    e2salvage isn't suitable since it's an XFS filesystem, but testdisk claims to support XFS, so I'll give it a try on a backup…. we'll see…. 😉

  6. Automatically restore files from lost+found
    I already wrote two days ago about it, but the problem then was to find a way of unique identifiers for files. Using md5sum and openssl sha1 seemed to deliver multiple files for each hash, which confused me. A closer look later revealed that there were

Comments are closed.