[Tfug] When RTFM is not enough

tfug@tfug.org tfug@tfug.org
Sun Jul 7 20:28:02 2002


On Sun, 7 Jul 2002, Bowie J. Poag wrote:

> Terrible, terrible, terrible. Stop bloating! Your system will hate you! Try
> this instead:
>
> bashism: for i in `locate foo`; do ls -l $i; done
>
> english:  For each instance of "foo" found, show me the details of it with ls.
>
>
> Replace "ls -l $i" with whatever command you're interested in performing on
> $i. ls is a good choice, since it provides alot of different search options
> including listings by inode, date, permissions, wildcarding, ownership, size,
> etc. Nothing stops you from tossing in an awk or grep statement either.
>
> Dammit.

And if you don't know what your file is called? That trick wouldn't do
that as is, so we'd have to modify it. The following could be done:

for i in `du -ak /`; do echo `ls -l $i 2> /dev/null` >> list; done

That stores the properties of the files, too. However, ls -l $i lists the
contents of directories if that's what $i is. There's no way I could find
of making du -ak not report dirs (in which case dirs wouldn't be
searchable), and I couldn't find a way for ls -l not to list them (other
than ls -l'ing the parent dir and grepping for it, which could return more
than one file, etc..) either, so you're stuck with ls -l looking in dirs
alog with du -ak looking in them, so your file is extremely bloated.

The solution could be a script such as this:

#!/usr/bin/perl

my @filelist = `du -ak /`;
my $output = "";

foreach my $file (@filelist)
{
        $file =~ s/\d+\s+\//\//g;
        chomp $file;

        if (-d $file)	// its a dir
        {
                $output .= $file . "\n";
        }
        else
        {
                $output .= `ls -l $file`;
        }
}

open (DAT, "list");
print DAT $output;
close DAT;


Of course, that script chokes on weird filenames (containing
parenthesis, #s, and other things), but that's not too hard to fix.
Another bug is that the script doesn't detect symlinks to directories, so
those end up getting ls -l'ed as well. Other than those minor bugs, that
script generates a list of ls -l's on all files and dirs. Of course, ls
-l'ing everything takes forever. IIRC (and I might not), find is _much_
faster on that.

w00t! That's most of the functionality of find duplicated after only an
hour of R&D! Ah, but there's more. That's just the non-interactive
part of it. You'd need to know how to grep for what you want, or write
another script to do that for you.

So perhaps find *is* the better way in this case afterall? (especially if
you're the only one using the machine and don't mind using your hardware,
waiting a bit, and causing the destruction of the world).

- Yan