File size distribution: Difference between revisions

From Rosetta Code
Content added Content deleted
(→‎{{header|zkl}}: didn't like how I formated)
(+python)
Line 2: Line 2:


Beginning from the current directory, or optionally from a directory specified as a command-line argument, determine how many files there are of various sizes in a directory hierarchy. My suggestion is to sort by logarithmn of file size, since a few bytes here or there, or even a factor of two or three, may not be that significant. Don't forget that empty files may exist, to serve as a marker. Is your file system predominantly devoted to a large number of smaller files, or a smaller number of huge files?
Beginning from the current directory, or optionally from a directory specified as a command-line argument, determine how many files there are of various sizes in a directory hierarchy. My suggestion is to sort by logarithmn of file size, since a few bytes here or there, or even a factor of two or three, may not be that significant. Don't forget that empty files may exist, to serve as a marker. Is your file system predominantly devoted to a large number of smaller files, or a smaller number of huge files?

=={{header|Python}}==
The distribution is stored in a '''collections.Counter''' object (like a dictionnary with automatic 0 value when a key is not found, useful when incrementing). Anything could be done with this object, here the number of files is printed for increasing sizes. No check is made during the directory walk: usually, safeguards would be needed or the program will fail on any unreadable file or directory (depending on rights, or too deep paths for instance).

<lang python>import sys, os
from collections import Counter

def dodir(path):
global h

for name in os.listdir(path):
p = os.path.join(path, name)

if os.path.islink(p):
pass
elif os.path.isfile(p):
h[os.stat(p).st_size] += 1
elif os.path.isdir(p):
dodir(p)
else:
pass

def main(arg):
global h
h = Counter()
for dir in arg:
dodir(dir)
s = n = 0
for k, v in sorted(h.items()):
print("Size %d -> %d file(s)" % (k, v))
n += v
s += k * v
print("Total %d bytes for %d files" % (s, n))

main(sys.argv[1:])</lang>


=={{header|zkl}}==
=={{header|zkl}}==

Revision as of 15:32, 9 October 2016

File size distribution is a draft programming task. It is not yet considered ready to be promoted as a complete task, for reasons that should be found in its talk page.

File Size Distribution

Beginning from the current directory, or optionally from a directory specified as a command-line argument, determine how many files there are of various sizes in a directory hierarchy. My suggestion is to sort by logarithmn of file size, since a few bytes here or there, or even a factor of two or three, may not be that significant. Don't forget that empty files may exist, to serve as a marker. Is your file system predominantly devoted to a large number of smaller files, or a smaller number of huge files?

Python

The distribution is stored in a collections.Counter object (like a dictionnary with automatic 0 value when a key is not found, useful when incrementing). Anything could be done with this object, here the number of files is printed for increasing sizes. No check is made during the directory walk: usually, safeguards would be needed or the program will fail on any unreadable file or directory (depending on rights, or too deep paths for instance).

<lang python>import sys, os from collections import Counter

def dodir(path):

   global h
   for name in os.listdir(path):
       p = os.path.join(path, name)
       if os.path.islink(p):
           pass
       elif os.path.isfile(p):
           h[os.stat(p).st_size] += 1
       elif os.path.isdir(p):
           dodir(p)
       else:
           pass

def main(arg):

   global h
   h = Counter()
   for dir in arg:
       dodir(dir)
   
   s = n = 0
   for k, v in sorted(h.items()):
       print("Size %d -> %d file(s)" % (k, v))
       n += v
       s += k * v
   print("Total %d bytes for %d files" % (s, n))

main(sys.argv[1:])</lang>

zkl

<lang zkl>pipe:=Thread.Pipe();

   // hoover all files in tree, don't return directories

fcn(pipe,dir){ File.globular(dir,"*",True,8,pipe); } .launch(pipe,vm.arglist[0]); // thread

dist,N,SZ,maxd:=List.createLong(50,0),0,0,0; foreach fnm in (pipe){

  sz,szd:=File.len(fnm), sz.numDigits;
  dist[szd]+=1;
  N+=1; SZ+=sz; maxd=maxd.max(szd);

} println("Found %d files, %,d bytes, %,d mean.".fmt(N,SZ,SZ/N)); scale:=50.0/(0.0).max(dist); szchrs,idx,comma:=",nnn"*20, -1, Walker.cycle(0,0,1).next; println("%15s %s (* = %.2f)".fmt("File size","Number of files",1.0/scale)); foreach sz,cnt in ([0..].zip(dist[0,maxd])){

  println("%15s : %s".fmt(szchrs[idx,*], "*"*(scale*cnt).round().toInt()));
  idx-=1 + comma();

}</lang>

Output:
$ zkl flSzDist.zkl ..
Found 1832 files, 108,667,806 bytes, 59,316 mean.
      File size   Number of files (* = 13.44)
              n : *
             nn : ***
            nnn : ********
          n,nnn : **********************************
         nn,nnn : **************************************************
        nnn,nnn : ********************************
      n,nnn,nnn : *******

$ zkl flSzDist.zkl /media/Tunes/
Found 4320 files, 67,627,849,052 bytes, 15,654,594 mean.
      File size   Number of files (* = 69.84)
              n : 
             nn : 
            nnn : 
          n,nnn : *
         nn,nnn : 
        nnn,nnn : 
      n,nnn,nnn : *
     nn,nnn,nnn : **************************************************
    nnn,nnn,nnn : ********
  n,nnn,nnn,nnn : *