Before I re-invent this particular wheel, has anybody got a nice routine for calculating the size of a directory using Python? It would be very nice if the routine would format the size nicely in Mb/Gb etc.
This walks all sub-directories; summing file sizes:
import os def get_size(start_path = '.'): total_size = 0 for dirpath, dirnames, filenames in os.walk(start_path): for f in filenames: fp = os.path.join(dirpath, f) # skip if it is symbolic link if not os.path.islink(fp): total_size += os.path.getsize(fp) return total_size print(get_size(), 'bytes')
And a oneliner for fun using os.listdir (Does not include sub-directories):
import os sum(os.path.getsize(f) for f in os.listdir('.') if os.path.isfile(f))
Updated To use os.path.getsize, this is clearer than using the os.stat().st_size method.
Thanks to ghostdog74 for pointing this out!
os.stat - st_size Gives the size in bytes. Can also be used to get file size and other file related information.
If you use Python 3.4 or previous then you may consider using the more efficient
walk method provided by the third-party
scandir package. In Python 3.5 and later, this package has been incorporated into the standard library and
os.walk has received the corresponding increase in performance.
Some of the approaches suggested so far implement a recursion, others employ a shell or will not produce neatly formatted results. When your code is one-off for Linux platforms, you can get formatting as usual, recursion included, as a one-liner. Except for the
du.py ----- #!/usr/bin/python3 import subprocess def du(path): """disk usage in human readable format (e.g. '2,1GB')""" return subprocess.check_output(['du','-sh', path]).split().decode('utf-8') if __name__ == "__main__": print(du('.'))
is simple, efficient and will work for files and multilevel directories:
$ chmod 750 du.py $ ./du.py 2,9M
A bit late after 5 years, but because this is still in the hitlists of search engines, it might be of help...