You're staring at a No space left on device error. Again. That sinking feeling hits – where did all your disk space go? I've been there too. Last year, my server crashed during peak traffic because the log directory ballooned to 40GB overnight. That's when I realized mastering Linux directory size checks isn't optional – it's survival.
Why You Can't Ignore Directory Sizing
Disk space issues creep up like uninvited guests. One minute everything's fine, next minute your database crashes or system updates fail. I once spent three hours troubleshooting a failed upgrade only to discover /var was 99% full. Knowing how to linux check directory size saves headaches and downtime. We'll cover:
• Deep analysis tools for storage audits
• Sorting and filtering tricks
• Permission workarounds
• Hidden gotchas that waste hours
The du Command: Your Swiss Army Knife
Let's start with du (disk usage). Basic but essential. I use this daily on my Ubuntu servers. The bread-and-butter syntax:
du -sh /path/to/directory
That -s gives the summary total, -h makes it human-readable (MB/GB). Want specifics? Check these variations:
Essential du Variations
| Command | Use Case | Personal Preference |
|---|---|---|
du -sh * | Show all items in current dir | My morning "coffee check" |
du -h --max-depth=1 | Depth-limited scanning | Saves hours on large trees |
du -ch /var/log/* | grep total | Get cumulative total only | For reports |
sudo du -sh /root | Size with root access | (Annoying but necessary) |
Sorting is where du shines. Found a space hog? Try:
du -h --max-depth=1 / | sort -h
That sort -h handles human-readable sizes. Crucial. I learned this after manually converting KB/MB for 20 minutes once. Never again.
du /) can take hours. Use --max-depth or target specific directories.
When du Isn't Enough: Advanced Tools
Sometimes du feels like using a hammer for surgery. Enter these lifesavers:
ncdu: The Visual Explorer
Install with sudo apt install ncdu (Debian/Ubuntu) or sudo yum install ncdu (RHEL/CentOS). Why I love it:
ncdu /home
You get an interactive interface showing directory percentages. Hit d to delete right there. Found 8GB of obsolete Docker images this way last week.
| Keybind | Action |
|---|---|
| ↑↓ | Navigate directories |
| Enter | Drill down |
| d | Delete selected |
| g | Toggle percentages |
Baobab: For GUI Lovers
Installed by default on GNOME systems. Launch with baobab. Shows sunburst charts – perfect for visual learners. My junior devs prefer this for quick linux check directory size tasks.
Special Case Survival Guide
Real-world sizing isn't textbook-perfect. Here's how I handle messy situations:
Cross-Filesystem Headaches
du loves traversing mounted drives. To stop it:
du -h --exclude="/mnt/*" --max-depth=1 /
Or use the -x flag:
sudo du -xh /
That -x saved me from scanning a 10TB network share last month. Lifesaver.
Permission Denied Errors
You'll see this scanning system directories. Solutions:
- Elevate carefully:
sudo du -sh /var/log - Redirect stderr:
du -sh /* 2>/dev/null - Find workarounds: Use
df -hfor partition-level checks
Symbolic Link Traps
By default, du ignores symlinks. To follow them:
du -Lh /path
Warning: This can cause double-counting! I learned this the hard way when my backup script reported 200% disk usage.
Pro Filtering Techniques
Finding needles in haystacks requires sharp filters. My go-to combos:
Exclude Specific Patterns
du -h --exclude='*.log' --exclude='cache' /var
Find + du Power Combo
find /home -type d -exec du -sh {} + | sort -h
This locates directories over 1GB:
find / -type d -size +1G -exec du -sh {} + 2>/dev/null
Disk Usage vs Actual Size: The Hidden Gap
Why does du report 12GB when ls -l shows 9GB? Two culprits:
- Block size overhead: Files consume 4K blocks minimum (even 1-byte files)
- Sparse files: Virtual machines love these
Check real vs allocated space:
du -h --apparent-size /path # Actual file sizes du -h /path # Disk blocks used
FAQs: Your Burning Questions Answered
Why does checking directory size hang on large directories?
Likely traversing network mounts or millions of files. Use --max-depth=1 first. On my NAS, scanning the root directory takes 2 hours – depth limiting cuts it to 3 minutes.
How to exclude hidden directories when checking size?
Use pattern exclusion: du -h --exclude=".*" /home. Though honestly, dotfiles rarely eat significant space.
What's faster between du and ls for size checks?
ls -l only shows file metadata. du scans disk blocks. For directory totals, du is irreplaceable.
How to track directory size changes over time?
Log outputs: date >> size_log.txt; du -sh /target >> size_log.txt. I have cron jobs doing this for critical directories.
Tool Comparison: When to Use What
| Tool | Best For | Speed | Depth | My Verdict |
|---|---|---|---|---|
du -sh | Quick single-dir checks | ★★★★★ | Shallow | Daily driver |
du --max-depth | Multi-level analysis | ★★★☆☆ | Custom | Weekly audits |
ncdu | Interactive exploration | ★★★★☆ | Deep | Crisis mode |
find + du | Targeted hunting | ★★☆☆☆ | Precise | Forensics |
Personal Pitfalls to Avoid
After 10 years managing Linux systems, here's what I wish I knew:
du as root on /Fills logs with permission errors. Use
sudo du -x --max-depth=3 / | sort -h instead.
2. Remember sort -h exists
I manually converted sizes for years before discovering it. Don't be like me.
3. Check mount points first
90% of "disappeared" space is separate partitions. df -h before diving into linux directory size checking.
Putting It All Together: My Workflow
When alerts scream "DISK SPACE CRITICAL":
- Run
df -hto locate full partitions - Quick scan:
sudo du -h --max-depth=1 /problem | sort -h - For deep dives:
ncdu /problem - Clean with interactive deletion
Monthly maintenance? Automated scripts logging directory sizes. Catching growth trends early prevents 3AM emergencies.
Mastering linux check directory size commands transforms storage management from panic-driven to proactive. Start with du -sh, graduate to ncdu, and always – always – remember sort -h.
Leave a Message