freebsd-update install no space left on device
- Reply: atsac : "Re: freebsd-update install no space left on device"
- Go to: [ bottom of page ] [ top of archives ] [ this month ]
Date: Mon, 28 Jul 2025 00:04:34 UTC
I was doing an upgrade from 14.2 to 14.3 and started receiving a long list of 'no space left on device' after the reboot and running install the second time. It did finish afterward, see below. I didn't expect this as I did what I thought was a mirror system just before and didn't have the issue. Here is what the system looked like afterward... > root@monitor1:/usr/local/etc # df -h > Filesystem Size Used Avail Capacity > Mounted on > zroot/ROOT/default 2.0G 1.9G 70M 97% / > devfs 1.0K 0B 1.0K 0% /dev > /dev/gpt/efiboot0 260M 1.3M 259M 1% /boot/efi > zroot/tmp 70M 144K 70M 0% /tmp > zroot/usr/ports 70M 96K 70M 0% > /usr/ports > zroot/var/tmp 70M 96K 70M 0% /var/tmp > zroot/var/audit 70M 96K 70M 0% > /var/audit > zroot/var/crash 70M 96K 70M 0% > /var/crash > zroot/var/log 92M 22M 70M 24% /var/log > zroot 70M 96K 70M 0% /zroot > zroot/home 70M 96K 70M 0% /home > zroot/var/mail 70M 212K 70M 0% /var/mail > zroot/home/admin 70M 156K 70M 0% > /home/admin > zroot/usr/src 70M 96K 70M 0% /usr/src But then I looked at last nights daily run output and found the zroot/ROOT/default was 5.7G, and yes, I double checked to make sure I was looking at the correct daily run output email... > Disk status: > Filesystem Size Used Avail Capacity Mounted on > zroot/ROOT/default 5.7G 4.2G 1.5G 73% / > devfs 1.0K 0B 1.0K 0% /dev > /dev/gpt/efiboot0 260M 1.3M 259M 1% /boot/efi > zroot/usr/ports 1.5G 96K 1.5G 0% /usr/ports > zroot/var/audit 1.5G 96K 1.5G 0% /var/audit > zroot/var/crash 1.5G 96K 1.5G 0% /var/crash > zroot/var/log 1.5G 22M 1.5G 1% /var/log > zroot/home 1.5G 96K 1.5G 0% /home > zroot/tmp 1.5G 144K 1.5G 0% /tmp > zroot/var/mail 1.5G 212K 1.5G 0% /var/mail > zroot/usr/src 1.5G 96K 1.5G 0% /usr/src > zroot 1.5G 96K 1.5G 0% /zroot > zroot/var/tmp 1.5G 96K 1.5G 0% /var/tmp > zroot/home/admin 1.5G 156K 1.5G 0% /home/admin How on earth did the size of zroot/ROOT/default change? More importantly at this point, am I going to start seeing any system issues and what is the best way to remedy? The zpool is 7.5G... > root@monitor1:/usr/local/etc # zpool list > NAME SIZE ALLOC FREE CKPOINT EXPANDSZ FRAG CAP DEDUP > HEALTH ALTROOT > zroot 7.50G 7.20G 310M - - 84% 95% 1.00x > ONLINE - This is a VM on a TrueNAS server. The mirror server on a separate host shows the same zpool with 89% CAP. These VMs are not that old, they were spun up on FreeBSD 14.1. Here is how the freebsd-update install ended... > install: > c34fc2d9bef9bef3a5244a4bcb085e4167b09135c0d5388e79129eaab937a3ca: No > such file or directory > rm: c34fc2d9bef9bef3a5244a4bcb085e4167b09135c0d5388e79129eaab937a3ca: > No such file or directory > /usr/sbin/freebsd-update: cannot create > 436480fa5540b23a3d090c53bb19ab30b5290f09364ac9a844dd75c7324dbc87: No > space left on device > install: > 436480fa5540b23a3d090c53bb19ab30b5290f09364ac9a844dd75c7324dbc87: No > such file or directory > rm: 436480fa5540b23a3d090c53bb19ab30b5290f09364ac9a844dd75c7324dbc87: > No such file or directory > /usr/sbin/freebsd-update: cannot create > b8b57186732467b24bcb1ced083cce181fb70ad4764feed7b6fec2eb3b8693a9: No > space left on device > install: > b8b57186732467b24bcb1ced083cce181fb70ad4764feed7b6fec2eb3b8693a9: No > such file or directory > rm: b8b57186732467b24bcb1ced083cce181fb70ad4764feed7b6fec2eb3b8693a9: > No such file or directory > /usr/sbin/freebsd-update: cannot create > 4f6891535bc9cc281fb941adfc0d11240dd6dd782f0b973074a954dfa715a57d: No > space left on device > install: > 4f6891535bc9cc281fb941adfc0d11240dd6dd782f0b973074a954dfa715a57d: No > such file or directory > rm: 4f6891535bc9cc281fb941adfc0d11240dd6dd782f0b973074a954dfa715a57d: > No such file or directory > chflags: ///usr/lib/libl.a: No such file or directory > chflags: ///usr/lib/libln.a: No such file or directory > > Restarting sshd after upgrade > Performing sanity check on sshd configuration. > Stopping sshd. > Waiting for PIDS: 853. > Performing sanity check on sshd configuration. > Starting sshd. > done. -- Robert