ZFS deduplication
Romain Tartière
romain at blogreen.org
Wed Jan 23 14:37:31 UTC 2013
Hello
My machine has run on full ZFS for quite a few years now without major
problem, but suddenly started to reports inconsistent "statistics". It
all started by `zfs list` reporting too much available space:
> # zfs list data
> NAME USED AVAIL REFER MOUNTPOINT
> data 117G 5,73E 4,52G legacy
"data" being a mirror of two 500 GB disks, the available space is
obviously wrong. The pool is healthy according to `zpool status`:
> # zpool status data
> pool: data
> state: ONLINE
> scan: scrub repaired 0 in 5h1m with 0 errors on Tue Jan 1 09:52:50 2013
> config:
>
> NAME STATE READ WRITE CKSUM
> data ONLINE 0 0 0
> mirror-0 ONLINE 0 0 0
> ada0p3 ONLINE 0 0 0
> ada1p3 ONLINE 0 0 0
>
> errors: No known data errors
However, `zpool list` reports an inconsistent deduplication value (it
used to be ~1.4 AFAICR):
> zpool list data
> NAME SIZE ALLOC FREE CAP DEDUP HEALTH ALTROOT
> data 460G 101G 359G 21% 1386985.39x ONLINE -
At first I though about a process that might have created bazillion of
files with the same contents but df(1) does not report any filesystem
with an abnormal number of used inodes. Likewise, df(1) does not show
a filesystem with a huge amount of data that could come from a single
file which content would be a repeated single block of data (but maybe
df(1) can't tell me that):
> df -i
> Filesystem 1K-blocks Used Avail Capacity iused ifree %iused Mounted on
> data 6455005842309380 4735434 6455005837573946 0% 45457 281474976665199 0% /
> devfs 1 1 0 100% 0 0 100% /dev
> procfs 4 4 0 100% 1 0 100% /proc
> fdescfs 1 1 0 100% 4 22497 0% /dev/fd
> data/tmp 6455005837580729 6783 6455005837573946 0% 2087 281474976708569 0% /tmp
> data/usr 6455005853245211 15671265 6455005837573946 0% 532871 281474976177785 0% /usr
> tank 1920937431 490426844 1430510587 26% 1315413 2861021175 0% /usr/home
> data/poudriere/data 6455005839432671 1858725 6455005837573946 0% 11958 281474976698698 0% /usr/local/poudriere/data
> data/poudriere/8_3_RELEASE_amd64 6455005838384969 811023 6455005837573946 0% 56480 281474976654176 0% /usr/local/poudriere/jails/8_3_RELEASE_amd64
> data/poudriere/9_0_RELEASE_amd64 6455005838615454 1041508 6455005837573946 0% 66523 281474976644133 0% /usr/local/poudriere/jails/9_0_RELEASE_amd64
> data/poudriere/jails/9_1_RELEASE_amd64 6455005838652760 1078814 6455005837573946 0% 67766 281474976642890 0% /usr/local/poudriere/jails/9_1_RELEASE_amd64
> data/poudriere/ports/default 6455005838064055 490109 6455005837573946 0% 151864 281474976558792 0% /usr/local/poudriere/ports/default/ports
> data/poudriere/ports/gnome 6455005838069283 495337 6455005837573946 0% 152758 281474976557898 0% /usr/local/poudriere/ports/gnome/ports
> data/poudriere/ports/mono 6455005838062663 488717 6455005837573946 0% 152411 281474976558245 0% /usr/local/poudriere/ports/mono/ports
> data/poudriere/ports/romain 6455005838064956 491010 6455005837573946 0% 152030 281474976558626 0% /usr/local/poudriere/ports/romain/ports
> data/usr/main 6455005838086216 512270 6455005837573946 0% 164164 281474976546492 0% /usr/ports
> data/var 6455005856718703 19144757 6455005837573946 0% 158141 281474976552515 0% /var
> data/var/cache 6455005837574041 95 6455005837573946 0% 17 281474976710639 0% /var/cache
> data/var/cache/portshaker 6455005837587068 13122 6455005837573946 0% 4658 281474976705998 0% /var/cache/portshaker
> data/var/cache/portshaker/bsd_sharp 6455005837578549 4603 6455005837573946 0% 1695 281474976708961 0% /var/cache/portshaker/bsd_sharp
> data/var/cache/portshaker/e 6455005837575230 1284 6455005837573946 0% 977 281474976709679 0% /var/cache/portshaker/e
> data/var/cache/portshaker/freebsd_texlive 6455005837574185 239 6455005837573946 0% 124 281474976710532 0% /var/cache/portshaker/freebsd_texlive
> data/var/cache/portshaker/freebsd_texlive_ports 6455005837578532 4586 6455005837573946 0% 289 281474976710367 0% /var/cache/portshaker/freebsd_texlive_ports
> data/var/cache/portshaker/freebsd_texlive_ports_marcuscom 6455005837574066 120 6455005837573946 0% 29 281474976710627 0% /var/cache/portshaker/freebsd_texlive_ports_marcuscom
> data/var/cache/portshaker/freebsd_texlive_releng 6455005837611826 37880 6455005837573946 0% 29335 281474976681321 0% /var/cache/portshaker/freebsd_texlive_releng
> data/var/cache/portshaker/ports 6455005838062115 488169 6455005837573946 0% 152039 281474976558617 0% /var/cache/portshaker/ports
> data/var/cache/portshaker/ports/distfiles 6455005842833837 5259891 6455005837573946 0% 2605 281474976708051 0% /var/cache/portshaker/ports/distfiles
> data/var/cache/portshaker/ports/packages 6455005856315476 18741530 6455005837573946 0% 16799 281474976693857 0% /var/cache/portshaker/ports/packages
> data/var/cache/portshaker/redports 6455005837574488 542 6455005837573946 0% 312 281474976710344 0% /var/cache/portshaker/redports
> data/var/cache/portshaker/redports:fneufneu 6455005837575482 1536 6455005837573946 0% 317 281474976710339 0% /var/cache/portshaker/redports:fneufneu
> data/var/cache/portshaker/redports:romain 6455005837574295 349 6455005837573946 0% 226 281474976710430 0% /var/cache/portshaker/redports:romain
> data/var/cache/portshaker/redports:virtualbox 6455005837574717 771 6455005837573946 0% 444 281474976710212 0% /var/cache/portshaker/redports:virtualbox
> data/var/cache/portshaker/redports:xbmc 6455005837573977 31 6455005837573946 0% 7 281474976710649 0% /var/cache/portshaker/redports:xbmc
> data/var/cache/portshaker/romain 6455005837577011 3065 6455005837573946 0% 930 281474976709726 0% /var/cache/portshaker/romain
> data/var/cache/portshaker/woot 6455005837574065 119 6455005837573946 0% 75 281474976710581 0% /var/cache/portshaker/woot
> data/var/jails 6455005841336901 3762955 6455005837573946 0% 517515 281474976193141 0% /var/jails
> data/var/jails/base 6455005838403648 829702 6455005837573946 0% 11659 281474976698997 0% /var/jails/base
> data/var/jails/r230911 6455005837857185 283239 6455005837573946 0% 11581 281474976699075 0% /var/jails/r230911
> data/var/jails/texlive-test 6455005844601398 7027452 6455005837573946 0% 496201 281474976214455 0% /var/jails/texlive-test
> data/var/jails/texlive.home.sigabrt.org 6455005850841853 13267907 6455005837573946 0% 377896 281474976332760 0% /var/jails/texlive.home.sigabrt.org
> data/var/jails/texlive.home.sigabrt.org/ports 6455005838230119 656173 6455005837573946 0% 152331 281474976558325 0% /var/jails/texlive.home.sigabrt.org/usr/ports
> data/var/jails/xbmc 6455005840442396 2868450 6455005837573946 0% 228330 281474976482326 0% /var/jails/xbmc
> devfs 1 1 0 100% 0 0 100% /var/named/dev
> devfs 1 1 0 100% 0 0 100% /var/jails/texlive.home.sigabrt.org/dev
> devfs 1 1 0 100% 0 0 100% /var/jails/texlive/dev
Does anyone has an idea about what is going on? Is there any way I can
know which deduplicated blocks are most used and in which file(s) to
locate the source of the problem? Or is there any way to prove that
this reporting is wrong?
I'm running 9.1-STABLE r245578, with zpool v28 (I guess, no way to check
that?) and ZFS v5 for everything but a few snapshots (v4, v3; according
to `zfs get version`).
Thanks!
Romain
--
Romain Tartière <romain at blogreen.org> http://romain.blogreen.org/
pgp: 8234 9A78 E7C0 B807 0B59 80FF BA4D 1D95 5112 336F (ID: 0x5112336F)
(plain text =non-HTML= PGP/GPG encrypted/signed e-mail much appreciated)
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 663 bytes
Desc: not available
URL: <http://lists.freebsd.org/pipermail/freebsd-fs/attachments/20130123/8e5f9d9e/attachment.sig>
More information about the freebsd-fs
mailing list