ZFS L2arc 16.0E size

Frank de Bot (lists) lists at searchy.net
Mon Feb 16 00:30:51 UTC 2015


Hello,

I have a FreeBSD 10.1 system with a raidz2 zfs configuration with 2ssd's
for l2arc . It is running '10.1-STABLE FreeBSD 10.1-STABLE #0 r278805'
Currently I'm running tests before it can go to production, but I have
the following issue. After a while the l2arc devices indicate 16.0E free
space and it starts 'consuming' more than it can hold

cache                        -      -      -      -      -      -
  gpt/l2arc1              107G  16.0E      0      2      0  92.7K
  gpt/l2arc2             68.3G  16.0E      0      1      0  60.8K

It ran good for a while, where data was removed from cache so it could
be filled with newer data. (Free space was always around 200/300Mbytes).

I've read about similar issues, which should be fixed in different
commits, but I'm running the latest stable 10.1 kernel right now. (One
of the last similar issue is:
https://bugs.freebsd.org/bugzilla/show_bug.cgi?id=197164 )
Another similar issue reported at FreeNAS
https://bugs.freenas.org/issues/5347 suggested it would be a hardware
issue, but I have 2 servers which experience the same problem. One has a
Crucial M500 drive and the other a M550. Both have a 64G partition voor
l2arc.

What is really going on here?


Regards,


Frank de Bot


More information about the freebsd-stable mailing list