[Bug 262617] ZFS pool on USB drive does not mount correct on startup

From: <bugzilla-noreply_at_freebsd.org>
Date: Thu, 17 Mar 2022 12:25:28 UTC
https://bugs.freebsd.org/bugzilla/show_bug.cgi?id=262617

            Bug ID: 262617
           Summary: ZFS pool on USB drive does not mount correct on
                    startup
           Product: Base System
           Version: 13.0-RELEASE
          Hardware: amd64
                OS: Any
            Status: New
          Severity: Affects Many People
          Priority: ---
         Component: kern
          Assignee: bugs@FreeBSD.org
          Reporter: donaldcallen@gmail.com

Created attachment 232513
  --> https://bugs.freebsd.org/bugzilla/attachment.cgi?id=232513&action=edit
dmesg

I have a 2TB Seagate Barracuda drive with SATA/USB adapter that I use for
backups and archives. I created a ZFS pool (named Primary) on this drive using
the entire drive, not a partition, as the ZFS docs recommend. This drive was
previously used with a Linux system and had a GPT partition scheme with one
partition containing an ext4 filesystem.

If I have this drive connected to my system when I power it on, the first thing
to note is that I get some chatter in /var/log/messages about a corrupted GPT
partition table:

Mar 17 07:59:38 pangloss kernel: GEOM: da0: the primary GPT table is corrupt or
invalid.
Mar 17 07:59:38 pangloss kernel: GEOM: da0: using the secondary instead --
recovery strongly advised.

da0 is the device address of the disk.

When the system comes up, 'df' indicates that the pool is mounted where it is
supposed to be and the space utilization numbers are correct. But 'ls' of the
mountpoint returns absolutely nothing. None of the files are accessible. As
root,
I umount the pool and remount and now it mounts correctly.

If the drive is not connected to the system at boot time and I later connect
it, I get similar messages about a corrupt GPT primary table. If I try to mount
with

zfs mount Primary

the mount fails. I don't have the error messages in front of me as I write
this. I will try to duplicate and get the exact sequence and report in a
subsequent comment.

zfs status Primary

said that the pool did not exist. I then rebooted the system with the drive
plugged in and got the same behavior described above (blank filesystem at
first, everything ok after umount/mount).

dmesg attached.

-- 
You are receiving this mail because:
You are the assignee for the bug.