Gvinum configuration changed after a simple reboot

Olivier Cochard olivier at freenas.org
Sat Mar 25 13:50:02 UTC 2006


Hi all!

it's me again torturing gvinum :-)

I've found a strange problem with creating gvinum RAID 5 volume.

I can reproduce this problem at each time.

I'm working with a VMware workstation PC with a 6.1-PRERELEASE and 3 hard
drive:
- ad0: 102Mb
- ad1: 204mb
- ad3: 306mb

And I want to test a RAID 5 volume with these 3 disks.

Then I create this config file:

$ cat /var/etc/raid.conf
drive disk_ad0 device /dev/ad0s1a
drive disk_ad1 device /dev/ad1s1a
drive disk_ad3 device /dev/ad3s1a
volume raid
plex org raid5 256k
sd length 0 drive disk_ad0
sd length 0 drive disk_ad1
sd length 0 drive disk_ad3

With this configuration I should obtain a 204Mb RAID5 volume (2 * 102Mb).
And gvinum confirm that's right with this output:

Software RAID information and status:
3 drives:
D disk_ad3              State: up    /dev/ad3s1a    A: 0/306 MB (0%)
D disk_ad1              State: up    /dev/ad1s1a    A: 0/204 MB (0%)
D disk_ad0              State: up    /dev/ad0s1a    A: 0/102 MB (0%)

1 volume:
V raid                  State: up    Plexes:       1    Size:        204 MB

1 plex:
P raid.p0            R5 State: up    Subdisks:     3    Size:        204 MB

3 subdisks:
S raid.p0.s2            State: up    D: disk_ad3     Size:        306 MB
S raid.p0.s1            State: up    D: disk_ad1     Size:        204 MB
S raid.p0.s0            State: up    D: disk_ad0     Size:        102 MB

I can launch a newfs on it without problem, and put file on it
... All seem ok.

But... when I reboot the VMWare workstation.. I obtain this:

Software RAID status:
3 drives:
D disk_ad0              State: up    /dev/ad0s1a    A: 0/102 MB (0%)
D disk_ad1              State: up    /dev/ad1s1a    A: 0/204 MB (0%)
D disk_ad3              State: up    /dev/ad3s1a    A: 0/306 MB (0%)

1 volume:
V raid                  State: up    Plexes:       1    Size:        613 MB

1 plex:
P raid.p0            R5 State: up    Subdisks:     3    Size:        613 MB

3 subdisks:
S raid.p0.s0            State: up    D: disk_ad0     Size:        102 MB
S raid.p0.s1            State: up    D: disk_ad1     Size:        204 MB
S raid.p0.s2            State: up    D: disk_ad3     Size:        306 MB

The total RAID volume space change... and I can't use it anymore!! (must be
delete).

$ fsck -t ufs /dev/gvinum/raid
Cannot find file system superblock
fsck_ufs: /dev/gvinum/raid: can't read disk label
** /dev/gvinum/raid
ioctl (GCINFO): Inappropriate ioctl for device

Any idea ?

Thanks,

Olivier

--
Olivier Cochard
FreeNAS main developer
http://www.freenas.org
Skype: callto://ocochard


More information about the freebsd-geom mailing list