Hi Chris,

I think you are running into this bug:

http://bugs.opensolaris.org/bugdatabase/view_bug.do?bug_id=6538600
pool version is not updated in the label config after zpool upgrade and
before reboot

Exporting and importing the pool or a reboot seems to clear it up.

Thanks,

Cindy

On 09/27/10 23:06, Chris Mosetick wrote:
The strange behavior that I witnessed on the machine that had its hostname renamed was never resolved or investigated further. Luckily it was a experiment/test machine. About the zpool headers not getting updated after a zpool upgrade? I filled a bug on the Illumos bug tracker:

http://illumos.org/issues/217

Somehow my formatting on the bug entry got mangled. :)

Luckily this bug appears to just effect zdb. The pools and file systems are in fact upgraded after you initiate the upgrade. FYI I have witnessed the same behavior when upgrading pools created on on a clean OS b134 machine to Open Indiana b147, zpool version 28.

I'm under the impression that this bug would not be difficult to fix, but being that zdb does not seem to be well documented, maybe in fact it would be hard to track down?



2010/9/27 Réfi Richárd <miam...@gmail.com <mailto:miam...@gmail.com>>

    Hi,

    Did you solved your issue?

    RR

    On Fri, Sep 10, 2010 at 2:56 AM, Chris Mosetick <cmoset...@gmail.com
    <mailto:cmoset...@gmail.com>> wrote:

        Not sure what the best list to send this to is right now, so I
        have selected a few, apologies in advance.

        A couple questions.  First I have a physical host (call him bob)
        that was just installed with b134 a few days ago.  I upgraded to
        b145 using the instructions on the Illumos wiki yesterday.  The
        pool has been upgraded (27) and the zfs file systems have been
        upgraded (5).

        ch...@bob:~# zpool upgrade rpool
        This system is currently running ZFS pool version 27.
        Pool 'rpool' is already formatted using the current version.

        ch...@bob:~# zfs upgrade rpool
        7 file systems upgraded

        The file systems have been upgraded according to "zfs get
        version rpool"

        Looks ok to me.

        However, I now get an error when I run zdb -D.  I can't remember
        exactly when I turned dedup on, but I moved some data on rpool,
        and "zpool list" shows 1.74x ratio.

        ch...@bob:~# zdb -D rpool
        zdb: can't open 'rpool': No such file or directory

        Also, running zdb by itself, returns expected output, but still
says my rpool is version 22. Is that expected?
        I never ran zdb before the upgrade, since it was a clean install
        from the b134 iso to go straight to b145.  One thing I will
        mention is that the hostname of the machine was changed too
        (using these instructions
<http://wiki.genunix.org/wiki/index.php/Change_hostname_HOWTO>). bob used to be eric. I don't know if that matters, but I can't
        open up the "Users and Groups" from Gnome anymore, /"unable to
        su"/ so something is still not right there.

        Moving on, I have another fresh install of b134 from iso inside
        a virtualbox virtual machine, on a total different physical
        machine.  This machine is named weston and was upgraded to b145
        using the same Illumos wiki instructions.  His name has never
        changed.  When I run the same zdb -D command I get the expected
        output.

        ch...@weston:~# zdb -D rpool
        DDT-sha256-zap-unique: 11 entries, size 558 on disk, 744 in core
        dedup = 1.00, compress = 7.51, copies = 1.00, dedup * compress /
        copies = 7.51

        However, after zpool and zfs upgrades _on both machines_, they
        still say the rpool is version 22.  Is that expected/correct?  I
        added a new virtual disk to the vm weston to see what would
        happen if I made a new pool on the new disk.

        ch...@weston:~# zpool create test c5t1d0

        Well, the new "test" pool shows version 27, but rpool is still
        listed at 22 by zdb.  Is this expected /correct behavior?  See
        the output below to see the rpool and test pool version numbers
        according to zdb on the host weston.

        Can anyone provide any insight into what I'm seeing?  Do I need
        to delete my b134 boot environments for rpool to show as version
        27 in zdb?  Why does zdb -D rpool give me can't open on the host
        bob?

        Thank you in advance,

        -Chris

        ch...@weston:~# zdb
        rpool:
            version: 22
            name: 'rpool'
            state: 0
            txg: 7254
            pool_guid: 17616386148370290153
            hostid: 8413798
            hostname: 'weston'
            vdev_children: 1
            vdev_tree:
                type: 'root'
                id: 0
                guid: 17616386148370290153
                create_txg: 4
                children[0]:
                    type: 'disk'
                    id: 0
                    guid: 14826633751084073618
                    path: '/dev/dsk/c5t0d0s0'
                    devid:
        'id1,s...@sata_____vbox_harddisk____vbf6ff53d9-49330fdb/a'
                    phys_path: '/p...@0,0/pci8086,2...@d/d...@0,0:a'
                    whole_disk: 0
                    metaslab_array: 23
                    metaslab_shift: 28
                    ashift: 9
                    asize: 32172408832
                    is_log: 0
                    create_txg: 4
        test:
            version: 27
            name: 'test'
            state: 0
            txg: 26
            pool_guid: 13455895622924169480
            hostid: 8413798
            hostname: 'weston'
            vdev_children: 1
            vdev_tree:
                type: 'root'
                id: 0
                guid: 13455895622924169480
                create_txg: 4
                children[0]:
                    type: 'disk'
                    id: 0
                    guid: 7436238939623596891
                    path: '/dev/dsk/c5t1d0s0'
                    devid:
        'id1,s...@sata_____vbox_harddisk____vba371da65-169e72ea/a'
                    phys_path: '/p...@0,0/pci8086,2...@d/d...@1,0:a'
                    whole_disk: 1
                    metaslab_array: 30
                    metaslab_shift: 24
                    ashift: 9
                    asize: 3207856128
                    is_log: 0
                    create_txg: 4

        _______________________________________________
        Developer mailing list
        develo...@lists.illumos.org <mailto:develo...@lists.illumos.org>
        http://lists.illumos.org/m/listinfo/developer




------------------------------------------------------------------------

_______________________________________________
opensolaris-discuss mailing list
opensolaris-discuss@opensolaris.org
_______________________________________________
opensolaris-discuss mailing list
opensolaris-discuss@opensolaris.org

Reply via email to