Re: ZFS PANIC: HELP.
On 02/25/2022 2:11 am, Alexander Leidinger wrote: Quoting Larry Rosenman (from Thu, 24 Feb 2022 20:19:45 -0600): I tried a scrub -- it panic'd on a fatal double fault. Suggestions? The safest / cleanest (but not fastest) is data export and pool re-creation. If you export dataset by dataset (instead of recursively all), you can even see which dataset is causing the issue. In case this per dataset export narrows down the issue and it is a dataset you don't care about (as in: 1) no issue to recreate from scratch or 2) there is a backup available) you could delete this (or each such) dataset and re-create it in-place (= not re-creating the entire pool). Bye, Alexander. http://www.Leidinger.net alexan...@leidinger.net: PGP 0x8F31830F9F2772BF http://www.FreeBSD.orgnetch...@freebsd.org : PGP 0x8F31830F9F2772BF I'm running this script: #!/bin/sh for i in $(zfs list -H | awk '{print $1}') do FS=$1 FN=$(echo ${FS} | sed -e s@/@_@g) sudo zfs send -vecLep ${FS}@REPAIR_SNAP | ssh l...@freenas.lerctr.org cat - \> $FN done How will I know a "Problem" dataset? -- Larry Rosenman http://www.lerctr.org/~ler Phone: +1 214-642-9640 E-Mail: l...@lerctr.org US Mail: 5708 Sabbia Dr, Round Rock, TX 78665-2106
Error building openoffice in current
gmake[1]: Leaving directory '/usr/ports/editors/openoffice-4/work/aoo-4.1.11/main/sw' sw deliver deliver -- version: 275594 COPY: build.lst -> /usr/ports/editors/openoffice-4/work/aoo-4.1.11/main/solver/4111/unxfbsdx.pro/inc/sw/build.lst LOG: writing /usr/ports/editors/openoffice-4/work/aoo-4.1.11/main/solver/4111/unxfbsdx.pro/inc/sw/deliver.log Module 'sw' delivered successfully. 1 files copied, 0 files unchanged 1 module(s): testtools need(s) to be rebuilt Reason(s): ERROR: error 65280 occurred while making /usr/ports/editors/openoffice-4/work/aoo-4.1.11/main/testtools/source/bridgetest When you have fixed the errors in that module you can resume the build by running: build --from testtoolsI get this error when trying to update openoffice-4any help appreciatedFilippo
Re: ZFS PANIC: HELP.
Quoting Larry Rosenman (from Thu, 24 Feb 2022 20:19:45 -0600): I tried a scrub -- it panic'd on a fatal double fault. Suggestions? The safest / cleanest (but not fastest) is data export and pool re-creation. If you export dataset by dataset (instead of recursively all), you can even see which dataset is causing the issue. In case this per dataset export narrows down the issue and it is a dataset you don't care about (as in: 1) no issue to recreate from scratch or 2) there is a backup available) you could delete this (or each such) dataset and re-create it in-place (= not re-creating the entire pool). Bye, Alexander. -- http://www.Leidinger.net alexan...@leidinger.net: PGP 0x8F31830F9F2772BF http://www.FreeBSD.orgnetch...@freebsd.org : PGP 0x8F31830F9F2772BF pgpbleK3b3rSl.pgp Description: Digitale PGP-Signatur