--- Comment #16 from ds <> ---
Adding some additional testing I performed.

Just to ensure I didn't have some anomaly in my test system described above
(given that your internal test showed that running on Windows server 2016
HYPER-V didn't seem to reveal a problem running with virtual SCSI drives - I
can't replicate this as I don't have the server version of Windows), I repeated
a test with another pc, running same O/S platform  as my test platform (Windows
10PRO64 - but older 3rd gen Intel CPU platform, i5-3570K, 16 GB RAM).   That
system is running a VM image of NAS4FREE 10.3 based on FREEBSD 10.3
successfully with a ZFS volume of 4 drives using RAID-Z1 resiliency.

I created a new VM with newer NAS4FREE 11.1 image and imported the config file
of the 10.3 NAS image which all ran fine, however the SCSI-defined disk drives
attached to VM were no longer visible - same problem as reported on my test
platform.   I've done enough testing to know that this process  doesn't damage
the content of an existing ZFS volume, so I wasn't concerned about going back
to 10.3 after doing this (which worked fine).   

If NAS4FREE 11 with FREEBSD11 runs fine in the Server 2016 version of HYPER-V
but not on the desktop version (which is the same series - Server 2016 latest
and W10 PRO Creators' Update then it begs the question as to what is different
with those 2 platforms as it relates to virtual machines with integrations
services and how FREEBSD11 works with them differently than 10.3 did.   

In the NAS4FREE support forum, someone else also reported running into this
problem with VMWARE ESX hypervisor but don't know which version so that led me
to think that HYPER-V was not be cause.

You are receiving this mail because:
You are the assignee for the bug.
_______________________________________________ mailing list
To unsubscribe, send any mail to 

Reply via email to