Wednesday, March 2, 2016

ESXi 6.0 Bug: Deprecated VMFS volume warning reported by ESXi hosts (adding iSCSI LUN)

Today we need to add a new iSCSI LUN to one of our vCenter 6.0 and found a bug in ESXi 6.0.

After create the LUN in NetApp we presented to the hosts.
Adding the iSCSI to one of the hosts everything was ok.

But when the rest of the hosts recognize the new Datastore we had a warming in all hosts.

"Deprecated VMFS volume(s) found on the host. Please consider upgrading volume(s) to the latest version"

Troubleshooting the host log in /var/log/hostd.log I found this:

warning hostd[2EFC2B70] [Originator@6876 sub=Hostsvc.DatastoreSystem opID=7878B682-0000041D-2b-bb-41-25e0 user=vpxuser] VMFS volume [/vmfs/volumes/56d6cfc8-c7b45bfc-0cd5-984be167ca4c] of version [0] is not supported.
warning hostd[2EFC2B70] [Originator@6876 sub=Hostsvc.DatastoreSystem opID=7878B682-0000041D-2b-bb-41-25e0 user=vpxuser] UpdateConfigIssues: Deprecated VMFS filesystems detected. These volumes should be upgraded to the latest version 

It seems that ESXi 6.0 have a bug when adding the iSCSI LUN to the host and while is mounting(when unmounted) it get this warning. Because the version of the filesystem is not know during the initial detection. So cannot be match in the initial state(while mounting the LUN)

There is no solution from VMware to this issue.

The KB about this bug is here: VMware KB 2109735

Since there is no fix at the moment for the issue, the solution is to restart the management agents on the hosts that have this issue. This will clear the warming message.

To restart the management agents we can go trough Direct Console User Interface (DCUI) and just choose the Restart Management Agents option.

Note: This option can disconnect your ESXi host temporarily from vCenter.

Or we can just Log in ssh in the ESXi host(my prefer option) and restart from the console.

Just use this:

/etc/init.d/hostd restart
/etc/init.d/vpxa restart

Do this on all hosts affected(that have the warning message) and will be clear.

Hope this can help you fixing this bug.

Note: Share this article, if you think is worth sharing.

No comments:

Post a Comment