You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm having trouble reading data from a certain vmfs5 datastore, I've successfully migrated many machines all varying sizes from another datastore, also vmfs5.
The problem seems to be any file between 8KB and 512GB (though the 512GB is a guess, 500GB doesn't work but 718GB and larger images work)
My testcase is a small text file that I appended until it became unreadable through vmfs-fuse.
Test case: a 2784 byte file is readable with vmfs-fuse, on a Vmware node I copy the file to a new file, still works, now I append the original file to the copy, data is still correct, I once more append the data and I have a 8352 byte file that has 'garbage' content but stays the same md5sum if I re-read the file.
Editing the file and reducing the size does not make it readable again.
With vmfs-fuse in debug mode nothing obvious (for me) changes, still same node id.
I've tried applying pull request #14 (which I needed anyway for 256GB+) but that does not help for the issue of the 8KB file.
Any suggestions on how to further debug this issue would be greatly appreciated.
This is the debug log of the successful and failed reading of a file.
I'm having trouble reading data from a certain vmfs5 datastore, I've successfully migrated many machines all varying sizes from another datastore, also vmfs5.
The problem seems to be any file between 8KB and 512GB (though the 512GB is a guess, 500GB doesn't work but 718GB and larger images work)
My testcase is a small text file that I appended until it became unreadable through vmfs-fuse.
Test case: a 2784 byte file is readable with vmfs-fuse, on a Vmware node I copy the file to a new file, still works, now I append the original file to the copy, data is still correct, I once more append the data and I have a 8352 byte file that has 'garbage' content but stays the same md5sum if I re-read the file.
Editing the file and reducing the size does not make it readable again.
With vmfs-fuse in debug mode nothing obvious (for me) changes, still same node id.
I've tried applying pull request #14 (which I needed anyway for 256GB+) but that does not help for the issue of the 8KB file.
Any suggestions on how to further debug this issue would be greatly appreciated.
This is the debug log of the successful and failed reading of a file.
Start vmfs-fuse -d
cd /to/dir
Copy test file on the Vmware node to a new file and then read it through vmfs-fuse.
Append the data on the Vmware node once more and read through vmfs-fuse.
Append the data on the Vmware node once more and read through vmfs-fuse.
The text was updated successfully, but these errors were encountered: