Human vs. computer in RAID recovery.

Human vs. computer battle.

As far as I am aware there is no way to rebuild HFS+ RAID from file-system analysis. NTFS is simple because has good system counters to use, same as EXT. But HFS+ requires good knowledge of RAID distributions.

As you see, people rely on some property of the filesystem being recovered to produce a strong signal that we can use to determine the correct RAID configuration. NTFS provides plenty of those. FAT provides even more. With EXT, hddguy probably knows more than I do, because I'm not aware of any strong singal in EXT. By and large, humans perfer to find a small bit of data with high signal-to-noise ratio, and use it. Understandable because it is limits amount of effort involved.

The RAID recovery software, on the other hand, mostly works with weaker signals. Weaker signals have far worse SNR, but they are in plenty. For example, you can calculate entropy values for any data. Obvious computer strength is to quickly process large arrays of data, which the software exploits in full. It just crunches through a sheer number of weak signals to produce the RAID layout.

Comments

Popular posts from this blog

Folder tree structure vs. file data

@DEVOPS_BORAT

Weird illustration