Hash is a value in Screaming Frog that is used for detecting duplicate content.
Using an algorithm, their crawler will assign a unique identifying value to any URL it encounters, unless it verbatim matches the content of another page, in which case it assigns the same value as the duplicate page.
Hash Fragments Importance To Your SEO Strategy
The hash fragments are randomly generated for each page, making them completely unique unless the copy is an exact duplicate to another page.
Whenever running a crawl, ensure that any URLs with a duplicate value in their hash fragment are either updated to have different content, canonicalized, or no-indexed.
If none of those options seem logical, take the worse performing URL & 301 re-direct it to the better performing URL in order to avoid a duplicate content penalty, while still sending users to the page with the better user experience.
Pages with similar content will not have similar hash fragments necessarily, as it is only meant to do exact match pairings.