Duplicate Document Cleanup - Best Practice

We used the migrator tool to import a little under 2,500 document files. The csv data file was missing some key information, yet the migrator ran (very slowly) without errors over a weekend, and hundreds of files were created in the vault. The following week, users noticed there were thousands of duplicate documents in their class group. We destroyed all documents associated with two classes, reconfigured and reran the importer. That seemed to resolve the issues, but we are still seeing thousands of duplicates in other classes for the group.

We have a search that returns all duplicates for the class group, but that search does not provide us with a count of the duplicates on the first page (new or old client -  we see 1-xxx of 10,000+). I was unable to adjust the max results in Admin based on support documents from 2023.  

We need some guidance as to how we should proceed to correct and clean up the remaining duplicates. 

We are SaaS,  26.2.15718.4 desktop client (New\Old)...