I wrote a file dupelication processor which gets the MD5 hash of each file, adds it to a hashmap, than takes all of the files with the same hash and adds it to a hashmap called dupeList. But while running large directories to scan such as C:\Program Files\ it will throw the following error
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.nio.file.Files.read(Unknown Source)
at java.nio.file.Files.readAllBytes(Unknown Source)
at com.embah.FileDupe.Utils.FileUtils.getMD5Hash(FileUtils.java:14)
at com.embah.FileDupe.FileDupe.getDuplicateFiles(FileDupe.java:43)
at com.embah.FileDupe.FileDupe.getDuplicateFiles(FileDupe.java:68)
at ImgHandler.main(ImgHandler.java:14)
Im sure its due to the fact it handles so many files, but im not sure of a better way to handle it. Im trying to get this working so I can sift thru all my kids baby pictures and remove dupelicates before I put them on my external harddrive for longterm storage. Thanks everyone for the help!
My code
public class FileUtils {
public static String getMD5Hash(String path){
try {
byte[] bytes = Files.readAllBytes(Paths.get(path)); //LINE STACK THROWS ERROR
byte[] hash = MessageDigest.getInstance("MD5").digest(bytes);
bytes = null;
String hexHash = DatatypeConverter.printHexBinary(hash);
hash = null;
return hexHash;
} catch(Exception e){
System.out.println("Having problem with file: " + path);
return null;
}
}
public class FileDupe {
public static Map<String, List<String>> getDuplicateFiles(String dirs){
Map<String, List<String>> allEntrys = new HashMap<>(); //<hash, file loc>
Map<String, List<String>> dupeEntrys = new HashMap<>();
File fileDir = new File(dirs);
if(fileDir.isDirectory()){
ArrayList<File> nestedFiles = getNestedFiles(fileDir.listFiles());
File[] fileList = new File[nestedFiles.size()];
fileList = nestedFiles.toArray(fileList);
for(File file:fileList){
String path = file.getAbsolutePath();
String hash = "";
if((hash = FileUtils.getMD5Hash(path)) == null)
continue;
if(!allEntrys.containsValue(path))
put(allEntrys, hash, path);
}
fileList = null;
}
allEntrys.forEach((hash, locs) -> {
if(locs.size() > 1){
dupeEntrys.put(hash, locs);
}
});
allEntrys = null;
return dupeEntrys;
}
public static Map<String, List<String>> getDuplicateFiles(String... dirs){
ArrayList<Map<String, List<String>>> maps = new ArrayList<Map<String, List<String>>>();
Map<String, List<String>> dupeMap = new HashMap<>();
for(String dir : dirs){ //Get all dupe files
maps.add(getDuplicateFiles(dir));
}
for(Map<String, List<String>> map : maps){ //iterate thru each map, and add all items not in the dupemap to it
dupeMap.putAll(map);
}
return dupeMap;
}
protected static ArrayList<File> getNestedFiles(File[] fileDir){
ArrayList<File> files = new ArrayList<File>();
return getNestedFiles(fileDir, files);
}
protected static ArrayList<File> getNestedFiles(File[] fileDir, ArrayList<File> allFiles){
for(File file:fileDir){
if(file.isDirectory()){
getNestedFiles(file.listFiles(), allFiles);
} else {
allFiles.add(file);
}
}
return allFiles;
}
protected static <KEY, VALUE> void put(Map<KEY, List<VALUE>> map, KEY key, VALUE value) {
map.compute(key, (s, strings) -> strings == null ? new ArrayList<>() : strings).add(value);
}
public class ImgHandler {
private static Scanner s = new Scanner(System.in);
public static void main(String[] args){
System.out.print("Please enter locations to scan for dupelicates\nSeperate Location via semi-colon(;)\nLocations: ");
String[] locList = s.nextLine().split(";");
Map<String, List<String>> dupes = FileDupe.getDuplicateFiles(locList);
System.out.println(dupes.size() + " dupes detected!");
dupes.forEach((hash, locs) -> {
System.out.println("Hash: " + hash);
locs.forEach((loc) -> System.out.println("\tLocation: " + loc));
});
}
heapsettings? b) You data structure seems a little complex - map of a list of a map