Today I had to do something a bit ugly. I had received a data dump from a 3rd party vendor with directories and names longer than the max windows/powershell can handle (260 char). Due to this we needed to flatten the structure by pulling all of the files out and placing them into a separate folder to be imported. Well my typical powershell script to handle this doesnt work, and while you can get it to work with powershell using symlinks and other work arounds. I didnt have time and as always this needed to be done yesterday and it was 15+ gigs of data.
Bring out good old batch and for loops with robocopy mixed in!
for %%F in (%destination%) do set destination="%%~fF"
for /r %source% %%F in (.) do if "%%~fF" neq %destination% ROBOCOPY "%%F" %destination% *.pdf *.docx *.doc *.wmv *.xls *.xlsx /COPY:DATSO /R:0
I used mapped drives to make the paths a little easier to deal with and kicked this off and it works flawlessly. Keep in mind we are coping and not moving these files so you will have duplicates. Also pay attention to the “/copy:datso” option as you may need to tweak this depending on your requirements.
Hope that helps!