Download many files in parallel? (Linux/Python?) -


i have big list of remote file locations , local paths them end up. each file small, there many of them. generating list within python.

i download of these files possible (in parallel) prior unpacking , processing them. best library or linux command-line utility me use? attempted implement using multiprocessing.pool, did not work ftp library.

i looked pycurl, , seemed wanted, not run on windows 7 x64.

i use pscp things this, , call using subprocess.popen

for example:

pscp_command = '''"c:\program files\putty\pscp.exe" -pw <pwd> -p -scp -unsafe <file location on   linux machine including machine name , login, can use wildcards here> <where want files go on windows machine>''' p = subprocess.popen( pscp_command, stdout=subprocess.pipe, stderr=subprocess.pipe ) stdout, stderr = p.communicate() p.wait() 

of course i'm assuming linux --> windows


Popular posts from this blog

How to calculate SNR of signals in MATLAB? -

c# - Attempting to upload to FTP: System.Net.WebException: System error -

ios - UISlider customization: how to properly add shadow to custom knob image -