![]() Pickle.dump(downloadedLog, open('downloaded.pickle', "wb" )) # file successfully downloaded and extracted store into local log and filesystem log Print "Saving extracted file to ",outputFilename If url in downloadedLog or os.path.isfile(outputFilename): ![]() # retrieve list of URLs from the webservers # remove entries older than 5 days (to maintain speed) # open logfile for downloaded data and save to local variableÄownloadedLog = pickle.load(open('downloaded.pickle')) # check for extraction directories existence Here is my current script which works but unfortunately has to write the files to disk. I would prefer not to actually write any of the zip or extracted files to disk if I could get away with it. My primary goal is to download and extract the zip file and pass the contents (CSV data) via a TCP stream. I am now at a loss to achieve the next step. ZIP files from a URL and then proceeds to extract the ZIP files and writes them to disk. I have managed to get my first python script to work which downloads a list of.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |