Thursday Exercise 2.1: Using a Web Proxy for Large Shared Input

Continuing the series of exercises blasting mouse genetic sequences, the objective of this exercise is to use a web proxy to stage the large database, which will be downloaded into each of many jobs that use the split input files from the last exercise (Exercise 1.3).


Place the Large File on the Proxy

First, you'll need to put the pdbaa_files.tar.gz file onto the Stash web directory. Use the following command:

[email protected] $ cp pdbaa_files.tar.gz ~/stash/public/

Test a download of the file

Once the file is placed in the ~/stash/public directory, it can be downloaded from a corresponding URL such as<USERNAME>/pdbaa_files.tar.gz, where <USERNAME> is your username on (make sure to keep the ~ before your username!).

Using the above convention (and from a different directory on, any directory), you can test the download of your pdbaa_files.tar.gz file with a command like the following:

[email protected] $ wget<USERNAME>/pdbaa_files.tar.gz

Replacing <USERNAME> with your own username (but keep the ~!). You may realize that you've been using wget to download files from a web proxy for many of the previous exercises at the school!

Run a New Test Job

Now, you'll repeat the last exercise (with a single input query file) but have HTCondor download the pdbaa_files.tar.gz file from the web proxy, instead of having the file transferred from the submit server.

Modify the submit file and wrapper script

In the wrapper script, we have to add some special lines so that we can pull from the HTTP proxy. In, we will have to add commands to pull the data file:


# Set the http_proxy environment which wget uses
export http_proxy=$OSG_SQUID_LOCATION

# Copy the pdbaa_files.tar.gz to the worker node
# Add the -S argument, so we can see if it was a cache HIT or MISS
wget -S<USERNAME>/pdbaa_files.tar.gz

tar xvzf pdbaa_files.tar.gz

./blastx -db pdbaa -query mouse.fa -out mouse.fa.result

rm pdbaa*

Be sure to replace <USERNAME> with your own user name.

The new line will download the pdbaa_files.tar.gz from the HTTP proxy, using the closest cache (because wget will look at the environment variable http_proxy for the newest cache).

In your submit file, you will need to remove the pdbaa_files.tar.gz file from the transfer_input_files, because we are now transferring the tarball over using HTTP proxies!

Submit the test job

You may wish to first remove the log, result, output, and error files from the previous tests, which will be overwritten when the new test job completes.

[email protected] $ rm *.error *.out *.result *.log

Submit the test job!

When the job starts, the wrapper will download the pdbaa_files.tar.gz file from the web proxy. If the jobs takes longer than two minutes, you can assume that it will complete successfully, and then continue with the rest of the exercise.

After the job completes examine the error file generated by the submission. At the top of the file, you will find something like:

--2019-07-11 18:29:14--
Resolving (
Connecting to (||:3128... connected.
Proxy request sent, awaiting response... 
  HTTP/1.0 200 OK
  Server: nginx/1.10.2
  Date: Thu, 11 Jul 2019 18:29:12 GMT
  Content-Type: application/octet-stream
  Content-Length: 22105124
  Last-Modified: Mon, 09 Jul 2018 23:22:11 GMT
  ETag: "5b43ee23-1514c24"
  Accept-Ranges: bytes
  Age: 2
  X-Cache: HIT from
  Via: 1.1 (squid/frontier-squid-2.7.STABLE9-27.1.osg33.el6)
  Connection: keep-alive
  Proxy-Connection: keep-alive
Length: 22105124 (21M) [application/octet-stream]

Notice the X-Cache line. It says it was a cache HIT from the proxy Yay! You successfully used a proxy to cache data near your worker node! Notice, the name of the cache may be different.

Run all 100 Jobs!

If all of the previous tests have gone okay, you can prepare to run all 100 jobs that will use the split input files. To make sure you're not going to generate too much data, use the size of files from the previous test to calculate how much total data you're going to add to the thur-blast-split directory for 100 jobs.

Make sure you remove pdbaa_files.tar.gz from the transfer_input_files in the split submit file.

Submit all 100 jobs! They may take a while to all complete, but it will still be faster than the many hours it would have taken to blast the single, large mouse_rna.fa file without splitting it up. In the meantime, as long as the first several jobs are running for longer than two minutes, you can move on to the next exercise