This site uses advanced css techniques
Most administrators of the Evolution™ Payroll Service-Bureau software are familiar with the iSystems support site, where updates and patches are published. Visiting that site requires the username and password associated with the service bureau, and it's likely been remembered by the browser for ease of logging in each time.
Downloading files to a Windows middle-tier system is normally done with a standard browser (IE or Firefox), but fetching files to the Linux system (such as SQL patch files) is usually a bit more work.
Though it's possible to download files via a browser to the Window system and then transfer to Linux, it's easiest to fetch them directly from the iSystems site while on the Linux machine itself.
This Evo Tip discusses the wget command, how it interacts with authentication, and how to create a simple wrapper script to make this easier.
The popular tool wget is found on essentially every Linux machine, and it fetches a file based on a full web URL provided on the command line. By way of example, we'll attempt to fetch the evo-sysver tool from this website:
$ cd /tmp $ wget http://www.unixwiz.net/evo/evo-sysver --14:40:33-- http://www.unixwiz.net/evo/evo-sysver => `evo-sysver' Connecting to www.unixwiz.net:80... connected! HTTP request sent, awaiting response... 200 OK Length: 353 [text/plain] 100%[=================================================>] 5,886 --.--K/s 14:40:33 (28.73 KB/s) - `evo-sysver' saved [353/353] $ chmod +x evo-sysver
Now the file — evo-sysver — is left in the current directory (as well as made executable).
A complication arises when we attempt to fetch a file from a password-protected resource, such as the iSystems support site (here, a SYSTEM database update):
$ wget http://is1.isystemsllc.com/sbupdate/9.0_Killington/9-0-8-91_system.zip ... Authorization failed. ...
The remote website is requesting the username and password, but wget is unable to provide it, so the attempted transfer fails. To remedy this, we provide our credentials on the command line, along with the URL, and find that wget is able to retriev the file this time:
$ wget --http-user="stevepay" --http-passwd="abc123" \ http://is1.isystemsllc.com/sbupdate/9.0_Killington/9-0-8-91_system.zip --13:13:39-- http://is1.isystemsllc.com/sbupdate/9.0_Killington/9-0-8-91_system.zip => `9-0-8-91_system.zip' Resolving is1.isystemsllc.com... 184.108.40.206 Connecting to is1.isystemsllc.com|220.127.116.11|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 17,825,481 (17M) [application/zip] 100%[======================================>] 17,825,481 160.14K/s ETA 00:00 13:15:29 (158.39 KB/s) - `9-0-8-91_system.zip' saved [17825481/17825481]
Though running the wget command with these parameters works, it's a bit tedious to type this each time a file needs to be downloaded (and especially so for consultants who have to keep track of many customer login accounts). We can automate a bit of this by creating a trivial wrapper script to include these credentials, easing the use substantially.
We normally call it evo-wget, and put it in the /usr/local/bin/ directory so it's available to all users on the system. The script consists of a single line: the wget command with the username and password parameters, and the "$@" token which passes on any other parameters passed to evo-wget from the user (say, the URL).
# cd /usr/local/bin # vi evo-wget exec wget --http-user="stevepay" --http-passwd="abc123" "$@" # chmod a+x evo-wget
Now, this script will fetch any URL from the iSystems support site, automatically passing the username and password, making it much easier to use casually:
# evo-wget http://is1.isystemsllc.com/sbupdate/9.0_Killington/9-0-8-91_system.zip
The file will be fetched into the current directory.
First published: 2007/07/11