Tuesday, January 27, 2015

Basic Windows Security


Windows is still a hotbed for viruses and malware, there are some decent yet not well known built-in tools that already exist on Windows and more that are freely available directly from Microsoft.

First if you have a serious problem you would be best to repair utilizing another OS, either a bootable USB (E.g. Linux/Knoppix) or another machine entirely.

I recommend following some online hardening guides, however you may cause headaches if you shutdown too many services, I recommend you take many well-named restore points along the way and test the functionality of whatever programs you plan to use.

This is a useful guide: Hardening Windows 8.1

Note: ** Do Not ** download the recommended software in the guide:  Software Restriction Policy 1.2  - Although it is hosted on Sourceforge multiple scanners have detected a Trojan (Artemesis). Generally you should be extremely careful about downloading any software from popular file-sharing sites including yet not limited to CNET, Sourceforge, etc. If you have an option to download from the author / developer / owner website directly - I recommend that you always choose that option. Furthermore if you see MD5 or SHA1 hash sums always check to make sure they match *before* you install the program. And scan everything! A good line of defense will have multiple tiers: E.g. Antivirus (E.g. McAfee) -> Spyware (E.g. Malwarebytes Anti-Malware, SpyHunter, Spybot Search & Destroy)  then even another level can be herdProtect Anti-Malware. If you have to utilize CNET to obtain software you should use everything you have to check it out before using it and it is recommended to sandbox/jail it or run it from within a VM to see what it does before using it on your real system.

After getting all the antivirus, malware and adware removed you can cautiously begin to connect your system back to the internet. 

Download and install EMET directly from Microsoft (latest version as of this post Apr, 04 2015 is EMET 5.2) - preferably from another machine while staying still offline: Microsoft - Enhanced Mitigation Experience Toolkit 5.2

Microsoft EMET is free and is key to making your Windows box secure, when turned up to the maximum protection level EMET thwarts a huge variety of threats and is defeatable only by the latest-and-greatest threats out there.

After EMET is installed and the settings are configured to ALWAYS ON, one of the first things to run are some of the free Microsoft built-in repair tools.

First up is to run System File Checker, run the following from an elevated command prompt - read more at Microsoft Support - Use SFC to repair missing or corrupted system files
C:\> sfc /scannow

Next run the Deployment Imaging and Servicing Management (DISM) tool to repair any Windows Image corruption, the /online flag tells DISM to use Windows Update for the repair image source - you can read more about it at Microsoft Technet- Repair a Windows Image:
C:\> dism /online /cleanup-image /restorehealth

Further ways to be secure, as the guide I linked to above recommends there is a software called Sandboxie that is really remarkable and it is free for one sandbox. Yet it is so useful that it is one of the few programs that I use that I decided to buy for extended features.

Sandboxie allows for jailing applications, so you can install and run applications from the sandbox without the application being capable of tampering with your real system files. When you install a program you can see exactly what it does, what registry entries it would have made where it would put files. Also when it runs you can see all the files that its accessing. If you don't like something with a single click you can wipe out everything that it did without affecting your real system.

Sandboxie is also EMET aware and is actively being developed (as of Jan 27,2015).

I have sandboxed on my system Firefox, Cygwin to name a few useful ones - I recommend to keep Firefox sandboxed at all times and additionally to install the add-on No Script and put it on the maximum protection settings whitelisting and opening up whatever features you need along the way and also a decent anti-keylogger (as of Feb 2, 2015 - QFX Key Scrambler works well and is free).

Furthermore I recommend you image your system (E.g.using Clonezilla, Macrium Reflect) and consider utilizing Virtual Machines (E.g. VirtualBox, VMware are both free) for anything really risky before you try it on your real system.   

Lastly if you really want to take it to the next level you can familiarize yourself with the way Military/DoD and Government Agencies secure their Windows computers, here: IASE
Windows 8 STIG - Version 1, Release 8 (Last Updated: Jan 23,2015)
Windows Operating Systems Overview (Last Updated: Jan 23,2015)

Monday, January 26, 2015

Alias lscmd - colorized grep for cmd data (w/ ps & lsof)

alias lscmd='_(){ CP="\\033[1;3" && Y="${CP}3m" && R="${CP}1m" && P="${CP}5m" && B="${CP}4m" && C="${CP}6m" && G="${CP}2m" && RST="${CP}0m" && if [[ $# -eq 0 ]] || [[ ${1} =~ ^- ]]; then echo -en "${Y}Usage: lscmd <cmd to grep for>${RST}\n"; else CHR1="${1:0:1}"; PSOUT=$(ps -ef | grep "[${CHR1}]${1:1}" 2>/dev/null); local -i PIDCHOICE=1; MATCHCOUNT=$(echo "${PSOUT}"|wc -l);if [[ "${MATCHCOUNT}" -gt 1 ]]; then PSOUT=$(echo "${PSOUT}" | cat -n); echo "${PSOUT}" | awk "{printf \"${Y}%s ${RST}\",\$1;\$1=\"\";printf \"${G}%s${RST}\n\",\$0}"; read -p "$(echo -en "${Y}Multiple matches found, enter number of process: ${RST}${G}")" -u 0 PIDCHOICE; echo -en "${RST}"; fi; if [[ -z "${PSOUT}" ]]; then echo -e "${Y}lscmd:: ${RST}${R}No Matches Found ${RST}${Y}(${RST}${G}grep${RST}${Y}\x27ing${RST} ${G}ps${RST}${Y}) for:${RST} ${B}${1}${RST}"; else TARGETPID=$(echo "${PSOUT}"|sed -n "${PIDCHOICE}p" | awk "{print \$2}"); CMDHEAD=$(/bin/ps --no-headers -p "${TARGETPID}" -o comm,pid -ww | awk "{printf \"${B}Command shortname: ${RST}${C}%s ${RST} ${B}Pid: ${RST}${C}%s${RST}\",\$1,\$2}");CMDLONG=$(/bin/ps --no-headers -p "${TARGETPID}" -o cmd -ww); CMDFILES=$(lsof -p "${TARGETPID}" 2>/dev/null | sed "1s#.*#\\${B}&\\${RST}#;1!s#^.*\$#\\${G}&\\${RST}#g"); OUTPUT1="${CMDHEAD}";OUTPUT2="${B}Full Command: ${RST}${Y}${CMDLONG}${RST}"; OUTPUT3="${CMDFILES}";echo -en "${OUTPUT1}\n${OUTPUT2}\n${OUTPUT3}${RST}\n"|more; fi; fi; }; _'

This command will grep ps for the term provided. If multiple matches are found the user will be presented with a number selection of matching commmands and a prompt to enter the choice (not the pid) of the process they wish to see data on.

/tmp/test$ Usage: lscmd <cmd to grep for>
/tmp/test$ lscmd yate
1 1000 9999 30794 6 Jan25 pts/1 01:02:07 yate
2 1000 28125 15877 1 01:11 pts/3 00:17:47 clients/yate-qt4 -c ./conf.d -m ./modules -e ./share
Multiple matches found, enter number of process: 1
Command shortname: yate Pid: 9999
Full Command: yate
COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME
yate 9999 cronkilla rtd DIR 8,1 4096 391664 /
yate 9999 cronkilla txt REG 8,1 6088 485695 /usr/bin/yate
yate 9999 cronkilla mem REG 8,1 39576 634867 /usr/lib/yate/jingle/jinglef
eatures.yate yate 9999 cronkilla mem REG 8,1 105952 634890 /usr/lib/yate/server/ysigcha n.yate
yate 9999 cronkilla mem REG 8,1 31144 634881 /usr/lib/yate/server/mrcpspe ech.yate
yate 9999 cronkilla mem REG 8,1 18696 634876 /usr/lib/yate/server/dbwave. yate
yate 9999 cronkilla mem REG 8,1 31088 634875 /usr/lib/yate/server/dbpbx.y ate
yate 9999 cronkilla mem REG 8,1 18856 634869 /usr/lib/yate/server/accfile .yate
yate 9999 cronkilla mem REG 8,1 54960 634889 /usr/lib/yate/server/yradius .yate
yate 9999 cronkilla mem REG 8,1 22832 634882 /usr/lib/yate/server/park.ya

Sunday, January 25, 2015

Unpack Mac Packages in Bash

alias unpkg='_(){ if [[ ! $# -eq 1 ]] || [[ ! -f "${1}" ]]; then echo "Usage: unpkg <~file~.pkg>"; else 7z x "${1}"; for a in *.pkg; do if [[ -d "$a" ]]; then cd "$a"; cat Payload | gunzip -dc |cpio -i; cd -; fi; done; fi; }; _'

This alias is not recursive, and will only process one level of pkg's if further pkg files exist you will have to re-run this alias on them as you find them, requires 7z (apt-get install p7zip-full).

Its also recommended to create a new directory and copy the pkg file into it before running, especially when you don't know what may be inside and it makes it easier to see what was extracted.

Thursday, January 22, 2015

MHDDFS - Span FS's Transparently via FUSE module

$ sudo apt-get install mhddfs 
      ...
   ### now a quick simple test in /tmp, according to the documents this
   ### utility can span different fs's and even different fs types
$ mkdir d1 d2 d3 virtual 
$ ls
 . .. virtual d3 d2 d1
$ sudo mhddfs d1,d2,d3 virtual -o allow_other 
mhddfs: directory '/tmp/test/d1' added to list
mhddfs: directory '/tmp/test/d2' added to list
mhddfs: directory '/tmp/test/d3' added to list
mhddfs: mount to: /tmp/test/virtual
mhddfs: move size limit 4294967296 bytes
$ dd if=/dev/zero of=/tmp/test/virtual/my.file bs=1M count=100 
100+0 records in
100+0 records out
104857600 bytes (105 MB) copied, 0.670829 s, 156 MB/s
$ du -sh *
100M d1
0 d2
0 d3
100M virtual
$ df -h virtual 
Filesystem Size Used Avail Use% Mounted on
/tmp/test/d1;/tmp/test/d2;/tmp/test/d3 940M 102M 838M 11% /tmp/test/virtual
Cool..
(See https://romanrm.net/mhddfs for an more extensive overview.)

Friday, January 16, 2015

Colorized DD Sequential Write Benchmark Alias

alias bm='_(){ CP="\\033[1;3"; Y="${CP}3m"; C="${CP}6m"; R="${CP}1m"; G="${CP}2m"; RST="${CP}0m"; if [ ! -w "." ]; then echo -en "${R}Current directory is not writable aborting benchmark.\n"; else SIZE=${1:-500}; FS=$(df . |sed "1d"|awk "{print \$1}"); LGFILE="./largefile43bm52342"; AVAILSPC=$(df -BM .|sed "1d"|awk "{print \$4}"|sed "s/M$//"); [[ ! -f "${LGFILE}" ]] && ((${SIZE}<$AVAILSPC)) && { DDRES=$(dd if=/dev/zero of="${LGFILE}" bs=1M count=${SIZE} 2>&1); DDRES=$(echo "${DDRES}" |tr "\n" " "| cut -d " " -f14-); rm "${LGFILE}"; echo -en "${Y}Sequential Write Benchmark for${RST} ${C}${FS}${RST}${Y}:${RST} ${G}(${SIZE}MB @) ${DDRES}${RST}\n"; } || { [[ -f "${LGFILE}" ]] && echo -en "${R}A file with the same name as the test file already exists: ${LGFILE}\nAborting benchmark, choose a new location on the same FS.\n(Where a file with the same name as the test file does not exist.)\n${RST}" || echo -en "${R}Not enough space on ${FS} for benchmark,\nAvail space: ${AVAILSPC}MB, Attempted ${SIZE}MB..\nTry again with a smaller size.\n${RST}"; }; fi; }; _' 

It optionally takes one argument, the size in MB of a file to test - if no argument is provided it will default to 500MB, it is fairly well behaved, it will check that there is space on the FS where it is run. And its output looks like this:   

Sequential Write Benchmark for /dev/sdb1: (500MB @) 178.3 MB/s

Thursday, January 8, 2015

Bash Localized Variable Occlusion

function update() { 
local -i VAR=45
VAR+=-1
VAR+=$1
echo $VAR
} 
 
$ VAR=3
$ update VAR 
 
And the output is??  
.. 88  
Why? Because $1 is VAR and the local VAR takes precedence control over 
VAR definitions, after dereferencing $1 the local VAR (now at 44) is added to itself. 

Sunday, January 4, 2015

Sanitizing dangerous yet useful commands

Some commands are useful yet utilized improperly can be very dangerous.

One example is the rar command, the compression level is excellent and the recovery record feature that automatically allows for rebuilding blocks within a given threshold tolerance is very useful against bitrot and other anomalous occurrences.

However some command options in rar are downright scary, like the -vd option: "-vd  Erase disk contents before creating volume. All files and directories on the target disk will be erased when ’−vd’ is used".

The -vd command option is bad by itself but the fact that another innocuous and useful command -vt list contents (verbose and technical) is a homonym gives chills.

The last thing I want to have happen to my 4TB hard drive is to have it wiped out because of a typo while I am trying to backup more of my important data to it.

So what to do about it? The answer was easy, create a simple script to wash the bad commands out using sed then wrap that with shc and set the permissions to 111 (execute only).

This concept can be expanded and adapted for any command, this is only a simple example, rar.bsh:

/tmp/rar.bsh #!/bin/bash 

########################################################################
### These rar options (except for lt) are diabolical in my opinion and 
### I have banned them from my system, modify this however you wish:        
###
### -vd  Erase disk contents before creating volume.All files and 
###       directories on the target disk will be erased when ’−vd’ is used.  
###       The switch applies only to removable media, the hard disk cannot 
###       be erased using this switch.  
###
### −df  Delete files after archiving. This switch in combination with the 
###        command "A" performs the same action as the command "M".
###
### v[t]  Verbosely list archive [technical] because "what#$?.." "was 
###       that add -vd???"
### l[t]  List content of archive [technical]. Files are listed as the ’v’ 
###         command with the exception of the file path. i.e. the file name 
###         is displayed.   
###
###   d   Delete files from archive.
##########################################################################
ORIG_CMD=''
   ### Quote/requote parameter quotes for eval
for arg in "$@";do ORIG_CMD="$ORIG_CMD \"${arg//\"/\\\"}\""done 
  ## Get rid of -vd -df -d and rewrite vt to lt and tell the user about it 
SANITIZED_RAR_CMD="$(sed 's/-vd//g; s/ d //; s/ vd //; s/-vt/lt/g; 
                     s/vt/lt/g; s/-df//g;' <<< "${ORIG_CMD}")" 
if "$ORIG_CMD" == "$SANITIZED_RAR_CMD" ]; then 
    ## Choose your own not easily guessed file name 
 eval "/usr/bin/.hide/rar_old_234290842348_ ${SANITIZED_RAR_CMD}"
else 
 echo "You tried something bad it was rewritten: $0 $SANITIZED_RAR_CMD" 
 read -p "(Press enter to continue or Ctl-C to break)..." -u RESPONSE 
 eval "/usr/bin/.hide/rar_old_234290842348_ ${SANITIZED_RAR_CMD}"   
fi 


Next obtain a copy of SHC: SHC by Francisco Javier Rosales García

Create a binary executable of the shell script, be sure to use the Traceable flag or 
it will create problems: 

$ shc -T -f rar.bsh 
$ ls 
rar.bsh rar.bsh.x 

The script now has been made into an executable, and has the .x extension. 

Now as root, create a new directory to hide the old rar and new script executables, 
rename the old rar executable to a unique name for added security: 
# mkdir /usr/bin/.hide 
# mv -nv /usr/bin/rar /usr/bin/.hide/rar_old_234290842348_ 
# mv -nv rar.bsh.x  /usr/bin/.hide/ 
# chmod 111 /usr/bin/.hide 
# chmod 111 /usr/bin/.hide/rar_old_234290842348_ /usr/bin/.hide/rar.bsh.x
# ln -s /usr/bin/.hide/rar.bsh.x /usr/bin/rar

Done! 

Now all users should be able to still use rar, yet no one has to worry about the scary options 
being accidentally typed in and causing trouble! 

Saturday, January 3, 2015

Find Commands of Open Files - lsof | ps alias

alias lsofcmd=$'_(){ CP="\\033[1;3" && Y="${CP}3m" && C="${CP}6m" && G="${CP}2m" && RST="${CP}0m" && IFS=$(echo -en "\\n\\b") && for a in $(lsof -w ${1} | awk \'{printf "Short_Cmd: %s Pid: %s File: ",$1,$2;for(i=9; i<=NF; i++)printf "%s ",$i;printf "\\n";}\' | sed \'1d\'); do echo -en "${Y}[[[[${C}${a}\b${Y}]]]]=>${RST}${G}$(/bin/ps --no-headers -p $(awk \'{print $4}\' <<< "${a}") -o cmd -ww)${RST}\n"; done; }; _' 

E.g.  $ lsofcmd /dev/sdb1
                                                                         . . .
[[[[Short_Cmd: wget Pid: 29355 File: /var/host/media/removable/MyBook/software/Qt/submodules/qt.mirror.constant.com/archive/qt/4.5/qt-win-opensource-src-4.5.0.zip]]]] => wget -E -r -N -Kk --random-wait --content-disposition --no-check-certificate -p --restrict-file-names=windows,lowercase,ascii --header User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:19.2) Gecko/20100101 Firefox/19.0 http://download.qt.io/archive/qt/5.4/5.4.0/submodules/ 

This is a colorized version, it should work on practically any terminal since it only uses colors in the 8 basic ANSI colors range 30-37 and looks decent on either a white or black background. If you want to see a good tips on Bash colorization have a look here:  FLOZz'

Friday, January 2, 2015

Recursive Bash script to clone any Sourceforge project

Source code
#!/usr/bin/env bash 



#############################################################################

###########################################################################

### Created by A. Danischewski (c) 20150102 v0.03

   ### 

    ## Usage: getsf.bsh <URL|"local.html"|"/full/path/to/local.html"> 

    ##   Files will be downloaded to the current directory or a newly created 

    ##   subdirectory if there are any folders

    ##

    ## Some Sourceforge projects don't provide convenient links to download 

    ## their software, no tar balls, no zip files, no svn or git links yet many 

    ## directories and files - downloading each is a lot of clicking and waiting -  

    ## that is time-consuming. That time that can be applied more productively. 

    ## 

    ## With this code you can recursively download entire Sourceforge projects. 

    ## It will dump all files to the directory from where it is run and 

    ## create subdirectories and load them for each folder link.  

    ## 

    ## Warning this program is simple and not very tested - likely to break at any 

    ## point in time since it relies on the current Sourceforge website layout

    ## but its simple and straightforward enough I thought I should share this

    ## for the concept of recursion. Your mileage will vary based on SF website 

    ## updates.  

    ##   

    ## If you fire this up one day and its broken, just take the string 

    ## $(lynx -dump -listonly -nonumbers "${1}"  | grep files | sed 's#/download##' \

    ## | grep http | grep -v timeline | grep -v \? | grep -v 'files/$' | uniq)

    ## and make that work to provide the URLS in the file list.

    ## 

    ## Make sure that the http://sourceforge.net/projects URL hasn't changed and 

    ## if need be adjust according to wherever the new pages live.  

    ##

    ## Then modify this: curl -b cookie_file -o "${DIRNM}.html" -L "$a"

    ## and make that work and you should have it basically functioning again. 

    ##

    ## Note: If you provide Sourceforge a useragent browser string (-A) 

    ##       this will break - the Sourceforge webserver has useragent logic

    ##       and it doesn't mind (at the time of this posting) providing files 

    ##       to the Curl agent. This is a case of - disguise yourself as 

    ##       a web browser and the website becomes ~less~ friendly. 

    ##    

    ## This program is free software: you can redistribute it and/or modify

    ## it under the terms of the GNU General Public License as published by

    ## the Free Software Foundation, either version 3 of the License, or

    ## (at your option) any later version.

    ## 

    ## This program is distributed in the hope that it will be useful,

    ## but WITHOUT ANY WARRANTY; without even the implied warranty of

    ## MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the

    ## GNU General Public License for more details.

    ## You should have received a copy of the GNU General Public License

  #### along with this program.  If not, see <http://www.gnu.org/licenses/>.

###########################################################################

#############################################################################



### You can uncomment the next line if its broken to better see whats going on

#set -x 



declare -r STARTURL="${1}"

declare -r STARTCMD="${STARTCMD:-${0/^./${PWD}}}"

declare -r BASEREF='<base href="http://sourceforge.net/projects">'



  ### To keep the code simple and flexible by allowing Lynx to handle the initial 

  ### input we lose the ability to prefilter to eliminate parent directories - 

  ### curl -> lynx -stdin | sed could avoid this too but this is pretty easy.  

  ### We concatenate the directories we process to a variable and export it 

  ### to all the child processes to check against - that way we don't loop. 

TMPPARENT="$(echo "${STARTURL##*/projects/}" |sed 's#/#_#g;s/_$//')"   

if [[ ! -z "${PARENT}" ]] && (($(! grep -Fc $"${TMPPARENT}" <<< "${PARENT}"))); then 

 PARENT="${PARENT}${TMPPARENT}"

elif [[ -z "${PARENT}" ]]; then 

 PARENT="${TMPPARENT}" 

fi 

export PARENT STARTCMD PWD   ### Export PARENT, STARTCMD (in case run w/relative) and 

                             ### PWD to be safe or the shell may complain.



  ### As far as I know Lynx doesn't provide a base url option, so we need to add a base tag to 

  ### avoid relative path links expanding to file:///. 

  ### If we are starting with a file on the filesystem make sure that our base tag is present

  ### this is the case of when you download an html file then pull from it when on the fs.

if [[ -f "${TMPPARENT}.html" ]] && ((! grep -Fc $"$BASEREF" <<< "${TMPPARENT}.html")); then  

       sed -"7 i $BASEREF" "$STARTURL"

fi   



function usage() {

cat <<EOF



 Description: Clone a SourceForge project



   This software wants the html of a Sourceforge project *files* url. 

   Not the project homepage, although if you do by accident it will get that 

   and a bunch more stuff that you probably don't need or want.  

   

   Instead it is designed to start at the Files tab, it can be the first Files page, 

   then you will get the whole project, or you can choose a subfolder to start.  



   Files will be recursively downloaded to the current directory and 

   newly created subdirectories for any SF folders.



   This software requires Lynx and Curl (sudo apt-get install curl lynx). 

    

 Usage: ${0##*/} < "http://SF/Files/URL"| "local.html"| "/fullpath/to/local.html" > 



   E.g. ${0##*/} "http://sourceforge.net/projects/files/<..target project..>" 

   

EOF

exit 1





[[ ! -z "${1}" ]] && [[ "${1}" =~ ^- ]] || [[ -z "${1}" ]] && usage



  ### Arg 1 <string> URL|.html file to process.  

function process_dir() { 

 for in $(lynx -dump -listonly -nonumbers "${1}" | grep files | \

   sed 's#/download##' | grep http | grep -v timeline | grep -v \? | \

   grep -v 'files/$' | uniq)do 

   FILENM="${a##*/}"

   FILENM="${FILENM%.*}"

   if [[ -z "${FILENM}" ]]; then 

       ### Process as a directory 

       ### Remove "http[s]://sourceforge.net/projects/" and under_bar /'s 

       ### for dir name.  

     DIRNM="$(echo "${a##*/projects/}" |sed 's#/#_#g;s/_$//')"     



     if [[ -d "${DIRNM}" ]] || (($(! grep -Fc $"${DIRNM}" <<< "${PARENT}"))); then 

       ### If the directory exists presuming this entry processed, probably the 

       ### starting parent or a parent link in SF to a link traversed if you need  

       ### to restart it or refresh it you should rename the directory and download  

       ### it again or if its really big you can add options to curl to not clobber. 

       echo "Pid:$$ Skipping ${DIRNM} ..."

       continue

     else 

       mkdir "${DIRNM}" 

       cd "${DIRNM}"

       echo "Pid:$$ Saving ${a} to ${PWD}/${DIRNM}.html ..."

       curl -b cookie_file -o "${DIRNM}.html" -L "$a"

       sed -"7 i $BASEREF" "${DIRNM}.html" ## Inject base tag for Lynx

       PARENT="${PARENT}${DIRNM}" ### Add newly processed directory to PARENT string. 

       "${STARTCMD}" "${DIRNM}.html" ### Recursively process directories. 

       cd 1>/dev/null

    fi 

  else 

    echo "Pid:$$ Saving ${a} to ${PWD}/${a##*/} ..."

    curl  -b cookie_file -o "${a##*/}" -L "$a"

  fi 

 done





process_dir "${STARTURL}" 

echo "Pid:$$ Done!!" 

exit 0


Source Code: getsf.bsh    MD5 0743699a1350c1c3d86d665bb1d3c75a
(Google Drive incorrectly identifies this as a bin file - its not, its a bash text file)
Also if you copy and paste this text it may not match because of the html double spacing, you can pipe it through sed and the MD5 sum should match: sed "{:a;N;\$!b a};s/[^\n]\n[^\n]//g;s/\n\n/\n/g;s/\(.*\)\(\n.*\)$/\1\n\2/g"

You may want to try this alias out, seems to work on any SF project: E.g. $ gsfn nasm  
alias gsfn='_(){ getsf.bsh "http://sourceforge.net/projects/${1}/files/?source=navbar";}; _'