m.(A.TT)/er
Learning By Way Of Ignorance
Sunday 07/03/11 02:20:52 PM

Lately I've stumbled upon a variety of tools that are included in most Linux distributions that I have never needed, was never made aware of or just never found.

One example would be comm :

$ whatis comm
comm                 (1)  - compare two sorted files line by line

Another [embarrassing] example is scp's ability to copy files/directories from one remote machine to another remote machine - in one step - without the files/directories ever being copied locally:

$ ssh host2 'ls -la /tmp/foolish'
total 8
drwxr-xr-x 2 mm   mm   4096 Jul  3 12:22 .
drwxrwxrwt 9 root root 4096 Jul  3 12:23 ..

$ scp host1:/etc/hosts host2:/tmp/foolish/

$ ssh host2 'ls -la /tmp/foolish'
total 12
drwxr-xr-x 2 mm   mm   4096 Jul  3 12:24 .
drwxrwxrwt 9 root root 4096 Jul  3 12:24 ..
-rw-r--r-- 1 mm   mm    221 Jul  3 12:24 hosts

Lacking a personal life and always seeking to incorporate new tools, I've developed a serious jones - a habit if you will - of listing packages and looking up binaries and scripts that I don't recognize (I have a crush on all package with 'utils' in the name). I would also stumble on GNU gold while hanging out in /usr/bin or /usr/sbin. This would always result in me wanting to find what packages my new friends were members of and immediate sift through the contents.

Considering most of the hosts that I work...err...hang out with are based on RedHat Enterprise Linux (or RHEL in ComputerNerdish), I would search through RPM package listings. In the past I have worked extensively with RPM's - I've managed hosts and their contents with rpm - I've managed Apt-RPM and Yum package repositories - I've built and currently build rpms. In fact, I recently just finished an automated package build and distribution solution that works with Subversion and will be releasing it to the wild very soon. In spite of all this *bragadocia* I was under the impression that the only way to query which rpm a file or directory belonged to was by running rpm -qf /path/to/file and the only way to list the contents of that package was to run rpm -ql packagename. That is, until I incorporated some slick functions into my .bashrc that replaced the two step operation of looking up and listing packages and found myself listing the contents of the package rpm:

$ wrl rpm

Listing Contents of rpm-4.4.2.3-18.el5:

/bin/rpm
/etc/cron.daily/rpm
/etc/logrotate.d/rpm
/etc/rpm
/usr/bin/gendiff
/usr/bin/rpm2cpio
/usr/bin/rpmdb
/usr/bin/rpmquery
...

Doh! I didn't even have to look at rpmquery's usage and I knew all the work I had just finished was sorta Elisha Gray'd (except I didn't come to timely innovation). Nevertheless it was mildly fun and educating to recreate the wheel (some of rpmquery's functionality) and so I thought I would share what I came up with.

Shell Functions As A Tool Test Drive

I spend a lot of time writing tools to automate or simplify stuff I make my minions (computers) do. I've found that I can non-intrusively test the functionality of a potential tool by incorporating it into my environment via functions in my .bashrc.

Let's look at the first two functions, which functionally accomplishes looking up the package name for any file either in my PATH or specified.

From $HOME/.bashrc:

function Validate {
    [ -d "$1" ] && Results="$1" && return 0
    local ItsType=$(type -t $1 || return 1)
    [[ "${ItsType}" == +(file) ]] && Results=$(type -P $1)  && return 0
    [[ "${ItsType}" == +(alias|keyword|function|builtin) ]] && return 1
}

function Which_RPM {
    Validate $1  && Target=${Results} || return 1
    Validate rpm && Rpm=${Results}    || return 1
    ${Rpm} -qf ${Target}
}

Validate

The first function is used to verify the existence, path and type of any non-builtin executables. This is used primarily for portability and secondarily for security. It serves as a functional resource (either directly or indirectly) for the rest of the functions covered here.

Which_RPM

Unlike the first, the second function accomplishes one of the tasks that I set out to achieve; namely to return the the package name for any base (i.e. not the full path) file name that happens to be in my $PATH. It will also return the package name for any file or directory that is specified by passing the full path on the command line. The first two operations utilize Validate to check the location of the argument passed on the command line and to check the location of the rpm excutable. The last line combines the two to get the desired result.

One thing I want to point out is my naming convention for variables and functions.

Variables : Clear, purpose named, CamelCase styled, curly brace bound

Validated Commands : Variable name matches command for which it stored with first letter capitalized, curly brace bound

Functions : Clear, purpose named, CamelCase like, with underscored word separation

All caps variables reduce readability and therefore I avoid them. The use of underscores has been something I avoided in the past - they look terrible with lowercase and they waste keyboard strokes (this is solved later on). However I have found using caps with underscored separate names look pretty good. Finally, the underscores aid in distinguishing the quasi namespace hierarchy which nicely separates functions from other stuff (like variables born of my crazy variable naming convention).

$ Which_RPM monit
monit-5.1.1-1.el5.rf

$ Which_RPM /etc/postfix        
postfix-2.3.3-2.1.el5_2

The next functional goal is to be able to list the contents of the packages that Which_RPM returns.

Which_RPM_List

function Which_RPM_List {
    Validate rpm && Rpm=${Results} || return 1
    for SomeFiles in $(echo $@)
    do
        WhichRPMResult=$(Which_RPM $@)
        echo -e "\nListing Contents of ${WhichRPMResult}:\n"
        ${Rpm} -ql ${WhichRPMResult}
    done
}

The first thing to note is that although Which_RPM_List calls Which_RPM we still have to Validate the rpm executable. Bash has no mechanism or concept of inheritance AND variable scoping is fairly restrictive when it comes to functions. Normally the right solution to this problem is to declare a global variable for anything that is used or shared by more than one function. But I wanted to see if I could accomplish everything in referenceable, modular units. Also notice that the name of the function follows the standard of establishing a namespace. Let's see this function in action:

$ Which_RPM_List monit

Listing Contents of monit-5.1.1-1.el5.rf:

/etc/monit.conf
/etc/monit.d
/etc/rc.d/init.d/monit
/usr/bin/monit
/usr/share/doc/monit-5.1.1
/usr/share/doc/monit-5.1.1/CHANGES.txt
/usr/share/doc/monit-5.1.1/COPYING
/usr/share/doc/monit-5.1.1/LICENSE
/usr/share/doc/monit-5.1.1/README
/usr/share/doc/monit-5.1.1/README.DEVELOPER
/usr/share/doc/monit-5.1.1/README.SSL
/usr/share/man/man1/monit.1.gz
/var/lib/monit

At this point, I was fairly happy but I immediately realized that Which_RPM will not deal with more than one argument passed to it. Sure I could go back and change it so that it can deal with multiple files or dirs, returning multiple package names, but I won't. Out of laziness and curiousity I decided to create another function that will use Which_RPM to do its thing on an entire iterated list. Here's what we got:

Which_RPM_Plural

function Which_RPM_Plural {
    Validate basename && Basename=${Results} || return 1
    for SomeFiles in $(echo $@)
    do
        WhichRPMResult=$(Which_RPM ${SomeFiles}) || continue
        JustBase=$(basename ${SomeFiles})
        echo "${JustBase} :: ${WhichRPMResult}"
    done
}

Seems pretty straight forward, even if it is not as efficient as it could be. One of the things I wanted to make sure it could do deal with passing everything in a directory as well as Bash brace expandable paths:

$ Which_RPM_Plural /etc/profile.d/*
bash_completion.sh :: bash-completion-20060301-1.el5.rf
colorls.csh :: coreutils-5.97-23.el5_4.2
colorls.sh :: coreutils-5.97-23.el5_4.2
glib2.csh :: glib2-2.12.3-4.el5_3.1
glib2.sh :: glib2-2.12.3-4.el5_3.1
lang.csh :: initscripts-8.45.30-2.el5.centos
lang.sh :: initscripts-8.45.30-2.el5.centos
less.csh :: less-436-2.el5
less.sh :: less-436-2.el5
vim.csh :: vim-enhanced-7.0.109-6.el5
vim.sh :: vim-enhanced-7.0.109-6.el5
which-2.sh :: which-2.16-7

$ Which_RPM_Plural /sbin/{fsck,service,chkconfig}
fsck :: e2fsprogs-1.39-23.el5
service :: initscripts-8.45.30-2.el5.centos
chkconfig :: chkconfig-1.3.30.2-2.el5

Yeah, this is nice, except it's butt ugly. Never fear, another function is here:

function Which_RPM_Plural_Formatted {
    Validate column && Column=${Results} || return 1
    Which_RPM_Plural $@ | ${Column} -t
}

And tonights winning lottery numbers are:

Which_RPM_Plural_Formatted

$ Which_RPM_Plural_Formatted /etc/profile.d/*
bash_completion.sh  ::  bash-completion-20060301-1.el5.rf
colorls.csh         ::  coreutils-5.97-23.el5_4.2
colorls.sh          ::  coreutils-5.97-23.el5_4.2
glib2.csh           ::  glib2-2.12.3-4.el5_3.1
glib2.sh            ::  glib2-2.12.3-4.el5_3.1
lang.csh            ::  initscripts-8.45.30-2.el5.centos
lang.sh             ::  initscripts-8.45.30-2.el5.centos
less.csh            ::  less-436-2.el5
less.sh             ::  less-436-2.el5
vim.csh             ::  vim-enhanced-7.0.109-6.el5
vim.sh              ::  vim-enhanced-7.0.109-6.el5
which-2.sh          ::  which-2.16-7

$ Which_RPM_Plural_Formatted /sbin/{fsck,service,chkconfig}
fsck       ::  e2fsprogs-1.39-23.el5
service    ::  initscripts-8.45.30-2.el5.centos
chkconfig  ::  chkconfig-1.3.30.2-2.el5

At this point I should be happy, but I am far from it. Why? Well, while I like the long, quasi namespaced, formatted function names, they make for terrible command line usage (terrible to type). We could set some aliases to shorter names OR we can bust out some more functions. YEAH! (obviously I am intentionally getting carried away):

function wr {
    Which_RPM "$@"
}

function wrl {
    Which_RPM_List "$@"
}

function wrpf {
    Which_RPM_Plural_Formatted "$@"
}

This is where the quasi namespaced fuction names really pay off. I simply name each new substituting function with the lowercased, first character of each word in the function name being substituted. Now to show off them shorty function names from the command line. But first, tab completion:

wrpf

$ wr
wr        write     wrjpgcom  wrl       wrpf      
$ wr

1 $ wrpf /etc/profile.d/*
bash_completion.sh  ::  bash-completion-20060301-1.el5.rf
colorls.csh         ::  coreutils-5.97-23.el5_4.2
colorls.sh          ::  coreutils-5.97-23.el5_4.2
glib2.csh           ::  glib2-2.12.3-4.el5_3.1
glib2.sh            ::  glib2-2.12.3-4.el5_3.1
lang.csh            ::  initscripts-8.45.30-2.el5.centos
lang.sh             ::  initscripts-8.45.30-2.el5.centos
less.csh            ::  less-436-2.el5
less.sh             ::  less-436-2.el5
vim.csh             ::  vim-enhanced-7.0.109-6.el5
vim.sh              ::  vim-enhanced-7.0.109-6.el5
which-2.sh          ::  which-2.16-7

Alas, thorough satisfaction is achieved except for the part where I found rpmquery already provides more or less everything I came up with:

rpmquery -f

$ rpmquery -f /sbin/{fsck,service,chkconfig}
e2fsprogs-1.39-23.el5
initscripts-8.45.30-2.el5.centos
chkconfig-1.3.30.2-2.el5

Bourne/Bash Have Hash

Thursday 08/04/11

For a while I have been using function Validate in order to ensure the existence of all non-builtin commands needed by a script. This also aided in making sure my scripts were slightly more secure by validating that the the command I expect to be a binary is actually a binary and not an alias to a malicious script or even malicious function established in the current shell. The benefits of function Validate don't exactly come without tradeoff. While one of my goals will always be to maximize the usage of language builtins, some of my scripts make use of a fair amount of non-builtin commands. For instance, Ambit uses the following non-builtin commands:

mkdir uniq sort sed cat touch host rm grep 
column head comm tail mv tr egrep logger

None of this stuff is 'exotic' or gleaned from packages that are not usually part of the typical base *NIX install, but it is still a high number of commands to validate. Luckily, the validation process happens only once and has shown to execute very fast. But...

As someone that has overlooked important information when learning a subject I have developed a habit of going back and re-reading material (over and over sometimes). Today I was looking through Bash's Parameter Expansion documentation when I stumbled on the the hash builtin to Bash:

$ help hash
hash: hash [-lr] [-p pathname] [-dt] [name ...]
    For each NAME, the full pathname of the command is determined and
    remembered.  If the -p option is supplied, PATHNAME is used as the
    full pathname of NAME, and no path search is performed.  The -r
    option causes the shell to forget all remembered locations.  The -d
    option causes the shell to forget the remembered location of each NAME.
    If the -t option is supplied the full pathname to which each NAME
    corresponds is printed.  If multiple NAME arguments are supplied with
    -t, the NAME is printed before the hashed full pathname.  The -l option
    causes output to be displayed in a format that may be reused as input.
    If no arguments are given, information about remembered commands is 
    displayed.

Doh!

Ascendancy Of Greatness...Accurately Predicted
Saturday 03/19/11 at 10:54:29 PM

For those that are just learning who Jon Jones is after his dominating win tonight over Mauricio Rua (to become the UFC Light Heavyeight Champion) know that your's truly and an old friend predicted the coming of the aforementioned Jones. Here's a Facebook back-and-forth between myself and Jeremiah Mason 8 months ago:

August 1, 2010 at 10:27pm

Mike Marschall -> Jeremiah Mason

This is so frustrating:

"UFC prospect shines Jon Jones (R) was too quick for Vladimir Matyushenko. Cagewriter: Test passed"

The guy is already one of the best in the division and unless he has a complete meltdown he will be the best light heavyweight of all time. I thought everyone knew that until I started seeing the "prospect" label.

August 2, 2010 at 2:45pm

Jeremiah Mason -> Mike Marschall

They need him on main cards..i know there trying to build name reconition, but he's got it all...i think he's more polished than Ryan Bader, yet he's on the main card against little Nog...maybe there just keeping him back to see if anyone comes up with an answer forhim...Matuchinco/Hamil/Bonner(allegedly) world class wrestlers destroyed...give him a wc striker and i guarentee Bones is to fast for them..his only thing that needs to be tested is his chin and cardio, how he reacts if game plan A isn't working..I tell everyone who will listen, jump on this train while u can, cause once it gets going, there will be no room

August 2, 2010 at 2:45pm

Mike Marschall

Yeah there is no doubt they are trying to milk it to maximize things for everyone. He would smash Bader. Yeah the bandwagon is going to get full really fast. Baring some freak accident, I can almost promise he will be the richest fighter in MMA before he is done and he will definitely be considered the greatest of all time. He is talented, mature and marketable.

[Unbelievably] Simple Enterprise Job Scheduling/Management
Saturday 03/19/11 at 03:08:05 AM

Jobber is a simple solution for centrally managing scheduled jobs across many systems. Jobber uses UNIX/Linux cron in addition to isitme (part of Jobber) to determine which user (EUID - effective user id) a job will run as, what time a job will run at and what hosts a job will run on. Jobber makes use of /etc/crontab as the central configuration resource for establishing all jobs. This file can then be distributed to all hosts via a variety of methods (Puppet/Chef, SCP, Rsync, etc.).

Download

Latest Jobber Source

How Jobber Works

Once jobber and isitme are copied to all hosts that need centralized job control, something similar to the following can be added to /etc/crontab:

                        Service/        
    Time     User         Host    Job  

*/5 * * * * Puppet jobber puppet pmaster
*/5 * * * * Puppet jobber puppet pupdate
*/2 * * * * Apache jobber apache ApachePush
*/5 * * * * Yum    jobber yum    YumPush

The /etc/crontab file allows for job entries to be established with an additional user field. This is ideal for those that like to establish 'job' or 'role' accounts for jobs that make use of automated authentication and process execution. Other than the user field, job entries in /etc/crontab are standard cron entries that establish an execution schedule and a job to run. In the case of a job that is Jobber controlled, cron runs jobber with two command line options:

  1. The hostname or service name representing the host[s] that will execute the job. Here we define 'service name' as a group of hosts that fulfill the same need (often cooperatively) and are likely identically configured.
  2. The job {binary,script} to be executed.

Jobber then acts as a wrapper with 3 primary functions:

  1. Call isitme, passing the hostname or service name to see if the current host running jobber matches the hostname or one of the hosts that make up the service.
  2. Execute the job {binary,script} that has been passed to jobber.
  3. Log (to syslog) the job name and the status of the job.

A job's status is either "Executed", "Failed" or "Skipped". If the current host matches the hostname or one of the hosts in the service passed to Jobber then the job will execute. If it does not then the job is skipped. This means all hosts that receive the centrally managed /etc/crontab file will attempt to execute scheduled jobs.

Jobber logs all job execution to syslog (regardless of a job's final status). This + Massh result in simple means to check job exection status no matter how many hosts are involved. The following shows that the job 'webpush' executed on 1.a.tt and not on the other two hosts:


1 $ massh -r att -c 'tail -5 /var/log/cron | grep JOB=webpush' -o | grep -v Failed
1.a.tt : 2011-03-19T07:45:02.378388-04:00 1 jobber[32491]: JOB=webpush STATUS=Executed
2.a.tt : 2011-03-19T04:45:01.512791-07:00 2 jobber[16464]: JOB=webpush STATUS=Skipped
3.a.tt : Mar 19 04:45:01 3 jobber[10349]: JOB=webpush STATUS=Skipped

Using Massh's -S (Success) option it is possible to verify the host by hostname only (Massh will only return the hostname[s] where the command ran successfully):


1 $ massh -r att -c 'tail -5 /var/log/cron | grep JOB=webpush.*E' -S 
1.a.tt

My (Not So) Humble Opinion
Monday 12/20/10 at 02:05:18 PM

Decided to post an email I wrote to a friend and former colleague where I explain my obvious disdain for Cloud Computing.

He wrote:

"...I know you are a cloud hater, but there is a shitload of opportunity to work on that stuff."

To which eagerly replied:

I have mislead almost everyone regarding my feelings about Cloud Computing. I do not have a problem with Cloud Computing as a concept. My problem is with the notion of Cloud Computing as a product or technology when it's just bullshit Marketecture for selling one of the following:

1) The excess computing capacity of companies that have created massive, global, horizontal environments (Amazon, Google, Yahoo, Facebook, Microsoft). These environments were designed to be simple to configure, manage and grow. The nature of such environments (huge scale) demand high levels of automation.

2) All the stuff in #1, but purpose built rather than a repackaging of excess capacity and existing tools. Hosting companies like RackSpace and Media Temple have created Clouds (ugh!) for the purpose of selling generalized computing capacity to the masses.

Cloud Computing is nothing more than what the aforementioned companies have constructed to address Internet capacity/demand for ~25 years. Cloud Computing (the term) was created so that sales people could turn around and sell excess capacity along with automation and management tools created by various operations teams at many of the worlds large Internet companies. The Internet and Computing Industry press have made the term ubiquitous via constant, trumped up buzz (my opinion of course). The idea of Computing as a Commodity (cpu cycles, storage) is not new, unless existing for 30 to 40 years is new. Combine this with the fact that there is no new technology represented by Cloud Computing and it is clear that Cloud Computing amounts to:

- A little technology recycling
- A lot of hot air and conceptual fluff
- An unmitigated marketing success
Those Whom I Admire and Learn From
Monday 09/27/10 at 08:36:43 PM

The following list of people (past and present) have profoundly influenced my thinking and helped shape my life:

The work, writing and speeches of the following people are currently consuming an unhealthy amount of my personal time:

The following people are members of the political establishment (politicians, campaign managers, political consultants, activists, media pundits, opinion journalists, ideology think tanks) that have gained my respect by either occassionally or consistently demonstrating the ability to tell the truth when it is ideologically and politically inconvenient. Deriving merit from such a list is absolutely sickening when considering it reflects the degenerative level of modern political discourse. Honesty in political discourse often means suppressing the urge to defend long standing, deep, ideological precepts that often come with a great deal of personal investment. Those who demonstrate even fleeting honesty over ideological loyalty deserve objective consideration and cannot be summarily dismissed or endicted by an irrational faith in party, platform or movement.

Taking It Upon Myself To Make Sure People Are Awake
Friday 11/05/10 at 09:26:46 PM
girl: Hey you
mm: hi there 
girl: What part of FL are you visiting?
mm: Lauderdale 
mm: Miami
girl: I'm in Orlando.. way north
mm: yeah 
mm: way way north
girl: how long are you down for?
mm: wed to mon
girl: quick trip
mm: yeah
mm: I am contracting right now 
mm: cannot take losing all that money 
girl: lol - traveling is good once in awhile 
girl: as long as you have some buffer for fun time....
mm: oh yeah 
mm: I do 
mm: I am also doing a talk
mm: http://www.flux.org/announcements.php?show=143
girl: sweet
mm: yeah
mm: this will be like the 4th or 5th one I have done for that group
girl: big group?
mm: I think it is around 700 members 
mm: usually get around 70 to 100 people for my talks 
girl: good that they keep coming back 
mm: typically the best attended talks from what I have been told 
mm: yeah 
girl: but of course - I'm sure you rock it 
mm: they get people from big companies from time to time 
mm: but nobody that is actually a member of the group
mm: and nobody that is actually working in an environment as big as
MY COMPANY's NAME OMITTED
girl: you mean as the speaker?
mm: yeah 
mm: I'm sure you have a lot more insight by working in that environment
mm: definitely 
mm: if there is one thing you really cannot simulate is real world, 
massive load
mm: like Peter North massive
girl: lol...
A Conversation With A *Legal* Immigrant
Wednesday 10/20/10 at 12:07:16 AM
friend: compiling 100 page proof for immigration
friend: they are asking for a lot of this
friend: shit
mm: sucks man
mm: our immigration system is so fucking broken
mm: we have no fucking idea how many people we need
mm: and who we need
friend: so true
friend: oh well
mm: I know that the state of CA is completely dependent on
migrants from MX and the smartest people from China, Japan,
India, Korea and Russia
mm: otherwise this place goes to fucking hell
mm: there is nothing good about this place outside of the
Universities, which created the tech industry, which is
completely propping CA up
mm: without Tech (including aerospace), which is to say,
without the smartest people from the rest of the world, CA
is a 3rd world nanny state with no money and no tax base
mm: here is my amnesty idea
mm: give amnesty to everyone that has even started the proper
process to immigrate
mm: let them all come in
mm: and kick out all the cheaters
mm: give incentive for doing it right
mm: fuck those that did it wrong
mm: it is insane that you have been dealing with this shit
since before I knew you and the whole time we have been friends
friend: was on the phone with AMEX extended warranty
friend: :)
friend: yeah man
friend: i've been sticking to the rule
friend: and i get fucked
mm: dude it is so fucked
friend: if I cheated the system might have been easier
mm: tomorrow morning I can drive one block and pick from 100
illegals to work  a construction site
mm: they stand outside the Home Depot that is as close as I
can throw a rock
mm: and they have no fear in LA since this is a Sanctuary
City which is blatantly violating federal law
mm: but there will be no case brought from the justice department
mm: it is pure fucked up politics
mm: good people who follow the rules are getting fucked
mm: while people who break the law are getting worked over,
abused, treated like slaves - yet they also get to stay
mm: they break the law, the employers break the law, the
employers work over the illegals and the gov't cares about a
state law that allows cops to "ask for papers" after they stop
you for something else and think you might be illegal
mm: imagine the idea of going to another country and thinking
you should not have to carry your papers if you are not a citizen
mm: INSANE!
mm: you were saying....
friend: hehe
friend: dude you type a lot
mm: yeah
friend: but
friend: you got it right
mm: I am trying to break records
friend: thanks for supporting me
mm: man I will do anything to support you
mm: if you ever need anything in writing from a dopey fuck like
me, just let me know
mm: yeah, yes
friend: m.a.tt/er/immigration
friend: haha
mm: hehe
friend: anyways man, i need to sleep now
mm: I could post this on there
friend: sure thing
mm: I might
friend: i will tweet it
mm: let me think about it
mm: I could get killed in this state
friend: hahaha
mm: this place is like communist russia
mm: I need to get the fuck out of here
friend: go ahead
friend: go to idaho
friend: anyways
mm: no way
friend: c ya tomorrow
mm: FL
mm: later
mm: love FL
mm: miss it
The Folly Of Over Emphasizing Race
Friday 10/08/10 at 01:57:01 AM

Regarding South Carolina's Nikki Haley - who has a chance to become "the second Indian American Republican governor in [the South], joining Louisiana's Bobby Jindal":

"She and her state have come a long way since, at about age 5, in her home town of Bamberg, she and her sister entered the Little Miss Bamberg pageant. It usually crowned a white and a black queen. The flummoxed judges disqualified both Randhawa girls."

- George Will

The rest of the the article can be found here.

Fear, Anxiety, Adrenaline, Confidence, Aggression
Wednesday 10/06/10 at 11:18:49 PM

When I come out I have supreme confidence but I'm scared to death, I'm afraid of everything, I'm afraid of losing, I'm afraid of everything. The closer I get to the ring the more confidence I get, the closer the more confident, the closer the more confident. All during my training i've been afraid of this man. I thought this man might be capable of beating me. I've dreamed of him beating me. Once I get to the ring I'm a God. No one can beat me.

- Mike Tyson

This is what it is like for a man who knows when and who he is fighting. Whether it is an arranged fight with a date and time or someone you know that you *have* to fight (to settle something that cannot be settled any other way). This is the struggle every man goes through. Every man wins and loses the fight long before he fights.

Come'n Getchu Sum!
Thursday 11/01/12 at 07:12:12 PM

The best part is that I did not have to beg anyone to post it. I was thinking I might have to 'put out' or something. My virginity is spared again. Anyway...

If you use RPMforge via {YUM,APT} or whatever, you can start Massh'ing immediately.

Name       : massh
Arch       : noarch
Version    : 1.0.3
Release    : 1.el5.rf
Size       : 13 k
Repo       : rpmforge
Summary    : Parallel shell and copy
URL        : http://m.a.tt/er/massh.html
License    : GPL
Description: Massh is a full featured mass ssh'er that 
           : can also push files and push/run
           : scripts on remote hosts.  Ambit is 
           : a string set expander or ranged interpolator
           : used to define sets of hosts that Massh and 
           : Pingz can use. Pingz is a masspinger/resolver 
           : that can use Ambit ranges or files for host sets.

This site will always use valid XHTML and valid CSS.

© AfterTheTweet.com 2017