Issue 1, January 2004
Modern History of Linux
Copyright (C) 2004 by Steve Litt. All rights reserved.
Materials from guest authors copyrighted by them and licensed for perpetual
use to Linux Productivity Magazine. All rights reserved to the copyright
holder, except for items specifically marked otherwise (certain free software
source code, GNU/GPL, etc.). All material herein provided "As-Is". User assumes
all risk and responsibility for any outcome.
| Back Issues |Troubleshooting Professional Magazine
Anyone who says you can have a lot
of widely dispersed people hack away on
a complicated piece of code and avoid total anarchy has never managed a
software project. -- Andy Tanenbaum
(2/5/1992 comp.os.minix post during the Minix/Linux flamefest)
By Steve Litt
Time flies when you're having fun. While contemplating this month's magazine
it hit me like a ton of bricks -- I've used GNU/Linux for 5 years. What changes
A quick perusal of the web showed a glut of GNU/Linux historical information
from 1991-1992, but less for the later years. So I thought it might be fun
to write of modern GNU/Linux history from the perspective of someone starting
in 1998. Welcome aboard...
Linux Productivity Magazine
By Steve Litt
Loyal readers, I need your help.
For months I've publicized Linux Productivity Magazine, expanding it from
a new magazine to a mainstay read by thousands. There's a limit to what I
can do alone, but if you take one minute to help, the possibilities are boundless.
If you like this magazine, please report it to one of the Linux magazines.
Tell them the URL, why you like it, and ask them to link to it.
I report it to them, but they don't take it very seriously when an author
blows his own horn. When a hundred readers report the magazine, they'll sit
up and take notice.
Reporting is simple enough. Just click on one of these links, and report
the magazine. It will take less than 5 minutes.
If you really like this magazine, please take 5 minutes to help bring it
to a wider audience. Submit it to one of the preceding sites.
GNU/Linux, open source and free software
By Steve Litt
Linux is a kernel. The operating system often described as "Linux" is that
kernel combined with software from many different sources. One of the most
prominent, and oldest of those sources, is the GNU project.
"GNU/Linux" is probably the most accurate moniker one can give to this
operating system. Please be aware that in all of Troubleshooters.Com,
when I say "Linux" I really mean "GNU/Linux". I completely believe that without
the GNU project, without the GNU Manifesto and the GNU/GPL license it spawned,
the operating system the press calls "Linux" never would have happened.
I'm part of the press and there are times when it's easier to say "Linux"
than explain to certain audiences that "GNU/Linux" is the same as what the
press calls "Linux". So I abbreviate. Additionally, I abbreviate in the same
way one might abbreviate the name of a multi-partner law firm. But make no
mistake about it. In any article in Troubleshooting Professional Magazine,
in the whole of Troubleshooters.Com, and even in the technical books I write,
when I say "Linux", I mean "GNU/Linux".
There are those who think FSF is making too big a deal of this. Nothing
could be farther from the truth. The GNU General Public License, combined
with Richard Stallman's GNU Manifesto and the resulting GNU-GPL License,
are the only reason we can enjoy this wonderful alternative to proprietary
operating systems, and the only reason proprietary operating systems aren't
even more flaky than they are now.
For practical purposes, the license requirements of "free software" and "open
source" are almost identical. Generally speaking, a license that complies
with one complies with the other. The difference between these two is a difference
in philosophy. The "free software" crowd believes the most important aspect
is freedom. The "open source" crowd believes the most important aspect is
the practical marketplace advantage that freedom produces.
I think they're both right. I wouldn't use the software without the freedom
guaranteeing me the right to improve the software, and the guarantee that
my improvements will not later be withheld from me. Freedom is essential.
And so are the practical benefits. Because tens of thousands of programmers
feel the way I do, huge amounts of free software/open source is available,
and its quality exceeds that of most proprietary software.
In summary, I use the terms "Linux" and "GNU/Linux" interchangably, with
the former being an abbreviation for the latter. I usually use the terms "free
software" and "open source" interchangably, as from a licensing perspective
they're very similar. Occasionally I'll prefer one or the other depending
if I'm writing about freedom, or business advantage.
1985: Unix Envy
By Steve Litt
Skating in and out of Santa Monica traffic on my way to work, I reflected
on my career. A young hotshot C programmer with a year's experience, it was
getting harder to describe me as a "junior" programmer. Trouble was, there
was little career path for a C programmer working the RT/11 operating system
on PDP/11 computers. If only I were in the Unix world.
Unix C programmer positions plastered the employment ads. Headhunters called
asking about Unix and C. Everyone wanted Unix, not RT/11. With Unix experience
you could name your figure. But you needed Unix experience to get a Unix
job. The Unix computers of the day cost $7000.00 and up, so buying one wasn't
an option. Oh how nice it would have been to have a Unix machine.
On the other side of the country, a young hotshot geek named Richard Stallman
was busy with the GNU project -- a project to create a clean-room Unix
lookalike capable of running on cheap machines. The thought of his work would
have excited me to no end, but I hadn't heard of it. Very few technologists
The GNU project had started with a paper called the GNU Manifesto, which
advocated free software, specifically a free UNIX workalike, which Stallman
then began creating with the help of others. In the Manifesto, Stallman prophetically
described the process of getting this to happen, including ideas in licensing
(must pass on source and all rights to the receiver, etc).
Stallman countered objections to the new paradigm, including the obvious
"programmers deserve the fruits of their labor". He stated that although
programmers would make less in a free-software economy, they'd still be well
He described an idealized world of no-charge software creating harmony
among programmers, and freeing business from being held hostage by software
vendors. Yes, it must have sounded pollyanishly idealistic in those stock
market soaring Reagan/Thatcher days.
1985 is the copyright date of version 1 of the GNU General Public License
(GPL), which Stallman wrote so as to guarantee freedom to use, modify, redistribute
and redistribute mods to software, forever and ever, amen. The creation of
the GNU GPL was probably the most important event in modern software history,
but in 1985 nobody knew of it. Time moved on...
1991: Secret Origins
By Steve Litt
Professor Andrew S. Tanenbaum was a hero to the working class Geek kid. Professor
Tanenbaum maintained a proprietary source distribution of a tiny Unix workalike,
whose primary purpose was the teaching of operating systems. It was called
MINIX, and it could run on hand-me-down mid-1980's computers, so the average
university Geek could afford the hardware to run it. The software -- $180.00
for the license and the source. Better yet, university professors who licensed
MINIX could give copies to all their students.
The MINIX Usenet newsgroup sported 43,000 inhabitants, many of whom submitted
improvements to MINIX. Many hoped MINIX would transend its academic beginnings
to become a really useful OS. Some were frustrated that Professor Tanenbaum
rejected many improvements in order to keep MINIX simple enough for a student
to master in a single semester.
One such inhabitant was a kid from Finland. This kid, whose name was Linus
Torvalds, wrote a Unix-like kernel, based as closely as he could to the POSIX
standard, and then semi-announced its existance on the MINIX newsgroup in
a 8/25/1991 post titled "What would you like to see most in minix?". By October
5 he was trolling the list for users and beta testers. He brought out version
0.12, a stable and useable version that could run several GNU utilities on
1/5/1992, and on January 29, 1992, the infamous Tanenbaum/Torvalds flamewar
erupted, witnessed by 43,000 MINIX newsgroup inhabitants.
Those 43,000 represented the tiniest sliver of technologists. The vast majority
of technologists knew nothing of the events on the MINIX newsgroup. Instead,
they were busy incorporating MS-DOS based commodity computers into businesses
of all sizes, and wondering what to do about Microsoft's new Windows 3.1
"operating system". Unix, and all expensive big iron operating systems, for
that matter, were on the decline.
I was part of that vast majority. I slammed out office automation code, with
front-ends written in Clarion 2.1 and back ends written in Borland Turbo
C++. Life was grand. No more Unix envy. But the very same Microsoft that
had given me the opportunity to write professional code on my kitchen table
commodity computer was itself entertaining visions of big iron...
1995: Wishful Thinking
By Steve Litt
A co-worker mentioned he was going to set up a "Linux" box at home. I asked
"what's Linux", and he said it was a free version of Unix that could run
on a 386. I didn't believe him.
Why should I? Who in their right mind would spend programmer-years writing
an operating system they then give away?
It was 1995, and the entirety of the country had emerged from the nasty Gulf
War recession. We were busy making money -- making up for lost time. Anyone
who could write code was profitably employed. Who would program for free?
1995 was the year when Microsoft owned the world. After capturing management's
imagination with Windows 3.0 and 3.1, in 1995 they issued the much more stable
and useful Windows 95. In 1995 Windows was ubiquitous, and Microsoft Office
was fast taking marketshare from WordPerfect. Corporate America demanded
their programmers produce GUI programs, even for such mundane tasks as batch
processing and text input. I begun to fear Microsoft's monopoly-based absolute
power would corrupt absolutely, and what that would mean to my career.
1995 was the year when the Internet escaped the bounds of the military and
universities to become available to anyone with a computer and a spare $20.00
to $30.00 to spend on an ISP. Mere mortals began creating websites. My first
website, Litts Tips, went online
If I had bothered performing an AltaVista search on the word "Linux", I would
have found a thriving community. The first Linux Expo happened in 1995 at
North Carolina State University. You could now buy Linux CD's from Yggdrasil,
Slackware and Red Hat. A magazine called "Linux Journal" was thriving, and
there was at least one tech book about Linux. Linux had been ported to the
64 bit DEC Alpha servers. Linux had thousands upon thousands of users, and
a few have found ways to make money. A company called VA Linux now sold systems
with preinstalled Linux. A company called "Cygnus" had been supporting free
software since 1989, and was now a serious entity.
But I didn't search. I just thought "wouldn't it be nice is that were true",
and continued programming Turbo C++. Millions of other technologists were
just like me -- oblivious.
By Steve Litt
On a work-finding tour of the Culver City section of Los Angeles, I happened
upon a tiny company selling a $4000.00 "ISP in a Box". In answer to my questions,
he mentioned it ran Linux. My ears perked up. Let's see, sale price, $4000.00.
Cost of goods sold, 0 (excluding hardware).
By 1997 the not so secret secret was that you could do a lot with Linux.
Tiny ISP's sprung up with no more than GNU/Linux, a commodity box, and a
pipe to the Internet. The Linux Expo show in North Carolina had now been
joined by Atlanta Linux Showcase, which started as a huge Installfest in
1996, and graduated to a real show in 1997.
By 1997 I loathed Microsoft -- their arrogance, their horrible development
environments, and their monopoly enforced ubiquity in the "enterprise". I
became one of the millions of technologists poised to make a move on Linux...
By Steve Litt
1998 was the year millions of people discovered GNU/Linux. I was one of them.
For me it was a year-long process. For the Linux Community, the year brought
The Internet revolution was going full bore. The March 9-13 Spring Internet
World '98 show at the Los Angeles Convention Center featured a tiny booth
from Red Hat Software, where they sold Red Hat installation CD's for $10.00.
I was tempted. Sorely tempted. At last I could have kitchen table Unix. Or
With my huge workload, that CD would probably become shelfware. And I needed
both my computers for my Windows programming. Making a mental note to someday
get Linux, I passed on the offer.
Others were more proactive. January 1998 saw the premier edition of Linux
Weekly News magazine, and Netscape announced they would release the source
to Netscape Navigator under a free software license.
February saw the introduction of the Cobalt Qube Linux disk space appliance,
and the Linux user community won InfoWorld's Tech Support Award. In March
Ralph Nader asked the large PC vendors to offer an OS alternative to Microsoft,
specifically mentioning Linux as a possible alternative.
In late March I couldn't think of a theme for the April issue of my monthly
magazine, "Troubleshooting Professional Magazine". Enraged over Microsoft's
conversion of the noble programming profession into glitch ridden Visual
Basic "apps", I themed the April issue "Corporationally Incorrect". The entire
magazine was a rant -- so unprofessional. Stats soared.
So I made the May issue "Free Software (Corporationally Incorrect, Part 2)".
Once again a rant, but this time documenting free software and Linux, and
even featuring a guest article by Forrest M. Hoffman and William W. Hargrove
of the Oak Ridge National Laboratory describing how they had built the The
Stone SouperComputer -- one of the first recorded Linux based clustered
supercomputers. This time the magazine's stats for the month exceeded every
other Troubleshooters.Com page. That's how I became a Linux advocate before
I'd ever used GNU/Linux. And that's how I got my first commercial reason
to install GNU/Linux.
While I was writing magazines, GNU/Linux was making history. The Google search
engine, based on Linux, went live in May.
May 1998 was special for one more reason -- the Microsoft Antitrust trial
In July Informix and Oracle released database software for Linux. Linux was
now a big iron, enterprise player. September marked the start of LinuxToday.Com.
September was very special for me. Like so many families before us, our family
migrated to find a safer place to raise our kids. We moved from Los Angeles
to Orlando, Florida. My wife and kids went first, and I followed in my old,
1967 Dodge Coronet, packed floor to ceiling, and roof too, with everything
from my business -- file cabinets, shelves, printers, backups, and my two
computers. Once in Orlando, 2500 miles away from my nearest client, I had
some time on my hands. I called up Red Hat and ordered the software I should
have bought at Internet World '98 the preceding spring. It arrived a week
later -- Red Hat 5.1, complete with a nice blue booklet explaining everything
So began a regular daily routine -- work getting new contacts, authoring
Troubleshooters.Com, and doing some programming on my main machine, all the
while repeatedly installing Linux on my secondary machine. That week I performed
about 40 installs, each time changing my responses and seeing the changes
in the /etcdirectory. I'd do my real work on my Windows machine,
and when the installer asked for info, I'd switch chairs and enter the info,
then go back to my real work. After about twenty such installs, I bought
2 network cards, 2 cat 5 cables and a hub, and networked the thing. That's
how I learned about networking.
Meanwhile, storm clouds gathered above Microsoft -- an antitrust trial loomed.
Microsoft could not squish Linux like they had every other competitor --
for the time being they needed to "play nice".
Remembering the huge readership of the April and May Troubleshooting Professional
issues, I made the November issue "Linux", and in that issue I went through
a complete Red Hat 5.1 install, starting with installation, then dual boot
installation, partitioning, X configuration, LILO. Then a few tasks such
as simple commands, a C program, a perl program, and rescue floppies. Then
I went into network config -- adapter setup, Apache, CGI, DNS, and last but
not least, a simple Perl DBI::DBD webapp. The November issue broke all records
for a Troubleshooters.Com web page, and attracted numerous visits for the
next three years.
A Linux User Group called ELUG met in the library in downtown Orlando. I
piled my secondary computer, my monitor, hub, network cables, and install
CD's in the trunk of my 67 Coronet, and went to ELUG.
Geniuses everywhere. No UNIX envy here -- many of these guys used UNIX since
the 1980's, and they knew everything. Guys whipped out install disks and
installed different distros. Everyone connected to each other. One guy went
in the ceiling, fished out a phone wire, and hooked up to the net.
This intellectual frenzy was orchestrated by the group's leader, Jeff Rose.
Jeff was the Central Florida's greatest ever Linux advocate, and his leadership
was legendary. When Jeff suggested something, ten ELUG members would jump
to get it done. ELUG was on its way to incorporating. Everywhere you looked,
ELUG was either getting publicity, or publicizing Linux. Jeff Rose had strategic
alliances with SVLUG, and ELUG was world famous. By December I had been appointed
ELUG's publicity person.
What a December it was. We all knew Linux was headed for stardom, and we
1999: Fame and Glory
By Steve Litt
If you were in IT back in 1999, you remember it. You'll tell your children
and grandchildren about it. Everyone had jobs and money. Everyone did interesting
things. If you didn't like the way your boss looked at you, you'd get another
job during your lunch break.
Of all the places to be in IT, Linux was the best. The Internet powered the
new economy, and GNU/Linux powered the Internet. Sure, you didn't make as
much money in GNU/Linux, but for once you could produce a product you could
be proud of.
Winter and spring featured such events as the release of Samba 2.0, "Windows
Refund Day" where geeks demanded refunds on the unused Windows OS bundled
with their computers, and the Debut of Linux Magazine. "open source" became
the buzzword of the day. The first LinuxWorld
Conference and Expo, held in San Jose, clams an attendance of 12,000.
In February I was named to the ELUG board of directors. In April 1999 MacMillan
hired me to write four chapters of Red Hat Linux 6 Unleashed: DNS, Samba,
Python and TCL. In late April I wrote the May 1999 issue of "Troubleshooting
Professional Magazine", themed "Where Have All the Heroes Gone", which discussed
heroes of the computer age, including the free software heroes. That issue
attracted over 20,000 visits in a single month -- to this day a record for
a single Troubleshooters.Com page. Then things got better...
On May 17, 1999 I aimed my car north and drove to Raleigh, North Carolina
for the fifth annual Linux Expo show. Everyone was there -- everyone you
read about at the Linux magazines, everyone you heard about at your LUG meetings.
I walked in and guess what -- they all knew me!
Hey, aren't you the guy who wrote "Where Have All the Heroes Gone?" Robin
Miller pulled me aside and complimented the Troubleshooters.Com Universal
Troubleshooting Process page, and then introduced me to the crew at Andover.Net,
which would someday become OSDN, which itself would be bought by VA Linux,
after which Robin Miller would become the Editor in Chief of OSDN.
I walked with a bounce in my step. I represented the great ELUG as one of
its directors. I was a contributing author to Red Hat Linux 6 Unleashed,
and hobnobed with many other Macmillan authors and editors. I was the author
of the much quoted "Where have all the heroes gone".
Don't get the idea that I was something special. EVERYONE there was becoming
famous. This was 1999, and Linux was the place to be noticed.
In June 1999, for legal reasons beyond the scope of this article, ELUG changed
its name to LEAP (Linux Enthusiasts and Professionals), and the ELUG board
of directors became the LEAP Executive Committee. The Exec Committee continued
the Board of Directors' work on creating bylaws for a corporation. In August
1999 LEAP incorporated and held its first officer election. I was elected
a director at large, and Chris Young was elected our first president. But
the name change had set off a civil war, with a few ELUGgers, including Jeff
Rose, forming a second ELUG organization. The flames of war were fanned by
five email flamers wanting to be heard at any cost.
The cost was two years progress. With his unbelieveable leadership, Jeff
Rose quickly built the membership of the second ELUG to a level beyond that
of LEAP. Orlando is too small a city, with two little a Geek presence, to
support two LUGs. Everyone knew that only one LUG would survive. The two
LUGs sniped at each other until one died, and one shined brightly.
The year continued. GNU/Linux was ported to the 64 bit Intel Merced processor.
SGI swore its allegance to Linux. Burlington Coat Factory converted their
systems to Linux. Red Hat IPO'ed in August, rising to $50/share, later rising
to $135/share. In December VA Linux IPO'ed, rising from a starting price
of $30/share to $250/share that same day.
By late 1999 it was clear to everyone that Microsoft wouldn't easily get
out of their little antitrust problem. Judge Jackson was no pushover. On
November 5, 1999, Judge Jackson issued a finding of fact that Microsoft was
indeed a monopoly, so the next step was the a finding of law, and if Microsoft
was again guilty, a remedy.
During the last quarter of the year I busied myself as the main author of
Samba Unleashed. There's something about having your name on the front of
a 1200 page book!
1999 had indeed been a year of fame and glory. What would the new millenium
bring? Read on...
2000: Business as Usual
By Steve Litt
2000 was a year of solidifying our position and making money. Everywhere
you looked, people made money with GNU/Linux. As long as you looked at corporations,
and not the rank and file Geeks who brought Linux to the forefront. GNU/Linux
was now the second most popular operating system. VA Linux bought Andover.Net.
Apache gained 60% of the web server market share. HP, Intel, IBM and NEC
formed the "Open Source Development Lab" to help programmers test their big-iron
targeted Linux code.
The LEAP/ELUG civil war continued, with LEAP making slow gains. ELUG was
based entirely on the personality and incredible leadership of Jeff Rose.
LEAP was based on bylaws, a 7 member Exec Committee, and a membership that
understood and believed in these things. Both groups were wholely committed
to GNU/Linux. Around May or June Jeff Rose left Orlando. Two subsequent ELUG
leaders did not have Jeff's magnetism, and ELUG lost its meeting place in
June or July, and became a mailing list only LUG. They dwindled quickly,
after which LEAP concentrated on building, which we did with abandon. Nevertheless,
the whole civil war was a travesty. We lost over a year of growth, and worse,
we lost Jeff Rose, who in any sane world would have been LEAP's first president.
By our second election in August 2000, LEAP was for all practical purposes
Orlando's only LUG. Chris Young had shepherded LEAP through a dangerous and
difficult year. Phil Barnett was elected our second president, and under
Phil LEAP became a powerhouse.
Samba Unleashed hit the bookstores in April. I didn't become a millionaire,
but I did OK.
Geeks didn't become millionaires, but they did just fine, thanks to a red
hot economy whose pundits thought of Linux as the holy grail. Computer trade
mags leapfrogged each other to become the latest to praise Linux.
In April 2000 Judge Jackson found that Microsoft was not only a monopoly,
but they had violated the Sherman Antitrust act. On June 7, 2000, Judge Jackson
issued his remedy, ruling that Microsoft should be broken in two. The GNU/Linux
world, long suffering from Microsoft's dirty tricks, rejoiced.
Unfortunately, in June Judge Jackson allowed Microsoft to delay the divestiture
(breakup) until their appeal had been heard. Jackson allowed Microsoft to
appeal directly to the Supreme Court, but the Supreme Court wouldn't hear
In December IBM pledged to invest a billion dollars in Linux during 2001.
What a wonderful world.
But late 2000 brought a couple disconcerting facts. There was strong evidence
that the eight year old economic expansion was winding down. And there was
a very contested presidential election in which the Supreme Court finally
appointed an administration that many believed would be more loyal to its
political contributors than to the nation's antitrust legislation...
2001: The Bubble Bursts
By Steve Litt
Bush took office in January 2001, and the justice department continued the
Microsoft case. Maybe the skeptics were wrong.
But Judge Jackson had made some out of court statements leading the appeals
court to believe that his punishment was flawed by personal beliefs, so although
the appeals court held up Microsoft's guilt, they ordered a new penalty phase.
In August 2001 Judge Koleen Kollar-Kotelly was appointed to replace judge
Jackson. Judge Kollar-Kotelly had little high tech background -- the Linux
world took this as ominous. Meanwhile, Microsoft resumed their dirty tricks,
bundling products like Instant Messaging, Media Player, and digital camera
software, in addition to Internet Explorer.
In 2001, Microsoft attempted to influence Congress into banning or limiting
free software with rants from Jim Allchin, Craig Mundie, and Steve Ballmer.
They attempted to hijack the information and intellectual property of others
through their original outrageous MS Passport license, and I'm proud to say
that I was one of the webmasters who boycotted all web surfers coming in
through Passport. Within a few days Microsoft backed down.
The dot-com bust began early in the year, and continued. GNU/Linux became
an outcast. The same trade press that had leapfrogged each other to print
the next pro-Linux story now leapfrogged each other to print the latest anti-Linux
story. Sensing which way the wind was blowing, the trade press praised Microsoft
and their products to the heavens.
As if the economy wasn't enough, that punk bin ladin and his defective friends
knocked down the trade towers, and many considered the U.S. response overly
mild. The brief outpouring of patriotism that followed couldn't mask the impotence
felt by a nation wondering if we had the will to defend ourselves. The economy,
which had already been headed down, plunged further. Tax cuts failed to stem
the onrush of recession.
Jobs were lost. Lots and lots of IT jobs. We thought it was bad then, but
we had no idea.
As the tide retreated from the business hype surrounding Linux, GNU/Linux
reverted to its true self, its best self, its grassroots self. Linux Expo
never saw the new century. Atlanta Linux Showcase became Annual Linux Showcase
in Atlanta in 2000, then in California in 2001, then nowhere in 2002, but
regional shows rushed in to fill the void. Florida had three great local
shows, including LEAP's Linux presentation at the Orlando CTS show.
On a personal level, in March 2001 I switched to Linux for my daily work,
thus bringing my actions in line with my advocacy. The April 2001 Troubleshooting
Professional Magazine detailed the trials and rewards of the switch. In August
2001 I was elected as LEAP's 3rd president. LEAP presented a highly acclaimed
GNU/Linux presence at the Orlando CTS show in May. LEAP was famous, financially
stable, and growing.
As 2001 ended, it appeared the worst might be over. We invaded Afghanistan
on October 7, and swore in a new Afghan government on December 21. The stock
market had partially recovered from the September 11 panic. In our relief
we might have been forgiven for ignoring the December 2 news item that Enron
had filed for bankruptcy...
By Steve Litt
What's OOR? Is it a programming paradigm? No, it stands for Outsourcing,
Offshoring and Recession. Perhaps it's
a programming paradigm after all. 2002 was the year computer programmers improved
their productivity by flipping burgers, driving cabs, and working as schoolteachers.
Millions of U.S. jobs were lost, and IT was especially hard hit.
Enron's late 2001 bankruptcy, and the 2002 revelations it spawned, knocked
the stock market back to 9/12/2001 levels. Money was lost, more people were
fired. The few jobs created seemed to go to foreigners -- either H1B/L1 visas
or jobs emailed to India. Worldcom soon joined Enron in revealing fraudulant
accounting practices leading to bankruptcy.
Worldcom was especially destructive. For years all the telecom companies had
lowballed prices and overbuilt infrastructure trying to compete with Worldcom,
who appeared to be making a healthy profit while charging customers next
to nothing. In trying to compete with Worldcom, most telecom's lost huge
sums, trashing their stock prices and inviting job-losing consolidation.
Layoffs were rampant.
Economic cycles are nothing new. If you're over 30, you've seen other economic
cycles come and go. Good times will come back again.
Offshoring is something new, at least to IT and engineering. Those
jobs are never coming back. Entire IT departments shut down and turned off
the lights, leaving one guy whose job is to power cycle the server when requested
by the technologists in India, China, Russia or whereever. American technologists
were retained just long enough to retrain their pennies on the dollar foreign
Those jobs are never coming back. If you ever wonder why the U.S. graduates
so many more lawyers than engineers, look no farther the relative ways those
two professions are rewarded. If current "free trade" attitudes prevail, there
won't be enough U.S. technologists to support a future war effort, and our
enemies will realize this. How will that affect our bargaining position in
How did the army of laid off computer professionals continue to sharpen their
skills without a job? Open source! People flipped burgers during the day,
and slammed out open source at night.
LEAP continued to thrive, serving employed and unemployed technologists alike.
In August I handed over the presidency to Phil Barnett, who served his second
term as president. LEAP members exchanged knowledge and employment tips. Many
LEAPsters worked on open source projects. LEAP's activities were duplicated
by LUGs in every city.
How did corporations deal with the now 2 year old bad economy? Increasingly,
they turned to open source, depriving would-be monopolist Microsoft of revenue.
How did Microsoft respond? They altered licenses in ways increasingly costly
to their customers. How did customers respond? Increasingly, they turned to
On Friday, November 1, 2002, at around 4:20 in the afternoon, the United States
of America granted full and unconditional monopoly powers to Microsoft Corporation.
Microsoft was granted a complete pardon for what has been ruled an illegal
monopolization of the web browser market. Microsoft was granted complete
power to monopolize any other markets they choose. Microsoft was granted
the power to restart the dirty tricks against competitors that they had put
on hold during the antitrust trial. The only penalty in the "settlement" proposed
by the Bush administration and rubberstamped by Judge Kollar-Kotelly is that
if Microsoft violates the settlement, they get two additional years tacked
on to the 5 year settlement. It's like telling a bank robber "you'll be on
probation for 5 years, and if you rob another bank we'll extend your probation
for two more years".
Microsoft now has the power to use monopolistic tactics, no matter how outrageous,
against open source software. And worst for us, I fear that as long as the
current administration is in Washington, Microsoft can use the United States
Government to attempt to shut down the use of open source in the United States.
In a perverse way, perhaps the tech industry implosion was a good thing.
In better times, Microsoft might have achieved the monopoly they so sought.
The joblessness and cost consciousness brought on by the funereal economy
supported open source...
2003: GNU/Linux Survives
By Steve Litt
2003 was upbeat. Just listen to the media. The radio bragged that job losses
were decreasing. It would be years before our unemployment levels returned
to a reasonable rate, especially in IT, and we were still losing jobs, but
we weren't losing them as fast. Toward the end of the year, we were technically
in a boom market. The stock market once again hit the 10,000 mark it first
achieved in the spring of 1999. A huge construction boom roared.
Few of my tech buddies shared in the joy. It seemed like newly created jobs
were emailed to India, while layoffs were born by Americans.
At LEAP meetings, more than ever, the most asked question was "does anyone
know of any jobs?". They were no longer demanding open source jobs. They no
longer demanded anything. They wanted a job, any job. And all too often the
response was either "no, I'm barely holding onto my job", or "I was going
to ask you the same question".
More than ever, LEAP served as a tech hangout. The unemployed and the hoping-to-stay-employed
learned at Thursday meetings and frolicked at Saturday Installfests. Recession
fueled workplace drudgery was countered by an increasingly fun and productive
LEAP presence. In August Max Lang ascended to the LEAP presidency. Max was
the first LEAP president not on the original founding LEAP executive committee.
LEAP had proven itself stable enough to spawn a second generation of leaders.
In 2003, open source proved a real stumbling block to Microsoft's monopoly
aspirations. IBM repeatedly hammered home the cost and productivity advantages
of GNU/Linux. Cities, states and countries switched from Microsoft to open
Imagine being Bill Gates in 2003. You've been unable to use your usual revenue
starvation techniques against GNU/Linux, and your pleas to the government
for protection against this "intellectual property cancer" appear to have
fallen on deaf ears. Clearly, you need a new strategy.
Your new strategy must be subtle. It's true that the antitrust "settlement"
of 11/1/2002 granted you full monopoly powers, but Judge Kollar-Kotelly made
it clear that she would monitor the situation. Perhaps you can find a company
with a barely supportable intellectual property claim to GNU/Linux source
code. Perhaps you can fund a flurry of anti-Linux litigation threats. Perhaps
you could keep that company, which had been previously bleeding red ink,
securely in the black by a cash infusion of a few million dollars in the
form of a "license" from that company. Chump change to you, but a massive
marketing headache to the Linux world via Stupid Courtroom Obfuscations.
If that was Microsoft's intent in purchasing the SCO license, this new anti-Linux
defense appeared no more successful against GNU/Linux than its whine to
the government and revenue starvation predecessors. The Samba
project roared in with its Samba 3.0, which replaces not only Microsoft file/print
server functionality, but a heck of a lot of Microsoft's authentication functionality.
IBM keeps marketing GNU/Linux, with quite a bit of success. Sun is now selling
GNU/Linux machines, complete with the StarOffice replacement for MS Office.
Outside the United States, Microsoft falls on hard times. Various German
municipalities have dumped MS for open source. Israel follows suit. Microsoft
bribed India with an insulting $400 million, and India buys more MS product,
for now. I wouldn't want to be Bill Gates.
There's a sizeable class of businesses and individuals now comfortably ensconced
in open source. Troubleshooters.Com is one of them. In the 2 years since my
Windows to Linux conversion, I've set up scripts and apps to automate everything
from book writing to order fulfillment to web publishing. My UMENU software,
a 2 day fun-project in 1999, is now the top level of my user interface. VimOutliner,
once a compromise to perform outlining on a GNU/Linux box, is now by far
the best outliner I've ever used.
And open source supports a new kind of technology job...
2004: Into the Future
By Steve Litt
Entire corporate IT departments are outsourced and offshored. Throughout America,
Fortune 500 IT jobs evaporate. It looks to me like the days of cushy IT jobs
But when you think about it, IT people rarely had it cushy. I did RT/11 in
a Unix world. I did DOS when DOS wasn't cool. There were a few Cobol programmers
who managed to parlay a single language into lifetime employment, but they're
a rarity. Most of us have had many occasions to learn quick and lowball our
There's a LEAP guy who's the IT department for several dentists offices. He
probably doesn't have paid health insurance and a subsidized 401K, but he
appears to make a good living. He uses open source when he can, Windows when
the client demands it.
Another LEAPster quit his programming job and now serves as the one man IT
department for a religious school. He's Linuxizing, thereby probably saving
his employer more than his salary, and making his own job easier.
Look what these two have in common:
This might be the IT job of the future, and it might be pretty good. In this
era of the small business generalist, there's less pressure to continually
learn new and bizarre programming tools. The small business generalist isn't
susceptable to replacement by someone who just happens to have more experience
in the new technology of the month. Gone are the days of age discrimination
-- small businesses often have the head guy talk to the new prospect, and
the head guy often respects experience. Gone are the days of having some fool
stand between you and the users you're accommodating -- this is a small business
- They work for small businesses.
- They must perform as generalists.
- Their customers are too small to hire an outsourcing agency to offshore
This reminds me very much of the DOS revolution of the mid 1980's. For a long
time, we DOS types were underpaid, often creating programs for individuals
or companies so small we prayed we'd get paid at all. But we carried our
own health insurance, wrote our own ticket, and felt free to turn down deals
we didn't like. We were hired gun programmers, and we loved life.
The 2004 small business IT generalist is similar. There's freedom to move
in places where the outsourcers and offshorers enjoy no economic advantage.
Our pay and benefits won't be obscenely high, which means our incentive to
move elsewhere will exceed our employers' incentive to fire us.
If history repeats itself, we're setting ourselves up to be the technical
elite. Remember that in 1985 DOS guys worked for little companies, receiving
no benefits and little money. The Unix and mainframe crew reaped the rewards.
By 1991 the tables had turned, with DOS guys hired to do quick apps in the
large companies, and the big iron Unix and mainframe guys scratching for work.
The DOS guys achieved that 6 year turnaround because departments in corporations,
long ignored by the stodgy big iron IT guys, found out that the DOS guys
knew how to convert a wish into a system in a matter of weeks, not years.
Imagine the surprise of the corporations when they discover that the offshore
crew still takes years to deliver their oh-so-perfectly-architected enterprise
apps, while the small business IT hired guns can use Perl/Python/PHP to add
another access to the company database in a matter of days.
We are the future of American IT, and open source is part of our value proposition.
Life After Windows: The Need for Speed
Life After Windows is a regular Linux Productivity Magazine column,
by Steve Litt, bringing you observations and tips subsequent to Troubleshooters.Com's
Windows to Linux conversion.
By Steve Litt
The preceding article mentioned the coming of age of the open source equipped
small business IT generalist. This is a guy who must compete on price against
third world workers. He's got to be quick, and his systems have to be quick.
The new IT professional must rely on himself, not on expert co-workers. This
requires a generalist knowledge of the technology, and also requires knowledge
and use of a valid Troubleshooting Process. Luckily, the very website you're
now reading can teach you the Universal Troubleshooting
Process, which is ideal for the new small business IT generalist.
The new IT professional must rely on himself, not on expert co-workers. This
requires just-in-time learning of technologies that become essential in satisfying
her employer or client. The incredibly productive Rapid Learning Process is
well documented on this website.
Remember the days of huge projects consisting of huge teams producing huge
apps over huge timeframes for huge corporations? If the app was overdue and
over budget, or if was cancelled, that wasn't your problem. You laughed all
the way to the bank. You can probably still get that work if you're willing
to move to certain developing countries and learn their languages. If you're
not willing to make that move, you better learn to develop quickly.
For 15 years we've heard talk of Rapid Application Development (RAD) and Reusability,
usually from folks who considered 6 months "rapid" and whose idea of reusability
was to spend 3 months developing an architecture that would promote reuse,
if only the maintenance programmers would take the time to read the documentation.
We hired guns have different definitions. To us, "rapid" means a day, which
is about the amount of time a small business will hire us on our own word.
When we walk out at the end of the day, we must leave them with something
that works. Not the app of their dreams, but something showing enough utility
and promise that they'll bring us back to enhance it.
To us, "reusability" refers to grep, cut, head,
tail, tee, sort, sed and awk.
These are tools tested throughout the ages, that do one thing and do it right.
These tools have very thin interfaces and no known bugs or security issues.
To us, "reusability" refers to Python, Perl and PHP, as well as module repositories
such as CPAN. These languages are years old, hugely tested, and don't depend
on memory-tromping pointers. These languages enforce no particular programming
paradigm, so anyone can use them to quickly slam out an app.
Not every app needs to run quickly. Certainly a database front end needn't
run quickly -- the bottleneck is the user's typing speed. Once the user hits
the transmit button, speed is important. No user wants to wait over 30 seconds
for an update to complete. Good database design usually assures timely response.
Likewise, sizing hardware and the wire to suit the user load is vital. Beyond
that, sometimes the needed functionality involves huge numbers of records.
In those cases, you must design for speed. Here are some tips:
When in Doubt, Screen em Out
The more records you can eliminate early in the game, the faster your app
runs. For instance, when selecting certain records, try to do it on the server
end (with a SQL statement) rather than on the client end (with program logic).
If the required logic is too complex for SQL alone, consider filtering the
output (on the server end) through the grep command. The grep
command is insanely fast -- much faster than sed, which in turn
is much faster than awk, which in turn is twice as fast as perl.
Take my home-grown web log analysis program. Troubleshooters.Com receives
over 2000 distinct-ip visits daily, and it's my job to track those visits
for marketing purposes (not to track bandwidth). On a daily basis, I record
the the top 20 most visited Troubleshooters.Com pages, as well as tracking
certain pages of special interest (such as my bookstore pages), and
summarizing daily distinct IP visits and total page hits. December's log
file contained 930572 lines -- a lot to process on a Dual Celeron 450 with
512MB. Because my program must go back several months and summarize all of
them, it typically must at least consider millions of lines of log file.
You notice I said "at least consider". Graphic pages are not of interest
in marketing (I don't sell advertising). Java .class files aren't
of interest -- I can track the .htm files that spawn them. Also
not of interest are pages that aren't successfully viewed -- only result
Web log evaluation requires some regex parsing -- a slow process. Luckily,
only about 1/10 of the lines are non-graphic, non Java, result code 200.
I screen out the rest before the regex parsing stage. Check this out:
cat `./logfilelist.cgi` | \
grep -v "\.gif " | \
grep -v "\.ico " | \
grep -v "\.png " | \
grep -v "\.js H" | \
grep -v "\.css" | \
grep -v "index.cgi" | \
grep " 200 " | \
grep -v "\.jpg " | \
grep "\"GET " | \
grep -v "\.class " | \
grep -v "65\.94\.113\.101" | \
In the preceding, the top partial line concats the monthly logs and feeds
the lines to the greps. The -v option means "reverse", meaning do
NOT pass matches. Therefore, the preceding first screens out .gif
files, then .ico, then .png, on and on. You might wonder
why I put them in that order. The answer is simple enough, I screen out the
most commonly occurring patterns first, so those lines don't proceed farther
down the pipeline. Doing it in that order screens out the most the quickest,
minimizing the total work.
At first glance these eleven greps might look like a lot of very slow work.
Indeed, if they happened consecutively, it would be slow. But the grep
command is optimized for piping, so that most of this work happens concurrently,
with each grep command being its own process. On multiprocessor machines
this is lightning fast, but even on single processor machines it's much faster
than consecutive processing. The eleven greps form a bucket brigade. Or perhaps
a better metaphor is an assembly line.
In an assembly line, each station performs one tiny task on one part, then
passes it on. You inspect often, so that if a part is bad, it is discarded
or sent back for rework, rather than having downstream stations continue
work on a part that can never be sold. On my eleven grep assembly line, bad
lines are discarded so that downstream stations (greps) don't process useless
information bound to be screened out anyway.
You might be tempted to run a single sed or awk script
to screen all eleven undesireable line types in a single process. Don't.
Grep is lightning fast, especially when properly pipelined.
Assembly Line Your Work
Unix (and therefore GNU/Linux) has a certain beauty. Part of that beauty
is the efficiency of process pipelines. Piped commands, when each command
immediately dispenses of a line after processing, can be cascaded for huge
performance. But they must pass on data quickly.
I wrote a whois parsing program to parse the output of whois
commands on a set of domain names, turning it into a report. For simplicity,
I did it as a piped series of small Perl programs. The performance was terrible...
Each Perl program collected all the lines, processed them, then put the processed
data in front and then passed the raw data to the next filter in the pipeline.
So each process needed to stand idle until the preceding processes processed
ALL the data.
A higher performance program would have figured out a way to screen out a
maximum amount of irrelevant data on the very first pass, one line at a time.
Then go from there. There are other tricks too...
Timing is Everything
My whois parsing program took an intolerable 20 seconds to run. The user
had to wait all that time and wonder if it was hung, especially when it was
performed in a web interface.
How different life would have been if I had made the parsing part of the
download process. The whois databases cut you off if you make too many requests
during a period of time. Therefore, it is necessary to put a 5 second sleep
between each whois request. That 5 seconds would have been more
than enough time to parse the data from a single whois command.
To the extent possible, try to schedule processing during times requiring
waits for other reasons. Keystroke acquisition, response waits, whatever
-- try to schedule processing for those times.
Do it in Memory
My original EMDL parser took 15 seconds to parse the EMDL file containing
my menu system. This parser involved several file writes. I rewrote the program
using Node.pm, a tiny knockoff of the most basic aspects of Document Object
Model. This enabled the entire menu structure to be held in memory. The resulting
parser took 2 seconds.
My original whois parsing program took a couple seconds, performing all operations
in memory on an array. When I replaced it with a simpler to understand series
of piped processes, processing time balooned to 20 seconds. If you can, do
it in memory.
Find the Right Data Structure
Finding the right data structure enhances both runtime speed and development
speed. My original EMDL parser was so convoluted that changing it required
5 hours code review. I was the author, but it scared me to death. It took
15 seconds to run. So I rewrote it with Node.pm, which creates a tree of
nodes exactly modelling the hierarchy of a menu. The rewrite was fairly easy,
and now it's easy to modify or understand. Oh, and runtime is now only 2
One of the easiest development methods is the Unix pipe. The entire interface
is stdin and stdout. Unfortunately, for complex logic, piping is slower at
runtime than in-memory operations, because complex logic cannot usually be
performed in a read one, pass one manner.
So why not simulate piping in a single program? For instance, in a parser,
read enough input to create one complete record, place that record in an
array, which becomes the raw data. Create a hash to hold the parsed data.
Then create a series of subroutines, each with the array and the hash as
read/write arguments. Each subroutine does one thing and does it well --
exactly like a Unix filter. Don't worry if several subroutines do almost
the same thing. The conceptual simplicity is worth the slight runtime hit,
and often the simplicity saves the convolutions that drown runtime performance
in a molassis of complexity.
When confronted with a new programming task, here are some of the questions
you should ask yourself right away:
Based on the answers, try to find data structures mirroring the needed data.
For instance, Node.pm creates node trees, which exactly model outlines and
menus, making the authoring of outline and menu processing programs trivial.
Node.pm can also model linked lists and circular linked lists, making it
great for input form construction tools.
- What is the desired output or result, and what data structure does it
- What data must be captured to achieve the desired output or result,
and what data structure does it resemble?
- What data must be stored persistently?
- In the tangible system modeled by the computer program, what lists exist?
(people, departments, students, clients, vendors, invoices, etc)
- What are the rules governing the input and output data?
- How can those rules be expressed in data, as opposed to algorithmically?
Raw line data can be represented by arrays. A data record can be a Perl hash,
with keys representing field names and values representing the contained
data. You could also represent a data record as an object.
If you need to code up an input form, perhaps the overriding data structure
should be a circular doubly linked list. Sometimes you need a database app
whose field names and lengths aren't known at compile time. This calls for
a second level of abstraction, where you have a table of fields, fieldnames,
fieldlengths, and possibly field validations. Code the app around that so
that a non-technical supervisor can set up an ad-hoc database. This comes
in handy for litigation support, for instance.
Stuff I Didn't Mention
You notice this article never mentioned assembler, C or C++, even though
those produce lightning fast code. Trouble is, they also facilitate subtle
bugs when you let your guard down. As a small business generalist, you don't
have time to debug assembler, C, or C++. You also don't want errant pointer
bugs discovered by the customer, nor do you want to leave the door open for
buffer overflow exploits. Use of non-pointer-based interpreters minimizes
I didn't mention Java. Java has an incredibly quick debugging phase, and
it certainly has no memory pitfalls. But whether from culture, or from the
language itself, it seems like Java requires days or weeks of design before
coding. That's a good thing in a huge project, but it's the kiss of death
if you're a small business generalist who must prove his worth on a daily
I didn't mention the various "frameworks", "development environments", or
"RAD tools". These are all wonderful things in their place, but all too often
they're not very rapid. The only "RAD tool" I've seen live up to its name
is Clarion, and that's not made for GNU/Linux. "Frameworks", "development
environments", and "RAD tools" usually carry a huge learning curve, often
cost a lot of money, and generally aren't worth it for small projects. Or
big apps comprised of small projects.
With the advent of OOR (Outsourcing, Offshoring and Recession), IT professionals
in the developed world need a new way to market themselves. Increasingly we're
marketing ourselves as one man (or woman) IT departments for businesses too
small to hire an outsourcer. Our new clients aren't rich, and they need results
quickly. Today more than ever, we feel the need for speed:
Complete books on the first two items can be purchased at the Troubleshooters.Com bookstore. Quick
development, and I mean days, not weeks, is achievable using Unix pipes, tried
and true Unix commands such as grep and sed, and coding
in a quick developing language like Perl, using either a web interface, a
TK interface, or an old fashioned CLI interface. Quick runtime performance
is gained by smart use of piping, smart use of in-memory processing, and data
design compatible with the expected input and desired output.
- Troubleshoot quick
- Learn quick
- Develop quick
- Run quick
I'm interested in your input, especially any experiences you have with ultra-quick
coding and admin for small businesses. Email me with any insights.
Letters to the Editor
All letters become the property of the publisher (Steve Litt), and may
be edited for clarity or brevity. We especially welcome additions, clarifications,
corrections or flames from vendors whose products have been reviewed in this
magazine. We reserve the right to not publish letters we deem in bad
taste (bad language, obscenity, hate, lewd, violence, etc.).
Submit letters to the editor to Steve Litt's email address, and be sure
the subject reads "Letter to the Editor". We regret that we cannot return
your letter, so please make a copy of it for future reference.
How to Submit an Article
We anticipate two to five articles per issue, with issues coming out monthly.
We look for articles that pertain to the GNU/Linux or open source. This can
be done as an essay, with humor, with a case study, or some other literary
device. A Troubleshooting poem would be nice. Submissions may mention a specific
product, but must be useful without the purchase of that product. Content
must greatly overpower advertising. Submissions should be between 250 and
2000 words long.
Any article submitted to Linux Productivity Magazine must be licensed
with the Open Publication License, which you can view at http://opencontent.org/openpub/.
At your option you may elect the option to prohibit substantive modifications.
However, in order to publish your article in Linux Productivity Magazine,
you must decline the option to prohibit commercial use, because Linux Productivity
Magazine is a commercial publication.
Obviously, you must be the copyright holder and must be legally able to
so license the article. We do not currently pay for articles.
Troubleshooters.Com reserves the right to edit any submission for clarity
or brevity, within the scope of the Open Publication License. If you elect
to prohibit substantive modifications, we may elect to place editors notes
outside of your material, or reject the submission, or send it back for modification.
Any published article will include a two sentence description of the author,
a hypertext link to his or her email, and a phone number if desired. Upon
request, we will include a hypertext link, at the end of the magazine issue,
to the author's website, providing that website meets the Troubleshooters.Com
criteria for links and that the author's
website first links to Troubleshooters.Com. Authors: please understand we
can't place hyperlinks inside articles. If we did, only the first article
would be read, and we can't place every article first.
Submissions should be emailed to Steve Litt's email address, with subject
line Article Submission. The first paragraph of your message should read
as follows (unless other arrangements are previously made in writing):
Copyright (c) 2003 by <your name>. This material
may be distributed only subject to the terms and conditions set forth in
the Open Publication License, version Draft v1.0, 8 June 1999 (Available
at http://www.troubleshooters.com/openpub04.txt/ (wordwrapped for readability
at http://www.troubleshooters.com/openpub04_wrapped.txt). The latest version
is presently available at http://www.opencontent.org/openpub/).
Open Publication License Option A [ is | is not] elected,
so this document [may | may not] be modified. Option B is not elected, so
this material may be published for commercial purposes.
After that paragraph, write the title, text of the article, and a two
sentence description of the author.
Why not Draft v1.0, 8 June 1999 OR LATER
The Open Publication License recommends using the word "or later" to describe
the version of the license. That is unacceptable for Troubleshooting Professional
Magazine because we do not know the provisions of that newer version, so
it makes no sense to commit to it. We all hope later versions will be better,
but there's always a chance that leadership will change. We cannot take the
chance that the disclaimer of warranty will be dropped in a later version.
All trademarks are the property of their respective owners. Troubleshooters.Com(R)
is a registered trademark of Steve Litt.
URLs Mentioned in this Issue