It’s really interesting how far IT services have come in the last 50 years as computer systems have become more and more prevalent in our business society. Things have obviously changed dramatically since the early mainframe days of the 60s, and since many of you were not around then to experience the changes, this article will give you a look back in time to see the state of IT services then and now.
In the 60s, IBM pretty much ruled the world with their mainframe computers. The early ones didn’t even have access through any type of terminals… they were all batch job oriented with all processing scheduled through the operator console. Competitors for IBM were Sperry-Univac, Burroughs, and Honeywell. Notice how many of those names are still around? It’s tough to maintain your edge in the computer business over decades of time.
As far as outsourced IT services were concerned, there were “service bureaus”. These were firms that owned large computer systems and would run the processing jobs for their clients on their systems and send the output back to the customer in the form of large green-bar paper printouts. This is the level of technology that we used to put men on the moon! Wow.
In the 70s, IBM and the others added cathode ray tube (CRT) terminal access to their capabilities list, and this ushered in the next phase of computing. These CRTs where monochrome “green screen” character-oriented devices that could display simple characters with no color or graphics, but it was far better that operator console only job input.
The CRT era also lead to another phase in computing called Terminal Shared Operation (TSO). What this meant was that mainframes and service bureaus could provide shared time to customers for a fee as an IT service. When remotely connected, they would use very basic MODEMs to convert data to analog, send it over phone lines, and then reverse the process on the receiving end. The first MODEMs were 300 baud or 300 bits per second. Since the terminals were, in effect, electronic teletype machines, this data rate was not as bad as it would seem. Again, WAY better then console job submission.
In the 80s, alternative computer systems started to come on the scene in the form of minicomputers. These were machines like Digital Equipment’s (DEC) VAX, Wang’s VS2200, IBM’s System/34, and Burroughs’ B1900. These were complete, floor-standing systems that were oriented to CRT input as the primary mode of operation, and were not as powerful or expensive as their mainframe counterparts. These systems were very fast (for their time), and really started the interactive paradigm shift in computing. Since they were much less expensive than their mainframe counterparts, more and more companies could afford to have their own internal system, and this became the norm for computing for the next 25 years or so. As a result, the traditional service bureau started to decline.
The 80s also saw the release of the first real micro-computer in 1982, the IBM Personal Computer (PC), and we all know where that went. It is really comical to look back on the original PC and its capabilities. Check this out… The original PC had an Intel 8088 CPU at 4.77 MHz, 16KB of RAM and (2) 500KB diskette drives. Wow. Compare that to today’s technology with multi-core CPUs measure in the GHz range each, 16GB of RAM, 3TB disk drives. There are also thumb drives at 256GB. That is thousands of times more power in each category.
Here’s another tidbit for you. Digital Research was the original “Microsoft” in that they made the first good 8-bit OS for the early micros. It was called Control Program for Microcomputers (CPM). When the 16-bit IBM PC came along, it was HOTLY contested between Microsoft Disk Operating System (DOS—the forerunner to Windows) and CPM/86, the 16 bit version of CPM. Microsoft chose MS-DOS, and the rest in history. I’ll bet you’ve never even hear of Digital Research, have you? Same can be said for Intel. The Intel Z-80 was the CPU chip of choice for early 8-bit PC, and when the 8088 debuted in the IBM PC, their success was guaranteed.
In the 90s and 2000s, the PC took off, the internet exploded, email became commonplace, smart phones and tablets rule the world, and most importantly, the Local Area Network (LAN) became popular. In the beginning, there was more than just Ethernet (invented by Xerox, but the way). There was also Token Ring (form IBM), and ARCnet (from ad Minicomputer maker called DataPoint). Well, ARCnet was very limited, and Token Ring, in spite of the fact that it had the IBM name on it, we simply too expensive and complex, so the world standardized on Ethernet.
Then the internet took hold. It was based on the TCP/IP protocol that Ethernet used, and the two blossomed gloriously together. Add to this the fact that very high-speed bandwidth became available at very low prices, and you have a recipe for another paradigm shift.
All of these advancements have caused the industry to come full circle. It has gotten to the point where more and more tasks that were very complicated 10 years ago, are easy now through the invention of powerful tools. A perfect example is the various website creation programs that exist. If you saw all the HTML code that is created as a result of these easy to use and setup programs, you would be shocked.
As a result of this, and the pressure the economy is exerting on companies to cut costs, there is a renewed interest in outsourcing some or all IT services to the modern-day service bureau. This would be people like Rackspace, Amazon and the like. This is especially true for the smaller business since they can completely avoid having to buy and maintain expensive servers and software, along with employing the key personnel to operate them. IT services are not a commodity yet like electricity and telephone, but they are getting closer and closer to that level, and for all these reasons, outsourced IT services will become the norm, especially for the smaller business as time progresses.