I never do end of the year things. For me the end of the year is just another day. As I read stories about the end of a decade and what has changed over the past 10 years, I began to think about the profound changes that have occurred in my own lifetime. I’ve witnessed the rapid changes brought about by the invention of the integrated circuit and more importantly, the microprocessor. So I’m going to go back farther in time and describe what I have personally experienced in the world of computers.
The end of the 1970’s
I was a teenager during the last years of the 70’s. I had an interest in electronics in my Junior year of high school and it was fortunate that my small rural school had an electronics class. In fact it was a two year course that I took in my Junior and Senior years. The class covered the basics, but I was already deep into electronics. I had already read dozens of books on digital logic and basic electronics a couple of years before I took the class.
By the time I was into my Senior year, I had already built a small computer using an 8085 CPU with 4k of memory. It could only communicate with a teletype. A huge clunky contraption that an electronic engineer who lived across the street had donated to me. I didn’t get very far with that computer because I couldn’t make the program in the EPROM work right. But I kept that machine and revisited it in the late 80’s. I built my own EPROM programmer in the 80’s and connected it to a Macintosh instead of a teletype machine.
By the end of the 70’s the Apple II and Commadore Pet had been around for a couple of years. Intel ran an ad in Byte magazine for a new microprocessor called the 8086. That occurred in 1979. For five bucks, you could buy the technical manual and they would give you a CPU for free! I still have the book and the CPU.
The 80’s
The IBM PC came out in 1981, which was the year I graduated from high school. That machine was expensive. I joined the Navy in 1982 and ended up stationed at Pearl Harbor in 1983. I had trained to repair the UYK-20 minicomputer, a 16-bit general purpose military computer and I was later sent to school to learn how to repair the SNAP-II computer (used by ship-board inventory tracking). This was about 1984, which is when the Macintosh showed up in the store. The Macintosh was expensive, but I had no bills and I saved my money to buy one. The machine was somewhat portable at a time when laptops as we know them didn’t exist. I bought the 128k version of this machine and then I bought Microsoft Basic for it. The OS with Microsoft Basic chewed up almost all of the 128k that was available, leaving only 10k for programs. I ran into that road-block pretty quick and had to upgrade to the Fat Mac. To upgrade this machine, they would remove the old motherboard and put in a new one that had 512k of memory.
I received my honorable discharge from the Navy in 1988 and went back to college. By that time, I had discovered that I liked programming more than working with electronics. In the Navy I was an electronics technician and I could see the trend in electronics was the creation of high-density integrated circuits. The end result would end up with hardware that was impossible to repair. We are at the point now, where resisters are nothing more than a tiny cube that is surface-mounted to a motherboard. Nobody pulls out a soldering iron and replaces such a device.
I started a degree in computer science at the University of Michigan. My years at UofM spanned from 1988 to 1994, which saw a lot of changes in the computer industry. When I started college, I bought a low-end PC that contained a 386sx processor. It didn’t have a math co-processor, which was pretty common at the time. It also had a 16-bit data bus instead of the full 32-bit. The 386 had been out for a while (launched in 1985) and the DX2 processor craze was about to start.
The 90’s
What was the DX2 craze? It started with the release of the Intel 486 processor in 1989. The speed of CPUs was increasing rapidly by that time. By the time the CPU speed hit 33Mhz, the DX2s came out. These were CPUs that were clocked at double the speed of the external data bus. The data bus ran at 33Mhz where memory system technology couldn’t keep up with the leaps in speed of the CPU itself. An internal cache memory kept filling with data while the CPU chugged along at full speed (at least that’s how it works most of the time). The most common DX2 processor was the DX2-66. It was a 66Mhz CPU and it felt blazing fast. This was about 1993.
The early 90’s also saw the release of Mosaic and the World Wide Web. The University of Michigan had Internet capabilities in the 80’s and 90’s, but you had to use tools like FTP and Telnet to do anything. There were a few Gopher servers out there, that were similar to what the Web does today. Computer Science students mostly used FTP to get source code files and maybe Usenet News for technology news. Email was the primary person-to-person communication method. The browser seemed magical.
Computer languages used at the university at the time were mostly C, C++ and Pascal. I used C to write an interactive web program. It was difficult to do because the program had to parse the HTML text string coming in and spit out HTML text for the browser. Now languages have been specifically designed to manipulate HTML (and other web protocols).
One of the early changes of the World Wide Web that I remember was when companies started to buy domain names to match their company names. Names like www.ibm.com and www.apple.com, etc. This turned into a craze at the end of the 90’s where every business was trying to get a web page up. This was when most web pages were static and only contained basic company and sales information. Search engines were becoming common, mostly Yahoo and Google at the time. I used Google because they had a clean webpage and it started up fast on my modem connected Internet.
The 2000’s
Ah, the turn of the century. This is where my Web career really got started. I worked for a small company that wanted to “webanize” their software. Yeah, that’s how they thought about it. They had a FoxPro database that they thought their software engineers could just slap a website onto and roll it out by next week! It took a lot of convincing to talk them down from that fantastic goal. I’m sure there was a programmer out there at the time that would have attempted such a ridiculous feat. I’m just glad we ended up building their site right.
This is where I got into PHP and Linux as well as Oracle. That was a pretty solid combination of technologies for the time. When my team evaluated what languages and technologies to use, the company already had a Microsoft SQL server running with Windows NT version 3. It was a tricky device that crashed once a day, and that was because it was used by an internal Windows program. If we had used that with the Internet, it probably would have crashed hourly. Fortunately, we chose Oracle at the time and wrote a new website from the ground up in PHP. Our other choice in web languages included Microsoft ASP. Which is now referred to as Classic-ASP. Our team was not too keen on Basic based languages, which is what drove us to PHP.
By 2006, our company had shrunk and our software staff was down to two developers and an IT person. We actually went through a few IT people before we realized that it was going to be almost impossible to keep a Unix guy on-staff full-time. A fortunate tax opportunity allowed the company to get a large influx of cash and I was asked what we could do with the cash to improve the system. I did some research and decided that we could switch into a Microsoft environment. We tested Windows Server 2003 with SQL Server and it ran without crashing for two weeks with a program that constantly hit it. I felt much more confident about Microsoft’s products. We also had to switch to Visual Studio C# from Borland’s C++ Builder that we had used for years. Borland had been bought by another company and Builder 2006 was full of bugs that made the product unusable. So we ported our PHP system to C# and we did it on the cheap. It was ugly and took some elbow grease to clean up the bugs, but the end product was much better.
Advancement in PC hardware during the 2000-2010 decade advanced fast. The CPU speed went from 200Mhz (which I had by the late 90’s with my Pentium-Pro) all the way to 4Ghz. It felt like the Pentium-II and III went from 300Mhz to 1Ghz in no time (1,400Mhz). I skipped the II and II and upgraded my Pentium Pro to a Pentium-4 processor (running just over 1Ghz). That lasted me until the Core-2 Duo came out. That was a 4Ghz CPU that lasted almost 6 years before my next upgrade.
By 2008, MVC and API’s had been introduced but were not widely adopted yet. Extreme programming had been around since the late 90’s, but that also had not been widely adopted yet. Unit testing was around, but breaking dependencies was a very difficult task. This was due to the late adoption of IOC containers by languages. Service Oriented Architecture was the big buzz word as well as “The Cloud.” It was more expensive to switch to cloud computers than to rent a rack at a data center and provide your own hardware.
2010 to Now
In this past decade I’ve seen the introduction of the IOC container, a wider adoption of cloud computing and systems built around APIs. The PC hasn’t advanced as rapidly as I have grown accustomed to. I still own one of the fastest hard drives available (the Samsung 970 Pro series M.2 NVMe) which I purchased in 2014. The number of processors in CPUs is increasing, but the speed remains the same. This is convenient for engineers like me, who use Docker and develop applications using APIs. For the average user, multiple processors are probably not translating into performance. I’m currently running a 4 CPU machine at 4Ghz.
In the past few years Artificial Intelligence has started to have an impact. Most of this is due to progress in natural language processing and intelligent search algorithms.
I started my blog during this decade (2012 to be exact). My original blog was on BlogSpot, but I moved it to my own host because WordPress gives me better control of the look and feel. I originally started my blog to record software information that I learned and knew I would need in the future. I still use my own blog for a reference to subject I know I recorded.
I traded in my flip-phone for my first iPhone (the iPhone 3). I bought my first iPad (the first one) when it came out. I wear those things out. That’s primarily my library of books and my web surfing device. I’ve come to the point that a five-minute Internet outage is a crisis. I don’t have cable anymore and stream everything.
What Now?
If progress continues at the current rate, I see a future where A.I. really makes a large impact. Multi-processor systems will be useful for such applications and PC sales with multiple processors will take off if there is something useful from A.I. So far, all we have are things like Alexa and Siri (and others). If A.I. advances to a point where we can communicate and have our computers perform work for us, then people will want more powerful machines. Right now, I think gaming is probably the largest industry pushing technology.
Robots are starting to become useful. I suspect that before intelligent robots become a household appliance, there will be on-line intelligent programs. I don’t think A.I. will become intelligent enough to replace knowledge workers in the next decade, but I suspect there will be niche knowledge areas that they will replace.
I think self-driving cars will be more commonplace by the end of the decade. It will probably be another 20 years or more before all cars are self-driving, but I suspect that over half will be self-driving by 2030. I wouldn’t mind owning a self-driving car. I could get some extra sleep while my car drives me to my destination. Self-driving cars have not come to this point yet, but it’s only a matter of time.
I think Internet speeds of over 1 Gigabit to the door will be common. Satellite WiFi data will become the new cell tower and cell data bandwidths will get crazy. The new 5G systems should be adopted over the next 10 years. Mobile devices will become more useful when all of this happens.
On the downside, I suspect that software will become much more complicated to use. Poorly designed and buggy software is becoming more common as computers get faster and more powerful. There is such a rush to get the next product out the door that companies don’t spend enough time fixing bugs and that is a problem. Government systems are behind in technology as well. That’s a bummer, because government systems are the most difficult to navigate. The lack of standards in data sharing is a problem as well. It’s still too difficult to do your own taxes when complex situations occur (new jobs, moving to a new location, investments, etc.). A person should be able to log into one of these systems and give permission to pull in all of their records and have the system figure out all the results at once. I suspect there will be little progress in this area. There seems to be no incentive to produce a quality government website and I see this trend in many company websites.
In addition to all that I have mentioned, I suspect there will be a technological advance that will catch us all by surprise. More powerful computer systems will translate into something that we can’t imagine yet. Will it be for entertainment? Information sharing? Will it give more power to governments? Will it improve our lives? Who knows. In the 1970’s memory was expensive and I dreamed about owning a computer someday that had 64 kilobytes. Now I know that bigger things are coming and they’re going to be cheap. I’m just glad I’m in the industry. If I had skipped college and continued to work as an electronics technician, I would have seen the industry from the side-lines instead of being neck-deep in it. It’s been a fun ride and I suspect the ride is just getting started.