Google
 

Aardvark Daily

New Zealand's longest-running online daily news and commentary publication, now in its 25th year. The opinion pieces presented here are not purported to be fact but reasonable effort is made to ensure accuracy.

Content copyright © 1995 - 2019 to Bruce Simpson (aka Aardvark), the logo was kindly created for Aardvark Daily by the folks at aardvark.co.uk



Please visit the sponsor!
Please visit the sponsor!

Poor old Intel

23 June 2020

Intel were right there at the start of the microprocessor revolution.

The Intel 4004 was one of (if not "the") first microprocessor chips and the concept of a programmable logic device changed the shape of electronics forever.

As a much younger man, I used to make my living by designing, building and repairing bespoke electronic systems. This was not only great fun but also earned be quite a healthy income. Whenever someone wanted something that was not available "off the shelf" they'd contact me and I'd knock up a unique solution to their problem, using readily available components.

The problem with this approach was that every solution had to be built from the ground up. There was little scope for creating standard building blocks and simply gluing them together to create a solution. A full-duplex 2-way radio system had nothing in common with an automatic IR-operated fan unit designed to flush a room of waste gasses whenever someone entered a toilet cubicle, for instance.

This made these systems quite expensive and often involved a great deal of work, crafting custom printed circuit boards, enclosures and such.

Likewise, much of the commercial and industrial electronics I was maintaining for various companies was totally unique to the particular application. The task of sitting down with a pile of schematics and working out what was going on each time a fault occured was quite an overhead.

But then the microprocessor age arrived, and everything changed.

Many of my bespoke systems soon consisted of a standard processor board plus interface circuitry to "the real world".

Instead of having to constantly design new hardware, a great deal of the systems I built shared a central core.

Much of the "hard work" could now be done with code rather than by cobbling together a unique architecture of resistors, diodes, transistors, capacitors and integrated circuits.

The same board that controlled a small press could also be reprogrammed to operate the sorting gates on an assembly-line.

I found myself transitioning from being a hardware engineer to being a software engineer and the shift was far more rapid than I could ever have imagined. In fact, within a year or so, I was doing very little bespoke hardware design and most of my hours were spent behind a keyboard, simply configuring the modules I'd designed for the various roles that were required of them.

But I digress... this is a column about Intel.

Back in the late 1970s, Intel was one of the leading microprocessor manufacturers. Sure, there were others such as Motorola, National Semiconductor, Philips and RCA but Intel had a very good share of the market with its 8080 device that featured in many of the early "home computer" devices.

Then Zilog came out with what was pretty much an 8080 on steroids. It had numerous advantages over the Intel part, not the least of which was inbuilt support for dynamic memory refresh. Back in the day, there were two types of RAM -- static and dynamic. Static memory was easy to use, all you had to do was write to it or read from it. Unfortunately it was also very expensive and of relatively low density. Dynamic RAM however, was smaller and much cheaper to make but had the downside that it had to be constantly refreshed or it would forget what you'd written to it. With the Intel processors, this required extra external circuitry to manage the refresh whereas the Z80 had this all onboard, thus saving manufacturers PCB space and money.

The Z80 also had a number of macro-instructions based around block-type operations and more relative-addressing modes. This probably means very little to modern programmers who only use high-level languages but back in the day, all the really good software was written in assembler so these little bonuses made life much easier for Z80 programmers.

The final icing on the cake was that even the lowest-spec Z80 could be clocked at 4MHz, quite a bit more than the 8080 from Intel.

Intel did come out with a successor to the 8080 in the form of the 8085 but really it was too little, too late and the world of CP/M computers was utterly dominated by Zilog.

It wasn't until the transition to 16-bit processors that Intel regained its former glory as "the" microprocessor manufacturer. When IBM decided on the Intel 8088 for its "Personal Computer", the die was cast and Intel went from strength to strength, buoyed by the demand from all the clone-makers who also wanted 8088 processors for their machines.

And so it was for many, many years -- Intel held the crown.

There were some upstart companies who tried to compete head-on with Intel in the processor market. Cyrix and AMD both created X86-compatible processors but neither was able to dethrone the king from his throne.

Even the biggest hold-out of all time, Apple, eventually switched to using Intel processors, having started with the MOS 6502 in the Apple II and then the 68000 series in the Macs before settling on the PowerPC chips after that. Today's Macs have Intel processors.

Intel's dominance was complete.

Until recently that is.

With the launch of the Ryzen series of processors, AMD has done what some thought was impossible -- dethroned Intel.

Sales of Intel desktop and laptop processors have reportedly crumbled in the face of the Ryzen challenge. AMD's new processors are significantly more power-efficient (thanks to a 7nm process) than Intel's and they are also delivering more instructions per clock-cycle. This, combined with high core counts, has made the Ryzen the prefered choice for many high-end desktop applications.

But it's not just the desktop arena where AMD is stealing market share. Their enterprise family of processors is also outgunning Intel at almost every turn.

It would appear that Intel's fortunes are under threat right?

Ah... but let's not forget that one of the biggest computer makers in the world is still using Intel proccessors right? Of course, Apple will ensure that plenty of Intel CPUs continue to be sold... correct?

Um, no.

Bad news on that front as well because Apple has announced that it's dumping Intel in favour of its own ARM-based processors very shortly.

I have to say that things look pretty bleak right now for Intel in the CPU marketplace.

They are a generation behind in technology (still very much reliant on 14nm) and their claims to build the highest performance processors is rapidly being eroded by AMD's advances.

Once again, Intel has gone from "king" to "also-ran" in the period of a very few short years.

I can't wait to see what happens next.

Ah, the computer industry, where the only constant is "change". I love it!

Please visit the sponsor!
Please visit the sponsor!

Have your say in the Aardvark Forums.

PERMALINK to this column


Rank This Aardvark Page

 

Change Font

Sci-Tech headlines

 


Features:

The EZ Battery Reconditioning scam

Beware The Alternative Energy Scammers

The Great "Run Your Car On Water" Scam

 

Recent Columns

Cancel Culture Continues
The demand for "correctness" continues and the latest example is Stuff's decision to ditch social media...

The best economic recovery plan
The PM has announced "jobs for nature" as a key part of the government's economic recovery plan...

The weakest link?
How ironic is it that just days after I published a column in which I suggested that where data security is involved we should trust nobody, that a huge crime ring is busted for just that reason...

When will this be fixed?
The internet and banking are uncomfortable bedmates...

Is tech paranoia justified?
Tech maybe scary to some but are the current levels of paranoia really justified?...

I apologise
In light of recent world-wide events and trends, today's column is an apology...

Covid-19, the good and bad news
The world is worried that CV19 will cause ongoing economic and human disaster around the globe...

Google to pay for news
When I read the headline I couldn't believe my eyes...

Criminals breaking the law? No!
Just over a year ago an attrocity was committed right here in New Zealand...

Should we be worried?
I think the world is somewhat underestimating China's desire to extend its borders...

Poor old Intel
Intel were right there at the start of the microprocessor revolution...