Is this the end of Silicon?

Computing advanced from vacuum tubes to transistors thanks to Bell Laboratory.  John Barden and Walter Brattain invented the transistor in 1947 and announced it in 1948. In 1954 Morris Tanenbaum creates the first silicon transistor at Bell Labs. They are bested by Texas Instruments’ engineers building and marketing the first commercial devices.  The commercial application of similar devices were for defense, construction, and science.


In 1971 Intel announced its Silicon Gate Technology which allowed the first monolithic CPU integrated on a single chip.  This chip was the Intel 4004, which was actually 4 chips consisting of the design.  This was one of many advancements thanks in part to Silicon.  Eventually Silicon became ubiquitous in Technology, and ultimately naming parts of San Jose.


Today the same company that created the first integrated chip announced a new direction.  They will start to move away from Silicon at 7nm. The new material of choice is indium gallium arsenide.  Also Intel looks to use III-V Semiconductors for its MOSFET gate.  While its years off, with the 14nm processors being released soon, and 10nm in 2016/2017.  It does highlight the inevitable transformation of technology into the history books.


The demands for CPU’s exponential growth of transistors requiring different technologies might be a niche case.  We will still have Silicon in circuit boards, disk controllers, motherboards, and GPU’s etc.  This is only to remind us for a minute or two to how far technology has come in a single lifetime.



Robert Illing is a Field Solution Executive focused on helping organizations modernize their Data Centers.

twitter-30px@robertilling       linkedin-30px