| |
VA3KIS > TECH 14.01.01 11:18l 138 Lines 8527 Bytes #-9177 (0) @ WW
BID : 32282_VE3YRA
Read: GUEST OE7FMI
Subj: SILCON'S LAST STAND.
Path: DB0ZKA<DB0GPP<DB0OFI<DB0PRT<DB0LX<DB0CZ<DB0GE<LX0PAC<LX0HST<HA3PG<
WB0TAX<VE3FJB<VE3YRA
Sent: 010114/0844Z @:VE3YRA.#SCON.ON.CAN.NOAM #:32282 [Aurora] FBB7.00g25
From: VA3KIS@VE3YRA.#SCON.ON.CAN.NOAM
To : TECH@WW
"The crystalline wafer has been at the heart of the processor for more than
30yrs, but silcon's days are numbered. What will replace it??"
It was an inevitable declaration. Arevision that fundamental physics was going
to extrot from Gordon Moore sooner or later.
Moore, who laid out that remarkably accurate roadmap 35 yrs ago, concedes that
point.
"Chip doubling will go from every couple of years to five years." In 1965 he
made the intial postulation. At the time, Moore worked at Fairchild
Semiconductor, one of Silcon valleysfirst startup company. This was 3yrs before
he co-founded Intel with the late Bob Noyce.
He extrapolated the idea that we would go from 60 components on a chip to
60,000 over 10yrs. The initial law, named by Carver Mead a professor at Cal
Inst of tech, predicted a doubling every year. But 10yrs later, Moore revised
the timeline to every 2yrs.
The current revision is also necessary. Experts in the semiconductor industry
believe there are only about 10 to 15 years left before chipmakers run out of
ways to pack more transistorson a chip, the technique to make chips faster.
You can imagine there will be a great difficulty if you try to print a line (on
a silicon wafer) that is smaller than the space between the atoms in silicon.
Physics gets in the way after a while.
The 42 million transistors on the new P4 chip are each 0.18 microns or 180
nanometers wide. A human hair is 100 microns wide. It will be possible to
shrink line widths to 0.025 microns or 25 nanometres by 2011.
More transistors means faster chips. By then a billion transistors on a chip
should be possible. That should produce chips that will run over 10GHz.
Maybe your going to get 12, 15, 20 GHz, it could go even smaller. There is
proof of concept in the 20 nanometre range.
Experts know the numbers they use are speculative. High pointed to a 1989
research paper that looked ahead at what microprocessors might be like today.
At that time (conventional processors) worked at 25 to 40 MHz. The prediction
was they (Intel) would reach 250MHz by 2000, but here we are, at the threshold
of 2GHz.
Eight times faster than the vision of 11 yrs ago.
But if scientists lose miniaturization as a tool to boost chip speeds, we will
have to push technology as far as we can by shrinking, then they will start
building bigger chips.
In a quest for more transistors, chipmakers could also start snadwiching chips
together... placing 2 chips together face-to-face or putting multipule chips
together. They could also try to move data around the chip faster. Intel has
been experimenting with ways of pulsing data across chips using supercicuits
made of industrial diamonds.
As chips get smaller, the cost to build them rises. A new, more expensive
factory has to be built each time chip architecture changes.
there's a lot to be done if Moores's law is going to stay on course over the
next 5 yrs.
Line widths need to shrink to 100 nanometres by 2005. Chip engineers still face
some diffcult problem in getting there.
As transistors get smaller, they need a high concentration of chemical
impuities called dopants added to the wafer to help hold electrical charge.
But high concentrations, dopants clump together and become electrically
inactive. That has yet to be solved.
Fluctuations in dopants concentration are also a factor. Chips with larger
transistors are unaffected, but as transistors get smaller they become more
susceptible and could behave unpredictably.
And then there are "gates" the tiny barriers one or two nanometres wide,
controling the flow of electrons in a transistor. A gate denotes whether a chip
counts 1 or 0. An open gate lets an electron through as zero. A closed gate
blocks it to count as one. This is the basis of binary math, the engine for all
computer calculations.
But electrons go rogue occasionally. They bolt through a gate and appear on the
other side. It's a quantim physics oddity called "tunneling," causing errors in
calculations.
Although there are no known solutions chip experts still believe they can be
solved.
New manufacturing techniques will drive the next big breakthrough in chip
miniaturization.
The current method optical lithography, uses ultrviolet light to record the
images of a circuit on silicon. Intel intends to replace that with extreme
ultraviolet lithography. The first working extreme ultraviolet tool will be
available next April.
By 2005, commercial machines using this technology will build chips that will
allow 0.07 micron (and smaller) chip technology.
Since the birth, microprocessors have been built around a clock. With each
tick, the entire state of the chip changes.
The clockspeed of a microprocessor determines how many instructions per second
it can execute. A 1GHz clockspeed represents a billion cycles per second.
Instead of having individual binary logic, we will start looking at logica
that's not binary, like neural networks. A neural network is a type of
artificial intelligence that imitates the brain.
If you could make a device that looks more like an animal's brain, it would
work betterin silicon than in carbon.
Instead of counting 1 and 0's a neural network is a series of interconnected
processing elements, the computer equivalent of a brains neurons and synapses.
"Biological-based computers, the next generation of computers?"
Using nuerons to preform computing is exciting because unlike silicon based
chips, an upgrade to the next level of processor isn't necessary to get more
speed.
It does computations through making more conections and adding more neurons.
but while scientists see evidence biological computers work, they still haven't
found a way to program them.
While fascinating strides are being made with living cells, Some researchers
are going even smaller in the feild of molecular computing.
Current computers use swithes etched in silicon but future computers might use
molecules, clusters of atoms. That would mean that molecular elctronics, or
more moletronics, could replace transistors, diodes and conductors in
conventional microelectronics circuitry.
They have developed a 1 molecule on-off switch that works at room temp. Strings
of molecules would be assembled together to form simple logical gates that
function like todays silicon transistors.
The size advantage means a molecular computer would consume very little power.
It also has the potential of vaster computing power, though be cautioed, its a
feild in initial satges of development. Estimates of when the technology could
be commercialized are pure speculation.
Plastic circuits are designed to replace silicon microprocessors, but they
could provide new display technologies, control electonic paper and work with
silicon microprocessors.
DNA computing is the most managable form of molecular computing that we know
of.
Then there's quantum computing, where the sub-atomic world is used to do basic
math using bazarre and often counterintuitive princples in quantum physics.
A conventional computer does binary math using switches that are either on or
off.Quantum computing uses switches that are not only on or off, but also on
and off at the same time, and every state in between. This doesn't seem
possible in the real world, but it's real in quantum physics.
In conventional computing a 1 or 0 is called a bit. In quantum computing,
states are called qubits.
Because a qubit is not just a 1 or 0 but can be both as well as all states in
between, it becomes an enormously powerful way to do parallel computing. Many
parts of a comuter all work at once on a prblem instead of waiting for each
other to finish before proceeding.
Researchers in quantum computing is in its infancy.
But quantum computers wouldn't necessarly replace silicon-based computers. If
it ever gets to the point of being useful, then it could co-exist in a peaceful
manner with conventional computers. Or it could be special purpose computer
that feeds data into a conventional computer.
Cryptology is an exciting and frghtening application of quantum computing. A
quantum computer would have the power to break almost any code. In the wrong
hands that would put military secrets at risk.
These are still theoretical technologies. But they will eventually make their
way into some form of computer technology of the future.
"SILICON" is a technology that cumulatively is a $100 Billion industry and to
beleive that something will come in and, in one leap, get a head of that, I
find it pretty hard to swallow.
Read previous mail | Read next mail
| |