phone charger cable and electronic industry

A new article in english for my readers 🙂

On December 23, 1947, John Bardeen, Mcdodo made history. The three researchers at AT & T’s Bell Labs in New Jersey invented electronics. It all started with a long series of experiments that applied an electric current to all kinds of minerals. That day, the signal passing through a germanium crystal came out with greater power than at the entrance. The transistor was born.

The base material will be replaced by lightning bolt. And, ten years later, the American Jack Kilby connected some transistors between them to give birth to the first integrated circuit. Europe, too, had the transistron, developed also seventy years ago by the Germans Herbert Mataré and Heinrich Welker, in Aulnay-sous-Bois (Seine-Saint-Denis).

From there, everything went very quickly in the miniaturization of the components of the size of a fingernail, or, according to the Americans, of a chip (chip, in English) – hence their nickname. The transistor first replaced the big tube lamps (to heat a few seconds to light) in the old radios – which took the name “transistor” – and then in TVs, hi-fi amplifiers, etc.

Moore’s Law
But the accelerator will be given by the American firm Intel, which marketed in 1971 the first microprocessor: a single chip, full of transistors and able to run a computer program.

Intel boss Gordon Moore enacted an empirical law stating that the density of a chip in transistors doubles every two years. This exponential growth of their computing power was inversely proportional to the infinitesimal race. From about two thousand transistors on the first microprocessor, we have passed, for the same manufacturing price, to more than seventeen billion today, each chip being the size of the nanometer. This is the key …