I’ve always been a fol­low­er of Claude Shan­non and the incred­ible work he did regard­ing Com­mu­nic­a­tion The­ory (i.e. signal/noise) while at Bell Labs. He knew enough to refrain from over-explan­a­tions — and in doing so he also inven­ted the broad­er dis­cip­line of Inform­a­tion The­ory. He coined the term ‘bit’, and was just as influ­en­tial to com­puters and inform­a­tion net­works as Alan Tur­ing. He built the first jug­gling robot.

I just fin­ished read­ing a quite com­pre­hens­ive his­tory of this sub­ject — The Inform­a­tion by James Gleick. From Amazon’s descrip­tion: “We live in the inform­a­tion age. But every era of his­tory has had its own inform­a­tion revolu­tion: the inven­tion of writ­ing, the com­pos­i­tion of dic­tion­ar­ies, the cre­ation of the charts that made nav­ig­a­tion pos­sible, the dis­cov­ery of the elec­tron­ic sig­nal, the crack­ing of the genet­ic code. In ‘The Inform­a­tion’ James Gleick tells the story of how human beings use, trans­mit and keep what they know. From Afric­an talk­ing drums to Wiki­pe­dia, from Morse code to the ‘bit’, it is a fas­cin­at­ing account of the mod­ern age’s defin­ing idea and a bril­liant explor­a­tion of how inform­a­tion has revolu­tion­ised our lives.”

There’s also a great (and greatly sim­pli­fied) video essay below about his work by the fant­ast­ic Adam Westbrook.

April 30th 2016 marks the cen­ten­ary of his birth, and there are a num­ber of cel­eb­ra­tions mark­ing this event. Many of his sem­in­al papers (includ­ing the cru­cial A Math­em­at­ic­al The­ory of Inform­a­tion) are avail­able here.

Admit­tedly, his work sounds a little dry, how­ever — along with John von Neu­mann and George Boole — his work ushered in the digit­al revolu­tion as we know it today, and will con­tin­ue to influ­ence how we think about com­puters well into the future.

via amazon.co.uk and delve.tv