« Pigpen Comment of the Day | Main | "A continuing exploration of mysteries." »

February 27, 2011

"One hundred trillion bits of information"

In 1949, one year after Shannon published the rules of information theory,
he drew up a table of the various stores of memory that then existed. The biggest memory in his table was the US Library of Congress, which he estimated to contain one hundred trillion bits of information. That was at the time a fair guess at the sum total of recorded human knowledge. Today a memory disc drive storing that amount of information weighs a few pounds and can be bought for about a thousand dollars. Information, otherwise known as data, pours into memories of that size or larger, in government and business offices and scientific laboratories all over the world. Gleick quotes the computer scientist Jaron Lanier describing the effect of the flood: “It’s as if you kneel to plant the seed of a tree and it grows so fast that it swallows your whole town before you can even rise to your feet.” -- How We Know by Freeman Dyson | The New York Review of Books

No kidding, a really great article by Dyson.

Posted by Vanderleun at February 27, 2011 8:22 AM. This is an entry on the sideblog of American Digest: Check it out.

Your Say

Post a comment




Remember Me?

(you may use HTML tags for style)