on UC Berkeley’s How Much Information?, http://www2.sims.berkeley.edu/research/projects/how-much-info-2003/
While the study itself is fascinating–I’m honestly a little surprised to see a serious attempt at quantifying annual information creation–I would love to see a comparative analysis across multiple years (to present), especially if the number of files (for instance, for p2p sharing) was tracked, too. The reason I’m particularly interested in the number of files as well as the size of the data is because I expect–despite the qualification about compression near the end of the Executive Summary, http://www2.sims.berkeley.edu/research/projects/how-much-info-2003/execsum.htm), that quality of video and audio created and transferred over the internet is substantially increasing as (1) download and upload speeds have increased with widespread high-speed connectivity, and (2) there are many more outlets for higher quality digitized products, (I think here to the iTunes offering to prorate higher encoding level versions of songs already purchased, or of HD offerings of episodes of television programs through Amazon). As quality increases and transmission of larger and larger files becomes easier, I think capturing the number of discrete units of information as well as the overall volume will be significant — especially for our purposes of organizing and facilitating access to all of this information.