Visitors Now: | |
Total Visits: | |
Total Stories: |
Story Views | |
Now: | |
Last Hour: | |
Last 24 Hours: | |
Total: |
For those wondering if this will make a difference to the incredibly complex and expensive process of data management:
The researchers used binary code to preserve the text, images and formatting of the book. While the scale is roughly what a 5 ¼-inch floppy disk once held, the density of the bits is nearly off the charts: 5.5 petabits, or 1 million gigabits, per cubic millimeter.
Talk about good use of space. This sort of memory could manage a lot of theoretical systems which conventional memory storage simply can’t do. Robots, smart systems, artificial intelligence in games, you name it, this blows the lid off capacity issues.
Apparently reading and writing is slower than in other media, so the DNA is currently being considered for archival storage, but that was what they said about binary originally. It looked too big to be practical 50 years ago, now it’s universal. The fact is that these higher capacity memories are invaluable for much more efficient processing, so there’s no way slow methods won’t be replaced.
Keep an eye on this technology, because the line between biology and the other sciences is blurring daily.
Four grams- And it can take every bit of data the world produces every year? Looks like the future just dropped in to say hello, doesn’t it?
Read more here: http://www.digitaljournal.com/article/331093