If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
|
Thread Tools | Display Modes |
#11
|
|||
|
|||
|
#12
|
|||
|
|||
Check out this article.
http://www.infoconomy.com/pages/stor...roup101451.adp basically digital information for present term access, understanding that it will need a migration of the back end storage platform and a translation of the front end software and data into whatever is the current technical lingua franca.. For long term storage and DR, using microfilm rated at 250 years. The was a law enacted by congress of UK and US for long term census records. So, there is your answer, everyone is right. Wanna get really cool, ideas are floating for laser lithographs on ceramic disks at the microscopic level for storage which would last thousands of years. I personally like that, but I'm a geek with an interest in history. |
#13
|
|||
|
|||
In article . com,
boatgeek wrote: Check out this article. http://www.infoconomy.com/pages/stor...roup101451.adp basically digital information for present term access, understanding that it will need a migration of the back end storage platform and a translation of the front end software and data into whatever is the current technical lingua franca.. For long term storage and DR, using microfilm rated at 250 years. The was a law enacted by congress of UK and US for long term census records. So, there is your answer, everyone is right. Wanna get really cool, ideas are floating for laser lithographs on ceramic disks at the microscopic level for storage which would last thousands of years. I personally like that, but I'm a geek with an interest in history. microscopic laser pits on nickel sheets. But seriously, folks.... Emulation and virtual machines are going to be the salvation for recovering ancient applications and data. As it has been pointed out, having a database dump does you no good unless you can run the application. Now it can be done in emulation. Current computers are sooo much faster than the machines we emulate that performance can be decent even if you have to emulate the entire instruction set. There will always be some service shop that will read your old media (if it's readable and burn it into a CDR (or whatever the media is years from now) and as long as you have the OS, application and data you'll be good to go. Take a look at this for a list of emulators for machines that haven't existed outside museums for decades. http://simh.trailing-edge.com/ I've owned (as a corproate manager) several of the machines on this list and played with an emulator. A machine that cost close to a million bucks in 1978 and sucked about 30kW runs slower than its emulator does on my PC, at least for a single user. The PC Emulator of IBM370 (http://www.conmicro.cx/hercules/) was used big-time by corporations in 1999 for testing Y2K conversions. -- a d y k e s @ p a n i x . c o m Don't blame me. I voted for Gore. |
#14
|
|||
|
|||
Paul Rubin wrote: [...] Sounds kind of complicated. Where's this data now, how is it stored, and how fast are you adding to it and through what kind of system? 20 TB isn't really big storage these days. You could have a small tape library online and move incoming raw data to tape immediately while also making the online viewing copies on disk. HSM systems with automatic migration and retrieval are probably overkill. It is kind of complicated. Currently we have 6Tb digitised and are adding 0.1Tb/week. Now this is data that's stuff that needs to be kept for ever - the audio stuff is world heritage stuff. The driver for using HSM is two fold 1) keeping multiple copies securely including offsite 2) we know we have a 900kg gorilla called video waiting in the wings .... |
#15
|
|||
|
|||
The point about file formats is well made, but we've been through the
same arguement in detail already. We're choosing file formats which are publically described for which there are multiple (open source) clients. The idea is to be able to ensure that we have the format description and enough example code to be able to recreate viewers in the future. That's why we're using tiff and bwf as the archival masters. I don't care about the mp3's as they are *derived* copies - we can as easily use ogg vorbis, or whatever we're using in 2055 as long as we can parse the original compression free datastream |
|
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Long term 100% CPU | 'Captain' Kirk DeHaan | Dell Computers | 6 | December 15th 04 06:45 AM |
Redundant network storage | Andrew Gideon | Storage & Hardrives | 0 | December 6th 04 05:12 PM |
SAN (Storage Area Network) Security FAQ Revision 2004/06/23 - Part 1/1 | Will Spencer | Storage & Hardrives | 0 | June 23rd 04 07:04 AM |
Enterprise Storage Management (ESM) FAQ Revision 2004/06/23 - Part 1/1 | Will Spencer | Storage & Hardrives | 0 | June 23rd 04 06:58 AM |
Get the Serial Number with Visual Basic | Michael Wittmann | General | 15 | November 15th 03 06:03 PM |