|
Microsoft and Dell were involved - it's Windows. Justin C. Haase Solution Manager - Technical Services Kingland Systems Corporation -----Original Message----- From: midrange-l-bounces@xxxxxxxxxxxx [mailto:midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of Tom Liotta Sent: Tuesday, March 20, 2007 4:28 PM To: midrange-l@xxxxxxxxxxxx Subject: Re: Accidentally Lost Data Jerry Adams wrote:
My hypothesis is that there's no witch hunt because it leads (straight line) to the top.
Good point. And when it's a public agency, "the top" is the public. Funding for such critical elements as a sufficient number of tape cartridges can be hard to come by. You'd be surprised at how limited budgets can be. I wouldn't be surprised if there were no budget for DR testing for example. There's no time made available, no approval for off-hours work to do the testing, no approval for extra cartridges to allow for a rotation, no schedule for the system to be unavailable. Why not? Because it comes from public funds and no one wants to pay for it. Ever been to a speak to a legislative committee over a budget request? It can be ugly when trying to explain that something's going to cost money. How do you pin something to "the top" in a public agency? In this case, though, there were multiple backups. The initial problem was that (1) a backup drive got reformatted accidentally in addition to the production drive. Then (2) backup tapes were unreadable. (And there was the reasonably complete paper backup.) There's not much detail on the "backup tapes". Plural "tapes" adds some confusion. Perhaps it was just confusion from the article author, perhaps each backup run takes multiple cartridges, or perhaps there were rotating backups and _none_ could be read. No info was given on the meaning of "unreadable" either. Media problems? Storage problems such as excessive heat/moisture in the facility? Had the problem always existed or was this something introduced since that last test for readability was done (if ever)? The AK DoR did have an AS/400 up until at least a few years ago, IIRC. But this sounds more like Windows -- "backup drive". If Windows, then DR testing _should_ have been much more possible on a regular basis. A spare test server could be available and DR tests done in normal hours without disruption. Or the Department of Administration could supply DR test equipment on a rotating basis around other Departments. (AK DoA used to hold significant control over general IT. That might have changed in recent years.) But it still comes down to budgets. If nothing else, is it more cost effective simply to re-enter a database once in a while or to have on-going multiple backups and significant DR testing for every major database throughout the government? Actually, for $220,000, it wouldn't surprise me if this was not outrageously more expensive than funding all the various Departments and their agencies doing significant redundant backups and recovery testing year after year. (It seems strange that $220,000 doesn't seem out of bounds to me for such a monumental project.) But it's guesswork without knowing a lot more details. Tom Liotta
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.