|
Per the current understanding of requirements put forward by auditors, we need to analyze changes and actions that are made/taken by users outside of the actual production applications. That is, changes made by command from the command line, etc. We then need to retain this data (journal receivers) for SEVEN years. There are a couple of issues that I would like to have your thoughts on: 1) I got bit last week after having set up an "automatic" analysis, because one of the Journal files became MASSIVE. One of the steps in my "automatic" methodology is to dump journal receivers to a data file so I can run those records against an SQL statement to report on those items that are out of the range that has been set up. What happened was that disk filled up. Is there a way to determine that you are about to do something stupid - like run out of disk - so you can stop it? 2) As a part of my retention routine, I have a tape that just sits in our development system, and I continue adding save files containing receivers from all our systems. This is not exactly ... safe ... because if something happened that destroyed that tape, we wouldn't have backup. I suppose we could back up just a weeks worth of information, but by the time we got to SEVEN years, we would probably own the storage company... So, how can I backup what will be massive amounts of data, over a LONG period of time, and still have the data safe? TIA for any input, Dave 612-371-1163
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.