×
The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.
Hi Dave
What do you want the backups for ? How accurate do your backups have
to be ? What constitutes a backup (I.e. you can recover to a point
in time using a daily save and applying journal receivers) ? Is
timeliness of recovery important ? Data only ?
Most likely the answers to these question will dictate what solutions
are viable.
It does seem to me that something based on remote journaling protects
your data by storing it at another site.You can easily apply journal
entries to the last save to get you a point in time recovery rather
than saving libraries at particular points in time. Remote journaling
mostly saves you having to save the receivers to tape or transmit them offsite.
I guess it would be possible to achieve the same effect by using
triggers and writing a record to a data queue each time something
happened and sending that to the other system. Bu then you need to
make sure all files are covered, deletes, commitment control etc. It
seems to me it gets pretty hard pretty quickly and the viability
depends on what you are trying to achieve.
In terms of doing backups to a remote machine using SAVRST*, as far
as I recall, save while active can be something of a flawed solution.
Partly it depends on what sync-ing the libraries you are saving
require and whether you can successfully save all the
libraries/objects to the *same* point in time. Having a master file
that's missing transactions or vice versa would be a worry... At one
point it was recommended that you shut your applications down to get
a reliable sync point. V5R may have improved this but you'll need to
check. In any event, I'm almost certain you need journaling and
commitment control for this to work reliably anyway, so if disk cost
or journaling performance myths were or issue your are back to where
you started.
The other thing to consider is that you may have a huge burst of data
over the wire when you do your save - and keep in mind that you have
no save until that transmission completes successfully. You may even
compromise the integrity of the remote library as well unless you
keep a backup copy as a safety measure while you do your save).
Remote journaling will keep you up to date the whole time but utilize
the wire more effectively. You may still have some work to do to
remove transactions depending on your use of commitment control
I noticed a product the other day called RAP/400 that seemed aimed at
data replication only and was priced pretty low. I know nothing about
it but it might be more what you are after. Having said that most of
the HA Vendors have a low end product aimed at data replication or
low end HA - iTera even has one that allows you "vault" your journals
to a remote PC, which I thought was a cool alternative to HA.
Hope this helps
Regards
Evan Harris
At 12:36 a.m. 24/04/2007, you wrote:
I am looking for a way to "backup" or make copies of files and
libraries, without knocking people out of the system or slowing them
down to do so. I need a way to periodically backup up transaction files
or updated master files periodically throughout the day and send them to
an alternate box. HA software is not an option at this point. I know I
could use journaling but do not know that I want to go down that road,
or else I will push for the HA software.
Can I use CPYLIB, CPYF, CRTDUPOBJ, SAVOBJ, SAVLIB commands and achieve
the results I am looking for? I realize I might have to run them
multiple times throughout the day.
Thanks for the direction.
Dave
As an Amazon Associate we earn from qualifying purchases.