× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



From an IFS perspective, consider using the i to backup a group of Windows
PCs. Using deduplication in the IFS would mean that, for 10 Windows 7
machines, there'd be just one set of Win7 files taking up disk space. One
set of commonly used apps, etc. If you average PC build has 50-70GB of OS
& apps before you get to the data, across 10 machines that would save
potentially 450-630GB of space.

It may not improve backup times though. That would depend on whether the
deduping algorithm can run faster than straight up disk writes.

Using another example, look at JDPeopOracle EnterpriseOne. Each time you
do a package build (which generates the app code) it chews through a few GB
of disk writing out new spec files and whatnot. There will often be just
minor differences from build to build meaning that a dedupe algorithm that
was based on sectors/clusters and not entire files/objects would radically
reduce disk consumption over time.

On Sat, Feb 9, 2013 at 7:01 AM, <rob@xxxxxxxxx> wrote:

Just how often do you think you have data duplicated in the IFS that you
would want deduplicated? For example if I do a CRTDUPOBJ of the IIM file
into a test library that is, in effect, duplicating something from
/qsys.lib/prod.lib/iim.file
to
/qsys.lib/test.lib/iim.file
What would deduplication do in this case?

Or do you want deduplication to only work on stream file directories?

IBM has something like this with Domino and DAOS. In which case Marketing
Dude sends out an email with PowerPointFromHell.ppt as an attachment in an
email to 100 people in the company it stores one copy as a separate file
(.nlo) in a different directory and every email just links to that file.
Of course, as I type this up I'm in the process of bouncing one such
Domino server to diagnose file corruption of these .nlo files and/or their
links. (Oops it just died with a "Panic" error.)
What's a little corruption when it saves a serious load of disk space? :-(
And if I try to OS copy one mail file from one server to another it
doesn't know where the links are but you learn to deal with stuff with
alternate methods.


Rob Berendt
--
IBM Certified System Administrator - IBM i 6.1
Group Dekko
Dept 1600
Mail to: 2505 Dekko Drive
Garrett, IN 46738
Ship to: Dock 108
6928N 400E
Kendallville, IN 46755
http://www.dekko.com





From: Matt Olson <Matt.Olson@xxxxxxxx>
To: Midrange Systems Technical Discussion <midrange-l@xxxxxxxxxxxx>,
Date: 02/08/2013 05:49 PM
Subject: RE: IFS Deduplication
Sent by: midrange-l-bounces@xxxxxxxxxxxx



Interesting, looks like it's a standard Xeon PC appliance with a crazy
amount of processing power (40 cores).

Perhaps they'll implement something like this at the software level
directly into the OS in V8R1.

Pretty neat feature to have without the SAN cost's.

-----Original Message-----
From: DeLong, Eric [mailto:EDeLong@xxxxxxxxxxxxxxx]
Sent: Friday, February 08, 2013 4:39 PM
To: Midrange Systems Technical Discussion
Subject: RE: IFS Deduplication

Well, IBM does have deduplication product offerings, but not free... For
the most part, I see references to this product: IBM ProtecTIER
Deduplication Services...

https://www-304.ibm.com/partnerworld/wps/pub/overview/HW21Z

Looks like an appliance based solution, I think.

-Eric DeLong

-----Original Message-----
From: midrange-l-bounces@xxxxxxxxxxxx [
mailto:midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of Matt Olson
Sent: Friday, February 08, 2013 4:04 PM
To: midrange-l@xxxxxxxxxxxx
Subject: IFS Deduplication

Does the IFS implemented in V7R1 have built in data de-duplication like
files on a Windows Server 2012 file share can now have since data
de-duplication is built into the new Windows Server 2012 OS?

If so, how can we activate it?

Matt
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing
list To post a message email: MIDRANGE-L@xxxxxxxxxxxx To subscribe,
unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx Before posting, please take a
moment to review the archives at http://archive.midrange.com/midrange-l.

--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing
list To post a message email: MIDRANGE-L@xxxxxxxxxxxx To subscribe,
unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx Before posting, please take a
moment to review the archives at http://archive.midrange.com/midrange-l.

--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing
list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.


--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.





As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.