|
The TeraPlex center ( I may have the name a bit off ) at IBM regularly
dealt with multi-billion row tables. Not only can DB/2 handle it, the
Power Systems running DB/2 ( either IBM i or AIX ) can do it. Now, once
you get into the finite details you might find that other tooling is
better than DB/2, but for most it's very viable.
I have multiple customers with tables with 100,000,000+ rows and aside
from back up/recovery we don't even notice the tables are that big.
Creating an index requires some thought as well but that's all in a days
work.
Jim Oberholtzer
Chief Technical Architect
Agile Technology Architects
On 9/4/2013 11:32 PM, Nathan Andelin wrote:
Let me admit up front that this message is something of a teaser.
While experimenting to better understand how IBM i might handle "big data", I inserted 130,000,000 rows into a table, using an RPG program that required 260 seconds of elapsed time to complete on a 2-core model 720 - P05 Group.
When I shared the results in a Linkedin discussion, a couple antagonists expressed doubt in my numbers. They didn't believe one could insert 500,000 rows per second, even using a model 795, and they didn't view IBM i on Power as a viable platform for hosting and handling "big data".
Say you have a need to record millions or billions of records quickly and read them quickly. How would you do it?
-Nathan.
--
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.