|
Our solution, implemented over 5 years ago, was to convert reports to .PCL files (now PDFs) and copy them to a directory in the IFS that gets auto-created for each user if it doesn't exist. The filename is embedded with the date & timestamp to ensure uniqueness. A web page uses net.data and a CL to read the IFS directory and present the reports to the users. With several buttons for each report they can 'print' or 'download' and can also 'copy' the reports to another user or 'delete' from their list. We also provide the raw spool file copied to an IFS object for importing into Excel ('text' button). The main maintenance item is cleaning up reports that the users fail to delete themselves; we've a daily CL that purges reports older than x days. Once a month or so I manually purge user directories that haven't been used in 3 months. It only takes about a minute so I've not been too concerned with automating the process. The spool files themselves aren't kept for more than a couple of days; once the report's converted to IFS files the spools aren't necessary. With 1200 users and thousands of reports generated daily, it's worked pretty well. Support issues are now next to nil, although there were some growing pains along the way. I've considered migrating to an email-based approach as an alternative but that would involve directory syncing between the 400 and the corporate LDAP, which currently doesn't know about 400 users.
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.