Can you point me to a white paper or case study where adding an ETL layer to an ODS (containing retail store POS data) has enabled speedier nightly data quality processing work on a high volume of records so that retail store registers can have cleansed records on time at the start of each business day - as opposed to an ongoing backlog?
Les Barbusinski's Answer: The performance of your ETL scripts is dependent on how you designed your processes and how they perform their I/O ... not on the tool used to generate them. You can get poor performing ETL processes just as easily with a tool as without one. The benefit to using an ETL tool is that the process of transforming raw data into information is visual and intuitive, and can easily be amended or augmented. I suspect that the reason you're experiencing such poor performance is because you're executing a lot of random I/O calls against relational tables. If you were to redesign your processes to decompose them into smaller units of work that processed sequential files rather than performing SQL calls, your ETL processes would "scream." An ETL tool can easily switch modes and apply its transformation logic against a flat file rather than a relational table (or vice versa), while a hand-coded Java script or stored procedure would require a substantial rewrite. That's the real benefit of using an ETL tool.
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access