Extract Process Optimization

Optimization of the client’s general ledger extract process, which works with large datasets from different sources.

Discuss an idea

All Technologies Used

Oracle
Oracle
Oracle SQL Trace
Oracle SQL Trace

Motivation

The client approached Azati to streamline and optimize the process of extracting data from their general ledger, which involves handling large datasets from different sources, in order to improve performance and efficiency.

Main Challenges

Challenge 1
Memory and Query Bottlenecks

Large volumes of data caused performance issues, including overloaded memory and inefficient query execution, which affected the system's ability to process and extract data quickly. Azati proposed optimizing the query design and improving the system's data handling capacity to ensure more efficient memory usage and faster query execution.

Challenge 2
Statistics and Index Inefficiency

Poor query design and incorrect system statistics led to inefficient query execution plans, with the optimizer frequently opting for full table scans instead of utilizing indexes, causing delays and system bottlenecks. Azati suggested gathering accurate system statistics, including optimizing index usage and fine-tuning query design to allow the optimizer to select more efficient execution plans, significantly improving query performance.

Key Features

  • Root Cause Analysis: Identified inefficiencies in query execution to understand the root cause of performance issues.
  • Optimization of System Statistics: Improved performance by gathering and validating system statistics for the database.
  • Improved Query Performance: Reduced query run time and improved system responsiveness by eliminating bottlenecks.

Our Approach

Performance Monitoring
Monitored system performance to identify inefficient query execution patterns.
Root Cause Analysis
Conducted root cause analysis to reveal that invalid system statistics were impacting query performance.
Statistics Validation and Adjustment
Developed a solution to gather and validate system statistics, adjusting them to reflect actual system performance.
Processing Type Balancing
Balanced the gathering of statistics between batch processing (DSS) and online transaction processing (OLTP).
Time-Based Statistics Collection
Scheduled and collected statistics for different time intervals (daytime, night-time, and 24-hour) to ensure representative data.
Optimization through Statistics Publishing
Analyzed and published new system statistics to optimize query execution plans.

Project Impact

As a result of the optimization, the system's performance improved significantly, reducing the runtime of processes from an average of 3 hours to just 10-15 minutes. The number of direct path read events was halved, leading to faster data extraction and improved system efficiency.

Ready To Get Started

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.