Automatically updating statistics

The Adaptive Server cost-based query processor estimates the query costs using the statistics for the tables, indexes, and columns named in a query. The query processor chooses the access method it determines has the least cost. However, this cost estimate cannot be accurate if the statistics are not accurate. You can run update statistics to ensure that the statistics are current, however, running update statistics has an associated cost because it consumes system resources such as CPU, buffer pools, sort buffers, and procedure cache.

You can set update statistics to run automatically when it best suits your site and avoid running it at times that hamper your system. Use the datachange function to determine the best time for you to run update statistics. datachange also helps to ensure that you do not unnecessarily run update statistics. You can use the Job Scheduler templates to determine the objects, schedules, priority, and datachange thresholds that trigger update statistics, which ensures that critical resources are used only when the query processor generates more efficient plans.

Because update statistics is a resource-intensive task, base the decision to run update statistics on a specific set of criteria. Key parameters that can help you determine a good time to run update statistics include:

Data change is a key metric that helps you measure the amount of altered data since you last ran update statistics, and is tracked by the datachange function. Using this metric and the criteria for resource availability, you can automate the process of running update statistics. Job Scheduler includes a mechanism to automatically run update statistics. and also includes customizable templates you can use to determine when to run update statistics. These inputs include all parameters to update statistics, the datachange threshold values, and the time to run update statistics. Job Scheduler runs update statistics at a low priority so it does not affect critical jobs that are running concurrently.