You can run following query to get table count and date analyzed. From: oracle-db-tuning-l Groups. No Account? Sign up. By signing in, you agree to our Terms of Use and Privacy Policy. Already have an account? Sign in. By signing up, you agree to our Terms of Use and Privacy Policy. Notify me of new posts via email. This site uses Akismet to reduce spam.
Learn how your comment data is processed. Skip to content. To check tables and indexes last analyzed date Posted on September 10, by Sher khan. After they are purged, set the desired retention. Share this: Twitter Facebook.
Like this: Like Loading Spread the Knowledge! Leave a Reply Cancel reply Your email address will not be published. Previous Post Previous. Next Post Next. Search for: Search Hits by Community. We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits.
You can find out more about Policy by tapping This Link. Manage consent. Close Privacy Overview This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. Husqvik Husqvik 5, 1 1 gold badge 14 14 silver badges 28 28 bronze badges.
Thanks for the help, but what is the bottom line: can I just let Oracle do it's thing in the background or do I need to worry about doing things like analyze table like in the good old days? I added one more paragraph to the answer. There is default maintenance window configuration for newly created database so statistics will be collected periodically according the rules I described above. We usually care about statistics when we rapidly change table content which can lead to execution plan changes and sub-optimal execution plans.
In most cases the periodical gathering is enough in some is much better to gather or compute statistics manually. You should know your data and requirements, Gathering statistics, especially histograms could take much time and resources for big tables. Yes, that is exactly the issue we are looking at: "Gathering statistics, especially histograms could take much time and resources for big tables. However, some bulk updates only include a handful of records.
In these cases there is a user expectation that the upload should not take very long I'm just inserting a few records, this shouldn't take long. When analyze is run after these small uploads it takes a long time and fails this expectation. It sounds like the best solution would be to run the analyze only after large "bulk" updates. A slightly less optimal solution would be to run analyze once after our initial large upload assuming that the data in the database at that point are representative of the larger data set.
If you have really large loads or incremental updates and you know your data sometimes it's better way to compute statistics on your own and set them manually instead of large full table scans or even worse histogram sorts.
Or you can lower the sample take for calculating statistics on big tables but there can be issues if the data isn't evenly distributed. Here is link as an inspiration how to do it: jonathanlewis. There is really lot already been written about gathering statistics so you can google around and decide what suits you the best.
Jon Heller Jon Heller Sign up or log in Sign up using Google. Sign up using Facebook.
0コメント