No announcement yet.

Purging and Archiving

  • Filter
  • Time
  • Show
Clear All
new posts

  • Purging and Archiving

    I am looking for help in the area of custom purges and/or archiving. We have been live for 4 months now and we are seeing really poor performance. The Business Monitor, Saved Queries, and custom reports are timing out on a regular basis. I'm looking for anyone willing to share what they do for purging or additional indexing if relief can be found there. I really need a table structure so we can write a custom PL/SQL script. I want to make sure we don't miss anything.

    Here are some of our common tables with counts.
    449,172 OR records
    3,319,474 OR_refnum records
    5,036,227 OR_line records
    150,574 shipment records
    311,932 Shipment_stop records

    Let me know if anyone can help point me in the right direction.

  • #2
    Re: Purging and Archiving


    These numbers you quote are not excessive.

    Without answering your original question - you should look to the performance of your Queries, ensuring they access indexes where ever possible.

    Also - it's quite common that your Customers Browser timeout policy is set quite low - so the timeouts you see may be being caused by this.

    Gary Cunningham


    • #3
      Re: Purging and Archiving

      Thanks for the info. Gary. Do you happen to know where the timeout setting is? I'll start there first.


      • #4
        Re: Purging and Archiving

        Hi Chad

        Unfortunately no.

        This I would imagine to be a Network configuration - I would suggest talking to your customers Network Support for more info.

        Gary Cunningham


        • #5
          Re: Purging and Archiving

          Hi Chad,
          Please check the following
          1. The last time your 'glogowner' database schema was analyzed. If not recently, I would advise you doing one asap and checking the performance
          2. Secondly, check the query performance directly on the database. if the response is good, check for VPD's

          Please post your reply for further assistance.



          • #6
            Re: Purging and Archiving

            We analyze the db every weekend. We do have an open SR with Oracle though on report performance that seems to keep pointing back to the VPD. Our reports run in 10 seconds when logged in as dba.admin but take around 7-10 minutes when logged in as a normal user. Oracle has not found a resolution at this time. I have been getting some relief when using vpd.add_active_table in the reports.

            Have you added indexes to the db or done any purging?



            • #7
              Re: Purging and Archiving

              Four months is about the time it takes for the order release refnum table to become the performance culprit. More than likely, if you take your saved queries and analyze them separately, you'll probably see full table scans. If you have already taken the normal steps of pinning all the big packages
              (i.e. run: sys.dbms_shared_pool.keep'GLOGOWNER.VPD');
              And, have a job that rebuilds all the otm indexes on an occasional basis.
              And run analyze on a weekly basis.

              Then try altering your saved queries. A good "trick" is, when quering reference data, instead of using "Begins With" "A", try using "Contains" and then providing more info to the code, like "ABC CORP". You'll notice a huge performance gain in the saved query, which, in turn will improve the business monitor. In general, if the business monitor takes more than 5 seconds to bring up, you definitely have a poorly performing query within it.

              As you have also found out, tuning should be done within a domain user, not DBA.ADMIN.

              Hope this helps.



              • #8
                Re: Purging and Archiving

                Can you explain more about (i.e. run: sys.dbms_shared_pool.keep'GLOGOWNER.VPD');

                We have 9 queries in our BM and it takes an average of 9 minutes to return.



                • #9
                  Re: Purging and Archiving

                  You should create a trigger in the database to Pin all the most commonly executed or large packages. This is in the OTM Administration Guide.

                  ORREPLACETRIGGER glogowner.db_startup_pin_packages

                  Additionally, what are the queries in your BM going after? Order Release? Shipments?

                  I would copy someone's BM who is having the issue, and then go and add them back one by one to see which query is giving you the issue. Once you find the bad performing query, and a fix for it, you'll have to make the same fix for all the other queries in the BM going after the same subject. Order release, in my experience is a common cause of BM performance. I have had the same issues as well. We have 15+ in the BM. We were seeing 3-5 minutes to bring it up. After tuning, we are down to 5 seconds.


                  • #10
                    Re: Purging and Archiving

                    I have our dba checking into pinning those packages.

                    We have just over 3.5M records in the OR_refnum table. Most of our BM queries are SQL queries that take about 30 sec. just to run against the db.

                    Do you use the app. purge process?


                    • #11
                      Re: Purging and Archiving

                      I have written my own purge package to purge data from the most heavily used tables. The main offenders of disk space are the i_transaction, i_transmission, i_log, i_log_detail, audit_trail and audit_data_info. I pass a parameter into the process to purge data that is beyond X days old. For the i_transaction and i_transmission tables, I null out the XML_BLOB field. The remaining tables, I delete the rows all together. The time frame for keeping data is up to your own preference. Some like to keep as long as 90 days, some like it as low as 21 days. I use the Oracle job scheduler to run the process once a week on the weekends. For very heavy usage, this can be run nightly.

                      I have found that purging the data is good for keeping storage requirements down. However, I haven't see any great benefit to improve on performance issues, particularly the areas you are seeing. Your problems have little to do with the large transaction type tables in need of purging.

                      Regardless, purging is still a necessary evil that you should have your DBA setup.

                      Hope this helps a bit.

                      - Mike