Announcement

Collapse
No announcement yet.

Duplication of Data load

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • nikesh mohanlal
    replied
    Re: Duplication of Data load

    Hi,

    Many thanks for the information.

    Please explain your understanding in detail.
    I am unable to explain.
    Any supporting document will help a lot.

    Else,

    Can I get a generic automated data loading tool which produces any number of records for any business object.
    Say for example, I say I would need 4,00,000 records to be created(duplicated) in 'xml' and then uploaded in OTM.

    It will be great if anyone let me know, how usually the process of data duplication takes place in OTM.
    Any specicfic tool is used?
    I tried with JMeter,but did nto worked.

    Please share your experience .

    Thanking and regards,
    Nikesh.

    Leave a comment:


  • chrisplough
    replied
    Re: Duplication of Data load

    No - I'm not advocating direct SQL updates. Regardless of what scripts you use (Perl or otherwise), I recommend loading the data through OTM's XML interfaces - so you'll need to format the XML and then post it via http to the integration servlet. This ensures that you won't have to worry about the information on a table level.

    --Chris

    Leave a comment:


  • nikesh mohanlal
    replied
    Re: Duplication of Data load

    So with the option of using perl scripts, also, we do need to populate the dependent tables and manage the relevant tables.
    Any idea on how to create a perl script to create 'Contacts' records?

    Regards,
    Nikesh.

    Leave a comment:


  • chrisplough
    replied
    Re: Duplication of Data load

    The only issue with the SQL population is that you'll have to carefully manage the relevant data in related tables, since you're going behind the application. You'll also need to restart to get a view on the data and any required Agent actions won't be run.

    The runtime will depend greatly on your platform and hardware - so it could be anywhere from a few hours to longer.

    --Chris

    Leave a comment:


  • nikesh mohanlal
    replied
    Re: Duplication of Data load

    Hi Chris,

    Many thanks for the instant reply.
    Well, I can think of writing a pl/Sql procedure with cursor concept to populate data into the DB,as it is one time job only.

    Any idea how long(duration) it will take to load 16,000 OrderRelease records into OTM, with both either as 'xml' or through 'csv' file type option.

    Regards,
    Nikesh.

    Leave a comment:


  • chrisplough
    replied
    Re: Duplication of Data load

    If you need to create a bunch of Order Releases with the same info, except for the GID - then I'd recommend creating a script and posting these to OTM via an http post. Alternately, if you don't want to deal with XML - you can create these as a CSV file (again, I'd script this - Perl, Python - whatever you prefer) and then upload it to OTM in order to create the data entities.

    This is a huge number of records, so the load may take quite some time and you'll want to split up the
    data files to make processing easier.

    --Chris

    Leave a comment:


  • nikesh mohanlal
    started a topic Duplication of Data load

    Duplication of Data load

    Hi all,

    Version: 5.5

    I have a peculiar requirement.
    Let's say I create an OR namely: 'XYZ'.
    Now, I need to create some 1,00,000 records of same details as of 'XYZ', but with different ORGid.

    Can anyone let me know how to do it?
    Any specific tool is available which does this job.

    Regards,
    Nikesh.
Working...
X