Refreshing LTM Data

Sort:
You are not authorized to post a reply.
Author
Messages
SP
Veteran Member
Posts: 122
Veteran Member
    Does anyone have any experience or care to offer guidance re refreshing LTM data?  I have an Infor functional consultant who is advising that LTM productlines cannot be refreshed.

    We have two LTM productlines in one LMRK env.  Currently all setup and conversions are happening in one of the LTM productlines with data being fed to S3 via IPA (LPA).

    The Infor consultant is advising the users that any garbage setup data going into LTM is there forever and there is no way to simply take snapshots from time to time like we always have in the past when it was just S3.

    I have a copy of the 10 page "Data Copy Process for Landmark Applications" published in November, but it seems much more involved than what I am needing.  I was really wondering if anyone has experience in this and would care to wax eloquent.   ;-)
    Woozy
    Veteran Member
    Posts: 709
    Veteran Member
      The short answer is that data refreshes are possible in LTM/Landmark.  However, they are considerably more difficult and involved than a database refresh because of the way the application links business classes together.  Your last sentance is the key - there isn't a simple way to do a refresh in Landmark.  It is possible, but it is very involved.

      Unfortunately, I can't really provide many details about this is done, because we are a Managed Services client, so Infor manages the data refresh process.  However, I believe they use the standard landmark command-line utilities to do the dump and load.

      We've been running LTM "live" since June of 2010, and were doing regular data refreshes starting in late 2009 for our cutover testing.  In that case, we did all our configuration setup in LTM (just config, not employee detail), then took a snapshot of the config data, and then did our cutover loads.  Once the cutover test load was complete and we figured out what configuration we missed, we purged all the data and reloaded our configuration snapshot.  Then we made the additional config changes, took another snapshot, and repeated the cycle.

      In our "live" environment, we regularly refresh data from PROD to our TEST, TRAIN, and DEV productlines.  One of the primary things we've learned is that a data refresh includes ALL environment and productline tables (including security, actions, workunits, etc), so we've had to use trial-and-error to determine which tables need to be purged after the refresh, or restored from a pre-refresh dump.  In the beginning we didn't purge actions or workunits, so flows that were in-process in PROD started timing out and sending "live" escalation emails from DEV, which caused a LOT of confusion. 

      We've also had to work with Managed Services to build a process to change all the email addresses in DEV to hcmdev@simplot.com (in our case) so if emails are sent they go to a common mailbox rather than the employee email account.  This impacts the ACTOR and EMPLOYEECONTACT business classes.  This is more difficult than a simple SQL update because sometimes the application uses email addresses that are stored in CLOB fields as XML, so AMS does a SQL update, then runs a command that fixes the CLOB values. 

      I hope this helps.


      Kelly Meade
      J. R. Simplot Company
      Boise, ID
      Mark Larochelle
      Basic Member
      Posts: 9
      Basic Member
        Here are the steps that I use, and was provided to me by Infor Professional Services. It the below example, an LTM production product line, lawtaprd is refreshed to a testing product line, lawtaint. Note that the DBA is involved with this method, which basically saves you alot of time. I can't really imagine this working strictly using Landmark imports within a reasonable window of time. The StartLM.bat and StopLM.bat scripts... I think if you are familiar with the Landmark cdexports/cdimport, dbexport/dbimport, daexport/daimport commands, and can coordinate this task with your DBA, you will find that your data refreshed, but your configuration in tact. Business owners did mention to me that they needed to update the Landmark/S3 integration forms in Rich Client to prevent Landmark test to Landmark production integration. What I did for extra insurance is the make a host entry in the Landmark development server for the LSF/S3 url to loop back to itself (127.0.0.1). This is of course only an issue if you are federated and have LTM integration back to S3 HR.

        E:\Supportstuff\Scripts>type StartLM.bat
        net start "LCM-Service SOPSNHFVHRD.NHFIRST.INT"
        net start "GridAgent - lmdev_grid - GridHost"
        D:
        cd D:\lmdev
        call enter.cmd
        cd E:\Supportstuff\Scripts
        call startlaw
        net start "IBM WebSphere Application Server V7.0 - SOPSNHFVHRDlmdev01"
        net start "IBM WebSphere Application Server V7.0 - SOPSNHFVHRDCellManager01"
        net start "IBM WebSphere Application Server V7.0 - lmdev_app1"
        net start W3SVC

        E:\Supportstuff\Scripts>type StopLM.bat
        net stop W3SVC
        net stop "IBM WebSphere Application Server V7.0 - lmdev_app1"
        net stop "IBM WebSphere Application Server V7.0 - SOPSNHFVHRDlmdev01"
        net stop "IBM WebSphere Application Server V7.0 - SOPSNHFVHRDCellManager01"
        D:
        cd D:\lmdev
        call enter.cmd
        cd E:\Supportstuff\Scripts
        call stoplaw
        net stop "GridAgent - lmdev_grid - GridHost"
        net stop "LCM-Service SOPSNHFVHRD.NHFIRST.INT"



        On the target (D:\lmdev\src) (11/3/13 – lawtaint refresh)
        ----------------
        adminlaw (11/3 7:00AM)
        Next 3 steps are completed with the script: E:\SupportStuff\Scripts\lawtaintRefresh.bat
        cdexport -z lawtaint.cdexport.zip lawtaint (exporting customizations, and processflow) (11/3 AM)
        dbexport -Cz lawtaint.save.zip lawtaint ConfigurationParameter ( 8:00AM)
        dbexport –Cz Roaming.save.zip GEN RoamingUIProfile (11/3)

        on the source (Production)
        -----------------
        Adminlaw (11/3 7:00AM)
        Next 2 steps run as local lawson account with the script: E:\SupportStuff\Scripts\Refresh.bat
        daexport -z lawtaprd.daexport.zip -e lawtaprd (11/3 7:41AM)
        cdexport -z lawtaprd.cdexport.authsecurity.zip --authsecurity lawtaprd (11/3 7:48AM)
        exportidentities ALL AllIdentities.xml (Not running because it hangs).
        export LAWTAPRD database using database utility (11/3 4:00AM)

        on the target (Development)
        import LAWTAPRD data to LAWTAIT using database utilities (11/3, begin at 10:00AM, finished )
        Received environment from DBA (11/4 9:34AM.
        StartLM.bat (11/4 9:55AM).
        Adminlaw
        Disable ERPS security triggers (11/4 8:00AM)
        Copy zip files over from source (production) (11/4 8:19AM)

        The following can be run from E:\SupportStuff\Scripts\lawtaintImports

        daimport --sameenv –w –I lawtaprd.daexport.zip lawtaprd=lawtaint (11/4 10:01AM).

        cdimport -I lawtaint.cdexport.zip lawtaint (11/4 10:12AM)
        (Gives significant amount of unique constraint errors)
        cdimport --authsecurity -I lawtaprd.cdexport.authsecurity.zip lawtaint (11/4 10:20AM).
        (all gave unique constraint errors)
        dbdeletedata -d -Y lawtaint ConfigurationParameter (11/4 10:22AM)
        dbimport -C -z lawtaint.save.zip lawtaint ConfigurationParameter (11/4 10:23AM)
        scupdate -f lawtaint (11/4 10:23AM)
        datamaint --auditsnapshot -Uc lawtaint (11/4, 9:32AM) Cancelled. Needs to be re-run.
        Startlaw and recycle rest of development ERP system. (10:25AM). (Only the Lawson account could log in)
        Run StopLM.bat and StartLM.bat (11:04AM)
        FDM to re-point forms with LSF/S3 integration to devnew, from prod.

        DBA leaves lawsonit and lawtait imports running. Once he validated Monday morning, re-cycled the entire development environment Monday morning.


        Srini Rao
        Veteran Member
        Posts: 148
        Veteran Member
          How long it takes to import the data. It's taking the 11 hours to finish the daimport. IS their a way to improve the speed.
          You are not authorized to post a reply.