PrevPrev Go to previous topic
NextNext Go to next topic
Last Post 08/22/2011 4:47 PM by  val
Copy data from Production environment to Test environment
 27 Replies
Sort:
You are not authorized to post a reply.
Author Messages
Linda Herrmann
Private
Private
Basic Member
(16 points)
Basic Member
Posts:8


Send Message:

--
01/19/2007 11:57 AM
    App 8.1.0, Env 8.0.3, Oracle 9.2.0
    We would like to copy the data only from our Production environment (productline "PROD") to our Test environment (productline "TEST"). These environments are on two separate UNIX servers. There is currently data in TEST, but it can & should be replaced with the PROD data.
    The goal is to give end-users a 'safe' environment in which they can create and test Crystal reports prior to moving into Production. We don't want to impact Production performance.
    We would be doing this weekly or bi-weekly, undecided as of now.
    Which process from the above post would best meet our needs?
    phay
    New Member
    (3 points)
    New Member
    Posts:1


    Send Message:

    --
    01/19/2007 12:36 PM
    We only do this migration monthly but we have done is weekly. The IMPORT/IMPORT using a Nickname back to DB2 on zOS from UDB on AIX causes NO interuptions to anyone on zOS. You can customize the EXPORT to just pull
    a weeks worth of data. Use SPUFI on zOS to generate these NICKNAME, EXPORT & IMPORT statements:
    eg)
    SELECT 'CREATE NICKNAME DB2PNICK.' ||
    NAME || ' FOR DB2PSERV.' || STRIP(CREATOR) ||
    '.' || STRIP(NAME) || ' ; '
    FROM SYSIBM.SYSTABLES
    WHERE CREATOR = 'LAWPRD';
    Generate statements like:
    CREATE NICKNAME DB2PNICK.APINVOICE FOR DB2PSERV.APINVOICE;
    EXPORT TO /dbdata/DBUDBPRD/exports/APINVOICE.IXF OF IXF SELECT * FROM DB2LNICK.APINVOICE where BATCH_DATE > (current date - 1 week) ;
    IMPORT FROM /dbdata/DBUDBPRD/exports/APINVOICE.IXF OF IXF MODIFIED BY INDEXSCHEMA=LAWPRD CREATE INTO LAWPRD.APINVOICE IN LAWDAT01 INDEX IN LAWIND01 ;
    These improvements in IMPORT/EXPORT reduce a lot of REORGing to fix
    the locations of tables & indexes.
    John Henley
    Private
    Private
    Senior Member
    (9563 points)
    Senior Member
    Posts:3205


    Send Message:

    --
    01/19/2007 1:57 PM
    Linda, I split this topic off from the original post, since it's a little different.

    To reiterate, you want to copy your production data (UNIX server) to your test environment which is also a UNIX server. Since your production and test are in separate environments running on separate servers, you can't use Lawson's dbcopy, which only copies between product lines running in the same environment on the same server.

    There are a few other things we would want to look at:
    1. Are you're running the same database vendor/version on both servers?
    2. If so, are the schemas set up the same (i.e. for the sake of argument, let's assume your production database uses a schema named 'lawson').
    3. What is the size of the production database?

    In some cases, it may be as simple as simply a detach/copy/reattach of the physical database files from production to the test server.

    If that's not possible (e.g. you can't detach the production environment), you would need to dump from the production environment to the test environment.

    To do that you have a choice--do you use the native database tools, or the Lawson tools (expsysdb on the production server/bldxxxddl then impexp on test).
    Thanks for using the LawsonGuru.com forums!
    John
    Sam Simpson
    Private
    Private
    Veteran Member
    (673 points)
    Veteran Member
    Posts:239


    Send Message:

    --
    01/19/2007 4:09 PM
    Linda,

    We usually use metadumptbl/metaloadtbl or dbdump/dbload for small numbers of tables. But we call upon our DBA when it involves large dumps that may take several days/hours to do. Our dba has some very good scripts that can copy database from one schema to another.
    John Henley
    Private
    Private
    Senior Member
    (9563 points)
    Senior Member
    Posts:3205


    Send Message:

    --
    01/19/2007 4:45 PM
    Sam, just to clarify:

    metadumptbl/metaloadtbl are used to dump/load the table DEFINITIONS--not the data.

    dbdump/dbload are used to dump/load table DATA, table-by-table.

    expsysdb/impexp are used to dump/load table DATA, for an entire product line and/or specific systemcode(s).
    Thanks for using the LawsonGuru.com forums!
    John
    Linda Herrmann
    Private
    Private
    Basic Member
    (16 points)
    Basic Member
    Posts:8


    Send Message:

    --
    01/22/2007 9:14 AM
    John -
    1. Are you're running the same database vendor/version on both servers? YES.

    2. If so, are the schemas set up the same (i.e. for the sake of argument, let's assume your production database uses a schema named 'lawson'). NO. Production schema=PROD, Test schema=TEST.

    3. What is the size of the production database? I don't know. But I will find out from our DBA.
    Production database = 2.6GB

    Our DBA states detaching from prod environment is not feasible.

    Thanks!!
    Linda Herrmann
    Private
    Private
    Basic Member
    (16 points)
    Basic Member
    Posts:8


    Send Message:

    --
    01/23/2007 12:52 PM
    John -

    Using Lawson tools (expsysdb on the production server/bldxxxddl then impexp on test) would mean bldora9ddl in our situation.
    Is this the order in which these tools need to be executed?
    1) expsysdb (prod server)
    2) bldora9ddl (test server)
    3) impexp (test server)

    Is so, is our database size 2.6GB likely to present a problem?

    Thank you.
    John Henley
    Private
    Private
    Senior Member
    (9563 points)
    Senior Member
    Posts:3205


    Send Message:

    --
    01/23/2007 1:22 PM
    Those are the correct steps. Size shouldn't be a problem (assuming you're UNIX settings are 'modern' and you don't have a problem with file > 2GB).

    One additional option that can save you a little time, but which adds a couple of steps, is to use bldora9ddl to build the (empty) tables, then run bldora9ddl -UDI to DROP the indexes, load the data, then run bldora9ddl -URI to re-build the indexes. The advantage is that the load will go much faster since there are no indexes.

    However, given that your database is pretty small, I don't think that's worth the extra steps, and would just recommend importing with the indexes in place.
    Thanks for using the LawsonGuru.com forums!
    John
    Xin Li
    Private
    Private
    Veteran Member
    (288 points)
    Veteran Member
    Posts:130


    Send Message:

    --
    01/28/2008 8:25 AM
    You can use Oracle tools exp and imp to unload and reload data from database to database. You Oracle dba should know hoe to do it.
    msjmg111
    Private
    Private
    Veteran Member
    (176 points)
    Veteran Member
    Posts:74


    Send Message:

    --
    03/11/2008 12:57 PM

    Check out Lawson KB article 557058.  I had a issues with expsysdb/impexp - errors and the time do do the copy was taking too long.  I worked with Lawson and the steps in the KB article have been recently updated.  One additional step that we did but is not in the article, we created a userid that has only read access to the prod tables.  We use that userid in the ORACLE file to keep it from inadvertently writing to our production system.

    Barry Ghotra
    Private
    Private
    Veteran Member
    (175 points)
    Veteran Member
    Posts:63


    Send Message:

    --
    12/10/2008 2:56 PM
    I know I'm trying to open an old topic, but has anyone tried this (john's recommendation or the KB article) for version 9 apps?
    a)I'm wondering if you simply copy the "lawson" schema (via exp/imp or other conventional methods) and if you are running on two separate servers, will that do the trick.
    b) In version 9 (lsf9 with lawson 9 apps) can you still do a dump of jobs and printfiles and load that into a new empty environment?

    Thanks.

    Barry Ghotra
    Private
    Private
    Veteran Member
    (175 points)
    Veteran Member
    Posts:63


    Send Message:

    --
    12/10/2008 3:10 PM
    One more thing, if we are using Oracle on AIX, can't we do a TWO_TASK (listener) setup to access the database on the separate servers. If that's the case, wouldn't dbcopy be the best approach and fastest?

    Still have questions about the scheduled jobs and printjobs!!

    Thanks.
    jdl
    Advanced Member
    (50 points)
    Advanced Member
    Posts:20


    Send Message:

    --
    12/11/2008 10:39 AM
    I am interested in the sqldbcopy - we use Oracle but haven't yet figured how to get sqldbcopy to work. I have used both the dbcopy and the bldora9ddl method. The latter method is much quicker, but if anyone can list the details of the sqldbcopy for an Oracle dbms, that would be great.
    John Henley
    Private
    Private
    Senior Member
    (9563 points)
    Senior Member
    Posts:3205


    Send Message:

    --
    12/11/2008 8:27 PM
    the target user/schema has to have SELECT access to the source tables. 

    Both databases have to exist on the same server.

    If both databases (source and target) don't exist in the same database instance, you can use a db_link (the -l parameter below):

    > sqldbcopy -d --ss sourceschemaname --sd sourcedatabase --ds targetschemaname --dd targetdatabase 
    --du targetusername --dp targetpassword -h 192.168.200.1:1521 -l dblinkifneeded ora10 sourceproductline targetproductline

    Thanks for using the LawsonGuru.com forums!
    John
    Barry Ghotra
    Private
    Private
    Veteran Member
    (175 points)
    Veteran Member
    Posts:63


    Send Message:

    --
    12/11/2008 8:35 PM
    Thanks.John, could you please give some feedback to this:
    If you need to migrate the entire environment from old to new hardware. Will it work to re-create the entire environment and then:
    a)I'm wondering if you simply copy the "lawson" schema (via exp/imp or other conventional methods) and if you are running on two separate servers, will that do the trick.
    b) In version 9 (lsf9 with lawson 9 apps) can you still do a dump of jobs and printfiles and load that into a new empty environment?

    Is there a way to safely copy users, jobs, etc. from lsf9 old to lsf9 new environment?

    Thanks.John, could you please give some feedback to this:

    Ben Coonfield
    Private
    Private
    Veteran Member
    (420 points)
    Veteran Member
    Posts:146


    Send Message:

    --
    12/12/2008 3:14 PM

    To migrate an entire LSF9 environment to new hardware, I would do it in four parts:

    Environment software (best to do a new install)

    ProductLine (Use productline copy procedure in the AMT)

    Database (I would use an Oracle export / import. There are many other methods but I find this fastest as we do this operation frequently.).

    Other GEN/LOGAN data.

    The last part is a bit challenging. I would migrate over different pieces by different methods, some custom.  Lawson provides utilities for some parts, but not for others.  There is still a Lawson utility to dump jobs which should work fine. (I usually have a few jobs that fail to load but this is due to invalidly defined jobs on the source system.) Other areas like tokendefs have dump/load utilities, but I usually need to do additional updates, such as changing the path names or product line name in user tokens.

    Migrating users is more difficult. Lawson does not provide any utility to dump users in a format suitable for loading on another server. You should be able to do this with LDAP tools but it is complex, and Lawson doesn’t provide any assistance.  If you must preserve the old user passwords, (and you have not done an LDAP bind to active directory) you will need to to this.  But if you can tolerate reassigning new passwords you could rebuild the users (probably with the loadusers utility).

    It is possible to physically copy the entire disk storage from the old server to the new, if the hardware is sufficiently similar – copying not just Lawson but the entire operating system. But you need to redo a number of configuration files to adjust them for the difference in IP address & hostname. It helps if you give the new hardware the same hostname (or alias) you used on the old system. But this process is undocumented and unsupported so you may need to do some research.  If you can even reuse the old IP address this might be an easy process, but only if the new hardware is sufficiently similar, e.g. the same device drivers work, same size & layout of storage, etc.  That is probably not likely.

    John Henley
    Private
    Private
    Senior Member
    (9563 points)
    Senior Member
    Posts:3205


    Send Message:

    --
    12/12/2008 3:47 PM
    Excellent post!
    Thanks for using the LawsonGuru.com forums!
    John
    Barry Ghotra
    Private
    Private
    Veteran Member
    (175 points)
    Veteran Member
    Posts:63


    Send Message:

    --
    12/12/2008 4:14 PM
    Hi Ben

    Thank you so much for the detailed explanation. I have the ability or luxury to make the new server environment completely identical, but I think I will stick with the new install as that's the cleanest.

    I understand the LOGAN/GEN and lawson export/import which should be doable. What I'm not clear on is the product line copy. If I'm already copying the apps data by lawson export/import, copying GEN/LOGAN data by rnddbdump or other means, why do I still need to copy the product line? Can't I simply create a new blank (even with the same name) in the new installed environment, create empty tables and simply import the lawson data by using the (ignore=Y for existing obejcts) import method?

    Would appreciate if you could clarify that.

    Thanks.
    Ben Coonfield
    Private
    Private
    Veteran Member
    (420 points)
    Veteran Member
    Posts:146


    Send Message:

    --
    12/15/2008 6:46 AM
    Recall that the installation or upgrade of the environment, and the application are completely separate processes. On the new hardware you will be reinstalling the environment software, but this is separate from the actual application. The application code (4GL programs) are typically not reinstalled but copied. In principle you could reinstall the applications from scratch on the target system, but you would want to be sure you then installed exactly the same combination of MSPs & patches that you have on your old system. It is easier and more accurate to copy all of this over from your old system, which is what the productline copy procedure is intended to do.

    If you were to copy the database to a target product line that had a different combination of MSPs or CTPs applied you would probably have problems. This would almost certainly be the case if you reinstalled the applications on the target, rather than copying them. Those problems could be serious depending on exactly which CTPs differed between the two systems.

    The productline copy really has two parts -- copying over all the definitions and code and recompiling everything, and copying over the database contents. The former is covered in detail in the AMT guide. The part relating to the database is commonly done at the same time, but since the details are DBMS-specific and most people use databse-supplied tools rather than lawson utilities for the database copy, that part is not covered in the AMT document.
    Barry Ghotra
    Private
    Private
    Veteran Member
    (175 points)
    Veteran Member
    Posts:63


    Send Message:

    --
    12/19/2008 5:12 PM
    Ben, thanks for the detailed explanation and I totally agree with your comments. As I had the exact patches and the sequence I applied them back in June, I was thinking of just completely reinstalling the environment and then re-installing the apps. At this point migrate the lawson schema over using the export/import (oracle). And then just get he jobs setup and the print jobs over. I only have one product line (with financial suites) and about 25 users for this particular environment, I think that's seems the most straight forward way.

    On the other note as I have the luxury of literary having a same OS environment (hostname, file systems, Base AIX OS VERSION), I was contemplating copying the files systems over, recompiling the code and run the SSOCONFIG to point to a new ADAM install (as that will be on a new named server). This seems faster, but you see any issues with this approach? The GEN, LOGAN and LAWSON schemas will be same exact database copies along with the file paths etc.?

    Please advise and again thank you for your precised replies.

    Barry Ghotra
    Ben Coonfield
    Private
    Private
    Veteran Member
    (420 points)
    Veteran Member
    Posts:146


    Send Message:

    --
    12/20/2008 8:54 AM
    Copying over the filesystems is certainly faster-- not so much because of the productline copy which doesn't take that long, but because of the environment reinstall. At least if you have a lot of add-ons to install, as did I. (EDI, portal, design studio, RSS, MSCM, etc...) All of these, plus their patches took far longer than the productline copy.

    If you try copying the entire filesystems to make an exact clone you will also need to make changes to the WebSphere & web server configuration. These changes are not documented anywhere by Lawson. I don't have access now to my notes on exactly what I changed, but I did find one PDF document on the internet about moving WebSphere to a new system that described this steps needed for WebSphere. This process is a bit easier if your web server is on the same box.

    Yiou will also need to review your web applications to see if any of them have configuration files that need updating. Most probably will not.

    Personally, I find that using the same productline name on two different systems to be undesirable; the productline copy process gives you an opportunity to change the name.

    Overall, the process to copy the entire system and reconfigure is much faster, once you identify all of the undocumented configuration changes needed. But there are many opportunities to miss something. On the other hand, doing this also means that much of the work for a disaster recovery plan is already done, as this is probably very similar to the process you would use for a disaster recovery situation. I would try it if you are OK preserving the exact disk layout, OS, & hostname, etc.
    Chris12
    Private
    Private
    Veteran Member
    (366 points)
    Veteran Member
    Posts:150


    Send Message:

    --
    06/09/2010 12:21 PM
    I use expsysdb, bld...., impexp
    Jeff White
    Private
    Private
    Veteran Member
    (225 points)
    Veteran Member
    Posts:81


    Send Message:

    --
    06/09/2010 12:33 PM
    For copying data, especially large amounts of data, we have found that Oracle's datapump is the way to go. F-A-S-T!
    beverly godwin
    Veteran Member
    (305 points)
    Veteran Member
    Posts:143


    Send Message:

    --
    04/14/2011 2:12 PM
    We are being told that we cannot grab a copy of our production (Windows: 9.0.1.5 ) to put into our test (Windows: 9.0.1.7). We want to do a parrellel test with payroll. Our test system was updated with esp's and msp's and that is the best way for us to test this out.

    We are told that they have to be the same msp to do this. Is this true?

    We actually did this last week (before being told that we could not) and GSC helped us with rebuilding the data dict and do a reorg..took about 15 minutes and they say we are good now..so I'm not sure why we couldn't just do that OOPS again to get fresh data to test for test.
    EricS
    Private
    Private
    Veteran Member
    (232 points)
    Veteran Member
    Posts:80


    Send Message:

    --
    04/14/2011 2:49 PM
    Unless you can figure out what table changes were made between 9.0.1.5 and 9.0.1.7, then figure out how any new/changed fields were populated when the upgrade ran to reproduce the upgrade process I don't see how you could copy a database from one MSP to a database on another MSP. At least not and promise your users that all the data in the database is consistent and correct. Al the reorg did was make the table structure consistent, I don't think that's enough.

    The best I see that you could do would be to create a product line on 9.0.1.5, copy data into it, upgrade that to 9.0.1.7, then either test on that product line or copy the data into your existing 9.0.0.7 product line.

    I think GSC should have told you to bring up a 9.0.1.5 product line and upgrade it again.
    John Henley
    Private
    Private
    Senior Member
    (9563 points)
    Senior Member
    Posts:3205


    Send Message:

    --
    04/14/2011 3:26 PM
    EricS is absolutely correct.

    Thanks for using the LawsonGuru.com forums!
    John
    Sampath S
    Private
    Private
    Basic Member
    (39 points)
    Basic Member
    Posts:15


    Send Message:

    --
    04/15/2011 4:28 AM
    Be sure of the customized tables; suppose if you are copying the complete data along with the tables structure, then if you any new custom tables defined in your test servers will be gone.
    val
    Private
    Private
    Basic Member
    (16 points)
    Basic Member
    Posts:8


    Send Message:

    --
    08/22/2011 4:47 PM
    Here is how I refresh data in a product line on the test server, with all data from the prod product line on the production server (Our DB is Oracle)...

    On Lawson Test App Sever:
    Delete tables and data in :
    bldora10ddl -UD

    On DB Server
    Export ORACLE_SID=PROD (or whatever your production instance SID is)
    Export data :
    exp file=prod.dmp owner=LAWPROD (<--prod database user name) statistics=none indexes=n log=exp.log

    On DB Server
    export ORACLE_SID=TEST
    Import data :
    imp file=prod.dmp fromuser=LAWPROD touser=LAWTEST (<--test database username) statistics=none indexes=none rows=y log=imp.log

    On Lawson Test App Server
    Build Indexes
    bldora10ddl -URI


    On Lawson Test App Server
    Verify Database is Valid
    verifyora10
    You are not authorized to post a reply.