| Author |
Topic |
|
hbharath
Starting Member
9 Posts |
Posted - 2008-01-03 : 06:23:42
|
| Hi all, I am Migrating a database from sybase to sql server 2000. I Have already created objects in sql server 2000. I have to only populate it with the data. I have decided to genrate insert script from the aquadata tool and run the script againgt sql server database. It works fine except for few table which have 1,50,000 and 9,00,000 rows. It shows insufficent memory error when i try to run the script with 1,50,000 and the script with 9,00,000 doesn't open in management studio. Please help. Bharath JrDBA |
|
|
AndrewMurphy
Master Smack Fu Yak Hacker
2916 Posts |
Posted - 2008-01-03 : 07:09:55
|
| Can you break it down into smaller batches? using the SET ROWCOUNT or TOP constructs.You may also consider exporting the data to "test" files and re-import using the BCP tool. |
 |
|
|
hbharath
Starting Member
9 Posts |
Posted - 2008-01-03 : 07:56:28
|
| Ok Let take a SQL file which as 1,50,000 record. if I SET ROWCOUNT 40,000 at the top of script and how will i identify the 40,001 record for the next execution.Bharath JrDBA |
 |
|
|
AndrewMurphy
Master Smack Fu Yak Hacker
2916 Posts |
Posted - 2008-01-03 : 09:26:45
|
| set rowcount 40000insert into table1select * from table2left join table1 on table1.id = table2.idwhere table2.id is nullIn english, insert into table1 all records from table2 not found in table1- given we match the records on id columns...limited to 40000 records....running this multiple times will move the table1 set from a record count of 0->40k->80k->120k->160k, etc....You could alterantively, if the id fields were consecutaitve and increaseing (a la identity fields), do (something like)set rowcount 40000set @looplimit = @looplimit (starting this at 1) + 40000insert into table1 select * from table2 where id >= @looplimit order by id |
 |
|
|
hbharath
Starting Member
9 Posts |
Posted - 2008-01-03 : 09:51:18
|
| hmmm but i am not inserting it from a table, I am inserting from a file which contains insert statement generated using the tool which i used against the sybase server(shown in ex below). Or should I try takin a bcpout from sybase server but it showed a problem earlier for some table. I found a tool which splits a large text file into smaller text file but other DBA told that it might not be reliable.Ex:INSERT INTO pay_timesheet_tbl(pay_calendar_id, employee_id, activity_id, project_id, project_type, client_id, reg_hours, ot_hours, comp_hours, vac_hours, sick_hours, hol_hours, lwop_hours, notes, company_id, business_unit_id, update_date, update_user, connect_user_id) VALUES(668, 70200271, 65350513, 31, 'COMRCL', 723, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, NULL, 0, 0, '20060118 15:28:42', 'ashta', NULL)GOINSERT INTO pay_timesheet_tbl(pay_calendar_id, employee_id, activity_id, project_id, project_type, client_id, reg_hours, ot_hours, comp_hours, vac_hours, sick_hours, hol_hours, lwop_hours, notes, company_id, business_unit_id, update_date, update_user, connect_user_id) VALUES(668, 70200273, 65350511, 35200097, 'SUBCON', 1103, 61.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, NULL, 0, 0, '20060118 15:28:42', 'ashta', NULL)GOINSERT INTO pay_timesheet_tbl(pay_calendar_id, employee_id, activity_id, project_id, project_type, client_id, reg_hours, ot_hours, comp_hours, vac_hours, sick_hours, hol_hours, lwop_hours, notes, company_id, business_unit_id, update_date, update_user, connect_user_id) VALUES(668, 70200274, 65350514, 15150091, 'COMRCL', 163, 72.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, NULL, 0, 0, '20060118 15:28:42', 'ashta', NULL)GOINSERT INTO pay_timesheet_tbl(pay_calendar_id, employee_id, activity_id, project_id, project_type, client_id, reg_hours, ot_hours, comp_hours, vac_hours, sick_hours, hol_hours, lwop_hours, notes, company_id, business_unit_id, update_date, update_user, connect_user_id) VALUES(668, 70200275, 65350519, 50200099, 'COMRCL', 1132, 40.00, 0.00, 0.00, 0.00, 0.00, 8.00, 32.00, NULL, 0, 0, '20060118 15:28:42', 'ashta', NULL)GOINSERT INTO pay_timesheet_tbl(pay_calendar_id, employee_id, activity_id, project_id, project_type, client_id, reg_hours, ot_hours, comp_hours, vac_hours, sick_hours, hol_hours, lwop_hours, notes, company_id, business_unit_id, update_date, update_user, connect_user_id) VALUES(668, 70200276, 65350517, 15150091, 'COMRCL', 163, 68.75, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, NULL, 0, 0, '20060118 15:28:42', 'ashta', NULL)GOINSERT INTO pay_timesheet_tbl(pay_calendar_id, employee_id, activity_id, project_id, project_type, client_id, reg_hours, ot_hours, comp_hours, vac_hours, sick_hours, hol_hours, lwop_hours, notes, company_id, business_unit_id, update_date, update_user, connect_user_id) VALUES(668, 70200278, 65350520, 27, 'COUNTY', 306, 64.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, NULL, 0, 0, '20060118 15:28:42', 'ashta', NULL)GOBharath JrDBA |
 |
|
|
AndrewMurphy
Master Smack Fu Yak Hacker
2916 Posts |
Posted - 2008-01-03 : 11:27:00
|
| With BCP your input file should be data only, with a format file to explain to BCP how to parse the incoming records.I took it that you were doing a cross-server query."....other dba...not reliable"...does he/she have any supporting info for that position?Using the insert statements like that would be slow, but workable....but for a once-off exercise it could be acceptable. |
 |
|
|
rmiao
Master Smack Fu Yak Hacker
7266 Posts |
Posted - 2008-01-03 : 22:21:50
|
| And you can set batch size in bcp. |
 |
|
|
russell
Pyro-ma-ni-yak
5072 Posts |
Posted - 2008-01-04 : 06:06:06
|
| while BCP is much much faster, u can just use sqlcmd to execute the file.EDIT my bad..sql2000...is osql, not sqlcmd |
 |
|
|
hbharath
Starting Member
9 Posts |
Posted - 2008-01-04 : 10:33:14
|
| thanks for the guidance... i will be deploying it tomo on production server. wish me luck.Bharath JrDBA |
 |
|
|
Harris00
Starting Member
19 Posts |
Posted - 2010-09-02 : 16:19:55
|
| How do i extract Sybase 5.5 database table schema to .sql file and run it on sql server to create objects. I am trying to migrate Sybase 5.5 to Sql server 2005 and need feedback on the best way to do it. |
 |
|
|
|
|
|