Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
matty1stop
Starting Member
37 Posts |
Posted - 2007-05-21 : 11:38:00
|
| I have two issues I am trying to resolve.1) I am attempting to add a foreign key relationship to a table with millions of records but keep experiencing a timeout problem. 2) Is it possible or even advisable to add a foreign key to a table in a linked database?Any help you could provide with either of these issues would be greatly appreciated.Thanks,Matt |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2007-05-21 : 11:45:58
|
| 1) You can manually check the data then add the FK with nocheck.2) What?==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
 |
|
|
matty1stop
Starting Member
37 Posts |
Posted - 2007-05-21 : 12:55:29
|
| Thanks for the reply1) Yes that seems to have worked2) Sorry if I was unclear. I am moving data from a data warehouse to some aggregate tables in a datamart. The imports are running very slow and I am trying to optimize them. I am using an insert select statement to populate the datamart. Some of the source tables for the import are in the datamart and others are in the warehouse. Can I create a foreign key from a table in the datamart db where the primary key is in the warehouse db? |
 |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2007-05-21 : 13:06:31
|
| You can't create a foreign key across databases (think about a restore).insert .. select will be slow.if you can't drop the table and do a select into then bcp the data out and in again or use ssis and a fast load.==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
 |
|
|
matty1stop
Starting Member
37 Posts |
Posted - 2007-05-21 : 13:16:21
|
| The main table in the warehouse currently has about 80 million records. The aggregate table in the data mart has 3 million (10 months of data). We are looking to update this data daily. You think the best way to do that is to drop (or trunctate?) the table in the data mart and do a select into?Thanks again for the advice. |
 |
|
|
|
|
|