Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Administration (2000)
 DB snapshot

Author  Topic 

marat
Yak Posting Veteran

85 Posts

Posted - 2004-07-14 : 00:01:04
I have been asked by manager to backup a snapshots of a DB user tables
on to DVD on a regular basis(say once a week). It suppose to run as a scheduled job on SQL Server 2K.
Obviously I have to backup only tables which has been changed since last backup.
Question: Is there any way except tables comparison(as DB tables or as flat files)or using triggers(on insert, update or delete) on tables to check if table has been updated since the last backup.
Thanks
marat

timmy
Master Smack Fu Yak Hacker

1242 Posts

Posted - 2004-07-14 : 00:12:06
Just run an incremental backup and then burn the resulting backup file to disc.

In answer to your question - no, there is not way that SQL can tell you this. If it's that important and you don't like the first option, you'd be better of timestamping all your tables.
Go to Top of Page

marat
Yak Posting Veteran

85 Posts

Posted - 2004-07-14 : 20:39:59
Thanks
I want to point that I am dealing with big tables 200M rows
and flat file size is 70G. Does it make sence to burn all tables (100G of data) to the DVD???
I have serious doubts about that.


Go to Top of Page

timmy
Master Smack Fu Yak Hacker

1242 Posts

Posted - 2004-07-14 : 21:26:39

I meant a database backup - not flat files. Exporting flat files would be a nightmare, especially if you have no idea of when the rows were added/updated.

An incremental backup should be relatively compact, depending on the regularity of the backups.
Go to Top of Page

marat
Yak Posting Veteran

85 Posts

Posted - 2004-07-14 : 22:00:18
We already have an incremental backup of DB.
This suppose to be alternative archiving procedure to quickly restore a table data from DVD
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2004-07-14 : 22:07:54
So you've got a large number of relatively small tables? Otherwise I don't see how this is going to work ... might help to understand the problem you are trying to solve, rather than your manager's proposed solution!

Kristen
Go to Top of Page

marat
Yak Posting Veteran

85 Posts

Posted - 2004-07-14 : 23:28:18
Tables will vary in size up to 70G.
The case is:
Our DB Architect want sort of internal archiving system where table data will be stored on DVDs as text files to be able to retrieve any DB table at givent point of time without restoring the whole DB using standart methods provided by MS SQL.
For example if she wants a table snapshot for some given point of time in the past it could be done by retrieving DVD from the archive and extracting table from DVD.
Important here is that it should be done on all databases on the server. Extracted data could be up to 150G in size for the server.
Isn't it mad???
Go to Top of Page

derrickleggett
Pointy Haired Yak DBA

4184 Posts

Posted - 2004-07-14 : 23:34:04
I think you've all apparently lost your mind. Fire the architect. Is that an option???

MeanOldDBA
derrickleggett@hotmail.com

When life gives you a lemon, fire the DBA.
Go to Top of Page

timmy
Master Smack Fu Yak Hacker

1242 Posts

Posted - 2004-07-14 : 23:36:18
Then it looks like you need to timestamp your tables.
Then make sure the ts is being updated properly.
Then you can easily write some script that will extract only new and updated records.

You could always use the Daryl Kerrigan option - tell him he's dreamin'.....



Go to Top of Page

derrickleggett
Pointy Haired Yak DBA

4184 Posts

Posted - 2004-07-14 : 23:39:26
You could make a database for every table and replicate every single table to it's own database. Then you could just backup/restore those. Wouldn't that be fun? :)

MeanOldDBA
derrickleggett@hotmail.com

When life gives you a lemon, fire the DBA.
Go to Top of Page

marat
Yak Posting Veteran

85 Posts

Posted - 2004-07-15 : 00:01:31
Timestamps isn't an option because data from the whole table has to be extracted, to fire the architect - isn't an option too, but could be.
This is kind of "just do it and we will see what happens" job.
Some people think that developers can do everything from nothing.

Go to Top of Page

timmy
Master Smack Fu Yak Hacker

1242 Posts

Posted - 2004-07-15 : 00:04:15
You might want to reconsider the inc. backup option.

If there is a problem, you can restore them to another db for analysis. Should be much easier. How often do they need to look at the archived data?
Go to Top of Page

marat
Yak Posting Veteran

85 Posts

Posted - 2004-07-15 : 00:08:10
Tim I totally agree with you to use MS SQL standart DB restoring procedures, and not reinventing the wheel.
Go to Top of Page

derrickleggett
Pointy Haired Yak DBA

4184 Posts

Posted - 2004-07-15 : 00:08:21
Well, when all else fails, do the impossible. SQLServerCentral has a script for you in their library that will let you bcp out tables to text files. You'll have to do a little work on it. You could set it up as a job and have all kinds of kewl logic in it (cough, cough).


http://www.sqlservercentral.com/Scripts/listscripts.asp?categorynm=Backup%20/%20Restore&categoryid=8

MeanOldDBA
derrickleggett@hotmail.com

When life gives you a lemon, fire the DBA.
Go to Top of Page

timmy
Master Smack Fu Yak Hacker

1242 Posts

Posted - 2004-07-15 : 00:22:49
You could always bcp out the table and do a DOS compare with yesterday's output. If you really wanted to create a headache for yourself.

Go to Top of Page

marat
Yak Posting Veteran

85 Posts

Posted - 2004-07-15 : 00:40:38
Strategy was to bcp and then to use DOS compare or using VB ActiveX script file
object to compare files and in the first instanse of mismatched lines buckup a file (which suppose to be faster).
Go to Top of Page

timmy
Master Smack Fu Yak Hacker

1242 Posts

Posted - 2004-07-15 : 00:56:30

I wouldn't recommend using the FileSystemObject to compare files of this size. You'd be better of writing a custom VB/C app that uses Win API functions. The FSO is very inefficient in this respect.
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2004-07-15 : 09:27:11
I'm still having a struggle with this at a conceptual level.

So, I want the latest CUSTOMER table (lets say). Fine that's on DVD #123. So I mount that somewhere and take a look. Hmmm ... there are ACCOUNT BALANCE figures here, I need the transactions. What DVD are they on? #099 No that's no good the transactions don't match the account balance, must be the next one - ah, here we go #124. No, the transactions are for higher value, must have been a transaction added between when the CUSTOMER table was dumped and the TRANSACTIONS. Blast.

So I ask an expert; and the expert says "What do you suppose 'Relational' means in RDBMS?"

Now, if the tables were for totally non-interconnected data I could see some sense in it. That's more like copying the files from your C drive to CD-ROM - you know, the ones that come off the digital camera that are not repeatable. We all back those up on a regular schedule. Hehehehe!

I think the correct solution to this is to have an archiving system ("TIMESTAMP" as others have been saying). Each TABLE needs a TRIGGER that saves the Old/New data to an "archive" table (probably best off in another database for this job).

The "other database" is backed up every day, and once a week [say] to DVD. Then there is some sort of clear-down that gets rid of old stuff. Basically this Other Database can grow up to, but no bigger than, the size of a DVD.

What this allows the Audit People to do is to see ANY record changed in the last X days/months. Depending on what the rate of change of your data is X may be a significant period of time, and I reckon it will provide the Audit People with a MUCH better means of finding out:

1) Who did the fraud
2) How big the fraud is
3) What is effected by the fraud.

(Replace "Fraud" with "Cock-up" if appropriate)

FWIW we do this for one of our applications. Record retension is 20 years, all changes within the 20 year period MAY need to be retrieved, changes within the last 18 months MUST be able to be retrieved. There is no requirement for this to be instant - its only used for s***-happened events - but in such circumstances it is deemed "affordable" to plough through tons of stuff to get to the bottom of what went wrong.

I'd be happy to write a bit more detail about how we go about this if that would be useful, but I'll save myself the bother until your Architect has either a) indicated its useful or b) left!

Kristen
Go to Top of Page

marat
Yak Posting Veteran

85 Posts

Posted - 2004-07-15 : 19:34:38
Hi Kristen,
Just some clarification, on how it could work for the case you described.
If we found corrupted say CUSTOMER table, what we need to do is to find first not corrupted one from archive and after that find required copy of relational tables from archive that has been dated by the same date or before the date not corrupted CUSTOMER table been archived. This could be done by searching in archive logging table(fields: table, date archived). I want to pint that if table hasn't been changed it shouldn't be archived. Of course if data damage is huge, in that case better to use
DB backup.
I guess that was your question.
Go to Top of Page

byrmol
Shed Building SQL Farmer

1591 Posts

Posted - 2004-07-15 : 20:01:01
Setup filegroups and files for you database and place the archiveable tables on separate filegroups and files. Run incremental backups per filegroup. This way just the tables concerned can be restored..

DavidM

"Always pre-heat the oven"
Go to Top of Page

marat
Yak Posting Veteran

85 Posts

Posted - 2004-07-15 : 21:03:11
Thanks guys
your answers just confirm my doubts.

Go to Top of Page
    Next Page

- Advertisement -