| Author |
Topic |
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-06-30 : 13:34:02
|
| Still coming like a freight train....the more I find out the less I want to know..OK So Rockect scientest boy set up log shipping for a 2 gb db.I know nothing about it yet, but I do understanf that there is a third server to manage the log shipping.Now the db is such a mess he's scheduled month reindexes for all the tables. The log then becomes a gig, which he ships and applies to the standby box? WHY? I don't know, he dumps the db everynight. It's all outside of the corp prod environement (which is why I expect they're moving it) So I made mention it all has to be brought in house before we can support it.The question is, can you "turn off" log shipping to not send the maintenenace? Can you dump the db nightly and restore that to a check point?And what's with this "watermark" term he keeps throwing around?Seems like using a jack hammer to drive a nail if you ask me.I mean if you had hunderds of servers I guess it would be nice to have something to manage it, but I've never had a problem doing it the "old fashion" way.Any thoughts/comments appreciated.Brett8-) |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2003-06-30 : 13:38:39
|
| Rocket scientist is only reindexing once per month? Not good.The log has to be applied to the standby server. Every log has to be applied to the standby server. This is how transaction logs work. No you can not turn off log shipping to not send the maintenance job information. No you can not dump the database nightly and restore that to a check point. That defeats the purpose of log shipping. Of course you could do what you are asking, but then you would break log shipping and need to recreate it every single day.Don't know what he means by watermark.Tara |
 |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-06-30 : 14:35:29
|
| I set up my own model based on a full nightly dump, and then dump trans logs every 15 minutes. When maint occurs, it's dump / maint / dump, if all goes well use the later dump..And we do this once a week.How long does it take to apply a 1gb tran log?Brett8-) |
 |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2003-06-30 : 14:52:26
|
| I think microsofts supplied log shipping uses another database to keep track of what's going on which is probably where the third server comes from.Never really see the point in this - much easier just to copy any logs that turn up and apply them.How long does it take to apply a 1gb tran log?Too long.You have a 2G database so if the log gets to 1G you might as well do a full backup and apply that.==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
 |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-06-30 : 15:25:19
|
| Well, I guess that's my point (and my problem).For a system that no longer requires an maintenance (yeah right) or any new development (yeah right), they've allocated 1/4 fte to support this F_CK'n mess.Can't wait to see transition.Brett8-) |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2003-06-30 : 15:32:38
|
| The third server is the monitoring server. It does not have to be a different server though. You can have two servers involved in log shipping where one of them is also the monitoring server. The monitoring server is used just purely to monitor log shipping. The monitoring server is not the one that knows about which transaction logs to apply. The maintenance plan knows about this part.I disagree with Nigel about the 1GB part. It only takes about 10-30 minutes on our servers to apply the 1GB transaction log. The whole purpose of log shipping is disaster recovery. So what if the transaction log gets to 1GB during maintenance, log shipping will automatically copy the file over to the standby server and then apply it. Do you really want to do a full backup after the maintenance and then do a restore and then setup log shipping? If you apply the full backup, meaning restore to the standby server, you now have to recreate log shipping.Tara |
 |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-06-30 : 15:38:21
|
| I guess my point is, I've never used log shipping, and never had any problems.Built my own I guess.It seems that there are some disadvantages in using log shipping...I mean why bother to apply maintenance when all you have to do is do the work (maint) onve and then dump and restore.I mean what if you had lots of servers and all of the maint had to be performed (basically) over again.Why bother.Dump and resotre.Done.MOOAll of this from ignorace of a product I know nothing about...Any enlightenment of the pros of the product , I'm all ears.Brett8-) |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2003-06-30 : 15:47:01
|
The disadvantage that you listed is not a disadvantage. So what if a transaction log file is 1GB from maintenance. Log shipping will automatically apply it for you on the standby server.quote: Why bother.
Are you at work when the reindexing occurs? I know I'm not. Who is going to do the dump and restore. Yes you could do it through jobs, but why not just let log shipping handle it for you.TaraEdited by - tduggan on 06/30/2003 15:49:39 |
 |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-06-30 : 15:50:55
|
Neither am I ..schedule jobs that kick off procedures...any failure beep (my prod dba that is )Well...the short story is rocket scientest boy built a rockect ship for a tiny database holding it together with spit and chewing gum, and now he has to set it up in a glass house...Can't WAIT.Brett8-) |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2003-06-30 : 15:57:31
|
| Ah I see, if a problem occurs, the prod DBA gets paged. Well here, I'm the prod DBA (well not only me but others too). Log shipping is very reliable. If it weren't, we wouldn't be using it. I rarely get paged on log shipping problems. Typically if we do get paged, it's because the restore can't happen because the transaction log file hasn't copied over yet (file is in use by copy). But we don't get paged on the first failure. We get paged when the standby server's database is out of synch by more than 45 minutes (configurable value).If you don't want the maintenance to get applied to the standby server, then don't use log shipping. Use custom jobs instead. As a production DBA though, I would recommend log shipping as the disaster recovery solution.Tara |
 |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-06-30 : 16:06:02
|
| Well since they wont pay for any changes I guess I'll to start reading about it and get ready to support it.On a scale of 1 to 10, how difficult is it to configure and manage?Brett8-) |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2003-06-30 : 16:10:39
|
| At first, log shipping can seem a bit tricky. But after you have created it a few times and have had to troubleshoot, you'll know it like the back of your hand. Log shipping is just full backups on a primary server, transaction log backup on the primary server, copying over the transaction logs to the secondary server, restoring the transaction logs on the secondary server, and monitoring of log shipping. That's really it. So you'll end up with 6 jobs plus one extra object in Enterprise Manager on the monitoring server. The 6 jobs are:Full backup on primaryTransaction log backup on primaryCopy job on secondaryRestore job on secondaryCopy alert job on monitoring serverRestore alert job on monitoring serverTara |
 |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2003-06-30 : 16:24:46
|
quote: I disagree with Nigel about the 1GB part. It only takes about 10-30 minutes on our servers to apply the 1GB transaction log. The whole purpose of log shipping is disaster recovery. So what if the transaction log gets to 1GB during maintenance, log shipping will automatically copy the file over to the standby server and then apply it. Do you really want to do a full backup after the maintenance and then do a restore and then setup log shipping? If you apply the full backup, meaning restore to the standby server, you now have to recreate log shipping.Tara
I do this by just scheduling a tr log backup on the source server.The dest server extracts any log files that turn up in the directory and applies them.No monitoring (except checking that logs get applied but that's part of all the other monitoring that goes on).No setup - if you apply a backup just make sure the next log in the directory is the first one taken after the backup.There are just two independent jobs running on the two servers.==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2003-06-30 : 16:33:00
|
quote: I do this by just scheduling a tr log backup on the source server.The dest server extracts any log files that turn up in the directory and applies them.
How is that different than what log shipping is doing?quote: No monitoring (except checking that logs get applied but that's part of all the other monitoring that goes on).
Monitoring is just a convenience thing. You don't have to use it. Log shipping runs perfectly fine without it.quote: No setup - if you apply a backup just make sure the next log in the directory is the first one taken after the backup.
True. But why bother applying the full backup. On our servers, applying the full backup takes about the same amount of time as applying a large transaction log (one caused by maintenance like reindexing), so why bother with the extra logic?quote: There are just two independent jobs running on the two servers.
Yeah, but how many job steps are there?Tara |
 |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-06-30 : 16:47:52
|
quote: There are just two independent jobs running on the two servers.
Tara, it's a pretty simple model.And why do they discuss the databases getting out of synch and the need for them to be monitored?I know if I dump and restore they'll be in synch...not that I probably can get out of maintaining the log shipping he set up.Brett8-) |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2003-06-30 : 16:54:47
|
| The monitoring is a quick and easy way to take a look at ALL of the log shipped databases at one time. You don't need to use it though. We log ship about 20 databases, so it's much easier to look at the log shipping monitor than to look at the jobs. Databases being out of synch occurs in your model too. Out of synch is the time between the backup of the transaction log and the restore of the transaction log. A treshold can be set on this value. We have it setup for 45 minutes, which is actually the default too. It is important to know if your databases are out of synch for disaster recovery situations.Log shipping is just as simple as the model that you are using. It is actually simpler because the maintenance plan does everything for you. Tara |
 |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-06-30 : 19:33:13
|
| Well the "out of synch-ness" I refering to was/is not a timing issue on my rocket scientest part...but who knows I can never get a straight answer...Any good books the specialize in Log Shipping and replication?Brett8-) |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2003-06-30 : 19:40:06
|
| I haven't read any books about Log Shipping, so I can't recommend any. I learned about it from my boss. He has dealt with log shipping for quite some time. He has done it using your model and now this way. He prefers the log shipping method due to its ease of setting up and also because of the monitoring aspect. I agree with him. You'll see what I mean if you ever have over 5 log shipped databases to monitor.After setting up log shipping a couple of times, you'll be a master at it too. It really isn't hard at all. Let me know what you think about log shipping after you become more comfortable about it.Tara |
 |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-06-30 : 19:46:21
|
quote: I agree with him.
That's a pretty good idea considering he's your boss I'll let you know how it shakes out.Correct me if I'm wrong, but I have to upgrade to Enterprise Client, no?And 7:45 EST, and what the hell am I still doing here?Brett8-) |
 |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2003-06-30 : 21:05:35
|
quote: I haven't read any books about Log Shipping, so I can't recommend any. I learned about it from my boss. He has dealt with log shipping for quite some time. He has done it using your model and now this way. He prefers the log shipping method due to its ease of setting up and also because of the monitoring aspect. I agree with him. You'll see what I mean if you ever have over 5 log shipped databases to monitor.After setting up log shipping a couple of times, you'll be a master at it too. It really isn't hard at all. Let me know what you think about log shipping after you become more comfortable about it.Tara
Well I've done 15 like this - for monitoring I don't want to go to a separate screen. I include it with all the other monitoring and all I have to do is see if the monitoring process is running and it's alert is working.I just say here's a database name, here's a directory and the SP sets up the log shipping (well if it can get through to the directory).==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2003-06-30 : 22:06:06
|
quote: Correct me if I'm wrong, but I have to upgrade to Enterprise Client, no?
No you do not. There is no such thing as an Enterprise client. Enterprise Edition is for the server. Client tools aren't enterprise or standard, they're just client tools. So what you've got already is fine.Tara |
 |
|
|
Next Page
|