Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
Alikkin
Starting Member
6 Posts |
Posted - 2008-02-07 : 16:27:20
|
| I am serving ad-units. In each ad-unit I show somewhere between 3 and 10 article headline. I track the headline impressions to get an idea of the headline click through rate. I save the output from the stats process in another table. I am currently evaluating the stats every hour, and then truncating the table every night at midnight. The problem is that I get lots of impressions and the database gets bogged down evaluating the data such as... SELECT COUNT FROM articleimpressions WHERE articleid = xBut the issue isn't the reporting of the data...so much as it's the capture. I had to add caching on the ad-server because the database couldn't handle the number of inserts.I thought about parsing the web server log files the next day, but the file sizes seem to be too large, and I can't process them all in one day. (At least not on the hardware that I am using.) I've thought about splitting log files by hour, but was wondering if there may be a more "native" solution within SQL Server? Maybe a trigger, and/or multi-threaded SP that fires and forgets an insert statement to a linked server. But performance is the key here.Thoughts? |
|
|
CShaw
Yak Posting Veteran
65 Posts |
Posted - 2008-02-07 : 21:04:23
|
| I am interested in your question and your solution. Can you share with us a couple pieces of information like, What is the hardware configuration of the SQL Server, how many transactions are you pushing to it when it gets bogged down. How many indexes are on the table?Chris Shawwww.SQLonCall.com |
 |
|
|
Alikkin
Starting Member
6 Posts |
Posted - 2008-02-08 : 10:19:23
|
| 70-90 million inserts a day, 4 processor, 4 gig of ram, 2 indexes |
 |
|
|
|
|
|