Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
zafarthesultan
Starting Member
16 Posts |
Posted - 2011-01-06 : 10:40:16
|
| Hello everyone, I have a database with 300+ tables and 1000+ stored procedures. I need to create indexes but not sure what should be the correct policy? All these tables are updated regularly, new records keep on adding and also records are deleted frequently. Now, where to start from? We have every column selected, updated and inserted of every table based on other column. Can anyone through some light? How many indexes on every table and what type of indexes should i create so that the performance improves? |
|
|
GilaMonster
Master Smack Fu Yak Hacker
4507 Posts |
|
|
Luismmr_cu
Starting Member
2 Posts |
Posted - 2011-01-06 : 11:18:20
|
Hi zafarthesultanyou need to consider how many records do your tables store. If you only have a couple of hundreds you don't need to used indexes but primary key wich is also an index.Use non-clustered indexes in tables where you have more than 100 000 records and you do a lots of "where" in your "select" queries to those tables but consider to do it on the tables you update or delete the less.Also do use indexes in varchars or date type fields, not recommended on numeric like money or floats if you have a lots of different valuesAfter created your indexes create a job that runs from time to time updating statistics so your indexes will be rebuilt after many inserts or deletes, otherwise your indexes wont work as wished.Do also some google, it always helps! |
 |
|
|
GilaMonster
Master Smack Fu Yak Hacker
4507 Posts |
Posted - 2011-01-07 : 03:09:25
|
quote: Originally posted by Luismmr_cu Use non-clustered indexes in tables where you have more than 100 000 records and you do a lots of "where" in your "select" queries to those tables but consider to do it on the tables you update or delete the less.
Where did that number come from? Tables far smaller than 100 000 rows benefit from good indexesquote: After created your indexes create a job that runs from time to time updating statistics so your indexes will be rebuilt after many inserts or deletes, otherwise your indexes wont work as wished.
Update statistics won't rebuild the indexes. You need a rebuild index job and maybe an update stats job.--Gail ShawSQL Server MVP |
 |
|
|
zafarthesultan
Starting Member
16 Posts |
Posted - 2011-01-09 : 05:22:49
|
| Thanks everyone. I think first i will have to analyze my queries, then i will have to decide the columns on which i should create the indexes. But again, from just going through few of them i found that those queries use almost every column in "where" clause for a particular table. So in that case do I have to create indexes on every column of my table? And is there any tool available that analyzes the queries rather than manually doing it? As in some cases there may be thousands of stored procedures. |
 |
|
|
GilaMonster
Master Smack Fu Yak Hacker
4507 Posts |
|
|
|
|
|
|
|