Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
Simon.Yoffe
Starting Member
2 Posts |
Posted - 2009-06-23 : 19:05:37
|
Hi,I created a CLR function that reads data from the database for creating a filter, then executes a select command with the constructed filter and returns the result to SQL SERVER.I checked the performance of my CLR function vs. running same code from external .Net application and found that the CLR function is much slower (two times slower).I searched the internet for more information about SQL CLR performace and TVF in CLR and didn't find much,The only stuff that I found is a suggestion to try regular connection string instead of "context connection=true" that improved the performance but not much.Is there any suggestion that might improve the performance in my situation?I'm using SQL Server 2005, and reading 3605 rows, each row contains 51 columns.In pseudo code it looks like:[SqlFunctionAttribute(FillRowMethodName="FillCompaniesRow", TableDefinition="...", DataAccess=DataAccessKind.Read, SystemDataAccess=SystemDataAccessKind.Read, IsDeterministic=true)]public static System.Collections.IEnumerable fnGetFilteredCompanies(System.Guid filterId) { System.Text.StringBuilder query = new System.Text.StringBuilder(@"SELECT ... FROM ..."); QueryFilter.AddCondition(query, filterId); return ServerPipe.ExecuteTableQuery(query.ToString());}The AddCondition method doing the next:1) calls to different stored procedures by creating an SqlCommand and calling command.ExecuteReader();2) string manipulations for creating a string filterIt opens only one connection which is used for all the commands, I used to work with "context connection=true" connection stringbut I found that a direct connection string works faster as suggested on the web.The ExecuteTableQuery method executes the query again by creating SqlCommand and calling command.ExecuteReader().It opens another connection (the previous one is closed) and it reads all the lines into List<object> while each element is an object[] containing all column values.Finally the FillCompaniesRow just casting the input object into object[] and then cast each array element to the appropriate SqlType.I executed SELECT * FROM MyFunction in management studio and got the next results:Client processing time 78 110 94 78 125 47 31 110 125 63 Total execution time 171 203 187 187 234 187 140 203 218 187 Wait time on server replies 93 93 93 109 109 140 109 93 93 124 Average client processing time 86.1000Average total execution time 191.7000Average wait time on server replies 105.6000I executed same code from a .Net application on the SQL Server machine and got the next result:0: total time 2811: total time 782: total time 623: total time 784: total time 785: total time 786: total time 787: total time 788: total time 629: total time 78average time 95.1I executed only the query + filter directly from management studio and got the next result:Client processing time 78 78 78 78 78 109 124 78 78 62Total execution time 78 78 78 78 78 109 124 93 78 62Wait time on server replies 0 0 0 0 0 0 0 15 0 0Average client processing time 84.1000Average total execution time 85.6000Average wait time on server replies 1.5000Meaning that the code although it builds the query in runtime by accessing the database,its execution time is almost similar to the execution time of the query itself.and it more then twice faster then the code executed in the SQL CLR, which I expected to be faster.So what I'm trying to understand is why I have such a big difference in performance between the SQLCLR and my external application?Simon Yoffe |
|
|
|
|
|
|