如何提高asp.net AJAX自动完成性能

时间:2022-06-01 18:00:14

My web site has city,state and zip code autocomplete feature.

我的网站有城市,州和邮政编码自动完成功能。

If user types in 3 characters of a city in the textbox, then top 20 cities starting with those characters are shown. As of now, Autocomplete method in our application queries sql 2005 database which has got around 900,000 records related to city,state and zip.

如果用户在文本框中键入城市的3个字符,则会显示以这些字符开头的前20个城市。截至目前,我们的应用程序中的Autocomplete方法查询sql 2005数据库,该数据库已获得大约900,000条与city,state和zip相关的记录。

But the response time to show the cities list seems to be very very slow.

但显示城市列表的响应时间似乎非常缓慢。

Hence, for peformance optimization, is it a good idea to store the location data into Lucene index or may be in Active directory and then pull the data from there?

因此,对于性能优化,将位置数据存储到Lucene索引中是一个好主意,还是可以在Active目录中,然后从那里提取数据?

Which one will be faster...Lucene or Activedirectory?And what are the pros and cons of each?Any suggestions please?

哪一个会更快... Lucene或Activedirectory?每个的优点和缺点是什么?有什么建议吗?

Thanks a bunch!

谢谢你!

3 个解决方案

#1


Taking a nuclear option (like changing backing data stores) probably shouldn't be the first option. Rather, you need to look at why the query is performing so slowly. I'd start with looking at the query performance in SQL Profiler and the execution plan in Sql Management Studio and see if I am missing anything stupid like an index. After you cover that angle, then check the web layer and ensure that you are not sending inordinate amounts of data or otherwise tossing a spanner in the works. Once you have established that you aren't killing yourself in the db or on the wire, then it is time to think about re-engineering.

采取核选项(如改变后备数据存储)可能不应该是第一选择。相反,您需要查看查询执行速度这么慢的原因。我首先看一下SQL Profiler中的查询性能和Sql Management Studio中的执行计划,看看我是否遗漏了像索引一样愚蠢的东西。在覆盖该角度后,检查网络层并确保您没有发送过多的数据或者在工作中抛出扳手。一旦你确定你没有在数据库或线路上自杀,那么现在是时候考虑重新设计了。

On a side note, my money would be on Sql Server handling the data end of this task better than either of those options. Lucene is better suited for full-text searches and AD is a poor database at best.

另外,我的钱将在Sql Server上处理此任务的数据结束比这些选项中的任何一个更好。 Lucene更适合全文搜索,AD最好是一个糟糕的数据库。

#2


I would cache the data into a separate table. Depending on how fresh you need that data to be, you can rebuild it as often as necessary.

我会将数据缓存到一个单独的表中。根据您需要的新鲜程度,您可以根据需要经常重建。

--Create the table
SELECT DISTINCT city, state, zip INTO myCacheTable FROM theRealTable

--Rebuild the table anytime
TRUNCATE TABLE myCacheTable
INSERT INTO myCacheTable (city, state, zip) SELECT DISTINCT city, state, zip FROM theRealTable

Your AJAX calls can access myCacheTable instead, which will have far fewer rows than 900k.

您的AJAX调用可以访问myCacheTable,其行数远少于900k。

#3


Adding to what Wyatt said, you first need to figure out which area is slow? Is the SQL query slow OR the network connection slow between the browser and the server? OR is there something else?

除了怀亚特所说的,你首先要弄清楚哪个区域很慢? SQL查询是慢还是浏览器和服务器之间的网络连接速度慢?还有别的吗?

And I completely agree with Wyatt that SQL Server is much more suitable for this task then Lucene and Active Directory.

我完全赞同Wyatt,SQL Server更适合Lucene和Active Directory这个任务。

#1


Taking a nuclear option (like changing backing data stores) probably shouldn't be the first option. Rather, you need to look at why the query is performing so slowly. I'd start with looking at the query performance in SQL Profiler and the execution plan in Sql Management Studio and see if I am missing anything stupid like an index. After you cover that angle, then check the web layer and ensure that you are not sending inordinate amounts of data or otherwise tossing a spanner in the works. Once you have established that you aren't killing yourself in the db or on the wire, then it is time to think about re-engineering.

采取核选项(如改变后备数据存储)可能不应该是第一选择。相反,您需要查看查询执行速度这么慢的原因。我首先看一下SQL Profiler中的查询性能和Sql Management Studio中的执行计划,看看我是否遗漏了像索引一样愚蠢的东西。在覆盖该角度后,检查网络层并确保您没有发送过多的数据或者在工作中抛出扳手。一旦你确定你没有在数据库或线路上自杀,那么现在是时候考虑重新设计了。

On a side note, my money would be on Sql Server handling the data end of this task better than either of those options. Lucene is better suited for full-text searches and AD is a poor database at best.

另外,我的钱将在Sql Server上处理此任务的数据结束比这些选项中的任何一个更好。 Lucene更适合全文搜索,AD最好是一个糟糕的数据库。

#2


I would cache the data into a separate table. Depending on how fresh you need that data to be, you can rebuild it as often as necessary.

我会将数据缓存到一个单独的表中。根据您需要的新鲜程度,您可以根据需要经常重建。

--Create the table
SELECT DISTINCT city, state, zip INTO myCacheTable FROM theRealTable

--Rebuild the table anytime
TRUNCATE TABLE myCacheTable
INSERT INTO myCacheTable (city, state, zip) SELECT DISTINCT city, state, zip FROM theRealTable

Your AJAX calls can access myCacheTable instead, which will have far fewer rows than 900k.

您的AJAX调用可以访问myCacheTable,其行数远少于900k。

#3


Adding to what Wyatt said, you first need to figure out which area is slow? Is the SQL query slow OR the network connection slow between the browser and the server? OR is there something else?

除了怀亚特所说的,你首先要弄清楚哪个区域很慢? SQL查询是慢还是浏览器和服务器之间的网络连接速度慢?还有别的吗?

And I completely agree with Wyatt that SQL Server is much more suitable for this task then Lucene and Active Directory.

我完全赞同Wyatt,SQL Server更适合Lucene和Active Directory这个任务。