如何提高Jquery自动补全的性能

时间:2022-06-05 06:22:43

I was planning to use jquery autocomplete for a site and have implemented a test version. Im now using an ajax call to retrieve a new list of strings for every character input. The problem is that it gets rather slow, 1.5s before the new list is populated. What is the best way to make autocomplete fast? Im using cakephp and just doing a find and with a limit of 10 items.

我计划为一个站点使用jquery自动完成,并实现了一个测试版本。Im现在使用ajax调用为每个字符输入检索新的字符串列表。问题是,在填充新列表之前,它变得相当缓慢,只有1.5秒。什么是使自动完成快速的最好方法?我使用cakephp进行查找,并限制10个条目。

7 个解决方案

#1


53  

This article - about how flickr does autocomplete is a very good read. I had a few "wow" experiences reading it.

这篇关于flickr如何自动完成的文章是一本很好的读物。我有几次“哇”的阅读体验。

"This widget downloads a list of all of your contacts, in JavaScript, in under 200ms (this is true even for members with 10,000+ contacts). In order to get this level of performance, we had to completely rethink how we send data from the server to the client."

这个小部件在200毫秒内用JavaScript下载你所有的联系人列表(即使对于拥有10000 +联系人的成员也是如此)。为了获得这种级别的性能,我们必须完全重新考虑如何从服务器向客户端发送数据。

#2


37  

Try preloading your list object instead of doing the query on the fly.

尝试预加载列表对象,而不是动态执行查询。

Also the autocomplete has a 300 ms delay by default.
Perhaps remove the delay

自动补全默认有300毫秒的延迟。也许删除延迟

$( ".selector" ).autocomplete({ delay: 0 });

#3


11  

1.5-second intervals are very wide gaps to serve an autocomplete service.

1.5秒间隔对于自动完成服务来说是非常大的间隔。

  1. Firstly optimize your query and db connections. Try keeping your db connection alive with memory caching.
  2. 首先优化查询和db连接。尝试使用内存缓存保持数据库连接。
  3. Use result caching methods if your service is highly used to ignore re-fetchs.
  4. 如果高度使用您的服务来忽略重新获取,则使用结果缓存方法。
  5. Use a client cache (a JS list) to keep the old requests on the client. If user types back and erases, it is going to be useful. Results will come from the frontend cache instead of backend point.
  6. 使用客户端缓存(一个JS列表)将旧的请求保存在客户端上。如果用户输入回和删除,这将是有用的。结果将来自前端缓存而不是后端。
  7. Regex filtering on the client side wont be costly, you may give it a chance.
  8. 在客户端上进行Regex过滤不会花费很大的成本,您可以给它一个机会。

#4


5  

Before doing some optimizations you should first analyze where the bottle-neck is. Try to find out how long each step (input → request → db query → response → display) takes. Maybe the CakePHP implementation has a delay not to send a request for every character entered.

在进行一些优化之前,您应该首先分析瓶颈在哪里。试图找出多久每一步(输入→→→→db查询响应请求显示)。也许CakePHP实现有一个延迟,不向输入的每个字符发送请求。

#5


4  

The real issue for speed in this case I believe is the time it takes to run the query on the database. If there is no way to improve the speed of your query then maybe extending your search to include more items with a some highly ranked results in it you can perform one search every other character, and filter through 20-30 results on the client side.

在这种情况下,速度的真正问题是在数据库上运行查询所需的时间。如果没有办法提高查询的速度,那么可以扩展搜索范围,在其中包含更多具有高排序结果的项,您可以每隔一个字符进行一次搜索,然后在客户端过滤20-30个结果。

This may improve the appearance of performance, but at 1.5 seconds, I would first try to improve the query speed.

这可能会改进性能的外观,但是在1.5秒时,我将首先尝试提高查询速度。

Other than that, if you can give us some more information I may be able to give you a more specific answer.

除此之外,如果你能给我们更多的信息,我可能会给你一个更具体的答案。

Good luck!

好运!

#6


4  

Server side on PHP/SQL is slow.

PHP/SQL的服务器端比较慢。

Don't use PHP/SQL. My autocomplete written on C++, and uses hashtables to lookup. See performance here.

不要使用PHP / SQL。我的自动补全写在c++上,并使用散列表进行查找。在这里看到的性能。

This is Celeron-300 computer, FreeBSD, Apache/FastCGI.

这是Celeron-300计算机、FreeBSD、Apache/FastCGI。

And, you see, runs quick on huge dictionaries. 10,000,000 records isn't a problem.

而且,你看,大型字典运行得很快。10,000,000条记录不是问题。

Also, supports priorities, dynamic translations, and another features.

此外,还支持优先级、动态翻译和其他特性。

#7


1  

Autocomplete itself is not slow, although your implementation certainly could be. The first thing I would check is the value of your delay option (see jQuery docs). Next, I would check your query: you might only be bringing back 10 records but are you doing a huge table scan to get those 10 records? Are you bringing back a ton of records from the database into a collection and then taking 10 items from the collection instead of doing server-side paging on the database? A simple index might help, but you are going to have to do some testing to be sure.

自动完成本身并不慢,尽管您的实现肯定会慢。我首先要检查的是延迟选项的值(参见jQuery文档)。接下来,我将检查您的查询:您可能只返回10条记录,但您是否正在进行巨大的表扫描以获取这10条记录?您是否从数据库中带回了大量的记录到一个集合中,然后从集合中提取10个项目,而不是在数据库中执行服务器端分页?一个简单的索引可能会有帮助,但是您需要做一些测试来确定。

#1


53  

This article - about how flickr does autocomplete is a very good read. I had a few "wow" experiences reading it.

这篇关于flickr如何自动完成的文章是一本很好的读物。我有几次“哇”的阅读体验。

"This widget downloads a list of all of your contacts, in JavaScript, in under 200ms (this is true even for members with 10,000+ contacts). In order to get this level of performance, we had to completely rethink how we send data from the server to the client."

这个小部件在200毫秒内用JavaScript下载你所有的联系人列表(即使对于拥有10000 +联系人的成员也是如此)。为了获得这种级别的性能,我们必须完全重新考虑如何从服务器向客户端发送数据。

#2


37  

Try preloading your list object instead of doing the query on the fly.

尝试预加载列表对象,而不是动态执行查询。

Also the autocomplete has a 300 ms delay by default.
Perhaps remove the delay

自动补全默认有300毫秒的延迟。也许删除延迟

$( ".selector" ).autocomplete({ delay: 0 });

#3


11  

1.5-second intervals are very wide gaps to serve an autocomplete service.

1.5秒间隔对于自动完成服务来说是非常大的间隔。

  1. Firstly optimize your query and db connections. Try keeping your db connection alive with memory caching.
  2. 首先优化查询和db连接。尝试使用内存缓存保持数据库连接。
  3. Use result caching methods if your service is highly used to ignore re-fetchs.
  4. 如果高度使用您的服务来忽略重新获取,则使用结果缓存方法。
  5. Use a client cache (a JS list) to keep the old requests on the client. If user types back and erases, it is going to be useful. Results will come from the frontend cache instead of backend point.
  6. 使用客户端缓存(一个JS列表)将旧的请求保存在客户端上。如果用户输入回和删除,这将是有用的。结果将来自前端缓存而不是后端。
  7. Regex filtering on the client side wont be costly, you may give it a chance.
  8. 在客户端上进行Regex过滤不会花费很大的成本,您可以给它一个机会。

#4


5  

Before doing some optimizations you should first analyze where the bottle-neck is. Try to find out how long each step (input → request → db query → response → display) takes. Maybe the CakePHP implementation has a delay not to send a request for every character entered.

在进行一些优化之前,您应该首先分析瓶颈在哪里。试图找出多久每一步(输入→→→→db查询响应请求显示)。也许CakePHP实现有一个延迟,不向输入的每个字符发送请求。

#5


4  

The real issue for speed in this case I believe is the time it takes to run the query on the database. If there is no way to improve the speed of your query then maybe extending your search to include more items with a some highly ranked results in it you can perform one search every other character, and filter through 20-30 results on the client side.

在这种情况下,速度的真正问题是在数据库上运行查询所需的时间。如果没有办法提高查询的速度,那么可以扩展搜索范围,在其中包含更多具有高排序结果的项,您可以每隔一个字符进行一次搜索,然后在客户端过滤20-30个结果。

This may improve the appearance of performance, but at 1.5 seconds, I would first try to improve the query speed.

这可能会改进性能的外观,但是在1.5秒时,我将首先尝试提高查询速度。

Other than that, if you can give us some more information I may be able to give you a more specific answer.

除此之外,如果你能给我们更多的信息,我可能会给你一个更具体的答案。

Good luck!

好运!

#6


4  

Server side on PHP/SQL is slow.

PHP/SQL的服务器端比较慢。

Don't use PHP/SQL. My autocomplete written on C++, and uses hashtables to lookup. See performance here.

不要使用PHP / SQL。我的自动补全写在c++上,并使用散列表进行查找。在这里看到的性能。

This is Celeron-300 computer, FreeBSD, Apache/FastCGI.

这是Celeron-300计算机、FreeBSD、Apache/FastCGI。

And, you see, runs quick on huge dictionaries. 10,000,000 records isn't a problem.

而且,你看,大型字典运行得很快。10,000,000条记录不是问题。

Also, supports priorities, dynamic translations, and another features.

此外,还支持优先级、动态翻译和其他特性。

#7


1  

Autocomplete itself is not slow, although your implementation certainly could be. The first thing I would check is the value of your delay option (see jQuery docs). Next, I would check your query: you might only be bringing back 10 records but are you doing a huge table scan to get those 10 records? Are you bringing back a ton of records from the database into a collection and then taking 10 items from the collection instead of doing server-side paging on the database? A simple index might help, but you are going to have to do some testing to be sure.

自动完成本身并不慢,尽管您的实现肯定会慢。我首先要检查的是延迟选项的值(参见jQuery文档)。接下来,我将检查您的查询:您可能只返回10条记录,但您是否正在进行巨大的表扫描以获取这10条记录?您是否从数据库中带回了大量的记录到一个集合中,然后从集合中提取10个项目,而不是在数据库中执行服务器端分页?一个简单的索引可能会有帮助,但是您需要做一些测试来确定。