当前位置: 动力学知识库 > 问答 > 编程问答 >

sql - Can LIMIT be applied to grouped results, but still get all the rows?

问题描述:

I have a table with books a table with authors and a table relating books to authors. A book can have more than one author, so when I do my big query for this results I might get more than one row per book, if the book has more than one author. I then merge together the results in the PHP, but the thing is that if I LIMIT - OFFSET the query for pagination, I might get less than 25 (desired) unique books per page.

Can anyone think of a (or is there a built-in) way to have the LIMIT affect a grouped-by query but still get all the results? I'd rather not do one grouped-by query and then do other queries to get each author because I lose the benefit of cached results.

If not, I'll probably do a pre-pass saving the cached results and then query each author separately.

网友答案:

I had exactly this same problem in a different use case (theater reservation system) and after some research and testing, I've used the pre-pass approach. It's fast and clean and works very well even with a large number of rows (in my case, over 600k). Hope it helps! :)

网友答案:

There are two approaches you could use:

  • Using n+1 queries.
  • Emulate ROW_NUMBER() OVER (PARTITION BY your_group) in MySQL using variables and select only the rows with row number 25 or less.

The second is quite difficult to write correctly.

网友答案:

There's already an accepted answer, but I think this may be useful.

GROUP_CONCAT allows you to merge multiple rows into a single row in a MySQL query. Using this, you could concatenate the authors into a list as one field.

SELECT GROUP_CONCAT(author) FROM books GROUP BY book_id;

http://dev.mysql.com/doc/refman/5.0/en/group-by-functions.html#function_group-concat

分享给朋友:
您可能感兴趣的文章:
随机阅读: