Make WordPress Core

Opened 10 years ago

Closed 10 years ago

Last modified 10 years ago

#27077 closed enhancement (invalid)

fix to avoid requery database for large result set via get_row

Reported by: openadvent's profile openadvent Owned by:
Milestone: Priority: normal
Severity: normal Version: 3.8.1
Component: Database Keywords:
Focuses: performance Cc:


when we have large results set,
using get_row to loop through results with offset to get the returned results.
but calling get_row seems to query the database on every call,
so im suggesting that it will detect a select statement and if it matches last query,
avoid requery database and just return the result

Attachments (2)

wp-db3.php (48.0 KB) - added by openadvent 10 years ago.
wp-db3.2.php (47.2 KB) - added by openadvent 10 years ago.
updated wpdb

Download all attachments as: .zip

Change History (5)

10 years ago


10 years ago

updated wpdb

#1 @nacin
10 years ago

  • Milestone Awaiting Review deleted
  • Resolution set to invalid
  • Status changed from new to closed

Hi openadvent, thanks for the report. In the future, if you can submit a patch formatted for SVN or Git, that would be great. There's more in the handbook.

This isn't something we're going to add to WordPress. If you're calling the database API, there is an expectation that the query will run and return the latest results. (Keep in mind, for example, other processes could be making DB changes).

You should optimize your code to not unnecessarily run DB statements you don't need. Rather than using get_row(), get_col(), or get_var() with offsets, use get_results() and run through the results yourself.

#2 @openadvent
10 years ago


the get_results uses too much memory for big results

unless you have another suggestion for get_row, maybe an alternative name to loop through records via offset

Last edited 10 years ago by openadvent (previous) (diff)

#3 @nacin
10 years ago

Then put a LIMIT on it. get_row() for X results and get_results() for X results are going to use the same amount of memory.

If you want to loop through records by "chunking" them into multiple queries using offset LIMITs, then you can/should do so. You don't need an API for that; it's too restricting for what needs to be handled on a case-by-case basis.

Note: See TracTickets for help on using tickets.