Tools: gitea utils does not load all pages

The issue is that the default `limit` seems to be 30, and not 50. Due to the
special case for `page == 1` and the `len(result_page) < limit` check, the
`url_json_get_all_pages` function may return before all pages have been loaded.

The fix is to simply remove the `limit` parameter. It wasn't used anyway.
Using it correctly is somewhat tricky if we can't pass in the limit in the
`page == 1` case. This may result in a couple more API requests but it's
probably not a problem in practice. If it becomes a problem, we should
figure out in which cases the `page == 1` special case is actually
required (it was not in the cases that I tested).

Note that the first link returns fewer results than the second link:
* https://projects.blender.org/api/v1/users/jacqueslucke/activities/feeds?only-performed-by=true&date=2024-04-22
* https://projects.blender.org/api/v1/users/jacqueslucke/activities/feeds?only-performed-by=true&date=2024-04-22&limit=50

Pull Request: https://projects.blender.org/blender/blender/pulls/120948
This commit is contained in:
Jacques Lucke 2024-04-28 01:06:51 +02:00
parent ab1da26e75
commit 3e814bc702

@ -40,10 +40,8 @@ def url_json_get(url: str) -> Optional[Union[Dict[str, Any], List[Dict[str, Any]
def url_json_get_all_pages(
url: str,
limit: int = 50,
verbose: bool = False,
) -> List[Dict[str, Any]]:
assert limit <= 50, "50 is the maximum limit of items per page"
result: List[Dict[str, Any]] = []
page = 1
while True:
@ -54,14 +52,14 @@ def url_json_get_all_pages(
# XXX: In some cases, a bug prevents using the `page` and `limit` parameters if the page is 1
result_page = url_json_get(url)
else:
result_page = url_json_get(f"{url}&page={page}&limit={limit}")
result_page = url_json_get(f"{url}&page={page}")
if not result_page:
break
assert isinstance(result_page, list)
result.extend(result_page)
if len(result_page) < limit:
if len(result_page) == 0:
break
page += 1