Reading/writing packages sometimes starts failing #610

Closed
opened 2024-02-07 21:14:07 +00:00 by emmorts · 4 comments
emmorts commented 2024-02-07 21:14:07 +00:00 (Migrated from gitea.com)

After updating chart v9.5.1 -> v10.1.1 I have started encountering an issue where sometimes trying to fetch or push packages start failing. To be fair, the issue could have been present even earlier, but it's possible that it hasn't occurred on my watch. Initially when I encountered this issue, I re-deployed the package and that did resolve the issue of fetching it, but at this point even writing started to fail. Looking at the log, it seems to point to permission issue.

I am running the latest rootless image (i.e. gitea/gitea:1.21.5-rootless) in a K3s cluster.

The following is an excerpt from the log when pushing a package starts to fail (notably stat operation).

2024/02/07 14:24:06 ...eb/routing/logger.go:102:func1() [I] router: completed HEAD /v2/dev/foobar/blobs/sha256:13d88c480ae4865d5957822e258ba8d7fe83e41daed61ab06113f7609ef35291 for 10.42.2.80:56500, 500 Internal Server Error in 25.3ms @ container/container.go:483(container.HeadBlob)
2024/02/07 14:24:06 ...eb/routing/logger.go:102:func1() [I] router: completed POST /v2/dev/foobar/blobs/uploads/ for 10.42.2.80:56524, 202 Accepted in 16.2ms @ container/container.go:215(container.InitiateUploadBlob)
2024/02/07 14:24:09 ...eb/routing/logger.go:102:func1() [I] router: completed PATCH /v2/dev/foobar/blobs/uploads/ceznwwageoxqaedtd2ygjymyh for 10.42.2.80:56534, 202 Accepted in 487.2ms @ container/container.go:325(container.UploadBlob)
2024/02/07 14:24:09 ...eb/routing/logger.go:102:func1() [I] router: completed PUT /v2/dev/foobar/blobs/uploads/ceznwwageoxqaedtd2ygjymyh?digest=sha256%3A702a753aea7862c25ea09c58c6e1ed7911e2e38d0f519b2d7f7a7ad952906998 for 10.42.2.80:56512, 201 Created in 333.5ms @ container/container.go:370(container.EndUploadBlob)
2024/02/07 14:24:09 ...eb/routing/logger.go:102:func1() [I] router: completed HEAD /v2/dev/foobar/blobs/sha256:702a753aea7862c25ea09c58c6e1ed7911e2e38d0f519b2d7f7a7ad952906998 for 10.42.2.80:56534, 200 OK in 23.7ms @ container/container.go:483(container.HeadBlob)
2024/02/07 14:24:09 ...ntainer/container.go:89:apiError() [E] stat /data/packages/c5/7e/c57ee5000d61345aa3ee6684794a8110328e2274d9a5ae7855969d1a26394463: permission denied
2024/02/07 14:24:09 ...eb/routing/logger.go:102:func1() [I] router: completed HEAD /v2/dev/foobar/blobs/sha256:c57ee5000d61345aa3ee6684794a8110328e2274d9a5ae7855969d1a26394463 for 10.42.2.80:56512, 500 Internal Server Error in 25.8ms @ container/container.go:483(container.HeadBlob)
2024/02/07 14:24:09 ...eb/routing/logger.go:102:func1() [I] router: completed POST /v2/dev/foobar/blobs/uploads/ for 10.42.2.80:56524, 202 Accepted in 16.7ms @ container/container.go:215(container.InitiateUploadBlob)
2024/02/07 14:24:09 ...eb/routing/logger.go:102:func1() [I] router: completed POST /v2/dev/foobar/blobs/uploads/ for 10.42.2.80:56500, 202 Accepted in 14.2ms @ container/container.go:215(container.InitiateUploadBlob)
2024/02/07 14:24:09 ...eb/routing/logger.go:102:func1() [I] router: completed PATCH /v2/dev/foobar/blobs/uploads/zi19mehycgvpz7irkiokebutm for 10.42.2.80:56500, 202 Accepted in 22.4ms @ container/container.go:325(container.UploadBlob)
2024/02/07 14:24:09 ...eb/routing/logger.go:102:func1() [I] router: completed PUT /v2/dev/foobar/blobs/uploads/zi19mehycgvpz7irkiokebutm?digest=sha256%3A347e716bfaf1c423f0a5ff5d655375e87c9acaa26982b0a5d103592ff3794262 for 10.42.2.80:56512, 201 Created in 31.2ms @ container/container.go:370(container.EndUploadBlob)
2024/02/07 14:24:09 ...ntainer/container.go:89:apiError() [E] stat /data/packages/34/7e/347e716bfaf1c423f0a5ff5d655375e87c9acaa26982b0a5d103592ff3794262: permission denied

Mounted volume is provisioned from NFS (via nfs-subdir-external-provisioner). It does seem that the entire /data directory has proper permissions though, as seen in the screenshot.
image

At this point I'm a bit at a loss, as I don't really know how to rectify the issue, and whether this is an issue on my end, or something else.

After updating chart v9.5.1 -> v10.1.1 I have started encountering an issue where _sometimes_ trying to fetch or push packages start failing. To be fair, the issue could have been present even earlier, but it's possible that it hasn't occurred on my watch. Initially when I encountered this issue, I re-deployed the package and that did resolve the issue of fetching it, but at this point even writing started to fail. Looking at the log, it seems to point to permission issue. I am running the latest rootless image (i.e. `gitea/gitea:1.21.5-rootless`) in a K3s cluster. The following is an excerpt from the log when pushing a package starts to fail (notably `stat` operation). > 2024/02/07 14:24:06 ...eb/routing/logger.go:102:func1() [I] router: completed HEAD /v2/dev/foobar/blobs/sha256:13d88c480ae4865d5957822e258ba8d7fe83e41daed61ab06113f7609ef35291 for 10.42.2.80:56500, 500 Internal Server Error in 25.3ms @ container/container.go:483(container.HeadBlob) 2024/02/07 14:24:06 ...eb/routing/logger.go:102:func1() [I] router: completed POST /v2/dev/foobar/blobs/uploads/ for 10.42.2.80:56524, 202 Accepted in 16.2ms @ container/container.go:215(container.InitiateUploadBlob) 2024/02/07 14:24:09 ...eb/routing/logger.go:102:func1() [I] router: completed PATCH /v2/dev/foobar/blobs/uploads/ceznwwageoxqaedtd2ygjymyh for 10.42.2.80:56534, 202 Accepted in 487.2ms @ container/container.go:325(container.UploadBlob) 2024/02/07 14:24:09 ...eb/routing/logger.go:102:func1() [I] router: completed PUT /v2/dev/foobar/blobs/uploads/ceznwwageoxqaedtd2ygjymyh?digest=sha256%3A702a753aea7862c25ea09c58c6e1ed7911e2e38d0f519b2d7f7a7ad952906998 for 10.42.2.80:56512, 201 Created in 333.5ms @ container/container.go:370(container.EndUploadBlob) 2024/02/07 14:24:09 ...eb/routing/logger.go:102:func1() [I] router: completed HEAD /v2/dev/foobar/blobs/sha256:702a753aea7862c25ea09c58c6e1ed7911e2e38d0f519b2d7f7a7ad952906998 for 10.42.2.80:56534, 200 OK in 23.7ms @ container/container.go:483(container.HeadBlob) 2024/02/07 14:24:09 ...ntainer/container.go:89:apiError() [E] stat /data/packages/c5/7e/c57ee5000d61345aa3ee6684794a8110328e2274d9a5ae7855969d1a26394463: permission denied 2024/02/07 14:24:09 ...eb/routing/logger.go:102:func1() [I] router: completed HEAD /v2/dev/foobar/blobs/sha256:c57ee5000d61345aa3ee6684794a8110328e2274d9a5ae7855969d1a26394463 for 10.42.2.80:56512, 500 Internal Server Error in 25.8ms @ container/container.go:483(container.HeadBlob) 2024/02/07 14:24:09 ...eb/routing/logger.go:102:func1() [I] router: completed POST /v2/dev/foobar/blobs/uploads/ for 10.42.2.80:56524, 202 Accepted in 16.7ms @ container/container.go:215(container.InitiateUploadBlob) 2024/02/07 14:24:09 ...eb/routing/logger.go:102:func1() [I] router: completed POST /v2/dev/foobar/blobs/uploads/ for 10.42.2.80:56500, 202 Accepted in 14.2ms @ container/container.go:215(container.InitiateUploadBlob) 2024/02/07 14:24:09 ...eb/routing/logger.go:102:func1() [I] router: completed PATCH /v2/dev/foobar/blobs/uploads/zi19mehycgvpz7irkiokebutm for 10.42.2.80:56500, 202 Accepted in 22.4ms @ container/container.go:325(container.UploadBlob) 2024/02/07 14:24:09 ...eb/routing/logger.go:102:func1() [I] router: completed PUT /v2/dev/foobar/blobs/uploads/zi19mehycgvpz7irkiokebutm?digest=sha256%3A347e716bfaf1c423f0a5ff5d655375e87c9acaa26982b0a5d103592ff3794262 for 10.42.2.80:56512, 201 Created in 31.2ms @ container/container.go:370(container.EndUploadBlob) 2024/02/07 14:24:09 ...ntainer/container.go:89:apiError() [E] stat /data/packages/34/7e/347e716bfaf1c423f0a5ff5d655375e87c9acaa26982b0a5d103592ff3794262: permission denied Mounted volume is provisioned from NFS (via [nfs-subdir-external-provisioner](https://github.com/kubernetes-sigs/nfs-subdir-external-provisioner)). It does seem that the entire `/data` directory has proper permissions though, as seen in the screenshot. ![image](/attachments/00846199-6eb8-4f52-8685-da3e9c588c22) At this point I'm a bit at a loss, as I don't really know how to rectify the issue, and whether this is an issue on my end, or something else.
justusbunsi commented 2024-02-07 21:36:46 +00:00 (Migrated from gitea.com)

Assuming you upgraded from chart version 10.1.0 to 10.1.1: the only change is the Gitea version. Based on your information and the logs you've provided, it looks like an issue within Gitea itself. Please file an issue in the Gitea main repo.

Assuming you upgraded from chart version 10.1.0 to 10.1.1: the only change is the Gitea version. Based on your information and the logs you've provided, it looks like an issue within Gitea itself. Please file an issue in the Gitea main repo.
emmorts commented 2024-02-08 06:45:48 +00:00 (Migrated from gitea.com)

@justusbunsi Apologies, I forgot to append, that I upgraded from v9.5.1, though to be honest the issue could have been there from even before - I only noticed it after personally encountering it.

@justusbunsi Apologies, I forgot to append, that I upgraded from v9.5.1, though to be honest the issue could have been there from even before - I only noticed it after personally encountering it.
pat-s commented 2024-02-11 11:12:28 +00:00 (Migrated from gitea.com)

I don't think that this is an issue of the chart. Also no changes happened between your release which could potentially have any influence on the described issue.

I also use the chart for a Gitea instance with a lot of use for "packages" without any issue yet.
My guess would rather go into the direction of storageClass permissions/settings of your RWX.

The chart doesn't do anything there, it just expects a RWX, everything else is on the user side.

2024/02/07 14:24:09 ...ntainer/container.go:89:apiError() [E] stat /data/packages/34/7e/347e716bfaf1c423f0a5ff5d655375e87c9acaa26982b0a5d103592ff3794262: permission denied

Did you check the full path recursively for proper permissions? Your screenshot only shows /data.
Also please avoid screenshots and try to use formatted code, it's easier to reply/reference it.

I don't think that this is an issue of the chart. Also no changes happened between your release which could potentially have any influence on the described issue. I also use the chart for a Gitea instance with a lot of use for "packages" without any issue yet. My guess would rather go into the direction of storageClass permissions/settings of your RWX. The chart doesn't do anything there, it just expects a RWX, everything else is on the user side. > 2024/02/07 14:24:09 ...ntainer/container.go:89:apiError() [E] stat /data/packages/34/7e/347e716bfaf1c423f0a5ff5d655375e87c9acaa26982b0a5d103592ff3794262: permission denied Did you check the full path recursively for proper permissions? Your screenshot only shows `/data`. Also please avoid screenshots and try to use formatted code, it's easier to reply/reference it.
pat-s commented 2024-02-23 07:31:40 +00:00 (Migrated from gitea.com)

Closing here, see my last comment.
Digging to the root of this issue is likely out of scope for this repo. Maybe opening an issue in the Discord Forum or asking in Discord might help to understand it better.

Closing here, see my last comment. Digging to the root of this issue is likely out of scope for this repo. Maybe opening an issue in the Discord Forum or asking in Discord might help to understand it better.
Sign in to join this conversation.
No Milestone
No project
No Assignees
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: lunny/helm-chart#610
No description provided.