Uploading artifacts too large archive – Gitlab pipeline

Spread the love

FATAL: too large  gitlab pipeline 😎

I upgraded some node versions in my GitLab CI pipeline. After upgrading the versions, I ran the pipeline and received an error that said ERROR: Uploading artifacts as “archive” to coordinator… too large archive πŸ”»

node_modules/: found 57656 matching files and directories 
ERROR: Uploading artifacts as "archive" to coordinator... too large archive Β id=31845 responseStatus=413 Payload Too Large status=413 token=YGCc9B7z
FATAL: too large Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  
Cleaning up file based variables
00:00
ERROR: Job failed: command terminated with exit code 1

I spent some time debugging this error and eventually fixed it. The problem was with the maximum artifact size limit. The maximum artifacts limit in my Gitlab was set to 100MB, which is the default artifact upload size limit. Here are the steps to correct this error.

To change this settings, Go Into your Gitlab, Go to Admin settings > Continuous Integration and Deployment > Maximum artifacts size (MB) and increase the value of the artifact size.

Note : That option is only available for self-managed GitLab instances, not GitLab.com. The default value for Gitlab.com SaaS is 1 GB.

You can set this limit to a different level :

1. Instance level

2. Project level

3. Group level

And click on the save changes and re-run the pipeline.  😍😍😍

See also  Unregistering runner from GitLab forbidden - Fixed

You may like this :  How to setup Nginx ingress using helm

No design skills? No problem. Checkout how it’s work! 🌈 😍 πŸ”₯ 🎨

Checkout Purple Photo to  create beautiful posts with zero designing skills.

3 thoughts on “Uploading artifacts too large archive – Gitlab pipeline”

Leave a Comment

PHP Code Snippets Powered By : XYZScripts.com