-
Kassio Borges authored
The Pipelines are grouped in Stages, like the CI jobs. Stages run in sequence, one after the other, while the pipelines within a stage run in parallel. Each pipeline runs on an, individual, `BulkImports::PipelineWorker` job, which enables: - smaller/shorter jobs: Background jobs can be interrupted during deploys or other unexpected infrastructure/ops events. To make jobs more resilient it's desirable to have smaller jobs wherever possible. - faster imports: Some pipelines can run in parallel, which reduces the total time of an import. - (follow-up) network/rate limits handling: When a pipeline gets rate limitted, it can be schedule to retry after the rate limit timeout. https://gitlab.com/gitlab-org/gitlab/-/issues/262024
e64aeb1b