Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parallel build of multi-stage Docker images #1548

Open
mitar opened this issue Jan 15, 2021 · 9 comments
Open

Parallel build of multi-stage Docker images #1548

mitar opened this issue Jan 15, 2021 · 9 comments
Labels
area/performance issues related to kaniko performance enhancement categorized differs-from-docker kind/enhancement New feature or request priority/p2 High impact feature/bug. Will get a lot of users happy works-with-docker

Comments

@mitar
Copy link

mitar commented Jan 15, 2021

Actual behavior

It looks like when I have multi-stage Docker image which have few independent stages to build everything, and then the last stage only to copy files over into the final image, those initial independent stages are not build in parallel.

Expected behavior

Independent stages initial stages could be build in parallel to speed up overall building.

To Reproduce

I used Dockerfile like:

FROM node:12-buster as node-builder

WORKDIR /app
COPY ./frontend .
RUN yarn install
RUN yarn build-staging
RUN yarn build-production

FROM golang:1.14-buster as go-builder

WORKDIR /app
COPY ./backend .
RUN make

FROM debian:buster

COPY --from=node-builder /app/dist-staging /app/staging/
COPY --from=node-builder /app/dist-production /app/production/
COPY --from=go-builder /app/backend /usr/local/bin/backend

Triage Notes for the Maintainers

Description Yes/No
Please check if this a new feature you are proposing
Please check if the build works in docker but not in kaniko
Please check if this error is seen when you use --cache flag
Please check if your dockerfile is a multistage dockerfile

(Buildkit-enabled docker build call seems to parallelize build run this.)

@ahmet2mir
Copy link

Hello,

We have a build with several build stage (compilation) with a final stage taking all compiled binaries.
It takes severals minutes instead of some to build the image.

Looks like this issue could be easily fixed using some goroutine and a simple env/param to define max "worker" to use (like this loop)

As this issue got a lot upvote, I could work on it but is it an accepted design or is there a limitation I missed ?

@Goopil
Copy link

Goopil commented Oct 29, 2021

This would be a great addition. Any news on this ? Thanks in advance

@gabyx
Copy link
Contributor

gabyx commented May 10, 2022

Hi, I am not an author of the project but am about to fix some standing issues regarding these

which should be fixed first:
#2066

I think "main" still has some weird bugs with stages, most users do not use lots of stages and some problems do not arise (due to hasing etc.)

I guess @jason Hall could clarify maybe how to go about this if its even possible, I guess it should be, but is not so trivial as stated above, since your are basically solving a DAG because every stage has a parent. So simply speaking

  • building up the stage graph (graph node: stage index), and then
  • defining an execution order over all nodes (coloring),
  • and then building all stages in parallel with the same execution priority successively
  • maybe there are shortcuts by how the Dockerfile is defined and what is possible in the sense of DAG... ?

@Metroxe
Copy link

Metroxe commented Jun 6, 2022

This would speed up our builds tremendously. Honestly an implementation identical to buildkit, where the stages all start and build up to a waiting dependency (if there is one) would be a suitable implementation.

@gabyx
Copy link
Contributor

gabyx commented Jun 6, 2022 via email

@ubershloder
Copy link

would be a great feature, saves some time and resources

@fernando-renzi
Copy link

We need this please! We have 3 stages with vue build, in docker buildkit they run in parallel, and with kaniko in serial

@aaron-prindle aaron-prindle added kind/enhancement New feature or request works-with-docker differs-from-docker priority/p2 High impact feature/bug. Will get a lot of users happy area/performance issues related to kaniko performance enhancement and removed works-with-docker labels May 29, 2023
@Shocktrooper
Copy link

bumping this as it would be a nice feature that puts kaniko more in feature parity with other build tools

@nero19960329
Copy link

Is there any update? It's an important feature for kaniko.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/performance issues related to kaniko performance enhancement categorized differs-from-docker kind/enhancement New feature or request priority/p2 High impact feature/bug. Will get a lot of users happy works-with-docker
Projects
None yet
Development

No branches or pull requests

10 participants