Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Overhaul Cilium manifests to match the newer versions #8717

Merged
merged 7 commits into from
May 11, 2022

Conversation

necatican
Copy link
Contributor

@necatican necatican commented Apr 14, 2022

What type of PR is this?
/kind feature

What this PR does / why we need it:
This PR will overhaul the current Cilium installation steps, allow people to use more configuration options, and reflect the recent changes.

Which issue(s) this PR fixes:

Fixes #8716

Special notes for your reviewer:

Does this PR introduce a user-facing change?:

[Cilium] Update Cilium manifests and the default version to v1.11.3

^None so far

@k8s-ci-robot k8s-ci-robot added do-not-merge/work-in-progress Indicates that a PR should not merge because it is a work in progress. kind/feature Categorizes issue or PR as related to a new feature. cncf-cla: yes Indicates the PR's author has signed the CNCF CLA. labels Apr 14, 2022
@k8s-ci-robot k8s-ci-robot added the size/L Denotes a PR that changes 100-499 lines, ignoring generated files. label Apr 14, 2022
@necatican necatican force-pushed the cilium-update branch 2 times, most recently from 44fc99e to 789c69b Compare April 15, 2022 13:44
@k8s-ci-robot k8s-ci-robot added size/XL Denotes a PR that changes 500-999 lines, ignoring generated files. and removed size/L Denotes a PR that changes 100-499 lines, ignoring generated files. labels Apr 15, 2022
@necatican necatican force-pushed the cilium-update branch 5 times, most recently from 2f2de96 to a75b90f Compare April 18, 2022 19:05
@necatican
Copy link
Contributor Author

Hello, o/
These should update the current configuration without any breaking changes. We will still have connectivity issues when using CiliumNetworkPolicy CRDs with NodeLocal DNS (#8546) Because our NodeLocal runs on the host network, and we aren't implementing any Local Redirect Policies. The current configuration without this PR is having the same issues.

I might have to join the army for a month, and I've been dealing with that and delegating/finishing up the work I got. I think this PR is okay to merge. I will deal with molecule tests and that NodeLocal DNS problem once I'm done with the army duty.

If anyone wants to complete the remaining tasks, there is some information about the Redirect Policy and NodeLocal DNS on Cilium's documentation.

@necatican necatican changed the title WIP: Overhaul Cilium manifests to match the newer versions Overhaul Cilium manifests to match the newer versions May 3, 2022
@k8s-ci-robot k8s-ci-robot removed the do-not-merge/work-in-progress Indicates that a PR should not merge because it is a work in progress. label May 3, 2022
@cristicalin
Copy link
Contributor

@necatican you might have to rebase on latest master branch to get the CI to pass

@cristicalin
Copy link
Contributor

Thank you @necatican for this significant update!

/lgtm

@k8s-ci-robot k8s-ci-robot added the lgtm "Looks good to me", indicates that a PR is ready to be merged. label May 10, 2022
@cristicalin
Copy link
Contributor

/cc @floryut @oomichi

@k8s-ci-robot k8s-ci-robot requested a review from floryut May 10, 2022 09:51
@@ -110,7 +110,7 @@ flannel_cni_version: "v1.0.1"
cni_version: "v1.0.1"
weave_version: 2.8.1
pod_infra_version: "3.3"
cilium_version: "v1.11.1"
cilium_version: "v1.11.3"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just question: There is a lot of cilium code which support older versions of cilium (1.8-)
Do we still need to keep those code?

It is fine to keep those code in this pull request for keeping the pull request small.
I just wanted to make sure we will remove the code or not soon.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hello o/
This problem bothers me as well. I did not want to introduce any breaking changes to the current system yet. However, we probably should set some boundaries to support only a few versions.

Cilium's documentation does not help with that issue. :/

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for your reply @necatican

I did not want to introduce any breaking changes to the current system yet.

Yeah, that makes sense.
It is fine to keep those code in this pull request.

@oomichi
Copy link
Contributor

oomichi commented May 11, 2022

/lgtm

@oomichi
Copy link
Contributor

oomichi commented May 11, 2022

Oops, lgtm tag is already there. Need to put

/approve

@k8s-ci-robot
Copy link
Contributor

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: necatican, oomichi

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@k8s-ci-robot k8s-ci-robot added the approved Indicates a PR has been approved by an approver from all required OWNERS files. label May 11, 2022
@k8s-ci-robot k8s-ci-robot merged commit 13443b0 into kubernetes-sigs:master May 11, 2022
@oomichi oomichi mentioned this pull request May 28, 2022
LuckySB pushed a commit to southbridgeio/kubespray that referenced this pull request Jun 30, 2023
…s#8717)

* [cilium] Separate templates for cilium, cilium-operator, and hubble installations

Signed-off-by: necatican <[email protected]>

* [cilium] Update cilium-operator templates

Signed-off-by: necatican <[email protected]>

* [cilium] Allow using custom args and mounting extra volumes for the Cilium Operator

Signed-off-by: necatican <[email protected]>

* [cilium] Update the cilium configmap to filter out the deprecated variables, and add the new variables

Signed-off-by: necatican <[email protected]>

* [cilium] Add an option to use Wireguard encryption on Cilium 1.10 and up

Signed-off-by: necatican <[email protected]>

* [cilium] Update cilium-agent templates

Signed-off-by: necatican <[email protected]>

* [cilium] Bump Cilium version to 1.11.3

Signed-off-by: necatican <[email protected]>
LuckySB pushed a commit to southbridgeio/kubespray that referenced this pull request Oct 23, 2023
…s#8717)

* [cilium] Separate templates for cilium, cilium-operator, and hubble installations

Signed-off-by: necatican <[email protected]>

* [cilium] Update cilium-operator templates

Signed-off-by: necatican <[email protected]>

* [cilium] Allow using custom args and mounting extra volumes for the Cilium Operator

Signed-off-by: necatican <[email protected]>

* [cilium] Update the cilium configmap to filter out the deprecated variables, and add the new variables

Signed-off-by: necatican <[email protected]>

* [cilium] Add an option to use Wireguard encryption on Cilium 1.10 and up

Signed-off-by: necatican <[email protected]>

* [cilium] Update cilium-agent templates

Signed-off-by: necatican <[email protected]>

* [cilium] Bump Cilium version to 1.11.3

Signed-off-by: necatican <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
approved Indicates a PR has been approved by an approver from all required OWNERS files. cncf-cla: yes Indicates the PR's author has signed the CNCF CLA. kind/feature Categorizes issue or PR as related to a new feature. lgtm "Looks good to me", indicates that a PR is ready to be merged. size/XL Denotes a PR that changes 500-999 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[cilium] Update Cilium installations steps to match the new versions
4 participants