Skip to content

Draft: [bpf] respect container device cfg for allow list #12974

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

janLo
Copy link

@janLo janLo commented Sep 14, 2022

Description

This is a draft of custom device support as needed in #8396. I'm neither a Go-developer nor very familiar with the gitpod codebase. I just wanted to draft these changes to see if it can work and to discuss it further because we need custom device support desperately. These changes are not tested (because I wasn't able to build the service) and probably need to be redone properly. Especially the injection of the runtime due to the limited cgroup Plugin interface is unfortunate.

The issue these changes address is, that gitpod creates a very static cgroup-device-bpf-progeam since fuse CgroupV2 support landed in #8769. This disregards any device settings a Kubernetes device plugin has made to this container. These changes try to fetch the actual device allow-list from the container, combine them with the static defaults and inject the fuse rule before the BPF program is created.

Related Issue(s)

#8769
#8396

How to test

The easiest way is to use the smarter-device-manager and a custom MuattingAdmissionWebhook to inject device resources into the pod manifests.

Release Notes

[ws-daemon] Add support for Kubernetes device plugin allow-list

This changes the CGroupV2 fuse device enabler to respect any devices
configured for the container instead of statically recreate the BPF
program from defaults. This allows the user to inject special devices
(GPU, KVM, ...) via a device manager plugin e.g. with a mutating
admission controller.
@csweichel
Copy link
Contributor

csweichel commented Sep 16, 2022

/werft run

👍 started the job as gitpod-build-cgroup-devices-fork.0
(with .werft/ from main)

@csweichel
Copy link
Contributor

csweichel commented Sep 16, 2022

/werft run

👍 started the job as gitpod-build-cgroup-devices-fork.1
(with .werft/ from main)

@stale
Copy link

stale bot commented Sep 28, 2022

This pull request has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the meta: stale This issue/PR is stale and will be closed soon label Sep 28, 2022
@janLo
Copy link
Author

janLo commented Sep 28, 2022

Oh, I'd like to receive some feedback before it gets closed ;)

@stale stale bot removed the meta: stale This issue/PR is stale and will be closed soon label Sep 28, 2022
@stale
Copy link

stale bot commented Oct 12, 2022

This pull request has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@janLo
Copy link
Author

janLo commented Oct 12, 2022

I'm still interested.

@stale stale bot added the meta: stale This issue/PR is stale and will be closed soon label Oct 12, 2022
@janLo
Copy link
Author

janLo commented Oct 18, 2022

Sorry to bother again, do you have any thoughts on this?

@stale stale bot removed the meta: stale This issue/PR is stale and will be closed soon label Oct 18, 2022
@stale
Copy link

stale bot commented Oct 28, 2022

This pull request has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the meta: stale This issue/PR is stale and will be closed soon label Oct 28, 2022
@janLo
Copy link
Author

janLo commented Oct 29, 2022

I'm still interested.

@stale stale bot removed the meta: stale This issue/PR is stale and will be closed soon label Oct 29, 2022
@stale
Copy link

stale bot commented Nov 9, 2022

This pull request has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the meta: stale This issue/PR is stale and will be closed soon label Nov 9, 2022
@janLo
Copy link
Author

janLo commented Nov 9, 2022

I am still interested.

@stale stale bot removed the meta: stale This issue/PR is stale and will be closed soon label Nov 9, 2022
@stale
Copy link

stale bot commented Nov 22, 2022

This pull request has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the meta: stale This issue/PR is stale and will be closed soon label Nov 22, 2022
@stale stale bot closed this Nov 27, 2022
@janLo
Copy link
Author

janLo commented Nov 27, 2022

I was still waiting for a reply.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
do-not-merge/work-in-progress meta: stale This issue/PR is stale and will be closed soon release-note size/L team: workspace Issue belongs to the Workspace team
Projects
No open projects
Status: No status
Development

Successfully merging this pull request may close these issues.

3 participants