kubeflow-github-action
The purpose of this action is to allow for automated deployments of Kubeflow Pipelines on Google Cloud Platform (GCP). The action will collect the pipeline from a python file and compile it before uploading it to Kubeflow. The Kubeflow deployment must be using IAP on GCP to work.
To compile a pipeline and upload it to kubeflow:
name: Compile and Deploy Kubeflow pipelineon: [push]# Set environmental variablesjobs: build: runs-on: ubuntu-18.04 steps: - name: checkout files in repo uses: actions/checkout@master - name: Submit Kubeflow pipeline id: kubeflow uses: NikeNano/kubeflow-github-action@master with: KUBEFLOW_URL: ${{ secrets.KUBEFLOW_URL }} ENCODED_GOOGLE_APPLICATION_CREDENTIALS: ${{ secrets.GKE_KEY }} GOOGLE_APPLICATION_CREDENTIALS: /tmp/gcloud-sa.json CLIENT_ID: ${{ secrets.CLIENT_ID }} PIPELINE_CODE_PATH: "example_pipeline.py" PIPELINE_FUNCTION_NAME: "flipcoin_pipeline" PIPELINE_PARAMETERS_PATH: "parameters.yaml" EXPERIMENT_NAME: "Default" RUN_PIPELINE: False VERSION_GITHUB_SHA: False
If you also would like to run it use the following:
name: Compile, Deploy and Run on Kubeflowon: [push]# Set environmental variablesjobs: build: runs-on: ubuntu-18.04 steps: - name: checkout files in repo uses: actions/checkout@master - name: Submit Kubeflow pipeline id: kubeflow uses: NikeNano/kubeflow-github-action@master with: KUBEFLOW_URL: ${{ secrets.KUBEFLOW_URL }} ENCODED_GOOGLE_APPLICATION_CREDENTIALS: ${{ secrets.GKE_KEY }} GOOGLE_APPLICATION_CREDENTIALS: /tmp/gcloud-sa.json CLIENT_ID: ${{ secrets.CLIENT_ID }} PIPELINE_CODE_PATH: "example_pipeline.py" PIPELINE_FUNCTION_NAME: "flipcoin_pipeline" PIPELINE_PARAMETERS_PATH: "parameters.yaml" EXPERIMENT_NAME: "Default" RUN_PIPELINE: True VERSION_GITHUB_SHA: False
The repo also contains an example where the containers in the pipeline are versioned with the github hash in order to improve operations and tracking of errors. However this requires that the pipelines function to be wrapped in a function with one argument:
def pipeline(github_sha :str): ...
the containers is versioned with the hash:
pre_image = f"gcr.io/{project}/pre_image:{github_sha}" train_forecast_image = f"gcr.io/{project}/train_forecast_image:{github_sha}"
for example see here
KUBEFLOW_URL: The URL to your kubeflow deployment
GKE_KEY: Service account with access to kubeflow and rights to deploy, see here for example, the credentials needs to be bas64 encode:
cat path-to-key.json | base64
GOOGLE_APPLICATION_CREDENTIALS: The path to where you like to store the secrets, which needs to be decoded from GKE_KEY
CLIENT_ID: The IAP client secret
PIPELINE_CODE_PATH: The full path to the python file containing the pipeline
PIPELINE_FUNCTION_NAME: The name of the pipeline function the PIPELINE_CODE_PATH file
PIPELINE_PARAMETERS_PATH: The pipeline parameters
EXPERIMENT_NAME: The name of the kubeflow experiment within which the pipeline should run
RUN_PIPELINE: If you like to also run the pipeline set "True"
VERSION_GITHUB_SHA: If the pipeline containers are versioned with the github hash
Add so that pipelines can be scheduled to run as well. Soooon done!
还没有评论,说两句吧!
热门资源
seetafaceJNI
项目介绍 基于中科院seetaface2进行封装的JAVA...
spark-corenlp
This package wraps Stanford CoreNLP annotators ...
Keras-ResNeXt
Keras ResNeXt Implementation of ResNeXt models...
capsnet-with-caps...
CapsNet with capsule-wise convolution Project ...
inferno-boilerplate
This is a very basic boilerplate example for pe...
智能在线
400-630-6780
聆听.建议反馈
E-mail: support@tusaishared.com