-
Notifications
You must be signed in to change notification settings - Fork 56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: generate_library.sh
with postprocessing
#1951
Changes from 188 commits
53abd0e
434cbfc
5825bcc
c39cbea
10a1c6d
cca6770
89bbedc
3e85e33
fdf9260
f3b60a6
1b4746d
2856709
81cbdc2
d733c5a
92660ff
07f3ea6
49542f7
d7b3752
0c8aa78
3d612f8
280a571
d375848
734e1eb
9a5a7cd
35ad197
8477b74
5eb0223
b12f99e
e893df3
a7d8a55
2a9f8a8
51beff7
f7e4b72
32463da
b58a802
d7dcdca
4e525a6
64ee89e
56e418a
a056ae1
7ef8278
4f0ef4a
5805e84
880c27b
a5fe171
f6c5784
5d16f16
4f5689a
b38f907
1648254
b896cad
f7cb5f7
68beeb9
44dedad
17d8e55
4c63bfd
4966605
c4a67e3
b60860d
808a281
fc7db26
7af9688
6d1b82e
563f088
a37a630
8359fe8
797c3c4
d9ac0f3
beb60c0
589eb2a
3425fa1
16c9c79
e7b7a89
1f92d42
2a4552c
c5ad849
18ee764
b71fa8a
7169e88
8ac66d3
759d175
8f88198
50ea3d6
45391fa
a33e8f6
b6959e0
a5563fe
bbf4ad4
2661ae0
dc90843
5ca4077
dc20ccf
bd4032b
645920b
fd2add0
eb630d6
6c74dd1
f4aeec1
51c0cce
15776eb
4d47b5b
bf59555
8501e6a
ff56f8f
0e235ff
949e03a
26724b9
cb79e4e
d4db01c
965ae63
206d193
43c3295
4a7a766
d61f635
4656c8c
89767f9
6957092
0c67ea2
b3169c3
85e7ef9
beef5fa
a2ac324
457b140
946ebc5
b3379b9
0c17333
0127e7f
4943488
79d9d4b
71b6e57
b3ac7a6
68b975a
84fdb5b
fb83971
26cb59b
fd25987
36d440b
a242fcd
2a04035
8cb7551
38e7573
8b3a7e6
a35f7b0
1f15c84
59ebdfb
e3fe861
5f5897c
25358e1
d03aad6
0f3ce9e
aef4f06
9477110
bd3b673
025305c
61bce33
5f5e001
1a833e5
f0b5190
6ae705c
0ffb4c5
24b0fe2
08b9afb
8deeebc
01de512
14fd3af
55a22b9
e10d1eb
a97158a
c549cd5
e94cd56
b8226dc
1c7e5c7
9d12336
d01736a
f355551
6b9bb19
7a9bd74
fa3f64d
a84eb2b
955ed0d
5886739
bdee6d7
a5b991e
d418a9c
e896c1c
8862348
7464c0e
d1bc3f9
a2a4ef9
fdcfb1e
21e4e85
17436eb
086f95f
3d216d2
338767e
30a1ffc
9228814
a60d3b0
fcfd510
45cb355
7ab4545
09128a4
173bf3a
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -14,6 +14,7 @@ jobs: | |
matrix: | ||
java: [ 8 ] | ||
os: [ ubuntu-22.04, macos-12 ] | ||
post_processing: [ 'true', 'false' ] | ||
runs-on: ${{ matrix.os }} | ||
steps: | ||
- uses: actions/checkout@v3 | ||
|
@@ -22,11 +23,35 @@ jobs: | |
java-version: ${{ matrix.java }} | ||
distribution: temurin | ||
cache: maven | ||
- uses: actions/setup-python@v4 | ||
with: | ||
python-version: '3.11' | ||
- name: install docker (ubuntu) | ||
if: matrix.os == 'ubuntu-22.04' | ||
run: | | ||
set -x | ||
# install docker | ||
sudo apt install containerd -y | ||
sudo apt install -y docker.io docker-compose | ||
|
||
# launch docker | ||
sudo systemctl start docker | ||
- name: install docker (macos) | ||
if: matrix.os == 'macos-12' | ||
run: | | ||
brew update --preinstall | ||
brew install docker docker-compose qemu | ||
brew upgrade qemu | ||
colima start | ||
docker run --user $(id -u):$(id -g) --rm hello-world | ||
- name: Run integration tests | ||
run: | | ||
set -x | ||
git config --global user.email "[email protected]" | ||
git config --global user.name "Github Workflow" | ||
library_generation/test/generate_library_integration_test.sh \ | ||
--googleapis_gen_url https://cloud-java-bot:${{ secrets.CLOUD_JAVA_BOT_GITHUB_TOKEN }}@github.com/googleapis/googleapis-gen.git | ||
--googleapis_gen_url https://cloud-java-bot:${{ secrets.CLOUD_JAVA_BOT_GITHUB_TOKEN }}@github.com/googleapis/googleapis-gen.git \ | ||
--enable_postprocessing "${{ matrix.post_processing }}" | ||
unit_tests: | ||
strategy: | ||
matrix: | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,7 +1,6 @@ | ||
#!/usr/bin/env bash | ||
|
||
set -eo pipefail | ||
set -x | ||
|
||
# parse input parameters | ||
while [[ $# -gt 0 ]]; do | ||
|
@@ -61,10 +60,22 @@ case $key in | |
include_samples="$2" | ||
shift | ||
;; | ||
--enable_postprocessing) | ||
enable_postprocessing="$2" | ||
shift | ||
;; | ||
--repository_path) | ||
repository_path="$2" | ||
shift | ||
;; | ||
--os_architecture) | ||
os_architecture="$2" | ||
shift | ||
;; | ||
--versions_file) | ||
versions_file="$2" | ||
shift | ||
;; | ||
*) | ||
echo "Invalid option: [$1]" | ||
exit 1 | ||
|
@@ -74,6 +85,7 @@ shift # past argument or value | |
done | ||
|
||
script_dir=$(dirname "$(readlink -f "$0")") | ||
# source utility functions | ||
source "${script_dir}"/utilities.sh | ||
output_folder="$(get_output_folder)" | ||
|
||
|
@@ -117,17 +129,20 @@ if [ -z "${include_samples}" ]; then | |
include_samples="true" | ||
fi | ||
|
||
if [ -z "$enable_postprocessing" ]; then | ||
enable_postprocessing="true" | ||
fi | ||
|
||
if [ -z "${os_architecture}" ]; then | ||
os_architecture=$(detect_os_architecture) | ||
fi | ||
|
||
|
||
mkdir -p "${output_folder}/${destination_path}" | ||
##################### Section 0 ##################### | ||
# prepare tooling | ||
##################################################### | ||
# the order of services entries in gapic_metadata.json is relevant to the | ||
# order of proto file, sort the proto files with respect to their name to | ||
# order of proto file, sort the proto files with respect to their bytes to | ||
# get a fixed order. | ||
folder_name=$(extract_folder_name "${destination_path}") | ||
pushd "${output_folder}" | ||
|
@@ -137,7 +152,7 @@ case "${proto_path}" in | |
find_depth="-maxdepth 1" | ||
;; | ||
esac | ||
proto_files=$(find "${proto_path}" ${find_depth} -type f -name "*.proto" | sort) | ||
proto_files=$(find "${proto_path}" ${find_depth} -type f -name "*.proto" | LC_COLLATE=C sort) | ||
# include or exclude certain protos in grpc plugin and gapic generator java. | ||
case "${proto_path}" in | ||
"google/cloud") | ||
|
@@ -280,5 +295,30 @@ popd # output_folder | |
##################################################### | ||
pushd "${output_folder}/${destination_path}" | ||
rm -rf java_gapic_srcjar java_gapic_srcjar_raw.srcjar.zip java_grpc.jar java_proto.jar temp-codegen.srcjar | ||
popd | ||
set +x | ||
popd # destination path | ||
##################### Section 5 ##################### | ||
# post-processing | ||
##################################################### | ||
source "${script_dir}/post_processing/postprocessing_utilities.sh" | ||
if [ "${enable_postprocessing}" != "true" ]; | ||
then | ||
echo "post processing is disabled" | ||
exit 0 | ||
fi | ||
if [ -z "${versions_file}" ];then | ||
echo "no versions.txt argument provided. Please provide one in order to enable post-processing" | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It's OK to make it required for now since it's only for existing libraries. For new client libraries, I'm assuming we have to generate versions.txt first before calling this script? Is it generated by new client library script currently? cc @JoeWang1127 There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @blakeli0 the version of a new library defaults to The new library script applies current versions but this is for other referenced artifacts such as I think we still need a versions file for referenced artifacts other than the library being generated when dealing with a new monorepo library |
||
exit 1 | ||
fi | ||
workspace="${output_folder}/workspace" | ||
if [ -d "${workspace}" ]; then | ||
rm -rdf "${workspace}" | ||
fi | ||
|
||
mkdir -p "${workspace}" | ||
|
||
run_owlbot_postprocessor "${workspace}" \ | ||
"${script_dir}" \ | ||
"${output_folder}/${destination_path}" \ | ||
"${repository_path}" \ | ||
"${proto_path}" \ | ||
"${versions_file}" |
blakeli0 marked this conversation as resolved.
Show resolved
Hide resolved
|
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,129 @@ | ||
#!/bin/bash | ||
# | ||
# Main functions to interact with owlbot post-processor and postprocessing | ||
# scripts | ||
|
||
# returns the metadata json path if given, or defaults to the one found in | ||
# $repository_path | ||
# Arguments | ||
# 1 - repository_path: path from output_folder to the location of the library | ||
# containing .repo-metadata. It assumes the existence of google-cloud-java in | ||
# the output folder | ||
# 2 - output_folder: root for the generated libraries, used in conjunction with | ||
get_repo_metadata_json() { | ||
blakeli0 marked this conversation as resolved.
Show resolved
Hide resolved
|
||
local repository_path=$1 | ||
local output_folder=$2 | ||
>&2 echo 'Attempting to obtain .repo-metadata.json from repository_path' | ||
local default_metadata_json_path="${output_folder}/${repository_path}/.repo-metadata.json" | ||
if [ -f "${default_metadata_json_path}" ]; then | ||
echo "${default_metadata_json_path}" | ||
else | ||
>&2 echo 'failed to obtain json from repository_path' | ||
exit 1 | ||
fi | ||
} | ||
|
||
# returns the owlbot image sha contained in google-cloud-java. This is default | ||
# behavior that may be overriden by a custom value in the future. | ||
# Arguments | ||
# 1 - output_folder: root for the generated libraries, used in conjunction with | ||
# 2 - repository_root: usually "google-cloud-java". The .OwlBot.yaml | ||
# file is looked into its .github folder | ||
get_owlbot_sha() { | ||
local output_folder=$1 | ||
local repository_root=$2 | ||
if [ ! -d "${output_folder}/${repository_root}" ]; | ||
then | ||
>&2 echo 'No repository to infer owlbot_sha was provided. This is necessary for post-processing' >&2 | ||
exit 1 | ||
fi | ||
>&2 echo "Attempting to obtain owlbot_sha from monorepo folder" | ||
owlbot_sha=$(grep 'sha256' "${output_folder}/${repository_root}/.github/.OwlBot.lock.yaml" | cut -d: -f3) | ||
echo "${owlbot_sha}" | ||
} | ||
|
||
# Runs the owlbot post-processor docker image. | ||
# Arguments | ||
# 1 - workspace: the location of the grpc,proto and gapic libraries to be | ||
# processed | ||
# owlbot | ||
# 4 - scripts_root: location of the generation scripts | ||
# 5 - destination_path: used to transfer the raw grpc, proto and gapic libraries | ||
# 6 - repository_path: path from output_folder to the location of the source of | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I still don't fully understand the usefulness of There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @blakeli0 the pre-existing poms with their folder structure are indeed necessary for the postprocessor. We need To clarify, the purpose of There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. As discussed, I removed usage of NOTE: The usage of owlbot copy works with a yaml file expecting a certain input folder structure. For example, that yaml has a There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I'm yet to update the readme |
||
# truth/pre-existing poms. This can either be a folder in google-cloud-java or | ||
# the root of a HW library | ||
# 7 - proto_path: googleapis path of the library. This is used to prepare the | ||
# folder structure to run `owlbot-cli copy-code` | ||
function run_owlbot_postprocessor { | ||
workspace=$1 | ||
scripts_root=$2 | ||
destination_path=$3 | ||
repository_path=$4 | ||
proto_path=$5 | ||
versions_file=$6 | ||
|
||
repository_root=$(echo "${repository_path}" | cut -d/ -f1) | ||
repo_metadata_json_path=$(get_repo_metadata_json "${repository_path}" "${output_folder}") | ||
owlbot_sha=$(get_owlbot_sha "${output_folder}" "${repository_root}") | ||
|
||
# read or infer owlbot sha | ||
|
||
cp "${repo_metadata_json_path}" "${workspace}"/.repo-metadata.json | ||
|
||
# call owl-bot-copy | ||
owlbot_staging_folder="${workspace}/owl-bot-staging" | ||
mkdir -p "${owlbot_staging_folder}" | ||
owlbot_postprocessor_image="gcr.io/cloud-devrel-public-resources/owlbot-java@sha256:${owlbot_sha}" | ||
|
||
|
||
|
||
# copy existing pom, owlbot and version files if the source of truth repo is present | ||
if [[ -d "${output_folder}/${repository_path}" ]]; then | ||
rsync -avm \ | ||
--include='*/' \ | ||
--include='*.xml' \ | ||
--include='package-info.java' \ | ||
--include='owlbot.py' \ | ||
--include='.OwlBot.yaml' \ | ||
--exclude='*' \ | ||
"${output_folder}/${repository_path}/" \ | ||
"${workspace}" | ||
fi | ||
|
||
echo 'Running owl-bot-copy' | ||
pre_processed_libs_folder="${destination_path}/pre-processed" | ||
mkdir -p "${pre_processed_libs_folder}/${proto_path}/$(basename "${destination_path}")" | ||
find "${destination_path}" -mindepth 1 -maxdepth 1 -type d -not -name 'pre-processed' \ | ||
-exec cp -pr {} "${pre_processed_libs_folder}/${proto_path}/$(basename "${destination_path}")" \; | ||
pushd "${pre_processed_libs_folder}" | ||
# create an empty repository so owl-bot-copy can process this as a repo | ||
# (cannot process non-git-repositories) | ||
git init | ||
git commit --allow-empty -m 'empty commit' | ||
popd # pre_processed_libs_folder | ||
|
||
docker run --rm \ | ||
--user $(id -u):$(id -g) \ | ||
-v "${workspace}:/repo" \ | ||
-v "${pre_processed_libs_folder}:/pre-processed-libraries" \ | ||
-w /repo \ | ||
--env HOME=/tmp \ | ||
gcr.io/cloud-devrel-public-resources/owlbot-cli:latest \ | ||
copy-code \ | ||
--source-repo-commit-hash=none \ | ||
--source-repo=/pre-processed-libraries \ | ||
--config-file=.OwlBot.yaml | ||
|
||
|
||
echo 'running owl-bot post-processor' | ||
versions_file_arg="" | ||
if [ -f "${versions_file}" ];then | ||
versions_file_arg="-v ${versions_file}:/versions.txt" | ||
fi | ||
# run the postprocessor | ||
docker run --rm \ | ||
-v "${workspace}:/workspace" \ | ||
${versions_file_arg} \ | ||
--user $(id -u):$(id -g) \ | ||
"${owlbot_postprocessor_image}" | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you think it's possible to only have
destination_path
, rather than having two parameters having similar meaning?After enable post processing, we can create a temp folder to store the pre-processed library. In order to use owlbot cli, the temp folder needs to satisfy a path prefix, which can be inferred from
proto_path
.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The only interesting situation is with
apigee-connect
. We have the monorepo-folder calledapigee-connect
, whereas the googleapis-gen folder isapigeeconnect
.This could be solved by inspecting the
repo_short
entry in .repo-metadata.jsonEdit: we will still need a path to such json file, which is currently obtained from
repository_path
. I think we can do it the other way around: to use onlyrepository_path
and inferdestination_path
from the jsonThere was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
On a second note, it is also possible to get the destination path via
googleapis-gen
followingproto_path
. That will lead us to thedestination_path
already set inproto_path_list
(e.g.google-cloud-asset-v1-java
).There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree with Joe as well, we should try to only use one path. I still don't quite get the need of an additional repo path, can you please explain more about it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I added this logic: we infer
destination_path
fromgoogleapis-gen
at the specifiedproto_path