Hi at all!
I am looking for a tool (or set of ultiple tools) which suits the scenario described in the attached picture below. The goal is to establish a Build-Pipeline which builds (from exisiting repository at Bitbucket) and deploys (into AWS S3) continously (triggered). All should happen in a cloud. Github Actions is not an option, because the cloud-hosted runners are way too tiny. We already use jenkins, but want to move away from it, as it is self-hosted.
I hope that somebody has expierience with particular CI/CD Tools (cloudbased) which could meet the described scenario. I would appreciate every hint or idea. I dont expect detailed solutions
Let me describe the wanted scenario in more detail:
The repository is currently (as is) stored at Bitbucket, as well as three more directories, which are needed for the build-process later. Alternatively, the named directories could be stored in an online storage or LFS-Service and pulled from there into the dedicated Cache/Working directory of the particular Build-Slave and stay persistent.
The directories are big-sized, one is actually about 60 GB and some of the contained files are greater than 1 GB. Build-tools (like cmake, msbuild, g++ etc.) should also remain persistant inside the cache, once installed, as well as enviroment-variables and configuration files.
The build-agents (as shown in the attached picture) should be highly scalable in CPU-cores, RAM ( and Number of agents). A build-agent should fire up within 1 Minute, build the artefakt and deploy it in a AWS S3-Bucket, wich already is being used in the current build-pipeline, which is self hosted.
Cache/Working directory (of agents):
- should allow particular directories to be persistent
- working directory/Cache should be big enough (>100GB)
- no limitation of filesize
- Secure handling of Secrets
- should be in the cloud
- should be scalable in CPU-Cores and RAM
- should support Linux/Windows/MacOS/Android
- boot-up time of an agent should be < 1min
- should support Bitbucket and AWS S3