``` ├── .editorconfig ├── .gitattributes ├── .github/ ├── CODEOWNERS ├── CODE_OF_CONDUCT.md ├── CONTRIBUTOR_AND_GUIDES/ ├── CODE-AUDIT.md ├── CONTRIBUTING.md ├── USER_SUBMITTED_GUIDES.md ├── ct/ ├── AppName.md ├── AppName.sh ├── install/ ├── AppName-install.md ├── AppName-install.sh ├── json/ ├── AppName.json ├── AppName.md ├── DISCUSSION_TEMPLATE/ ├── request-script.yml ├── FUNDING.yml ├── ISSUE_TEMPLATE/ ├── bug_report.yml ├── config.yml ├── feature_request.yml ├── frontend_report.yml ├── task.yml ├── autolabeler-config.json ├── changelog-pr-config.json ├── pull_request_template.md ├── runner/ ├── docker/ ├── gh-runner-self.dockerfile ├── workflows/ ├── auto-update-app-headers.yml ├── autolabeler.yml ├── changelog-pr.yml ├── close-discussion.yml ├── close-ttek-issues.yaml ├── close_issue_in_dev.yaml ├── crawl-versions.yaml ├── create-docker-for-runner.yml ├── delete-json-branch.yml ├── frontend-cicd.yml ├── github-release.yml ├── script-test.yml ├── script_format.yml ├── scripts/ ├── app-test/ ├── pr-alpine-install.func ├── pr-build.func ├── pr-create-lxc.sh ├── pr-install.func ├── generate-app-headers.sh ├── update-json.sh ├── update_json_date.sh ├── update-json-date.yml ├── validate-filenames.yml ├── .gitignore ├── .vscode/ ├── .editorconfig ├── extensions.json ├── settings.json ├── CHANGELOG.md ``` ## /.editorconfig ```editorconfig path="/.editorconfig" ; editorconfig.org root = true [*] charset = utf-8 continuation_indent_size = 2 end_of_line = lf indent_size = 2 indent_style = space insert_final_newline = true max_line_length = 120 tab_width = 2 ; trim_trailing_whitespace = true ; disabled until files are cleaned up [*.md] trim_trailing_whitespace = false ``` ## /.gitattributes Binary file available at https://raw.githubusercontent.com/community-scripts/ProxmoxVE/refs/heads/main/.gitattributes ## /.github/CODEOWNERS ```github/CODEOWNERS path="/.github/CODEOWNERS" # # CODEOWNERS for ProxmoxVE # # Order is important; the last matching pattern takes the most # precedence. # Codeowners for specific folders and files # Remember ending folders with / # Set default reviewers * @community-scripts/Contributor # All changes in frontend /frontend/ @community-scripts/Frontend-Dev ``` ## /.github/CODE_OF_CONDUCT.md # Contributor Covenant Code of Conduct ## Our Pledge We as members, contributors, and leaders pledge to make participation in our community a harassment-free experience for everyone, regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, level of experience, education, socio-economic status, nationality, personal appearance, race, religion, or sexual identity and orientation. We pledge to act and interact in ways that contribute to an open, welcoming, diverse, inclusive, and healthy community. ## Our Standards Examples of behavior that contributes to a positive environment for our community include: * Demonstrating empathy and kindness toward other people * Being respectful of differing opinions, viewpoints, and experiences * Giving and gracefully accepting constructive feedback * Accepting responsibility and apologizing to those affected by our mistakes, and learning from the experience * Focusing on what is best not just for us as individuals, but for the overall community Examples of unacceptable behavior include: * The use of sexualized language or imagery, and sexual attention or advances of any kind * Trolling, insulting or derogatory comments, and personal or political attacks * Public or private harassment * Publishing others' private information, such as a physical or email address, without their explicit permission * Other conduct which could reasonably be considered inappropriate in a professional setting ## Enforcement Responsibilities Community leaders are responsible for clarifying and enforcing our standards of acceptable behavior and will take appropriate and fair corrective action in response to any behavior that they deem inappropriate, threatening, offensive, or harmful. Community leaders have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, and will communicate reasons for moderation decisions when appropriate. ## Scope This Code of Conduct applies within all community spaces, and also applies when an individual is officially representing the community in public spaces. Examples of representing our community include using an official e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event. ## Enforcement Instances of abusive, harassing, or otherwise unacceptable behavior may be reported to the community leaders responsible for enforcement. All complaints will be reviewed and investigated promptly and fairly. All community leaders are obligated to respect the privacy and security of the reporter of any incident. ## Enforcement Guidelines Community leaders will follow these Community Impact Guidelines in determining the consequences for any action they deem in violation of this Code of Conduct: ### 1. Correction **Community Impact**: Use of inappropriate language or other behavior deemed unprofessional or unwelcome in the community. **Consequence**: A private, written warning from community leaders, providing clarity around the nature of the violation and an explanation of why the behavior was inappropriate. A public apology may be requested. ### 2. Warning **Community Impact**: A violation through a single incident or series of actions. **Consequence**: A warning with consequences for continued behavior. No interaction with the people involved, including unsolicited interaction with those enforcing the Code of Conduct, for a specified period of time. This includes avoiding interactions in community spaces as well as external channels like social media. Violating these terms may lead to a temporary or permanent ban. ### 3. Temporary Ban **Community Impact**: A serious violation of community standards, including sustained inappropriate behavior. **Consequence**: A temporary ban from any sort of interaction or public communication with the community for a specified period of time. No public or private interaction with the people involved, including unsolicited interaction with those enforcing the Code of Conduct, is allowed during this period. Violating these terms may lead to a permanent ban. ### 4. Permanent Ban **Community Impact**: Demonstrating a pattern of violation of community standards, including sustained inappropriate behavior, harassment of an individual, or aggression toward or disparagement of classes of individuals. **Consequence**: A permanent ban from any sort of public interaction within the community. ## Attribution This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 2.0, available at [https://www.contributor-covenant.org/version/2/0/code_of_conduct.html][v2.0]. Community Impact Guidelines were inspired by [Mozilla's code of conduct enforcement ladder][Mozilla CoC]. For answers to common questions about this code of conduct, see the FAQ at [https://www.contributor-covenant.org/faq][FAQ]. Translations are available at [https://www.contributor-covenant.org/translations][translations]. [homepage]: https://www.contributor-covenant.org [v2.0]: https://www.contributor-covenant.org/version/2/0/code_of_conduct.html [Mozilla CoC]: https://github.com/mozilla/diversity [FAQ]: https://www.contributor-covenant.org/faq [translations]: https://www.contributor-covenant.org/translations ## /.github/CONTRIBUTOR_AND_GUIDES/CODE-AUDIT.md

Exploring the Scripts and Steps Involved in an Application LXC Installation

1) [adguard.sh](https://github.com/community-scripts/ProxmoxVE/blob/main/ct/adguard.sh): This script collects system parameters. (Also holds the function to update the application.) 2) [build.func](https://github.com/community-scripts/ProxmoxVE/blob/main/misc/build.func): Adds user settings and integrates collected information. 3) [create_lxc.sh](https://github.com/community-scripts/ProxmoxVE/blob/main/ct/create_lxc.sh): Constructs the LXC container. 4) [adguard-install.sh](https://github.com/community-scripts/ProxmoxVE/blob/main/install/adguard-install.sh): Executes functions from [install.func](https://github.com/community-scripts/ProxmoxVE/blob/main/misc/install.func), and installs the application. 5) [adguard.sh](https://github.com/community-scripts/ProxmoxVE/blob/main/ct/adguard.sh) (again): To display the completion message. The installation process uses reusable scripts: [build.func](https://github.com/community-scripts/ProxmoxVE/blob/main/misc/build.func), [create_lxc.sh](https://github.com/community-scripts/ProxmoxVE/blob/main/ct/create_lxc.sh), and [install.func](https://github.com/community-scripts/ProxmoxVE/blob/main/misc/install.func), which are not specific to any particular application. To gain a better understanding, focus on reviewing [adguard-install.sh](https://github.com/community-scripts/ProxmoxVE/blob/main/install/adguard-install.sh). This script contains the commands and configurations for installing and configuring AdGuard Home within the LXC container. ## /.github/CONTRIBUTOR_AND_GUIDES/CONTRIBUTING.md # Community Scripts Contribution Guide ## **Welcome to the communty-scripts Repository!** 📜 These documents outline the essential coding standards for all our scripts and JSON files. Adhering to these standards ensures that our codebase remains consistent, readable, and maintainable. By following these guidelines, we can improve collaboration, reduce errors, and enhance the overall quality of our project. ### Why Coding Standards Matter Coding standards are crucial for several reasons: 1. **Consistency**: Consistent code is easier to read, understand, and maintain. It helps new team members quickly get up to speed and reduces the learning curve. 2. **Readability**: Clear and well-structured code is easier to debug and extend. It allows developers to quickly identify and fix issues. 3. **Maintainability**: Code that follows a standard structure is easier to refactor and update. It ensures that changes can be made with minimal risk of introducing new bugs. 4. **Collaboration**: When everyone follows the same standards, it becomes easier to collaborate on code. It reduces friction and misunderstandings during code reviews and merges. ### Scope of These Documents These documents cover the coding standards for the following types of files in our project: - **`install/$AppName-install.sh` Scripts**: These scripts are responsible for the installation of applications. - **`ct/$AppName.sh` Scripts**: These scripts handle the creation and updating of containers. - **`frontend/public/json/$AppName.json`**: These files store structured data and are used for the website. Each section provides detailed guidelines on various aspects of coding, including shebang usage, comments, variable naming, function naming, indentation, error handling, command substitution, quoting, script structure, and logging. Additionally, examples are provided to illustrate the application of these standards. By following the coding standards outlined in this document, we ensure that our scripts and JSON files are of high quality, making our project more robust and easier to manage. Please refer to this guide whenever you create or update scripts and JSON files to maintain a high standard of code quality across the project. 📚🔍 Let's work together to keep our codebase clean, efficient, and maintainable! 💪🚀 ## Getting Started Before contributing, please ensure that you have the following setup: 1. **Visual Studio Code** (recommended for script development) 2. **Recommended VS Code Extensions:** - [Shell Syntax](https://marketplace.visualstudio.com/items?itemName=bmalehorn.shell-syntax) - [ShellCheck](https://marketplace.visualstudio.com/items?itemName=timonwong.shellcheck) - [Shell Format](https://marketplace.visualstudio.com/items?itemName=foxundermoon.shell-format) ### Important Notes - Use [AppName.sh](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_AND_GUIDES/ct/AppName.sh) and [AppName-install.sh](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_AND_GUIDES/install/AppName-install.sh) as templates when creating new scripts. Final version of the script (the one you will push for review), must have all comments removed, except the ones in the file header. --- # 🚀 The Application Script (ct/AppName.sh) - You can find all coding standards, as well as the structure for this file [here](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_AND_GUIDES/ct/AppName.md). - These scripts are responsible for container creation, setting the necessary variables and handling the update of the application once installed. --- # 🛠 The Installation Script (install/AppName-install.sh) - You can find all coding standards, as well as the structure for this file [here](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_AND_GUIDES/install/AppName-install.md). - These scripts are responsible for the installation of the application. --- ## 🚀 Building Your Own Scripts Start with the [template script](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_AND_GUIDES/install/AppName-install.sh) --- ## 🤝 Contribution Process All PR's related to new scripts should be made against our Dev repository first, where we can test the scripts before they are pushed and merged in the official repository. **Our Dev repo is `http://www.github.com/community-scripts/ProxmoxVED`** You will need to adjust paths mentioned further down this document to match the repo you're pushing the scripts to. ### 1. Fork the repository Fork to your GitHub account ### 2. Clone your fork on your local environment ```bash git clone https://github.com/yourUserName/ForkName ``` ### 3. Create a new branch ```bash git switch -c your-feature-branch ``` ### 4. Change paths in build.func install.func and AppName.sh To be able to develop from your own branch you need to change:\ `https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main`\ to\ `https://raw.githubusercontent.com/[USER]/[REPOSITORY]/refs/heads/[BRANCH]`\ in following files: `misc/build.func`\ `misc/install.func`\ `ct/AppName.sh` Example: `https://raw.githubusercontent.com/tremor021/PromoxVE/refs/heads/testbranch` Also you need to change:\ `https://raw.githubusercontent.com/community-scripts/ProxmoxVE/raw/main`\ to\ `https://raw.githubusercontent.com/[USER]/[REPOSITORY]/raw/[BRANCH]`\ in `misc/install.func` in order for `update` shell command to work.\ These changes are only while writing and testing your scripts. Before opening a Pull Request, you should change all above mentioned paths in `misc/build.func`, `misc/install.func` and `ct/AppName.sh` to point to the original paths. ### 4. Commit changes (without build.func and install.func!) ```bash git commit -m "Your commit message" ``` ### 5. Push to your fork ```bash git push origin your-feature-branch ``` ### 6. Create a Pull Request Open a Pull Request from your feature branch to the main branch on the Dev repository. You must only include your **$AppName.sh**, **$AppName-install.sh** and **$AppName.json** files in the pull request. --- ## 📚 Pages - [CT Template: AppName.sh](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_AND_GUIDES/ct/AppName.sh) - [Install Template: AppName-install.sh](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_AND_GUIDES/install/AppName-install.sh) - [JSON Template: AppName.json](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_AND_GUIDES/json/AppName.json) ## /.github/CONTRIBUTOR_AND_GUIDES/USER_SUBMITTED_GUIDES.md

User Submitted Guides

In order to contribute a guide on installing with Proxmox VE Helper Scripts, you should open a pull request that adds the guide to the `USER_SUBMITTED_GUIDES.md` file. [Proxmox Automation with Proxmox Helper Scripts!](https://www.youtube.com/watch?v=kcpu4z5eSEU) [Installing Home Assistant OS using Proxmox 8](https://community.home-assistant.io/t/installing-home-assistant-os-using-proxmox-8/201835) [How To Separate Zigbee2MQTT From Home Assistant In Proxmox](https://smarthomescene.com/guides/how-to-separate-zigbee2mqtt-from-home-assistant-in-proxmox/) [How To Install Home Assistant On Proxmox: The Easy Way](https://smarthomescene.com/guides/how-to-install-home-assistant-on-proxmox-the-easy-way/) [Home Assistant: Installing InfluxDB (LXC)](https://www.derekseaman.com/2023/04/home-assistant-installing-influxdb-lxc.html) [Home Assistant: Proxmox Quick Start Guide](https://www.derekseaman.com/2023/10/home-assistant-proxmox-ve-8-0-quick-start-guide-2.html) [Home Assistant: Installing Grafana (LXC) with Let’s Encrypt SSL](https://www.derekseaman.com/2023/04/home-assistant-installing-grafana-lxc.html) [Proxmox: Plex LXC with Alder Lake Transcoding](https://www.derekseaman.com/2023/04/proxmox-plex-lxc-with-alder-lake-transcoding.html) [How To Backup Home Assistant In Proxmox](https://smarthomescene.com/guides/how-to-backup-home-assistant-in-proxmox/) [Running Frigate on Proxmox](https://www.homeautomationguy.io/blog/running-frigate-on-proxmox) [Frigate VM on Proxmox with PCIe Coral TPU](https://www.derekseaman.com/2023/06/home-assistant-frigate-vm-on-proxmox-with-pcie-coral-tpu.html) [Moving Home Assistant’s Database To MariaDB On Proxmox](https://smarthomescene.com/guides/moving-home-assistants-database-to-mariadb-on-proxmox/) [How-to: Proxmox VE 7.4 to 8.0 Upgrade](https://www.derekseaman.com/2023/06/how-to-proxmox-7-4-to-8-0-upgrade.html) [iGPU Transcoding In Proxmox with Jellyfin](https://www.youtube.com/watch?v=XAa_qpNmzZs) [Proxmox + NetData]() [Proxmox Homelab Series]() [The fastest installation of Docker and Portainer on Proxmox VE](https://lavr.site/en-fastest-install-docker-portainer-proxmox/) [How To Setup Proxmox Backuper Server Using Helper Scripts]() ## /.github/CONTRIBUTOR_AND_GUIDES/ct/AppName.md # **AppName.sh Scripts** `AppName.sh` scripts found in the `/ct` directory. These scripts are responsible for the installation of the desired application. For this guide we take `/ct/snipeit.sh` as example. ## Table of Contents - [**AppName.sh Scripts**](#appnamesh-scripts) - [Table of Contents](#table-of-contents) - [1. **File Header**](#1-file-header) - [1.1 **Shebang**](#11-shebang) - [1.2 **Import Functions**](#12-import-functions) - [1.3 **Metadata**](#13-metadata) - [2 **Variables and function import**](#2-variables-and-function-import) - [2.1 **Default Values**](#21-default-values) - [2.2 **📋 App output \& base settings**](#22--app-output--base-settings) - [2.3 **🛠 Core functions**](#23--core-functions) - [3 **Update function**](#3-update-function) - [3.1 **Function Header**](#31-function-header) - [3.2 **Check APP**](#32-check-app) - [3.3 **Check version**](#33-check-version) - [3.4 **Verbosity**](#34-verbosity) - [3.5 **Backups**](#35-backups) - [3.6 **Cleanup**](#36-cleanup) - [3.7 **No update function**](#37-no-update-function) - [4 **End of the script**](#4-end-of-the-script) - [5. **Contribution checklist**](#5-contribution-checklist) ## 1. **File Header** ### 1.1 **Shebang** - Use `#!/usr/bin/env bash` as the shebang. ```bash #!/usr/bin/env bash ``` ### 1.2 **Import Functions** - Import the build.func file. - When developing your own script, change the URL to your own repository. > [!IMPORTANT] > You also need to change all apperances of this URL in `misc/build.func` and `misc/install.func` Example for development: ```bash source <(curl -s https://raw.githubusercontent.com/[USER]/[REPO]/refs/heads/[BRANCH]/misc/build.func) ``` Final script: ```bash source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func) ``` > [!CAUTION] > Before opening a Pull Request, change the URLs to point to the community-scripts repo. ### 1.3 **Metadata** - Add clear comments for script metadata, including author, copyright, and license information. Example: ```bash # Copyright (c) 2021-2025 community-scripts ORG # Author: [YourUserName] # License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE # Source: [SOURCE_URL] ``` > [!NOTE]: > > - Add your username and source URL > - For existing scripts, add "| Co-Author [YourUserName]" after the current author > - Source is a URL of github repo containting source files of the application you're installing (not URL of your homepage or a blog) --- ## 2 **Variables and function import** > > [!NOTE] > You need to have all this set in your script, otherwise it will not work! ### 2.1 **Default Values** - This section sets the default values for the container. - `APP` needs to be set to the application name and must be equal to the filenames of your scripts. - `var_tags`: You can set Tags for the CT wich show up in the Proxmox UI. Don´t overdo it! >[!NOTE] >Description for all Default Values > >| Variable | Description | Notes | >|----------|-------------|-------| >| `APP` | Application name | Must match ct\AppName.sh | >| `var_tags` | Proxmox display tags without Spaces, only ; | Limit the number to 2 | >| `var_cpu` | CPU cores | Number of cores | >| `var_ram` | RAM | In MB | >| `var_disk` | Disk capacity | In GB | >| `var_os` | Operating system | alpine, debian, ubuntu | >| `var_version` | OS version | e.g., 3.20, 11, 12, 20.04 | >| `var_unprivileged` | Container type | 1 = Unprivileged, 0 = Privileged | Example: ```bash APP="SnipeIT" var_tags="asset-management;foss" var_cpu="2" var_ram="2048" var_disk="4" var_os="debian" var_version="12" var_unprivileged="1" ``` ## 2.2 **📋 App output & base settings** ```bash header_info "$APP" ``` - `header_info`: Generates ASCII header for APP ## 2.3 **🛠 Core functions** ```bash variables color catch_errors ``` - `variables`: Processes input and prepares variables - `color`: Sets icons, colors, and formatting - `catch_errors`: Enables error handling --- ## 3 **Update function** ### 3.1 **Function Header** - If applicable write a function that updates the application and the OS in the container. - Each update function starts with the same code: ```bash function update_script() { header_info check_container_storage check_container_resources ``` ### 3.2 **Check APP** - Before doing anything update-wise, check if the app is installed in the container. Example: ```bash if [[ ! -d /opt/snipe-it ]]; then msg_error "No ${APP} Installation Found!" exit fi ``` ### 3.3 **Check version** - Before updating, check if a new version exists. - We use the `${APPLICATION}_version.txt` file created in `/opt` during the install to compare new versions against the currently installed version. Example with a Github Release: ```bash RELEASE=$(curl -fsSL https://api.github.com/repos/snipe/snipe-it/releases/latest | grep "tag_name" | awk '{print substr($2, 3, length($2)-4) }') if [[ ! -f /opt/${APP}_version.txt ]] || [[ "${RELEASE}" != "$(cat /opt/${APP}_version.txt)" ]]; then msg_info "Updating ${APP} to v${RELEASE}" #DO UPDATE else msg_ok "No update required. ${APP} is already at v${RELEASE}." fi exit } ``` ### 3.4 **Verbosity** - Use the appropriate flag (**-q** in the examples) for a command to suppress its output. Example: ```bash wget -q unzip -q ``` - If a command does not come with this functionality use `$STD` to suppress it's output. Example: ```bash $STD php artisan migrate --force $STD php artisan config:clear ``` ### 3.5 **Backups** - Backup user data if necessary. - Move all user data back in the directory when the update is finished. >[!NOTE] >This is not meant to be a permanent backup Example backup: ```bash mv /opt/snipe-it /opt/snipe-it-backup ``` Example config restore: ```bash cp /opt/snipe-it-backup/.env /opt/snipe-it/.env cp -r /opt/snipe-it-backup/public/uploads/ /opt/snipe-it/public/uploads/ cp -r /opt/snipe-it-backup/storage/private_uploads /opt/snipe-it/storage/private_uploads ``` ### 3.6 **Cleanup** - Do not forget to remove any temporary files/folders such as zip-files or temporary backups. Example: ```bash rm -rf /opt/v${RELEASE}.zip rm -rf /opt/snipe-it-backup ``` ### 3.7 **No update function** - In case you can not provide an update function use the following code to provide user feedback. ```bash function update_script() { header_info check_container_storage check_container_resources if [[ ! -d /opt/snipeit ]]; then msg_error "No ${APP} Installation Found!" exit fi msg_error "Currently we don't provide an update function for this ${APP}." exit } ``` --- ## 4 **End of the script** - `start`: Launches Whiptail dialogue - `build_container`: Collects and integrates user settings - `description`: Sets LXC container description - With `echo -e "${TAB}${GATEWAY}${BGN}http://${IP}${CL}"` you can point the user to the IP:PORT/folder needed to access the app. ```bash start build_container description msg_ok "Completed Successfully!\n" echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}" echo -e "${INFO}${YW} Access it using the following URL:${CL}" echo -e "${TAB}${GATEWAY}${BGN}http://${IP}${CL}" ``` --- ## 5. **Contribution checklist** - [ ] Shebang is correctly set (`#!/usr/bin/env bash`). - [ ] Correct link to *build.func* - [ ] Metadata (author, license) is included at the top. - [ ] Variables follow naming conventions. - [ ] Update function exists. - [ ] Update functions checks if app is installed and for new version. - [ ] Update function cleans up temporary files. - [ ] Script ends with a helpful message for the user to reach the application. ## /.github/CONTRIBUTOR_AND_GUIDES/ct/AppName.sh ```sh path="/.github/CONTRIBUTOR_AND_GUIDES/ct/AppName.sh" #!/usr/bin/env bash source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func) # Copyright (c) 2021-2025 community-scripts ORG # Author: [YourUserName] # License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE # Source: [SOURCE_URL] # App Default Values APP="[APP_NAME]" # Name of the app (e.g. Google, Adventurelog, Apache-Guacamole" var_tags="[TAGS]" # Tags for Proxmox VE, maximum 2 pcs., no spaces allowed, separated by a semicolon ; (e.g. database | adblock;dhcp) var_cpu="[CPU]" # Number of cores (1-X) (e.g. 4) - default are 2 var_ram="[RAM]" # Amount of used RAM in MB (e.g. 2048 or 4096) var_disk="[DISK]" # Amount of used disk space in GB (e.g. 4 or 10) var_os="[OS]" # Default OS (e.g. debian, ubuntu, alpine) var_version="[VERSION]" # Default OS version (e.g. 12 for debian, 24.04 for ubuntu, 3.20 for alpine) var_unprivileged="[UNPRIVILEGED]" # 1 = unprivileged container, 0 = privileged container header_info "$APP" variables color catch_errors function update_script() { header_info check_container_storage check_container_resources # Check if installation is present | -f for file, -d for folder if [[ ! -f [INSTALLATION_CHECK_PATH] ]]; then msg_error "No ${APP} Installation Found!" exit fi # Crawling the new version and checking whether an update is required RELEASE=$(curl -fsSL [RELEASE_URL] | [PARSE_RELEASE_COMMAND]) if [[ "${RELEASE}" != "$(cat /opt/${APP}_version.txt)" ]] || [[ ! -f /opt/${APP}_version.txt ]]; then # Stopping Services msg_info "Stopping $APP" systemctl stop [SERVICE_NAME] msg_ok "Stopped $APP" # Creating Backup msg_info "Creating Backup" tar -czf "/opt/${APP}_backup_$(date +%F).tar.gz" [IMPORTANT_PATHS] msg_ok "Backup Created" # Execute Update msg_info "Updating $APP to v${RELEASE}" [UPDATE_COMMANDS] msg_ok "Updated $APP to v${RELEASE}" # Starting Services msg_info "Starting $APP" systemctl start [SERVICE_NAME] msg_ok "Started $APP" # Cleaning up msg_info "Cleaning Up" rm -rf [TEMP_FILES] msg_ok "Cleanup Completed" # Last Action echo "${RELEASE}" >/opt/${APP}_version.txt msg_ok "Update Successful" else msg_ok "No update required. ${APP} is already at v${RELEASE}" fi exit } start build_container description msg_ok "Completed Successfully!\n" echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}" echo -e "${INFO}${YW} Access it using the following URL:${CL}" echo -e "${TAB}${GATEWAY}${BGN}http://${IP}:[PORT]${CL}" ``` ## /.github/CONTRIBUTOR_AND_GUIDES/install/AppName-install.md # **AppName-install.sh Scripts** `AppName-install.sh` scripts found in the `/install` directory. These scripts are responsible for the installation of the application. For this guide we take `/install/snipeit-install.sh` as example. ## Table of Contents - [**AppName-install.sh Scripts**](#appname-installsh-scripts) - [Table of Contents](#table-of-contents) - [1. **File header**](#1-file-header) - [1.1 **Shebang**](#11-shebang) - [1.2 **Comments**](#12-comments) - [1.3 **Variables and function import**](#13-variables-and-function-import) - [2. **Variable naming and management**](#2-variable-naming-and-management) - [2.1 **Naming conventions**](#21-naming-conventions) - [3. **Dependencies**](#3-dependencies) - [3.1 **Install all at once**](#31-install-all-at-once) - [3.2 **Collapse dependencies**](#32-collapse-dependencies) - [4. **Paths to application files**](#4-paths-to-application-files) - [5. **Version management**](#5-version-management) - [5.1 **Install the latest release**](#51-install-the-latest-release) - [5.2 **Save the version for update checks**](#52-save-the-version-for-update-checks) - [6. **Input and output management**](#6-input-and-output-management) - [6.1 **User feedback**](#61-user-feedback) - [6.2 **Verbosity**](#62-verbosity) - [7. **String/File Manipulation**](#7-stringfile-manipulation) - [7.1 **File Manipulation**](#71-file-manipulation) - [8. **Security practices**](#8-security-practices) - [8.1 **Password generation**](#81-password-generation) - [8.2 **File permissions**](#82-file-permissions) - [9. **Service Configuration**](#9-service-configuration) - [9.1 **Configuration files**](#91-configuration-files) - [9.2 **Credential management**](#92-credential-management) - [9.3 **Enviroment files**](#93-enviroment-files) - [9.4 **Services**](#94-services) - [10. **Cleanup**](#10-cleanup) - [10.1 **Remove temporary files**](#101-remove-temporary-files) - [10.2 **Autoremove and autoclean**](#102-autoremove-and-autoclean) - [11. **Best Practices Checklist**](#11-best-practices-checklist) - [Example: High-Level Script Flow](#example-high-level-script-flow) ## 1. **File header** ### 1.1 **Shebang** - Use `#!/usr/bin/env bash` as the shebang. ```bash #!/usr/bin/env bash ``` ### 1.2 **Comments** - Add clear comments for script metadata, including author, copyright, and license information. - Use meaningful inline comments to explain complex commands or logic. Example: ```bash # Copyright (c) 2021-2025 community-scripts ORG # Author: [YourUserName] # License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE # Source: [SOURCE_URL] ``` > [!NOTE]: > > - Add your username > - When updating/reworking scripts, add "| Co-Author [YourUserName]" > - Source is a URL of github repo containting source files of the application you're installing (not URL of your homepage or a blog) ### 1.3 **Variables and function import** - This sections adds the support for all needed functions and variables. ```bash source /dev/stdin <<<"$FUNCTIONS_FILE_PATH" color verb_ip6 catch_errors setting_up_container network_check update_os ``` --- ## 2. **Variable naming and management** ### 2.1 **Naming conventions** - Use uppercase names for constants and environment variables. - Use lowercase names for local script variables. Example: ```bash DB_NAME=snipeit_db # Environment-like variable (constant) db_user="snipeit" # Local variable ``` --- ## 3. **Dependencies** ### 3.1 **Install all at once** - Install all dependencies with a single command if possible Example: ```bash $STD apt-get install -y \ composer \ git \ nginx ``` ### 3.2 **Collapse dependencies** Collapse dependencies to keep the code readable. Example: Use ```bash php8.2-{bcmath,common,ctype} ``` instead of ```bash php8.2-bcmath php8.2-common php8.2-ctype ``` --- ## 4. **Paths to application files** If possible install the app and all necessary files in `/opt/` --- ## 5. **Version management** ### 5.1 **Install the latest release** - Always try and install the latest release - Do not hardcode any version if not absolutely necessary Example for a git release: ```bash RELEASE=$(curl -fsSL https://api.github.com/repos/snipe/snipe-it/releases/latest | grep "tag_name" | awk '{print substr($2, 3, length($2)-4) }') wget -q "https://github.com/snipe/snipe-it/archive/refs/tags/v${RELEASE}.zip" ``` ### 5.2 **Save the version for update checks** - Write the installed version into a file. - This is used for the update function in **AppName.sh** to check for if a Update is needed. Example: ```bash echo "${RELEASE}" >"/opt/AppName_version.txt" ``` --- ## 6. **Input and output management** ### 6.1 **User feedback** - Use standard functions like `msg_info`, `msg_ok` or `msg_error` to print status messages. - Each `msg_info` must be followed with a `msg_ok` before any other output is made. - Display meaningful progress messages at key stages. - Taking user input with `read -p` must be outside of `msg_info`...`msg_ok` code block Example: ```bash msg_info "Installing Dependencies" $STD apt-get install -y ... msg_ok "Installed Dependencies" read -p "Do you wish to enable HTTPS mode? (y/N): " httpschoice ``` ### 6.2 **Verbosity** - Use the appropiate flag (**-q** in the examples) for a command to suppres its output Example: ```bash wget -q unzip -q ``` - If a command dose not come with such a functionality use `$STD` (a custom standard redirection variable) for managing output verbosity. Example: ```bash $STD apt-get install -y nginx ``` --- ## 7. **String/File Manipulation** ### 7.1 **File Manipulation** - Use `sed` to replace placeholder values in configuration files. Example: ```bash sed -i -e "s|^DB_DATABASE=.*|DB_DATABASE=$DB_NAME|" \ -e "s|^DB_USERNAME=.*|DB_USERNAME=$DB_USER|" \ -e "s|^DB_PASSWORD=.*|DB_PASSWORD=$DB_PASS|" .env ``` --- ## 8. **Security practices** ### 8.1 **Password generation** - Use `openssl` to generate random passwords. - Use only alphanumeric values to not introduce unknown behaviour. Example: ```bash DB_PASS=$(openssl rand -base64 18 | tr -dc 'a-zA-Z0-9' | head -c13) ``` ### 8.2 **File permissions** Explicitly set secure ownership and permissions for sensitive files. Example: ```bash chown -R www-data: /opt/snipe-it chmod -R 755 /opt/snipe-it ``` --- ## 9. **Service Configuration** ### 9.1 **Configuration files** Use `cat </etc/nginx/conf.d/snipeit.conf server { listen 80; root /opt/snipe-it/public; index index.php; } EOF ``` ### 9.2 **Credential management** Store the generated credentials in a file. Example: ```bash USERNAME=username PASSWORD=$(openssl rand -base64 18 | tr -dc 'a-zA-Z0-9' | head -c13) { echo "Application-Credentials" echo "Username: $USERNAME" echo "Password: $PASSWORD" } >> ~/application.creds ``` ### 9.3 **Enviroment files** Use `cat </path/to/.env VARIABLE="value" PORT=3000 DB_NAME="${DB_NAME}" EOF ``` ### 9.4 **Services** Enable affected services after configuration changes and start them right away. Example: ```bash systemctl enable -q --now nginx ``` --- ## 10. **Cleanup** ### 10.1 **Remove temporary files** Remove temporary files and downloads after use. Example: ```bash rm -rf /opt/v${RELEASE}.zip ``` ### 10.2 **Autoremove and autoclean** Remove unused dependencies to reduce disk space usage. Example: ```bash apt-get -y autoremove apt-get -y autoclean ``` --- ## 11. **Best Practices Checklist** - [ ] Shebang is correctly set (`#!/usr/bin/env bash`). - [ ] Metadata (author, license) is included at the top. - [ ] Variables follow naming conventions. - [ ] Sensitive values are dynamically generated. - [ ] Files and services have proper permissions. - [ ] Script cleans up temporary files. --- ### Example: High-Level Script Flow 1. Dependencies installation 2. Database setup 3. Download and configure application 4. Service configuration 5. Final cleanup ## /.github/CONTRIBUTOR_AND_GUIDES/install/AppName-install.sh ```sh path="/.github/CONTRIBUTOR_AND_GUIDES/install/AppName-install.sh" #!/usr/bin/env bash # Copyright (c) 2021-2025 community-scripts ORG # Author: [YourUserName] # License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE # Source: [SOURCE_URL] # Import Functions und Setup source /dev/stdin <<< "$FUNCTIONS_FILE_PATH" color verb_ip6 catch_errors setting_up_container network_check update_os # Installing Dependencies msg_info "Installing Dependencies" $STD apt-get install -y \ [PACKAGE_1] \ [PACKAGE_2] \ [PACKAGE_3] msg_ok "Installed Dependencies" # Template: MySQL Database msg_info "Setting up Database" DB_NAME=[DB_NAME] DB_USER=[DB_USER] DB_PASS=$(openssl rand -base64 18 | tr -dc 'a-zA-Z0-9' | head -c13) $STD mysql -u root -e "CREATE DATABASE $DB_NAME;" $STD mysql -u root -e "CREATE USER '$DB_USER'@'localhost' IDENTIFIED WITH mysql_native_password AS PASSWORD('$DB_PASS');" $STD mysql -u root -e "GRANT ALL ON $DB_NAME.* TO '$DB_USER'@'localhost'; FLUSH PRIVILEGES;" { echo "${APPLICATION} Credentials" echo "Database User: $DB_USER" echo "Database Password: $DB_PASS" echo "Database Name: $DB_NAME" } >> ~/$APP_NAME.creds msg_ok "Set up Database" # Temp # Setup App msg_info "Setup ${APPLICATION}" RELEASE=$(curl -fsSL https://api.github.com/repos/[REPO]/releases/latest | grep "tag_name" | awk '{print substr($2, 2, length($2)-3) }') curl -fsSL -o "${RELEASE}.zip" "https://github.com/[REPO]/archive/refs/tags/${RELEASE}.zip" unzip -q "${RELEASE}.zip" mv "${APPLICATION}-${RELEASE}/" "/opt/${APPLICATION}" # # # echo "${RELEASE}" >/opt/${APPLICATION}_version.txt msg_ok "Setup ${APPLICATION}" # Creating Service (if needed) msg_info "Creating Service" cat </etc/systemd/system/${APPLICATION}.service [Unit] Description=${APPLICATION} Service After=network.target [Service] ExecStart=[START_COMMAND] Restart=always [Install] WantedBy=multi-user.target EOF systemctl enable -q --now ${APPLICATION} msg_ok "Created Service" motd_ssh customize # Cleanup msg_info "Cleaning up" rm -f ${RELEASE}.zip $STD apt-get -y autoremove $STD apt-get -y autoclean msg_ok "Cleaned" ``` ## /.github/CONTRIBUTOR_AND_GUIDES/json/AppName.json ```json path="/.github/CONTRIBUTOR_AND_GUIDES/json/AppName.json" { "name": "AppName", "slug": "appname", "categories": [ 0 ], "date_created": "DATE CREATED", "type": "ct", "updateable": true, "privileged": false, "interface_port": DEFAULT-PORT, "documentation": null, "website": "LINK TO WEBSITE", "logo": "LINK TO LOGO", "description": "Description of the app", "install_methods": [ { "type": "default", "script": "ct/AppName.sh", "resources": { "cpu": 2, "ram": 2048, "hdd": 4, "os": "debian", "version": "12" } } ], "default_credentials": { "username": null, "password": null }, "notes": [] } ``` ## /.github/CONTRIBUTOR_AND_GUIDES/json/AppName.md # **AppName.json Files** `AppName.json` files found in the `/json` directory. These files are used to provide informations for the website. For this guide we take `/json/snipeit.json` as example. ## Table of Contents - [**AppName.json Files**](#appnamejson-files) - [Table of Contents](#table-of-contents) - [1. JSON Generator](#1-json-generator) ## 1. JSON Generator Use the [JSON Generator](https://community-scripts.github.io/ProxmoxVE/json-editor) to create this file for your application. ## /.github/DISCUSSION_TEMPLATE/request-script.yml ```yml path="/.github/DISCUSSION_TEMPLATE/request-script.yml" title: "[Script request]: " labels: ["enhancement"] body: - type: input attributes: label: Application Name description: Enter the application name. placeholder: "e.g., Home Assistant" validations: required: true - type: input attributes: label: Website description: Official website or github page. placeholder: "e.g., https://www.home-assistant.io/" validations: required: true - type: textarea attributes: label: Description description: Explain what the application does and why it should be added to Proxmox VE Helper-Scripts. placeholder: "e.g., Home Assistant is a popular open-source platform that brings all your smart home devices together in one place. Adding it to Proxmox VE Helper-Scripts would make setup and management on Proxmox easy, letting users quickly get a powerful, self-hosted smart home system up and running." validations: required: true - type: checkboxes attributes: label: Due Diligence options: - label: "I have searched existing [scripts](https://community-scripts.github.io/Proxmox/scripts) and found no duplicates." required: true - label: "I have searched existing [discussions](https://github.com/community-scripts/ProxmoxVE/discussions?discussions_q=) and found no duplicate requests." required: true - type: markdown attributes: value: "Thanks for submitting your request! The team will review it and reach out if we need more information." ``` ## /.github/FUNDING.yml ```yml path="/.github/FUNDING.yml" ko_fi: community_scripts github: community_scripts ``` ## /.github/ISSUE_TEMPLATE/bug_report.yml ```yml path="/.github/ISSUE_TEMPLATE/bug_report.yml" name: "🐞 Script Issue Report" description: Report a specific issue with a script. For other inquiries, please use the Discussions section. labels: ["bug"] body: - type: markdown attributes: value: | ## ⚠️ **IMPORTANT - READ FIRST** - 🔍 **Search first:** Before submitting, check if the issue has already been reported or resolved in [closed issues](https://github.com/community-scripts/ProxmoxVE/issues?q=is%3Aissue+is%3Aclosed). If found, comment on that issue instead of creating a new one. Alternatively, check the **[Discussions](https://github.com/community-scripts/ProxmoxVE/discussions)** under the *"Announcement"* or *"Guide"* categories for relevant information. - 🔎 If you encounter `[ERROR] in line 23: exit code *: while executing command "$@" > /dev/null 2>&1`, rerun the script with verbose mode before submitting the issue. - 📜 **Read the script:** Familiarize yourself with the script's content and its purpose. This will help you understand the issue better and provide more relevant information Thank you for taking the time to report an issue! Please provide as much detail as possible to help us address the problem efficiently. - type: input id: guidelines attributes: label: ✅ Have you read and understood the above guidelines? placeholder: "yes" validations: required: true - type: input id: script_name attributes: label: 📜 What is the name of the script you are using? placeholder: "e.g., NextcloudPi, Zigbee2MQTT" validations: required: true - type: input id: script_command attributes: label: 📂 What was the exact command used to execute the script? placeholder: "e.g., bash -c \"$(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/ct/zigbee2mqtt.sh)\" or \"update\"" validations: required: true - type: checkboxes validations: required: true attributes: label: ⚙️ What settings are you using? options: - label: Default Settings - label: Advanced Settings - type: markdown attributes: value: "💡 **Tip:** If you are using Advanced Settings, please test with Default Settings before submitting an issue." - type: dropdown id: linux_distribution attributes: label: 🖥️ Which Linux distribution are you using? options: - - Alpine - Debian 11 - Debian 12 - Ubuntu 20.04 - Ubuntu 22.04 - Ubuntu 24.04 - Ubuntu 24.10 validations: required: true - type: textarea id: issue_description attributes: label: 📝 Provide a clear and concise description of the issue. validations: required: true - type: textarea id: steps_to_reproduce attributes: label: 🔄 Steps to reproduce the issue. placeholder: "e.g., Step 1: ..., Step 2: ..." validations: required: true - type: textarea id: error_output attributes: label: ❌ Paste the full error output (if available). placeholder: "Include any relevant logs or error messages." validations: required: true - type: textarea id: additional_context attributes: label: 🖼️ Additional context (optional). placeholder: "Include screenshots, code blocks (use triple backticks \`\`\`), or any other relevant information." validations: required: false ``` ## /.github/ISSUE_TEMPLATE/config.yml ```yml path="/.github/ISSUE_TEMPLATE/config.yml" blank_issues_enabled: false contact_links: - name: 🌟 new Script request url: https://github.com/community-scripts/ProxmoxVE/discussions/new?category=request-script about: For feature/script requests, please use the Discussions section. - name: 🤔 Questions and Help url: https://github.com/community-scripts/ProxmoxVE/discussions about: For suggestions or questions, please use the Discussions section. - name: 💻 Discord url: https://discord.gg/jsYVk5JBxq about: Join our Discord server to chat with other users in the Proxmox Helper Scripts community. ``` ## /.github/ISSUE_TEMPLATE/feature_request.yml ```yml path="/.github/ISSUE_TEMPLATE/feature_request.yml" name: "✨ Feature Request" description: "Suggest a new feature or enhancement. (not for script requests)" labels: ["enhancement"] body: - type: markdown attributes: value: | # ✨ **Feature Request** Have an idea for a new feature? Share your thoughts below! - type: input id: feature_summary attributes: label: "🌟 Briefly describe the feature" placeholder: "e.g., Add support for XYZ" validations: required: true - type: textarea id: feature_description attributes: label: "📝 Detailed description" placeholder: "Explain the feature in detail" validations: required: true - type: textarea id: use_case attributes: label: "💡 Why is this useful?" placeholder: "Describe the benefit of this feature" validations: required: true ``` ## /.github/ISSUE_TEMPLATE/frontend_report.yml ```yml path="/.github/ISSUE_TEMPLATE/frontend_report.yml" name: "🌐 Website Issue Report" description: Report an issue, an optimization request or an documentation issue specifically related to the website. labels: "website" body: - type: markdown attributes: value: | **IMPORTANT:** Failure to comply with the following guidelines may result in immediate closure. - Prior to submitting, kindly search the closed issues to check if the problem you are reporting has already been addressed and resolved. If you come across a closed issue that pertains to your problem, please leave a comment on that issue instead of creating a new one. - If the problem is related to a bug in the website, kindly check for browser compatibility and ensure the issue occurs in multiple browsers before submitting. - For suggestions, questions, or feature requests, please use the [Discussions section.](https://github.com/community-scripts/ProxmoxVE/discussions) - type: input id: guidelines attributes: label: Please verify that you have read and understood the guidelines. placeholder: 'yes' validations: required: true - type: dropdown id: issue_type validations: required: true attributes: label: What type of issue is this? options: - - Bug - Optimization - Documentation - Other - type: textarea id: bug_description attributes: label: A clear and concise description of the issue. validations: required: true - type: dropdown id: browser validations: required: true attributes: label: Which browser are you using? options: - - Chrome - Firefox - Safari - Edge - Other - type: markdown attributes: value: | **If the issue is browser-related**, please provide information on the version and platform (Windows, MacOS, Linux, etc.). - type: textarea id: screenshot attributes: label: If relevant, including screenshots or a code block can be helpful in clarifying the issue. placeholder: "Code blocks begin and conclude by enclosing the code with three backticks (\`\`\`) above and below it." validations: required: false - type: textarea id: reproduce attributes: label: Please provide detailed steps to reproduce the issue. placeholder: "First do this, then this ..." validations: required: false ``` ## /.github/ISSUE_TEMPLATE/task.yml ```yml path="/.github/ISSUE_TEMPLATE/task.yml" name: "🛠️ Task / General Request" description: "Request a general task, improvement, or refactor." labels: ["task"] body: - type: markdown attributes: value: | # 🛠️ **Task / General Request** Request a task that isn't a bug or feature request. - type: input id: task_summary attributes: label: "📌 Task summary" placeholder: "e.g., Refactor XYZ" validations: required: true - type: textarea id: task_details attributes: label: "📋 Task details" placeholder: "Explain what needs to be done" validations: required: true ``` ## /.github/autolabeler-config.json ```json path="/.github/autolabeler-config.json" { "new script": [ { "fileStatus": "added", "includeGlobs": ["ct/**", "install/**", "misc/**", "turnkey/**", "vm/**"], "excludeGlobs": [] } ], "update script": [ { "fileStatus": "modified", "includeGlobs": ["ct/**", "install/**", "misc/**", "turnkey/**", "vm/**"], "excludeGlobs": ["misc/build.func", "misc/install.func", "misc/api.func"] } ], "delete script": [ { "fileStatus": "removed", "includeGlobs": ["ct/**", "install/**", "misc/**", "turnkey/**", "vm/**"], "excludeGlobs": [] } ], "maintenance": [ { "fileStatus": null, "includeGlobs": [ "*.md", ".github/**", "misc/*.func", "ct/create_lxc.sh", "api/**" ], "excludeGlobs": [] } ], "core": [ { "fileStatus": null, "includeGlobs": ["misc/*.func", "ct/create_lxc.sh"], "excludeGlobs": [] } ], "website": [ { "fileStatus": null, "includeGlobs": ["frontend/**"], "excludeGlobs": [] } ], "api": [ { "fileStatus": null, "includeGlobs": ["api/**", "misc/api.func"], "excludeGlobs": [] } ], "github": [ { "fileStatus": null, "includeGlobs": [".github/**"], "excludeGlobs": [] } ], "json": [ { "fileStatus": "modified", "includeGlobs": ["frontend/public/json/**"], "excludeGlobs": [] } ], "high risk": [ { "fileStatus": null, "includeGlobs": [ "misc/build.func", "misc/install.func", "ct/create_lxc.sh" ], "excludeGlobs": [] } ], "documentation": [ { "fileStatus": null, "includeGlobs": ["*.md"], "excludeGlobs": [] } ] } ``` ## /.github/changelog-pr-config.json ```json path="/.github/changelog-pr-config.json" [ { "title": "🆕 New Scripts", "labels": ["new script"] }, { "title": "🚀 Updated Scripts", "labels": ["update script"], "subCategories": [ { "title": "🐞 Bug Fixes", "labels": ["bugfix"], "notes" : [] }, { "title": "✨ New Features", "labels": ["feature"], "notes" : [] }, { "title": "💥 Breaking Changes", "labels": ["breaking change"], "notes" : [] }, { "title": "🔧 Refactor", "labels": ["refactor"], "notes" : [] } ] }, { "title": "🧰 Maintenance", "labels": ["maintenance"], "subCategories": [ { "title": "🐞 Bug Fixes", "labels": ["bugfix"], "notes" : [] }, { "title": "✨ New Features", "labels": ["feature"], "notes" : [] }, { "title": "💥 Breaking Changes", "labels": ["breaking change"], "notes" : [] }, { "title": "📡 API", "labels": ["api"], "notes" : [] }, { "title": "💾 Core", "labels": ["core"], "notes" : [] }, { "title": "📂 Github", "labels": ["github"], "notes" : [] }, { "title" :"📝 Documentation", "labels": ["documentation"], "notes" : [] }, { "title" :"🔧 Refactor", "labels": ["refactor"], "notes" : [] } ] }, { "title": "🌐 Website", "labels": ["website"], "subCategories": [ { "title": "🐞 Bug Fixes", "labels": ["bugfix"], "notes" : [] }, { "title": "✨ New Features", "labels": ["feature"], "notes" : [] }, { "title": "💥 Breaking Changes", "labels": ["breaking change"], "notes" : [] }, { "title": "📝 Script Information", "labels": ["json"], "notes" : [] } ] }, { "title": "❔ Unlabelled", "labels": [] }, { "title": "💥 Breaking Changes", "labels": ["breaking change"] } ] ``` ## /.github/pull_request_template.md ## ✍️ Description ## 🔗 Related PR / Issue Link: # ## ✅ Prerequisites (**X** in brackets) - [ ] **Self-review completed** – Code follows project standards. - [ ] **Tested thoroughly** – Changes work as expected. - [ ] **No security risks** – No hardcoded secrets, unnecessary privilege escalations, or permission issues. --- ## 🛠️ Type of Change (**X** in brackets) - [ ] 🐞 **Bug fix** – Resolves an issue without breaking functionality. - [ ] ✨ **New feature** – Adds new, non-breaking functionality. - [ ] 💥 **Breaking change** – Alters existing functionality in a way that may require updates. - [ ] 🆕 **New script** – A fully functional and tested script or script set. - [ ] 🌍 **Website update** – Changes to website-related JSON files or metadata. - [ ] 🔧 **Refactoring / Code Cleanup** – Improves readability or maintainability without changing functionality. - [ ] 📝 **Documentation update** – Changes to `README`, `AppName.md`, `CONTRIBUTING.md`, or other docs. ## /.github/runner/docker/gh-runner-self.dockerfile ```dockerfile path="/.github/runner/docker/gh-runner-self.dockerfile" FROM mcr.microsoft.com/dotnet/runtime-deps:8.0-jammy as build ARG TARGETOS ARG TARGETARCH ARG DOCKER_VERSION=27.5.1 ARG BUILDX_VERSION=0.20.1 ARG RUNNER_ARCH="x64" RUN apt update -y && apt install sudo curl unzip -y WORKDIR /actions-runner RUN RUNNER_VERSION=$(curl -s https://api.github.com/repos/actions/runner/releases/latest | grep "tag_name" | head -n 1 | awk '{print substr($2, 3, length($2)-4)}') \ && curl -f -L -o runner.tar.gz https://github.com/actions/runner/releases/download/v${RUNNER_VERSION}/actions-runner-linux-${RUNNER_ARCH}-${RUNNER_VERSION}.tar.gz \ && tar xzf ./runner.tar.gz \ && rm runner.tar.gz RUN RUNNER_CONTAINER_HOOKS_VERSION=$(curl -s https://api.github.com/repos/actions/runner-container-hooks/releases/latest | grep "tag_name" | head -n 1 | awk '{print substr($2, 3, length($2)-4)}') \ && curl -f -L -o runner-container-hooks.zip https://github.com/actions/runner-container-hooks/releases/download/v${RUNNER_CONTAINER_HOOKS_VERSION}/actions-runner-hooks-k8s-${RUNNER_CONTAINER_HOOKS_VERSION}.zip \ && unzip ./runner-container-hooks.zip -d ./k8s \ && rm runner-container-hooks.zip RUN export RUNNER_ARCH=${TARGETARCH} \ && if [ "$RUNNER_ARCH" = "amd64" ]; then export DOCKER_ARCH=x86_64 ; fi \ && if [ "$RUNNER_ARCH" = "arm64" ]; then export DOCKER_ARCH=aarch64 ; fi \ && curl -fLo docker.tgz https://download.docker.com/${TARGETOS}/static/stable/${DOCKER_ARCH}/docker-${DOCKER_VERSION}.tgz \ && tar zxvf docker.tgz \ && rm -rf docker.tgz \ && mkdir -p /usr/local/lib/docker/cli-plugins \ && curl -fLo /usr/local/lib/docker/cli-plugins/docker-buildx \ "https://github.com/docker/buildx/releases/download/v${BUILDX_VERSION}/buildx-v${BUILDX_VERSION}.linux-${TARGETARCH}" \ && chmod +x /usr/local/lib/docker/cli-plugins/docker-buildx FROM mcr.microsoft.com/dotnet/runtime-deps:8.0-jammy ENV DEBIAN_FRONTEND=noninteractive ENV RUNNER_MANUALLY_TRAP_SIG=1 ENV ACTIONS_RUNNER_PRINT_LOG_TO_STDOUT=1 ENV ImageOS=ubuntu22 RUN apt update -y \ && apt install -y --no-install-recommends sudo lsb-release gpg-agent software-properties-common curl jq unzip \ && rm -rf /var/lib/apt/lists/* RUN add-apt-repository ppa:git-core/ppa \ && apt update -y \ && apt install -y git \ && rm -rf /var/lib/apt/lists/* RUN adduser --disabled-password --gecos "" --uid 1001 runner \ && groupadd docker --gid 123 \ && usermod -aG sudo runner \ && usermod -aG docker runner \ && echo "%sudo ALL=(ALL:ALL) NOPASSWD:ALL" > /etc/sudoers \ && echo "Defaults env_keep += \"DEBIAN_FRONTEND\"" >> /etc/sudoers # Install own dependencies in final image RUN curl -fsSL https://deb.nodesource.com/setup_22.x | bash - \ && apt-get install -y nodejs \ && apt-get install -y gh jq git WORKDIR /home/runner COPY --chown=runner:docker --from=build /actions-runner . COPY --from=build /usr/local/lib/docker/cli-plugins/docker-buildx /usr/local/lib/docker/cli-plugins/docker-buildx RUN install -o root -g root -m 755 docker/* /usr/bin/ && rm -rf docker USER runner ``` ## /.github/workflows/auto-update-app-headers.yml ```yml path="/.github/workflows/auto-update-app-headers.yml" name: Auto Update .app-files on: push: branches: - main paths: - 'ct/**.sh' workflow_dispatch: jobs: update-app-files: runs-on: runner-cluster-htl-set permissions: contents: write pull-requests: write steps: - name: Generate a token id: generate-token uses: actions/create-github-app-token@v1 with: app-id: ${{ vars.APP_ID }} private-key: ${{ secrets.APP_PRIVATE_KEY }} # Step 1: Checkout repository - name: Checkout repository uses: actions/checkout@v2 # Step 2: Disable file mode changes detection - name: Disable file mode changes run: git config core.fileMode false # Step 3: Set up Git user for committing changes - name: Set up Git run: | git config --global user.name "GitHub Actions" git config --global user.email "github-actions[bot]@users.noreply.github.com" # Step 4: Install figlet - name: Install figlet run: sudo apt-get install -y figlet # Step 5: Run the updated generate-app-files.sh script - name: Run generate-app-files.sh run: | chmod +x .github/workflows/scripts/generate-app-headers.sh .github/workflows/scripts/generate-app-headers.sh env: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Step 6: Check if there are any changes - name: Check if there are any changes run: | echo "Checking for changes..." git add -A # Untracked Dateien aufnehmen git status if git diff --cached --quiet; then echo "No changes detected." echo "changed=false" >> "$GITHUB_ENV" else echo "Changes detected:" git diff --stat --cached echo "changed=true" >> "$GITHUB_ENV" fi # Step 7: Commit and create PR if changes exist - name: Commit and create PR if changes exist if: env.changed == 'true' run: | git commit -m "Update .app files" git checkout -b pr-update-app-files git push origin pr-update-app-files --force gh pr create --title "[core] update .app files" \ --body "This PR is auto-generated by a GitHub Action to update the .app files." \ --head pr-update-app-files \ --base main \ --label "automated pr" env: GH_TOKEN: ${{ steps.generate-token.outputs.token }} - name: Approve pull request if: env.changed == 'true' env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} run: | PR_NUMBER=$(gh pr list --head "pr-update-app-files" --json number --jq '.[].number') if [ -n "$PR_NUMBER" ]; then gh pr review $PR_NUMBER --approve fi - name: Re-approve pull request after update if: env.changed == 'true' env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} run: | PR_NUMBER=$(gh pr list --head "pr-update-app-files" --json number --jq '.[].number') if [ -n "$PR_NUMBER" ]; then gh pr review $PR_NUMBER --approve fi # Step 8: Output success message when no changes - name: No changes detected if: env.changed == 'false' run: echo "No changes to commit. Workflow completed successfully." ``` ## /.github/workflows/autolabeler.yml ```yml path="/.github/workflows/autolabeler.yml" name: Auto Label Pull Requests on: pull_request_target: branches: ["main"] types: [opened, synchronize, reopened, edited] jobs: autolabeler: runs-on: runner-cluster-htl-set permissions: pull-requests: write env: CONFIG_PATH: .github/autolabeler-config.json steps: - name: Checkout repository uses: actions/checkout@v4 - name: Install dependencies run: npm install minimatch - name: Label PR based on file changes and PR template uses: actions/github-script@v7 with: script: | const fs = require('fs').promises; const path = require('path'); const { minimatch } = require('minimatch'); const configPath = path.resolve(process.env.CONFIG_PATH); const fileContent = await fs.readFile(configPath, 'utf-8'); const autolabelerConfig = JSON.parse(fileContent); const prNumber = context.payload.pull_request.number; const prBody = context.payload.pull_request.body.toLowerCase(); let labelsToAdd = new Set(); const prListFilesResponse = await github.rest.pulls.listFiles({ owner: context.repo.owner, repo: context.repo.repo, pull_number: prNumber, }); const prFiles = prListFilesResponse.data; // Apply labels based on file changes for (const [label, rules] of Object.entries(autolabelerConfig)) { const shouldAddLabel = prFiles.some((prFile) => { return rules.some((rule) => { const isFileStatusMatch = rule.fileStatus ? rule.fileStatus === prFile.status : true; const isIncludeGlobMatch = rule.includeGlobs.some((glob) => minimatch(prFile.filename, glob)); const isExcludeGlobMatch = rule.excludeGlobs.some((glob) => minimatch(prFile.filename, glob)); return isFileStatusMatch && isIncludeGlobMatch && !isExcludeGlobMatch; }); }); if (shouldAddLabel) { labelsToAdd.add(label); } } //if two labels or more are added, return if (labelsToAdd.size < 2) { const templateLabelMappings = { "🐞 **Bug fix**": "bugfix", "✨ **New feature**": "feature", "💥 **Breaking change**": "breaking change", "🔧 **Refactoring / Code Cleanup**": "refactor", }; for (const [checkbox, label] of Object.entries(templateLabelMappings)) { const escapedCheckbox = checkbox.replace(/([.*+?^=!:${}()|\[\]\/\\])/g, "\\$1"); const regex = new RegExp(`- \\[(x|X)\\]\\s*.*${escapedCheckbox}`, "i"); const match = prBody.match(regex); if (match) { console.log(`Match: ${match}`); labelsToAdd.add(label); } } } console.log(`Labels to add: ${Array.from(labelsToAdd).join(", ")}`); if (labelsToAdd.size > 0) { console.log(`Adding labels ${Array.from(labelsToAdd).join(", ")} to PR ${prNumber}`); await github.rest.issues.addLabels({ owner: context.repo.owner, repo: context.repo.repo, issue_number: prNumber, labels: Array.from(labelsToAdd), }); } ``` ## /.github/workflows/changelog-pr.yml ```yml path="/.github/workflows/changelog-pr.yml" name: Create Changelog Pull Request on: push: branches: ["main"] workflow_dispatch: jobs: update-changelog-pull-request: runs-on: runner-cluster-htl-set env: CONFIG_PATH: .github/changelog-pr-config.json BRANCH_NAME: github-action-update-changelog AUTOMATED_PR_LABEL: "automated pr" permissions: contents: write pull-requests: write steps: - name: Generate a token id: generate-token uses: actions/create-github-app-token@v1 with: app-id: ${{ vars.APP_ID }} private-key: ${{ secrets.APP_PRIVATE_KEY }} - name: Checkout code uses: actions/checkout@v4 with: fetch-depth: 0 - name: Get latest dates in changelog run: | DATES=$(grep -E '^## [0-9]{4}-[0-9]{2}-[0-9]{2}' CHANGELOG.md | head -n 2 | awk '{print $2}') LATEST_DATE=$(echo "$DATES" | sed -n '1p') SECOND_LATEST_DATE=$(echo "$DATES" | sed -n '2p') TODAY=$(date -u +%Y-%m-%d) echo "TODAY=$TODAY" >> $GITHUB_ENV if [[ "$LATEST_DATE" == "$TODAY" ]]; then echo "LATEST_DATE=$SECOND_LATEST_DATE" >> $GITHUB_ENV else echo "LATEST_DATE=$LATEST_DATE" >> $GITHUB_ENV fi - name: Get categorized pull requests id: get-categorized-prs uses: actions/github-script@v7 with: script: | async function main() { const fs = require('fs').promises; const path = require('path'); const configPath = path.resolve(process.env.CONFIG_PATH); const fileContent = await fs.readFile(configPath, 'utf-8'); const changelogConfig = JSON.parse(fileContent); const categorizedPRs = changelogConfig.map(obj => ({ ...obj, notes: [], subCategories: obj.subCategories ?? ( obj.labels.includes("update script") ? [ { title: "🐞 Bug Fixes", labels: ["bugfix"], notes: [] }, { title: "✨ New Features", labels: ["feature"], notes: [] }, { title: "💥 Breaking Changes", labels: ["breaking change"], notes: [] }, { title: "🔧 Refactor", labels: ["refactor"], notes: [] }, ] : obj.labels.includes("maintenance") ? [ { title: "🐞 Bug Fixes", labels: ["bugfix"], notes: [] }, { title: "✨ New Features", labels: ["feature"], notes: [] }, { title: "💥 Breaking Changes", labels: ["breaking change"], notes: [] }, { title: "📡 API", labels: ["api"], notes: [] }, { title: "Github", labels: ["github"], notes: [] }, { title: "📝 Documentation", labels: ["documentation"], notes: [] }, { title: "🔧 Refactor", labels: ["refactor"], notes: [] } ] : obj.labels.includes("website") ? [ { title: "🐞 Bug Fixes", labels: ["bugfix"], notes: [] }, { title: "✨ New Features", labels: ["feature"], notes: [] }, { title: "💥 Breaking Changes", labels: ["breaking change"], notes: [] }, { title: "Script Information", labels: ["json"], notes: [] } ] : [] ) })); const latestDateInChangelog = new Date(process.env.LATEST_DATE); latestDateInChangelog.setUTCHours(23, 59, 59, 999); const { data: pulls } = await github.rest.pulls.list({ owner: context.repo.owner, repo: "ProxmoxVE", base: "main", state: "closed", sort: "updated", direction: "desc", per_page: 100, }); const filteredPRs = pulls.filter(pr => pr.merged_at && new Date(pr.merged_at) > latestDateInChangelog && !pr.labels.some(label => ["invalid", "wontdo", process.env.AUTOMATED_PR_LABEL].includes(label.name.toLowerCase()) ) ); for (const pr of filteredPRs) { const prLabels = pr.labels.map(label => label.name.toLowerCase()); if (pr.user.login.includes("push-app-to-main[bot]")) { const scriptName = pr.title; try { const { data: relatedIssues } = await github.rest.issues.listForRepo({ owner: context.repo.owner, repo: "ProxmoxVED", state: "all", labels: ["Started Migration To ProxmoxVE"] }); const matchingIssue = relatedIssues.find(issue => issue.title.toLowerCase().includes(scriptName.toLowerCase()) ); if (matchingIssue) { const issueAuthor = matchingIssue.user.login; const issueAuthorUrl = `https://github.com/${issueAuthor}`; prNote = `- ${pr.title} [@${issueAuthor}](${issueAuthorUrl}) ([#${pr.number}](${pr.html_url}))`; } else { prNote = `- ${pr.title} ([#${pr.number}](${pr.html_url}))`; } } catch (error) { console.error(`Error fetching related issues: ${error}`); prNote = `- ${pr.title} ([#${pr.number}](${pr.html_url}))`; } }else{ prNote = `- ${pr.title} [@${pr.user.login}](https://github.com/${pr.user.login}) ([#${pr.number}](${pr.html_url}))`; } if (prLabels.includes("new script")) { const newScriptCategory = categorizedPRs.find(category => category.title === "New Scripts" || category.labels.includes("new script")); if (newScriptCategory) { newScriptCategory.notes.push(prNote); } } else { let categorized = false; const priorityCategories = categorizedPRs.slice(); for (const category of priorityCategories) { if (categorized) break; if (category.labels.some(label => prLabels.includes(label))) { if (category.subCategories && category.subCategories.length > 0) { const subCategory = category.subCategories.find(sub => sub.labels.some(label => prLabels.includes(label)) ); if (subCategory) { subCategory.notes.push(prNote); } else { category.notes.push(prNote); } } else { category.notes.push(prNote); } categorized = true; } } } } return categorizedPRs; } return await main(); - name: Update CHANGELOG.md uses: actions/github-script@v7 with: script: | const fs = require('fs').promises; const path = require('path'); const today = process.env.TODAY; const latestDateInChangelog = process.env.LATEST_DATE; const changelogPath = path.resolve('CHANGELOG.md'); const categorizedPRs = ${{ steps.get-categorized-prs.outputs.result }}; console.log(JSON.stringify(categorizedPRs, null, 2)); let newReleaseNotes = `## ${today}\n\n`; for (const { title, notes, subCategories } of categorizedPRs) { const hasSubcategories = subCategories && subCategories.length > 0; const hasMainNotes = notes.length > 0; const hasSubNotes = hasSubcategories && subCategories.some(sub => sub.notes && sub.notes.length > 0); if (hasMainNotes || hasSubNotes) { newReleaseNotes += `### ${title}\n\n`; } if (hasMainNotes) { newReleaseNotes += ` ${notes.join("\n")}\n\n`; } if (hasSubcategories) { for (const { title: subTitle, notes: subNotes } of subCategories) { if (subNotes && subNotes.length > 0) { newReleaseNotes += ` - #### ${subTitle}\n\n`; newReleaseNotes += ` ${subNotes.join("\n ")}\n\n`; } } } } const changelogContent = await fs.readFile(changelogPath, 'utf-8'); const changelogIncludesTodaysReleaseNotes = changelogContent.includes(`\n## ${today}`); const regex = changelogIncludesTodaysReleaseNotes ? new RegExp(`## ${today}.*(?=## ${latestDateInChangelog})`, "gs") : new RegExp(`(?=## ${latestDateInChangelog})`, "gs"); const newChangelogContent = changelogContent.replace(regex, newReleaseNotes); await fs.writeFile(changelogPath, newChangelogContent); - name: Check for changes id: verify-diff run: | git diff --quiet . || echo "changed=true" >> $GITHUB_ENV - name: Commit and push changes if: env.changed == 'true' run: | git config --global user.name "github-actions[bot]" git config --global user.email "github-actions[bot]@users.noreply.github.com" git add CHANGELOG.md git commit -m "Update CHANGELOG.md" git checkout -b $BRANCH_NAME || git checkout $BRANCH_NAME git push origin $BRANCH_NAME --force - name: Create pull request if not exists if: env.changed == 'true' env: GH_TOKEN: ${{ steps.generate-token.outputs.token }} run: | PR_EXISTS=$(gh pr list --head "${BRANCH_NAME}" --json number --jq '.[].number') if [ -z "$PR_EXISTS" ]; then gh pr create --title "[Github Action] Update CHANGELOG.md" \ --body "This PR is auto-generated by a Github Action to update the CHANGELOG.md file." \ --head $BRANCH_NAME \ --base main \ --label "$AUTOMATED_PR_LABEL" fi - name: Approve pull request if: env.changed == 'true' env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} run: | PR_NUMBER=$(gh pr list --head "${BRANCH_NAME}" --json number --jq '.[].number') if [ -n "$PR_NUMBER" ]; then gh pr review $PR_NUMBER --approve fi - name: Re-approve pull request after update if: env.changed == 'true' env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} run: | PR_NUMBER=$(gh pr list --head "${BRANCH_NAME}" --json number --jq '.[].number') if [ -n "$PR_NUMBER" ]; then gh pr review $PR_NUMBER --approve fi ``` ## /.github/workflows/close-discussion.yml ```yml path="/.github/workflows/close-discussion.yml" name: Close Discussion on PR Merge on: push: branches: - main permissions: contents: read discussions: write jobs: close-discussion: runs-on: runner-cluster-htl-set steps: - name: Checkout Repository uses: actions/checkout@v4 - name: Set Up Node.js uses: actions/setup-node@v4 with: node-version: "20" - name: Install Dependencies run: npm install zx @octokit/graphql - name: Close Discussion env: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_SHA: ${{ github.sha }} GITHUB_REPOSITORY: ${{ github.repository }} run: | npx zx << 'EOF' import { graphql } from "@octokit/graphql"; (async function () { try { const token = process.env.GITHUB_TOKEN; const commitSha = process.env.GITHUB_SHA; const [owner, repo] = process.env.GITHUB_REPOSITORY.split("/"); if (!token || !commitSha || !owner || !repo) { console.log("Missing required environment variables."); process.exit(1); } const graphqlWithAuth = graphql.defaults({ headers: { authorization: `Bearer ${token}` }, }); // Find PR from commit SHA const searchQuery = ` query($owner: String!, $repo: String!, $sha: GitObjectID!) { repository(owner: $owner, name: $repo) { object(oid: $sha) { ... on Commit { associatedPullRequests(first: 1) { nodes { number body } } } } } } `; const prResult = await graphqlWithAuth(searchQuery, { owner, repo, sha: commitSha, }); const pr = prResult.repository.object.associatedPullRequests.nodes[0]; if (!pr) { console.log("No PR found for this commit."); return; } const prNumber = pr.number; const prBody = pr.body; const match = prBody.match(/#(\d+)/); if (!match) { console.log("No discussion ID found in PR body."); return; } const discussionNumber = match[1]; console.log(`Extracted Discussion Number: ${discussionNumber}`); // Fetch GraphQL discussion ID const discussionQuery = ` query($owner: String!, $repo: String!, $number: Int!) { repository(owner: $owner, name: $repo) { discussion(number: $number) { id } } } `; // try { const discussionResponse = await graphqlWithAuth(discussionQuery, { owner, repo, number: parseInt(discussionNumber, 10), }); const discussionQLId = discussionResponse.repository.discussion.id; if (!discussionQLId) { console.log("Failed to fetch discussion GraphQL ID."); return; } } catch (error) { console.error("Discussion not found or error occurred while fetching discussion:", error); return; } // Post comment const commentMutation = ` mutation($discussionId: ID!, $body: String!) { addDiscussionComment(input: { discussionId: $discussionId, body: $body }) { comment { id body } } } `; const commentResponse = await graphqlWithAuth(commentMutation, { discussionId: discussionQLId, body: `Merged with PR #${prNumber}`, }); const commentId = commentResponse.addDiscussionComment.comment.id; if (!commentId) { console.log("Failed to post the comment."); return; } console.log(`Comment Posted Successfully! Comment ID: ${commentId}`); // Mark comment as answer const markAnswerMutation = ` mutation($id: ID!) { markDiscussionCommentAsAnswer(input: { id: $id }) { discussion { id title } } } `; await graphqlWithAuth(markAnswerMutation, { id: commentId }); console.log("Comment marked as answer successfully!"); } catch (error) { console.error("Error:", error); process.exit(1); } })(); EOF ``` ## /.github/workflows/close-ttek-issues.yaml ```yaml path="/.github/workflows/close-ttek-issues.yaml" name: Auto-Close tteck Issues on: issues: types: [opened] jobs: close_tteck_issues: runs-on: ubuntu-latest steps: - name: Auto-close if tteck script detected uses: actions/github-script@v7 with: script: | const issue = context.payload.issue; const content = `${issue.title}\n${issue.body}`; const issueNumber = issue.number; // Check for tteck script mention if (content.includes("tteck") || content.includes("tteck/Proxmox")) { const message = `Hello, it looks like you are referencing the **old tteck repo**. This repository is no longer used for active scripts. **Please update your bookmarks** and use: [https://helper-scripts.com](https://helper-scripts.com) Also make sure your Bash command starts with: \`\`\`bash bash <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/ct/...) \`\`\` This issue is being closed automatically.`; await github.rest.issues.createComment({ ...context.repo, issue_number: issueNumber, body: message }); // Optionally apply a label like "not planned" await github.rest.issues.addLabels({ ...context.repo, issue_number: issueNumber, labels: ["not planned"] }); // Close the issue await github.rest.issues.update({ ...context.repo, issue_number: issueNumber, state: "closed" }); } ``` ## /.github/workflows/close_issue_in_dev.yaml ```yaml path="/.github/workflows/close_issue_in_dev.yaml" name: Close Matching Issue on PR Merge on: pull_request: types: - closed jobs: close_issue: if: github.event.pull_request.merged == true runs-on: ubuntu-latest steps: - name: Checkout target repo (main) uses: actions/checkout@v4 with: repository: community-scripts/ProxmoxVE ref: main token: ${{ secrets.GITHUB_TOKEN }} - name: Extract and Process PR Title id: extract_title run: | title=$(echo "${{ github.event.pull_request.title }}" | sed 's/^New Script://g' | tr '[:upper:]' '[:lower:]' | sed 's/ //g' | sed 's/-//g') echo "Processed Title: $title" echo "title=$title" >> $GITHUB_ENV - name: Search for Issues with Similar Titles id: find_issue env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} run: | issues=$(gh issue list --repo community-scripts/ProxmoxVED --json number,title --jq '.[] | {number, title}') best_match_score=0 best_match_number=0 for issue in $(echo "$issues" | jq -r '. | @base64'); do _jq() { echo ${issue} | base64 --decode | jq -r ${1} } issue_title=$(_jq '.title' | tr '[:upper:]' '[:lower:]' | sed 's/ //g' | sed 's/-//g') issue_number=$(_jq '.number') match_score=$(echo "$title" | grep -o "$issue_title" | wc -l) if [ "$match_score" -gt "$best_match_score" ]; then best_match_score=$match_score best_match_number=$issue_number fi done if [ "$best_match_number" != "0" ]; then echo "issue_number=$best_match_number" >> $GITHUB_ENV else echo "No matching issue found." exit 0 fi - name: Comment on the Best-Matching Issue and Close It if: env.issue_number != '' env: GH_TOKEN: ${{ secrets.PAT_MICHEL }} run: | gh issue comment $issue_number --repo community-scripts/ProxmoxVED --body "Merged with #${{ github.event.pull_request.number }} in ProxmoxVE" gh issue close $issue_number --repo community-scripts/ProxmoxVED ``` ## /.github/workflows/crawl-versions.yaml ```yaml path="/.github/workflows/crawl-versions.yaml" name: Crawl Versions from newreleases.io on: workflow_dispatch: schedule: # Runs at 12:00 AM and 12:00 PM UTC - cron: "0 0,12 * * *" permissions: contents: write pull-requests: write jobs: crawl-versions: runs-on: runner-cluster-htl-set steps: - name: Checkout Repository uses: actions/checkout@v2 with: repository: community-scripts/ProxmoxVE ref: main - name: Generate a token id: generate-token uses: actions/create-github-app-token@v1 with: app-id: ${{ vars.APP_ID }} private-key: ${{ secrets.APP_PRIVATE_KEY }} - name: Crawl from newreleases.io env: token: ${{ secrets.NEWRELEASES_TOKEN }} run: | page=1 projects_file="project_json" output_file="frontend/public/json/versions.json" echo "[]" > $output_file while true; do echo "Start loop on page: $page" projects=$(curl -s -H "X-Key: $token" "https://api.newreleases.io/v1/projects?page=$page") total_pages=$(echo "$projects" | jq -r '.total_pages') if [ -z "$total_pages" ] || [ "$total_pages" -eq 0 ]; then echo "No pages available. Exiting." exit 1 fi if [ $page == $total_pages ]; then break fi if [ -z "$projects" ] || ! echo "$projects" | jq -e '.projects' > /dev/null; then echo "No more projects or invalid response. Exiting." break fi echo "$projects" > "$projects_file" jq -r '.projects[] | "\(.id) \(.name)"' "$projects_file" | while read -r id name; do version=$(curl -s -H "X-Key: $token" "https://api.newreleases.io/v1/projects/$id/latest-release") version_data=$(echo "$version" | jq -r '.version // empty') date=$(echo "$version" | jq -r '.date // empty') if [ -n "$version_data" ]; then jq --arg name "$name" --arg version "$version_data" --arg date "$date" \ '. += [{"name": $name, "version": $version, "date": $date}]' "$output_file" > "$output_file.tmp" && mv "$output_file.tmp" "$output_file" fi done ((page++)) done - name: Commit JSON env: GH_TOKEN: ${{ steps.generate-token.outputs.token }} run: | git config --global user.email "github-actions[bot]@users.noreply.github.com" git config --global user.name "GitHub Actions[bot]" git checkout -b update_versions || git checkout update_versions git add frontend/public/json/versions.json if git diff --cached --quiet; then echo "No changes detected." echo "changed=false" >> "$GITHUB_ENV" exit 0 else echo "Changes detected:" git diff --stat --cached echo "changed=true" >> "$GITHUB_ENV" fi git commit -m "Update versions.json" git push origin update_versions --force gh pr create --title "[Github Action] Update versions.json" --body "Update versions.json, crawled from newreleases.io" --base main --head update_versions --label "automated pr" - name: Approve pull request if: env.changed == 'true' env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} run: | PR_NUMBER=$(gh pr list --head "update_versions" --json number --jq '.[].number') if [ -n "$PR_NUMBER" ]; then gh pr review $PR_NUMBER --approve fi - name: Re-approve pull request after update if: env.changed == 'true' env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} run: | PR_NUMBER=$(gh pr list --head "update_versions" --json number --jq '.[].number') if [ -n "$PR_NUMBER" ]; then gh pr review $PR_NUMBER --approve fi ``` ## /.github/workflows/create-docker-for-runner.yml ```yml path="/.github/workflows/create-docker-for-runner.yml" name: Build and Publish Docker Image on: push: branches: - main paths: - '.github/runner/docker/**' schedule: - cron: '0 0 * * *' jobs: build: runs-on: ubuntu-latest #To ensure it always builds we use the github runner with all the right tooling steps: - name: Checkout code uses: actions/checkout@v3 - name: Log in to GHCR uses: docker/login-action@v2 with: registry: ghcr.io username: ${{ github.actor }} password: ${{ secrets.GITHUB_TOKEN }} - name: Build Docker image run: | repo_name=${{ github.repository }} # Get repository name repo_name_lower=$(echo $repo_name | tr '[:upper:]' '[:lower:]') # Convert to lowercase docker build -t ghcr.io/$repo_name_lower/gh-runner-self:latest -f .github/runner/docker/gh-runner-self.dockerfile . - name: Push Docker image to GHCR run: | repo_name=${{ github.repository }} # Get repository name repo_name_lower=$(echo $repo_name | tr '[:upper:]' '[:lower:]') # Convert to lowercase docker push ghcr.io/$repo_name_lower/gh-runner-self:latest ``` ## /.github/workflows/delete-json-branch.yml ```yml path="/.github/workflows/delete-json-branch.yml" name: Delete JSON date PR Branch on: pull_request: types: [closed] branches: - main jobs: delete_branch: runs-on: runner-cluster-htl-set steps: - name: Checkout the code uses: actions/checkout@v3 - name: Delete PR Update Branch if: github.event.pull_request.merged == true && startsWith(github.event.pull_request.head.ref, 'pr-update-json-') run: | PR_BRANCH="${{ github.event.pull_request.head.ref }}" echo "Deleting branch $PR_BRANCH..." # Avoid deleting the default branch (e.g., main) if [[ "$PR_BRANCH" != "main" ]]; then git push origin --delete "$PR_BRANCH" else echo "Skipping deletion of the main branch" fi ``` ## /.github/workflows/frontend-cicd.yml ```yml path="/.github/workflows/frontend-cicd.yml" # Based on https://github.com/actions/starter-workflows/blob/main/pages/nextjs.yml name: Frontend CI/CD on: push: branches: ["main"] paths: - frontend/** pull_request: branches: ["main"] types: [opened, synchronize, reopened, edited] paths: - frontend/** workflow_dispatch: permissions: contents: read concurrency: group: pages-${{ github.ref }} cancel-in-progress: false jobs: build: runs-on: runner-cluster-htl-set defaults: run: working-directory: frontend # Set default working directory for all run steps steps: - name: Checkout uses: actions/checkout@v4 - name: Setup Node uses: actions/setup-node@v4 with: node-version: "20" cache: npm cache-dependency-path: frontend/package-lock.json - name: Install dependencies run: npm ci --prefer-offline --legacy-peer-deps - name: Run tests run: npm run test - name: Configure Next.js for pages uses: actions/configure-pages@v5 with: static_site_generator: next - name: Build with Next.js run: npm run build - name: Upload artifact if: github.ref == 'refs/heads/main' uses: actions/upload-pages-artifact@v3 with: path: frontend/out deploy: runs-on: ubuntu-latest needs: build if: github.ref == 'refs/heads/main' permissions: pages: write id-token: write environment: name: github-pages url: ${{ steps.deployment.outputs.page_url }} steps: - name: Deploy to GitHub Pages id: deployment uses: actions/deploy-pages@v4 ``` ## /.github/workflows/github-release.yml ```yml path="/.github/workflows/github-release.yml" name: Create Daily Release on: schedule: - cron: '1 0 * * *' # Runs daily at 00:01 UTC workflow_dispatch: jobs: create-daily-release: runs-on: runner-cluster-htl-set permissions: contents: write steps: - name: Checkout repository uses: actions/checkout@v4 - name: Clean CHANGELOG (remove HTML header) run: sed -n '/^## /,$p' CHANGELOG.md > changelog_cleaned.md - name: Extract relevant changelog section run: | YESTERDAY=$(date -u --date="yesterday" +%Y-%m-%d) echo "Checking for changes on: $YESTERDAY" # Extract the section from "## $YESTERDAY" until the next "## YYYY-MM-DD" sed -n "/^## $YESTERDAY/,/^## [0-9]\{4\}-[0-9]\{2\}-[0-9]\{2\}/p" changelog_cleaned.md | head -n -1 > changelog_tmp_full.md # Truncate the extracted section to 5000 characters head -c 5000 changelog_tmp_full.md > changelog_tmp.md echo "=== Extracted Changelog ===" cat changelog_tmp.md echo "===========================" # Abort if no content was found if [ ! -s changelog_tmp.md ]; then echo "No changes found for $YESTERDAY, skipping release." exit 0 fi - name: Create GitHub release env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} run: | YESTERDAY=$(date -u --date="yesterday" +%Y-%m-%d) gh release create "$YESTERDAY" -t "$YESTERDAY" -F changelog_tmp.md ``` ## /.github/workflows/script-test.yml ```yml path="/.github/workflows/script-test.yml" name: Run Scripts on PVE Node for testing permissions: pull-requests: write on: pull_request_target: branches: - main paths: - "install/**.sh" - "ct/**.sh" jobs: run-install-script: runs-on: pvenode steps: - name: Checkout PR branch uses: actions/checkout@v4 with: ref: ${{ github.event.pull_request.head.ref }} repository: ${{ github.event.pull_request.head.repo.full_name }} fetch-depth: 0 - name: Add Git safe directory run: | git config --global --add safe.directory /__w/ProxmoxVE/ProxmoxVE - name: Set up GH_TOKEN env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} run: | echo "GH_TOKEN=${GH_TOKEN}" >> $GITHUB_ENV - name: Get Changed Files run: | CHANGED_FILES=$(gh pr diff ${{ github.event.pull_request.number }} --repo ${{ github.repository }} --name-only) CHANGED_FILES=$(echo "$CHANGED_FILES" | tr '\n' ' ') echo "Changed files: $CHANGED_FILES" echo "SCRIPT=$CHANGED_FILES" >> $GITHUB_ENV env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} - name: Get scripts id: check-install-script run: | ALL_FILES=() ADDED_FILES=() for FILE in ${{ env.SCRIPT }}; do if [[ $FILE =~ ^install/.*-install\.sh$ ]] || [[ $FILE =~ ^ct/.*\.sh$ ]]; then STRIPPED_NAME=$(basename "$FILE" | sed 's/-install//' | sed 's/\.sh$//') if [[ ! " ${ADDED_FILES[@]} " =~ " $STRIPPED_NAME " ]]; then ALL_FILES+=("$FILE") ADDED_FILES+=("$STRIPPED_NAME") # Mark this base file as added (without the path) fi fi done ALL_FILES=$(echo "${ALL_FILES[@]}" | xargs) echo "$ALL_FILES" echo "ALL_FILES=$ALL_FILES" >> $GITHUB_ENV - name: Run scripts id: run-install continue-on-error: true run: | set +e #run for each files in /ct for FILE in ${{ env.ALL_FILES }}; do STRIPPED_NAME=$(basename "$FILE" | sed 's/-install//' | sed 's/\.sh$//') echo "Running Test for: $STRIPPED_NAME" if grep -E -q 'read\s+-r\s+-p\s+".*"\s+\w+' "$FILE"; then echo "The script contains an interactive prompt. Skipping execution." continue fi if [[ $FILE =~ ^install/.*-install\.sh$ ]]; then CT_SCRIPT="ct/$STRIPPED_NAME.sh" if [[ ! -f $CT_SCRIPT ]]; then echo "No CT script found for $STRIPPED_NAME" ERROR_MSG="No CT script found for $FILE" echo "$ERROR_MSG" > result_$STRIPPED_NAME.log continue fi if grep -E -q 'read\s+-r\s+-p\s+".*"\s+\w+' "install/$STRIPPED_NAME-install.sh"; then echo "The script contains an interactive prompt. Skipping execution." continue fi echo "Found CT script for $STRIPPED_NAME" chmod +x "$CT_SCRIPT" RUNNING_FILE=$CT_SCRIPT elif [[ $FILE =~ ^ct/.*\.sh$ ]]; then INSTALL_SCRIPT="install/$STRIPPED_NAME-install.sh" if [[ ! -f $INSTALL_SCRIPT ]]; then echo "No install script found for $STRIPPED_NAME" ERROR_MSG="No install script found for $FILE" echo "$ERROR_MSG" > result_$STRIPPED_NAME.log continue fi echo "Found install script for $STRIPPED_NAME" chmod +x "$INSTALL_SCRIPT" RUNNING_FILE=$FILE if grep -E -q 'read\s+-r\s+-p\s+".*"\s+\w+' "ct/$STRIPPED_NAME.sh"; then echo "The script contains an interactive prompt. Skipping execution." continue fi fi git remote add community-scripts https://github.com/community-scripts/ProxmoxVE.git git fetch community-scripts rm -f .github/workflows/scripts/app-test/pr-build.func || true rm -f .github/workflows/scripts/app-test/pr-install.func || true rm -f .github/workflows/scripts/app-test/pr-alpine-install.func || true rm -f .github/workflows/scripts/app-test/pr-create-lxc.sh || true git checkout community-scripts/main -- .github/workflows/scripts/app-test/pr-build.func git checkout community-scripts/main -- .github/workflows/scripts/app-test/pr-install.func git checkout community-scripts/main -- .github/workflows/scripts/app-test/pr-alpine-install.func git checkout community-scripts/main -- .github/workflows/scripts/app-test/pr-create-lxc.sh chmod +x $RUNNING_FILE chmod +x .github/workflows/scripts/app-test/pr-create-lxc.sh chmod +x .github/workflows/scripts/app-test/pr-install.func chmod +x .github/workflows/scripts/app-test/pr-alpine-install.func chmod +x .github/workflows/scripts/app-test/pr-build.func sed -i 's|source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func)|source .github/workflows/scripts/app-test/pr-build.func|g' "$RUNNING_FILE" echo "Executing $RUNNING_FILE" ERROR_MSG=$(./$RUNNING_FILE 2>&1 > /dev/null) echo "Finished running $FILE" if [ -n "$ERROR_MSG" ]; then echo "ERROR in $STRIPPED_NAME: $ERROR_MSG" echo "$ERROR_MSG" > result_$STRIPPED_NAME.log fi done set -e # Restore exit-on-error - name: Cleanup PVE Node run: | containers=$(pct list | tail -n +2 | awk '{print $0 " " $4}' | awk '{print $1}') for container_id in $containers; do status=$(pct status $container_id | awk '{print $2}') if [[ $status == "running" ]]; then pct stop $container_id pct destroy $container_id fi done - name: Post error comments run: | ERROR="false" SEARCH_LINE=".github/workflows/scripts/app-test/pr-build.func: line 255:" # Get all existing comments on the PR EXISTING_COMMENTS=$(gh pr view ${{ github.event.pull_request.number }} --repo ${{ github.repository }} --json comments --jq '.comments[].body') for FILE in ${{ env.ALL_FILES }}; do STRIPPED_NAME=$(basename "$FILE" | sed 's/-install//' | sed 's/\.sh$//') if [[ ! -f result_$STRIPPED_NAME.log ]]; then continue fi ERROR_MSG=$(cat result_$STRIPPED_NAME.log) if [ -n "$ERROR_MSG" ]; then CLEANED_ERROR_MSG=$(echo "$ERROR_MSG" | sed "s|$SEARCH_LINE.*||") COMMENT_BODY=":warning: The script _**$FILE**_ failed with the following message:
${CLEANED_ERROR_MSG}
" # Check if the comment already exists if echo "$EXISTING_COMMENTS" | grep -qF "$COMMENT_BODY"; then echo "Skipping duplicate comment for $FILE" else echo "Posting error message for $FILE" gh pr comment ${{ github.event.pull_request.number }} \ --repo ${{ github.repository }} \ --body "$COMMENT_BODY" ERROR="true" fi fi done echo "ERROR=$ERROR" >> $GITHUB_ENV ``` ## /.github/workflows/script_format.yml ```yml path="/.github/workflows/script_format.yml" name: Script Format Check permissions: pull-requests: write on: pull_request_target: branches: - main paths: - "install/*.sh" - "ct/*.sh" jobs: run-install-script: runs-on: pvenode steps: - name: Checkout PR branch (supports forks) uses: actions/checkout@v4 with: ref: ${{ github.event.pull_request.head.ref }} repository: ${{ github.event.pull_request.head.repo.full_name }} fetch-depth: 0 - name: Add Git safe directory run: | git config --global --add safe.directory /__w/ProxmoxVE/ProxmoxVE - name: Set up GH_TOKEN env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} run: | echo "GH_TOKEN=${GH_TOKEN}" >> $GITHUB_ENV - name: Get Changed Files run: | CHANGED_FILES=$(gh pr diff ${{ github.event.pull_request.number }} --repo ${{ github.repository }} --name-only) CHANGED_FILES=$(echo "$CHANGED_FILES" | tr '\n' ' ') echo "Changed files: $CHANGED_FILES" echo "SCRIPT=$CHANGED_FILES" >> $GITHUB_ENV env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} - name: Check scripts id: run-install continue-on-error: true run: | for FILE in ${{ env.SCRIPT }}; do STRIPPED_NAME=$(basename "$FILE" | sed 's/-install//' | sed 's/\.sh$//') echo "Running Test for: $STRIPPED_NAME" FILE_STRIPPED="${FILE##*/}" LOG_FILE="result_$FILE_STRIPPED.log" if [[ $FILE =~ ^ct/.*\.sh$ ]]; then FIRST_LINE=$(sed -n '1p' "$FILE") [[ "$FIRST_LINE" != "#!/usr/bin/env bash" ]] && echo "Line 1 was $FIRST_LINE | Should be: #!/usr/bin/env bash" >> "$LOG_FILE" SECOND_LINE=$(sed -n '2p' "$FILE") [[ "$SECOND_LINE" != "source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func)" ]] && echo "Line 2 was $SECOND_LINE | Should be: source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func)" >> "$LOG_FILE" THIRD_LINE=$(sed -n '3p' "$FILE") if ! [[ "$THIRD_LINE" =~ ^#\ Copyright\ \(c\)\ [0-9]{4}-[0-9]{4}\ community-scripts\ ORG$ || "$THIRD_LINE" =~ ^Copyright\ \(c\)\ [0-9]{4}-[0-9]{4}\ tteck$ ]]; then echo "Line 3 was $THIRD_LINE | Should be: # Copyright (c) 2021-2025 community-scripts ORG" >> "$LOG_FILE" fi EXPECTED_AUTHOR="# Author:" EXPECTED_LICENSE="# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE" EXPECTED_SOURCE="# Source:" EXPECTED_EMPTY="" for i in {4..7}; do LINE=$(sed -n "${i}p" "$FILE") case $i in 4) [[ $LINE == $EXPECTED_AUTHOR* ]] || printf "Line %d was: '%s' | Should start with: '%s'\n" "$i" "$LINE" "$EXPECTED_AUTHOR" >> $LOG_FILE ;; 5) [[ "$LINE" == "$EXPECTED_LICENSE" ]] || printf "Line %d was: '%s' | Should be: '%s'\n" "$i" "$LINE" "$EXPECTED_LICENSE" >> $LOG_FILE ;; 6) [[ $LINE == $EXPECTED_SOURCE* ]] || printf "Line %d was: '%s' | Should start with: '%s'\n" "$i" "$LINE" "$EXPECTED_SOURCE" >> $LOG_FILE ;; 7) [[ -z $LINE ]] || printf "Line %d was: '%s' | Should be empty\n" "$i" "$LINE" >> $LOG_FILE ;; esac done EXPECTED_PREFIXES=( "APP=" "var_tags=" "var_cpu=" # Must be a number "var_ram=" # Must be a number "var_disk=" # Must be a number "var_os=" # Must be debian, alpine, or ubuntu "var_version=" "var_unprivileged=" # Must be 0 or 1 ) for i in {8..15}; do LINE=$(sed -n "${i}p" "$FILE") INDEX=$((i - 8)) case $INDEX in 2|3|4) # var_cpu, var_ram, var_disk (must be numbers) if [[ "$LINE" =~ ^${EXPECTED_PREFIXES[$INDEX]}([0-9]+)$ ]]; then continue # Valid else echo "Line $i was '$LINE' | Should be: '${EXPECTED_PREFIXES[$INDEX]}'" >> "$LOG_FILE" fi ;; 5) # var_os (must be debian, alpine, or ubuntu) if [[ "$LINE" =~ ^var_os=(debian|alpine|ubuntu)$ ]]; then continue # Valid else echo "Line $i was '$LINE' | Should be: 'var_os=[debian|alpine|ubuntu]'" >> "$LOG_FILE" fi ;; 7) # var_unprivileged (must be 0 or 1) if [[ "$LINE" =~ ^var_unprivileged=[01]$ ]]; then continue # Valid else echo "Line $i was '$LINE' | Should be: 'var_unprivileged=[0|1]'" >> "$LOG_FILE" fi ;; *) # Other lines (must start with expected prefix) if [[ "$LINE" == ${EXPECTED_PREFIXES[$INDEX]}* ]]; then continue # Valid else echo "Line $i was '$LINE' | Should start with '${EXPECTED_PREFIXES[$INDEX]}'" >> "$LOG_FILE" fi ;; esac done for i in {16..20}; do LINE=$(sed -n "${i}p" "$FILE") EXPECTED=( "header_info \"$APP\"" "variables" "color" "catch_errors" "function update_script() {" ) [[ "$LINE" != "${EXPECTED[$((i-16))]}" ]] && echo "Line $i was $LINE | Should be: ${EXPECTED[$((i-16))]}" >> "$LOG_FILE" done cat "$LOG_FILE" elif [[ $FILE =~ ^install/.*-install\.sh$ ]]; then FIRST_LINE=$(sed -n '1p' "$FILE") [[ "$FIRST_LINE" != "#!/usr/bin/env bash" ]] && echo "Line 1 was $FIRST_LINE | Should be: #!/usr/bin/env bash" >> "$LOG_FILE" SECOND_LINE=$(sed -n '2p' "$FILE") [[ -n "$SECOND_LINE" ]] && echo "Line 2 should be empty" >> "$LOG_FILE" THIRD_LINE=$(sed -n '3p' "$FILE") if ! [[ "$THIRD_LINE" =~ ^#\ Copyright\ \(c\)\ [0-9]{4}-[0-9]{4}\ community-scripts\ ORG$ || "$THIRD_LINE" =~ ^Copyright\ \(c\)\ [0-9]{4}-[0-9]{4}\ tteck$ ]]; then echo "Line 3 was $THIRD_LINE | Should be: # Copyright (c) 2021-2025 community-scripts ORG" >> "$LOG_FILE" fi EXPECTED_AUTHOR="# Author:" EXPECTED_LICENSE="# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE" EXPECTED_SOURCE="# Source:" EXPECTED_EMPTY="" for i in {4..7}; do LINE=$(sed -n "${i}p" "$FILE") case $i in 4) [[ $LINE == $EXPECTED_AUTHOR* ]] || printf "Line %d was: '%s' | Should start with: '%s'\n" "$i" "$LINE" "$EXPECTED_AUTHOR" >> $LOG_FILE ;; 5) [[ "$LINE" == "$EXPECTED_LICENSE" ]] || printf "Line %d was: '%s' | Should be: '%s'\n" "$i" "$LINE" "$EXPECTED_LICENSE" >> $LOG_FILE ;; 6) [[ $LINE == $EXPECTED_SOURCE* ]] || printf "Line %d was: '%s' | Should start with: '%s'\n" "$i" "$LINE" "$EXPECTED_SOURCE" >> $LOG_FILE ;; 7) [[ -z $LINE ]] || printf "Line %d was: '%s' | Should be empty\n" "$i" "$LINE" >> $LOG_FILE ;; esac done [[ "$(sed -n '8p' "$FILE")" != 'source /dev/stdin <<< "$FUNCTIONS_FILE_PATH"' ]] && echo 'Line 8 should be: source /dev/stdin <<< "$FUNCTIONS_FILE_PATH"' >> "$LOG_FILE" for i in {9..14}; do LINE=$(sed -n "${i}p" "$FILE") EXPECTED=( "color" "verb_ip6" "catch_errors" "setting_up_container" "network_check" "update_os" ) [[ "$LINE" != "${EXPECTED[$((i-9))]}" ]] && echo "Line $i was $LINE | Should be: ${EXPECTED[$((i-9))]}" >> "$LOG_FILE" done [[ -n "$(sed -n '15p' "$FILE")" ]] && echo "Line 15 should be empty" >> "$LOG_FILE" [[ "$(sed -n '16p' "$FILE")" != 'msg_info "Installing Dependencies"' ]] && echo 'Line 16 should be: msg_info "Installing Dependencies"' >> "$LOG_FILE" LAST_3_LINES=$(tail -n 3 "$FILE") [[ "$LAST_3_LINES" != *"$STD apt-get -y autoremove"* ]] && echo 'Third to last line should be: $STD apt-get -y autoremove' >> "$LOG_FILE" [[ "$LAST_3_LINES" != *"$STD apt-get -y autoclean"* ]] && echo 'Second to last line should be: $STD apt-get -y clean' >> "$LOG_FILE" [[ "$LAST_3_LINES" != *'msg_ok "Cleaned"'* ]] && echo 'Last line should be: msg_ok "Cleaned"' >> "$LOG_FILE" cat "$LOG_FILE" fi done - name: Post error comments run: | ERROR="false" for FILE in ${{ env.SCRIPT }}; do FILE_STRIPPED="${FILE##*/}" LOG_FILE="result_$FILE_STRIPPED.log" echo $LOG_FILE if [[ ! -f $LOG_FILE ]]; then continue fi ERROR_MSG=$(cat $LOG_FILE) if [ -n "$ERROR_MSG" ]; then echo "Posting error message for $FILE" echo ${ERROR_MSG} gh pr comment ${{ github.event.pull_request.number }} \ --repo ${{ github.repository }} \ --body ":warning: The script _**$FILE**_ has the following formatting errors:
${ERROR_MSG}
" ERROR="true" fi done echo "ERROR=$ERROR" >> $GITHUB_ENV env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} - name: Fail if error if: ${{ env.ERROR == 'true' }} run: exit 1 ``` ## /.github/workflows/scripts/app-test/pr-alpine-install.func ```func path="/.github/workflows/scripts/app-test/pr-alpine-install.func" #!/usr/bin/env bash # Copyright (c) 2021-2025 community-scripts ORG # Author: Michel Roegl-Brunner (michelroegl-brunner) # License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE color() { return } catch_errors() { set -Eeuo pipefail trap 'error_handler $LINENO "$BASH_COMMAND"' ERR } # This function handles errors error_handler() { local line_number="$1" local command="$2" SCRIPT_NAME=$(basename "$0") local error_message="$SCRIPT_NAME: Failure in line $line_number while executing command $command" echo -e "\n$error_message" exit 0 } verb_ip6() { STD="" return } msg_info() { local msg="$1" echo -ne "${msg}\n" } msg_ok() { local msg="$1" echo -e "${msg}\n" } msg_error() { local msg="$1" echo -e "${msg}\n" } RETRY_NUM=10 RETRY_EVERY=3 i=$RETRY_NUM setting_up_container() { while [ $i -gt 0 ]; do if [ "$(ip addr show | grep 'inet ' | grep -v '127.0.0.1' | awk '{print $2}' | cut -d'/' -f1)" != "" ]; then break fi echo 1>&2 -en "No Network! " sleep $RETRY_EVERY i=$((i - 1)) done if [ "$(ip addr show | grep 'inet ' | grep -v '127.0.0.1' | awk '{print $2}' | cut -d'/' -f1)" = "" ]; then echo 1>&2 -e "\n No Network After $RETRY_NUM Tries" echo -e "Check Network Settings" exit 1 fi msg_ok "Set up Container OS" msg_ok "Network Connected: $(hostname -i)" } network_check() { RESOLVEDIP=$(getent hosts github.com | awk '{ print $1 }') if [[ -z "$RESOLVEDIP" ]]; then msg_error "DNS Lookup Failure"; else msg_ok "DNS Resolved github.com to $RESOLVEDIP"; fi set -e } update_os() { msg_info "Updating Container OS" apk update apk upgrade msg_ok "Updated Container OS" } motd_ssh() { return } customize() { return } ``` ## /.github/workflows/scripts/app-test/pr-build.func ```func path="/.github/workflows/scripts/app-test/pr-build.func" #!/usr/bin/env bash # Copyright (c) 2021-2025 community-scripts ORG # Author: Michel Roegl-Brunner (michelroegl-brunner) # License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE variables() { NSAPP=$(echo ${APP,,} | tr -d ' ') # This function sets the NSAPP variable by converting the value of the APP variable to lowercase and removing any spaces. var_install="${NSAPP}-install" # sets the var_install variable by appending "-install" to the value of NSAPP. } NEXTID=$(pvesh get /cluster/nextid) timezone=$(cat /etc/timezone) header_info() { return } base_settings() { # Default Settings CT_TYPE="1" DISK_SIZE="4" CORE_COUNT="1" RAM_SIZE="1024" VERBOSE="no" PW="" CT_ID=$NEXTID HN=$NSAPP BRG="vmbr0" NET="dhcp" GATE="" APT_CACHER="" APT_CACHER_IP="" DISABLEIP6="no" MTU="" SD="" NS="" MAC="" VLAN="" SSH="no" SSH_AUTHORIZED_KEY="" TAGS="community-script;" # Override default settings with variables from ct script CT_TYPE=${var_unprivileged:-$CT_TYPE} DISK_SIZE=${var_disk:-$DISK_SIZE} CORE_COUNT=${var_cpu:-$CORE_COUNT} RAM_SIZE=${var_ram:-$RAM_SIZE} VERB=${var_verbose:-$VERBOSE} TAGS="${TAGS}${var_tags:-}" # Since these 2 are only defined outside of default_settings function, we add a temporary fallback. TODO: To align everything, we should add these as constant variables (e.g. OSTYPE and OSVERSION), but that would currently require updating the default_settings function for all existing scripts if [ -z "$var_os" ]; then var_os="debian" fi if [ -z "$var_version" ]; then var_version="12" fi } color() { # Colors YW=$(echo "\033[33m") YWB=$(echo "\033[93m") BL=$(echo "\033[36m") RD=$(echo "\033[01;31m") BGN=$(echo "\033[4;92m") GN=$(echo "\033[1;92m") DGN=$(echo "\033[32m") # Formatting CL=$(echo "\033[m") UL=$(echo "\033[4m") BOLD=$(echo "\033[1m") BFR="\\r\\033[K" HOLD=" " TAB=" " # Icons CM="${TAB}✔️${TAB}${CL}" CROSS="${TAB}✖️${TAB}${CL}" INFO="${TAB}💡${TAB}${CL}" OS="${TAB}🖥️${TAB}${CL}" OSVERSION="${TAB}🌟${TAB}${CL}" CONTAINERTYPE="${TAB}📦${TAB}${CL}" DISKSIZE="${TAB}💾${TAB}${CL}" CPUCORE="${TAB}🧠${TAB}${CL}" RAMSIZE="${TAB}🛠️${TAB}${CL}" SEARCH="${TAB}🔍${TAB}${CL}" VERIFYPW="${TAB}🔐${TAB}${CL}" CONTAINERID="${TAB}🆔${TAB}${CL}" HOSTNAME="${TAB}🏠${TAB}${CL}" BRIDGE="${TAB}🌉${TAB}${CL}" NETWORK="${TAB}📡${TAB}${CL}" GATEWAY="${TAB}🌐${TAB}${CL}" DISABLEIPV6="${TAB}🚫${TAB}${CL}" DEFAULT="${TAB}⚙️${TAB}${CL}" MACADDRESS="${TAB}🔗${TAB}${CL}" VLANTAG="${TAB}🏷️${TAB}${CL}" ROOTSSH="${TAB}🔑${TAB}${CL}" CREATING="${TAB}🚀${TAB}${CL}" ADVANCED="${TAB}🧩${TAB}${CL}" } catch_errors() { set -Eeuo pipefail trap 'error_handler $LINENO "$BASH_COMMAND"' ERR } # This function handles errors error_handler() { local line_number="$1" local command="$2" SCRIPT_NAME=$(basename "$0") local error_message="$SCRIPT_NAME: Failure in line $line_number while executing command $command" echo -e "\n$error_message" exit 100 } msg_info() { local msg="$1" echo -ne "${msg}\n" } msg_ok() { local msg="$1" echo -e "${msg}\n" } msg_error() { local msg="$1" echo -e "${msg}\n" } start() { base_settings return } build_container() { # if [ "$VERB" == "yes" ]; then set -x; fi if [ "$CT_TYPE" == "1" ]; then FEATURES="keyctl=1,nesting=1" else FEATURES="nesting=1" fi TEMP_DIR=$(mktemp -d) pushd $TEMP_DIR >/dev/null if [ "$var_os" == "alpine" ]; then export FUNCTIONS_FILE_PATH="$(curl -s https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/.github/workflows/scripts/app-test/pr-alpine-install.func)" else export FUNCTIONS_FILE_PATH="$(curl -s https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/.github/workflows/scripts/app-test/pr-install.func)" fi export CACHER="$APT_CACHER" export CACHER_IP="$APT_CACHER_IP" export tz="" export DISABLEIPV6="$DISABLEIP6" export APPLICATION="$APP" export app="$NSAPP" export PASSWORD="$PW" export VERBOSE="$VERB" export SSH_ROOT="${SSH}" export SSH_AUTHORIZED_KEY export CTID="$CT_ID" export CTTYPE="$CT_TYPE" export PCT_OSTYPE="$var_os" export PCT_OSVERSION="$var_version" export PCT_DISK_SIZE="$DISK_SIZE" export tz="$timezone" export PCT_OPTIONS=" -features $FEATURES -hostname $HN -tags $TAGS $SD $NS -net0 name=eth0,bridge=$BRG$MAC,ip=$NET$GATE$VLAN$MTU -onboot 1 -cores $CORE_COUNT -memory $RAM_SIZE -unprivileged $CT_TYPE $PW " echo "Container ID: $CTID" # This executes create_lxc.sh and creates the container and .conf file bash -c "$(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/.github/workflows/scripts/app-test/pr-create-lxc.sh)" LXC_CONFIG=/etc/pve/lxc/${CTID}.conf if [ "$CT_TYPE" == "0" ]; then cat <>$LXC_CONFIG # USB passthrough lxc.cgroup2.devices.allow: a lxc.cap.drop: lxc.cgroup2.devices.allow: c 188:* rwm lxc.cgroup2.devices.allow: c 189:* rwm lxc.mount.entry: /dev/serial/by-id dev/serial/by-id none bind,optional,create=dir lxc.mount.entry: /dev/ttyUSB0 dev/ttyUSB0 none bind,optional,create=file lxc.mount.entry: /dev/ttyUSB1 dev/ttyUSB1 none bind,optional,create=file lxc.mount.entry: /dev/ttyACM0 dev/ttyACM0 none bind,optional,create=file lxc.mount.entry: /dev/ttyACM1 dev/ttyACM1 none bind,optional,create=file EOF fi if [ "$CT_TYPE" == "0" ]; then if [[ "$APP" == "Channels" || "$APP" == "Emby" || "$APP" == "ErsatzTV" || "$APP" == "Frigate" || "$APP" == "Jellyfin" || "$APP" == "Plex" || "$APP" == "Scrypted" || "$APP" == "Tdarr" || "$APP" == "Unmanic" || "$APP" == "Ollama" ]]; then cat <>$LXC_CONFIG # VAAPI hardware transcoding lxc.cgroup2.devices.allow: c 226:0 rwm lxc.cgroup2.devices.allow: c 226:128 rwm lxc.cgroup2.devices.allow: c 29:0 rwm lxc.mount.entry: /dev/fb0 dev/fb0 none bind,optional,create=file lxc.mount.entry: /dev/dri dev/dri none bind,optional,create=dir lxc.mount.entry: /dev/dri/renderD128 dev/dri/renderD128 none bind,optional,create=file EOF fi else if [[ "$APP" == "Channels" || "$APP" == "Emby" || "$APP" == "ErsatzTV" || "$APP" == "Frigate" || "$APP" == "Jellyfin" || "$APP" == "Plex" || "$APP" == "Scrypted" || "$APP" == "Tdarr" || "$APP" == "Unmanic" || "$APP" == "Ollama" ]]; then if [[ -e "/dev/dri/renderD128" ]]; then if [[ -e "/dev/dri/card0" ]]; then cat <>$LXC_CONFIG # VAAPI hardware transcoding dev0: /dev/dri/card0,gid=44 dev1: /dev/dri/renderD128,gid=104 EOF else cat <>$LXC_CONFIG # VAAPI hardware transcoding dev0: /dev/dri/card1,gid=44 dev1: /dev/dri/renderD128,gid=104 EOF fi fi fi fi # This starts the container and executes -install.sh msg_info "Starting LXC Container" pct start "$CTID" msg_ok "Started LXC Container" if [[ ! -f "/root/actions-runner/_work/ProxmoxVE/ProxmoxVE/install/$var_install.sh" ]]; then msg_error "No install script found for $APP" exit 1 fi if [ "$var_os" == "alpine" ]; then sleep 3 pct exec "$CTID" -- /bin/sh -c 'cat </etc/apk/repositories http://dl-cdn.alpinelinux.org/alpine/latest-stable/main http://dl-cdn.alpinelinux.org/alpine/latest-stable/community EOF' pct exec "$CTID" -- ash -c "apk add bash >/dev/null" fi lxc-attach -n "$CTID" -- bash -c "$(cat /root/actions-runner/_work/ProxmoxVE/ProxmoxVE/install/$var_install.sh)" } description() { IP=$(pct exec "$CTID" ip a s dev eth0 | awk '/inet / {print $2}' | cut -d/ -f1) } ``` ## /.github/workflows/scripts/app-test/pr-create-lxc.sh ```sh path="/.github/workflows/scripts/app-test/pr-create-lxc.sh" #!/usr/bin/env bash # Copyright (c) 2021-2025 community-scripts ORG # Author: Michel Roegl-Brunner (michelroegl-brunner) # License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE color() { return } catch_errors() { set -Eeuo pipefail trap 'error_handler $LINENO "$BASH_COMMAND"' ERR } # This function handles errors error_handler() { local exit_code="$?" local line_number="$1" local command="$2" local error_message="Failure in line $line_number: exit code $exit_code: while executing command $command" echo -e "\n$error_message" exit 100 } verb_ip6() { return } msg_info() { local msg="$1" echo -ne "${msg}\n" } msg_ok() { local msg="$1" echo -e "${msg}\n" } msg_error() { local msg="$1" echo -e "${msg}\n" } VALIDCT=$(pvesm status -content rootdir | awk 'NR>1') if [ -z "$VALIDCT" ]; then msg_error "Unable to detect a valid Container Storage location." exit 1 fi VALIDTMP=$(pvesm status -content vztmpl | awk 'NR>1') if [ -z "$VALIDTMP" ]; then msg_error "Unable to detect a valid Template Storage location." exit 1 fi function select_storage() { local CLASS=$1 local CONTENT local CONTENT_LABEL case $CLASS in container) CONTENT='rootdir' CONTENT_LABEL='Container' ;; template) CONTENT='vztmpl' CONTENT_LABEL='Container template' ;; *) false || { msg_error "Invalid storage class." exit 201 } ;; esac # This Queries all storage locations local -a MENU while read -r line; do local TAG=$(echo $line | awk '{print $1}') local TYPE=$(echo $line | awk '{printf "%-10s", $2}') local FREE=$(echo $line | numfmt --field 4-6 --from-unit=K --to=iec --format %.2f | awk '{printf( "%9sB", $6)}') local ITEM="Type: $TYPE Free: $FREE " local OFFSET=2 if [[ $((${#ITEM} + $OFFSET)) -gt ${MSG_MAX_LENGTH:-} ]]; then local MSG_MAX_LENGTH=$((${#ITEM} + $OFFSET)) fi MENU+=("$TAG" "$ITEM" "OFF") done < <(pvesm status -content $CONTENT | awk 'NR>1') # Select storage location if [ $((${#MENU[@]} / 3)) -eq 1 ]; then printf ${MENU[0]} else msg_error "STORAGE ISSUES!" exit 202 fi } [[ "${CTID:-}" ]] || { msg_error "You need to set 'CTID' variable." exit 203 } [[ "${PCT_OSTYPE:-}" ]] || { msg_error "You need to set 'PCT_OSTYPE' variable." exit 204 } # Test if ID is valid [ "$CTID" -ge "100" ] || { msg_error "ID cannot be less than 100." exit 205 } # Test if ID is in use if pct status $CTID &>/dev/null; then echo -e "ID '$CTID' is already in use." unset CTID msg_error "Cannot use ID that is already in use." exit 206 fi TEMPLATE_STORAGE=$(select_storage template) || exit CONTAINER_STORAGE=$(select_storage container) || exit pveam update >/dev/null TEMPLATE_SEARCH=${PCT_OSTYPE}-${PCT_OSVERSION:-} mapfile -t TEMPLATES < <(pveam available -section system | sed -n "s/.*\($TEMPLATE_SEARCH.*\)/\1/p" | sort -t - -k 2 -V) [ ${#TEMPLATES[@]} -gt 0 ] || { msg_error "Unable to find a template when searching for '$TEMPLATE_SEARCH'." exit 207 } TEMPLATE="${TEMPLATES[-1]}" TEMPLATE_PATH="/var/lib/vz/template/cache/$TEMPLATE" if ! pveam list "$TEMPLATE_STORAGE" | grep -q "$TEMPLATE"; then [[ -f "$TEMPLATE_PATH" ]] && rm -f "$TEMPLATE_PATH" pveam download "$TEMPLATE_STORAGE" "$TEMPLATE" >/dev/null || { msg_error "A problem occurred while downloading the LXC template." exit 208 } fi grep -q "root:100000:65536" /etc/subuid || echo "root:100000:65536" >>/etc/subuid grep -q "root:100000:65536" /etc/subgid || echo "root:100000:65536" >>/etc/subgid PCT_OPTIONS=(${PCT_OPTIONS[@]:-${DEFAULT_PCT_OPTIONS[@]}}) [[ " ${PCT_OPTIONS[@]} " =~ " -rootfs " ]] || PCT_OPTIONS+=(-rootfs "$CONTAINER_STORAGE:${PCT_DISK_SIZE:-8}") if ! pct create "$CTID" "${TEMPLATE_STORAGE}:vztmpl/${TEMPLATE}" "${PCT_OPTIONS[@]}" &>/dev/null; then [[ -f "$TEMPLATE_PATH" ]] && rm -f "$TEMPLATE_PATH" pveam download "$TEMPLATE_STORAGE" "$TEMPLATE" >/dev/null || { msg_error "A problem occurred while re-downloading the LXC template." exit 208 } if ! pct create "$CTID" "${TEMPLATE_STORAGE}:vztmpl/${TEMPLATE}" "${PCT_OPTIONS[@]}" &>/dev/null; then msg_error "A problem occurred while trying to create container after re-downloading template." exit 200 fi fi ``` ## /.github/workflows/scripts/app-test/pr-install.func ```func path="/.github/workflows/scripts/app-test/pr-install.func" #!/usr/bin/env bash # Copyright (c) 2021-2025 community-scripts ORG # Author: Michel Roegl-Brunner (michelroegl-brunner) # License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE color() { return } catch_errors() { set -Euo pipefail trap 'error_handler $LINENO "$BASH_COMMAND"' ERR } error_handler() { local line_number="$1" local command="$2" local error_message="Failure in line $line_number while executing command '$command'" echo -e "\n$error_message\n" >&2 exit 1 } verb_ip6() { STD="silent" silent() { "$@" >/dev/null 2>&1 || error_handler "${BASH_LINENO[0]}" "$*" } return } msg_info() { local msg="$1" echo -ne "${msg}\n" } msg_ok() { local msg="$1" echo -e "${msg}\n" } msg_error() { local msg="$1" echo -e "${msg}\n" } RETRY_NUM=10 RETRY_EVERY=3 setting_up_container() { sed -i "/$LANG/ s/\(^# \)//" /etc/locale.gen locale_line=$(grep -v '^#' /etc/locale.gen | grep -E '^[a-zA-Z]' | awk '{print $1}' | head -n 1) echo "LANG=${locale_line}" >/etc/default/locale locale-gen >/dev/null export LANG=${locale_line} echo $tz >/etc/timezone ln -sf /usr/share/zoneinfo/$tz /etc/localtime for ((i = RETRY_NUM; i > 0; i--)); do if [ "$(hostname -I)" != "" ]; then break fi sleep $RETRY_EVERY done if [ "$(hostname -I)" = "" ]; then echo 1>&2 -e "\nNo Network After $RETRY_NUM Tries" echo -e "Check Network Settings" exit 101 fi rm -rf /usr/lib/python3.*/EXTERNALLY-MANAGED systemctl disable -q --now systemd-networkd-wait-online.service } network_check() { RESOLVEDIP=$(getent hosts github.com | awk '{ print $1 }') if [[ -z "$RESOLVEDIP" ]]; then msg_error "DNS Lookup Failure"; else msg_ok "DNS Resolved github.com to $RESOLVEDIP"; fi set -e } update_os() { export DEBIAN_FRONTEND=noninteractive apt-get update >/dev/null 2>&1 apt-get -o Dpkg::Options::="--force-confold" -y dist-upgrade >/dev/null rm -rf /usr/lib/python3.*/EXTERNALLY-MANAGED } motd_ssh() { return } customize() { return } ``` ## /.github/workflows/scripts/generate-app-headers.sh ```sh path="/.github/workflows/scripts/generate-app-headers.sh" #!/usr/bin/env bash # Base directory for headers headers_dir="./ct/headers" # Ensure the headers directory exists and clear it mkdir -p "$headers_dir" rm -f "$headers_dir"/* # Find all .sh files in ./ct directory, sorted alphabetically find ./ct -type f -name "*.sh" | sort | while read -r script; do # Extract the APP name from the APP line app_name=$(grep -oP '^APP="\K[^"]+' "$script" 2>/dev/null) if [[ -n "$app_name" ]]; then # Define the output file name in the headers directory output_file="${headers_dir}/$(basename "${script%.*}")" # Generate figlet output figlet_output=$(figlet -w 500 -f slant "$app_name") # Check if figlet output is not empty if [[ -n "$figlet_output" ]]; then echo "$figlet_output" > "$output_file" echo "Generated: $output_file" else echo "Figlet failed for $app_name in $script" fi else echo "No APP name found in $script, skipping." fi done echo "Completed processing .sh files." ``` ## /.github/workflows/scripts/update-json.sh ```sh path="/.github/workflows/scripts/update-json.sh" #!/bin/bash FILE=$1 TODAY=$(date -u +"%Y-%m-%d") if [[ -z "$FILE" ]]; then echo "No file specified. Exiting." exit 1 fi if [[ ! -f "$FILE" ]]; then echo "File $FILE not found. Exiting." exit 1 fi DATE_IN_JSON=$(jq -r '.date_created' "$FILE" 2>/dev/null || echo "") if [[ "$DATE_IN_JSON" != "$TODAY" ]]; then jq --arg date "$TODAY" '.date_created = $date' "$FILE" >tmp.json && mv tmp.json "$FILE" fi ``` ## /.github/workflows/scripts/update_json_date.sh ```sh path="/.github/workflows/scripts/update_json_date.sh" #!/usr/bin/env bash # Verzeichnis, das die JSON-Dateien enthält json_dir="./json/*.json" current_date=$(date +"%Y-%m-%d") for json_file in $json_dir; do if [[ -f "$json_file" ]]; then current_json_date=$(jq -r '.date_created' "$json_file") if [[ "$current_json_date" != "$current_date" ]]; then echo "Updating $json_file with date $current_date" jq --arg date "$current_date" '.date_created = $date' "$json_file" >temp.json && mv temp.json "$json_file" git add "$json_file" git commit -m "Update date_created to $current_date in $json_file" else echo "Date in $json_file is already up to date." fi fi done git push origin HEAD ``` ## /.github/workflows/update-json-date.yml ```yml path="/.github/workflows/update-json-date.yml" name: Update JSON Date on: push: branches: - main paths: - 'frontend/public/json/**.json' workflow_dispatch: jobs: update-app-files: runs-on: runner-cluster-htl-set permissions: contents: write pull-requests: write steps: - name: Generate a token id: generate-token uses: actions/create-github-app-token@v1 with: app-id: ${{ vars.APP_ID }} private-key: ${{ secrets.APP_PRIVATE_KEY }} - name: Generate dynamic branch name id: timestamp run: echo "BRANCH_NAME=pr-update-json-$(date +'%Y%m%d%H%M%S')" >> $GITHUB_ENV - name: Set up GH_TOKEN env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} run: | echo "GH_TOKEN=${GH_TOKEN}" >> $GITHUB_ENV - name: Checkout Repository uses: actions/checkout@v4 with: fetch-depth: 2 # Ensure we have the last two commits - name: Get Previous Commit id: prev_commit run: | PREV_COMMIT=$(git rev-parse HEAD^) echo "Previous commit: $PREV_COMMIT" echo "prev_commit=$PREV_COMMIT" >> $GITHUB_ENV - name: Get Newly Added JSON Files id: new_json_files run: | git diff --name-only --diff-filter=A ${{ env.prev_commit }} HEAD | grep '^frontend/public/json/.*\.json$' > new_files.txt || true echo "New files detected:" cat new_files.txt || echo "No new files." - name: Disable file mode changes run: git config core.fileMode false - name: Set up Git run: | git config --global user.name "GitHub Actions" git config --global user.email "github-actions[bot]@users.noreply.github.com" - name: Change JSON Date id: change-json-date run: | current_date=$(date +"%Y-%m-%d") while IFS= read -r file; do # Skip empty lines [[ -z "$file" ]] && continue if [[ -f "$file" ]]; then echo "Processing $file..." current_json_date=$(jq -r '.date_created // empty' "$file") if [[ -z "$current_json_date" || "$current_json_date" != "$current_date" ]]; then echo "Updating $file with date $current_date" jq --arg date "$current_date" '.date_created = $date' "$file" > temp.json && mv temp.json "$file" else echo "Date in $file is already up to date." fi else echo "Warning: File $file not found!" fi done < new_files.txt rm new_files.txt - name: Check if there are any changes run: | echo "Checking for changes..." git add -A git status if git diff --cached --quiet; then echo "No changes detected." echo "changed=false" >> "$GITHUB_ENV" else echo "Changes detected:" git diff --stat --cached echo "changed=true" >> "$GITHUB_ENV" fi # Step 7: Commit and create PR if changes exist - name: Commit and create PR if changes exist if: env.changed == 'true' run: | git commit -m "Update date in json" git checkout -b ${{ env.BRANCH_NAME }} git push origin ${{ env.BRANCH_NAME }} gh pr create --title "[core] update date in json" \ --body "This PR is auto-generated by a GitHub Action to update the date in json." \ --head ${{ env.BRANCH_NAME }} \ --base main \ --label "automated pr" env: GH_TOKEN: ${{ steps.generate-token.outputs.token }} - name: Approve pull request if: env.changed == 'true' env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} run: | PR_NUMBER=$(gh pr list --head "${{ env.BRANCH_NAME }}" --json number --jq '.[].number') if [ -n "$PR_NUMBER" ]; then gh pr review $PR_NUMBER --approve fi - name: No changes detected if: env.changed == 'false' run: echo "No changes to commit. Workflow completed successfully." ``` ## /.github/workflows/validate-filenames.yml ```yml path="/.github/workflows/validate-filenames.yml" name: Validate filenames on: pull_request_target: paths: - "ct/*.sh" - "install/*.sh" - "frontend/public/json/*.json" jobs: check-files: name: Check changed files runs-on: runner-cluster-htl-set permissions: pull-requests: write steps: - name: Get pull request information if: github.event_name == 'pull_request_target' uses: actions/github-script@v7 id: pr with: script: | const { data: pullRequest } = await github.rest.pulls.get({ ...context.repo, pull_number: context.payload.pull_request.number, }); return pullRequest; - name: Checkout code uses: actions/checkout@v4 with: fetch-depth: 0 # Ensure the full history is fetched for accurate diffing ref: ${{ github.event_name == 'pull_request_target' && fromJSON(steps.pr.outputs.result).merge_commit_sha || '' }} - name: Get changed files id: changed-files run: | if ${{ github.event_name == 'pull_request_target' }}; then echo "files=$(git diff --name-only ${{ github.event.pull_request.base.sha }} ${{ steps.pr.outputs.result && fromJSON(steps.pr.outputs.result).merge_commit_sha }} | xargs)" >> $GITHUB_OUTPUT else echo "files=$(git diff --name-only ${{ github.event.before }} ${{ github.event.after }} | xargs)" >> $GITHUB_OUTPUT fi - name: "Validate filenames in ct and install directory" if: always() && steps.changed-files.outputs.files != '' id: check-scripts run: | CHANGED_FILES=$(printf "%s\n" ${{ steps.changed-files.outputs.files }} | { grep -E '^(ct|install)/.*\.sh$' || true; }) NON_COMPLIANT_FILES="" for FILE in $CHANGED_FILES; do # Datei "ct/create_lxc.sh" explizit überspringen if [[ "$FILE" == "ct/create_lxc.sh" ]]; then continue fi BASENAME=$(echo "$(basename "${FILE%.*}")") if [[ ! "$BASENAME" =~ ^[a-z0-9-]+$ ]]; then NON_COMPLIANT_FILES="$NON_COMPLIANT_FILES $FILE" fi done if [ -n "$NON_COMPLIANT_FILES" ]; then echo "files=$NON_COMPLIANT_FILES" >> $GITHUB_OUTPUT echo "Non-compliant filenames found, change to lowercase:" for FILE in $NON_COMPLIANT_FILES; do echo "$FILE" done exit 1 fi - name: "Validate filenames in json directory." if: always() && steps.changed-files.outputs.files != '' id: check-json run: | CHANGED_FILES=$(printf "%s\n" ${{ steps.changed-files.outputs.files }} | { grep -E '^json/.*\.json$' || true; }) NON_COMPLIANT_FILES="" for FILE in $CHANGED_FILES; do BASENAME=$(echo "$(basename "${FILE%.*}")") if [[ ! "$BASENAME" =~ ^[a-z0-9-]+$ ]]; then NON_COMPLIANT_FILES="$NON_COMPLIANT_FILES $FILE" fi done if [ -n "$NON_COMPLIANT_FILES" ]; then echo "files=$NON_COMPLIANT_FILES" >> $GITHUB_OUTPUT echo "Non-compliant filenames found, change to lowercase:" for FILE in $NON_COMPLIANT_FILES; do echo "$FILE" done exit 1 fi - name: Post results and comment if: always() && steps.check-scripts.outputs.files != '' && steps.check-json.outputs.files != '' && github.event_name == 'pull_request_target' uses: actions/github-script@v7 with: script: | const result = "${{ job.status }}" === "success" ? "success" : "failure"; const nonCompliantFiles = { script: "${{ steps.check-scripts.outputs.files }}", JSON: "${{ steps.check-json.outputs.files }}", }; const issueNumber = context.payload.pull_request ? context.payload.pull_request.number : null; const commentIdentifier = "validate-filenames"; let newCommentBody = `\n### Filename validation\n\n`; if (result === "failure") { newCommentBody += ":x: We found issues in the following changed files:\n\n"; for (const [check, files] of Object.entries(nonCompliantFiles)) { if (files) { newCommentBody += `**${check.charAt(0).toUpperCase() + check.slice(1)} filename invalid:**\n${files .trim() .split(" ") .map((file) => `- ${file}`) .join("\n")}\n\n`; } } newCommentBody += "Please change the filenames to lowercase and use only alphanumeric characters and dashes.\n"; } else { newCommentBody += `:rocket: All files passed filename validation!\n`; } newCommentBody += `\n\n`; if (issueNumber) { const { data: comments } = await github.rest.issues.listComments({ ...context.repo, issue_number: issueNumber, }); const existingComment = comments.find( (comment) => comment.user.login === "github-actions[bot]", ); if (existingComment) { if (existingComment.body.includes(commentIdentifier)) { const re = new RegExp(String.raw`[\s\S]*?`, ""); newCommentBody = existingComment.body.replace(re, newCommentBody); } else { newCommentBody = existingComment.body + '\n\n---\n\n' + newCommentBody; } await github.rest.issues.updateComment({ ...context.repo, comment_id: existingComment.id, body: newCommentBody, }); } else { await github.rest.issues.createComment({ ...context.repo, issue_number: issueNumber, body: newCommentBody, }); } } ``` ## /.gitignore ```gitignore path="/.gitignore" # General OS files .DS_Store Thumbs.db # Editor & IDE files (keeping .vscode settings but ignoring unnecessary metadata) !.vscode/ .vscode/*.workspace .vscode/*.tmp # Log and Cache files logs/ *.log npm-debug.log* yarn-debug.log* yarn-error.log* # Python-specific exclusions __pycache__/ *.pyc *.pyo *.pyd *.venv/ venv/ env/ *.env # Node.js dependencies (frontend folder was excluded, but keeping this rule for reference) frontend/node_modules/ frontend/.svelte-kit/ frontend/.turbo/ frontend/.vite/ frontend/build/ # API and Backend specific exclusions api/.env api/__pycache__/ api/*.sqlite3 # Install scripts and temporary files install/tmp/ install/*.bak # VM and Container-specific exclusions vm/tmp/ vm/*.qcow2 vm/*.img vm/*.vmdk vm/*.iso vm/*.bak # Miscellaneous temporary or unnecessary files *.bak *.swp *.swo *.swn *.tmp *.backup # JSON configuration backups json/*.bak json/*.tmp json/.vscode/ # Ignore compiled binaries or packaged artifacts *.exe *.dll *.bin *.deb *.rpm *.tar.gz *.zip *.tgz # Ignore repository metadata or Git itself .git/ .gitignore .vscode/settings.json ``` ## /.vscode/.editorconfig ```editorconfig path="/.vscode/.editorconfig" ; editorconfig.org root = true [*] charset = utf-8 continuation_indent_size = 2 end_of_line = lf indent_size = 2 indent_style = space insert_final_newline = true max_line_length = 120 tab_width = 2 ; trim_trailing_whitespace = true ; disabled until files are cleaned up [*.md] trim_trailing_whitespace = false ``` ## /.vscode/extensions.json ```json path="/.vscode/extensions.json" { "recommendations": [ "bmalehorn.shell-syntax", "timonwong.shellcheck", "foxundermoon.shell-format", "editorconfig.editorconfig" ], "unwantedRecommendations": [] } ``` ## /.vscode/settings.json ```json path="/.vscode/settings.json" { "files.associations": { "*.func": "shellscript" }, "files.eol": "\n", "files.insertFinalNewline": true, "editor.formatOnSave": true, "editor.codeActionsOnSave": { "source.fixAll": "explicit" } } ``` The content has been capped at 50000 tokens, and files over NaN bytes have been omitted. The user could consider applying other filters to refine the result. The better and more specific the context, the better the LLM can follow instructions. If the context seems verbose, the user can refine the filter using uithub. Thank you for using https://uithub.com - Perfect LLM context for any GitHub repo.