[{"content":"In my ongoing journey to improve my Python development workflow, I\u0026rsquo;ve discovered the power of combining Nox with GitHub Actions. After setting up Nox to automate my local development tasks, I realized I was still manually running these commands every time I wanted to push code. That\u0026rsquo;s when I thought: \u0026ldquo;Why not automate this entire process in the cloud?\u0026rdquo;\nThis week, I\u0026rsquo;ll show you exactly how to build upon the Nox setup we covered previously and create a complete CI/CD pipeline using GitHub Actions. By the end of this tutorial, your Python projects will automatically run formatting, linting, and testing on every commit - no more manual work!\nTo make this easier to follow along, I\u0026rsquo;ve created a starter repository template that includes all the files we\u0026rsquo;ll be working with. You can find it here: Python CI/CD Tutorial Starter Repo. Feel free to use it as your starting point!\nWhat is a CI/CD Pipeline Before we get started, let\u0026rsquo;s define what CI/CD stands for. CI/CD stands continuous integration / continuous deployment and is the process of deploying new features from your local machine to the production environment. CI/CD pipelines are very import for code bases because they act as the quality control and manufactures of new code for a codebase. CI/CD jobs are a very important principal of DevOps and Backend Engineering\nSetting Up Your Project Foundation Before we dive into GitHub Actions, let\u0026rsquo;s make sure your project has the essential files. If you followed my previous Nox tutorial, you might already have some of these!\nFirst, you\u0026rsquo;ll need a requirements.txt file in your project root with these development dependencies:\nnox ruff black coverage Next, here\u0026rsquo;s the noxfile.py that should be familiar from my previous tutorial:\nimport nox @nox.session def format(session: nox.Session) -\u0026gt; None: \u0026#34;\u0026#34;\u0026#34;Format code with black.\u0026#34;\u0026#34;\u0026#34; session.install(\u0026#34;black\u0026#34;) session.run(\u0026#34;black\u0026#34;, \u0026#34;.\u0026#34;) @nox.session def lint(session: nox.Session) -\u0026gt; None: \u0026#34;\u0026#34;\u0026#34;Run linting checks.\u0026#34;\u0026#34;\u0026#34; session.install(\u0026#34;ruff\u0026#34;) session.run(\u0026#34;ruff\u0026#34;, \u0026#34;check\u0026#34;, \u0026#34;--fix\u0026#34;, \u0026#34;.\u0026#34;) @nox.session def tests(session: nox.Session) -\u0026gt; None: \u0026#34;\u0026#34;\u0026#34;Run the test suite with coverage.\u0026#34;\u0026#34;\u0026#34; session.install(\u0026#34;-r\u0026#34;, \u0026#34;requirements.txt\u0026#34;) session.install(\u0026#34;coverage[toml]\u0026#34;) # Run tests with coverage session.run( \u0026#34;coverage\u0026#34;, \u0026#34;run\u0026#34;, \u0026#34;-m\u0026#34;, \u0026#34;unittest\u0026#34;, \u0026#34;discover\u0026#34;, \u0026#34;-s\u0026#34;, \u0026#34;tests\u0026#34;, \u0026#34;-p\u0026#34;, # We use *_test.py pattern for our tests \u0026#34;*_test.py\u0026#34;, ) # Generate coverage report session.run(\u0026#34;coverage\u0026#34;, \u0026#34;report\u0026#34;, \u0026#34;--fail-under=90\u0026#34;) # Generate HTML report for local viewing session.run(\u0026#34;coverage\u0026#34;, \u0026#34;html\u0026#34;) Customize Your Linting\nI always like to include an ruff.toml file to specify exactly which linting rules I want my projects to follow. This keeps my code consistent across all my projects:\n[lint] select = [ # pycodestyle \u0026#34;E\u0026#34;, # Pyflakes \u0026#34;F\u0026#34;, # pyupgrade \u0026#34;UP\u0026#34;, # flake8-bugbear \u0026#34;B\u0026#34;, # flake8-simplify \u0026#34;SIM\u0026#34;, # isort \u0026#34;I\u0026#34;, ] One thing I learned the hard way early in my Python journey was the importance of a comprehensive .gitignore file. Trust me, you don\u0026rsquo;t want to accidentally commit your virtual environment or sensitive data! Here\u0026rsquo;s the template I use for all my Python projects:\n# Byte-compiled / optimized / DLL files __pycache__/ *.py[codz] *$py.class # C extensions *.so # Distribution / packaging .Python build/ develop-eggs/ dist/ downloads/ eggs/ .eggs/ lib/ lib64/ parts/ sdist/ var/ wheels/ share/python-wheels/ *.egg-info/ .installed.cfg *.egg MANIFEST # PyInstaller # Usually these files are written by a python script from a template # before PyInstaller builds the exe, so as to inject date/other infos into it. *.manifest *.spec # Installer logs pip-log.txt pip-delete-this-directory.txt # Unit test / coverage reports htmlcov/ .tox/ .nox/ .coverage .coverage.* .cache nosetests.xml coverage.xml *.cover *.py.cover .hypothesis/ .pytest_cache/ cover/ # Translations *.mo *.pot # Django stuff: *.log local_settings.py db.sqlite3 db.sqlite3-journal # Flask stuff: instance/ .webassets-cache # Scrapy stuff: .scrapy # Sphinx documentation docs/_build/ # PyBuilder .pybuilder/ target/ # Jupyter Notebook .ipynb_checkpoints # IPython profile_default/ ipython_config.py # pyenv # For a library or package, you might want to ignore these files since the code is # intended to run in multiple environments; otherwise, check them in: # .python-version # pipenv # According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control. # However, in case of collaboration, if having platform-specific dependencies or dependencies # having no cross-platform support, pipenv may install dependencies that don\u0026#39;t work, or not # install all needed dependencies. #Pipfile.lock # UV # Similar to Pipfile.lock, it is generally recommended to include uv.lock in version control. # This is especially recommended for binary packages to ensure reproducibility, and is more # commonly ignored for libraries. #uv.lock # poetry # Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control. # This is especially recommended for binary packages to ensure reproducibility, and is more # commonly ignored for libraries. # https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control #poetry.lock #poetry.toml # pdm # Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control. # pdm recommends including project-wide configuration in pdm.toml, but excluding .pdm-python. # https://pdm-project.org/en/latest/usage/project/#working-with-version-control #pdm.lock #pdm.toml .pdm-python .pdm-build/ # pixi # Similar to Pipfile.lock, it is generally recommended to include pixi.lock in version control. #pixi.lock # Pixi creates a virtual environment in the .pixi directory, just like venv module creates one # in the .venv directory. It is recommended not to include this directory in version control. .pixi # PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm __pypackages__/ # Celery stuff celerybeat-schedule celerybeat.pid # SageMath parsed files *.sage.py # Environments .env .envrc .venv env/ venv/ ENV/ env.bak/ venv.bak/ # Spyder project settings .spyderproject .spyproject # Rope project settings .ropeproject # mkdocs documentation /site # mypy .mypy_cache/ .dmypy.json dmypy.json # Pyre type checker .pyre/ # pytype static type analyzer .pytype/ # Cython debug symbols cython_debug/ # PyCharm # JetBrains specific template is maintained in a separate JetBrains.gitignore that can # be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore # and can be added to the global gitignore or merged into this file. For a more nuclear # option (not recommended) you can uncomment the following to ignore the entire idea folder. #.idea/ # Abstra # Abstra is an AI-powered process automation framework. # Ignore directories containing user credentials, local state, and settings. # Learn more at https://abstra.io/docs .abstra/ # Visual Studio Code # Visual Studio Code specific template is maintained in a separate VisualStudioCode.gitignore # that can be found at https://github.com/github/gitignore/blob/main/Global/VisualStudioCode.gitignore # and can be added to the global gitignore or merged into this file. However, if you prefer, # you could uncomment the following to ignore the entire vscode folder # .vscode/ # Ruff stuff: .ruff_cache/ # PyPI configuration file .pypirc # Cursor # Cursor is an AI-powered code editor. `.cursorignore` specifies files/directories to # exclude from AI features like autocomplete and code analysis. Recommended for sensitive data # refer to https://docs.cursor.com/context/ignore-files .cursorignore .cursorindexingignore # Marimo marimo/_static/ marimo/_lsp/ __marimo__/ # Mac OS Metadata .DS_Store Optional: Local Development Setup\nIf you\u0026rsquo;re working with teammates or frequently clone repositories, consider adding a setup.sh script to automate virtual environment creation. This isn\u0026rsquo;t required for GitHub Actions (since it creates its own environments), but it\u0026rsquo;s incredibly handy for local development:\n#!/usr/bin/env bash python3 -m venv .virtualenv if [[ \u0026#34;$OSTYPE\u0026#34; == \u0026#34;msys\u0026#34; || \u0026#34;$OSTYPE\u0026#34; == \u0026#34;win32\u0026#34; ]]; then source .virtualenv/Scripts/activate else source .virtualenv/bin/activate fi pip install --upgrade pip pip install -r requirements.txt Creating Your First GitHub Actions Workflow Now comes the exciting part! With all our project files in place, it\u0026rsquo;s time to set up the CI/CD pipeline that will run automatically whenever you push code.\nFirst, let\u0026rsquo;s create the directory structure that GitHub Actions expects:\nmkdir -p .github/workflows The .github directory is like the control center for your repository - it\u0026rsquo;s where GitHub looks for all configuration files. The workflows subdirectory is specifically where we\u0026rsquo;ll define our automated processes.\nNow for the main event! Let\u0026rsquo;s create our workflow file. I like to name mine python-package.yml to keep things clear and descriptive:\n# This workflow will install Python dependencies, run tests and lint with a variety of Python versions # For more information on workflows see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python # For information on branch naming conventions: https://medium.com/@abhay.pixolo/naming-conventions-for-git-branches-a-cheatsheet-8549feca2534 name: Python package on: workflow_dispatch: push: branches: [\u0026#34;main\u0026#34;, \u0026#34;feature/*\u0026#34;, \u0026#34;hotfix/*\u0026#34;, \u0026#34;bugfix/*\u0026#34;, \u0026#34;release/*\u0026#34;, \u0026#34;stable/*\u0026#34;] paths: - \u0026#39;**/*.py\u0026#39; - \u0026#39;**/*.pyi\u0026#39; - \u0026#39;requirements.txt\u0026#39; - \u0026#39;noxfile.py\u0026#39; pull_request: branches: [\u0026#34;main\u0026#34;, \u0026#34;feature/*\u0026#34;, \u0026#34;hotfix/*\u0026#34;, \u0026#34;bugfix/*\u0026#34;, \u0026#34;release/*\u0026#34;, \u0026#34;stable/*\u0026#34;] paths: - \u0026#39;**/*.py\u0026#39; - \u0026#39;**/*.pyi\u0026#39; - \u0026#39;requirements.txt\u0026#39; - \u0026#39;noxfile.py\u0026#39; jobs: build: runs-on: ${{ matrix.os }} strategy: fail-fast: false matrix: python-version: [\u0026#34;3.12\u0026#34;] os: [ubuntu-latest, windows-latest, macOS-latest] steps: - uses: actions/checkout@v4 - name: Set up Python ${{ matrix.python-version }} uses: actions/setup-python@v3 with: python-version: ${{ matrix.python-version }} - name: Install nox run: | python -m pip install --upgrade pip python -m pip install nox - name: Run format with nox run: nox -s format env: PYTHONPATH: ${{ github.workspace }} - name: Run lint with nox run: nox -s lint env: PYTHONPATH: ${{ github.workspace }} - name: Run tests with nox run: nox -s tests env: PYTHONPATH: ${{ github.workspace }} Committing Your Workflow Now that we\u0026rsquo;ve created our workflow file, we need to commit and push it to GitHub so it can start working its magic:\ngit add . git commit -m \u0026#34;Add GitHub Actions CI/CD workflow\u0026#34; git push origin main Once you push this, head over to your GitHub repository and click on the \u0026ldquo;Actions\u0026rdquo; tab. You should see your new workflow listed there! If there are any syntax errors in your YAML, GitHub will let you know right away.\nWorkflow Validation\nIf your workflow doesn\u0026rsquo;t appear in the Actions tab, double-check that your YAML indentation is correct. YAML is very sensitive to spaces and tabs. You can also use online YAML validators to check your syntax before committing.\nSetting Up Branch Protection (Important!) Before we test our workflow, let\u0026rsquo;s set up branch protection rules. This ensures that our CI/CD jobs must pass before any code can be merged into the main branch - trust me, your future self will thank you for this!\nHere\u0026rsquo;s a basic ruleset I\u0026rsquo;ve created that works well for most projects:\n{ \u0026#34;id\u0026#34;: 7904611, \u0026#34;name\u0026#34;: \u0026#34;Basic Ruleset\u0026#34;, \u0026#34;target\u0026#34;: \u0026#34;branch\u0026#34;, \u0026#34;source_type\u0026#34;: \u0026#34;Repository\u0026#34;, \u0026#34;source\u0026#34;: \u0026#34;Lementknight/Python-Github-Actions-Ci-Cd-Tutorial-Starter-Repo\u0026#34;, \u0026#34;enforcement\u0026#34;: \u0026#34;active\u0026#34;, \u0026#34;conditions\u0026#34;: { \u0026#34;ref_name\u0026#34;: { \u0026#34;exclude\u0026#34;: [], \u0026#34;include\u0026#34;: [ \u0026#34;refs/heads/main\u0026#34; ] } }, \u0026#34;rules\u0026#34;: [ { \u0026#34;type\u0026#34;: \u0026#34;deletion\u0026#34; }, { \u0026#34;type\u0026#34;: \u0026#34;non_fast_forward\u0026#34; }, { \u0026#34;type\u0026#34;: \u0026#34;pull_request\u0026#34;, \u0026#34;parameters\u0026#34;: { \u0026#34;required_approving_review_count\u0026#34;: 0, \u0026#34;dismiss_stale_reviews_on_push\u0026#34;: false, \u0026#34;require_code_owner_review\u0026#34;: false, \u0026#34;require_last_push_approval\u0026#34;: false, \u0026#34;required_review_thread_resolution\u0026#34;: false, \u0026#34;automatic_copilot_code_review_enabled\u0026#34;: false, \u0026#34;allowed_merge_methods\u0026#34;: [ \u0026#34;merge\u0026#34;, \u0026#34;squash\u0026#34;, \u0026#34;rebase\u0026#34; ] } } ], \u0026#34;bypass_actors\u0026#34;: [] } You can save this JSON content to a file and import it to your repository following GitHub\u0026rsquo;s ruleset import guide.\nSet This Up First\nI recommend setting up these protection rules before creating your first pull request. This way, you\u0026rsquo;ll immediately see the CI/CD checks in action and ensure your workflow is properly configured.\nFeel free to adjust these rules based on your project\u0026rsquo;s needs. For more advanced protection options, check out GitHub\u0026rsquo;s guide on creating rulesets for repositories.\nTesting Your New CI/CD Pipeline Now for the fun part - let\u0026rsquo;s see it in action! Create a pull request to trigger your workflow. You\u0026rsquo;ll see GitHub Actions automatically run your formatting, linting, and testing jobs. It\u0026rsquo;s honestly satisfying to watch!\nAnd that\u0026rsquo;s it! You now have a robust CI/CD pipeline that automatically ensures your Python code meets your quality standards before it gets merged. No more \u0026ldquo;oops, I forgot to run the tests\u0026rdquo; moments!\nWhat\u0026rsquo;s Next? I\u0026rsquo;m planning to write about advanced GitHub Actions topics like deploying to PyPI, setting up matrix builds for multiple Python versions, and integrating with other tools like pre-commit hooks. If there\u0026rsquo;s a specific topic you\u0026rsquo;d like me to cover, let me know!\nIf you run into any issues setting up your CI/CD pipeline or have questions about any part of this process, feel free to reach out. I\u0026rsquo;m always happy to help fellow developers streamline their workflows.\nFeel free to reach out via the contact form.\n","permalink":"https://calebaguirreleon.com/posts/python-github-actions-ci-cd-tutorial/","summary":"\u003cp\u003eIn my ongoing journey to improve my Python development workflow, I\u0026rsquo;ve discovered the power of combining \u003ca href=\"/posts/nox-tutorial/\"\u003eNox\u003c/a\u003e with GitHub Actions. After setting up Nox to automate my local development tasks, I realized I was still manually running these commands every time I wanted to push code. That\u0026rsquo;s when I thought: \u0026ldquo;Why not automate this entire process in the cloud?\u0026rdquo;\u003c/p\u003e\n\u003cp\u003eThis week, I\u0026rsquo;ll show you exactly how to build upon the Nox setup we covered previously and create a complete CI/CD pipeline using GitHub Actions. By the end of this tutorial, your Python projects will automatically run formatting, linting, and testing on every commit - no more manual work!\u003c/p\u003e","title":"Setting Up Python CI/CD with GitHub Actions and Nox"},{"content":"In my pursuit to improve as a software engineer, I discovered tools like Black and Ruff that helped me write pythonic code following PEP 8 best practices. While linter and formatter commands are easy to remember, testing commands can become painfully cumbersome.\nFor example, in one of my current projects, testing requires this command:\ncoverage run -m unittest discover -s \u0026#34;tests\u0026#34; -p \u0026#34;*_test.py\u0026#34; Followed by:\ncoverage report --fail-under=100 #generates coverage report coverage html #creates HTML for coverage report Just typing those commands makes my fingers hurt! And that\u0026rsquo;s before dealing with virtual environment setup and dependencies. This is where Nox comes in.\nNox lets you automate your formatting, linting, and testing workflows by defining them in a noxfile.py. This file specifies different sessions to run, along with their dependencies and commands. With Nox, you get a consistent and reproducible environment that makes dependency management and test execution simpler.\nHere\u0026rsquo;s what a noxfile.py looks like for automating formatting:\nimport nox ... @nox.session def format(session: nox.Session) -\u0026gt; None: \u0026#34;\u0026#34;\u0026#34;Format code with black.\u0026#34;\u0026#34;\u0026#34; session.install(\u0026#34;black\u0026#34;) session.run(\u0026#34;black\u0026#34;, \u0026#34;.\u0026#34;) ... You import the nox module and use the @nox.session decorator to define a function. Notice that you install the black package within the session. This is because each Nox session runs in an isolated environment where packages and commands are contained within that session.\nRun it in the terminal like this:\nnox -s format Simple, right? Now you can run your formatting without remembering complex commands or arguments.\nCommon Nox Commands Here are some essential commands you\u0026rsquo;ll use frequently with Nox:\nnox --list # List all available sessions nox # Run all sessions nox -s format # Run a specific session nox -s format lint # Run multiple sessions nox -s tests -- # Pass arguments to the session (after --) Project Structure\nPlace your noxfile.py in the root directory of your project, alongside your requirements.txt or pyproject.toml file. This ensures Nox can easily find your project files and dependencies.\nBefore creating your noxfile.py, install Nox in your development environment:\npip install nox black ruff Virtual Environment Recommendation\nI prefer to put all of my dependencies in a requirements.txt file and install them in a virtual environment. This keeps my global Python environment clean and avoids version conflicts between projects.\nIf you\u0026rsquo;re using a requirements.txt file for your project dependencies, here\u0026rsquo;s a complete noxfile.py example:\nimport nox @nox.session def format(session: nox.Session) -\u0026gt; None: \u0026#34;\u0026#34;\u0026#34;Format code with black.\u0026#34;\u0026#34;\u0026#34; session.install(\u0026#34;black\u0026#34;) session.run(\u0026#34;black\u0026#34;, \u0026#34;.\u0026#34;) @nox.session def lint(session: nox.Session) -\u0026gt; None: \u0026#34;\u0026#34;\u0026#34;Run linting checks.\u0026#34;\u0026#34;\u0026#34; session.install(\u0026#34;ruff\u0026#34;) session.run(\u0026#34;ruff\u0026#34;, \u0026#34;check\u0026#34;, \u0026#34;--fix\u0026#34;, \u0026#34;.\u0026#34;) @nox.session def tests(session: nox.Session) -\u0026gt; None: \u0026#34;\u0026#34;\u0026#34;Run the test suite with coverage.\u0026#34;\u0026#34;\u0026#34; session.install(\u0026#34;-r\u0026#34;, \u0026#34;requirements.txt\u0026#34;) session.install(\u0026#34;coverage[toml]\u0026#34;) # Run tests with coverage session.run( \u0026#34;coverage\u0026#34;, \u0026#34;run\u0026#34;, \u0026#34;-m\u0026#34;, \u0026#34;unittest\u0026#34;, \u0026#34;discover\u0026#34;, \u0026#34;-s\u0026#34;, \u0026#34;tests\u0026#34;, \u0026#34;-p\u0026#34;, # We use *_test.py pattern for our tests \u0026#34;*_test.py\u0026#34;, ) # Generate coverage report session.run(\u0026#34;coverage\u0026#34;, \u0026#34;report\u0026#34;, \u0026#34;--fail-under=90\u0026#34;) # Generate HTML report for local viewing session.run(\u0026#34;coverage\u0026#34;, \u0026#34;html\u0026#34;) Tip\nIf you want to read more about nox, check out the official documentation.\nAnd that\u0026rsquo;s it! You now have a fully automated testing and development workflow using Nox. You can easily run your formatting, linting, and testing commands with simple terminal commands. This saves time and ensures your development environment stays consistent and reproducible across different machines.\nNext week I plan on explaining how you can use nox in your ci/cd pipeline to automate your testing and deployment processes. Stay tuned!\nIf you have any questions or topics you\u0026rsquo;d like me to cover, feel free to reach out!\nFeel free to reach out via the contact form.\n","permalink":"https://calebaguirreleon.com/posts/nox-tutorial/","summary":"\u003cp\u003eIn my pursuit to improve as a software engineer, I discovered tools like \u003ca href=\"https://black.readthedocs.io/en/stable/\"\u003eBlack\u003c/a\u003e and \u003ca href=\"https://docs.astral.sh/ruff/\"\u003eRuff\u003c/a\u003e that helped me write pythonic code following \u003ca href=\"https://peps.python.org/pep-0008/\"\u003ePEP 8\u003c/a\u003e best practices. While linter and formatter commands are easy to remember, testing commands can become painfully cumbersome.\u003c/p\u003e\n\u003cp\u003eFor example, in one of my current projects, testing requires this command:\u003c/p\u003e\n\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" class=\"chroma\"\u003e\u003ccode class=\"language-bash\" data-lang=\"bash\"\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003ecoverage run -m unittest discover -s \u003cspan class=\"s2\"\u003e\u0026#34;tests\u0026#34;\u003c/span\u003e -p \u003cspan class=\"s2\"\u003e\u0026#34;*_test.py\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003cp\u003eFollowed by:\u003c/p\u003e\n\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" class=\"chroma\"\u003e\u003ccode class=\"language-bash\" data-lang=\"bash\"\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003ecoverage report --fail-under\u003cspan class=\"o\"\u003e=\u003c/span\u003e\u003cspan class=\"m\"\u003e100\u003c/span\u003e \u003cspan class=\"c1\"\u003e#generates coverage report\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003ecoverage html \u003cspan class=\"c1\"\u003e#creates HTML for coverage report\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003cp\u003eJust typing those commands makes my fingers hurt! And that\u0026rsquo;s before dealing with virtual environment setup and dependencies. This is where Nox comes in.\u003c/p\u003e","title":"Nox Tutorial"},{"content":"Recently I downloaded Balatro on my Macbook Pro and I wanted to play mods on it, but to play mods on it I had to run a shell script. I didn\u0026rsquo;t want to navigate to the directory on my terminal and run the script every time I wanted to play, so I decided to turn the command that I had to run into an Apple app. This is how I did it.\nStep 1: Create the Shell Script In this step I created a shell script that navigates to the directory where the Balatro mods are located and runs the script to launch the mods. Let\u0026rsquo;s name the file play_modded_balatro.sh. Here is an example of what the script might look like:\nplay_modded_balatro.sh #!/bin/bash cd \u0026#34;path/to/your/balatro/mods\u0026#34; ./balatro_mod_script.sh Step 2: Make the Script Executable After creating the shell script, you need to make it executable. Open your terminal and run the following command:\nchmod +x play_modded_balatro.sh Congratulations! You have now created a shell script that can be executed from the terminal. Now let\u0026rsquo;s turn it into an app.\nBefore we can turn the shell script into an app, we need to create a directory structure that macOS expects for applications.\nHere is what the structure of the app will look like:\nApp_Name.app/ ├── Contents/ │ ├── Info.plist │ ├── MacOS/ │ │ └── play_modded_balatro.sh App_Name.app is the name of your application, this is the root directory of the application bundle which will contains the necessary files for macOS to recognize it as an application.\nInfo.plist is a property list file that contains metadata about the application, such as its name, version, and executable path.\nStep 3: Create the Directory Structure Open your terminal and navigate to the Applications directory. Then run the following commands to create the necessary directories:\ncd ~ cd ../../Applications # Navigate to the Applications directory mkdir -p App_Name.app/Contents/MacOS # Creates the directory structure for the app touch App_Name.app/Contents/Info.plist # Creates the Info.plist file cp \u0026lt;path_to_your_shell_script\u0026gt; App_Name.app/Contents/MacOS/ # Copies the shell script to the MacOS directory Note\nYou want to place the app in the Applications directory so that it can be easily accessed and launched like any other application on your Mac.\nStep 4: Edit the Info.plist File Now you need to edit the Info.plist file to provide the necessary metadata for your application. Open the Info.plist file in a text editor and add the following content:\n\u0026lt;?xml version=\u0026#34;1.0\u0026#34; encoding=\u0026#34;UTF-8\u0026#34;?\u0026gt; \u0026lt;!DOCTYPE plist PUBLIC \u0026#34;-//Apple//DTD PLIST 1.0//EN\u0026#34; \u0026#34;http://www.apple.com/DTDs/PropertyList-1.0.dtd\u0026#34;\u0026gt; \u0026lt;plist version=\u0026#34;1.0\u0026#34;\u0026gt; \u0026lt;dict\u0026gt; \u0026lt;key\u0026gt;CFBundleName\u0026lt;/key\u0026gt; \u0026lt;string\u0026gt;App_Name\u0026lt;/string\u0026gt; \u0026lt;key\u0026gt;CFBundleIdentifier\u0026lt;/key\u0026gt; \u0026lt;string\u0026gt;com.example.App_Name\u0026lt;/string\u0026gt; \u0026lt;key\u0026gt;CFBundleVersion\u0026lt;/key\u0026gt; \u0026lt;string\u0026gt;1.0\u0026lt;/string\u0026gt; \u0026lt;key\u0026gt;CFBundleShortVersionString\u0026lt;/key\u0026gt; \u0026lt;string\u0026gt;1.0\u0026lt;/string\u0026gt; \u0026lt;key\u0026gt;LSArchitecturePriority\u0026lt;/key\u0026gt; \u0026lt;string\u0026gt;arm64\u0026lt;/string\u0026gt; \u0026lt;/dict\u0026gt; \u0026lt;/plist\u0026gt; Note\nReplace LSArchitecturePriority is set to arm64 since my the mod for Balatro needs to run on the Apple Silicon architecture (not with Rosetta 2). You can read more about the Info.plist file here.\nStep 5: Run the Application Now that you have created the app structure and edited the Info.plist file, you can run your application. Open Finder, navigate to the Applications directory, and double-click on App_Name.app.\nReferences:\nConverting a Shell Script Into a *.app File The Importance of Info.plist in iOS Development ","permalink":"https://calebaguirreleon.com/posts/how-to-turn-shell-script-into-apple-app/","summary":"\u003cp\u003eRecently I downloaded \u003ca href=\"https://www.playbalatro.com/\"\u003eBalatro\u003c/a\u003e on my Macbook Pro and I wanted to play mods on it, but to play mods on it I had to run a shell script. I didn\u0026rsquo;t want to navigate to the directory on my terminal and run the script every time I wanted to play, so I decided to turn the command that I had to run into an Apple app. This is how I did it.\u003c/p\u003e","title":"How to Turn a Shell Script into an Apple App"},{"content":"About Me Bio My name is Caleb Aguirre-Leon and I am a NYC-based data engineer at Attention Arc. I have industry experience supporting enterprise clients including Google and IBM.\nWhen I am not programming, I am taking pictures of the world around me. You can see some of my work on my Flickr.\n","permalink":"https://calebaguirreleon.com/aboutme/","summary":"\u003ch1 id=\"about-me\"\u003eAbout Me\u003c/h1\u003e\n\u003ch2 id=\"bio\"\u003eBio\u003c/h2\u003e\n\u003c!-- Insert a proper description at some point --\u003e\n\u003cp\u003eMy name is Caleb Aguirre-Leon and I am a NYC-based data engineer at \u003ca href=\"https://attentionarc.com/\"\u003eAttention Arc\u003c/a\u003e. I have industry experience supporting enterprise clients including Google and IBM.\u003c/p\u003e\n\u003cp\u003eWhen I am not programming, I am taking pictures of the world around me. You can see some of my work on my \u003ca href=\"https://www.flickr.com/photos/lementknight/\"\u003eFlickr\u003c/a\u003e.\u003c/p\u003e","title":""},{"content":"Have a question about a post, or just want to say hi? Fill out the form below.\nName Email Message Send ","permalink":"https://calebaguirreleon.com/contact/","summary":"Get in touch","title":"Contact"}]