Merge branch 'main' into patch-1

This commit is contained in:
Erwan MATHIEU 2024-04-12 10:16:32 +02:00 committed by GitHub
commit 4bd1c4c4e0
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
7956 changed files with 237740 additions and 241926 deletions

View file

@ -0,0 +1,76 @@
name: ❌ Slicing Failed
description: When you see the message Slicing failed with an unexpected error
labels: ["Type: Bug", "Status: Triage", "Slicing Error :collision:"]
body:
- type: markdown
attributes:
value: |
### ✨Try our improved Cura 5.7✨
Before filling out the report below, we want you to try the latest Cura 5.7 Beta.
This version of Cura has become significantly more reliable and has an updated slicing engine that will automatically send a report to the Cura Team for analysis.
#### [You can find the downloads here](https://github.com/Ultimaker/Cura/releases/tag/5.7.0-beta.1) ####
If you still encounter a crash you are still welcome to report the issue so we can use your model as a test case, you can find instructions on how to do that below.
### Project File
**⚠️ Before you continue, we need your project file to troubleshoot a slicing crash.**
It contains the printer and settings we need for troubleshooting.
![Alt Text](https://user-images.githubusercontent.com/40423138/240616958-5a9751f2-bd34-4808-9752-6fde2e27516e.gif)
To save a project file go to File -> Save project.
Please make sure to .zip your project file.
For big files, you may need to use [WeTransfer](https://wetransfer.com/) or similar file-sharing sites.
🤔 Before you share, please think to yourself. Is this a model that can be shared?
Unfortunately we cannot help if this file is missing.
Do you have the project file? Than let's continue ⬇️
### Questions
- type: input
attributes:
label: Cura Version
placeholder: 5.6.0
validations:
required: true
- type: markdown
attributes:
value: |
We work hard on improving our slicing crashes. Our most recent release is 5.6.0.
If you are not on the latest version of Cura, [you can download it here](https://github.com/Ultimaker/Cura/releases/latest)
- type: input
attributes:
label: Operating System
description: Information about the operating system the issue occurs on. Include at least the operating system and maybe GPU.
placeholder: Windows 11 / MacOS Catalina / MX Linux
validations:
required: true
- type: input
attributes:
label: Printer
description: Which printer was selected in Cura?
validations:
required: true
- type: input
attributes:
label: Name abnormal settings
description: Are there any settings that you might have changed that caused the crash? Does your model slice when you select the default profiles?
placeholder:
validations:
- type: input
attributes:
label: Describe model location
description: Does your model slice if you rotate the model 90 degrees or if you move it away from the center of the buildplate?
placeholder:
validations:
- type: input
attributes:
label: Describe your model
description: Have you sliced your model succesfully before? Is it watertight? Have you tried doing a quick [Mesh Fix with the Meshtools Plugin](https://marketplace.ultimaker.com/app/cura/plugins/fieldofview/MeshTools)?
validations:
required: true
- type: textarea
attributes:
label: Add your .zip here ⬇️
description: You can add the zip file and additional information that is relevant to the issue in the comments below.
validations:
required: true

View file

@ -1,40 +1,41 @@
name: Bug Report
name: 🪲 Bug Report
description: Create a report to help us fix issues.
labels: ["Type: Bug", "Status: Triage"]
body:
- type: markdown
attributes:
value: |
**Thank you for using Cura and wanting to report a bug.**
**Thank you for using Cura and wanting to report a bug. 🙏**
Before filing, please check if the issue already exists (either open or closed) by using the search bar on the issues page. If it does, comment there. Even if it's closed, we can reopen it based on your comment.
Before filing, [please check if the issue already exists](https://github.com/Ultimaker/Cura/issues?q=is%3Aissue) by using the search bar on the issues page.
If it does, comment there. Even if it's closed, we can reopen it based on your comment.
Also, please note the application version in the title of the issue "For example (3.2.1) Cannot connect to 3rd-party printer". Please do not write things like **Request** or **BUG** in the title, this is what labels are for.
Please include the cura version in the title of the issue. For example, *"[5.4.0] Support Brim is missing in this model"*.
- type: input
attributes:
label: Application Version
label: Cura Version
description: The version of Cura this issue occurs with.
placeholder: 5.0.0
placeholder: 5.4.0
validations:
required: true
- type: input
attributes:
label: Platform
label: Operating System
description: Information about the operating system the issue occurs on. Include at least the operating system and maybe GPU.
placeholder: Windows 10
placeholder: Windows 11 / MacOS Catalina / MX Linux
validations:
required: true
- type: input
attributes:
label: Printer
description: Which printer was selected in Cura?
placeholder: Ultimaker S5
description: Which printer was selected in Cura? It also helps to mention if you made any firmware modifications to your printer.
placeholder: Ultimaker S7 / Creality CR-10 with Klipper
validations:
required: true
- type: textarea
attributes:
label: Reproduction steps
description: Tell us what you did!
description: Share what you did, so we can reproduce it
placeholder: |
1. Something you did
2. Something you did next
@ -43,40 +44,39 @@ body:
- type: textarea
attributes:
label: Actual results
description: What happens after the above steps have been followed.
description: What happens after the above steps have been followed?
validations:
required: true
- type: textarea
attributes:
label: Expected results
description: What should happen after the above steps have been followed.
description: What should happen after the above steps have been followed?
validations:
required: true
- type: markdown
attributes:
value: |
Please be sure to add the following files:
* For slicing issues, upload a **project file** that clearly shows the bug.
To save a project file go to `File -> Save project`. Please make sure to .zip your project file. For big files you may need to use WeTransfer or similar file sharing sites.
G-code files are not project files!
* **Screenshots** of showing the problem, perhaps before/after images.
* A **log file** for crashes and similar issues.
You can find your log file here:
Windows: `%APPDATA%\cura\<Cura version>\cura.log` or usually `C:\Users\\<your username>\AppData\Roaming\cura\<Cura version>\cura.log`
MacOS: `$USER/Library/Application Support/cura/<Cura version>/cura.log`
Ubuntu/Linux: `$USER/.local/share/cura/<Cura version>/cura.log`
If the Cura user interface still starts, you can also reach this directory from the application menu in Help -> Show settings folder
- type: checkboxes
attributes:
label: Checklist of files to include
options:
- label: Log file
- label: Project file
### Please add the following files when they are related to...
* 🔵 **The quality of your print**
Please add **a Project File**. It contains the printer and settings we need for troubleshooting.
To save a project file go to File -> Save project.
Please make sure to .zip your project file. For big files, you may need to use [WeTransfer](https://wetransfer.com/) or similar file-sharing sites.
G-code files are not project files! Before you share, please think to yourself. Is this a model that can be shared?
![Alt Text](https://user-images.githubusercontent.com/40423138/240616958-5a9751f2-bd34-4808-9752-6fde2e27516e.gif)
* 🔵 **Using and interacting with Cura**
Please add **screenshots** showing the issue.
Before and after, and arrows can help here.
* 🔵 **Unexpected crashes and behavior**
Please add **a log file** with information on what your Cura is doing.
You can find your log file here:
Windows: `%APPDATA%\cura\<Cura version>\cura.log`
MacOS: `$USER/Library/Application Support/cura/<Cura version>/cura.log`
Ubuntu/Linux: `$USER/.local/share/cura/<Cura version>/cura.log`
If the Cura user interface still starts, you can also reach this directory from the application menu in Help -> Show settings folder
- type: textarea
attributes:
label: Additional information & file uploads
description: You can add these files and additional information that is relevant to the issue in the comments below.
label: Add your .zip and screenshots here ⬇️
description: You can add the zip file and additional information that is relevant to the issue in the comments below.
validations:
required: true

View file

@ -1,4 +1,4 @@
name: Feature Request
name: 💡 Feature Request
description: Suggest an idea for this project.
labels: ["Type: New Feature", "Status: Triage"]
body:

View file

@ -28,6 +28,6 @@ This fixes... OR This improves... -->
<!-- Check if relevant -->
- [ ] My code follows the style guidelines of this project as described in [UltiMaker Meta](https://github.com/Ultimaker/Meta) and [Cura QML best practices](https://github.com/Ultimaker/Cura/wiki/QML-Best-Practices)
- [ ] I have read the [Contribution guide](https://github.com/Ultimaker/Cura/blob/main/contributing.md)
- [ ] I have read the [Contribution guide](https://github.com/Ultimaker/Cura/blob/main/CONTRIBUTING.md)
- [ ] I have commented my code, particularly in hard-to-understand areas
- [ ] I have uploaded any files required to test this change
- [ ] I have uploaded any files required to test this change

View file

@ -1,154 +0,0 @@
name: Create and Upload Conan package
on:
workflow_call:
inputs:
project_name:
required: true
type: string
recipe_id_full:
required: true
type: string
build_id:
required: true
type: number
build_info:
required: false
default: true
type: boolean
recipe_id_latest:
required: false
type: string
runs_on:
required: true
type: string
python_version:
required: true
type: string
conan_config_branch:
required: false
type: string
conan_logging_level:
required: false
type: string
conan_clean_local_cache:
required: false
type: boolean
default: false
conan_upload_community:
required: false
default: true
type: boolean
env:
CONAN_LOGIN_USERNAME_CURA: ${{ secrets.CONAN_USER }}
CONAN_PASSWORD_CURA: ${{ secrets.CONAN_PASS }}
CONAN_LOGIN_USERNAME_CURA_CE: ${{ secrets.CONAN_USER }}
CONAN_PASSWORD_CURA_CE: ${{ secrets.CONAN_PASS }}
CONAN_LOG_RUN_TO_OUTPUT: 1
CONAN_LOGGING_LEVEL: ${{ inputs.conan_logging_level }}
CONAN_NON_INTERACTIVE: 1
jobs:
conan-package-create:
runs-on: ${{ inputs.runs_on }}
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Setup Python and pip
uses: actions/setup-python@v4
with:
python-version: ${{ inputs.python_version }}
cache: 'pip'
cache-dependency-path: .github/workflows/requirements-conan-package.txt
- name: Install Python requirements for runner
run: pip install -r https://raw.githubusercontent.com/Ultimaker/Cura/main/.github/workflows/requirements-conan-package.txt
# Note the runner requirements are always installed from the main branch in the Ultimaker/Cura repo
- name: Use Conan download cache (Bash)
if: ${{ runner.os != 'Windows' }}
run: conan config set storage.download_cache="$HOME/.conan/conan_download_cache"
- name: Use Conan download cache (Powershell)
if: ${{ runner.os == 'Windows' }}
run: conan config set storage.download_cache="C:\Users\runneradmin\.conan\conan_download_cache"
- name: Cache Conan local repository packages (Bash)
uses: actions/cache@v3
if: ${{ runner.os != 'Windows' }}
with:
path: |
$HOME/.conan/data
$HOME/.conan/conan_download_cache
key: conan-${{ inputs.runs_on }}-${{ runner.arch }}-create-cache
- name: Cache Conan local repository packages (Powershell)
uses: actions/cache@v3
if: ${{ runner.os == 'Windows' }}
with:
path: |
C:\Users\runneradmin\.conan\data
C:\.conan
C:\Users\runneradmin\.conan\conan_download_cache
key: conan-${{ inputs.runs_on }}-${{ runner.arch }}-create-cache
- name: Install MacOS system requirements
if: ${{ runner.os == 'Macos' }}
run: brew install autoconf automake ninja
# NOTE: Due to what are probably github issues, we have to remove the cache and reconfigure before the rest.
# This is maybe because grub caches the disk it uses last time, which is recreated each time.
- name: Install Linux system requirements
if: ${{ runner.os == 'Linux' }}
run: |
sudo rm /var/cache/debconf/config.dat
sudo dpkg --configure -a
sudo add-apt-repository ppa:ubuntu-toolchain-r/test -y
sudo apt update
sudo apt upgrade
sudo apt install build-essential checkinstall libegl-dev zlib1g-dev libssl-dev ninja-build autoconf libx11-dev libx11-xcb-dev libfontenc-dev libice-dev libsm-dev libxau-dev libxaw7-dev libxcomposite-dev libxcursor-dev libxdamage-dev libxdmcp-dev libxext-dev libxfixes-dev libxi-dev libxinerama-dev libxkbfile-dev libxmu-dev libxmuu-dev libxpm-dev libxrandr-dev libxrender-dev libxres-dev libxss-dev libxt-dev libxtst-dev libxv-dev libxvmc-dev libxxf86vm-dev xtrans-dev libxcb-render0-dev libxcb-render-util0-dev libxcb-xkb-dev libxcb-icccm4-dev libxcb-image0-dev libxcb-keysyms1-dev libxcb-randr0-dev libxcb-shape0-dev libxcb-sync-dev libxcb-xfixes0-dev libxcb-xinerama0-dev xkb-data libxcb-dri3-dev uuid-dev libxcb-util-dev libxkbcommon-x11-dev pkg-config flex bison -y
- name: Install GCC-12 on ubuntu-22.04
if: ${{ startsWith(inputs.runs_on, 'ubuntu-22.04') }}
run: |
sudo apt install g++-12 gcc-12 -y
sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-12 12
sudo update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-12 12
- name: Use GCC-10 on ubuntu-20.04
if: ${{ startsWith(inputs.runs_on, 'ubuntu-20.04') }}
run: |
sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-10 10
sudo update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-10 10
- name: Create the default Conan profile
run: conan profile new default --detect
- name: Get Conan configuration from branch
if: ${{ inputs.conan_config_branch != '' }}
run: conan config install https://github.com/Ultimaker/conan-config.git -a "-b ${{ inputs.conan_config_branch }}"
- name: Get Conan configuration
if: ${{ inputs.conan_config_branch == '' }}
run: conan config install https://github.com/Ultimaker/conan-config.git
- name: Create the Packages
run: conan install ${{ inputs.recipe_id_full }} --build=missing --update
- name: Upload the Package(s)
if: always()
run: |
conan upload ${{ inputs.recipe_id_full }} -r cura --all -c

View file

@ -1,27 +1,6 @@
---
name: conan-package
# Exports the recipe, sources and binaries for Mac, Windows and Linux and upload these to the server such that these can
# be used downstream.
#
# It should run on pushes against main or CURA-* branches, but it will only create the binaries for main and release branches
on:
workflow_dispatch:
inputs:
create_binaries_windows:
required: true
default: false
description: 'create binaries Windows'
create_binaries_linux:
required: true
default: false
description: 'create binaries Linux'
create_binaries_macos:
required: true
default: false
description: 'create binaries Macos'
push:
paths:
- 'plugins/**'
@ -32,118 +11,40 @@ on:
- 'packaging/**'
- '.github/workflows/conan-*.yml'
- '.github/workflows/notify.yml'
- '.github/workflows/requirements-conan-package.txt'
- '.github/workflows/requirements-runner.txt'
- 'requirements*.txt'
- 'conanfile.py'
- 'conandata.yml'
- 'GitVersion.yml'
- '*.jinja'
branches:
- main
- 'main'
- 'CURA-*'
- '[1-9].[0-9]'
- '[1-9].[0-9][0-9]'
tags:
- '[1-9].[0-9].[0-9]*'
- '[1-9].[0-9].[0-9]'
- '[1-9].[0-9][0-9].[0-9]*'
- 'PP-*'
- '[0-9].[0-9]*'
- '[0-9].[0-9][0-9]*'
env:
CONAN_LOGIN_USERNAME_CURA: ${{ secrets.CONAN_USER }}
CONAN_PASSWORD_CURA: ${{ secrets.CONAN_PASS }}
CONAN_LOGIN_USERNAME_CURA_CE: ${{ secrets.CONAN_USER }}
CONAN_PASSWORD_CURA_CE: ${{ secrets.CONAN_PASS }}
CONAN_LOG_RUN_TO_OUTPUT: 1
CONAN_LOGGING_LEVEL: ${{ inputs.conan_logging_level }}
CONAN_NON_INTERACTIVE: 1
CONAN_LOGIN_USERNAME_CURA: ${{ secrets.CONAN_USER }}
CONAN_PASSWORD_CURA: ${{ secrets.CONAN_PASS }}
permissions: {}
jobs:
conan-recipe-version:
permissions:
contents: read
uses: ultimaker/cura/.github/workflows/conan-recipe-version.yml@main
uses: ultimaker/cura-workflows/.github/workflows/conan-recipe-version.yml@main
with:
project_name: cura
conan-package-create-linux:
conan-package-export:
needs: [ conan-recipe-version ]
runs-on: 'ubuntu-latest'
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Setup Python and pip
uses: actions/setup-python@v4
with:
python-version: '3.10.x'
cache: 'pip'
cache-dependency-path: .github/workflows/requirements-conan-package.txt
- name: Install Python requirements for runner
run: pip install -r https://raw.githubusercontent.com/Ultimaker/Cura/main/.github/workflows/requirements-conan-package.txt
# Note the runner requirements are always installed from the main branch in the Ultimaker/Cura repo
- name: Use Conan download cache (Bash)
if: ${{ runner.os != 'Windows' }}
run: conan config set storage.download_cache="$HOME/.conan/conan_download_cache"
- name: Cache Conan local repository packages (Bash)
uses: actions/cache@v3
with:
path: |
$HOME/.conan/data
$HOME/.conan/conan_download_cache
key: conan-ubuntu-${{ runner.arch }}-create-cache
# NOTE: Due to what are probably github issues, we have to remove the cache and reconfigure before the rest.
# This is maybe because grub caches the disk it uses last time, which is recreated each time.
- name: Install Linux system requirements
if: ${{ runner.os == 'Linux' }}
run: |
sudo rm /var/cache/debconf/config.dat
sudo dpkg --configure -a
sudo add-apt-repository ppa:ubuntu-toolchain-r/test -y
sudo apt update
sudo apt upgrade
sudo apt install efibootmgr build-essential checkinstall libegl-dev zlib1g-dev libssl-dev ninja-build autoconf libx11-dev libx11-xcb-dev libfontenc-dev libice-dev libsm-dev libxau-dev libxaw7-dev libxcomposite-dev libxcursor-dev libxdamage-dev libxdmcp-dev libxext-dev libxfixes-dev libxi-dev libxinerama-dev libxkbfile-dev libxmu-dev libxmuu-dev libxpm-dev libxrandr-dev libxrender-dev libxres-dev libxss-dev libxt-dev libxtst-dev libxv-dev libxvmc-dev libxxf86vm-dev xtrans-dev libxcb-render0-dev libxcb-render-util0-dev libxcb-xkb-dev libxcb-icccm4-dev libxcb-image0-dev libxcb-keysyms1-dev libxcb-randr0-dev libxcb-shape0-dev libxcb-sync-dev libxcb-xfixes0-dev libxcb-xinerama0-dev xkb-data libxcb-dri3-dev uuid-dev libxcb-util-dev libxkbcommon-x11-dev pkg-config flex bison -y
- name: Install GCC-12
run: |
sudo apt install g++-12 gcc-12 -y
sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-12 12
sudo update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-12 12
- name: Create the default Conan profile
run: conan profile new default --detect
- name: Get Conan configuration
run: conan config install https://github.com/Ultimaker/conan-config.git
- name: Create the Packages
run: conan create . ${{ needs.conan-recipe-version.outputs.recipe_id_full }} --build=missing --update -o ${{ needs.conan-recipe-version.outputs.project_name }}:devtools=True
- name: Create the latest alias
if: always()
run: conan alias ${{ needs.conan-recipe-version.outputs.recipe_id_latest }} ${{ needs.conan-recipe-version.outputs.recipe_id_full }}
- name: Upload the Package(s)
if: always()
run: |
conan upload ${{ needs.conan-recipe-version.outputs.recipe_id_full }} -r cura --all -c
conan upload ${{ needs.conan-recipe-version.outputs.recipe_id_latest }} -r cura -c
notify-create:
if: ${{ always() && (github.event_name == 'push' && (github.ref_name == 'main' || github.ref_name == 'master' || needs.conan-recipe-version.outputs.is_release_branch == 'true')) }}
needs: [ conan-recipe-version, conan-package-create-linux ]
uses: ultimaker/cura/.github/workflows/notify.yml@main
uses: ultimaker/cura-workflows/.github/workflows/conan-recipe-export.yml@main
with:
success: ${{ contains(join(needs.*.result, ','), 'success') }}
success_title: "New binaries created in ${{ github.repository }}"
success_body: "Created binaries for ${{ needs.conan-recipe-version.outputs.recipe_id_full }}"
failure_title: "Failed to create binaries in ${{ github.repository }}"
failure_body: "Failed to created binaries for ${{ needs.conan-recipe-version.outputs.recipe_id_full }}"
recipe_id_full: ${{ needs.conan-recipe-version.outputs.recipe_id_full }}
recipe_id_latest: ${{ needs.conan-recipe-version.outputs.recipe_id_latest }}
secrets: inherit
conan-package-create:
needs: [ conan-recipe-version, conan-package-export ]
uses: ultimaker/cura-workflows/.github/workflows/conan-package-create-linux.yml@main
with:
recipe_id_full: ${{ needs.conan-recipe-version.outputs.recipe_id_full }}
conan_extra_args: "-o cura:enable_i18n=True"
secrets: inherit

View file

@ -1,99 +0,0 @@
name: Export Conan Recipe to server
on:
workflow_call:
inputs:
recipe_id_full:
required: true
type: string
recipe_id_latest:
required: false
type: string
runs_on:
required: true
type: string
python_version:
required: true
type: string
conan_config_branch:
required: false
type: string
conan_logging_level:
required: false
type: string
conan_export_binaries:
required: false
type: boolean
conan_upload_community:
required: false
default: true
type: boolean
env:
CONAN_LOGIN_USERNAME_CURA: ${{ secrets.CONAN_USER }}
CONAN_PASSWORD_CURA: ${{ secrets.CONAN_PASS }}
CONAN_LOGIN_USERNAME_CURA_CE: ${{ secrets.CONAN_USER }}
CONAN_PASSWORD_CURA_CE: ${{ secrets.CONAN_PASS }}
CONAN_LOG_RUN_TO_OUTPUT: 1
CONAN_LOGGING_LEVEL: ${{ inputs.conan_logging_level }}
CONAN_NON_INTERACTIVE: 1
jobs:
package-export:
runs-on: ${{ inputs.runs_on }}
steps:
- name: Checkout project
uses: actions/checkout@v3
- name: Setup Python and pip
uses: actions/setup-python@v4
with:
python-version: ${{ inputs.python_version }}
cache: 'pip'
cache-dependency-path: .github/workflows/requirements-conan-package.txt
- name: Install Python requirements and Create default Conan profile
run: |
pip install -r https://raw.githubusercontent.com/Ultimaker/Cura/main/.github/workflows/requirements-conan-package.txt
conan profile new default --detect
# Note the runner requirements are always installed from the main branch in the Ultimaker/Cura repo
- name: Cache Conan local repository packages
uses: actions/cache@v3
with:
path: $HOME/.conan/data
key: ${{ runner.os }}-conan-export-cache
- name: Get Conan configuration from branch
if: ${{ inputs.conan_config_branch != '' }}
run: conan config install https://github.com/Ultimaker/conan-config.git -a "-b ${{ inputs.conan_config_branch }}"
- name: Get Conan configuration
if: ${{ inputs.conan_config_branch == '' }}
run: conan config install https://github.com/Ultimaker/conan-config.git
- name: Export the Package (binaries)
if: ${{ inputs.conan_export_binaries }}
run: conan create . ${{ inputs.recipe_id_full }} --build=missing --update
- name: Export the Package
if: ${{ !inputs.conan_export_binaries }}
run: conan export . ${{ inputs.recipe_id_full }}
- name: Create the latest alias
if: always()
run: conan alias ${{ inputs.recipe_id_latest }} ${{ inputs.recipe_id_full }}
- name: Upload the Package(s)
if: always()
run: |
conan upload ${{ inputs.recipe_id_full }} -r cura --all -c
conan upload ${{ inputs.recipe_id_latest }} -r cura -c

View file

@ -1,221 +0,0 @@
name: Get Conan Recipe Version
on:
workflow_call:
inputs:
project_name:
required: true
type: string
user:
required: false
default: ultimaker
type: string
additional_buildmetadata:
required: false
default: ""
type: string
outputs:
recipe_id_full:
description: "The full Conan recipe id: <name>/<version>@<user>/<channel>"
value: ${{ jobs.get-semver.outputs.recipe_id_full }}
recipe_id_latest:
description: "The full Conan recipe aliased (latest) id: <name>/(latest)@<user>/<channel>"
value: ${{ jobs.get-semver.outputs.recipe_id_latest }}
recipe_semver_full:
description: "The full semver <Major>.<Minor>.<Patch>-<PreReleaseTag>+<BuildMetaData>"
value: ${{ jobs.get-semver.outputs.semver_full }}
is_release_branch:
description: "is current branch a release branch?"
value: ${{ jobs.get-semver.outputs.release_branch }}
user:
description: "The conan user"
value: ${{ jobs.get-semver.outputs.user }}
channel:
description: "The conan channel"
value: ${{ jobs.get-semver.outputs.channel }}
project_name:
description: "The conan projectname"
value: ${{ inputs.project_name }}
jobs:
get-semver:
runs-on: ubuntu-latest
outputs:
recipe_id_full: ${{ steps.get-conan-broadcast-data.outputs.recipe_id_full }}
recipe_id_latest: ${{ steps.get-conan-broadcast-data.outputs.recipe_id_latest }}
semver_full: ${{ steps.get-conan-broadcast-data.outputs.semver_full }}
is_release_branch: ${{ steps.get-conan-broadcast-data.outputs.is_release_branch }}
user: ${{ steps.get-conan-broadcast-data.outputs.user }}
channel: ${{ steps.get-conan-broadcast-data.outputs.channel }}
steps:
- name: Checkout repo
uses: actions/checkout@v3
if: ${{ github.event.pull_request.head.repo.full_name == github.repository }}
with:
fetch-depth: 0
ref: ${{ github.head_ref }}
- name: Checkout repo PR
uses: actions/checkout@v3
if: ${{ github.event.pull_request.head.repo.full_name != github.repository }}
with:
fetch-depth: 0
ref: ${{ github.base_ref }}
- name: Setup Python and pip
uses: actions/setup-python@v4
with:
python-version: "3.10.x"
cache: 'pip'
cache-dependency-path: .github/workflows/requirements-conan-package.txt
- name: Install Python requirements and Create default Conan profile
run: |
pip install -r .github/workflows/requirements-conan-package.txt
pip install gitpython
- id: get-conan-broadcast-data
name: Get Conan broadcast data
run: |
import subprocess
import os
from conan.tools.scm import Version
from conan.errors import ConanException
from git import Repo
repo = Repo('.')
user = "${{ inputs.user }}".lower()
project_name = "${{ inputs.project_name }}"
event_name = "${{ github.event_name }}"
issue_number = "${{ github.ref }}".split('/')[2]
is_tag = "${{ github.ref_type }}" == "tag"
is_release_branch = False
ref_name = "${{ github.base_ref }}" if event_name == "pull_request" else "${{ github.ref_name }}"
buildmetadata = "" if "${{ inputs.additional_buildmetadata }}" == "" else "${{ inputs.additional_buildmetadata }}_"
# FIXME: for when we push a tag (such as an release)
channel = "testing"
if is_tag:
branch_version = Version(ref_name)
is_release_branch = True
channel = "_"
user = "_"
actual_version = f"{branch_version}"
else:
try:
branch_version = Version(repo.active_branch.name)
except ConanException:
branch_version = Version('0.0.0')
if ref_name == f"{branch_version.major}.{branch_version.minor}":
channel = 'stable'
is_release_branch = True
elif ref_name in ("main", "master"):
channel = 'testing'
else:
channel = "_".join(repo.active_branch.name.replace("-", "_").split("_")[:2]).lower()
if "pull_request" in event_name:
channel = f"pr_{issue_number}"
# %% Get the actual version
latest_branch_version = Version("0.0.0")
latest_branch_tag = None
for tag in repo.git.tag(merged = True).splitlines():
if str(tag).startswith("firmware") or str(tag).startswith("master"):
continue # Quick-fix for the versioning scheme name of the embedded team in fdm_materials(_private) repo
try:
version = Version(tag)
except ConanException:
continue
if version > latest_branch_version and version < Version("6.0.0"):
# FIXME: stupid old Cura tags 13.04 etc. keep popping up, als the fdm_material tag for firmware are messing with this
latest_branch_version = version
latest_branch_tag = repo.tag(tag)
if latest_branch_tag:
# %% Get the actual version
no_commits = 0
for commit in repo.iter_commits("HEAD"):
if commit == latest_branch_tag.commit:
break
no_commits += 1
latest_branch_version_prerelease = latest_branch_version.pre
if latest_branch_version.pre and not "." in str(latest_branch_version.pre):
# The prerealese did not contain a version number, default it to 1
latest_branch_version_prerelease = f"{latest_branch_version.pre}.1"
if event_name == "pull_request":
actual_version = f"{latest_branch_version.major}.{latest_branch_version.minor}.{latest_branch_version.patch}-{str(latest_branch_version_prerelease).lower()}+{buildmetadata}pr_{issue_number}_{no_commits}"
channel_metadata = f"{channel}_{no_commits}"
else:
if channel in ("stable", "_", ""):
channel_metadata = f"{no_commits}"
else:
channel_metadata = f"{channel}_{no_commits}"
if is_release_branch:
if latest_branch_version.pre == "" and branch_version > latest_branch_version:
actual_version = f"{branch_version.major}.{branch_version.minor}.0-beta.1+{buildmetadata}{channel_metadata}"
elif latest_branch_version.pre == "":
# An actual full release has been created, we are working on patch
bump_up_patch = int(str(latest_branch_version.patch)) + 1
actual_version = f"{latest_branch_version.major}.{latest_branch_version.minor}.{bump_up_patch}-beta.1+{buildmetadata}{channel_metadata}"
else:
# An beta release has been created we are working toward a next beta or full release
bump_up_release_tag = int(str(latest_branch_version.pre).split('.')[1]) + 1
actual_version = f"{latest_branch_version.major}.{latest_branch_version.minor}.{latest_branch_version.patch}-{str(latest_branch_version.pre).split('.')[0]}.{bump_up_release_tag}+{buildmetadata}{channel_metadata}"
else:
max_branches_version = Version("0.0.0")
branches_no_commits = no_commits
for branch in repo.references:
try:
if "remotes/origin" in branch.abspath:
b_version = Version(branch.name.split("/")[-1])
if b_version < Version("10.0.0") and b_version > max_branches_version:
max_branches_version = b_version
branches_no_commits = repo.commit().count() - branch.commit.count()
except:
pass
if max_branches_version > latest_branch_version:
actual_version = f"{max_branches_version.major}.{int(str(max_branches_version.minor)) + 1}.0-alpha+{buildmetadata}{channel}_{branches_no_commits}"
else:
actual_version = f"{latest_branch_version.major}.{int(str(latest_branch_version.minor)) + 1}.0-alpha+{buildmetadata}{channel_metadata}"
# %% Set the environment output
output_env = os.environ["GITHUB_OUTPUT"]
content = ""
if os.path.exists(output_env):
with open(output_env, "r") as f:
content = f.read()
with open(output_env, "w") as f:
f.write(content)
f.writelines(f"name={project_name}\n")
f.writelines(f"version={actual_version}\n")
f.writelines(f"channel={channel}\n")
f.writelines(f"recipe_id_full={project_name}/{actual_version}@{user}/{channel}\n")
f.writelines(f"recipe_id_latest={project_name}/latest@{user}/{channel}\n")
f.writelines(f"semver_full={actual_version}\n")
f.writelines(f"is_release_branch={str(is_release_branch).lower()}\n")
summary_env = os.environ["GITHUB_STEP_SUMMARY"]
with open(summary_env, "w") as f:
f.writelines(f"# {project_name}\n")
f.writelines(f"name={project_name}\n")
f.writelines(f"version={actual_version}\n")
f.writelines(f"channel={channel}\n")
f.writelines(f"recipe_id_full={project_name}/{actual_version}@{user}/{channel}\n")
f.writelines(f"recipe_id_latest={project_name}/latest@{user}/{channel}\n")
f.writelines(f"semver_full={actual_version}\n")
f.writelines(f"is_release_branch={str(is_release_branch).lower()}\n")
shell: python

View file

@ -1,151 +0,0 @@
name: Cura All Installers
run-name: ${{ inputs.cura_conan_version }} for exe ${{ inputs.build_windows_exe }}, msi ${{ inputs.build_windows_msi }}, dmg ${{ inputs.build_macos }}, pkg ${{ inputs.build_macos_installer }}, appimage ${{ inputs.build_linux }} - enterprise ${{ inputs.enterprise }}
on:
workflow_dispatch:
inputs:
cura_conan_version:
description: 'Cura Conan Version'
default: 'cura/latest@ultimaker/testing'
required: true
type: string
conan_args:
description: 'Conan args: eq.: --require-override'
default: ''
required: false
type: string
conan_config:
description: 'Conan config branch to use'
default: ''
required: false
type: string
enterprise:
description: 'Build Cura as an Enterprise edition'
default: false
required: true
type: boolean
staging:
description: 'Use staging API'
default: false
required: true
type: boolean
installer:
description: 'Create the installer'
default: true
required: true
type: boolean
build_windows_exe:
description: 'Build for Windows exe'
default: false
required: true
type: boolean
build_windows_msi:
description: 'Build for msi+pkg'
default: true
required: true
type: boolean
build_linux:
description: 'Build for Linux'
default: true
required: true
type: boolean
build_macos:
description: 'Build dmg for MacOS'
default: true
required: true
type: boolean
# Run the nightly at 3:25 UTC on working days
schedule:
- cron: '25 3 * * 1-5'
jobs:
windows-installer-create-exe:
if: ${{ inputs.build_windows_exe }}
uses: ./.github/workflows/cura-installer.yml
with:
platform: 'windows-2022'
os_name: 'win64'
cura_conan_version: ${{ inputs.cura_conan_version }}
conan_args: ${{ inputs.conan_args }}
conan_config: ${{ inputs.conan_config }}
enterprise: ${{ inputs.enterprise }}
staging: ${{ inputs.staging }}
installer: ${{ inputs.installer }}
msi_installer: false
secrets: inherit
windows-installer-create-msi:
if: ${{ inputs.build_windows_msi }}
uses: ./.github/workflows/cura-installer.yml
with:
platform: 'windows-2022'
os_name: 'win64'
cura_conan_version: ${{ inputs.cura_conan_version }}
conan_args: ${{ inputs.conan_args }}
conan_config: ${{ inputs.conan_config }}
enterprise: ${{ inputs.enterprise }}
staging: ${{ inputs.staging }}
installer: ${{ inputs.installer }}
msi_installer: true
secrets: inherit
linux-installer-create:
if: ${{ inputs.build_linux }}
uses: ./.github/workflows/cura-installer.yml
with:
platform: 'ubuntu-20.04'
os_name: 'linux'
cura_conan_version: ${{ inputs.cura_conan_version }}
conan_args: ${{ inputs.conan_args }}
conan_config: ${{ inputs.conan_config }}
enterprise: ${{ inputs.enterprise }}
staging: ${{ inputs.staging }}
installer: ${{ inputs.installer }}
msi_installer: false
secrets: inherit
linux-modern-installer-create:
if: ${{ inputs.build_linux }}
uses: ./.github/workflows/cura-installer.yml
with:
platform: 'ubuntu-22.04'
os_name: 'linux-modern'
cura_conan_version: ${{ inputs.cura_conan_version }}
conan_args: ${{ inputs.conan_args }}
conan_config: ${{ inputs.conan_config }}
enterprise: ${{ inputs.enterprise }}
staging: ${{ inputs.staging }}
installer: ${{ inputs.installer }}
msi_installer: false
secrets: inherit
macos-dmg-create:
if: ${{ inputs.build_macos }}
uses: ./.github/workflows/cura-installer.yml
with:
platform: 'macos-11'
os_name: 'mac'
cura_conan_version: ${{ inputs.cura_conan_version }}
conan_args: ${{ inputs.conan_args }}
conan_config: ${{ inputs.conan_config }}
enterprise: ${{ inputs.enterprise }}
staging: ${{ inputs.staging }}
installer: ${{ inputs.installer }}
msi_installer: false
secrets: inherit
macos-installer-create:
if: ${{ inputs.build_macos }}
uses: ./.github/workflows/cura-installer.yml
with:
platform: 'macos-11'
os_name: 'mac'
cura_conan_version: ${{ inputs.cura_conan_version }}
conan_args: ${{ inputs.conan_args }}
conan_config: ${{ inputs.conan_config }}
enterprise: ${{ inputs.enterprise }}
staging: ${{ inputs.staging }}
installer: ${{ inputs.installer }}
msi_installer: true
secrets: inherit

View file

@ -1,372 +0,0 @@
name: Cura Installer
run-name: ${{ inputs.cura_conan_version }} for ${{ inputs.platform }} by @${{ github.actor }}
on:
workflow_call:
inputs:
platform:
description: 'Selected Installer OS'
default: 'ubuntu-20.04'
required: true
type: string
os_name:
description: 'OS Friendly Name'
default: 'linux'
required: true
type: string
cura_conan_version:
description: 'Cura Conan Version'
default: 'cura/latest@ultimaker/testing'
required: true
type: string
conan_args:
description: 'Conan args: eq.: --require-override'
default: ''
required: false
type: string
conan_config:
description: 'Conan config branch to use'
default: ''
required: false
type: string
enterprise:
description: 'Build Cura as an Enterprise edition'
default: false
required: true
type: boolean
staging:
description: 'Use staging API'
default: false
required: true
type: boolean
installer:
description: 'Create the installer'
default: true
required: true
type: boolean
msi_installer:
description: 'Create the msi'
default: false
required: true
type: boolean
env:
CONAN_LOGIN_USERNAME_CURA: ${{ secrets.CONAN_USER }}
CONAN_PASSWORD_CURA: ${{ secrets.CONAN_PASS }}
CONAN_LOGIN_USERNAME_CURA_CE: ${{ secrets.CONAN_USER }}
CONAN_PASSWORD_CURA_CE: ${{ secrets.CONAN_PASS }}
CONAN_LOG_RUN_TO_OUTPUT: 1
CONAN_LOGGING_LEVEL: ${{ inputs.conan_logging_level }}
CONAN_NON_INTERACTIVE: 1
CODESIGN_IDENTITY: ${{ secrets.CODESIGN_IDENTITY }}
MAC_NOTARIZE_USER: ${{ secrets.MAC_NOTARIZE_USER }}
MAC_NOTARIZE_PASS: ${{ secrets.MAC_NOTARIZE_PASS }}
MACOS_CERT_P12: ${{ secrets.MACOS_CERT_P12 }}
MACOS_CERT_INSTALLER_P12: ${{ secrets.MACOS_CERT_INSTALLER_P12 }}
MACOS_CERT_USER: ${{ secrets.MACOS_CERT_USER }}
GPG_PRIVATE_KEY: ${{ secrets.GPG_PRIVATE_KEY }}
MACOS_CERT_PASSPHRASE: ${{ secrets.MACOS_CERT_PASSPHRASE }}
WIN_CERT_INSTALLER_CER: ${{ secrets.WIN_CERT_INSTALLER_CER }}
WIN_CERT_INSTALLER_CER_PASS: ${{ secrets.WIN_CERT_INSTALLER_CER_PASS }}
CURA_CONAN_VERSION: ${{ inputs.cura_conan_version }}
ENTERPRISE: ${{ inputs.enterprise }}
STAGING: ${{ inputs.staging }}
jobs:
cura-installer-create:
runs-on: ${{ inputs.platform }}
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Setup Python and pip
uses: actions/setup-python@v4
with:
python-version: '3.10.x'
cache: 'pip'
cache-dependency-path: .github/workflows/requirements-conan-package.txt
- name: Install Python requirements for runner
run: pip install -r https://raw.githubusercontent.com/Ultimaker/Cura/main/.github/workflows/requirements-conan-package.txt
# Note the runner requirements are always installed from the main branch in the Ultimaker/Cura repo
- name: Use Conan download cache (Bash)
if: ${{ runner.os != 'Windows' }}
run: conan config set storage.download_cache="$HOME/.conan/conan_download_cache"
- name: Use Conan download cache (Powershell)
if: ${{ runner.os == 'Windows' }}
run: conan config set storage.download_cache="C:\Users\runneradmin\.conan\conan_download_cache"
- name: Cache Conan local repository packages (Bash)
uses: actions/cache@v3
if: ${{ runner.os != 'Windows' }}
with:
path: |
$HOME/.conan/data
$HOME/.conan/conan_download_cache
key: conan-${{ runner.os }}-${{ runner.arch }}-installer-cache
- name: Cache Conan local repository packages (Powershell)
uses: actions/cache@v3
if: ${{ runner.os == 'Windows' }}
with:
path: |
C:\Users\runneradmin\.conan\data
C:\.conan
C:\Users\runneradmin\.conan\conan_download_cache
key: conan-${{ runner.os }}-${{ runner.arch }}-installer-cache
- name: Install MacOS system requirements
if: ${{ runner.os == 'Macos' }}
run: brew install autoconf automake ninja create-dmg # Delete create-dmg when deprecating dmg
- name: Hack needed specifically for ubuntu-22.04 from mid-Feb 2023 onwards
if: ${{ runner.os == 'Linux' && startsWith(inputs.platform, 'ubuntu-22.04') }}
run: sudo apt remove libodbc2 libodbcinst2 unixodbc-common -y
# NOTE: Due to what are probably github issues, we have to remove the cache and reconfigure before the rest.
# This is maybe because grub caches the disk it uses last time, which is recreated each time.
- name: Install Linux system requirements
if: ${{ runner.os == 'Linux' }}
run: |
sudo rm /var/cache/debconf/config.dat
sudo dpkg --configure -a
sudo add-apt-repository ppa:ubuntu-toolchain-r/test -y
sudo apt update
sudo apt upgrade
sudo apt install build-essential checkinstall libegl-dev zlib1g-dev libssl-dev ninja-build autoconf libx11-dev libx11-xcb-dev libfontenc-dev libice-dev libsm-dev libxau-dev libxaw7-dev libxcomposite-dev libxcursor-dev libxdamage-dev libxdmcp-dev libxext-dev libxfixes-dev libxi-dev libxinerama-dev libxkbfile-dev libxmu-dev libxmuu-dev libxpm-dev libxrandr-dev libxrender-dev libxres-dev libxss-dev libxt-dev libxtst-dev libxv-dev libxvmc-dev libxxf86vm-dev xtrans-dev libxcb-render0-dev libxcb-render-util0-dev libxcb-xkb-dev libxcb-icccm4-dev libxcb-image0-dev libxcb-keysyms1-dev libxcb-randr0-dev libxcb-shape0-dev libxcb-sync-dev libxcb-xfixes0-dev libxcb-xinerama0-dev xkb-data libxcb-dri3-dev uuid-dev libxcb-util-dev libxkbcommon-x11-dev pkg-config -y
wget --no-check-certificate --quiet https://github.com/AppImage/AppImageKit/releases/download/continuous/appimagetool-x86_64.AppImage -O $GITHUB_WORKSPACE/appimagetool
chmod +x $GITHUB_WORKSPACE/appimagetool
echo "APPIMAGETOOL_LOCATION=$GITHUB_WORKSPACE/appimagetool" >> $GITHUB_ENV
- name: Install GCC-12 on ubuntu-22.04
if: ${{ startsWith(inputs.platform, 'ubuntu-22.04') }}
run: |
sudo apt install g++-12 gcc-12 -y
sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-12 12
sudo update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-12 12
- name: Use GCC-10 on ubuntu-20.04
if: ${{ startsWith(inputs.platform, 'ubuntu-20.04') }}
run: |
sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-10 10
sudo update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-10 10
- name: Create the default Conan profile
run: conan profile new default --detect
- name: Configure GPG Key Linux (Bash)
if: ${{ runner.os == 'Linux' }}
run: echo -n "$GPG_PRIVATE_KEY" | base64 --decode | gpg --import
- name: Configure Macos keychain Developer Cert(Bash)
id: macos-keychain-developer-cert
if: ${{ runner.os == 'Macos' }}
uses: apple-actions/import-codesign-certs@v1
with:
keychain-password: ${{ secrets.MACOS_KEYCHAIN_PASSWORD }}
p12-file-base64: ${{ secrets.MACOS_CERT_P12 }}
p12-password: ${{ secrets.MACOS_CERT_PASSPHRASE }}
- name: Configure Macos keychain Installer Cert (Bash)
id: macos-keychain-installer-cert
if: ${{ runner.os == 'Macos' }}
uses: apple-actions/import-codesign-certs@v1
with:
keychain-password: ${{ secrets.MACOS_KEYCHAIN_PASSWORD }}
create-keychain: false # keychain is created in previous use of action.
p12-file-base64: ${{ secrets.MACOS_CERT_INSTALLER_P12 }}
p12-password: ${{ secrets.MACOS_CERT_PASSPHRASE }}
- name: Create PFX certificate from BASE64_PFX_CONTENT secret
if: ${{ runner.os == 'Windows' }}
id: create-pfx
env:
PFX_CONTENT: ${{ secrets.WIN_CERT_INSTALLER_CER }}
run: |
$pfxPath = Join-Path -Path $env:RUNNER_TEMP -ChildPath "cert.pfx";
$encodedBytes = [System.Convert]::FromBase64String($env:PFX_CONTENT);
Set-Content $pfxPath -Value $encodedBytes -AsByteStream;
echo "PFX_PATH=$pfxPath" >> $env:GITHUB_OUTPUT;
- name: Get Conan configuration from branch
if: ${{ inputs.conan_config != '' }}
run: conan config install https://github.com/Ultimaker/conan-config.git -a "-b ${{ inputs.conan_config }}"
- name: Get Conan configuration
if: ${{ inputs.conan_config == '' }}
run: conan config install https://github.com/Ultimaker/conan-config.git
- name: Create the Packages (Bash)
if: ${{ runner.os != 'Windows' }}
run: conan install $CURA_CONAN_VERSION ${{ inputs.conan_args }} --build=missing --update -if cura_inst -g VirtualPythonEnv -o cura:enterprise=$ENTERPRISE -o cura:staging=$STAGING --json "cura_inst/conan_install_info.json"
- name: Create the Packages (Powershell)
if: ${{ runner.os == 'Windows' }}
run: conan install $Env:CURA_CONAN_VERSION ${{ inputs.conan_args }} --build=missing --update -if cura_inst -g VirtualPythonEnv -o cura:enterprise=$Env:ENTERPRISE -o cura:staging=$Env:STAGING --json "cura_inst/conan_install_info.json"
- name: Set Environment variables for Cura (bash)
if: ${{ runner.os != 'Windows' }}
run: |
. ./cura_inst/bin/activate_github_actions_env.sh
. ./cura_inst/bin/activate_github_actions_version_env.sh
- name: Set Environment variables for Cura (Powershell)
if: ${{ runner.os == 'Windows' }}
run: |
echo "${Env:WIX}\bin" | Out-File -FilePath $env:GITHUB_PATH -Encoding utf8 -Append
.\cura_inst\Scripts\activate_github_actions_env.ps1
.\cura_inst\Scripts\activate_github_actions_version_env.ps1
- name: Unlock Macos keychain (Bash)
if: ${{ runner.os == 'Macos' }}
run: security unlock -p $TEMP_KEYCHAIN_PASSWORD signing_temp.keychain
env:
TEMP_KEYCHAIN_PASSWORD: ${{ steps.macos-keychain-developer-cert.outputs.keychain-password }}
# FIXME: This is a workaround to ensure that we use and pack a shared library for OpenSSL 1.1.1l. We currently compile
# OpenSSL statically for CPython, but our Python Dependenies (such as PyQt6) require a shared library.
# Because Conan won't allow for building the same library with two different options (easily) we need to install it explicitly
# and do a manual copy to the VirtualEnv, such that Pyinstaller can find it.
- name: Install OpenSSL shared
run: conan install openssl/1.1.1l@_/_ --build=missing --update -o openssl:shared=True -g deploy
- name: Copy OpenSSL shared (Bash)
if: ${{ runner.os != 'Windows' }}
run: |
cp ./openssl/lib/*.so* ./cura_inst/bin/ || true
cp ./openssl/lib/*.dylib* ./cura_inst/bin/ || true
- name: Copy OpenSSL shared (Powershell)
if: ${{ runner.os == 'Windows' }}
run: |
cp openssl/bin/*.dll ./cura_inst/Scripts/
cp openssl/lib/*.lib ./cura_inst/Lib/
- name: Create the Cura dist
run: pyinstaller ./cura_inst/UltiMaker-Cura.spec
- name: Output the name file name and extension
id: filename
shell: python
run: |
import os
enterprise = "-Enterprise" if "${{ inputs.enterprise }}" == "true" else ""
installer_filename = f"UltiMaker-Cura-{os.getenv('CURA_VERSION_FULL')}{enterprise}-${{ inputs.os_name }}"
if "${{ runner.os }}" == "Windows":
installer_ext = "msi" if "${{ inputs.msi_installer }}" == "true" else "exe"
elif "${{ runner.os }}" == "macOS":
installer_ext = "pkg" if "${{ inputs.msi_installer }}" == "true" else "dmg"
else:
installer_ext = "AppImage"
output_env = os.environ["GITHUB_OUTPUT"]
content = ""
if os.path.exists(output_env):
with open(output_env, "r") as f:
content = f.read()
with open(output_env, "w") as f:
f.write(content)
f.writelines(f"INSTALLER_FILENAME={installer_filename}\n")
f.writelines(f"INSTALLER_EXT={installer_ext}\n")
f.writelines(f"FULL_INSTALLER_FILENAME={installer_filename}.{installer_ext}\n")
- name: Summarize the used Conan dependencies
shell: python
run: |
import os
import json
from pathlib import Path
conan_install_info_path = Path("cura_inst/conan_install_info.json")
conan_info = {"installed": []}
if os.path.exists(conan_install_info_path):
with open(conan_install_info_path, "r") as f:
conan_info = json.load(f)
sorted_deps = sorted([dep["recipe"]["id"].replace('#', r' rev: ') for dep in conan_info["installed"]])
summary_env = os.environ["GITHUB_STEP_SUMMARY"]
content = ""
if os.path.exists(summary_env):
with open(summary_env, "r") as f:
content = f.read()
with open(summary_env, "w") as f:
f.write(content)
f.writelines("# ${{ steps.filename.outputs.FULL_INSTALLER_FILENAME }} uses:\n")
for dep in sorted_deps:
f.writelines(f"`{dep}`\n")
- name: Archive the artifacts (bash)
if: ${{ !inputs.installer && runner.os != 'Windows' }}
run: tar -zcf "./${{ steps.filename.outputs.INSTALLER_FILENAME }}.tar.gz" "./UltiMaker-Cura/"
working-directory: dist
- name: Archive the artifacts (Powershell)
if: ${{ !inputs.installer && runner.os == 'Windows' }}
run: Compress-Archive -Path ".\UltiMaker-Cura" -DestinationPath ".\${{ steps.filename.outputs.INSTALLER_FILENAME }}.zip"
working-directory: dist
- name: Create the Windows exe installer (Powershell)
if: ${{ inputs.installer && runner.os == 'Windows' && !inputs.msi_installer }}
run: |
python ..\cura_inst\packaging\NSIS\create_windows_installer.py ../cura_inst . "${{ steps.filename.outputs.FULL_INSTALLER_FILENAME }}"
working-directory: dist
- name: Create the Windows msi installer (Powershell)
if: ${{ inputs.installer && runner.os == 'Windows' && inputs.msi_installer }}
run: |
python ..\cura_inst\packaging\msi\create_windows_msi.py ..\cura_inst .\UltiMaker-Cura "${{ steps.filename.outputs.FULL_INSTALLER_FILENAME }}" "$Env:CURA_APP_NAME"
working-directory: dist
- name: Sign the Windows exe installer (Powershell)
if: ${{ inputs.installer && runner.os == 'Windows' && !inputs.msi_installer }}
env:
PFX_PATH: ${{ steps.create-pfx.outputs.PFX_PATH }}
run: |
& "C:/Program Files (x86)/Windows Kits/10/bin/10.0.17763.0/x86/signtool.exe" sign /f $Env:PFX_PATH /p "$Env:WIN_CERT_INSTALLER_CER_PASS" /fd SHA256 /t http://timestamp.digicert.com "${{ steps.filename.outputs.FULL_INSTALLER_FILENAME }}"
working-directory: dist
- name: Sign the Windows msi installer (Powershell)
if: ${{ inputs.installer && runner.os == 'Windows' && inputs.msi_installer }}
env:
PFX_PATH: ${{ steps.create-pfx.outputs.PFX_PATH }}
run: |
& "C:/Program Files (x86)/Windows Kits/10/bin/10.0.17763.0/x86/signtool.exe" sign /f $Env:PFX_PATH /p "$Env:WIN_CERT_INSTALLER_CER_PASS" /fd SHA256 /t http://timestamp.digicert.com "${{ steps.filename.outputs.FULL_INSTALLER_FILENAME }}"
working-directory: dist
- name: Create the Linux AppImage (Bash)
if: ${{ inputs.installer && runner.os == 'Linux' }}
run: python ../cura_inst/packaging/AppImage/create_appimage.py ./UltiMaker-Cura $CURA_VERSION_FULL "${{ steps.filename.outputs.FULL_INSTALLER_FILENAME }}"
working-directory: dist
- name: Create the MacOS dmg and/or pkg (Bash)
if: ${{ github.event.inputs.installer == 'true' && runner.os == 'Macos' }}
run: python ../cura_inst/packaging/MacOS/build_macos.py ../cura_inst . $CURA_CONAN_VERSION "${{ steps.filename.outputs.FULL_INSTALLER_FILENAME }}" "$CURA_APP_NAME"
working-directory: dist
- name: Upload the artifacts
uses: actions/upload-artifact@v3
with:
name: ${{ steps.filename.outputs.INSTALLER_FILENAME }}-${{ steps.filename.outputs.INSTALLER_EXT }}
path: |
dist/*.tar.gz
dist/*.zip
dist/${{ steps.filename.outputs.FULL_INSTALLER_FILENAME }}
dist/*.asc
retention-days: 5
notify-export:
if: ${{ always() }}
needs: [ cura-installer-create ]
uses: ultimaker/cura/.github/workflows/notify.yml@main
with:
success: ${{ contains(join(needs.*.result, ','), 'success') }}
success_title: "Create the Cura distributions"
success_body: "Installers for ${{ inputs.cura_conan_version }}"
failure_title: "Failed to create the Cura distributions"
failure_body: "Failed to create at least 1 installer for ${{ inputs.cura_conan_version }}"
secrets: inherit

272
.github/workflows/installers.yml vendored Normal file
View file

@ -0,0 +1,272 @@
name: All installers
run-name: ${{ inputs.cura_conan_version }} by @${{ github.actor }}
on:
workflow_dispatch:
inputs:
cura_conan_version:
description: 'Cura Conan Version'
default: 'cura/latest@ultimaker/testing'
required: true
type: string
conan_args:
description: 'Conan args: eq.: --require-override'
default: ''
required: false
type: string
enterprise:
description: 'Build Cura as an Enterprise edition'
default: false
required: true
type: boolean
staging:
description: 'Use staging API'
default: false
required: true
type: boolean
nightly:
description: 'Upload to nightly release'
default: false
required: true
type: boolean
schedule:
# Daily at 4:15 CET (main-branch) and 5:15 CET (release-branch)
- cron: '15 3 * * *'
- cron: '15 4 * * *'
env:
CONAN_ARGS: ${{ inputs.conan_args || '' }}
ENTERPRISE: ${{ inputs.enterprise || false }}
STAGING: ${{ inputs.staging || false }}
jobs:
default_values:
uses: ultimaker/cura-workflows/.github/workflows/cura-installer-default-value.yml@main
with:
cura_conan_version: ${{ inputs.cura_conan_version }}
latest_release: '5.6'
latest_release_schedule_hour: 4
latest_release_tag: 'nightly'
windows-installer:
uses: ultimaker/cura-workflows/.github/workflows/cura-installer-windows.yml@main
needs: [ default_values ]
with:
cura_conan_version: ${{ needs.default_values.outputs.cura_conan_version }}
conan_args: ${{ github.event.inputs.conan_args }}
enterprise: ${{ github.event.inputs.enterprise == 'true' }}
staging: ${{ github.event.inputs.staging == 'true' }}
architecture: X64
operating_system: self-hosted-Windows-X64
secrets: inherit
linux-installer:
uses: ultimaker/cura-workflows/.github/workflows/cura-installer-linux.yml@main
needs: [ default_values ]
with:
cura_conan_version: ${{ needs.default_values.outputs.cura_conan_version }}
conan_args: ${{ github.event.inputs.conan_args }}
enterprise: ${{ github.event.inputs.enterprise == 'true' }}
staging: ${{ github.event.inputs.staging == 'true' }}
architecture: X64
operating_system: ubuntu-22.04
secrets: inherit
macos-installer:
uses: ultimaker/cura-workflows/.github/workflows/cura-installer-macos.yml@main
needs: [ default_values ]
with:
cura_conan_version: ${{ needs.default_values.outputs.cura_conan_version }}
conan_args: ${{ github.event.inputs.conan_args }}
enterprise: ${{ github.event.inputs.enterprise == 'true' }}
staging: ${{ github.event.inputs.staging == 'true' }}
architecture: X64
operating_system: self-hosted-X64
secrets: inherit
macos-arm-installer:
uses: ultimaker/cura-workflows/.github/workflows/cura-installer-macos.yml@main
needs: [ default_values ]
with:
cura_conan_version: ${{ needs.default_values.outputs.cura_conan_version }}
conan_args: ${{ github.event.inputs.conan_args }}
enterprise: ${{ github.event.inputs.enterprise == 'true' }}
staging: ${{ github.event.inputs.staging == 'true' }}
architecture: ARM64
operating_system: self-hosted-ARM64
secrets: inherit
# Run and update nightly release when the nightly input is set to true or if the schedule is triggered
update-nightly-release:
if: ${{ inputs.nightly || github.event_name == 'schedule' }}
runs-on: ubuntu-latest
needs: [ default_values, windows-installer, linux-installer, macos-installer, macos-arm-installer ]
steps:
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 1
- name: Download the run info
uses: actions/download-artifact@v2
with:
name: linux-run-info
- name: Set the run info as environment variables
run: |
. run_info.sh
- name: Output the name file name and extension
id: filename
shell: python
run: |
import os
import datetime
enterprise = "-Enterprise" if "${{ github.event.inputs.enterprise }}" == "true" else ""
linux = f"UltiMaker-Cura-{os.getenv('CURA_VERSION_FULL')}{enterprise}-linux-X64"
mac_x64_dmg = f"UltiMaker-Cura-{os.getenv('CURA_VERSION_FULL')}{enterprise}-macos-X64"
mac_x64_pkg = f"UltiMaker-Cura-{os.getenv('CURA_VERSION_FULL')}{enterprise}-macos-X64"
mac_arm_dmg = f"UltiMaker-Cura-{os.getenv('CURA_VERSION_FULL')}{enterprise}-macos-ARM64"
mac_arm_pkg = f"UltiMaker-Cura-{os.getenv('CURA_VERSION_FULL')}{enterprise}-macos-ARM64"
win_msi = installer_filename = f"UltiMaker-Cura-{os.getenv('CURA_VERSION_FULL')}{enterprise}-win64-X64"
win_exe = installer_filename = f"UltiMaker-Cura-{os.getenv('CURA_VERSION_FULL')}{enterprise}-win64-X64"
nightly_name = "UltiMaker-Cura-" + os.getenv('CURA_VERSION_FULL').split("+")[0]
nightly_creation_time = str(datetime.datetime.now().strftime("%Y-%m-%d %H:%M:%S"))
output_env = os.environ["GITHUB_OUTPUT"]
content = ""
if os.path.exists(output_env):
with open(output_env, "r") as f:
content = f.read()
with open(output_env, "w") as f:
f.write(content)
f.writelines(f"LINUX={linux}\n")
f.writelines(f"MAC_X64_DMG={mac_x64_dmg}\n")
f.writelines(f"MAC_X64_PKG={mac_x64_pkg}\n")
f.writelines(f"MAC_ARM_DMG={mac_arm_dmg}\n")
f.writelines(f"MAC_ARM_PKG={mac_arm_pkg}\n")
f.writelines(f"WIN_MSI={win_msi}\n")
f.writelines(f"WIN_EXE={win_exe}\n")
f.writelines(f"NIGHTLY_NAME={nightly_name}\n")
f.writelines(f"NIGHTLY_TIME={nightly_creation_time}\n")
- name: Download linux installer jobs artifacts
uses: actions/download-artifact@v2
with:
name: ${{ steps.filename.outputs.LINUX }}-AppImage
path: installers
- name: Download linux installer jobs asc artifacts
uses: actions/download-artifact@v2
with:
name: ${{ steps.filename.outputs.LINUX }}-asc
path: installers
- name: Rename Linux installer to nightlies
run: |
mv installers/${{ steps.filename.outputs.LINUX }}.AppImage installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-linux-X64.AppImage
mv installers/${{ steps.filename.outputs.LINUX }}.AppImage.asc installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-linux-X64.AppImage.asc
- name: Update nightly release for Linux
run: |
gh release upload ${{ needs.default_values.outputs.release_tag }} installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-linux-X64.AppImage --clobber
gh release upload ${{ needs.default_values.outputs.release_tag }} installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-linux-X64.AppImage.asc --clobber
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Download win msi installer jobs artifacts
uses: actions/download-artifact@v2
with:
name: ${{ steps.filename.outputs.WIN_MSI }}-msi
path: installers
- name: Download win exe installer jobs artifacts
uses: actions/download-artifact@v2
with:
name: ${{ steps.filename.outputs.WIN_EXE }}-exe
path: installers
- name: Rename Windows installers to nightlies
run: |
mv installers/${{ steps.filename.outputs.WIN_MSI }}.msi installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-win64-X64.msi
mv installers/${{ steps.filename.outputs.WIN_EXE }}.exe installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-win64-X64.exe
- name: Update nightly release for Windows
run: |
gh release upload ${{ needs.default_values.outputs.release_tag }} installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-win64-X64.msi --clobber
gh release upload ${{ needs.default_values.outputs.release_tag }} installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-win64-X64.exe --clobber
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Download MacOS (X64) dmg installer jobs artifacts
uses: actions/download-artifact@v2
with:
name: ${{ steps.filename.outputs.MAC_X64_DMG }}-dmg
path: installers
- name: Download MacOS (X64) pkg installer jobs artifacts
uses: actions/download-artifact@v2
with:
name: ${{ steps.filename.outputs.MAC_X64_PKG }}-pkg
path: installers
- name: Rename MacOS (X64) installers to nightlies
run: |
mv installers/${{ steps.filename.outputs.MAC_X64_DMG }}.dmg installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-macos-X64.dmg
mv installers/${{ steps.filename.outputs.MAC_X64_PKG }}.pkg installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-macos-X64.pkg
- name: Update nightly release for MacOS (X64)
run: |
gh release upload ${{ needs.default_values.outputs.release_tag }} installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-macos-X64.dmg --clobber
gh release upload ${{ needs.default_values.outputs.release_tag }} installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-macos-X64.pkg --clobber
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Download MacOS (ARM-64) dmg installer jobs artifacts
uses: actions/download-artifact@v2
with:
name: ${{ steps.filename.outputs.MAC_ARM_DMG }}-dmg
path: installers
- name: Download MacOS (ARM-64) pkg installer jobs artifacts
uses: actions/download-artifact@v2
with:
name: ${{ steps.filename.outputs.MAC_ARM_PKG }}-pkg
path: installers
- name: Rename MacOS (ARM-64) installers to nightlies
run: |
mv installers/${{ steps.filename.outputs.MAC_ARM_DMG }}.dmg installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-macos-ARM64.dmg
mv installers/${{ steps.filename.outputs.MAC_ARM_PKG }}.pkg installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-macos-ARM64.pkg
- name: Update nightly release for MacOS (ARM-64)
run: |
gh release upload ${{ needs.default_values.outputs.release_tag }} installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-macos-ARM64.dmg --clobber
gh release upload ${{ needs.default_values.outputs.release_tag }} installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-macos-ARM64.pkg --clobber
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: create the release notes
shell: python
run: |
import os
import datetime
from jinja2 import Template
with open(".github/workflows/release_notes.md.jinja", "r") as f:
release_notes = Template(f.read())
current_nightly_beta = "${{ needs.default_values.outputs.release_tag }}".split("nightly-")[-1]
with open("release-notes.md", "w") as f:
f.write(release_notes.render(
timestamp="${{ steps.filename.outputs.NIGHTLY_TIME }}",
branch="" if "${{ needs.default-values.outputs.release_tag == 'nightly' }}" == 'true' else current_nightly_beta,
branch_specific="" if os.getenv("GITHUB_REF") == "refs/heads/main" else f"?branch={current_nightly_beta}",
))
- name: Update nightly release description (with date)
if: always()
run: |
gh release edit ${{ needs.default_values.outputs.release_tag }} --title "${{ steps.filename.outputs.NIGHTLY_NAME }}" --notes-file release-notes.md
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

52
.github/workflows/linux.yml vendored Normal file
View file

@ -0,0 +1,52 @@
name: Linux Installer
run-name: ${{ inputs.cura_conan_version }} by @${{ github.actor }}
on:
workflow_dispatch:
inputs:
cura_conan_version:
description: 'Cura Conan Version'
default: 'cura/latest@ultimaker/testing'
required: true
type: string
conan_args:
description: 'Conan args: eq.: --require-override'
default: ''
required: false
type: string
enterprise:
description: 'Build Cura as an Enterprise edition'
default: false
required: true
type: boolean
staging:
description: 'Use staging API'
default: false
required: true
type: boolean
architecture:
description: 'Architecture'
required: true
default: 'X64'
type: choice
options:
- X64
operating_system:
description: 'OS'
required: true
default: 'ubuntu-22.04'
type: choice
options:
- ubuntu-22.04
jobs:
linux-installer:
uses: ultimaker/cura-workflows/.github/workflows/cura-installer-linux.yml@main
with:
cura_conan_version: ${{ inputs.cura_conan_version }}
conan_args: ${{ inputs.conan_args }}
enterprise: ${{ inputs.enterprise }}
staging: ${{ inputs.staging }}
architecture: ${{ inputs.architecture }}
operating_system: ${{ inputs.operating_system }}
secrets: inherit

56
.github/workflows/macos.yml vendored Normal file
View file

@ -0,0 +1,56 @@
name: MacOS Installer
run-name: ${{ inputs.cura_conan_version }} by @${{ github.actor }}
on:
workflow_dispatch:
inputs:
cura_conan_version:
description: 'Cura Conan Version'
default: 'cura/latest@ultimaker/testing'
required: true
type: string
conan_args:
description: 'Conan args: eq.: --require-override'
default: ''
required: false
type: string
enterprise:
description: 'Build Cura as an Enterprise edition'
default: false
required: true
type: boolean
staging:
description: 'Use staging API'
default: false
required: true
type: boolean
architecture:
description: 'Architecture'
required: true
default: 'ARM64'
type: choice
options:
- X64
- ARM64
operating_system:
description: 'OS'
required: true
default: 'self-hosted-ARM64'
type: choice
options:
- self-hosted-X64
- self-hosted-ARM64
- macos-11
- macos-12
jobs:
macos-installer:
uses: ultimaker/cura-workflows/.github/workflows/cura-installer-macos.yml@main
with:
cura_conan_version: ${{ inputs.cura_conan_version }}
conan_args: ${{ inputs.conan_args }}
enterprise: ${{ inputs.enterprise }}
staging: ${{ inputs.staging }}
architecture: ${{ inputs.architecture }}
operating_system: ${{ inputs.operating_system }}
secrets: inherit

View file

@ -1,54 +0,0 @@
name: Get Conan Recipe Version
on:
workflow_call:
inputs:
success:
required: true
type: boolean
success_title:
required: true
type: string
success_body:
required: true
type: string
failure_title:
required: true
type: string
failure_body:
required: true
type: string
jobs:
slackNotification:
name: Slack Notification
runs-on: ubuntu-latest
steps:
- name: Slack notify on-success
if: ${{ inputs.success }}
uses: rtCamp/action-slack-notify@v2
env:
SLACK_USERNAME: ${{ github.repository }}
SLACK_COLOR: green
SLACK_ICON: https://github.com/Ultimaker/Cura/blob/main/icons/cura-128.png?raw=true
SLACK_TITLE: ${{ inputs.success_title }}
SLACK_MESSAGE: ${{ inputs.success_body }}
SLACK_WEBHOOK: ${{ secrets.SLACK_WEBHOOK }}
- name: Slack notify on-failure
if: ${{ !inputs.success }}
uses: rtCamp/action-slack-notify@v2
env:
SLACK_USERNAME: ${{ github.repository }}
SLACK_COLOR: red
SLACK_ICON: https://github.com/Ultimaker/Cura/blob/main/icons/cura-128.png?raw=true
SLACK_TITLE: ${{ inputs.failure_title }}
SLACK_MESSAGE: ${{ inputs.failure_body }}
SLACK_WEBHOOK: ${{ secrets.SLACK_WEBHOOK }}

View file

@ -18,6 +18,7 @@ jobs:
- uses: technote-space/get-diff-action@v6
with:
DIFF_FILTER: AMRCD
PATTERNS: |
resources/+(extruders|definitions)/*.def.json
resources/+(intent|quality|variants)/**/*.inst.cfg
@ -41,6 +42,10 @@ jobs:
if: env.GIT_DIFF && !env.MATCHED_FILES
run: python printer-linter/src/terminal.py --diagnose --report printer-linter-result/fixes.yml ${{ env.GIT_DIFF_FILTERED }}
- name: Check Deleted Files(s)
if: env.GIT_DIFF
run: python printer-linter/src/terminal.py --deleted --report printer-linter-result/comment.md ${{ env.GIT_DIFF_FILTERED }}
- name: Save PR metadata
run: |
echo ${{ github.event.number }} > printer-linter-result/pr-id.txt

View file

@ -39,6 +39,11 @@ jobs:
echo "pr_id=$(cat printer-linter-result/pr-id.txt)" >> $GITHUB_ENV
echo "pr_head_repo=$(cat printer-linter-result/pr-head-repo.txt)" >> $GITHUB_ENV
echo "pr_head_ref=$(cat printer-linter-result/pr-head-ref.txt)" >> $GITHUB_ENV
if [[ -f "printer-linter-result/comment.md" ]]; then
echo "commentFileExists=true" >> $GITHUB_ENV
else
echo "commentFileExists=false" >> $GITHUB_ENV
fi
- uses: actions/checkout@v3
with:
@ -72,6 +77,13 @@ jobs:
mkdir printer-linter-result
unzip printer-linter-result.zip -d printer-linter-result
- name: Run PR Comments
if: env.commentFileExists == 'true'
uses: peter-evans/create-or-update-comment@v4
with:
issue-number: ${{ env.pr_id }}
body-path: 'printer-linter-result/comment.md'
- name: Run clang-tidy-pr-comments action
uses: platisd/clang-tidy-pr-comments@bc0bb7da034a8317d54e7fe1e819159002f4cc40
with:

View file

@ -1,15 +1,10 @@
name: process-pull-request
on:
pull_request_target:
types: [opened, reopened, edited, synchronize, review_requested, ready_for_review, assigned]
pull_request_target:
types: [ opened, reopened, edited, review_requested, ready_for_review, assigned ]
jobs:
add_label:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions-ecosystem/action-add-labels@v1
if: ${{ github.event.pull_request.head.repo.full_name != github.repository }}
with:
labels: 'PR: Community Contribution :crown:'
add_label:
uses: ultimaker/cura-workflows/.github/workflows/process-pull-request.yml@main
secrets: inherit

View file

@ -0,0 +1,39 @@
# Nightlies
> :clock12: Created at: {{ timestamp }}
| | |
|--------------:|--------------------------------------------------------------------------------------------|
| **Nightlies** | [![nightly {{ branch }}](https://github.com/Ultimaker/Cura/actions/workflows/installers.yml/badge.svg{{ branch_specific }}
?event=schedule)](https://github.com/Ultimaker/Cura/actions/workflows/installers.yml) |
# Unit Test results
| | |
|-------------------------------:|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| **Cura {{ branch }}** | [![unit-test](https://github.com/Ultimaker/Cura/actions/workflows/unit-test.yml/badge.svg{{ branch_specific }})](https://github.com/Ultimaker/Cura/actions/workflows/unit-test.yml) |
| **CuraEngine {{ branch }}** | [![unit-test](https://github.com/Ultimaker/CuraEngine/actions/workflows/unit-test.yml/badge.svg{{ branch_specific }})](https://github.com/Ultimaker/CuraEngine/actions/workflows/unit-test.yml) |
| **Uranium {{ branch }}** | [![unit-test](https://github.com/Ultimaker/Uranium/actions/workflows/unit-test.yml/badge.svg{{ branch_specific }})](https://github.com/Ultimaker/Uranium/actions/workflows/unit-test.yml) |
| **CuraEngine GradualFlow 0.1** | [![unit-test](https://github.com/Ultimaker/CuraEngine_plugin_gradual_flow/actions/workflows/unit-test.yml/badge.svg?branch=0.1)](https://github.com/Ultimaker/CuraEngine_plugin_gradual_flow/actions/workflows/unit-test.yml) |
| **synsepalum-dulcificum 0.1** | [![unit-test](https://github.com/Ultimaker/synsepalum-dulcificum/actions/workflows/unit-test.yml/badge.svg?branch=0.1)](https://github.com/Ultimaker/synsepalum-dulcificum/actions/workflows/unit-test.yml) |
| **libSavitar** | [![unit-test](https://github.com/Ultimaker/libSavitar/actions/workflows/unit-test.yml/badge.svg)](https://github.com/Ultimaker/libSavitar/actions/workflows/unit-test.yml) |
| **libnest2d** | [![unit-test](https://github.com/Ultimaker/libnest2d/actions/workflows/unit-test.yml/badge.svg)](https://github.com/Ultimaker/libnest2d/actions/workflows/unit-test.yml) |
# Conan packages
| | |
|------------------------------------:|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| **Cura {{ branch }}** | [![conan-package](https://github.com/Ultimaker/Cura/actions/workflows/conan-package.yml/badge.svg{{ branch_specific }})](https://github.com/Ultimaker/Cura/actions/workflows/conan-package.yml) |
| **CuraEngine {{ branch }}** | [![conan-package](https://github.com/Ultimaker/CuraEngine/actions/workflows/conan-package.yml/badge.svg{{ branch_specific }})](https://github.com/Ultimaker/CuraEngine/actions/workflows/conan-package.yml) |
| **Uranium {{ branch }}** | [![conan-package](https://github.com/Ultimaker/Uranium/actions/workflows/conan-package.yml/badge.svg{{ branch_specific }})](https://github.com/Ultimaker/Uranium/actions/workflows/conan-package.yml) |
| **fdm_materials {{ branch }}** | [![conan-package](https://github.com/Ultimaker/fdm_materials/actions/workflows/conan-package.yml/badge.svg{{ branch_specific }})](https://github.com/Ultimaker/fdm_materials/actions/workflows/conan-package.yml) |
| **cura-binary-data {{ branch }}** | [![conan-package](https://github.com/Ultimaker/cura-binary-data/actions/workflows/conan-package.yml/badge.svg{{ branch_specific }})](https://github.com/Ultimaker/cura-binary-data/actions/workflows/conan-package.yml) |
| **CuraEngine GradualFlow 0.1** | [![conan-package](https://github.com/Ultimaker/CuraEngine_plugin_gradual_flow/actions/workflows/conan-package.yml/badge.svg?branch=0.1)](https://github.com/Ultimaker/CuraEngine_plugin_gradual_flow/actions/workflows/conan-package.yml) |
| **synsepalum-dulcificum 0.1** | [![conan-package](https://github.com/Ultimaker/synsepalum-dulcificum/actions/workflows/conan-package.yml/badge.svg?branch=0.1)](https://github.com/Ultimaker/synsepalum-dulcificum/actions/workflows/conan-package.yml) |
| **CuraEngine gRPC definitions 0.1** | [![conan-package](https://github.com/Ultimaker/CuraEngine_grpc_definitions/actions/workflows/conan-package.yml/badge.svg?branch=0.1)](https://github.com/Ultimaker/CuraEngine_grpc_definitions/actions/workflows/conan-package.yml) |
| **libArcus** | [![conan-package](https://github.com/Ultimaker/libArcus/actions/workflows/conan-package.yml/badge.svg)](https://github.com/Ultimaker/libArcus/actions/workflows/conan-package.yml) |
| **pyArcus** | [![conan-package](https://github.com/Ultimaker/pyArcus/actions/workflows/conan-package.yml/badge.svg)](https://github.com/Ultimaker/pyArcus/actions/workflows/conan-package.yml) |
| **libSavitar** | [![conan-package](https://github.com/Ultimaker/libSavitar/actions/workflows/conan-package.yml/badge.svg)](https://github.com/Ultimaker/libSavitar/actions/workflows/conan-package.yml) |
| **pySavitar** | [![conan-package](https://github.com/Ultimaker/pySavitar/actions/workflows/conan-package.yml/badge.svg)](https://github.com/Ultimaker/pySavitar/actions/workflows/conan-package.yml) |
| **libnest2d** | [![conan-package](https://github.com/Ultimaker/libnest2d/actions/workflows/conan-package.yml/badge.svg)](https://github.com/Ultimaker/libnest2d/actions/workflows/conan-package.yml) |
| **pynest2d** | [![conan-package](https://github.com/Ultimaker/pynest2d/actions/workflows/conan-package.yml/badge.svg)](https://github.com/Ultimaker/pynest2d/actions/workflows/conan-package.yml) |

View file

@ -1,2 +1,2 @@
conan==1.56.0
sip
conan>=1.60.2,<2.0.0
sip<=6.7.12

View file

62
.github/workflows/security_badge.yml vendored Normal file
View file

@ -0,0 +1,62 @@
# NOTE: Best to keep all of these remarks in, they might prove useful in the future.
# This is basically just the standard one that is suggested on 'new workflow'.
name: Scorecard supply-chain security
on:
# For Branch-Protection check. Only the default branch is supported. See
# https://github.com/ossf/scorecard/blob/main/docs/checks.md#branch-protection
branch_protection_rule:
# To guarantee Maintained check is occasionally updated. See
# https://github.com/ossf/scorecard/blob/main/docs/checks.md#maintained
schedule:
- cron: '25 2 * * 5'
push:
branches: [ "main" ]
# Declare default permissions as read only.
permissions: read-all
jobs:
analysis:
name: Scorecard analysis
runs-on: ubuntu-latest
permissions:
# Needed for Code scanning upload
security-events: write
# Needed for GitHub OIDC token if publish_results is true
id-token: write
steps:
- name: "Checkout code"
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # v4.1.1
with:
persist-credentials: false
- name: "Run analysis"
uses: ossf/scorecard-action@0864cf19026789058feabb7e87baa5f140aac736 # v2.3.1
with:
results_file: results.sarif
results_format: sarif
# Scorecard team runs a weekly scan of public GitHub repos,
# see https://github.com/ossf/scorecard#public-data.
# Setting `publish_results: true` helps us scale by leveraging your workflow to
# extract the results instead of relying on our own infrastructure to run scans.
# And it's free for you!
publish_results: true
# Upload the results as artifacts (optional). Commenting out will disable
# uploads of run results in SARIF format to the repository Actions tab.
# https://docs.github.com/en/actions/advanced-guides/storing-workflow-data-as-artifacts
- name: "Upload artifact"
uses: actions/upload-artifact@5d5d22a31266ced268874388b861e4b58bb5c2f3 # v4.3.1
with:
name: SARIF file
path: results.sarif
retention-days: 5
# Upload the results to GitHub's code scanning dashboard (optional).
# Commenting out will disable upload of results to your repo's Code Scanning dashboard
- name: "Upload to code-scanning"
uses: github/codeql-action/upload-sarif@83a02f7883b12e0e4e1a146174f5e2292a01e601 # v2.16.4
with:
sarif_file: results.sarif

37
.github/workflows/stale.yml vendored Normal file
View file

@ -0,0 +1,37 @@
name: 'Close stale issues and PRs'
on:
schedule:
- cron: '0 9-17/4 * * *'
jobs:
stale:
runs-on: ubuntu-latest
steps:
- uses: actions/stale@v8
with:
days-before-pr-close: -1
days-before-stale: 365
days-before-close: 14
operations-per-run: 3000
ascending: true
exempt-issue-labels: 'Status: Triage,Developer Environment :computer:,Status: On Backlog,PR: Community Contribution :crown:,PR: Printer Definitions :factory:,PR: Translations :books:'
stale-issue-label: 'Status: Stale :hourglass:'
labels-to-add-when-unstale: 'Status: Triage'
only-labels: "Type: New Feature,Status: Deferred"
stale-issue-message: |
Hi 👋,
We are cleaning our list of issues to improve our focus.
This feature request seems to be older than a year, which is at least three major Cura releases ago.
It also received the label Deferred indicating that we did not have time to work on it back then and haven't found time to work on it since.
If this is still something that you think can improve how you and others use Cura, can you please leave a comment?
We will have a fresh set of eyes to look at it.
If it has been resolved or don't need it to be improved anymore, you don't have to do anything, and this issue will be automatically closed in 14 days.
close-issue-message: |
This issue was closed because it has been inactive for 14 days since being marked as stale.
If you encounter this issue and still have a need for this, you are welcome to make a fresh new issue with an updated description.
permissions:
contents: write # only for delete-branch option
issues: write
pull-requests: write

View file

@ -1,82 +1,14 @@
name: unit-test-post
on:
workflow_run:
workflows: [ "unit-test" ]
types: [ completed ]
workflow_run:
workflows: [ "unit-test" ]
types: [ completed ]
jobs:
publish-test-results:
if: ${{ github.event.workflow_run.event == 'pull_request' && github.event.workflow_run.conclusion == 'success' }}
runs-on: ubuntu-latest
steps:
- name: Download analysis results
uses: actions/github-script@v3.1.0
with:
script: |
let artifacts = await github.actions.listWorkflowRunArtifacts({
owner: context.repo.owner,
repo: context.repo.repo,
run_id: ${{github.event.workflow_run.id }},
});
let matchArtifact = artifacts.data.artifacts.filter((artifact) => {
return artifact.name == "test-result"
})[0];
let download = await github.actions.downloadArtifact({
owner: context.repo.owner,
repo: context.repo.repo,
artifact_id: matchArtifact.id,
archive_format: "zip",
});
let fs = require("fs");
fs.writeFileSync("${{github.workspace}}/test-result.zip", Buffer.from(download.data));
- name: Set environment variables
run: |
mkdir pr_env
unzip test-result.zip -d pr_env
echo "pr_id=$(cat pr_env/pr-id.txt)" >> $GITHUB_ENV
echo "pr_head_repo=$(cat pr_env/pr-head-repo.txt)" >> $GITHUB_ENV
echo "pr_head_ref=$(cat pr_env/pr-head-ref.txt)" >> $GITHUB_ENV
- uses: actions/checkout@v3
with:
repository: ${{ env.pr_head_repo }}
ref: ${{ env.pr_head_ref }}
persist-credentials: false
- name: Redownload analysis results
uses: actions/github-script@v3.1.0
with:
script: |
let artifacts = await github.actions.listWorkflowRunArtifacts({
owner: context.repo.owner,
repo: context.repo.repo,
run_id: ${{github.event.workflow_run.id }},
});
let matchArtifact = artifacts.data.artifacts.filter((artifact) => {
return artifact.name == "test-result"
})[0];
let download = await github.actions.downloadArtifact({
owner: context.repo.owner,
repo: context.repo.repo,
artifact_id: matchArtifact.id,
archive_format: "zip",
});
let fs = require("fs");
fs.writeFileSync("${{github.workspace}}/test-result.zip", Buffer.from(download.data));
- name: Extract analysis results
run: |
mkdir -p tests
unzip test-result.zip -d tests
- name: Publish Unit Test Results
id: test-results
uses: EnricoMi/publish-unit-test-result-action@v1
with:
files: "tests/**/*.xml"
- name: Conclusion
run: echo "Conclusion is ${{ fromJSON( steps.test-results.outputs.json ).conclusion }}"
publish-test-results:
uses: ultimaker/cura-workflows/.github/workflows/unit-test-post.yml@main
with:
event: ${{ github.event.workflow_run.event }}
conclusion: ${{ github.event.workflow_run.conclusion }}
secrets: inherit

View file

@ -1,161 +1,62 @@
---
name: unit-test
on:
push:
paths:
- 'plugins/**'
- 'resources/**'
- 'cura/**'
- 'icons/**'
- 'tests/**'
- 'packaging/**'
- '.github/workflows/conan-*.yml'
- '.github/workflows/unit-test.yml'
- '.github/workflows/notify.yml'
- '.github/workflows/requirements-conan-package.txt'
- 'requirements*.txt'
- 'conanfile.py'
- 'conandata.yml'
- 'GitVersion.yml'
- '*.jinja'
branches:
- main
- 'CURA-*'
- '[1-9]+.[0-9]+'
tags:
- '[0-9]+.[0-9]+.[0-9]+'
- '[0-9]+.[0-9]+-beta'
pull_request:
paths:
- 'plugins/**'
- 'resources/**'
- 'cura/**'
- 'icons/**'
- 'tests/**'
- 'packaging/**'
- '.github/workflows/conan-*.yml'
- '.github/workflows/unit-test.yml'
- '.github/workflows/notify.yml'
- '.github/workflows/requirements-conan-package.txt'
- 'requirements*.txt'
- 'conanfile.py'
- 'conandata.yml'
- 'GitVersion.yml'
- '*.jinja'
branches:
- main
- '[1-9]+.[0-9]+'
tags:
- '[0-9]+.[0-9]+.[0-9]+'
- '[0-9]+.[0-9]+-beta'
push:
paths:
- 'plugins/**'
- 'resources/**'
- 'cura/**'
- 'icons/**'
- 'tests/**'
- '.github/workflows/unit-test.yml'
- '.github/workflows/requirements-runner.txt'
- 'requirements*.txt'
- 'conanfile.py'
- 'conandata.yml'
- '*.jinja'
branches:
- main
- 'CURA-*'
- 'PP-*'
- '[0-9]+.[0-9]+'
env:
CONAN_LOGIN_USERNAME_CURA: ${{ secrets.CONAN_USER }}
CONAN_PASSWORD_CURA: ${{ secrets.CONAN_PASS }}
CONAN_LOGIN_USERNAME_CURA_CE: ${{ secrets.CONAN_USER }}
CONAN_PASSWORD_CURA_CE: ${{ secrets.CONAN_PASS }}
CONAN_LOG_RUN_TO_OUTPUT: 1
CONAN_LOGGING_LEVEL: info
CONAN_NON_INTERACTIVE: 1
pull_request:
paths:
- 'plugins/**'
- 'resources/**'
- 'cura/**'
- 'icons/**'
- 'tests/**'
- '.github/workflows/unit-test.yml'
- '.github/workflows/requirements-runner.txt'
- 'requirements*.txt'
- 'conanfile.py'
- 'conandata.yml'
- '*.jinja'
branches:
- main
- '[0-9]+.[0-9]+'
permissions:
contents: read
env:
CONAN_LOGIN_USERNAME: ${{ secrets.CONAN_USER }}
CONAN_PASSWORD: ${{ secrets.CONAN_PASS }}
jobs:
conan-recipe-version:
uses: ultimaker/cura/.github/workflows/conan-recipe-version.yml@main
with:
project_name: cura
conan-recipe-version:
uses: ultimaker/cura-workflows/.github/workflows/conan-recipe-version.yml@main
with:
project_name: cura
testing:
runs-on: ubuntu-20.04
needs: [ conan-recipe-version ]
steps:
- name: Checkout
uses: actions/checkout@v3
with:
fetch-depth: 2
- name: Setup Python and pip
uses: actions/setup-python@v4
with:
python-version: '3.10.x'
architecture: 'x64'
cache: 'pip'
cache-dependency-path: .github/workflows/requirements-conan-package.txt
- name: Install Python requirements and Create default Conan profile
run: |
pip install -r requirements-conan-package.txt
conan profile new default --detect
working-directory: .github/workflows/
- name: Use Conan download cache (Bash)
if: ${{ runner.os != 'Windows' }}
run: conan config set storage.download_cache="$HOME/.conan/conan_download_cache"
- name: Cache Conan local repository packages (Bash)
uses: actions/cache@v3
if: ${{ runner.os != 'Windows' }}
with:
path: |
$HOME/.conan/data
$HOME/.conan/conan_download_cache
key: conan-${{ runner.os }}-${{ runner.arch }}-unit-cache
# NOTE: Due to what are probably github issues, we have to remove the cache and reconfigure before the rest.
# This is maybe because grub caches the disk it uses last time, which is recreated each time.
- name: Install Linux system requirements
if: ${{ runner.os == 'Linux' }}
run: |
sudo rm /var/cache/debconf/config.dat
sudo dpkg --configure -a
sudo apt update
sudo apt upgrade
sudo apt install build-essential checkinstall libegl-dev zlib1g-dev libssl-dev ninja-build autoconf libx11-dev libx11-xcb-dev libfontenc-dev libice-dev libsm-dev libxau-dev libxaw7-dev libxcomposite-dev libxcursor-dev libxdamage-dev libxdmcp-dev libxext-dev libxfixes-dev libxi-dev libxinerama-dev libxkbfile-dev libxmu-dev libxmuu-dev libxpm-dev libxrandr-dev libxrender-dev libxres-dev libxss-dev libxt-dev libxtst-dev libxv-dev libxvmc-dev libxxf86vm-dev xtrans-dev libxcb-render0-dev libxcb-render-util0-dev libxcb-xkb-dev libxcb-icccm4-dev libxcb-image0-dev libxcb-keysyms1-dev libxcb-randr0-dev libxcb-shape0-dev libxcb-sync-dev libxcb-xfixes0-dev libxcb-xinerama0-dev xkb-data libxcb-dri3-dev uuid-dev libxcb-util-dev libxkbcommon-x11-dev pkg-config -y
- name: Use GCC-10 on ubuntu-20.04
run: |
sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-10 10
sudo update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-10 10
- name: Get Conan configuration
run: conan config install https://github.com/Ultimaker/conan-config.git
- name: Install dependencies
run: conan install . ${{ needs.conan-recipe-version.outputs.recipe_id_full }} --build=missing --update -o cura:devtools=True -g VirtualPythonEnv -if venv
- name: Upload the Dependency package(s)
run: conan upload "*" -r cura --all -c
- name: Set Environment variables for Cura (bash)
if: ${{ runner.os != 'Windows' }}
run: |
. ./venv/bin/activate_github_actions_env.sh
- name: Run Unit Test
id: run-test
run: |
pytest --junitxml=junit_cura.xml
working-directory: tests
- name: Save PR metadata
if: always()
run: |
echo ${{ github.event.number }} > pr-id.txt
echo ${{ github.event.pull_request.head.repo.full_name }} > pr-head-repo.txt
echo ${{ github.event.pull_request.head.ref }} > pr-head-ref.txt
working-directory: tests
- name: Upload Test Results
if: always()
uses: actions/upload-artifact@v3
with:
name: test-result
path: |
tests/**/*.xml
tests/pr-id.txt
tests/pr-head-repo.txt
tests/pr-head-ref.txt
testing:
uses: ultimaker/cura-workflows/.github/workflows/unit-test.yml@main
needs: [ conan-recipe-version ]
with:
recipe_id_full: ${{ needs.conan-recipe-version.outputs.recipe_id_full }}
conan_extra_args: '-g VirtualPythonEnv -o cura:devtools=True -c tools.build:skip_test=False --options "*:enable_sentry=False"'
unit_test_cmd: 'pytest --junitxml=junit_cura.xml'
unit_test_dir: 'tests'
conan_generator_dir: './venv/bin'
secrets: inherit

View file

@ -0,0 +1,87 @@
name: update-translations
on:
push:
paths:
- 'plugins/**'
- 'resources/**'
- 'cura/**'
- 'icons/**'
- 'tests/**'
- 'packaging/**'
- '.github/workflows/conan-*.yml'
- '.github/workflows/notify.yml'
- '.github/workflows/requirements-conan-package.txt'
- 'requirements*.txt'
- 'conanfile.py'
- 'conandata.yml'
- 'GitVersion.yml'
- '*.jinja'
branches:
- '[1-9].[0-9]'
- '[1-9].[0-9][0-9]'
tags:
- '[1-9].[0-9].[0-9]*'
- '[1-9].[0-9].[0-9]'
- '[1-9].[0-9][0-9].[0-9]*'
jobs:
update-translations:
name: Update translations
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Cache Conan data
id: cache-conan
uses: actions/cache@v3
with:
path: ~/.conan
key: ${{ runner.os }}-conan
- name: Setup Python and pip
uses: actions/setup-python@v4
with:
python-version: 3.11.x
cache: pip
cache-dependency-path: .github/workflows/requirements-conan-package.txt
- name: Install Python requirements for runner
run: pip install -r .github/workflows/requirements-conan-package.txt
# NOTE: Due to what are probably github issues, we have to remove the cache and reconfigure before the rest.
# This is maybe because grub caches the disk it uses last time, which is recreated each time.
- name: Install Linux system requirements
if: ${{ runner.os == 'Linux' }}
run: |
sudo rm /var/cache/debconf/config.dat
sudo dpkg --configure -a
sudo add-apt-repository ppa:ubuntu-toolchain-r/test -y
sudo apt update
sudo apt upgrade
sudo apt install efibootmgr build-essential checkinstall libegl-dev zlib1g-dev libssl-dev ninja-build autoconf libx11-dev libx11-xcb-dev libfontenc-dev libice-dev libsm-dev libxau-dev libxaw7-dev libxcomposite-dev libxcursor-dev libxdamage-dev libxdmcp-dev libxext-dev libxfixes-dev libxi-dev libxinerama-dev libxkbfile-dev libxmu-dev libxmuu-dev libxpm-dev libxrandr-dev libxrender-dev libxres-dev libxss-dev libxt-dev libxtst-dev libxv-dev libxvmc-dev libxxf86vm-dev xtrans-dev libxcb-render0-dev libxcb-render-util0-dev libxcb-xkb-dev libxcb-icccm4-dev libxcb-image0-dev libxcb-keysyms1-dev libxcb-randr0-dev libxcb-shape0-dev libxcb-sync-dev libxcb-xfixes0-dev libxcb-xinerama0-dev xkb-data libxcb-dri3-dev uuid-dev libxcb-util-dev libxkbcommon-x11-dev pkg-config flex bison g++-12 gcc-12 -y
- name: Install GCC-13
run: |
sudo apt install g++-13 gcc-13 -y
sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-13 13
sudo update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-13 13
- name: Create the default Conan profile
run: conan profile new default --detect --force
- name: Get Conan configuration
run: |
conan config install https://github.com/Ultimaker/conan-config.git
conan config install https://github.com/Ultimaker/conan-config.git -a "-b runner/${{ runner.os }}/${{ runner.arch }}"
- name: generate the files using Conan install
run: conan install . --build=missing --update -o cura:devtools=True
- uses: stefanzweifel/git-auto-commit-action@v4
with:
file_pattern: resources/i18n/*.po resources/i18n/*.pot
status_options: --untracked-files=no
commit_message: update translations

53
.github/workflows/windows.yml vendored Normal file
View file

@ -0,0 +1,53 @@
name: Windows Installer
run-name: ${{ inputs.cura_conan_version }} by @${{ github.actor }}
on:
workflow_dispatch:
inputs:
cura_conan_version:
description: 'Cura Conan Version'
default: 'cura/latest@ultimaker/testing'
required: true
type: string
conan_args:
description: 'Conan args: eq.: --require-override'
default: ''
required: false
type: string
enterprise:
description: 'Build Cura as an Enterprise edition'
default: false
required: true
type: boolean
staging:
description: 'Use staging API'
default: false
required: true
type: boolean
architecture:
description: 'Architecture'
required: true
default: 'X64'
type: choice
options:
- X64
operating_system:
description: 'OS'
required: true
default: 'self-hosted-Windows-X64'
type: choice
options:
- self-hosted-Windows-X64
- windows-2022
jobs:
windows-installer:
uses: ultimaker/cura-workflows/.github/workflows/cura-installer-windows.yml@main
with:
cura_conan_version: ${{ inputs.cura_conan_version }}
conan_args: ${{ inputs.conan_args }}
enterprise: ${{ inputs.enterprise }}
staging: ${{ inputs.staging }}
architecture: ${{ inputs.architecture }}
operating_system: ${{ inputs.operating_system }}
secrets: inherit

5
.gitignore vendored
View file

@ -31,6 +31,7 @@ LC_MESSAGES
.directory
.idea
cura.desktop
*.bak
# Eclipse+PyDev
.project
@ -100,3 +101,7 @@ graph_info.json
Ultimaker-Cura.spec
.run/
/printer-linter/src/printerlinter.egg-info/
/plugins/CuraEngineGradualFlow
/resources/bundled_packages/bundled_*.json
curaengine_plugin_gradual_flow
curaengine_plugin_gradual_flow.exe

View file

@ -2,6 +2,9 @@ checks:
diagnostic-mesh-file-extension: true
diagnostic-mesh-file-size: true
diagnostic-definition-redundant-override: true
diagnostic-resources-macos-app-directory-name: true
diagnostic-resource-file-deleted: true
diagnostic-material-temperature-defined: true
fixes:
diagnostic-definition-redundant-override: true
format:

View file

@ -18,8 +18,8 @@ url: "https://ultimaker.com/software/ultimaker-cura"
repository-code: "https://github.com/Ultimaker/Cura"
license: LGPL-3.0
license-url: "https://github.com/Ultimaker/Cura/blob/main/LICENSE"
version: 5.2.1
date-released: "2022-10-19"
version: 5.4.0
date-released: "2023-07-04"
keywords:
- Ultimaker
- Cura

View file

@ -1,6 +1,6 @@
Submitting bug reports
----------------------
Please submit bug reports for all of Cura and CuraEngine to the [Cura repository](https://github.com/Ultimaker/Cura/issues). There will be a template there to fill in. Depending on the type of issue, we will usually ask for the [Cura log](Logging Issues) or a project file.
Please submit bug reports for all of Cura and CuraEngine to the [Cura repository](https://github.com/Ultimaker/Cura/issues). There will be a template there to fill in. Depending on the type of issue, we will usually ask for the [Cura log](https://github.com/Ultimaker/Cura/wiki/Reporting#cura-log) or a project file.
If a bug report would contain private information, such as a proprietary 3D model, you may also e-mail us. Ask for contact information in the issue.
@ -8,14 +8,22 @@ Bugs related to supporting certain types of printers can usually not be solved b
Requesting features
-------------------
The issue template in the Cura repository does not apply to feature requests. You can ignore it.
When requesting a feature, please describe clearly what you need and why you think this is valuable to users or what problem it solves.
Making pull requests
--------------------
If you want to propose a change to Cura's source code, please create a pull request in the appropriate repository (being [Cura](https://github.com/Ultimaker/Cura), [Uranium](https://github.com/Ultimaker/Uranium), [CuraEngine](https://github.com/Ultimaker/CuraEngine), [fdm_materials](https://github.com/Ultimaker/fdm_materials), [libArcus](https://github.com/Ultimaker/libArcus), [cura-build](https://github.com/Ultimaker/cura-build), [cura-build-environment](https://github.com/Ultimaker/cura-build-environment), [libSavitar](https://github.com/Ultimaker/libSavitar), [libCharon](https://github.com/Ultimaker/libCharon) or [cura-binary-data](https://github.com/Ultimaker/cura-binary-data)) and if your change requires changes on multiple of these repositories, please link them together so that we know to merge them together.
If you want to propose a change to Cura's source code, please create a pull request in the appropriate repository. Since Cura has multiple repositories that influence it, we've listed the most important ones below:
* [Cura](https://github.com/Ultimaker/Cura)
* [Uranium](https://github.com/Ultimaker/Uranium)
* [CuraEngine](https://github.com/Ultimaker/CuraEngine)
* [fdm_materials](https://github.com/Ultimaker/fdm_materials)
* [libArcus](https://github.com/Ultimaker/libArcus)
* [libSavitar](https://github.com/Ultimaker/libSavitar)
* [libCharon](https://github.com/Ultimaker/libCharon)
* [cura-binary-data](https://github.com/Ultimaker/cura-binary-data))
If your change requires changes on multiple of these repositories, please link them together so that we know to merge & review them together.
The style guide for code contributions to Cura and other Ultimaker projects can be found [here](https://github.com/Ultimaker/Meta/blob/master/general/generic_code_conventions.md).
Some of these repositories will have automated tests running when you create a pull request, indicated by green check marks or red crosses in the Github web page. If you see a red cross, that means that a test has failed. If the test doesn't fail on the Master branch but does fail on your branch, that indicates that you've probably made a mistake and you need to do that. Click on the cross for more details, or run the test locally by running `cmake . && ctest --verbose`.
Some of these repositories will have automated tests running when you create a pull request, indicated by green check marks or red crosses in the Github web page. If you see a red cross, that means that a test has failed. If the test doesn't fail on the Main branch but does fail on your branch, that indicates that you've probably made a mistake and you need to do that. Click on the cross for more details, or run the test locally by running `cmake . && ctest --verbose`.

View file

@ -1,4 +1,4 @@
# Copyright (c) 2022 UltiMaker
# Copyright (c) 2023 UltiMaker
# Cura is released under the terms of the LGPLv3 or higher.
CuraAppName = "{{ cura_app_name }}"
@ -12,3 +12,6 @@ CuraCloudAccountAPIRoot = "{{ cura_cloud_account_api_root }}"
CuraMarketplaceRoot = "{{ cura_marketplace_root }}"
CuraDigitalFactoryURL = "{{ cura_digital_factory_url }}"
CuraLatestURL = "{{ cura_latest_url }}"
ConanInstalls = {{ conan_installs }}
PythonInstalls = {{ python_installs }}

1
FUNDING.yml Normal file
View file

@ -0,0 +1 @@
github: [ultimaker]

View file

@ -1,4 +1,3 @@
<br>
<div align = center>
@ -13,7 +12,7 @@
[![Badge Test]][Test]
[![Badge Conan]][Conan]
[![Badge Downloads]][Downloads]
<br>
<br>
@ -27,7 +26,9 @@
*With hundreds of settings & community-managed print profiles,* <br>
*Ultimaker Cura is sure to lead your next project to a success.*
<br>
**Contribute Printer Profiles?** -- Please [look here](https://github.com/Ultimaker/Cura/wiki/Adding-new-machine-profiles-to-Cura) first. <br>
**Contribute Translations?** -- Please [look here](https://github.com/Ultimaker/Cura/wiki/Translating-Cura) first.
<br>
[![Button Building]][Building]
@ -51,20 +52,24 @@
<br>
[![OpenSSF Scorecard](https://api.securityscorecards.dev/projects/github.com/Ultimaker/Cura/badge)](https://api.securityscorecards.dev/projects/github.com/Ultimaker/Cura)
<br>
<!----------------------------------------------------------------------------->
[Contributors]: https://github.com/Ultimaker/Cura/graphs/contributors
[PullRequests]: https://github.com/Ultimaker/Cura/pulls
[Machines]: https://github.com/Ultimaker/Cura/wiki/Adding-new-machine-profiles-to-Cura
[Building]: https://github.com/Ultimaker/Cura/wiki/Running-Cura-from-Source
[Building]: https://github.com/Ultimaker/Cura/wiki/Getting-Started
[Localize]: https://github.com/Ultimaker/Cura/wiki/Translating-Cura
[Settings]: https://github.com/Ultimaker/Cura/wiki/Cura-Settings
[Plugins]: https://github.com/Ultimaker/Cura/wiki/Plugin-Directory
[Settings]: https://github.com/Ultimaker/Cura/wiki/Profiles-&-Settings
[Plugins]: https://github.com/Ultimaker/Cura/wiki/Plugins-And-Packages
[Closed]: https://github.com/Ultimaker/Cura/issues?q=is%3Aissue+is%3Aclosed
[Issues]: https://github.com/Ultimaker/Cura/issues
[Conan]: https://github.com/Ultimaker/Cura/actions/workflows/conan-package.yml
[Test]: https://github.com/Ultimaker/Cura/actions/workflows/unit-test.yml
[Downloads]: https://github.com/Ultimaker/Cura/releases/latest
[License]: LICENSE
[Report]: docs/Report.md
@ -79,15 +84,16 @@
[Badge License]: https://img.shields.io/badge/License-LGPL3-336887.svg?style=for-the-badge&labelColor=458cb5&logoColor=white&logo=GNU
[Badge Closed]: https://img.shields.io/github/issues-closed/ultimaker/cura?style=for-the-badge&logoColor=white&labelColor=629944&color=446a30&logo=AddThis
[Badge Issues]: https://img.shields.io/github/issues/ultimaker/cura?style=for-the-badge&logoColor=white&labelColor=c34360&color=933349&logo=AdBlock
[Badge Conan]: https://img.shields.io/github/workflow/status/Ultimaker/Cura/conan-package?style=for-the-badge&logoColor=white&labelColor=6185aa&color=4c6987&logo=Conan&label=Conan%20Package
[Badge Test]: https://img.shields.io/github/workflow/status/Ultimaker/Cura/unit-test?style=for-the-badge&logoColor=white&labelColor=4a999d&color=346c6e&logo=Codacy&label=Unit%20Test
[Badge Conan]: https://img.shields.io/github/actions/workflow/status/Ultimaker/Cura/conan-package.yml?branch=main&style=for-the-badge&logoColor=white&labelColor=6185aa&color=4c6987&logo=Conan&label=Conan%20Package
[Badge Test]: https://img.shields.io/github/actions/workflow/status/Ultimaker/Cura/unit-test.yml?branch=main&style=for-the-badge&logoColor=white&labelColor=4a999d&color=346c6e&logo=Codacy&label=Unit%20Test
[Badge Size]: https://img.shields.io/github/repo-size/ultimaker/cura?style=for-the-badge&logoColor=white&labelColor=715a97&color=584674&logo=GoogleAnalytics
[Badge Downloads]: https://img.shields.io/github/downloads-pre/Ultimaker/Cura/latest/total?style=for-the-badge
<!---------------------------------[ Buttons ]--------------------------------->
[Button Localize]: https://img.shields.io/badge/Help_Localize-e2467d?style=for-the-badge&logoColor=white&logo=GoogleTranslate
[Button Machines]: https://img.shields.io/badge/Adding_Machines-yellow?style=for-the-badge&logoColor=white&logo=CloudFoundry
[Button Machines]: https://img.shields.io/badge/Adding_Printers-yellow?style=for-the-badge&logoColor=white&logo=CloudFoundry
[Button Settings]: https://img.shields.io/badge/Configuration-00979D?style=for-the-badge&logoColor=white&logo=CodeReview
[Button Building]: https://img.shields.io/badge/Building_Cura-blue?style=for-the-badge&logoColor=white&logo=GitBook
[Button Plugins]: https://img.shields.io/badge/Plugin_Usage-569A31?style=for-the-badge&logoColor=white&logo=ROS

View file

@ -55,7 +55,8 @@ exe = EXE(
target_arch={{ target_arch }},
codesign_identity=os.getenv('CODESIGN_IDENTITY', None),
entitlements_file={{ entitlements_file }},
icon={{ icon }}
icon={{ icon }},
contents_directory='.'
)
coll = COLLECT(
@ -70,188 +71,7 @@ coll = COLLECT(
)
{% if macos == true %}
# PyInstaller seems to copy everything in the resource folder for the MacOS, this causes issues with codesigning and notarizing
# The folder structure should adhere to the one specified in Table 2-5
# https://developer.apple.com/library/archive/documentation/CoreFoundation/Conceptual/CFBundles/BundleTypes/BundleTypes.html#//apple_ref/doc/uid/10000123i-CH101-SW1
# The class below is basically ducktyping the BUNDLE class of PyInstaller and using our own `assemble` method for more fine-grain and specific
# control. Some code of the method below is copied from:
# https://github.com/pyinstaller/pyinstaller/blob/22d1d2a5378228744cc95f14904dae1664df32c4/PyInstaller/building/osx.py#L115
#-----------------------------------------------------------------------------
# Copyright (c) 2005-2022, PyInstaller Development Team.
#
# Distributed under the terms of the GNU General Public License (version 2
# or later) with exception for distributing the bootloader.
#
# The full license is in the file COPYING.txt, distributed with this software.
#
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
#-----------------------------------------------------------------------------
import plistlib
import shutil
import PyInstaller.utils.osx as osxutils
from pathlib import Path
from PyInstaller.building.osx import BUNDLE
from PyInstaller.building.utils import (_check_path_overlap, _rmtree, add_suffix_to_extension, checkCache)
from PyInstaller.building.datastruct import logger
from PyInstaller.building.icon import normalize_icon_type
class UMBUNDLE(BUNDLE):
def assemble(self):
from PyInstaller.config import CONF
if _check_path_overlap(self.name) and os.path.isdir(self.name):
_rmtree(self.name)
logger.info("Building BUNDLE %s", self.tocbasename)
# Create a minimal Mac bundle structure.
macos_path = Path(self.name, "Contents", "MacOS")
resources_path = Path(self.name, "Contents", "Resources")
frameworks_path = Path(self.name, "Contents", "Frameworks")
os.makedirs(macos_path)
os.makedirs(resources_path)
os.makedirs(frameworks_path)
# Makes sure the icon exists and attempts to convert to the proper format if applicable
self.icon = normalize_icon_type(self.icon, ("icns",), "icns", CONF["workpath"])
# Ensure icon path is absolute
self.icon = os.path.abspath(self.icon)
# Copy icns icon to Resources directory.
shutil.copy(self.icon, os.path.join(self.name, 'Contents', 'Resources'))
# Key/values for a minimal Info.plist file
info_plist_dict = {
"CFBundleDisplayName": self.appname,
"CFBundleName": self.appname,
# Required by 'codesign' utility.
# The value for CFBundleIdentifier is used as the default unique name of your program for Code Signing
# purposes. It even identifies the APP for access to restricted OS X areas like Keychain.
#
# The identifier used for signing must be globally unique. The usual form for this identifier is a
# hierarchical name in reverse DNS notation, starting with the toplevel domain, followed by the company
# name, followed by the department within the company, and ending with the product name. Usually in the
# form: com.mycompany.department.appname
# CLI option --osx-bundle-identifier sets this value.
"CFBundleIdentifier": self.bundle_identifier,
"CFBundleExecutable": os.path.basename(self.exename),
"CFBundleIconFile": os.path.basename(self.icon),
"CFBundleInfoDictionaryVersion": "6.0",
"CFBundlePackageType": "APPL",
"CFBundleVersionString": self.version,
"CFBundleShortVersionString": self.version,
}
# Set some default values. But they still can be overwritten by the user.
if self.console:
# Setting EXE console=True implies LSBackgroundOnly=True.
info_plist_dict['LSBackgroundOnly'] = True
else:
# Let's use high resolution by default.
info_plist_dict['NSHighResolutionCapable'] = True
# Merge info_plist settings from spec file
if isinstance(self.info_plist, dict) and self.info_plist:
info_plist_dict.update(self.info_plist)
plist_filename = os.path.join(self.name, "Contents", "Info.plist")
with open(plist_filename, "wb") as plist_fh:
plistlib.dump(info_plist_dict, plist_fh)
links = []
_QT_BASE_PATH = {'PySide2', 'PySide6', 'PyQt5', 'PyQt6', 'PySide6'}
for inm, fnm, typ in self.toc:
# Adjust name for extensions, if applicable
inm, fnm, typ = add_suffix_to_extension(inm, fnm, typ)
inm = Path(inm)
fnm = Path(fnm)
# Copy files from cache. This ensures that are used files with relative paths to dynamic library
# dependencies (@executable_path)
if typ in ('EXTENSION', 'BINARY') or (typ == 'DATA' and inm.suffix == '.so'):
if any(['.' in p for p in inm.parent.parts]):
inm = Path(inm.name)
fnm = Path(checkCache(
str(fnm),
strip = self.strip,
upx = self.upx,
upx_exclude = self.upx_exclude,
dist_nm = str(inm),
target_arch = self.target_arch,
codesign_identity = self.codesign_identity,
entitlements_file = self.entitlements_file,
strict_arch_validation = (typ == 'EXTENSION'),
))
frame_dst = frameworks_path.joinpath(inm)
if not frame_dst.exists():
if frame_dst.is_dir():
os.makedirs(frame_dst, exist_ok = True)
else:
os.makedirs(frame_dst.parent, exist_ok = True)
shutil.copy(fnm, frame_dst, follow_symlinks = True)
macos_dst = macos_path.joinpath(inm)
if not macos_dst.exists():
if macos_dst.is_dir():
os.makedirs(macos_dst, exist_ok = True)
else:
os.makedirs(macos_dst.parent, exist_ok = True)
# Create relative symlink to the framework
symlink_to = Path(*[".." for p in macos_dst.relative_to(macos_path).parts], "Frameworks").joinpath(
frame_dst.relative_to(frameworks_path))
try:
macos_dst.symlink_to(symlink_to)
except FileExistsError:
pass
else:
if typ == 'DATA':
if any(['.' in p for p in inm.parent.parts]) or inm.suffix == '.so':
# Skip info dist egg and some not needed folders in tcl and tk, since they all contain dots in their files
logger.warning(f"Skipping DATA file {inm}")
continue
res_dst = resources_path.joinpath(inm)
if not res_dst.exists():
if res_dst.is_dir():
os.makedirs(res_dst, exist_ok = True)
else:
os.makedirs(res_dst.parent, exist_ok = True)
shutil.copy(fnm, res_dst, follow_symlinks = True)
macos_dst = macos_path.joinpath(inm)
if not macos_dst.exists():
if macos_dst.is_dir():
os.makedirs(macos_dst, exist_ok = True)
else:
os.makedirs(macos_dst.parent, exist_ok = True)
# Create relative symlink to the resource
symlink_to = Path(*[".." for p in macos_dst.relative_to(macos_path).parts], "Resources").joinpath(
res_dst.relative_to(resources_path))
try:
macos_dst.symlink_to(symlink_to)
except FileExistsError:
pass
else:
macos_dst = macos_path.joinpath(inm)
if not macos_dst.exists():
if macos_dst.is_dir():
os.makedirs(macos_dst, exist_ok = True)
else:
os.makedirs(macos_dst.parent, exist_ok = True)
shutil.copy(fnm, macos_dst, follow_symlinks = True)
# Sign the bundle
logger.info('Signing the BUNDLE...')
try:
osxutils.sign_binary(self.name, self.codesign_identity, self.entitlements_file, deep = True)
except Exception as e:
logger.warning(f"Error while signing the bundle: {e}")
logger.warning("You will need to sign the bundle manually!")
logger.info(f"Building BUNDLE {self.tocbasename} completed successfully.")
app = UMBUNDLE(
app = BUNDLE(
coll,
name='{{ display_name }}.app',
icon={{ icon }},
@ -266,10 +86,15 @@ app = UMBUNDLE(
'CFBundlePackageType': 'APPL',
'CFBundleVersionString': {{ version }},
'CFBundleShortVersionString': {{ short_version }},
'CFBundleURLTypes': [{
'CFBundleURLName': '{{ display_name }}',
'CFBundleURLSchemes': ['cura', 'slicer'],
}],
'CFBundleDocumentTypes': [{
'CFBundleTypeRole': 'Viewer',
'CFBundleTypeExtensions': ['*'],
'CFBundleTypeName': 'Model Files',
}]
},
){% endif %}
'CFBundleTypeRole': 'Viewer',
'CFBundleTypeExtensions': ['stl', 'obj', '3mf', 'gcode', 'ufp'],
'CFBundleTypeName': 'Model Files',
}]
},
)
{% endif %}

View file

@ -1,3 +1,31 @@
version: "5.8.0-alpha.0"
requirements:
- "uranium/(latest)@ultimaker/testing"
- "curaengine/(latest)@ultimaker/testing"
- "cura_binary_data/(latest)@ultimaker/testing"
- "fdm_materials/(latest)@ultimaker/testing"
- "curaengine_plugin_gradual_flow/0.1.0-beta.3"
- "dulcificum/latest@ultimaker/testing"
- "pysavitar/5.3.0"
- "pynest2d/5.3.0"
- "curaengine_grpc_definitions/0.2.0"
- "native_cad_plugin/2.0.0"
requirements_internal:
- "fdm_materials/(latest)@internal/testing"
- "cura_private_data/(latest)@internal/testing"
urls:
default:
cloud_api_root: "https://api.ultimaker.com"
cloud_account_api_root: "https://account.ultimaker.com"
marketplace_root: "https://marketplace.ultimaker.com"
digital_factory_url: "https://digitalfactory.ultimaker.com"
cura_latest_url: "https://software.ultimaker.com/latest.json"
staging:
cloud_api_root: "https://api-staging.ultimaker.com"
cloud_account_api_root: "https://account-staging.ultimaker.com"
marketplace_root: "https://marketplace-staging.ultimaker.com"
digital_factory_url: "https://digitalfactory-staging.ultimaker.com"
cura_latest_url: "https://software.ultimaker.com/latest.json"
pyinstaller:
runinfo:
entrypoint: "cura_app.py"
@ -6,14 +34,30 @@ pyinstaller:
package: "cura"
src: "plugins"
dst: "share/cura/plugins"
curaengine_gradual_flow_plugin:
package: "curaengine_plugin_gradual_flow"
src: "res/plugins/CuraEngineGradualFlow"
dst: "share/cura/plugins/CuraEngineGradualFlow"
curaengine_gradual_flow_plugin_bundled:
package: "curaengine_plugin_gradual_flow"
src: "res/bundled_packages"
dst: "share/cura/resources/bundled_packages"
native_cad_plugin:
package: "native_cad_plugin"
src: "res/plugins/NativeCADplugin"
dst: "share/cura/plugins/NativeCADplugin"
native_cad_plugin_bundled:
package: "native_cad_plugin"
src: "res/bundled_packages"
dst: "share/cura/resources/bundled_packages"
cura_resources:
package: "cura"
src: "resources"
dst: "share/cura/resources"
cura_private_data:
package: "cura_private_data"
src: "resources"
dst: "share/cura/resources"
src: "res"
dst: "share/cura"
internal: true
uranium_plugins:
package: "uranium"
@ -41,7 +85,7 @@ pyinstaller:
dst: "share/windows"
fdm_materials:
package: "fdm_materials"
src: "materials"
src: "res/resources/materials"
dst: "share/cura/resources/materials"
tcl:
package: "tcl"
@ -57,9 +101,15 @@ pyinstaller:
src: "bin"
dst: "."
binary: "CuraEngine"
curaengine_gradual_flow_plugin_service:
package: "curaengine_plugin_gradual_flow"
src: "bin"
dst: "."
binary: "curaengine_plugin_gradual_flow"
hiddenimports:
- "pySavitar"
- "pyArcus"
- "pyDulcificum"
- "pynest2d"
- "PyQt6"
- "PyQt6.QtNetwork"
@ -77,7 +127,6 @@ pyinstaller:
- "sqlite3"
- "trimesh"
- "win32ctypes"
- "PyQt6"
- "PyQt6.QtNetwork"
- "PyQt6.sip"
- "stl"
@ -119,6 +168,10 @@ pycharm_targets:
module_name: Cura
name: pytest in TestGCodeListDecorator.py
script_name: tests/TestGCodeListDecorator.py
- jinja_path: .run_templates/pycharm_cura_test.run.xml.jinja
module_name: Cura
name: pytest in TestHitChecker.py
script_name: tests/TestHitChecker.py
- jinja_path: .run_templates/pycharm_cura_test.run.xml.jinja
module_name: Cura
name: pytest in TestIntentManager.py
@ -147,6 +200,10 @@ pycharm_targets:
module_name: Cura
name: pytest in TestPrintInformation.py
script_name: tests/TestPrintInformation.py
- jinja_path: .run_templates/pycharm_cura_test.run.xml.jinja
module_name: Cura
name: pytest in TestPrintOrderManager.py
script_name: tests/TestPrintOrderManager.py
- jinja_path: .run_templates/pycharm_cura_test.run.xml.jinja
module_name: Cura
name: pytest in TestProfileRequirements.py

View file

@ -1,16 +1,17 @@
import os
from io import StringIO
from pathlib import Path
from jinja2 import Template
from conan import ConanFile
from conan.tools.files import copy, rmdir, save, mkdir
from conan.tools.files import copy, rmdir, save, mkdir, rm, update_conandata
from conan.tools.microsoft import unix_path
from conan.tools.env import VirtualRunEnv, Environment, VirtualBuildEnv
from conan.tools.scm import Version
from conan.errors import ConanInvalidConfiguration, ConanException
required_conan_version = "<=1.56.0"
required_conan_version = ">=1.58.0 <2.0.0"
class CuraConan(ConanFile):
@ -21,14 +22,11 @@ class CuraConan(ConanFile):
description = "3D printer / slicing GUI built on top of the Uranium framework"
topics = ("conan", "python", "pyqt6", "qt", "qml", "3d-printing", "slicer")
build_policy = "missing"
exports = "LICENSE*", "UltiMaker-Cura.spec.jinja", "CuraVersion.py.jinja"
exports = "LICENSE*", "*.jinja"
settings = "os", "compiler", "build_type", "arch"
no_copy_source = True # We won't build so no need to copy sources to the build folder
# FIXME: Remove specific branch once merged to main
# Extending the conanfile with the UMBaseConanfile https://github.com/Ultimaker/conan-ultimaker-index/tree/CURA-9177_Fix_CI_CD/recipes/umbase
python_requires = "umbase/[>=0.1.7]@ultimaker/stable", "translationextractor/[>=1.0.0]@ultimaker/stable"
python_requires_extend = "umbase.UMBaseConanfile"
python_requires = "translationextractor/[>=2.2.0]@ultimaker/stable"
options = {
"enterprise": ["True", "False", "true", "false"], # Workaround for GH Action passing boolean as lowercase string
@ -37,7 +35,8 @@ class CuraConan(ConanFile):
"cloud_api_version": "ANY",
"display_name": "ANY", # TODO: should this be an option??
"cura_debug_mode": [True, False], # FIXME: Use profiles
"internal": [True, False]
"internal": ["True", "False", "true", "false"], # Workaround for GH Action passing boolean as lowercase string
"enable_i18n": [True, False],
}
default_options = {
"enterprise": "False",
@ -46,9 +45,18 @@ class CuraConan(ConanFile):
"cloud_api_version": "1",
"display_name": "UltiMaker Cura",
"cura_debug_mode": False, # Not yet implemented
"internal": False,
"internal": "False",
"enable_i18n": False,
}
def set_version(self):
if not self.version:
self.version = self.conan_data["version"]
@property
def _i18n_options(self):
return self.conf.get("user.i18n:options", default = {"extract": True, "build": True}, check_type = dict)
@property
def _pycharm_targets(self):
return self.conan_data["pycharm_targets"]
@ -64,6 +72,8 @@ class CuraConan(ConanFile):
self._cura_env = Environment()
self._cura_env.define("QML2_IMPORT_PATH", str(self._site_packages.joinpath("PyQt6", "Qt6", "qml")))
self._cura_env.define("QT_PLUGIN_PATH", str(self._site_packages.joinpath("PyQt6", "Qt6", "plugins")))
if not self.in_local_cache:
self._cura_env.define("CURA_DATA_ROOT", str(self._share_dir.joinpath("cura")))
if self.settings.os == "Linux":
self._cura_env.define("QT_QPA_FONTDIR", "/usr/share/fonts")
@ -71,14 +81,14 @@ class CuraConan(ConanFile):
self._cura_env.define("QT_XKB_CONFIG_ROOT", "/usr/share/X11/xkb")
return self._cura_env
@property
def _staging(self):
return self.options.staging in ["True", 'true']
@property
def _enterprise(self):
return self.options.enterprise in ["True", 'true']
@property
def _internal(self):
return self.options.internal in ["True", 'true']
@property
def _app_name(self):
if self._enterprise:
@ -86,24 +96,10 @@ class CuraConan(ConanFile):
return str(self.options.display_name)
@property
def _cloud_api_root(self):
return "https://api-staging.ultimaker.com" if self._staging else "https://api.ultimaker.com"
@property
def _cloud_account_api_root(self):
return "https://account-staging.ultimaker.com" if self._staging else "https://account.ultimaker.com"
@property
def _marketplace_root(self):
return "https://marketplace-staging.ultimaker.com" if self._staging else "https://marketplace.ultimaker.com"
@property
def _digital_factory_url(self):
return "https://digitalfactory-staging.ultimaker.com" if self._staging else "https://digitalfactory.ultimaker.com"
@property
def _cura_latest_url(self):
return "https://software.ultimaker.com/latest.json"
def _urls(self):
if self.options.staging in ["True", 'true']:
return "staging"
return "default"
@property
def requirements_txts(self):
@ -154,6 +150,44 @@ class CuraConan(ConanFile):
return "'x86_64'"
return "None"
def _conan_installs(self):
self.output.info("Collecting conan installs")
conan_installs = {}
# list of conan installs
for dependency in self.dependencies.host.values():
conan_installs[dependency.ref.name] = {
"version": dependency.ref.version,
"revision": dependency.ref.revision
}
return conan_installs
def _python_installs(self):
self.output.info("Collecting python installs")
python_installs = {}
# list of python installs
run_env = VirtualRunEnv(self)
env = run_env.environment()
env.prepend_path("PYTHONPATH", str(self._site_packages.as_posix()))
venv_vars = env.vars(self, scope = "run")
outer = '"' if self.settings.os == "Windows" else "'"
inner = "'" if self.settings.os == "Windows" else '"'
buffer = StringIO()
with venv_vars.apply():
self.run(f"""python -c {outer}import pkg_resources; print({inner};{inner}.join([(s.key+{inner},{inner}+ s.version) for s in pkg_resources.working_set])){outer}""",
env = "conanrun",
output = buffer)
packages = str(buffer.getvalue()).split("-----------------\n")
packages = packages[1].strip('\r\n').split(";")
for package in packages:
name, version = package.split(",")
python_installs[name] = {"version": version}
return python_installs
def _generate_cura_version(self, location):
with open(os.path.join(self.recipe_folder, "CuraVersion.py.jinja"), "r") as f:
cura_version_py = Template(f.read())
@ -163,7 +197,7 @@ class CuraConan(ConanFile):
cura_version = Version(self.conf.get("user.cura:version", default = self.version, check_type = str))
pre_tag = f"-{cura_version.pre}" if cura_version.pre else ""
build_tag = f"+{cura_version.build}" if cura_version.build else ""
internal_tag = f"+internal" if self.options.internal else ""
internal_tag = f"+internal" if self._internal else ""
cura_version = f"{cura_version.major}.{cura_version.minor}.{cura_version.patch}{pre_tag}{build_tag}{internal_tag}"
with open(os.path.join(location, "CuraVersion.py"), "w") as f:
@ -173,18 +207,21 @@ class CuraConan(ConanFile):
cura_version = cura_version,
cura_build_type = "Enterprise" if self._enterprise else "",
cura_debug_mode = self.options.cura_debug_mode,
cura_cloud_api_root = self._cloud_api_root,
cura_cloud_api_root = self.conan_data["urls"][self._urls]["cloud_api_root"],
cura_cloud_api_version = self.options.cloud_api_version,
cura_cloud_account_api_root = self._cloud_account_api_root,
cura_marketplace_root = self._marketplace_root,
cura_digital_factory_url = self._digital_factory_url,
cura_latest_url = self._cura_latest_url))
cura_cloud_account_api_root = self.conan_data["urls"][self._urls]["cloud_account_api_root"],
cura_marketplace_root = self.conan_data["urls"][self._urls]["marketplace_root"],
cura_digital_factory_url = self.conan_data["urls"][self._urls]["digital_factory_url"],
cura_latest_url=self.conan_data["urls"][self._urls]["cura_latest_url"],
conan_installs=self._conan_installs(),
python_installs=self._python_installs(),
))
def _generate_pyinstaller_spec(self, location, entrypoint_location, icon_path, entitlements_file):
pyinstaller_metadata = self.conan_data["pyinstaller"]
datas = [(str(self._base_dir.joinpath("conan_install_info.json")), ".")]
datas = []
for data in pyinstaller_metadata["datas"].values():
if not self.options.internal and data.get("internal", False):
if not self._internal and data.get("internal", False):
continue
if "package" in data: # get the paths from conan package
@ -194,9 +231,11 @@ class CuraConan(ConanFile):
else:
src_path = os.path.join(self.source_folder, data["src"])
else:
if data["package"] not in self.deps_cpp_info.deps:
continue
src_path = os.path.join(self.deps_cpp_info[data["package"]].rootpath, data["src"])
elif "root" in data: # get the paths relative from the sourcefolder
src_path = os.path.join(self.source_folder, data["root"], data["src"])
elif "root" in data: # get the paths relative from the install folder
src_path = os.path.join(self.install_folder, data["root"], data["src"])
else:
continue
if Path(src_path).exists():
@ -207,7 +246,9 @@ class CuraConan(ConanFile):
if "package" in binary: # get the paths from conan package
src_path = os.path.join(self.deps_cpp_info[binary["package"]].rootpath, binary["src"])
elif "root" in binary: # get the paths relative from the sourcefolder
src_path = os.path.join(self.source_folder, binary["root"], binary["src"])
src_path = str(self.source_path.joinpath(binary["root"], binary["src"]))
if self.settings.os == "Windows":
src_path = src_path.replace("\\", "\\\\")
else:
continue
if not Path(src_path).exists():
@ -237,7 +278,7 @@ class CuraConan(ConanFile):
with open(os.path.join(self.recipe_folder, "UltiMaker-Cura.spec.jinja"), "r") as f:
pyinstaller = Template(f.read())
version = self.conf_info.get("user.cura:version", default = self.version, check_type = str)
version = self.conf.get("user.cura:version", default = self.version, check_type = str)
cura_version = Version(version)
with open(os.path.join(location, "UltiMaker-Cura.spec"), "w") as f:
@ -261,6 +302,9 @@ class CuraConan(ConanFile):
short_version = f"'{cura_version.major}.{cura_version.minor}.{cura_version.patch}'",
))
def export(self):
update_conandata(self, {"version": self.version})
def export_sources(self):
copy(self, "*", os.path.join(self.recipe_folder, "plugins"), os.path.join(self.export_sources_folder, "plugins"))
copy(self, "*", os.path.join(self.recipe_folder, "resources"), os.path.join(self.export_sources_folder, "resources"), excludes = "*.mo")
@ -271,42 +315,54 @@ class CuraConan(ConanFile):
copy(self, "requirements.txt", self.recipe_folder, self.export_sources_folder)
copy(self, "requirements-dev.txt", self.recipe_folder, self.export_sources_folder)
copy(self, "requirements-ultimaker.txt", self.recipe_folder, self.export_sources_folder)
copy(self, "UltiMaker-Cura.spec.jinja", self.recipe_folder, self.export_sources_folder)
copy(self, "CuraVersion.py.jinja", self.recipe_folder, self.export_sources_folder)
copy(self, "cura_app.py", self.recipe_folder, self.export_sources_folder)
def set_version(self):
if self.version is None:
self.version = self._umdefault_version()
def config_options(self):
if self.settings.os == "Windows" and not self.conf.get("tools.microsoft.bash:path", check_type=str):
del self.options.enable_i18n
def configure(self):
self.options["pyarcus"].shared = True
self.options["pysavitar"].shared = True
self.options["pynest2d"].shared = True
self.options["dulcificum"].shared = self.settings.os != "Windows"
self.options["cpython"].shared = True
self.options["boost"].header_only = True
if self.settings.os == "Linux":
self.options["curaengine_grpc_definitions"].shared = True
self.options["openssl"].shared = True
if self.conf.get("user.curaengine:sentry_url", "", check_type=str) != "":
self.options["curaengine"].enable_sentry = True
self.options["arcus"].enable_sentry = True
self.options["clipper"].enable_sentry = True
def validate(self):
version = self.conf_info.get("user.cura:version", default = self.version, check_type = str)
version = self.conf.get("user.cura:version", default = self.version, check_type = str)
if version and Version(version) <= Version("4"):
raise ConanInvalidConfiguration("Only versions 5+ are support")
def requirements(self):
self.requires("pyarcus/5.2.2")
self.requires("curaengine/(latest)@ultimaker/testing")
self.requires("pysavitar/5.2.2")
self.requires("pynest2d/5.2.2")
self.requires("uranium/(latest)@ultimaker/testing")
self.requires("fdm_materials/(latest)@{}/testing".format("internal" if self.options.internal else "ultimaker"))
self.requires("cura_binary_data/(latest)@ultimaker/testing")
self.requires("cpython/3.10.4")
if self.options.internal:
self.requires("cura_private_data/(latest)@ultimaker/testing")
for req in self.conan_data["requirements"]:
if self._internal and "fdm_materials" in req:
continue
if not self._enterprise and "native_cad_plugin" in req:
continue
self.requires(req)
if self._internal:
for req in self.conan_data["requirements_internal"]:
self.requires(req)
self.requires("cpython/3.10.4@ultimaker/stable")
self.requires("clipper/6.4.2@ultimaker/stable")
self.requires("openssl/3.2.0")
self.requires("protobuf/3.21.12")
self.requires("boost/1.82.0")
self.requires("spdlog/1.12.0")
self.requires("fmt/10.1.1")
self.requires("zlib/1.2.13")
def build_requirements(self):
if self.options.devtools:
if self.settings.os != "Windows" or self.conf.get("tools.microsoft.bash:path", check_type = str):
# FIXME: once m4, autoconf, automake are Conan V2 ready use self.win_bash and add gettext as base tool_requirement
self.tool_requires("gettext/0.21", force_host_context=True)
if self.options.get_safe("enable_i18n", False):
self.tool_requires("gettext/0.21", force_host_context = True)
def layout(self):
self.folders.source = "."
@ -328,123 +384,98 @@ class CuraConan(ConanFile):
self._generate_cura_version(os.path.join(self.source_folder, "cura"))
if self.options.devtools:
entitlements_file = "'{}'".format(os.path.join(self.source_folder, "packaging", "MacOS", "cura.entitlements"))
self._generate_pyinstaller_spec(location = self.generators_folder,
entrypoint_location = "'{}'".format(os.path.join(self.source_folder, self.conan_data["pyinstaller"]["runinfo"]["entrypoint"])).replace("\\", "\\\\"),
icon_path = "'{}'".format(os.path.join(self.source_folder, "packaging", self.conan_data["pyinstaller"]["icon"][str(self.settings.os)])).replace("\\", "\\\\"),
entitlements_file = entitlements_file if self.settings.os == "Macos" else "None")
if not self.in_local_cache:
# Copy CuraEngine.exe to bindirs of Virtual Python Environment
curaengine = self.dependencies["curaengine"].cpp_info
copy(self, "CuraEngine.exe", curaengine.bindirs[0], self.source_folder, keep_path = False)
copy(self, "CuraEngine", curaengine.bindirs[0], self.source_folder, keep_path = False)
# Update the po files
if self.settings.os != "Windows" or self.conf.get("tools.microsoft.bash:path", check_type=str):
vb = VirtualBuildEnv(self)
vb.generate()
# Copy the external plugins that we want to bundle with Cura
rmdir(self,str(self.source_path.joinpath("plugins", "CuraEngineGradualFlow")))
curaengine_plugin_gradual_flow = self.dependencies["curaengine_plugin_gradual_flow"].cpp_info
copy(self, "*", curaengine_plugin_gradual_flow.resdirs[0], str(self.source_path.joinpath("plugins", "CuraEngineGradualFlow")), keep_path = True)
copy(self, "*", curaengine_plugin_gradual_flow.bindirs[0], self.source_folder, keep_path = False)
copy(self, "bundled_*.json", curaengine_plugin_gradual_flow.resdirs[1], str(self.source_path.joinpath("resources", "bundled_packages")), keep_path = False)
# FIXME: once m4, autoconf, automake are Conan V2 ready use self.win_bash and add gettext as base tool_requirement
cpp_info = self.dependencies["gettext"].cpp_info
for po_file in self.source_path.joinpath("resources", "i18n").glob("**/*.po"):
pot_file = self.source_path.joinpath("resources", "i18n", po_file.with_suffix('.pot').name)
mkdir(self, str(unix_path(self, pot_file.parent)))
self.run(
f"{cpp_info.bindirs[0]}/msgmerge --no-wrap --no-fuzzy-matching -width=140 -o {po_file} {po_file} {pot_file}",
env="conanbuild", ignore_errors=True)
def build(self):
if self.options.devtools:
if self.settings.os != "Windows" or self.conf.get("tools.microsoft.bash:path", check_type = str):
# FIXME: once m4, autoconf, automake are Conan V2 ready use self.win_bash and add gettext as base tool_requirement
for po_file in self.source_path.joinpath("resources", "i18n").glob("**/*.po"):
mo_file = Path(self.build_folder, po_file.with_suffix('.mo').relative_to(self.source_path))
mo_file = mo_file.parent.joinpath("LC_MESSAGES", mo_file.name)
mkdir(self, str(unix_path(self, Path(mo_file).parent)))
cpp_info = self.dependencies["gettext"].cpp_info
self.run(f"{cpp_info.bindirs[0]}/msgfmt {po_file} -o {mo_file} -f", env="conanbuild", ignore_errors=True)
def imports(self):
self.copy("CuraEngine.exe", root_package = "curaengine", src = "@bindirs", dst = "", keep_path = False)
self.copy("CuraEngine", root_package = "curaengine", src = "@bindirs", dst = "", keep_path = False)
rmdir(self, os.path.join(self.source_folder, "resources", "materials"))
self.copy("*.fdm_material", root_package = "fdm_materials", src = "@resdirs", dst = "resources/materials", keep_path = False)
self.copy("*.sig", root_package = "fdm_materials", src = "@resdirs", dst = "resources/materials", keep_path = False)
if self.options.internal:
self.copy("*", root_package = "cura_private_data", src = self.deps_cpp_info["cura_private_data"].resdirs[0],
dst = self._share_dir.joinpath("cura", "resources"), keep_path = True)
if self._enterprise:
rmdir(self, str(self.source_path.joinpath("plugins", "NativeCADplugin")))
curaengine_plugin_gradual_flow = self.dependencies["native_cad_plugin"].cpp_info
copy(self, "*", curaengine_plugin_gradual_flow.resdirs[0], str(self.source_path.joinpath("plugins", "NativeCADplugin")), keep_path = True)
copy(self, "bundled_*.json", curaengine_plugin_gradual_flow.resdirs[1], str(self.source_path.joinpath("resources", "bundled_packages")), keep_path = False)
# Copy resources of cura_binary_data
self.copy("*", root_package = "cura_binary_data", src = self.deps_cpp_info["cura_binary_data"].resdirs[0],
dst = self._share_dir.joinpath("cura", "resources"), keep_path = True)
self.copy("*", root_package = "cura_binary_data", src = self.deps_cpp_info["cura_binary_data"].resdirs[1],
dst =self._share_dir.joinpath("uranium", "resources"), keep_path = True)
cura_binary_data = self.dependencies["cura_binary_data"].cpp_info
copy(self, "*", cura_binary_data.resdirs[0], str(self._share_dir.joinpath("cura")), keep_path = True)
copy(self, "*", cura_binary_data.resdirs[1], str(self._share_dir.joinpath("uranium")), keep_path = True)
if self.settings.os == "Windows":
copy(self, "*", cura_binary_data.resdirs[2], str(self._share_dir.joinpath("windows")), keep_path = True)
self.copy("*.dll", src = "@bindirs", dst = self._site_packages)
self.copy("*.pyd", src = "@libdirs", dst = self._site_packages)
self.copy("*.pyi", src = "@libdirs", dst = self._site_packages)
self.copy("*.dylib", src = "@libdirs", dst = self._script_dir)
def deploy(self):
# Copy CuraEngine.exe to bindirs of Virtual Python Environment
# TODO: Fix source such that it will get the curaengine relative from the executable (Python bindir in this case)
self.copy_deps("CuraEngine.exe", root_package = "curaengine", src = self.deps_cpp_info["curaengine"].bindirs[0],
dst = self._base_dir,
keep_path = False)
self.copy_deps("CuraEngine", root_package = "curaengine", src = self.deps_cpp_info["curaengine"].bindirs[0], dst = self._base_dir,
keep_path = False)
# Copy resources of Cura (keep folder structure)
self.copy("*", src = self.cpp_info.bindirs[0], dst = self._base_dir, keep_path = False)
self.copy("*", src = self.cpp_info.libdirs[0], dst = self._site_packages.joinpath("cura"), keep_path = True)
self.copy("*", src = self.cpp_info.resdirs[0], dst = self._share_dir.joinpath("cura", "resources"), keep_path = True)
self.copy("*", src = self.cpp_info.resdirs[1], dst = self._share_dir.joinpath("cura", "plugins"), keep_path = True)
for dependency in self.dependencies.host.values():
for bindir in dependency.cpp_info.bindirs:
copy(self, "*.dll", bindir, str(self._site_packages), keep_path = False)
for libdir in dependency.cpp_info.libdirs:
copy(self, "*.pyd", libdir, str(self._site_packages), keep_path = False)
copy(self, "*.pyi", libdir, str(self._site_packages), keep_path = False)
copy(self, "*.dylib", libdir, str(self._base_dir.joinpath("lib")), keep_path = False)
# Copy materials (flat)
self.copy_deps("*.fdm_material", root_package = "fdm_materials", src = self.deps_cpp_info["fdm_materials"].resdirs[0],
dst = self._share_dir.joinpath("cura", "resources", "materials"), keep_path = False)
self.copy_deps("*.sig", root_package = "fdm_materials", src = self.deps_cpp_info["fdm_materials"].resdirs[0],
dst = self._share_dir.joinpath("cura", "resources", "materials"), keep_path = False)
rmdir(self, os.path.join(self.source_folder, "resources", "materials"))
fdm_materials = self.dependencies["fdm_materials"].cpp_info
copy(self, "*", fdm_materials.resdirs[0], self.source_folder)
# Copy internal resources
if self.options.internal:
self.copy_deps("*", root_package = "cura_private_data", src = self.deps_cpp_info["cura_private_data"].resdirs[0],
dst = self._share_dir.joinpath("cura", "resources"), keep_path = True)
self.copy_deps("*", root_package = "cura_private_data", src = self.deps_cpp_info["cura_private_data"].resdirs[1],
dst = self._share_dir.joinpath("cura", "plugins"), keep_path = True)
if self._internal:
cura_private_data = self.dependencies["cura_private_data"].cpp_info
copy(self, "*", cura_private_data.resdirs[0], str(self._share_dir.joinpath("cura")))
if self.options.devtools:
entitlements_file = "'{}'".format(os.path.join(self.source_folder, "packaging", "MacOS", "cura.entitlements"))
self._generate_pyinstaller_spec(
location=self.generators_folder,
entrypoint_location="'{}'".format(
os.path.join(self.source_folder, self.conan_data["pyinstaller"]["runinfo"]["entrypoint"])).replace(
"\\", "\\\\"),
icon_path="'{}'".format(os.path.join(self.source_folder, "packaging",
self.conan_data["pyinstaller"]["icon"][
str(self.settings.os)])).replace("\\", "\\\\"),
entitlements_file=entitlements_file if self.settings.os == "Macos" else "None"
)
if self.options.get_safe("enable_i18n", False) and self._i18n_options["extract"]:
vb = VirtualBuildEnv(self)
vb.generate()
# # FIXME: once m4, autoconf, automake are Conan V2 ready use self.win_bash and add gettext as base tool_requirement
cpp_info = self.dependencies["gettext"].cpp_info
pot = self.python_requires["translationextractor"].module.ExtractTranslations(self, cpp_info.bindirs[0])
pot.generate()
def build(self):
if self.options.get_safe("enable_i18n", False) and self._i18n_options["build"]:
for po_file in self.source_path.joinpath("resources", "i18n").glob("**/*.po"):
mo_file = Path(self.build_folder, po_file.with_suffix('.mo').relative_to(self.source_path))
mo_file = mo_file.parent.joinpath("LC_MESSAGES", mo_file.name)
mkdir(self, str(unix_path(self, Path(mo_file).parent)))
cpp_info = self.dependencies["gettext"].cpp_info
self.run(f"{cpp_info.bindirs[0]}/msgfmt {po_file} -o {mo_file} -f", env="conanbuild", ignore_errors=True)
def deploy(self):
copy(self, "*", os.path.join(self.package_folder, self.cpp.package.resdirs[2]), os.path.join(self.install_folder, "packaging"), keep_path = True)
# Copy resources of Cura (keep folder structure) needed by pyinstaller to determine the module structure
copy(self, "*", os.path.join(self.package_folder, self.cpp_info.bindirs[0]), str(self._base_dir), keep_path = False)
copy(self, "*", os.path.join(self.package_folder, self.cpp_info.libdirs[0]), str(self._site_packages.joinpath("cura")), keep_path = True)
copy(self, "*", os.path.join(self.package_folder, self.cpp_info.resdirs[0]), str(self._share_dir.joinpath("cura", "resources")), keep_path = True)
copy(self, "*", os.path.join(self.package_folder, self.cpp_info.resdirs[1]), str(self._share_dir.joinpath("cura", "plugins")), keep_path = True)
# Copy resources of Uranium (keep folder structure)
self.copy_deps("*", root_package = "uranium", src = self.deps_cpp_info["uranium"].resdirs[0],
dst = self._share_dir.joinpath("uranium", "resources"), keep_path = True)
self.copy_deps("*", root_package = "uranium", src = self.deps_cpp_info["uranium"].resdirs[1],
dst = self._share_dir.joinpath("uranium", "plugins"), keep_path = True)
self.copy_deps("*", root_package = "uranium", src = self.deps_cpp_info["uranium"].libdirs[0],
dst = self._site_packages.joinpath("UM"),
keep_path = True)
self.copy_deps("*", root_package = "uranium", src = str(os.path.join(self.deps_cpp_info["uranium"].libdirs[0], "Qt", "qml", "UM")),
dst = self._site_packages.joinpath("PyQt6", "Qt6", "qml", "UM"),
keep_path = True)
# Copy resources of cura_binary_data
self.copy_deps("*", root_package = "cura_binary_data", src = self.deps_cpp_info["cura_binary_data"].resdirs[0],
dst = self._share_dir.joinpath("cura"), keep_path = True)
self.copy_deps("*", root_package = "cura_binary_data", src = self.deps_cpp_info["cura_binary_data"].resdirs[1],
dst = self._share_dir.joinpath("uranium"), keep_path = True)
if self.settings.os == "Windows":
self.copy_deps("*", root_package = "cura_binary_data", src = self.deps_cpp_info["cura_binary_data"].resdirs[2],
dst = self._share_dir.joinpath("windows"), keep_path = True)
self.copy_deps("*.dll", src = "@bindirs", dst = self._site_packages)
self.copy_deps("*.pyd", src = "@libdirs", dst = self._site_packages)
self.copy_deps("*.pyi", src = "@libdirs", dst = self._site_packages)
self.copy_deps("*.dylib", src = "@libdirs", dst = self._base_dir.joinpath("lib"))
# Copy packaging scripts
self.copy("*", src = self.cpp_info.resdirs[2], dst = self._base_dir.joinpath("packaging"))
# Copy requirements.txt's
self.copy("*.txt", src = self.cpp_info.resdirs[-1], dst = self._base_dir.joinpath("pip_requirements"))
uranium = self.dependencies["uranium"].cpp_info
copy(self, "*", uranium.resdirs[0], str(self._share_dir.joinpath("uranium", "resources")), keep_path = True)
copy(self, "*", uranium.resdirs[1], str(self._share_dir.joinpath("uranium", "plugins")), keep_path = True)
copy(self, "*", uranium.libdirs[0], str(self._site_packages.joinpath("UM")), keep_path = True)
# Generate the GitHub Action version info Environment
version = self.conf_info.get("user.cura:version", default = self.version, check_type = str)
version = self.conf.get("user.cura:version", default = self.version, check_type = str)
cura_version = Version(version)
env_prefix = "Env:" if self.settings.os == "Windows" else ""
activate_github_actions_version_env = Template(r"""echo "CURA_VERSION_MAJOR={{ cura_version_major }}" >> ${{ env_prefix }}GITHUB_ENV
@ -481,6 +512,13 @@ echo "CURA_APP_NAME={{ cura_app_name }}" >> ${{ env_prefix }}GITHUB_ENV
copy(self, "requirement*.txt", src = self.source_folder, dst = os.path.join(self.package_folder, self.cpp.package.resdirs[-1]))
copy(self, "*", src = os.path.join(self.source_folder, "packaging"), dst = os.path.join(self.package_folder, self.cpp.package.resdirs[2]))
# Remove the CuraEngineGradualFlow plugin from the package
rmdir(self, os.path.join(self.package_folder, self.cpp.package.resdirs[1], "CuraEngineGradualFlow"))
rm(self, "bundled_*.json", os.path.join(self.package_folder, self.cpp.package.resdirs[0], "bundled_packages"), recursive = False)
# Remove the fdm_materials from the package
rmdir(self, os.path.join(self.package_folder, self.cpp.package.resdirs[0], "materials"))
def package_info(self):
self.user_info.pip_requirements = "requirements.txt"
self.user_info.pip_requirements_git = "requirements-ultimaker.txt"
@ -488,10 +526,14 @@ echo "CURA_APP_NAME={{ cura_app_name }}" >> ${{ env_prefix }}GITHUB_ENV
if self.in_local_cache:
self.runenv_info.append_path("PYTHONPATH", os.path.join(self.package_folder, "site-packages"))
self.env_info.PYTHONPATH.append(os.path.join(self.package_folder, "site-packages"))
self.runenv_info.append_path("PYTHONPATH", os.path.join(self.package_folder, "plugins"))
self.env_info.PYTHONPATH.append(os.path.join(self.package_folder, "plugins"))
else:
self.runenv_info.append_path("PYTHONPATH", self.source_folder)
self.env_info.PYTHONPATH.append(self.source_folder)
self.runenv_info.append_path("PYTHONPATH", os.path.join(self.source_folder, "plugins"))
self.env_info.PYTHONPATH.append(os.path.join(self.source_folder, "plugins"))
def package_id(self):
self.info.clear()
@ -504,6 +546,8 @@ echo "CURA_APP_NAME={{ cura_app_name }}" >> ${{ env_prefix }}GITHUB_ENV
del self.info.options.cloud_api_version
del self.info.options.display_name
del self.info.options.cura_debug_mode
if self.options.get_safe("enable_i18n", False):
del self.info.options.enable_i18n
# TODO: Use the hash of requirements.txt and requirements-ultimaker.txt, Because changing these will actually result in a different
# Cura. This is needed because the requirements.txt aren't managed by Conan and therefor not resolved in the package_id. This isn't

View file

@ -190,6 +190,20 @@ class Account(QObject):
def isLoggedIn(self) -> bool:
return self._logged_in
@pyqtSlot()
def stopSyncing(self) -> None:
Logger.debug(f"Stopping sync of cloud printers")
self._setManualSyncEnabled(True)
if self._update_timer.isActive():
self._update_timer.stop()
@pyqtSlot()
def startSyncing(self) -> None:
Logger.debug(f"Starting sync of cloud printers")
self._setManualSyncEnabled(False)
if not self._update_timer.isActive():
self._update_timer.start()
def _onLoginStateChanged(self, logged_in: bool = False, error_message: Optional[str] = None) -> None:
if error_message:
if self._error_message:

View file

@ -1,4 +1,4 @@
# Copyright (c) 2022 UltiMaker
# Copyright (c) 2023 UltiMaker
# Cura is released under the terms of the LGPLv3 or higher.
# ---------
@ -14,7 +14,7 @@ DEFAULT_CURA_LATEST_URL = "https://software.ultimaker.com/latest.json"
# Each release has a fixed SDK version coupled with it. It doesn't make sense to make it configurable because, for
# example Cura 3.2 with SDK version 6.1 will not work. So the SDK version is hard-coded here and left out of the
# CuraVersion.py.in template.
CuraSDKVersion = "8.3.0"
CuraSDKVersion = "8.7.0"
try:
from cura.CuraVersion import CuraLatestURL
@ -69,13 +69,25 @@ try:
except ImportError:
CuraAppDisplayName = DEFAULT_CURA_DISPLAY_NAME
DEPENDENCY_INFO = {}
try:
from pathlib import Path
conan_install_info = Path(__file__).parent.parent.joinpath("conan_install_info.json")
if conan_install_info.exists():
import json
with open(conan_install_info, "r") as f:
DEPENDENCY_INFO = json.loads(f.read())
except:
pass
from cura.CuraVersion import ConanInstalls
if type(ConanInstalls) == dict:
CONAN_INSTALLS = ConanInstalls
else:
CONAN_INSTALLS = {}
except ImportError:
CONAN_INSTALLS = {}
try:
from cura.CuraVersion import PythonInstalls
if type(PythonInstalls) == dict:
PYTHON_INSTALLS = PythonInstalls
else:
PYTHON_INSTALLS = {}
except ImportError:
PYTHON_INSTALLS = {}

View file

@ -8,17 +8,20 @@ from UM.Logger import Logger
from UM.Message import Message
from UM.Scene.SceneNode import SceneNode
from UM.i18n import i18nCatalog
from cura.Arranging.Nest2DArrange import arrange
from cura.Arranging.GridArrange import GridArrange
from cura.Arranging.Nest2DArrange import Nest2DArrange
i18n_catalog = i18nCatalog("cura")
class ArrangeObjectsJob(Job):
def __init__(self, nodes: List[SceneNode], fixed_nodes: List[SceneNode], min_offset = 8) -> None:
def __init__(self, nodes: List[SceneNode], fixed_nodes: List[SceneNode], min_offset = 8,
*, grid_arrange: bool = False) -> None:
super().__init__()
self._nodes = nodes
self._fixed_nodes = fixed_nodes
self._min_offset = min_offset
self._grid_arrange = grid_arrange
def run(self):
found_solution_for_all = False
@ -29,10 +32,18 @@ class ArrangeObjectsJob(Job):
title = i18n_catalog.i18nc("@info:title", "Finding Location"))
status_message.show()
if self._grid_arrange:
arranger = GridArrange(self._nodes, Application.getInstance().getBuildVolume(), self._fixed_nodes)
else:
arranger = Nest2DArrange(self._nodes, Application.getInstance().getBuildVolume(), self._fixed_nodes,
factor=1000)
found_solution_for_all = False
try:
found_solution_for_all = arrange(self._nodes, Application.getInstance().getBuildVolume(), self._fixed_nodes)
found_solution_for_all = arranger.arrange()
except: # If the thread crashes, the message should still close
Logger.logException("e", "Unable to arrange the objects on the buildplate. The arrange algorithm has crashed.")
Logger.logException("e",
"Unable to arrange the objects on the buildplate. The arrange algorithm has crashed.")
status_message.hide()

View file

@ -0,0 +1,27 @@
from typing import List, TYPE_CHECKING, Optional, Tuple, Set
if TYPE_CHECKING:
from UM.Operations.GroupedOperation import GroupedOperation
class Arranger:
def createGroupOperationForArrange(self, add_new_nodes_in_scene: bool = False) -> Tuple["GroupedOperation", int]:
"""
Find placement for a set of scene nodes, but don't actually move them just yet.
:param add_new_nodes_in_scene: Whether to create new scene nodes before applying the transformations and rotations
:return: tuple (found_solution_for_all, node_items)
WHERE
found_solution_for_all: Whether the algorithm found a place on the buildplate for all the objects
node_items: A list of the nodes return by libnest2d, which contain the new positions on the buildplate
"""
raise NotImplementedError
def arrange(self, add_new_nodes_in_scene: bool = False) -> bool:
"""
Find placement for a set of scene nodes, and move them by using a single grouped operation.
:param add_new_nodes_in_scene: Whether to create new scene nodes before applying the transformations and rotations
:return: found_solution_for_all: Whether the algorithm found a place on the buildplate for all the objects
"""
grouped_operation, not_fit_count = self.createGroupOperationForArrange(add_new_nodes_in_scene)
grouped_operation.push()
return not_fit_count == 0

View file

@ -0,0 +1,342 @@
import math
from typing import List, TYPE_CHECKING, Tuple, Set, Union
if TYPE_CHECKING:
from UM.Scene.SceneNode import SceneNode
from cura.BuildVolume import BuildVolume
from UM.Application import Application
from UM.Math.AxisAlignedBox import AxisAlignedBox
from UM.Math.Polygon import Polygon
from UM.Math.Vector import Vector
from UM.Operations.AddSceneNodeOperation import AddSceneNodeOperation
from UM.Operations.GroupedOperation import GroupedOperation
from UM.Operations.TranslateOperation import TranslateOperation
from cura.Arranging.Arranger import Arranger
class GridArrange(Arranger):
def __init__(self, nodes_to_arrange: List["SceneNode"], build_volume: "BuildVolume", fixed_nodes: List["SceneNode"] = None):
if fixed_nodes is None:
fixed_nodes = []
self._nodes_to_arrange = nodes_to_arrange
self._build_volume = build_volume
self._build_volume_bounding_box = build_volume.getBoundingBox()
self._fixed_nodes = fixed_nodes
self._margin_x: float = 1
self._margin_y: float = 1
self._grid_width = 0
self._grid_height = 0
for node in self._nodes_to_arrange:
bounding_box = node.getBoundingBox()
self._grid_width = max(self._grid_width, bounding_box.width)
self._grid_height = max(self._grid_height, bounding_box.depth)
self._grid_width += self._margin_x
self._grid_height += self._margin_y
# Round up the grid size to the nearest cm, this assures that new objects will
# be placed on integer offsets from each other
grid_precision = 10 # 1cm
rounded_grid_width = math.ceil(self._grid_width / grid_precision) * grid_precision
rounded_grid_height = math.ceil(self._grid_height / grid_precision) * grid_precision
# The space added by the "grid precision rounding up" of the grid size
self._grid_round_margin_x = rounded_grid_width - self._grid_width
self._grid_round_margin_y = rounded_grid_height - self._grid_height
self._grid_width = rounded_grid_width
self._grid_height = rounded_grid_height
self._offset_x = 0
self._offset_y = 0
self._findOptimalGridOffset()
coord_initial_leftover_x = self._build_volume_bounding_box.right + 2 * self._grid_width
coord_initial_leftover_y = (self._build_volume_bounding_box.back + self._build_volume_bounding_box.front) * 0.5
self._initial_leftover_grid_x, self._initial_leftover_grid_y = self._coordSpaceToGridSpace(
coord_initial_leftover_x, coord_initial_leftover_y)
self._initial_leftover_grid_x = math.floor(self._initial_leftover_grid_x)
self._initial_leftover_grid_y = math.floor(self._initial_leftover_grid_y)
# Find grid indexes that intersect with fixed objects
self._fixed_nodes_grid_ids = set()
for node in self._fixed_nodes:
self._fixed_nodes_grid_ids = self._fixed_nodes_grid_ids.union(
self._intersectingGridIdxInclusive(node.getBoundingBox()))
# grid indexes that are in disallowed area
for polygon in self._build_volume.getDisallowedAreas():
self._fixed_nodes_grid_ids = self._fixed_nodes_grid_ids.union(self._intersectingGridIdxInclusive(polygon))
self._build_plate_grid_ids = self._intersectingGridIdxExclusive(self._build_volume_bounding_box)
# Filter out the corner grid squares if the build plate shape is elliptic
if self._build_volume.getShape() == "elliptic":
self._build_plate_grid_ids = set(
filter(lambda grid_id: self._checkGridUnderDiscSpace(grid_id[0], grid_id[1]),
self._build_plate_grid_ids))
self._allowed_grid_idx = self._build_plate_grid_ids.difference(self._fixed_nodes_grid_ids)
def createGroupOperationForArrange(self, add_new_nodes_in_scene: bool = False) -> Tuple[GroupedOperation, int]:
# Find the sequence in which items are placed
coord_build_plate_center_x = self._build_volume_bounding_box.width * 0.5 + self._build_volume_bounding_box.left
coord_build_plate_center_y = self._build_volume_bounding_box.depth * 0.5 + self._build_volume_bounding_box.back
grid_build_plate_center_x, grid_build_plate_center_y = self._coordSpaceToGridSpace(coord_build_plate_center_x,
coord_build_plate_center_y)
sequence: List[Tuple[int, int]] = list(self._allowed_grid_idx)
sequence.sort(key=lambda grid_id: (grid_build_plate_center_x - grid_id[0]) ** 2 + (
grid_build_plate_center_y - grid_id[1]) ** 2)
scene_root = Application.getInstance().getController().getScene().getRoot()
grouped_operation = GroupedOperation()
for grid_id, node in zip(sequence, self._nodes_to_arrange):
if add_new_nodes_in_scene:
grouped_operation.addOperation(AddSceneNodeOperation(node, scene_root))
grid_x, grid_y = grid_id
operation = self._moveNodeOnGrid(node, grid_x, grid_y)
grouped_operation.addOperation(operation)
leftover_nodes = self._nodes_to_arrange[len(sequence):]
left_over_grid_y = self._initial_leftover_grid_y
for node in leftover_nodes:
if add_new_nodes_in_scene:
grouped_operation.addOperation(AddSceneNodeOperation(node, scene_root))
# find the first next grid position that isn't occupied by a fixed node
while (self._initial_leftover_grid_x, left_over_grid_y) in self._fixed_nodes_grid_ids:
left_over_grid_y = left_over_grid_y - 1
operation = self._moveNodeOnGrid(node, self._initial_leftover_grid_x, left_over_grid_y)
grouped_operation.addOperation(operation)
left_over_grid_y = left_over_grid_y - 1
return grouped_operation, len(leftover_nodes)
def _findOptimalGridOffset(self):
if len(self._fixed_nodes) == 0:
edge_disallowed_size = self._build_volume.getEdgeDisallowedSize()
self._offset_x = edge_disallowed_size
self._offset_y = edge_disallowed_size
return
if len(self._fixed_nodes) == 1:
center_grid_x = 0.5 * self._grid_width + self._build_volume_bounding_box.left
center_grid_y = 0.5 * self._grid_height + self._build_volume_bounding_box.back
bounding_box = self._fixed_nodes[0].getBoundingBox()
center_node_x = (bounding_box.left + bounding_box.right) * 0.5
center_node_y = (bounding_box.back + bounding_box.front) * 0.5
self._offset_x = center_node_x - center_grid_x
self._offset_y = center_node_y - center_grid_y
return
# If there are multiple fixed nodes, an optimal solution is not always possible
# We will try to find an offset that minimizes the number of grid intersections
# with fixed nodes. The algorithm below achieves this by utilizing a scanline
# algorithm. In this algorithm each axis is solved separately as offsetting
# is completely independent in each axis. The comments explaining the algorithm
# below are for the x-axis, but the same applies for the y-axis.
#
# Each node either occupies ceil((node.right - node.right) / grid_width) or
# ceil((node.right - node.right) / grid_width) + 1 grid squares. We will call
# these the node's "footprint".
#
# ┌────────────────┐
# minimum foot-print │ NODE │
# └────────────────┘
# │ grid 1 │ grid 2 │ grid 3 │ grid 4 | grid 5 |
# ┌────────────────┐
# maximum foot-print │ NODE │
# └────────────────┘
#
# The algorithm will find the grid offset such that the number of nodes with
# a _minimal_ footprint is _maximized_.
# The scanline algorithm works as follows, we create events for both end points
# of each node's footprint. The event have two properties,
# - the coordinate: the amount the endpoint can move to the
# left before it crosses a grid line
# - the change: either +1 or -1, indicating whether crossing the grid line
# would result in a minimal footprint node becoming a maximal footprint
class Event:
def __init__(self, coord: float, change: float):
self.coord = coord
self.change = change
# create events for both the horizontal and vertical axis
events_horizontal: List[Event] = []
events_vertical: List[Event] = []
for node in self._fixed_nodes:
bounding_box = node.getBoundingBox()
left = bounding_box.left - self._build_volume_bounding_box.left
right = bounding_box.right - self._build_volume_bounding_box.left
back = bounding_box.back - self._build_volume_bounding_box.back
front = bounding_box.front - self._build_volume_bounding_box.back
value_left = math.ceil(left / self._grid_width) * self._grid_width - left
value_right = math.ceil(right / self._grid_width) * self._grid_width - right
value_back = math.ceil(back / self._grid_height) * self._grid_height - back
value_front = math.ceil(front / self._grid_height) * self._grid_height - front
# give nodes a weight according to their size. This
# weight is heuristically chosen to be proportional to
# the number of grid squares the node-boundary occupies
weight = bounding_box.width + bounding_box.depth
events_horizontal.append(Event(value_left, weight))
events_horizontal.append(Event(value_right, -weight))
events_vertical.append(Event(value_back, weight))
events_vertical.append(Event(value_front, -weight))
events_horizontal.sort(key=lambda event: event.coord)
events_vertical.sort(key=lambda event: event.coord)
def findOptimalShiftAxis(events: List[Event], interval: float) -> float:
# executing the actual scanline algorithm
# iteratively go through events (left to right) and keep track of the
# current footprint. The optimal location is the one with the minimal
# footprint. If there are multiple locations with the same minimal
# footprint, the optimal location is the one with the largest range
# between the left and right endpoint of the footprint.
prev_offset = events[-1].coord - interval
current_minimal_footprint_count = 0
best_minimal_footprint_count = float('inf')
best_offset_span = float('-inf')
best_offset = 0.0
for event in events:
offset_span = event.coord - prev_offset
if current_minimal_footprint_count < best_minimal_footprint_count or (
current_minimal_footprint_count == best_minimal_footprint_count and offset_span > best_offset_span):
best_minimal_footprint_count = current_minimal_footprint_count
best_offset_span = offset_span
best_offset = event.coord
current_minimal_footprint_count += event.change
prev_offset = event.coord
return best_offset - best_offset_span * 0.5
center_grid_x = 0.5 * self._grid_width
center_grid_y = 0.5 * self._grid_height
optimal_center_x = self._grid_width - findOptimalShiftAxis(events_horizontal, self._grid_width)
optimal_center_y = self._grid_height - findOptimalShiftAxis(events_vertical, self._grid_height)
self._offset_x = optimal_center_x - center_grid_x
self._offset_y = optimal_center_y - center_grid_y
def _moveNodeOnGrid(self, node: "SceneNode", grid_x: int, grid_y: int) -> "Operation.Operation":
coord_grid_x, coord_grid_y = self._gridSpaceToCoordSpace(grid_x, grid_y)
center_grid_x = coord_grid_x + (0.5 * self._grid_width)
center_grid_y = coord_grid_y + (0.5 * self._grid_height)
return TranslateOperation(node, Vector(center_grid_x, node.getWorldPosition().y, center_grid_y),
set_position=True)
def _getGridCornerPoints(
self,
bounds: Union[AxisAlignedBox, Polygon],
*,
margin_x: float = 0.0,
margin_y: float = 0.0
) -> Tuple[float, float, float, float]:
if isinstance(bounds, AxisAlignedBox):
coord_x1 = bounds.left - margin_x
coord_x2 = bounds.right + margin_x
coord_y1 = bounds.back - margin_y
coord_y2 = bounds.front + margin_y
elif isinstance(bounds, Polygon):
coord_x1 = float('inf')
coord_y1 = float('inf')
coord_x2 = float('-inf')
coord_y2 = float('-inf')
for x, y in bounds.getPoints():
coord_x1 = min(coord_x1, x)
coord_y1 = min(coord_y1, y)
coord_x2 = max(coord_x2, x)
coord_y2 = max(coord_y2, y)
else:
raise TypeError("bounds must be either an AxisAlignedBox or a Polygon")
coord_x1 -= margin_x
coord_x2 += margin_x
coord_y1 -= margin_y
coord_y2 += margin_y
grid_x1, grid_y1 = self._coordSpaceToGridSpace(coord_x1, coord_y1)
grid_x2, grid_y2 = self._coordSpaceToGridSpace(coord_x2, coord_y2)
return grid_x1, grid_y1, grid_x2, grid_y2
def _intersectingGridIdxInclusive(self, bounds: Union[AxisAlignedBox, Polygon]) -> Set[Tuple[int, int]]:
grid_x1, grid_y1, grid_x2, grid_y2 = self._getGridCornerPoints(
bounds,
margin_x=-(self._margin_x + self._grid_round_margin_x) * 0.5,
margin_y=-(self._margin_y + self._grid_round_margin_y) * 0.5,
)
grid_idx = set()
for grid_x in range(math.floor(grid_x1), math.ceil(grid_x2)):
for grid_y in range(math.floor(grid_y1), math.ceil(grid_y2)):
grid_idx.add((grid_x, grid_y))
return grid_idx
def _intersectingGridIdxExclusive(self, bounds: Union[AxisAlignedBox, Polygon]) -> Set[Tuple[int, int]]:
grid_x1, grid_y1, grid_x2, grid_y2 = self._getGridCornerPoints(
bounds,
margin_x=(self._margin_x + self._grid_round_margin_x) * 0.5,
margin_y=(self._margin_y + self._grid_round_margin_y) * 0.5,
)
grid_idx = set()
for grid_x in range(math.ceil(grid_x1), math.floor(grid_x2)):
for grid_y in range(math.ceil(grid_y1), math.floor(grid_y2)):
grid_idx.add((grid_x, grid_y))
return grid_idx
def _gridSpaceToCoordSpace(self, x: float, y: float) -> Tuple[float, float]:
grid_x = x * self._grid_width + self._build_volume_bounding_box.left + self._offset_x
grid_y = y * self._grid_height + self._build_volume_bounding_box.back + self._offset_y
return grid_x, grid_y
def _coordSpaceToGridSpace(self, grid_x: float, grid_y: float) -> Tuple[float, float]:
coord_x = (grid_x - self._build_volume_bounding_box.left - self._offset_x) / self._grid_width
coord_y = (grid_y - self._build_volume_bounding_box.back - self._offset_y) / self._grid_height
return coord_x, coord_y
def _checkGridUnderDiscSpace(self, grid_x: int, grid_y: int) -> bool:
left, back = self._gridSpaceToCoordSpace(grid_x, grid_y)
right, front = self._gridSpaceToCoordSpace(grid_x + 1, grid_y + 1)
corners = [(left, back), (right, back), (right, front), (left, front)]
return all([self._checkPointUnderDiscSpace(x, y) for x, y in corners])
def _checkPointUnderDiscSpace(self, x: float, y: float) -> bool:
disc_x, disc_y = self._coordSpaceToDiscSpace(x, y)
distance_to_center_squared = disc_x ** 2 + disc_y ** 2
return distance_to_center_squared <= 1.0
def _coordSpaceToDiscSpace(self, x: float, y: float) -> Tuple[float, float]:
# Transform coordinate system to
#
# coord_build_plate_left = -1
# | coord_build_plate_right = 1
# v (0,1) v
# ┌───────┬───────┐ < coord_build_plate_back = -1
# │ │ │
# │ │(0,0) │
# (-1,0)├───────o───────┤(1,0)
# │ │ │
# │ │ │
# └───────┴───────┘ < coord_build_plate_front = +1
# (0,-1)
disc_x = ((x - self._build_volume_bounding_box.left) / self._build_volume_bounding_box.width) * 2.0 - 1.0
disc_y = ((y - self._build_volume_bounding_box.back) / self._build_volume_bounding_box.depth) * 2.0 - 1.0
return disc_x, disc_y

View file

@ -6,6 +6,7 @@ from pynest2d import Point, Box, Item, NfpConfig, nest
from typing import List, TYPE_CHECKING, Optional, Tuple
from UM.Application import Application
from UM.Decorators import deprecated
from UM.Logger import Logger
from UM.Math.Matrix import Matrix
from UM.Math.Polygon import Polygon
@ -15,148 +16,168 @@ from UM.Operations.AddSceneNodeOperation import AddSceneNodeOperation
from UM.Operations.GroupedOperation import GroupedOperation
from UM.Operations.RotateOperation import RotateOperation
from UM.Operations.TranslateOperation import TranslateOperation
from cura.Arranging.Arranger import Arranger
if TYPE_CHECKING:
from UM.Scene.SceneNode import SceneNode
from cura.BuildVolume import BuildVolume
def findNodePlacement(nodes_to_arrange: List["SceneNode"], build_volume: "BuildVolume", fixed_nodes: Optional[List["SceneNode"]] = None, factor = 10000) -> Tuple[bool, List[Item]]:
"""
Find placement for a set of scene nodes, but don't actually move them just yet.
:param nodes_to_arrange: The list of nodes that need to be moved.
:param build_volume: The build volume that we want to place the nodes in. It gets size & disallowed areas from this.
:param fixed_nodes: List of nods that should not be moved, but should be used when deciding where the others nodes
are placed.
:param factor: The library that we use is int based. This factor defines how accurate we want it to be.
class Nest2DArrange(Arranger):
def __init__(self,
nodes_to_arrange: List["SceneNode"],
build_volume: "BuildVolume",
fixed_nodes: Optional[List["SceneNode"]] = None,
*,
factor: int = 10000,
lock_rotation: bool = False):
"""
:param nodes_to_arrange: The list of nodes that need to be moved.
:param build_volume: The build volume that we want to place the nodes in. It gets size & disallowed areas from this.
:param fixed_nodes: List of nods that should not be moved, but should be used when deciding where the others nodes
are placed.
:param factor: The library that we use is int based. This factor defines how accuracte we want it to be.
:param lock_rotation: If set to true the orientation of the object will remain the same
"""
super().__init__()
self._nodes_to_arrange = nodes_to_arrange
self._build_volume = build_volume
self._fixed_nodes = fixed_nodes
self._factor = factor
self._lock_rotation = lock_rotation
:return: tuple (found_solution_for_all, node_items)
WHERE
found_solution_for_all: Whether the algorithm found a place on the buildplate for all the objects
node_items: A list of the nodes return by libnest2d, which contain the new positions on the buildplate
"""
spacing = int(1.5 * factor) # 1.5mm spacing.
def findNodePlacement(self) -> Tuple[bool, List[Item]]:
spacing = int(1.5 * self._factor) # 1.5mm spacing.
machine_width = build_volume.getWidth()
machine_depth = build_volume.getDepth()
build_plate_bounding_box = Box(int(machine_width * factor), int(machine_depth * factor))
edge_disallowed_size = self._build_volume.getEdgeDisallowedSize()
machine_width = self._build_volume.getWidth() - (edge_disallowed_size * 2)
machine_depth = self._build_volume.getDepth() - (edge_disallowed_size * 2)
build_plate_bounding_box = Box(int(machine_width * self._factor), int(machine_depth * self._factor))
if fixed_nodes is None:
fixed_nodes = []
if self._fixed_nodes is None:
self._fixed_nodes = []
# Add all the items we want to arrange
node_items = []
for node in nodes_to_arrange:
hull_polygon = node.callDecoration("getConvexHull")
if not hull_polygon or hull_polygon.getPoints is None:
Logger.log("w", "Object {} cannot be arranged because it has no convex hull.".format(node.getName()))
continue
converted_points = []
for point in hull_polygon.getPoints():
converted_points.append(Point(int(point[0] * factor), int(point[1] * factor)))
item = Item(converted_points)
node_items.append(item)
# Use a tiny margin for the build_plate_polygon (the nesting doesn't like overlapping disallowed areas)
half_machine_width = 0.5 * machine_width - 1
half_machine_depth = 0.5 * machine_depth - 1
build_plate_polygon = Polygon(numpy.array([
[half_machine_width, -half_machine_depth],
[-half_machine_width, -half_machine_depth],
[-half_machine_width, half_machine_depth],
[half_machine_width, half_machine_depth]
], numpy.float32))
disallowed_areas = build_volume.getDisallowedAreas()
num_disallowed_areas_added = 0
for area in disallowed_areas:
converted_points = []
# Clip the disallowed areas so that they don't overlap the bounding box (The arranger chokes otherwise)
clipped_area = area.intersectionConvexHulls(build_plate_polygon)
if clipped_area.getPoints() is not None and len(clipped_area.getPoints()) > 2: # numpy array has to be explicitly checked against None
for point in clipped_area.getPoints():
converted_points.append(Point(int(point[0] * factor), int(point[1] * factor)))
disallowed_area = Item(converted_points)
disallowed_area.markAsDisallowedAreaInBin(0)
node_items.append(disallowed_area)
num_disallowed_areas_added += 1
for node in fixed_nodes:
converted_points = []
hull_polygon = node.callDecoration("getConvexHull")
if hull_polygon is not None and hull_polygon.getPoints() is not None and len(hull_polygon.getPoints()) > 2: # numpy array has to be explicitly checked against None
# Add all the items we want to arrange
node_items = []
for node in self._nodes_to_arrange:
hull_polygon = node.callDecoration("getConvexHull")
if not hull_polygon or hull_polygon.getPoints is None:
Logger.log("w", "Object {} cannot be arranged because it has no convex hull.".format(node.getName()))
continue
converted_points = []
for point in hull_polygon.getPoints():
converted_points.append(Point(int(point[0] * factor), int(point[1] * factor)))
converted_points.append(Point(int(point[0] * self._factor), int(point[1] * self._factor)))
item = Item(converted_points)
item.markAsFixedInBin(0)
node_items.append(item)
num_disallowed_areas_added += 1
config = NfpConfig()
config.accuracy = 1.0
# Use a tiny margin for the build_plate_polygon (the nesting doesn't like overlapping disallowed areas)
half_machine_width = 0.5 * machine_width - 1
half_machine_depth = 0.5 * machine_depth - 1
build_plate_polygon = Polygon(numpy.array([
[half_machine_width, -half_machine_depth],
[-half_machine_width, -half_machine_depth],
[-half_machine_width, half_machine_depth],
[half_machine_width, half_machine_depth]
], numpy.float32))
num_bins = nest(node_items, build_plate_bounding_box, spacing, config)
disallowed_areas = self._build_volume.getDisallowedAreas()
for area in disallowed_areas:
converted_points = []
# Strip the fixed items (previously placed) and the disallowed areas from the results again.
node_items = list(filter(lambda item: not item.isFixed(), node_items))
# Clip the disallowed areas so that they don't overlap the bounding box (The arranger chokes otherwise)
clipped_area = area.intersectionConvexHulls(build_plate_polygon)
found_solution_for_all = num_bins == 1
if clipped_area.getPoints() is not None and len(
clipped_area.getPoints()) > 2: # numpy array has to be explicitly checked against None
for point in clipped_area.getPoints():
converted_points.append(Point(int(point[0] * self._factor), int(point[1] * self._factor)))
return found_solution_for_all, node_items
disallowed_area = Item(converted_points)
disallowed_area.markAsDisallowedAreaInBin(0)
node_items.append(disallowed_area)
for node in self._fixed_nodes:
converted_points = []
hull_polygon = node.callDecoration("getConvexHull")
if hull_polygon is not None and hull_polygon.getPoints() is not None and len(
hull_polygon.getPoints()) > 2: # numpy array has to be explicitly checked against None
for point in hull_polygon.getPoints():
converted_points.append(Point(int(point[0] * self._factor), int(point[1] * self._factor)))
item = Item(converted_points)
item.markAsFixedInBin(0)
node_items.append(item)
strategies = [NfpConfig.Alignment.CENTER] * 3 + [NfpConfig.Alignment.BOTTOM_LEFT] * 3
found_solution_for_all = False
while not found_solution_for_all and len(strategies) > 0:
config = NfpConfig()
config.accuracy = 1.0
config.alignment = NfpConfig.Alignment.CENTER
config.starting_point = strategies[0]
strategies = strategies[1:]
if self._lock_rotation:
config.rotations = [0.0]
num_bins = nest(node_items, build_plate_bounding_box, spacing, config)
# Strip the fixed items (previously placed) and the disallowed areas from the results again.
node_items = list(filter(lambda item: not item.isFixed(), node_items))
found_solution_for_all = num_bins == 1
return found_solution_for_all, node_items
def createGroupOperationForArrange(self, add_new_nodes_in_scene: bool = False) -> Tuple[GroupedOperation, int]:
scene_root = Application.getInstance().getController().getScene().getRoot()
found_solution_for_all, node_items = self.findNodePlacement()
not_fit_count = 0
grouped_operation = GroupedOperation()
for node, node_item in zip(self._nodes_to_arrange, node_items):
if add_new_nodes_in_scene:
grouped_operation.addOperation(AddSceneNodeOperation(node, scene_root))
if node_item.binId() == 0:
# We found a spot for it
rotation_matrix = Matrix()
rotation_matrix.setByRotationAxis(node_item.rotation(), Vector(0, -1, 0))
grouped_operation.addOperation(RotateOperation(node, Quaternion.fromMatrix(rotation_matrix)))
grouped_operation.addOperation(
TranslateOperation(node, Vector(node_item.translation().x() / self._factor, 0,
node_item.translation().y() / self._factor)))
else:
# We didn't find a spot
grouped_operation.addOperation(
TranslateOperation(node, Vector(200, node.getWorldPosition().y, -not_fit_count * 20), set_position = True))
not_fit_count += 1
return grouped_operation, not_fit_count
@deprecated("Use the Nest2DArrange class instead")
def findNodePlacement(nodes_to_arrange: List["SceneNode"], build_volume: "BuildVolume",
fixed_nodes: Optional[List["SceneNode"]] = None, factor=10000) -> Tuple[bool, List[Item]]:
arranger = Nest2DArrange(nodes_to_arrange, build_volume, fixed_nodes, factor=factor)
return arranger.findNodePlacement()
@deprecated("Use the Nest2DArrange class instead")
def createGroupOperationForArrange(nodes_to_arrange: List["SceneNode"],
build_volume: "BuildVolume",
fixed_nodes: Optional[List["SceneNode"]] = None,
factor = 10000,
add_new_nodes_in_scene: bool = False) -> Tuple[GroupedOperation, int]:
scene_root = Application.getInstance().getController().getScene().getRoot()
found_solution_for_all, node_items = findNodePlacement(nodes_to_arrange, build_volume, fixed_nodes, factor)
not_fit_count = 0
grouped_operation = GroupedOperation()
for node, node_item in zip(nodes_to_arrange, node_items):
if add_new_nodes_in_scene:
grouped_operation.addOperation(AddSceneNodeOperation(node, scene_root))
if node_item.binId() == 0:
# We found a spot for it
rotation_matrix = Matrix()
rotation_matrix.setByRotationAxis(node_item.rotation(), Vector(0, -1, 0))
grouped_operation.addOperation(RotateOperation(node, Quaternion.fromMatrix(rotation_matrix)))
grouped_operation.addOperation(TranslateOperation(node, Vector(node_item.translation().x() / factor, 0,
node_item.translation().y() / factor)))
else:
# We didn't find a spot
grouped_operation.addOperation(
TranslateOperation(node, Vector(200, node.getWorldPosition().y, -not_fit_count * 20), set_position = True))
not_fit_count += 1
return grouped_operation, not_fit_count
factor=10000,
add_new_nodes_in_scene: bool = False) -> Tuple[GroupedOperation, int]:
arranger = Nest2DArrange(nodes_to_arrange, build_volume, fixed_nodes, factor=factor)
return arranger.createGroupOperationForArrange(add_new_nodes_in_scene=add_new_nodes_in_scene)
@deprecated("Use the Nest2DArrange class instead")
def arrange(nodes_to_arrange: List["SceneNode"],
build_volume: "BuildVolume",
fixed_nodes: Optional[List["SceneNode"]] = None,
factor = 10000,
factor=10000,
add_new_nodes_in_scene: bool = False) -> bool:
"""
Find placement for a set of scene nodes, and move them by using a single grouped operation.
:param nodes_to_arrange: The list of nodes that need to be moved.
:param build_volume: The build volume that we want to place the nodes in. It gets size & disallowed areas from this.
:param fixed_nodes: List of nods that should not be moved, but should be used when deciding where the others nodes
are placed.
:param factor: The library that we use is int based. This factor defines how accuracte we want it to be.
:param add_new_nodes_in_scene: Whether to create new scene nodes before applying the transformations and rotations
:return: found_solution_for_all: Whether the algorithm found a place on the buildplate for all the objects
"""
grouped_operation, not_fit_count = createGroupOperationForArrange(nodes_to_arrange, build_volume, fixed_nodes, factor, add_new_nodes_in_scene)
grouped_operation.push()
return not_fit_count == 0
arranger = Nest2DArrange(nodes_to_arrange, build_volume, fixed_nodes, factor=factor)
return arranger.arrange(add_new_nodes_in_scene=add_new_nodes_in_scene)

141
cura/BackendPlugin.py Normal file
View file

@ -0,0 +1,141 @@
# Copyright (c) 2023 Ultimaker B.V.
# Cura is released under the terms of the LGPLv3 or higher.
import socket
import os
import subprocess
from typing import Optional, List
from UM.Logger import Logger
from UM.Message import Message
from UM.Settings.AdditionalSettingDefinitionsAppender import AdditionalSettingDefinitionsAppender
from UM.PluginObject import PluginObject
from UM.i18n import i18nCatalog
from UM.Platform import Platform
from UM.Resources import Resources
class BackendPlugin(AdditionalSettingDefinitionsAppender, PluginObject):
catalog = i18nCatalog("cura")
settings_catalog = i18nCatalog("fdmprinter.def.json")
def __init__(self, catalog_i18n = settings_catalog) -> None:
super().__init__(catalog_i18n)
self.__port: int = 0
self._plugin_address: str = "127.0.0.1"
self._plugin_command: Optional[List[str]] = None
self._process = None
self._is_running = False
self._supported_slots: List[int] = []
self._use_plugin = True
def usePlugin(self) -> bool:
return self._use_plugin
def getSupportedSlots(self) -> List[int]:
return self._supported_slots
def isRunning(self):
return self._is_running
def setPort(self, port: int) -> None:
self.__port = port
def getPort(self) -> int:
return self.__port
def getAddress(self) -> str:
return self._plugin_address
def setAvailablePort(self) -> None:
"""
Sets the port to a random available port.
"""
sock = socket.socket()
sock.bind((self.getAddress(), 0))
port = sock.getsockname()[1]
self.setPort(port)
def _validatePluginCommand(self) -> list[str]:
"""
Validate the plugin command and add the port parameter if it is missing.
:return: A list of strings containing the validated plugin command.
"""
if not self._plugin_command or "--port" in self._plugin_command:
return self._plugin_command or []
return self._plugin_command + ["--address", self.getAddress(), "--port", str(self.__port)]
def start(self) -> bool:
"""
Starts the backend_plugin process.
:return: True if the plugin process started successfully, False otherwise.
"""
if not self.usePlugin():
return False
Logger.info(f"Starting backend_plugin [{self._plugin_id}] with command: {self._validatePluginCommand()}")
plugin_log_path = os.path.join(Resources.getDataStoragePath(), f"{self.getPluginId()}.log")
if os.path.exists(plugin_log_path):
try:
os.remove(plugin_log_path)
except:
pass # removing is only done such that it doesn't grow out of proportions, if it fails once or twice that is okay
Logger.info(f"Logging plugin output to: {plugin_log_path}")
try:
# STDIN needs to be None because we provide no input, but communicate via a local socket instead.
# The NUL device sometimes doesn't exist on some computers.
with open(plugin_log_path, 'a') as f:
popen_kwargs = {
"stdin": None,
"stdout": f, # Redirect output to file
"stderr": subprocess.STDOUT, # Combine stderr and stdout
}
if Platform.isWindows():
popen_kwargs["creationflags"] = subprocess.CREATE_NO_WINDOW
self._process = subprocess.Popen(self._validatePluginCommand(), **popen_kwargs)
self._is_running = True
return True
except PermissionError:
Logger.log("e", f"Couldn't start EnginePlugin: {self._plugin_id} No permission to execute process.")
self._showMessage(self.catalog.i18nc("@info:plugin_failed",
f"Couldn't start EnginePlugin: {self._plugin_id}\nNo permission to execute process."),
message_type = Message.MessageType.ERROR)
except FileNotFoundError:
Logger.logException("e", f"Unable to find local EnginePlugin server executable for: {self._plugin_id}")
self._showMessage(self.catalog.i18nc("@info:plugin_failed",
f"Unable to find local EnginePlugin server executable for: {self._plugin_id}"),
message_type = Message.MessageType.ERROR)
except BlockingIOError:
Logger.logException("e", f"Couldn't start EnginePlugin: {self._plugin_id} Resource is temporarily unavailable")
self._showMessage(self.catalog.i18nc("@info:plugin_failed",
f"Couldn't start EnginePlugin: {self._plugin_id}\nResource is temporarily unavailable"),
message_type = Message.MessageType.ERROR)
except OSError as e:
Logger.logException("e", f"Couldn't start EnginePlugin {self._plugin_id} Operating system is blocking it (antivirus?)")
self._showMessage(self.catalog.i18nc("@info:plugin_failed",
f"Couldn't start EnginePlugin: {self._plugin_id}\nOperating system is blocking it (antivirus?)"),
message_type = Message.MessageType.ERROR)
return False
def stop(self) -> bool:
if not self._process:
self._is_running = False
return True # Nothing to stop
try:
self._process.terminate()
return_code = self._process.wait()
self._is_running = False
Logger.log("d", f"EnginePlugin: {self._plugin_id} was killed. Received return code {return_code}")
return True
except PermissionError:
Logger.log("e", f"Unable to kill running EnginePlugin: {self._plugin_id} Access is denied.")
self._showMessage(self.catalog.i18nc("@info:plugin_failed",
f"Unable to kill running EnginePlugin: {self._plugin_id}\nAccess is denied."),
message_type = Message.MessageType.ERROR)
return False
def _showMessage(self, message: str, message_type: Message.MessageType = Message.MessageType.ERROR) -> None:
Message(message, title=self.catalog.i18nc("@info:title", "EnginePlugin"), message_type = message_type).show()

View file

@ -120,6 +120,8 @@ class BuildVolume(SceneNode):
# Objects loaded at the moment. We are connected to the property changed events of these objects.
self._scene_objects = set() # type: Set[SceneNode]
# Number of toplevel printable meshes. If there is more than one, the build volume needs to take account of the gantry height in One at a Time printing.
self._root_printable_object_count = 0
self._scene_change_timer = QTimer()
self._scene_change_timer.setInterval(200)
@ -151,6 +153,7 @@ class BuildVolume(SceneNode):
def _onSceneChangeTimerFinished(self):
root = self._application.getController().getScene().getRoot()
new_scene_objects = set(node for node in BreadthFirstIterator(root) if node.callDecoration("isSliceable"))
if new_scene_objects != self._scene_objects:
for node in new_scene_objects - self._scene_objects: #Nodes that were added to the scene.
self._updateNodeListeners(node)
@ -166,6 +169,26 @@ class BuildVolume(SceneNode):
self.rebuild()
self._scene_objects = new_scene_objects
# This also needs to be called when objects are grouped/ungrouped,
# which is not reflected in a change in self._scene_objects
self._updateRootPrintableObjectCount()
def _updateRootPrintableObjectCount(self):
# Get the number of models in the scene root, excluding modifier meshes and counting grouped models as 1
root = self._application.getController().getScene().getRoot()
scene_objects = set(node for node in BreadthFirstIterator(root) if node.callDecoration("isSliceable") or node.callDecoration("isGroup"))
new_root_printable_object_count = len(list(node for node in scene_objects if node.getParent() == root and not (
node_stack := node.callDecoration("getStack") and (
node.callDecoration("getStack").getProperty("anti_overhang_mesh", "value") or
node.callDecoration("getStack").getProperty("support_mesh", "value") or
node.callDecoration("getStack").getProperty("cutting_mesh", "value") or
node.callDecoration("getStack").getProperty("infill_mesh", "value")
))
))
if new_root_printable_object_count != self._root_printable_object_count:
self._root_printable_object_count = new_root_printable_object_count
self._onSettingPropertyChanged("print_sequence", "value") # Create fake event, so right settings are triggered.
def _updateNodeListeners(self, node: SceneNode):
@ -203,6 +226,9 @@ class BuildVolume(SceneNode):
if shape:
self._shape = shape
def getShape(self) -> str:
return self._shape
def getDiagonalSize(self) -> float:
"""Get the length of the 3D diagonal through the build volume.
@ -486,20 +512,20 @@ class BuildVolume(SceneNode):
if not self._disallowed_areas:
return None
bounding_box = Polygon(numpy.array([[min_w, min_d], [min_w, max_d], [max_w, max_d], [max_w, min_d]], numpy.float32))
mb = MeshBuilder()
color = self._disallowed_area_color
for polygon in self._disallowed_areas:
points = polygon.getPoints()
if len(points) == 0:
intersection = polygon.intersectionConvexHulls(bounding_box)
points = numpy.flipud(intersection.getPoints())
if len(points) < 3:
continue
first = Vector(self._clamp(points[0][0], min_w, max_w), disallowed_area_height,
self._clamp(points[0][1], min_d, max_d))
previous_point = Vector(self._clamp(points[0][0], min_w, max_w), disallowed_area_height,
self._clamp(points[0][1], min_d, max_d))
for point in points:
new_point = Vector(self._clamp(point[0], min_w, max_w), disallowed_area_height,
self._clamp(point[1], min_d, max_d))
first = Vector(points[0][0], disallowed_area_height, points[0][1])
previous_point = Vector(points[1][0], disallowed_area_height, points[1][1])
for point in points[2:]:
new_point = Vector(point[0], disallowed_area_height, point[1])
mb.addFace(first, previous_point, new_point, color=color)
previous_point = new_point
@ -647,12 +673,14 @@ class BuildVolume(SceneNode):
self._width = self._global_container_stack.getProperty("machine_width", "value")
machine_height = self._global_container_stack.getProperty("machine_height", "value")
if self._global_container_stack.getProperty("print_sequence", "value") == "one_at_a_time" and len(self._scene_objects) > 1:
self._height = min(self._global_container_stack.getProperty("gantry_height", "value") * self._scale_vector.z, machine_height)
if self._height < (machine_height * self._scale_vector.z):
if self._global_container_stack.getProperty("print_sequence", "value") == "one_at_a_time" and self._root_printable_object_count > 1:
new_height = min(self._global_container_stack.getProperty("gantry_height", "value") * self._scale_vector.z, machine_height)
if self._height > new_height:
self._build_volume_message.show()
else:
elif self._height < new_height:
self._build_volume_message.hide()
self._height = new_height
else:
self._height = self._global_container_stack.getProperty("machine_height", "value")
self._build_volume_message.hide()
@ -687,14 +715,21 @@ class BuildVolume(SceneNode):
update_extra_z_clearance = True
for setting_key in self._changed_settings_since_last_rebuild:
if setting_key in ["print_sequence", "support_mesh", "infill_mesh", "cutting_mesh", "anti_overhang_mesh"]:
self._updateRootPrintableObjectCount()
if setting_key == "print_sequence":
machine_height = self._global_container_stack.getProperty("machine_height", "value")
if self._application.getGlobalContainerStack().getProperty("print_sequence", "value") == "one_at_a_time" and len(self._scene_objects) > 1:
self._height = min(self._global_container_stack.getProperty("gantry_height", "value") * self._scale_vector.z, machine_height)
if self._height < (machine_height * self._scale_vector.z):
if self._application.getGlobalContainerStack().getProperty("print_sequence", "value") == "one_at_a_time" and self._root_printable_object_count > 1:
new_height = min(
self._global_container_stack.getProperty("gantry_height", "value") * self._scale_vector.z,
machine_height)
if self._height > new_height:
self._build_volume_message.show()
else:
elif self._height < new_height:
self._build_volume_message.hide()
self._height = new_height
else:
self._height = self._global_container_stack.getProperty("machine_height", "value") * self._scale_vector.z
self._build_volume_message.hide()
@ -804,7 +839,7 @@ class BuildVolume(SceneNode):
prime_tower_areas = self._computeDisallowedAreasPrinted(used_extruders)
for extruder_id in prime_tower_areas:
for area_index, prime_tower_area in enumerate(prime_tower_areas[extruder_id]):
for area in result_areas[extruder_id]:
for area in result_areas_no_brim[extruder_id]:
if prime_tower_area.intersectsPolygon(area) is not None:
prime_tower_collision = True
break
@ -851,13 +886,24 @@ class BuildVolume(SceneNode):
machine_depth = self._global_container_stack.getProperty("machine_depth", "value")
prime_tower_x = self._global_container_stack.getProperty("prime_tower_position_x", "value")
prime_tower_y = - self._global_container_stack.getProperty("prime_tower_position_y", "value")
prime_tower_brim_enable = self._global_container_stack.getProperty("prime_tower_brim_enable", "value")
prime_tower_base_size = self._global_container_stack.getProperty("prime_tower_base_size", "value")
prime_tower_base_height = self._global_container_stack.getProperty("prime_tower_base_height", "value")
adhesion_type = self._global_container_stack.getProperty("adhesion_type", "value")
if not self._global_container_stack.getProperty("machine_center_is_zero", "value"):
prime_tower_x = prime_tower_x - machine_width / 2 #Offset by half machine_width and _depth to put the origin in the front-left.
prime_tower_y = prime_tower_y + machine_depth / 2
radius = prime_tower_size / 2
prime_tower_area = Polygon.approximatedCircle(radius, num_segments = 24)
prime_tower_area = prime_tower_area.translate(prime_tower_x - radius, prime_tower_y - radius)
delta_x = -radius
delta_y = -radius
if prime_tower_base_size > 0 and ((prime_tower_brim_enable and prime_tower_base_height > 0) or adhesion_type == "raft"):
radius += prime_tower_base_size
prime_tower_area = Polygon.approximatedCircle(radius, num_segments = 32)
prime_tower_area = prime_tower_area.translate(prime_tower_x + delta_x, prime_tower_y + delta_y)
prime_tower_area = prime_tower_area.getMinkowskiHull(Polygon.approximatedCircle(0))
for extruder in used_extruders:
@ -1162,7 +1208,7 @@ class BuildVolume(SceneNode):
_raft_settings = ["adhesion_type", "raft_base_thickness", "raft_interface_layers", "raft_interface_thickness", "raft_surface_layers", "raft_surface_thickness", "raft_airgap", "layer_0_z_overlap"]
_extra_z_settings = ["retraction_hop_enabled", "retraction_hop"]
_prime_settings = ["extruder_prime_pos_x", "extruder_prime_pos_y", "prime_blob_enable"]
_tower_settings = ["prime_tower_enable", "prime_tower_size", "prime_tower_position_x", "prime_tower_position_y", "prime_tower_brim_enable"]
_tower_settings = ["prime_tower_enable", "prime_tower_size", "prime_tower_position_x", "prime_tower_position_y", "prime_tower_brim_enable", "prime_tower_base_size", "prime_tower_base_height"]
_ooze_shield_settings = ["ooze_shield_enabled", "ooze_shield_dist"]
_distance_settings = ["infill_wipe_dist", "travel_avoid_distance", "support_offset", "support_enable", "travel_avoid_other_parts", "travel_avoid_supports", "wall_line_count", "wall_line_width_0", "wall_line_width_x"]
_extruder_settings = ["support_enable", "support_bottom_enable", "support_roof_enable", "support_infill_extruder_nr", "support_extruder_nr_layer_0", "support_bottom_extruder_nr", "support_roof_extruder_nr", "brim_line_count", "skirt_brim_extruder_nr", "raft_base_extruder_nr", "raft_interface_extruder_nr", "raft_surface_extruder_nr", "adhesion_type"] #Settings that can affect which extruders are used.

View file

@ -22,7 +22,7 @@ except ImportError:
from PyQt6.QtCore import QT_VERSION_STR, PYQT_VERSION_STR, QUrl
from PyQt6.QtWidgets import QDialog, QDialogButtonBox, QVBoxLayout, QLabel, QTextEdit, QGroupBox, QCheckBox, QPushButton
from PyQt6.QtGui import QDesktopServices
from PyQt6.QtGui import QDesktopServices, QTextCursor
from UM.Application import Application
from UM.Logger import Logger
@ -309,7 +309,7 @@ class CrashHandler:
trace = "".join(trace_list)
text_area.setText(trace)
text_area.setReadOnly(True)
text_area.moveCursor(QTextCursor.MoveOperation.End) # Move cursor to end, so we see last bit of the exception
layout.addWidget(text_area)
group.setLayout(layout)
@ -400,7 +400,7 @@ class CrashHandler:
text_area.setText(logdata)
text_area.setReadOnly(True)
text_area.moveCursor(QTextCursor.MoveOperation.End) # Move cursor to end, so we see last bit of the log
layout.addWidget(text_area)
group.setLayout(layout)

View file

@ -1,15 +1,18 @@
# Copyright (c) 2018 Ultimaker B.V.
# Copyright (c) 2023 UltiMaker
# Cura is released under the terms of the LGPLv3 or higher.
from PyQt6.QtCore import QObject, QUrl
from PyQt6.QtGui import QDesktopServices
from typing import List, cast
from PyQt6.QtCore import QObject, QUrl, pyqtSignal, pyqtProperty
from PyQt6.QtGui import QDesktopServices
from PyQt6.QtWidgets import QApplication
from UM.Application import Application
from UM.Event import CallFunctionEvent
from UM.FlameProfiler import pyqtSlot
from UM.Math.Vector import Vector
from UM.Scene.Selection import Selection
from UM.Scene.Iterator.BreadthFirstIterator import BreadthFirstIterator
from UM.Scene.Iterator.DepthFirstIterator import DepthFirstIterator
from UM.Operations.GroupedOperation import GroupedOperation
from UM.Operations.RemoveSceneNodeOperation import RemoveSceneNodeOperation
from UM.Operations.TranslateOperation import TranslateOperation
@ -20,16 +23,23 @@ from cura.MultiplyObjectsJob import MultiplyObjectsJob
from cura.Settings.SetObjectExtruderOperation import SetObjectExtruderOperation
from cura.Settings.ExtruderManager import ExtruderManager
from cura.Arranging.GridArrange import GridArrange
from cura.Arranging.Nest2DArrange import Nest2DArrange
from cura.Operations.SetBuildPlateNumberOperation import SetBuildPlateNumberOperation
from UM.Logger import Logger
from UM.Scene.SceneNode import SceneNode
class CuraActions(QObject):
def __init__(self, parent: QObject = None) -> None:
super().__init__(parent)
self._operation_stack = Application.getInstance().getOperationStack()
self._operation_stack.changed.connect(self._onUndoStackChanged)
undoStackChanged = pyqtSignal()
@pyqtSlot()
def openDocumentation(self) -> None:
# Starting a web browser from a signal handler connected to a menu will crash on windows.
@ -38,6 +48,25 @@ class CuraActions(QObject):
event = CallFunctionEvent(self._openUrl, [QUrl("https://ultimaker.com/en/resources/manuals/software?utm_source=cura&utm_medium=software&utm_campaign=dropdown-documentation")], {})
cura.CuraApplication.CuraApplication.getInstance().functionEvent(event)
@pyqtProperty(bool, notify=undoStackChanged)
def canUndo(self):
return self._operation_stack.canUndo()
@pyqtProperty(bool, notify=undoStackChanged)
def canRedo(self):
return self._operation_stack.canRedo()
@pyqtSlot()
def undo(self):
self._operation_stack.undo()
@pyqtSlot()
def redo(self):
self._operation_stack.redo()
def _onUndoStackChanged(self):
self.undoStackChanged.emit()
@pyqtSlot()
def openBugReportPage(self) -> None:
event = CallFunctionEvent(self._openUrl, [QUrl("https://github.com/Ultimaker/Cura/issues/new/choose")], {})
@ -78,16 +107,25 @@ class CuraActions(QObject):
center_operation = TranslateOperation(current_node, Vector(0, center_y, 0), set_position = True)
operation.addOperation(center_operation)
operation.push()
@pyqtSlot(int)
def multiplySelection(self, count: int) -> None:
"""Multiply all objects in the selection
:param count: The number of times to multiply the selection.
"""
min_offset = cura.CuraApplication.CuraApplication.getInstance().getBuildVolume().getEdgeDisallowedSize() + 2 # Allow for some rounding errors
job = MultiplyObjectsJob(Selection.getAllSelectedObjects(), count, min_offset = max(min_offset, 8))
job.start()
@pyqtSlot(int)
def multiplySelectionToGrid(self, count: int) -> None:
"""Multiply all objects in the selection
:param count: The number of times to multiply the selection.
"""
min_offset = cura.CuraApplication.CuraApplication.getInstance().getBuildVolume().getEdgeDisallowedSize() + 2 # Allow for some rounding errors
job = MultiplyObjectsJob(Selection.getAllSelectedObjects(), count, min_offset = max(min_offset, 8))
job = MultiplyObjectsJob(Selection.getAllSelectedObjects(), count, min_offset=max(min_offset, 8),
grid_arrange=True)
job.start()
@pyqtSlot()
@ -181,5 +219,64 @@ class CuraActions(QObject):
Selection.clear()
@pyqtSlot()
def cut(self) -> None:
self.copy()
self.deleteSelection()
@pyqtSlot()
def copy(self) -> None:
mesh_writer = cura.CuraApplication.CuraApplication.getInstance().getMeshFileHandler().getWriter("3MFWriter")
if not mesh_writer:
Logger.log("e", "No 3MF writer found, unable to copy.")
return
# Get the selected nodes
selected_objects = Selection.getAllSelectedObjects()
# Serialize the nodes to a string
scene_string = mesh_writer.sceneNodesToString(selected_objects)
# Put the string on the clipboard
QApplication.clipboard().setText(scene_string)
@pyqtSlot()
def paste(self) -> None:
application = cura.CuraApplication.CuraApplication.getInstance()
mesh_reader = application.getMeshFileHandler().getReaderForFile(".3mf")
if not mesh_reader:
Logger.log("e", "No 3MF reader found, unable to paste.")
return
# Parse the scene from the clipboard
scene_string = QApplication.clipboard().text()
nodes = mesh_reader.stringToSceneNodes(scene_string)
if not nodes:
# Nothing to paste
return
# Find all fixed nodes, these are the nodes that should be avoided when arranging
fixed_nodes = []
root = application.getController().getScene().getRoot()
for node in DepthFirstIterator(root):
# Only count sliceable objects
if node.callDecoration("isSliceable"):
fixed_nodes.append(node)
# Add the new nodes to the scene, and arrange them
arranger = GridArrange(nodes, application.getBuildVolume(), fixed_nodes)
group_operation, not_fit_count = arranger.createGroupOperationForArrange(add_new_nodes_in_scene = True)
group_operation.push()
# deselect currently selected nodes, and select the new nodes
for node in Selection.getAllSelectedObjects():
Selection.remove(node)
numberOfFixedNodes = len(fixed_nodes)
for node in nodes:
numberOfFixedNodes += 1
node.printOrder = numberOfFixedNodes
Selection.add(node)
def _openUrl(self, url: QUrl) -> None:
QDesktopServices.openUrl(url)

View file

@ -1,22 +1,27 @@
# Copyright (c) 2022 Ultimaker B.V.
# Copyright (c) 2023 UltiMaker
# Cura is released under the terms of the LGPLv3 or higher.
import enum
import os
import re
import sys
import tempfile
import time
import platform
from pathlib import Path
from typing import cast, TYPE_CHECKING, Optional, Callable, List, Any, Dict
import requests
import numpy
from PyQt6.QtCore import QObject, QTimer, QUrl, pyqtSignal, pyqtProperty, QEvent, pyqtEnum, QCoreApplication
from PyQt6.QtCore import QObject, QTimer, QUrl, QUrlQuery, pyqtSignal, pyqtProperty, QEvent, pyqtEnum, QCoreApplication, \
QByteArray
from PyQt6.QtGui import QColor, QIcon
from PyQt6.QtQml import qmlRegisterUncreatableType, qmlRegisterUncreatableMetaObject, qmlRegisterSingletonType, qmlRegisterType
from PyQt6.QtQml import qmlRegisterUncreatableMetaObject, qmlRegisterSingletonType, qmlRegisterType
from PyQt6.QtWidgets import QMessageBox
import UM.Util
import cura.Settings.cura_empty_instance_containers
from UM.Application import Application
from UM.Decorators import override
from UM.Decorators import override, deprecated
from UM.FlameProfiler import pyqtSlot
from UM.Logger import Logger
from UM.Math.AxisAlignedBox import AxisAlignedBox
@ -28,6 +33,7 @@ from UM.Message import Message
from UM.Operations.AddSceneNodeOperation import AddSceneNodeOperation
from UM.Operations.GroupedOperation import GroupedOperation
from UM.Operations.SetTransformOperation import SetTransformOperation
from UM.OutputDevice.ProjectOutputDevice import ProjectOutputDevice
from UM.Platform import Platform
from UM.PluginError import PluginNotFoundError
from UM.Preferences import Preferences
@ -49,11 +55,11 @@ from UM.Settings.Validator import Validator
from UM.View.SelectionPass import SelectionPass # For typing.
from UM.Workspace.WorkspaceReader import WorkspaceReader
from UM.i18n import i18nCatalog
from UM.Version import Version
from cura import ApplicationMetadata
from cura.API import CuraAPI
from cura.API.Account import Account
from cura.Arranging.ArrangeObjectsJob import ArrangeObjectsJob
from cura.Arranging.Nest2DArrange import arrange
from cura.Machines.MachineErrorChecker import MachineErrorChecker
from cura.Machines.Models.BuildPlateModel import BuildPlateModel
from cura.Machines.Models.CustomQualityProfilesDropDownMenuModel import CustomQualityProfilesDropDownMenuModel
@ -99,7 +105,8 @@ from cura.Settings.SettingInheritanceManager import SettingInheritanceManager
from cura.Settings.SidebarCustomMenuItemsModel import SidebarCustomMenuItemsModel
from cura.Settings.SimpleModeSettingsManager import SimpleModeSettingsManager
from cura.TaskManagement.OnExitCallbackManager import OnExitCallbackManager
from cura.UI import CuraSplashScreen, MachineActionManager, PrintInformation
from cura.UI import CuraSplashScreen, PrintInformation
from cura.UI.MachineActionManager import MachineActionManager
from cura.UI.AddPrinterPagesModel import AddPrinterPagesModel
from cura.UI.MachineSettingsManager import MachineSettingsManager
from cura.UI.ObjectsModel import ObjectsModel
@ -114,11 +121,13 @@ from . import CameraAnimation
from . import CuraActions
from . import PlatformPhysics
from . import PrintJobPreviewImageProvider
from .Arranging.Nest2DArrange import Nest2DArrange
from .AutoSave import AutoSave
from .Machines.Models.CompatibleMachineModel import CompatibleMachineModel
from .Machines.Models.MachineListModel import MachineListModel
from .Machines.Models.ActiveIntentQualitiesModel import ActiveIntentQualitiesModel
from .Machines.Models.IntentSelectionModel import IntentSelectionModel
from .PrintOrderManager import PrintOrderManager
from .SingleInstance import SingleInstance
if TYPE_CHECKING:
@ -130,7 +139,7 @@ class CuraApplication(QtApplication):
# SettingVersion represents the set of settings available in the machine/extruder definitions.
# You need to make sure that this version number needs to be increased if there is any non-backwards-compatible
# changes of the settings.
SettingVersion = 21
SettingVersion = 23
Created = False
@ -147,6 +156,7 @@ class CuraApplication(QtApplication):
DefinitionChangesContainer = Resources.UserType + 10
SettingVisibilityPreset = Resources.UserType + 11
IntentInstanceContainer = Resources.UserType + 12
ImageFiles = Resources.UserType + 13
pyqtEnum(ResourceTypes)
@ -172,18 +182,20 @@ class CuraApplication(QtApplication):
# Variables set from CLI
self._files_to_open = []
self._urls_to_open = []
self._use_single_instance = False
self._single_instance = None
self._open_project_mode: Optional[str] = None
self._cura_formula_functions = None # type: Optional[CuraFormulaFunctions]
self._machine_action_manager = None # type: Optional[MachineActionManager.MachineActionManager]
self._machine_action_manager: Optional[MachineActionManager] = None
self.empty_container = None # type: EmptyInstanceContainer
self.empty_definition_changes_container = None # type: EmptyInstanceContainer
self.empty_variant_container = None # type: EmptyInstanceContainer
self.empty_intent_container = None # type: EmptyInstanceContainer
self.empty_intent_container = None # type: EmptyInstanceContainer
self.empty_material_container = None # type: EmptyInstanceContainer
self.empty_quality_container = None # type: EmptyInstanceContainer
self.empty_quality_changes_container = None # type: EmptyInstanceContainer
@ -194,6 +206,7 @@ class CuraApplication(QtApplication):
self._container_manager = None
self._object_manager = None
self._print_order_manager = None
self._extruders_model = None
self._extruders_model_with_optional = None
self._build_plate_model = None
@ -204,6 +217,8 @@ class CuraApplication(QtApplication):
self._cura_scene_controller = None
self._machine_error_checker = None
self._backend_plugins: List[BackendPlugin] = []
self._machine_settings_manager = MachineSettingsManager(self, parent = self)
self._material_management_model = None
self._quality_management_model = None
@ -243,7 +258,7 @@ class CuraApplication(QtApplication):
self._additional_components = {} # Components to add to certain areas in the interface
self._open_file_queue = [] # A list of files to open (after the application has started)
self._open_url_queue = [] # A list of urls to open (after the application has started)
self._update_platform_activity_timer = None
self._sidebar_custom_menu_items = [] # type: list # Keeps list of custom menu items for the side bar
@ -264,6 +279,11 @@ class CuraApplication(QtApplication):
CentralFileStorage.setIsEnterprise(ApplicationMetadata.IsEnterpriseVersion)
Resources.setIsEnterprise(ApplicationMetadata.IsEnterpriseVersion)
self._conan_installs = ApplicationMetadata.CONAN_INSTALLS
self._python_installs = ApplicationMetadata.PYTHON_INSTALLS
self._supported_url_schemes: List[str] = ["cura", "slicer"]
@pyqtProperty(str, constant=True)
def ultimakerCloudApiRootUrl(self) -> str:
return UltimakerCloudConstants.CuraCloudAPIRoot
@ -316,7 +336,11 @@ class CuraApplication(QtApplication):
assert not "This crash is triggered by the trigger_early_crash command line argument."
for filename in self._cli_args.file:
self._files_to_open.append(os.path.abspath(filename))
url = QUrl(filename)
if url.scheme() in self._supported_url_schemes:
self._urls_to_open.append(url)
else:
self._files_to_open.append(os.path.abspath(filename))
def initialize(self) -> None:
self.__addExpectedResourceDirsAndSearchPaths() # Must be added before init of super
@ -333,11 +357,11 @@ class CuraApplication(QtApplication):
self.__addAllEmptyContainers()
self.__setLatestResouceVersionsForVersionUpgrade()
self._machine_action_manager = MachineActionManager.MachineActionManager(self)
self._machine_action_manager = MachineActionManager(self)
self._machine_action_manager.initialize()
def __sendCommandToSingleInstance(self):
self._single_instance = SingleInstance(self, self._files_to_open)
self._single_instance = SingleInstance(self, self._files_to_open, self._urls_to_open)
# If we use single instance, try to connect to the single instance server, send commands, and then exit.
# If we cannot find an existing single instance server, this is the only instance, so just keep going.
@ -354,10 +378,20 @@ class CuraApplication(QtApplication):
Resources.addExpectedDirNameInData(dir_name)
app_root = os.path.abspath(os.path.join(os.path.dirname(sys.executable)))
Resources.addSecureSearchPath(os.path.join(app_root, "share", "cura", "resources"))
Resources.addSecureSearchPath(os.path.join(self._app_install_dir, "share", "cura", "resources"))
if platform.system() == "Darwin":
Resources.addSecureSearchPath(os.path.join(app_root, "Resources", "share", "cura", "resources"))
Resources.addSecureSearchPath(
os.path.join(self._app_install_dir, "Resources", "share", "cura", "resources"))
else:
Resources.addSecureSearchPath(os.path.join(app_root, "share", "cura", "resources"))
Resources.addSecureSearchPath(os.path.join(self._app_install_dir, "share", "cura", "resources"))
if not hasattr(sys, "frozen"):
cura_data_root = os.environ.get('CURA_DATA_ROOT', None)
if cura_data_root:
Resources.addSearchPath(str(Path(cura_data_root).joinpath("resources")))
Resources.addSearchPath(os.path.join(os.path.abspath(os.path.dirname(__file__)), "..", "resources"))
# local Conan cache
@ -407,6 +441,9 @@ class CuraApplication(QtApplication):
SettingFunction.registerOperator("extruderValue", self._cura_formula_functions.getValueInExtruder)
SettingFunction.registerOperator("extruderValues", self._cura_formula_functions.getValuesInAllExtruders)
SettingFunction.registerOperator("anyExtruderWithMaterial", self._cura_formula_functions.getExtruderPositionWithMaterial)
SettingFunction.registerOperator("anyExtruderNrWithOrDefault",
self._cura_formula_functions.getAnyExtruderPositionWithOrDefault)
SettingFunction.registerOperator("resolveOrValue", self._cura_formula_functions.getResolveOrValue)
SettingFunction.registerOperator("defaultExtruderPosition", self._cura_formula_functions.getDefaultExtruderPosition)
SettingFunction.registerOperator("valueFromContainer", self._cura_formula_functions.getValueFromContainerAtIndex)
@ -425,6 +462,7 @@ class CuraApplication(QtApplication):
Resources.addStorageType(self.ResourceTypes.DefinitionChangesContainer, "definition_changes")
Resources.addStorageType(self.ResourceTypes.SettingVisibilityPreset, "setting_visibility")
Resources.addStorageType(self.ResourceTypes.IntentInstanceContainer, "intent")
Resources.addStorageType(self.ResourceTypes.ImageFiles, "images")
self._container_registry.addResourceType(self.ResourceTypes.QualityInstanceContainer, "quality")
self._container_registry.addResourceType(self.ResourceTypes.QualityChangesInstanceContainer, "quality_changes")
@ -435,6 +473,7 @@ class CuraApplication(QtApplication):
self._container_registry.addResourceType(self.ResourceTypes.MachineStack, "machine")
self._container_registry.addResourceType(self.ResourceTypes.DefinitionChangesContainer, "definition_changes")
self._container_registry.addResourceType(self.ResourceTypes.IntentInstanceContainer, "intent")
self._container_registry.addResourceType(self.ResourceTypes.ImageFiles, "images")
Resources.addType(self.ResourceTypes.QmlFiles, "qml")
Resources.addType(self.ResourceTypes.Firmware, "firmware")
@ -490,6 +529,36 @@ class CuraApplication(QtApplication):
def startSplashWindowPhase(self) -> None:
"""Runs preparations that needs to be done before the starting process."""
self.setRequiredPlugins([
# Misc.:
"ConsoleLogger", # You want to be able to read the log if something goes wrong.
"CuraEngineBackend", # Cura is useless without this one since you can't slice.
"FileLogger", # You want to be able to read the log if something goes wrong.
"XmlMaterialProfile", # Cura crashes without this one.
"Marketplace",
# This contains the interface to enable/disable plug-ins, so if you disable it you can't enable it back.
"PrepareStage", # Cura is useless without this one since you can't load models.
"PreviewStage", # This shows the list of the plugin views that are installed in Cura.
"MonitorStage", # Major part of Cura's functionality.
"LocalFileOutputDevice", # Major part of Cura's functionality.
"LocalContainerProvider", # Cura is useless without any profiles or setting definitions.
# Views:
"SimpleView", # Dependency of SolidView.
"SolidView", # Displays models. Cura is useless without it.
# Readers & Writers:
"GCodeWriter", # Cura is useless if it can't write its output.
"STLReader", # Most common model format, so disabling this makes Cura 90% useless.
"3MFWriter", # Required for writing project files.
# Tools:
"CameraTool", # Needed to see the scene. Cura is useless without it.
"SelectionTool", # Dependency of the rest of the tools.
"TranslateTool", # You'll need this for almost every print.
])
# Plugins need to be set here, since in the super the check is done if they are actually loaded.
super().startSplashWindowPhase()
if not self.getIsHeadLess():
@ -498,33 +567,7 @@ class CuraApplication(QtApplication):
except FileNotFoundError:
Logger.log("w", "Unable to find the window icon.")
self.setRequiredPlugins([
# Misc.:
"ConsoleLogger", #You want to be able to read the log if something goes wrong.
"CuraEngineBackend", #Cura is useless without this one since you can't slice.
"FileLogger", #You want to be able to read the log if something goes wrong.
"XmlMaterialProfile", #Cura crashes without this one.
"Marketplace", #This contains the interface to enable/disable plug-ins, so if you disable it you can't enable it back.
"PrepareStage", #Cura is useless without this one since you can't load models.
"PreviewStage", #This shows the list of the plugin views that are installed in Cura.
"MonitorStage", #Major part of Cura's functionality.
"LocalFileOutputDevice", #Major part of Cura's functionality.
"LocalContainerProvider", #Cura is useless without any profiles or setting definitions.
# Views:
"SimpleView", #Dependency of SolidView.
"SolidView", #Displays models. Cura is useless without it.
# Readers & Writers:
"GCodeWriter", #Cura is useless if it can't write its output.
"STLReader", #Most common model format, so disabling this makes Cura 90% useless.
"3MFWriter", #Required for writing project files.
# Tools:
"CameraTool", #Needed to see the scene. Cura is useless without it.
"SelectionTool", #Dependency of the rest of the tools.
"TranslateTool", #You'll need this for almost every print.
])
self._i18n_catalog = i18nCatalog("cura")
self._update_platform_activity_timer = QTimer()
@ -575,6 +618,7 @@ class CuraApplication(QtApplication):
preferences.addPreference("view/invert_zoom", False)
preferences.addPreference("view/filter_current_build_plate", False)
preferences.addPreference("view/navigation_style", "cura")
preferences.addPreference("cura/sidebar_collapsed", False)
preferences.addPreference("cura/favorite_materials", "")
@ -605,6 +649,16 @@ class CuraApplication(QtApplication):
def _onEngineCreated(self):
self._qml_engine.addImageProvider("print_job_preview", PrintJobPreviewImageProvider.PrintJobPreviewImageProvider())
version = Version(self.getVersion())
if hasattr(sys, "frozen") and version.hasPostFix() and "beta" not in version.getPostfixType():
self._qml_engine.rootObjects()[0].setTitle(f"{ApplicationMetadata.CuraAppDisplayName} {ApplicationMetadata.CuraVersion}")
message = Message(
self._i18n_catalog.i18nc("@info:warning",
f"This version is not intended for production use. If you encounter any issues, please report them on our GitHub page, mentioning the full version {self.getVersion()}"),
lifetime = 0,
title = self._i18n_catalog.i18nc("@info:title", "Nightly build"),
message_type = Message.MessageType.WARNING)
message.show()
@pyqtProperty(bool)
def needToShowUserAgreement(self) -> bool:
@ -788,6 +842,7 @@ class CuraApplication(QtApplication):
self._plugin_registry.addType("profile_reader", self._addProfileReader)
self._plugin_registry.addType("profile_writer", self._addProfileWriter)
self._plugin_registry.addType("backend_plugin", self._addBackendPlugin)
if Platform.isLinux():
lib_suffixes = {"", "64", "32", "x32"} # A few common ones on different distributions.
@ -824,11 +879,10 @@ class CuraApplication(QtApplication):
def run(self):
super().run()
if len(ApplicationMetadata.DEPENDENCY_INFO) > 0:
Logger.debug("Using Conan managed dependencies: " + ", ".join(
[dep["recipe"]["id"] for dep in ApplicationMetadata.DEPENDENCY_INFO["installed"] if dep["recipe"]["version"] != "latest"]))
else:
Logger.warning("Could not find conan_install_info.json")
self._log_hardware_info()
Logger.debug("Using conan dependencies: {}", str(self.conanInstalls))
Logger.debug("Using python dependencies: {}", str(self.pythonInstalls))
Logger.log("i", "Initializing machine error checker")
self._machine_error_checker = MachineErrorChecker(self)
@ -857,6 +911,7 @@ class CuraApplication(QtApplication):
# initialize info objects
self._print_information = PrintInformation.PrintInformation(self)
self._cura_actions = CuraActions.CuraActions(self)
self._print_order_manager = PrintOrderManager(self.getObjectsModel().getNodes)
self.processEvents()
# Initialize setting visibility presets model.
self._setting_visibility_presets_model = SettingVisibilityPresetsModel(self.getPreferences(), parent = self)
@ -897,6 +952,14 @@ class CuraApplication(QtApplication):
self.exec()
def _log_hardware_info(self):
hardware_info = platform.uname()
Logger.info(f"System: {hardware_info.system}")
Logger.info(f"Release: {hardware_info.release}")
Logger.info(f"Version: {hardware_info.version}")
Logger.info(f"Processor name: {hardware_info.processor}")
Logger.info(f"CPU Cores: {os.cpu_count()}")
def __setUpSingleInstanceServer(self):
if self._use_single_instance:
self._single_instance.startServer()
@ -906,6 +969,10 @@ class CuraApplication(QtApplication):
self.callLater(self._openFile, file_name)
for file_name in self._open_file_queue: # Open all the files that were queued up while plug-ins were loading.
self.callLater(self._openFile, file_name)
for url in self._urls_to_open:
self.callLater(self._openUrl, url)
for url in self._open_url_queue:
self.callLater(self._openUrl, url)
initializationFinished = pyqtSignal()
showAddPrintersUncancellableDialog = pyqtSignal() # Used to show the add printers dialog with a greyed background
@ -927,6 +994,7 @@ class CuraApplication(QtApplication):
t.setEnabledAxis([ToolHandle.XAxis, ToolHandle.YAxis, ToolHandle.ZAxis])
Selection.selectionChanged.connect(self.onSelectionChanged)
self._print_order_manager.printOrderChanged.connect(self._onPrintOrderChanged)
# Set default background color for scene
self.getRenderer().setBackgroundColor(QColor(245, 245, 245))
@ -1016,6 +1084,10 @@ class CuraApplication(QtApplication):
def getTextManager(self, *args) -> "TextManager":
return self._text_manager
@pyqtSlot()
def setWorkplaceDropToBuildplate(self):
return self._physics.setAppAllModelDropDown()
def getCuraFormulaFunctions(self, *args) -> "CuraFormulaFunctions":
if self._cura_formula_functions is None:
self._cura_formula_functions = CuraFormulaFunctions(self)
@ -1042,6 +1114,10 @@ class CuraApplication(QtApplication):
self._object_manager = ObjectsModel(self)
return self._object_manager
@pyqtSlot(str, result = "QVariantList")
def getSupportedActionMachineList(self, definition_id: str) -> List["MachineAction"]:
return self._machine_action_manager.getSupportedActions(self._machine_manager.getDefinitionByMachineId(definition_id))
@pyqtSlot(result = QObject)
def getExtrudersModel(self, *args) -> "ExtrudersModel":
if self._extruders_model is None:
@ -1067,6 +1143,16 @@ class CuraApplication(QtApplication):
self._build_plate_model = BuildPlateModel(self)
return self._build_plate_model
@pyqtSlot()
def exportUcp(self):
writer = self.getMeshFileHandler().getWriter("3MFWriter")
if writer is None:
Logger.warning("3mf writer is not enabled")
return
writer.exportUcp()
def getCuraSceneController(self, *args) -> CuraSceneController:
if self._cura_scene_controller is None:
self._cura_scene_controller = CuraSceneController.createCuraSceneController()
@ -1077,14 +1163,16 @@ class CuraApplication(QtApplication):
self._setting_inheritance_manager = SettingInheritanceManager.createSettingInheritanceManager()
return self._setting_inheritance_manager
def getMachineActionManager(self, *args: Any) -> MachineActionManager.MachineActionManager:
@pyqtSlot(result = QObject)
def getMachineActionManager(self, *args: Any) -> MachineActionManager:
"""Get the machine action manager
We ignore any *args given to this, as we also register the machine manager as qml singleton.
It wants to give this function an engine and script engine, but we don't care about that.
"""
return cast(MachineActionManager.MachineActionManager, self._machine_action_manager)
return self._machine_action_manager
@pyqtSlot(result = QObject)
def getMaterialManagementModel(self) -> MaterialManagementModel:
@ -1098,7 +1186,8 @@ class CuraApplication(QtApplication):
self._quality_management_model = QualityManagementModel(parent = self)
return self._quality_management_model
def getSimpleModeSettingsManager(self, *args):
@pyqtSlot(result=QObject)
def getSimpleModeSettingsManager(self)-> SimpleModeSettingsManager:
if self._simple_mode_settings_manager is None:
self._simple_mode_settings_manager = SimpleModeSettingsManager()
return self._simple_mode_settings_manager
@ -1115,9 +1204,15 @@ class CuraApplication(QtApplication):
if event.type() == QEvent.Type.FileOpen:
if self._plugins_loaded:
self._openFile(event.file())
if event.file():
self._openFile(event.file())
if event.url():
self._openUrl(event.url())
else:
self._open_file_queue.append(event.file())
if event.file():
self._open_file_queue.append(event.file())
if event.url():
self._open_url_queue.append(event.url())
if int(event.type()) == 20: # 'QEvent.Type.Quit' enum isn't there, even though it should be according to docs.
# Once we're at this point, everything should have been flushed already (past OnExitCallbackManager).
@ -1135,16 +1230,43 @@ class CuraApplication(QtApplication):
return self._print_information
def getQualityProfilesDropDownMenuModel(self, *args, **kwargs):
@pyqtSlot(result=QObject)
def getQualityProfilesDropDownMenuModel(self, *args, **kwargs)-> QualityProfilesDropDownMenuModel:
if self._quality_profile_drop_down_menu_model is None:
self._quality_profile_drop_down_menu_model = QualityProfilesDropDownMenuModel(self)
return self._quality_profile_drop_down_menu_model
def getCustomQualityProfilesDropDownMenuModel(self, *args, **kwargs):
@pyqtSlot(result=QObject)
def getCustomQualityProfilesDropDownMenuModel(self, *args, **kwargs)->CustomQualityProfilesDropDownMenuModel:
if self._custom_quality_profile_drop_down_menu_model is None:
self._custom_quality_profile_drop_down_menu_model = CustomQualityProfilesDropDownMenuModel(self)
return self._custom_quality_profile_drop_down_menu_model
@deprecated("SimpleModeSettingsManager is deprecated and will be removed in major SDK release, Use getSimpleModeSettingsManager() instead", since = "5.7.0")
def getSimpleModeSettingsManagerWrapper(self, *args, **kwargs):
return self.getSimpleModeSettingsManager()
@deprecated("MachineActionManager is deprecated and will be removed in major SDK release, Use getMachineActionManager() instead", since="5.7.0")
def getMachineActionManagerWrapper(self, *args, **kwargs):
return self.getMachineActionManager()
@deprecated("QualityManagementModel is deprecated and will be removed in major SDK release, Use getQualityManagementModel() instead", since="5.7.0")
def getQualityManagementModelWrapper(self, *args, **kwargs):
return self.getQualityManagementModel()
@deprecated("MaterialManagementModel is deprecated and will be removed in major SDK release, Use getMaterialManagementModel() instead", since = "5.7.0")
def getMaterialManagementModelWrapper(self, *args, **kwargs):
return self.getMaterialManagementModel()
@deprecated("QualityProfilesDropDownMenuModel is deprecated and will be removed in major SDK release, Use getQualityProfilesDropDownMenuModel() instead", since = "5.7.0")
def getQualityProfilesDropDownMenuModelWrapper(self, *args, **kwargs):
return self.getQualityProfilesDropDownMenuModel()
@deprecated("CustomQualityProfilesDropDownMenuModel is deprecated and will be removed in major SDK release, Use getCustomQualityProfilesDropDownMenuModel() instead", since = "5.7.0")
def getCustomQualityProfilesDropDownMenuModelWrapper(self, *args, **kwargs):
return self.getCustomQualityProfilesDropDownMenuModel()
def getCuraAPI(self, *args, **kwargs) -> "CuraAPI":
return self._cura_API
@ -1160,6 +1282,7 @@ class CuraApplication(QtApplication):
self.processEvents()
engine.rootContext().setContextProperty("Printer", self)
engine.rootContext().setContextProperty("CuraApplication", self)
engine.rootContext().setContextProperty("PrintOrderManager", self._print_order_manager)
engine.rootContext().setContextProperty("PrintInformation", self._print_information)
engine.rootContext().setContextProperty("CuraActions", self._cura_actions)
engine.rootContext().setContextProperty("CuraSDKVersion", ApplicationMetadata.CuraSDKVersion)
@ -1173,8 +1296,8 @@ class CuraApplication(QtApplication):
qmlRegisterSingletonType(MachineManager, "Cura", 1, 0, self.getMachineManager, "MachineManager")
qmlRegisterSingletonType(IntentManager, "Cura", 1, 6, self.getIntentManager, "IntentManager")
qmlRegisterSingletonType(SettingInheritanceManager, "Cura", 1, 0, self.getSettingInheritanceManager, "SettingInheritanceManager")
qmlRegisterSingletonType(SimpleModeSettingsManager, "Cura", 1, 0, self.getSimpleModeSettingsManager, "SimpleModeSettingsManager")
qmlRegisterSingletonType(MachineActionManager.MachineActionManager, "Cura", 1, 0, self.getMachineActionManager, "MachineActionManager")
qmlRegisterSingletonType(SimpleModeSettingsManager, "Cura", 1, 0, self.getSimpleModeSettingsManagerWrapper, "SimpleModeSettingsManager")
qmlRegisterSingletonType(MachineActionManager, "Cura", 1, 0, self.getMachineActionManagerWrapper, "MachineActionManager")
self.processEvents()
qmlRegisterType(NetworkingUtil, "Cura", 1, 5, "NetworkingUtil")
@ -1199,16 +1322,14 @@ class CuraApplication(QtApplication):
qmlRegisterType(FavoriteMaterialsModel, "Cura", 1, 0, "FavoriteMaterialsModel")
qmlRegisterType(GenericMaterialsModel, "Cura", 1, 0, "GenericMaterialsModel")
qmlRegisterType(MaterialBrandsModel, "Cura", 1, 0, "MaterialBrandsModel")
qmlRegisterSingletonType(QualityManagementModel, "Cura", 1, 0, self.getQualityManagementModel, "QualityManagementModel")
qmlRegisterSingletonType(MaterialManagementModel, "Cura", 1, 5, self.getMaterialManagementModel, "MaterialManagementModel")
qmlRegisterSingletonType(QualityManagementModel, "Cura", 1, 0, self.getQualityManagementModelWrapper,"QualityManagementModel")
qmlRegisterSingletonType(MaterialManagementModel, "Cura", 1, 5, self.getMaterialManagementModelWrapper,"MaterialManagementModel")
self.processEvents()
qmlRegisterType(DiscoveredPrintersModel, "Cura", 1, 0, "DiscoveredPrintersModel")
qmlRegisterType(DiscoveredCloudPrintersModel, "Cura", 1, 7, "DiscoveredCloudPrintersModel")
qmlRegisterSingletonType(QualityProfilesDropDownMenuModel, "Cura", 1, 0,
self.getQualityProfilesDropDownMenuModel, "QualityProfilesDropDownMenuModel")
qmlRegisterSingletonType(CustomQualityProfilesDropDownMenuModel, "Cura", 1, 0,
self.getCustomQualityProfilesDropDownMenuModel, "CustomQualityProfilesDropDownMenuModel")
qmlRegisterSingletonType(QualityProfilesDropDownMenuModel, "Cura", 1, 0, self.getQualityProfilesDropDownMenuModelWrapper, "QualityProfilesDropDownMenuModel")
qmlRegisterSingletonType(CustomQualityProfilesDropDownMenuModel, "Cura", 1, 0, self.getCustomQualityProfilesDropDownMenuModelWrapper, "CustomQualityProfilesDropDownMenuModel")
qmlRegisterType(NozzleModel, "Cura", 1, 0, "NozzleModel")
qmlRegisterType(IntentModel, "Cura", 1, 6, "IntentModel")
qmlRegisterType(IntentCategoryModel, "Cura", 1, 6, "IntentCategoryModel")
@ -1337,7 +1458,11 @@ class CuraApplication(QtApplication):
self._scene_bounding_box = scene_bounding_box
self.sceneBoundingBoxChanged.emit()
self._platform_activity = True if count > 0 else False
if count > 0:
self._platform_activity = True
else:
ProjectOutputDevice.setLastOutputName(None)
self._platform_activity = False
self.activityChanged.emit()
@pyqtSlot()
@ -1423,6 +1548,13 @@ class CuraApplication(QtApplication):
# Single build plate
@pyqtSlot()
def arrangeAll(self) -> None:
self._arrangeAll(grid_arrangement = False)
@pyqtSlot()
def arrangeAllInGrid(self) -> None:
self._arrangeAll(grid_arrangement = True)
def _arrangeAll(self, *, grid_arrangement: bool) -> None:
nodes_to_arrange = []
active_build_plate = self.getMultiBuildPlateModel().activeBuildPlate
locked_nodes = []
@ -1452,17 +1584,17 @@ class CuraApplication(QtApplication):
locked_nodes.append(node)
else:
nodes_to_arrange.append(node)
self.arrange(nodes_to_arrange, locked_nodes)
self.arrange(nodes_to_arrange, locked_nodes, grid_arrangement = grid_arrangement)
def arrange(self, nodes: List[SceneNode], fixed_nodes: List[SceneNode]) -> None:
def arrange(self, nodes: List[SceneNode], fixed_nodes: List[SceneNode], *, grid_arrangement: bool = False) -> None:
"""Arrange a set of nodes given a set of fixed nodes
:param nodes: nodes that we have to place
:param fixed_nodes: nodes that are placed in the arranger before finding spots for nodes
:param grid_arrangement: If set to true if objects are to be placed in a grid
"""
min_offset = self.getBuildVolume().getEdgeDisallowedSize() + 2 # Allow for some rounding errors
job = ArrangeObjectsJob(nodes, fixed_nodes, min_offset = max(min_offset, 8))
job = ArrangeObjectsJob(nodes, fixed_nodes, min_offset = max(min_offset, 8), grid_arrange = grid_arrangement)
job.start()
@pyqtSlot()
@ -1494,7 +1626,7 @@ class CuraApplication(QtApplication):
if not nodes:
return
objects_in_filename = {} # type: Dict[str, List[CuraSceneNode]]
objects_in_filename: Dict[str, List[CuraSceneNode]] = {}
for node in nodes:
mesh_data = node.getMeshData()
if mesh_data:
@ -1508,15 +1640,14 @@ class CuraApplication(QtApplication):
Logger.log("w", "Unable to reload data because we don't have a filename.")
for file_name, nodes in objects_in_filename.items():
for node in nodes:
file_path = os.path.normpath(os.path.dirname(file_name))
job = ReadMeshJob(file_name, add_to_recent_files = file_path != tempfile.gettempdir()) # Don't add temp files to the recent files list
job._node = node # type: ignore
job.finished.connect(self._reloadMeshFinished)
if has_merged_nodes:
job.finished.connect(self.updateOriginOfMergedMeshes)
job.start()
file_path = os.path.normpath(os.path.dirname(file_name))
job = ReadMeshJob(file_name,
add_to_recent_files=file_path != tempfile.gettempdir()) # Don't add temp files to the recent files list
job._nodes = nodes # type: ignore
job.finished.connect(self._reloadMeshFinished)
if has_merged_nodes:
job.finished.connect(self.updateOriginOfMergedMeshes)
job.start()
@pyqtSlot("QStringList")
def setExpandedCategories(self, categories: List[str]) -> None:
@ -1651,8 +1782,12 @@ class CuraApplication(QtApplication):
Selection.remove(node)
Selection.add(group_node)
all_nodes = self.getObjectsModel().getNodes()
PrintOrderManager.updatePrintOrdersAfterGroupOperation(all_nodes, group_node, selected_nodes)
@pyqtSlot()
def ungroupSelected(self) -> None:
all_nodes = self.getObjectsModel().getNodes()
selected_objects = Selection.getAllSelectedObjects().copy()
for node in selected_objects:
if node.callDecoration("isGroup"):
@ -1660,21 +1795,30 @@ class CuraApplication(QtApplication):
group_parent = node.getParent()
children = node.getChildren().copy()
for child in children:
# Ungroup only 1 level deep
if child.getParent() != node:
continue
# Ungroup only 1 level deep
children_to_ungroup = list(filter(lambda child: child.getParent() == node, children))
for child in children_to_ungroup:
# Set the parent of the children to the parent of the group-node
op.addOperation(SetParentOperation(child, group_parent))
# Add all individual nodes to the selection
Selection.add(child)
PrintOrderManager.updatePrintOrdersAfterUngroupOperation(all_nodes, node, children_to_ungroup)
op.push()
# Note: The group removes itself from the scene once all its children have left it,
# see GroupDecorator._onChildrenChanged
def _onPrintOrderChanged(self) -> None:
# update object list
scene = self.getController().getScene()
scene.sceneChanged.emit(scene.getRoot())
# reset if already was sliced
Application.getInstance().getBackend().needsSlicing()
Application.getInstance().getBackend().tickle()
def _createSplashScreen(self) -> Optional[CuraSplashScreen.CuraSplashScreen]:
if self._is_headless:
return None
@ -1688,9 +1832,10 @@ class CuraApplication(QtApplication):
def _reloadMeshFinished(self, job) -> None:
"""
Function called whenever a ReadMeshJob finishes in the background. It reloads a specific node object in the
Function called when ReadMeshJob finishes reloading a file in the background, then update node objects in the
scene from its source file. The function gets all the nodes that exist in the file through the job result, and
then finds the scene node that it wants to refresh by its object id. Each job refreshes only one node.
then finds the scene nodes that need to be refreshed by their name. Each job refreshes all nodes of a file.
Nodes that are not present in the updated file are kept in the scene.
:param job: The :py:class:`Uranium.UM.ReadMeshJob.ReadMeshJob` running in the background that reads all the
meshes in a file
@ -1700,25 +1845,93 @@ class CuraApplication(QtApplication):
if len(job_result) == 0:
Logger.log("e", "Reloading the mesh failed.")
return
object_found = False
mesh_data = None
renamed_nodes = {} # type: Dict[str, int]
# Find the node to be refreshed based on its id
for job_result_node in job_result:
if job_result_node.getId() == job._node.getId():
mesh_data = job_result_node.getMeshData()
object_found = True
break
if not object_found:
Logger.warning("The object with id {} no longer exists! Keeping the old version in the scene.".format(job_result_node.getId()))
return
if not mesh_data:
Logger.log("w", "Could not find a mesh in reloaded node.")
return
job._node.setMeshData(mesh_data)
mesh_data = job_result_node.getMeshData()
if not mesh_data:
Logger.log("w", "Could not find a mesh in reloaded node.")
continue
# Solves issues with object naming
result_node_name = job_result_node.getName()
if not result_node_name:
result_node_name = os.path.basename(mesh_data.getFileName())
if result_node_name in renamed_nodes: # objects may get renamed by ObjectsModel._renameNodes() when loaded
renamed_nodes[result_node_name] += 1
result_node_name = "{0}({1})".format(result_node_name, renamed_nodes[result_node_name])
else:
renamed_nodes[job_result_node.getName()] = 0
# Find the matching scene node to replace
scene_node = None
for replaced_node in job._nodes:
if replaced_node.getName() == result_node_name:
scene_node = replaced_node
break
if scene_node:
scene_node.setMeshData(mesh_data)
else:
# Current node is a new one in the file, or it's name has changed
# TODO: Load this mesh into the scene. Also alter the "_reloadJobFinished" action in UM.Scene
Logger.log("w", "Could not find matching node for object '{0}' in the scene.".format(result_node_name))
def _openFile(self, filename):
self.readLocalFile(QUrl.fromLocalFile(filename))
def _openUrl(self, url: QUrl) -> None:
if url.scheme() not in self._supported_url_schemes:
# only handle cura:// and slicer:// urls schemes
return
match url.host() + url.path():
case "open" | "open/":
query = QUrlQuery(url.query())
model_url = QUrl(query.queryItemValue("file", options=QUrl.ComponentFormattingOption.FullyDecoded))
def on_finish(response):
content_disposition_header_key = QByteArray("content-disposition".encode())
if not response.hasRawHeader(content_disposition_header_key):
Logger.log("w", "Could not find Content-Disposition header in response from {0}".format(
model_url.url()))
# Use the last part of the url as the filename, and assume it is an STL file
filename = model_url.path().split("/")[-1] + ".stl"
else:
# content_disposition is in the format
# ```
# content_disposition attachment; "filename=[FILENAME]"
# ```
# Use a regex to extract the filename
content_disposition = str(response.rawHeader(content_disposition_header_key).data(),
encoding='utf-8')
content_disposition_match = re.match(r'attachment; filename="(?P<filename>.*)"',
content_disposition)
assert content_disposition_match is not None
filename = content_disposition_match.group("filename")
tmp = tempfile.NamedTemporaryFile(suffix=filename, delete=False)
with open(tmp.name, "wb") as f:
f.write(response.readAll())
self.readLocalFile(QUrl.fromLocalFile(tmp.name), add_to_recent_files=False)
def on_error(*args, **kwargs):
Logger.log("w", "Could not download file from {0}".format(model_url.url()))
Message("Could not download file: " + str(model_url.url()),
title= "Loading Model failed",
message_type=Message.MessageType.ERROR).show()
return
self.getHttpRequestManager().get(
model_url.url(),
callback=on_finish,
error_callback=on_error,
)
case path:
Logger.log("w", "Unsupported url scheme path: {0}".format(path))
def _addProfileReader(self, profile_reader):
# TODO: Add the profile reader to the list of plug-ins that can be used when importing profiles.
pass
@ -1726,6 +1939,13 @@ class CuraApplication(QtApplication):
def _addProfileWriter(self, profile_writer):
pass
def _addBackendPlugin(self, backend_plugin: "BackendPlugin") -> None:
self._container_registry.addAdditionalSettingDefinitionsAppender(backend_plugin)
self._backend_plugins.append(backend_plugin)
def getBackendPlugins(self) -> List["BackendPlugin"]:
return self._backend_plugins
@pyqtSlot("QSize")
def setMinimumWindowSize(self, size):
main_window = self.getMainWindow()
@ -1762,6 +1982,17 @@ class CuraApplication(QtApplication):
openProjectFile = pyqtSignal(QUrl, bool, arguments = ["project_file", "add_to_recent_files"]) # Emitted when a project file is about to open.
@pyqtSlot(QUrl, bool)
def readLocalUcpFile(self, file: QUrl, add_to_recent_files: bool = True):
file_name = QUrl(file).toLocalFile()
workspace_reader = self.getWorkspaceFileHandler()
if workspace_reader is None:
Logger.warning(f"Workspace reader not found, cannot read file {file_name}.")
return
workspace_reader.readLocalFile(file, add_to_recent_files)
@pyqtSlot(QUrl, str, bool)
@pyqtSlot(QUrl, str)
@pyqtSlot(QUrl)
@ -1775,7 +2006,7 @@ class CuraApplication(QtApplication):
Logger.log("i", "Attempting to read file %s", file.toString())
if not file.isValid():
return
self._open_project_mode = project_mode
scene = self.getController().getScene()
for node in DepthFirstIterator(scene.getRoot()):
@ -1785,16 +2016,16 @@ class CuraApplication(QtApplication):
is_project_file = self.checkIsValidProjectFile(file)
if project_mode is None:
project_mode = self.getPreferences().getValue("cura/choice_on_open_project")
if self._open_project_mode is None:
self._open_project_mode = self.getPreferences().getValue("cura/choice_on_open_project")
if is_project_file and project_mode == "open_as_project":
if is_project_file and self._open_project_mode == "open_as_project":
# open as project immediately without presenting a dialog
workspace_handler = self.getWorkspaceFileHandler()
workspace_handler.readLocalFile(file, add_to_recent_files_hint = add_to_recent_files)
return
if is_project_file and project_mode == "always_ask":
if is_project_file and self._open_project_mode == "always_ask":
# present a dialog asking to open as project or import models
self.callLater(self.openProjectFile.emit, file, add_to_recent_files)
return
@ -1894,7 +2125,8 @@ class CuraApplication(QtApplication):
node.scale(original_node.getScale())
node.setSelectable(True)
node.setName(os.path.basename(file_name))
if not node.getName():
node.setName(os.path.basename(file_name))
self.getBuildVolume().checkBoundsAndUpdate(node)
is_non_sliceable = "." + file_extension in self._non_sliceable_extensions
@ -1928,8 +2160,11 @@ class CuraApplication(QtApplication):
center_y = 0
node.translate(Vector(0, center_y, 0))
nodes_to_arrange.append(node)
# If the file is a project,and models are to be loaded from a that project,
# models inside file should be arranged in buildplate.
elif self._open_project_mode == "open_as_model":
nodes_to_arrange.append(node)
# This node is deep copied from some other node which already has a BuildPlateDecorator, but the deepcopy
# of BuildPlateDecorator produces one that's associated with build plate -1. So, here we need to check if
@ -1949,7 +2184,8 @@ class CuraApplication(QtApplication):
if select_models_on_load:
Selection.add(node)
try:
arrange(nodes_to_arrange, self.getBuildVolume(), fixed_nodes)
arranger = Nest2DArrange(nodes_to_arrange, self.getBuildVolume(), fixed_nodes)
arranger.arrange()
except:
Logger.logException("e", "Failed to arrange the models")
@ -1962,6 +2198,12 @@ class CuraApplication(QtApplication):
def addNonSliceableExtension(self, extension):
self._non_sliceable_extensions.append(extension)
@pyqtSlot(str, result = bool)
def isProjectUcp(self, file_url) -> bool:
file_path = QUrl(file_url).toLocalFile()
workspace_reader = self.getWorkspaceFileHandler().getReaderForFile(file_path)
return workspace_reader.getIsProjectUcp()
@pyqtSlot(str, result=bool)
def checkIsValidProjectFile(self, file_url):
"""Checks if the given file URL is a valid project file. """
@ -1971,6 +2213,8 @@ class CuraApplication(QtApplication):
if workspace_reader is None:
return False # non-project files won't get a reader
try:
if workspace_reader.getPluginId() == "3MFReader":
workspace_reader.clearOpenAsUcp()
result = workspace_reader.preRead(file_path, show_dialog=False)
return result == WorkspaceReader.PreReadResult.accepted
except:
@ -2076,3 +2320,15 @@ class CuraApplication(QtApplication):
@classmethod
def getInstance(cls, *args, **kwargs) -> "CuraApplication":
return cast(CuraApplication, super().getInstance(**kwargs))
@pyqtProperty(bool, constant=True)
def isEnterprise(self) -> bool:
return ApplicationMetadata.IsEnterpriseVersion
@pyqtProperty("QVariant", constant=True)
def conanInstalls(self) -> Dict[str, Dict[str, str]]:
return self._conan_installs
@pyqtProperty("QVariant", constant=True)
def pythonInstalls(self) -> Dict[str, Dict[str, str]]:
return self._python_installs

View file

@ -1,4 +1,4 @@
# Copyright (c) 2018 Ultimaker B.V.
# Copyright (c) 2023 UltiMaker
# Cura is released under the terms of the LGPLv3 or higher.
import glob
import os
@ -55,7 +55,9 @@ class CuraPackageManager(PackageManager):
def initialize(self) -> None:
self._installation_dirs_dict["materials"] = Resources.getStoragePath(CuraApplication.ResourceTypes.MaterialInstanceContainer)
self._installation_dirs_dict["qualities"] = Resources.getStoragePath(CuraApplication.ResourceTypes.QualityInstanceContainer)
self._installation_dirs_dict["variants"] = Resources.getStoragePath(CuraApplication.ResourceTypes.VariantInstanceContainer)
self._installation_dirs_dict["variants"] = Resources.getStoragePath(
CuraApplication.ResourceTypes.VariantInstanceContainer)
self._installation_dirs_dict["images"] = Resources.getStoragePath(CuraApplication.ResourceTypes.ImageFiles)
# Due to a bug in Cura 5.1.0 we needed to change the directory structure of the curapackage on the server side (See SD-3871).
# Although the material intent profiles will be installed in the `intent` folder, the curapackage from the server side will

88
cura/HitChecker.py Normal file
View file

@ -0,0 +1,88 @@
from typing import List, Dict
from cura.Scene.CuraSceneNode import CuraSceneNode
class HitChecker:
"""Checks if nodes can be printed without causing any collisions and interference"""
def __init__(self, nodes: List[CuraSceneNode]) -> None:
self._hit_map = self._buildHitMap(nodes)
def anyTwoNodesBlockEachOther(self, nodes: List[CuraSceneNode]) -> bool:
"""Returns True if any 2 nodes block each other"""
for a in nodes:
for b in nodes:
if self._hit_map[a][b] and self._hit_map[b][a]:
return True
return False
def canPrintBefore(self, node: CuraSceneNode, other_nodes: List[CuraSceneNode]) -> bool:
"""Returns True if node doesn't block other_nodes and can be printed before them"""
no_hits = all(not self._hit_map[node][other_node] for other_node in other_nodes)
return no_hits
def canPrintAfter(self, node: CuraSceneNode, other_nodes: List[CuraSceneNode]) -> bool:
"""Returns True if node doesn't hit other nodes and can be printed after them"""
no_hits = all(not self._hit_map[other_node][node] for other_node in other_nodes)
return no_hits
def calculateScore(self, a: CuraSceneNode, b: CuraSceneNode) -> int:
"""Calculate score simply sums the number of other objects it 'blocks'
:param a: node
:param b: node
:return: sum of the number of other objects
"""
score_a = sum(self._hit_map[a].values())
score_b = sum(self._hit_map[b].values())
return score_a - score_b
def canPrintNodesInProvidedOrder(self, ordered_nodes: List[CuraSceneNode]) -> bool:
"""Returns True If nodes don't have any hits in provided order"""
for node_index, node in enumerate(ordered_nodes):
nodes_before = ordered_nodes[:node_index - 1] if node_index - 1 >= 0 else []
nodes_after = ordered_nodes[node_index + 1:] if node_index + 1 < len(ordered_nodes) else []
if not self.canPrintBefore(node, nodes_after) or not self.canPrintAfter(node, nodes_before):
return False
return True
@staticmethod
def _buildHitMap(nodes: List[CuraSceneNode]) -> Dict[CuraSceneNode, CuraSceneNode]:
"""Pre-computes all hits between all objects
:nodes: nodes that need to be checked for collisions
:return: dictionary where hit_map[node1][node2] is False if there node1 can be printed before node2
"""
hit_map = {j: {i: HitChecker._checkHit(j, i) for i in nodes} for j in nodes}
return hit_map
@staticmethod
def _checkHit(a: CuraSceneNode, b: CuraSceneNode) -> bool:
"""Checks if a can be printed before b
:param a: node
:param b: node
:return: False if a can be printed before b
"""
if a == b:
return False
a_hit_hull = a.callDecoration("getConvexHullBoundary")
b_hit_hull = b.callDecoration("getConvexHullHeadFull")
overlap = a_hit_hull.intersectsPolygon(b_hit_hull)
if overlap:
return True
# Adhesion areas must never overlap, regardless of printing order
# This would cause over-extrusion
a_hit_hull = a.callDecoration("getAdhesionArea")
b_hit_hull = b.callDecoration("getAdhesionArea")
overlap = a_hit_hull.intersectsPolygon(b_hit_hull)
if overlap:
return True
else:
return False

View file

@ -1,5 +1,6 @@
# Copyright (c) 2019 Ultimaker B.V.
# Cura is released under the terms of the LGPLv3 or higher.
import math
import numpy
from typing import Optional, cast
@ -66,7 +67,7 @@ class LayerPolygon:
# Buffering the colors shouldn't be necessary as it is not
# re-used and can save a lot of memory usage.
self._color_map = LayerPolygon.getColorMap()
self._colors = self._color_map[self._types] # type: numpy.ndarray
self._colors: numpy.ndarray = self._color_map[self._types]
# When type is used as index returns true if type == LayerPolygon.InfillType
# or type == LayerPolygon.SkinType
@ -74,8 +75,8 @@ class LayerPolygon:
# Should be generated in better way, not hardcoded.
self._is_infill_or_skin_type_map = numpy.array([0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0], dtype=bool)
self._build_cache_line_mesh_mask = None # type: Optional[numpy.ndarray]
self._build_cache_needed_points = None # type: Optional[numpy.ndarray]
self._build_cache_line_mesh_mask: Optional[numpy.ndarray] = None
self._build_cache_needed_points: Optional[numpy.ndarray] = None
def buildCache(self) -> None:
# For the line mesh we do not draw Infill or Jumps. Therefore those lines are filtered out.
@ -186,6 +187,11 @@ class LayerPolygon:
def types(self):
return self._types
@property
def lineLengths(self):
data_array = numpy.array(self._data)
return numpy.linalg.norm(data_array[1:] - data_array[:-1], axis=1)
@property
def data(self):
return self._data

View file

@ -49,7 +49,7 @@ class MachineErrorChecker(QObject):
self._keys_to_check = set() # type: Set[str]
self._num_keys_to_check_per_update = 10
self._num_keys_to_check_per_update = 1
def initialize(self) -> None:
self._error_check_timer.timeout.connect(self._rescheduleCheck)

View file

@ -14,6 +14,7 @@ from cura.Machines.QualityChangesGroup import QualityChangesGroup # To construc
from cura.Machines.QualityGroup import QualityGroup # To construct groups of quality profiles that belong together.
from cura.Machines.QualityNode import QualityNode
from cura.Machines.VariantNode import VariantNode
from cura.Machines.MaterialNode import MaterialNode
import UM.FlameProfiler
@ -167,13 +168,20 @@ class MachineNode(ContainerNode):
return self.global_qualities.get(self.preferred_quality_type, next(iter(self.global_qualities.values())))
def isExcludedMaterial(self, material: MaterialNode) -> bool:
"""Returns whether the material should be excluded from the list of materials."""
for exclude_material in self.exclude_materials:
if exclude_material in material["id"]:
return True
return False
@UM.FlameProfiler.profile
def _loadAll(self) -> None:
"""(Re)loads all variants under this printer."""
container_registry = ContainerRegistry.getInstance()
if not self.has_variants:
self.variants["empty"] = VariantNode("empty_variant", machine = self)
self.variants["empty"] = VariantNode("empty_variant", machine=self)
self.variants["empty"].materialsChanged.connect(self.materialsChanged)
else:
# Find all the variants for this definition ID.

View file

@ -21,17 +21,25 @@ class MaterialNode(ContainerNode):
Its subcontainers are quality profiles.
"""
def __init__(self, container_id: str, variant: "VariantNode") -> None:
def __init__(self, container_id: str, variant: "VariantNode", *, container: ContainerInterface = None) -> None:
super().__init__(container_id)
self.variant = variant
self.qualities = {} # type: Dict[str, QualityNode] # Mapping container IDs to quality profiles.
self.materialChanged = Signal() # Triggered when the material is removed or its metadata is updated.
container_registry = ContainerRegistry.getInstance()
my_metadata = container_registry.findContainersMetadata(id = container_id)[0]
self.base_file = my_metadata["base_file"]
self.material_type = my_metadata["material"]
self.guid = my_metadata["GUID"]
if container is not None:
self.base_file = container.getMetaDataEntry("base_file")
self.material_type = container.getMetaDataEntry("material")
self.brand = container.getMetaDataEntry("brand")
self.guid = container.getMetaDataEntry("GUID")
else:
my_metadata = container_registry.findContainersMetadata(id = container_id)[0]
self.base_file = my_metadata["base_file"]
self.material_type = my_metadata["material"]
self.brand = my_metadata["brand"]
self.guid = my_metadata["GUID"]
self._loadAll()
container_registry.containerRemoved.connect(self._onRemoved)
container_registry.containerMetaDataChanged.connect(self._onMetadataChanged)
@ -80,6 +88,7 @@ class MaterialNode(ContainerNode):
# such as "generic_pla_ultimaker_s5_AA_0.4". So we search with the "base_file" which is the material_root_id.
else:
qualities = container_registry.findInstanceContainersMetadata(type = "quality", definition = self.variant.machine.quality_definition, material = self.base_file)
if not qualities:
my_material_type = self.material_type
if self.variant.machine.has_variants:
@ -89,9 +98,22 @@ class MaterialNode(ContainerNode):
else:
qualities_any_material = container_registry.findInstanceContainersMetadata(type = "quality", definition = self.variant.machine.quality_definition)
all_material_base_files = {material_metadata["base_file"] for material_metadata in container_registry.findInstanceContainersMetadata(type = "material", material = my_material_type)}
# First we attempt to find materials that have the same brand but not the right color
all_material_base_files_right_brand = {material_metadata["base_file"] for material_metadata in container_registry.findInstanceContainersMetadata(type = "material", material = my_material_type, brand = self.brand)}
qualities.extend((quality for quality in qualities_any_material if quality.get("material") in all_material_base_files))
right_brand_no_color_qualities = [quality for quality in qualities_any_material if quality.get("material") in all_material_base_files_right_brand]
if right_brand_no_color_qualities:
# We found qualties for materials with the right brand but not with the right color. Use those.
qualities.extend(right_brand_no_color_qualities)
else:
# Fall back to generic
all_material_base_files = {material_metadata["base_file"] for material_metadata in
container_registry.findInstanceContainersMetadata(type="material",
material=my_material_type)}
no_brand_no_color_qualities = (quality for quality in qualities_any_material if
quality.get("material") in all_material_base_files)
qualities.extend(no_brand_no_color_qualities)
if not qualities: # No quality profiles found. Go by GUID then.
my_guid = self.guid

View file

@ -51,6 +51,9 @@ class CompatibleMachineModel(ListModel):
for output_device in machine_manager.printerOutputDevices:
for printer in output_device.printers:
extruder_configs = dict()
# If the printer name already exist in the queue skip it
if printer.name in [item["name"] for item in self.items]:
continue
# initialize & add current active material:
for extruder in printer.extruders:

View file

@ -227,7 +227,7 @@ class ExtrudersModel(ListModel):
"material_brand": "",
"color_name": "",
"material_type": "",
"material_label": ""
"material_name": ""
}
items.append(item)
if self._items != items:

View file

@ -39,7 +39,9 @@ class IntentCategoryModel(ListModel):
"""
if len(cls._translations) == 0:
cls._translations["default"] = {
"name": catalog.i18nc("@label", "Default")
"name": catalog.i18nc("@label", "Balanced"),
"description": catalog.i18nc("@text",
"The balanced profile is designed to strike a balance between productivity, surface quality, mechanical properties and dimensional accuracy.")
}
cls._translations["visual"] = {
"name": catalog.i18nc("@label", "Visual"),
@ -53,6 +55,17 @@ class IntentCategoryModel(ListModel):
"name": catalog.i18nc("@label", "Draft"),
"description": catalog.i18nc("@text", "The draft profile is designed to print initial prototypes and concept validation with the intent of significant print time reduction.")
}
cls._translations["annealing"] = {
"name": catalog.i18nc("@label", "Annealing"),
"description": catalog.i18nc("@text",
"The annealing profile requires post-processing in an oven after the print is finished. This profile retains the dimensional accuracy of the printed part after annealing and improves strength, stiffness, and thermal resistance.")
}
cls._translations["solid"] = {
"name": catalog.i18nc("@label", "Solid"),
"description": catalog.i18nc("@text",
"A highly dense and strong part but at a slower print time. Great for functional parts.")
}
return cls._translations
def __init__(self, intent_category: str) -> None:

View file

@ -1,29 +1,32 @@
# Copyright (c) 2022 Ultimaker B.V.
# Copyright (c) 2023 UltiMaker
# Cura is released under the terms of the LGPLv3 or higher.
import collections
from typing import OrderedDict, Optional
from typing import Optional
from PyQt6.QtCore import Qt, QTimer, QObject
from PyQt6.QtCore import Qt, QTimer, QObject, QUrl
import cura
from UM import i18nCatalog
from UM.Logger import Logger
from UM.Qt.ListModel import ListModel
from UM.Resources import Resources
from UM.Settings.ContainerRegistry import ContainerRegistry
from UM.Settings.Interfaces import ContainerInterface
from cura.Machines.Models.IntentCategoryModel import IntentCategoryModel
from cura.Settings.IntentManager import IntentManager
catalog = i18nCatalog("cura")
class IntentSelectionModel(ListModel):
NameRole = Qt.ItemDataRole.UserRole + 1
IntentCategoryRole = Qt.ItemDataRole.UserRole + 2
WeightRole = Qt.ItemDataRole.UserRole + 3
DescriptionRole = Qt.ItemDataRole.UserRole + 4
IconRole = Qt.ItemDataRole.UserRole + 5
CustomIconRole = Qt.ItemDataRole.UserRole + 6
def __init__(self, parent: Optional[QObject] = None) -> None:
super().__init__(parent)
@ -33,6 +36,7 @@ class IntentSelectionModel(ListModel):
self.addRoleName(self.WeightRole, "weight")
self.addRoleName(self.DescriptionRole, "description")
self.addRoleName(self.IconRole, "icon")
self.addRoleName(self.CustomIconRole, "custom_icon")
application = cura.CuraApplication.CuraApplication.getInstance()
@ -53,30 +57,9 @@ class IntentSelectionModel(ListModel):
self._onChange()
@staticmethod
def _getDefaultProfileInformation() -> OrderedDict[str, dict]:
""" Default information user-visible string. Ordered by weight. """
default_profile_information = collections.OrderedDict()
default_profile_information["default"] = {
"name": catalog.i18nc("@label", "Default"),
"icon": "GearCheck"
}
default_profile_information["visual"] = {
"name": catalog.i18nc("@label", "Visual"),
"description": catalog.i18nc("@text", "The visual profile is designed to print visual prototypes and models with the intent of high visual and surface quality."),
"icon" : "Visual"
}
default_profile_information["engineering"] = {
"name": catalog.i18nc("@label", "Engineering"),
"description": catalog.i18nc("@text", "The engineering profile is designed to print functional prototypes and end-use parts with the intent of better accuracy and for closer tolerances."),
"icon": "Nut"
}
default_profile_information["quick"] = {
"name": catalog.i18nc("@label", "Draft"),
"description": catalog.i18nc("@text", "The draft profile is designed to print initial prototypes and concept validation with the intent of significant print time reduction."),
"icon": "SpeedOMeter"
}
return default_profile_information
_default_intent_categories = ["default", "visual", "engineering", "quick", "annealing", "solid"]
_icons = {"default": "GearCheck", "visual": "Visual", "engineering": "Nut", "quick": "SpeedOMeter",
"annealing": "Anneal", "solid": "Hammer"}
def _onContainerChange(self, container: ContainerInterface) -> None:
"""Updates the list of intents if an intent profile was added or removed."""
@ -89,38 +72,63 @@ class IntentSelectionModel(ListModel):
def _update(self) -> None:
Logger.log("d", "Updating {model_class_name}.".format(model_class_name = self.__class__.__name__))
global_stack = cura.CuraApplication.CuraApplication.getInstance().getGlobalContainerStack()
cura_application = cura.CuraApplication.CuraApplication.getInstance()
global_stack = cura_application.getGlobalContainerStack()
if global_stack is None:
self.setItems([])
Logger.log("d", "No active GlobalStack, set quality profile model as empty.")
return
# Check for material compatibility
if not cura.CuraApplication.CuraApplication.getInstance().getMachineManager().activeMaterialsCompatible():
if not cura_application.getMachineManager().activeMaterialsCompatible():
Logger.log("d", "No active material compatibility, set quality profile model as empty.")
self.setItems([])
return
default_profile_info = self._getDefaultProfileInformation()
available_categories = IntentManager.getInstance().currentAvailableIntentCategories()
result = []
for i, category in enumerate(available_categories):
profile_info = default_profile_info.get(category, {})
for category in available_categories:
try:
weight = list(default_profile_info.keys()).index(category)
except ValueError:
weight = len(available_categories) + i
if category in self._default_intent_categories:
result.append({
"name": IntentCategoryModel.translation(category, "name", category.title()),
"description": IntentCategoryModel.translation(category, "description", None),
"icon": self._icons[category],
"custom_icon": None,
"intent_category": category,
"weight": self._default_intent_categories.index(category),
})
else:
# There can be multiple intents with the same category, use one of these
# intent-metadata's for the icon/description defintions for the intent
result.append({
"name": profile_info.get("name", category.title()),
"description": profile_info.get("description", None),
"icon" : profile_info.get("icon", ""),
"intent_category": category,
"weight": weight,
})
intent_metadata = cura_application.getContainerRegistry().findContainersMetadata(type="intent",
definition=global_stack.findInstanceContainerDefinitionId(global_stack.definition),
intent_category=category)[0]
intent_name = intent_metadata.get("name", category.title())
icon = intent_metadata.get("icon", None)
description = intent_metadata.get("description", None)
if icon is not None and icon != '':
try:
icon = QUrl.fromLocalFile(
Resources.getPath(cura.CuraApplication.CuraApplication.ResourceTypes.ImageFiles, icon))
except (FileNotFoundError, NotADirectoryError, PermissionError):
Logger.log("e", f"Icon file for intent {intent_name} not found.")
icon = None
result.append({
"name": intent_name,
"description": description,
"custom_icon": icon,
"icon": None,
"intent_category": category,
"weight": 5,
})
result.sort(key=lambda k: k["weight"])

View file

@ -8,7 +8,9 @@ catalog = i18nCatalog("cura")
intent_translations = collections.OrderedDict() # type: collections.OrderedDict[str, Dict[str, Optional[str]]]
intent_translations["default"] = {
"name": catalog.i18nc("@label", "Default")
"name": catalog.i18nc("@label", "Balanced"),
"description": catalog.i18nc("@text",
"The balanced profile is designed to strike a balance between productivity, surface quality, mechanical properties and dimensional accuracy.")
}
intent_translations["visual"] = {
"name": catalog.i18nc("@label", "Visual"),
@ -22,3 +24,8 @@ intent_translations["quick"] = {
"name": catalog.i18nc("@label", "Draft"),
"description": catalog.i18nc("@text", "The draft profile is designed to print initial prototypes and concept validation with the intent of significant print time reduction.")
}
intent_translations["solid"] = {
"name": catalog.i18nc("@label", "Solid"),
"description": catalog.i18nc("@text",
"A highly dense and strong part but at a slower print time. Great for functional parts.")
}

View file

@ -5,7 +5,7 @@
# online cloud connected printers are represented within this ListModel. Additional information such as the number of
# connected printers for each printer type is gathered.
from typing import Optional, List, cast
from typing import Optional, List, cast, Dict, Any
from PyQt6.QtCore import Qt, QTimer, QObject, pyqtSlot, pyqtProperty, pyqtSignal
@ -30,10 +30,10 @@ class MachineListModel(ListModel):
ComponentTypeRole = Qt.ItemDataRole.UserRole + 8
IsNetworkedMachineRole = Qt.ItemDataRole.UserRole + 9
def __init__(self, parent: Optional[QObject] = None, machines_filter: List[GlobalStack] = None, listenToChanges: bool = True) -> None:
def __init__(self, parent: Optional[QObject] = None, machines_filter: List[GlobalStack] = None, listenToChanges: bool = True, showCloudPrinters: bool = False) -> None:
super().__init__(parent)
self._show_cloud_printers = False
self._show_cloud_printers = showCloudPrinters
self._machines_filter = machines_filter
self._catalog = i18nCatalog("cura")
@ -110,22 +110,22 @@ class MachineListModel(ListModel):
for abstract_machine in abstract_machine_stacks:
definition_id = abstract_machine.definition.getId()
online_machine_stacks = machines_manager.getMachinesWithDefinition(definition_id, online_only = True)
connected_machine_stacks = machines_manager.getMachinesWithDefinition(definition_id, online_only = False)
online_machine_stacks = list(filter(lambda machine: machine.hasNetworkedConnection(), online_machine_stacks))
online_machine_stacks.sort(key=lambda machine: machine.getName().upper())
connected_machine_stacks = list(filter(lambda machine: machine.hasNetworkedConnection(), connected_machine_stacks))
connected_machine_stacks.sort(key=lambda machine: machine.getName().upper())
if abstract_machine in other_machine_stacks:
other_machine_stacks.remove(abstract_machine)
if abstract_machine in online_machine_stacks:
online_machine_stacks.remove(abstract_machine)
if abstract_machine in connected_machine_stacks:
connected_machine_stacks.remove(abstract_machine)
# Create a list item for abstract machine
self.addItem(abstract_machine, True, len(online_machine_stacks))
self.addItem(abstract_machine, True, len(connected_machine_stacks))
# Create list of machines that are children of the abstract machine
for stack in online_machine_stacks:
for stack in connected_machine_stacks:
if self._show_cloud_printers:
self.addItem(stack, True)
# Remove this machine from the other stack list
@ -159,3 +159,8 @@ class MachineListModel(ListModel):
"machineCount": machine_count,
"catergory": "connected" if is_online else "other",
})
def getItems(self) -> Dict[str, Any]:
if self.count > 0:
return self.items
return {}

View file

@ -44,6 +44,10 @@ class MaterialBrandsModel(BaseMaterialsModel):
if bool(container_node.getMetaDataEntry("removed", False)):
continue
# Ignore materials that are marked as not visible for whatever reason
if not bool(container_node.getMetaDataEntry("visible", True)):
continue
# Add brands we haven't seen yet to the dict, skipping generics
brand = container_node.getMetaDataEntry("brand", "")
if brand.lower() == "generic":

View file

@ -344,7 +344,7 @@ class QualityManagementModel(ListModel):
"quality_type": quality_group.quality_type,
"quality_changes_group": None,
"intent_category": "default",
"section_name": catalog.i18nc("@label", "Default"),
"section_name": catalog.i18nc("@label", "Balanced"),
"layer_height": layer_height, # layer_height is only used for sorting
}
item_list.append(item)

View file

@ -60,7 +60,7 @@ class VariantNode(ContainerNode):
materials = list(materials_per_base_file.values())
# Filter materials based on the exclude_materials property.
filtered_materials = [material for material in materials if material["id"] not in self.machine.exclude_materials]
filtered_materials = [material for material in materials if not self.machine.isExcludedMaterial(material)]
for material in filtered_materials:
base_file = material["base_file"]
@ -148,7 +148,7 @@ class VariantNode(ContainerNode):
if "empty_material" in self.materials:
del self.materials["empty_material"]
self.materials[base_file] = MaterialNode(container.getId(), variant = self)
self.materials[base_file] = MaterialNode(container.getId(), variant = self, container = container)
self.materials[base_file].materialChanged.connect(self.materialsChanged)
self.materialsChanged.emit(self.materials[base_file])

View file

@ -14,17 +14,19 @@ from UM.Operations.TranslateOperation import TranslateOperation
from UM.Scene.Iterator.DepthFirstIterator import DepthFirstIterator
from UM.Scene.SceneNode import SceneNode
from UM.i18n import i18nCatalog
from cura.Arranging.Nest2DArrange import arrange, createGroupOperationForArrange
from cura.Arranging.GridArrange import GridArrange
from cura.Arranging.Nest2DArrange import Nest2DArrange
i18n_catalog = i18nCatalog("cura")
class MultiplyObjectsJob(Job):
def __init__(self, objects, count, min_offset = 8):
def __init__(self, objects, count: int, min_offset: int = 8 ,* , grid_arrange: bool = False):
super().__init__()
self._objects = objects
self._count = count
self._min_offset = min_offset
self._count: int = count
self._min_offset: int = min_offset
self._grid_arrange: bool = grid_arrange
def run(self) -> None:
status_message = Message(i18n_catalog.i18nc("@info:status", "Multiplying and placing objects"), lifetime = 0,
@ -39,7 +41,7 @@ class MultiplyObjectsJob(Job):
root = scene.getRoot()
processed_nodes = [] # type: List[SceneNode]
processed_nodes: List[SceneNode] = []
nodes = []
fixed_nodes = []
@ -76,12 +78,12 @@ class MultiplyObjectsJob(Job):
found_solution_for_all = True
group_operation = GroupedOperation()
if nodes:
group_operation, not_fit_count = createGroupOperationForArrange(nodes,
Application.getInstance().getBuildVolume(),
fixed_nodes,
factor = 10000,
add_new_nodes_in_scene = True)
found_solution_for_all = not_fit_count == 0
if self._grid_arrange:
arranger = GridArrange(nodes, Application.getInstance().getBuildVolume(), fixed_nodes)
else:
arranger = Nest2DArrange(nodes, Application.getInstance().getBuildVolume(), fixed_nodes, factor=1000)
group_operation, not_fit_count = arranger.createGroupOperationForArrange(add_new_nodes_in_scene=True)
if nodes_to_add_without_arrange:
for nested_node in nodes_to_add_without_arrange:

View file

@ -16,6 +16,7 @@ from UM.TaskManagement.HttpRequestManager import HttpRequestManager # To downlo
catalog = i18nCatalog("cura")
TOKEN_TIMESTAMP_FORMAT = "%Y-%m-%d %H:%M:%S"
REQUEST_TIMEOUT = 5 # Seconds
class AuthorizationHelpers:
@ -40,6 +41,7 @@ class AuthorizationHelpers:
"""
data = {
"client_id": self._settings.CLIENT_ID if self._settings.CLIENT_ID is not None else "",
"client_secret": self._settings.CLIENT_SECRET if self._settings.CLIENT_SECRET is not None else "",
"redirect_uri": self._settings.CALLBACK_URL if self._settings.CALLBACK_URL is not None else "",
"grant_type": "authorization_code",
"code": authorization_code,
@ -52,7 +54,8 @@ class AuthorizationHelpers:
data = urllib.parse.urlencode(data).encode("UTF-8"),
headers_dict = headers,
callback = lambda response: self.parseTokenResponse(response, callback),
error_callback = lambda response, _: self.parseTokenResponse(response, callback)
error_callback = lambda response, _: self.parseTokenResponse(response, callback),
timeout = REQUEST_TIMEOUT
)
def getAccessTokenUsingRefreshToken(self, refresh_token: str, callback: Callable[[AuthenticationResponse], None]) -> None:
@ -64,6 +67,7 @@ class AuthorizationHelpers:
Logger.log("d", "Refreshing the access token for [%s]", self._settings.OAUTH_SERVER_URL)
data = {
"client_id": self._settings.CLIENT_ID if self._settings.CLIENT_ID is not None else "",
"client_secret": self._settings.CLIENT_SECRET if self._settings.CLIENT_SECRET is not None else "",
"redirect_uri": self._settings.CALLBACK_URL if self._settings.CALLBACK_URL is not None else "",
"grant_type": "refresh_token",
"refresh_token": refresh_token,
@ -75,7 +79,9 @@ class AuthorizationHelpers:
data = urllib.parse.urlencode(data).encode("UTF-8"),
headers_dict = headers,
callback = lambda response: self.parseTokenResponse(response, callback),
error_callback = lambda response, _: self.parseTokenResponse(response, callback)
error_callback = lambda response, _: self.parseTokenResponse(response, callback),
urgent = True,
timeout = REQUEST_TIMEOUT
)
def parseTokenResponse(self, token_response: QNetworkReply, callback: Callable[[AuthenticationResponse], None]) -> None:
@ -120,7 +126,8 @@ class AuthorizationHelpers:
check_token_url,
headers_dict = headers,
callback = lambda reply: self._parseUserProfile(reply, success_callback, failed_callback),
error_callback = lambda _, _2: failed_callback() if failed_callback is not None else None
error_callback = lambda _, _2: failed_callback() if failed_callback is not None else None,
timeout = REQUEST_TIMEOUT
)
def _parseUserProfile(self, reply: QNetworkReply, success_callback: Optional[Callable[[UserProfile], None]], failed_callback: Optional[Callable[[], None]] = None) -> None:

View file

@ -6,6 +6,7 @@ from threading import Lock # To turn an asynchronous call synchronous.
from typing import Optional, Callable, Tuple, Dict, Any, List, TYPE_CHECKING
from urllib.parse import parse_qs, urlparse
from UM.Logger import Logger
from cura.OAuth2.Models import AuthenticationResponse, ResponseData, HTTP_STATUS
from UM.i18n import i18nCatalog
@ -70,11 +71,13 @@ class AuthorizationRequestHandler(BaseHTTPRequestHandler):
code = self._queryGet(query, "code")
state = self._queryGet(query, "state")
if state != self.state:
Logger.log("w", f"The provided state was not correct. Got {state} and expected {self.state}")
token_response = AuthenticationResponse(
success = False,
err_message = catalog.i18nc("@message", "The provided state is not correct.")
)
elif code and self.authorization_helpers is not None and self.verification_code is not None:
Logger.log("d", "Timeout when authenticating with the account server.")
token_response = AuthenticationResponse(
success = False,
err_message = catalog.i18nc("@message", "Timeout when authenticating with the account server.")
@ -92,6 +95,7 @@ class AuthorizationRequestHandler(BaseHTTPRequestHandler):
elif self._queryGet(query, "error_code") == "user_denied":
# Otherwise we show an error message (probably the user clicked "Deny" in the auth dialog).
Logger.log("d", "User did not give the required permission when authorizing this application")
token_response = AuthenticationResponse(
success = False,
err_message = catalog.i18nc("@message", "Please give the required permissions when authorizing this application.")
@ -99,6 +103,7 @@ class AuthorizationRequestHandler(BaseHTTPRequestHandler):
else:
# We don't know what went wrong here, so instruct the user to check the logs.
Logger.log("w", f"Unexpected error when logging in. Error_code: {self._queryGet(query, 'error_code')}, State: {state}")
token_response = AuthenticationResponse(
success = False,
error_message = catalog.i18nc("@message", "Something unexpected happened when trying to log in, please try again.")

View file

@ -1,4 +1,4 @@
# Copyright (c) 2021 Ultimaker B.V.
# Copyright (c) 2024 UltiMaker
# Cura is released under the terms of the LGPLv3 or higher.
import json
@ -6,13 +6,14 @@ from datetime import datetime, timedelta
from typing import Callable, Dict, Optional, TYPE_CHECKING, Union
from urllib.parse import urlencode, quote_plus
from PyQt6.QtCore import QUrl
from PyQt6.QtCore import QUrl, QTimer
from PyQt6.QtGui import QDesktopServices
from UM.Logger import Logger
from UM.Message import Message
from UM.Signal import Signal
from UM.i18n import i18nCatalog
from UM.TaskManagement.HttpRequestManager import HttpRequestManager # To download log-in tokens.
from cura.OAuth2.AuthorizationHelpers import AuthorizationHelpers, TOKEN_TIMESTAMP_FORMAT
from cura.OAuth2.LocalAuthorizationServer import LocalAuthorizationServer
from cura.OAuth2.Models import AuthenticationResponse, BaseModel
@ -25,26 +26,32 @@ if TYPE_CHECKING:
MYCLOUD_LOGOFF_URL = "https://account.ultimaker.com/logoff?utm_source=cura&utm_medium=software&utm_campaign=change-account-before-adding-printers"
REFRESH_TOKEN_MAX_RETRIES = 15
REFRESH_TOKEN_RETRY_INTERVAL = 1000
class AuthorizationService:
"""The authorization service is responsible for handling the login flow, storing user credentials and providing
account information.
"""
# Emit signal when authentication is completed.
onAuthStateChanged = Signal()
def __init__(self,
settings: "OAuth2Settings",
preferences: Optional["Preferences"] = None,
get_user_profile: bool = True) -> None:
# Emit signal when authentication is completed.
self.onAuthStateChanged = Signal()
# Emit signal when authentication failed.
onAuthenticationError = Signal()
# Emit signal when authentication failed.
self.onAuthenticationError = Signal()
accessTokenChanged = Signal()
self.accessTokenChanged = Signal()
def __init__(self, settings: "OAuth2Settings", preferences: Optional["Preferences"] = None) -> None:
self._settings = settings
self._auth_helpers = AuthorizationHelpers(settings)
self._auth_url = "{}/authorize".format(self._settings.OAUTH_SERVER_URL)
self._auth_data: Optional[AuthenticationResponse] = None
self._user_profile: Optional["UserProfile"] = None
self._get_user_profile: bool = get_user_profile
self._preferences = preferences
self._server = LocalAuthorizationServer(self._auth_helpers, self._onAuthStateChanged, daemon=True)
self._currently_refreshing_token = False # Whether we are currently in the process of refreshing auth. Don't make new requests while busy.
@ -53,6 +60,12 @@ class AuthorizationService:
self.onAuthStateChanged.connect(self._authChanged)
self._refresh_token_retries = 0
self._refresh_token_retry_timer = QTimer()
self._refresh_token_retry_timer.setInterval(REFRESH_TOKEN_RETRY_INTERVAL)
self._refresh_token_retry_timer.setSingleShot(True)
self._refresh_token_retry_timer.timeout.connect(self.refreshAccessToken)
def _authChanged(self, logged_in):
if logged_in and self._unable_to_get_data_message is not None:
self._unable_to_get_data_message.hide()
@ -163,16 +176,29 @@ class AuthorizationService:
return
def process_auth_data(response: AuthenticationResponse) -> None:
self._currently_refreshing_token = False
if response.success:
self._refresh_token_retries = 0
self._storeAuthData(response)
HttpRequestManager.getInstance().setDelayRequests(False)
self.onAuthStateChanged.emit(logged_in = True)
else:
Logger.warning("Failed to get a new access token from the server.")
self.onAuthStateChanged.emit(logged_in = False)
if self._refresh_token_retries >= REFRESH_TOKEN_MAX_RETRIES:
self._refresh_token_retries = 0
Logger.warning("Failed to get a new access token from the server, giving up.")
HttpRequestManager.getInstance().setDelayRequests(False)
self.onAuthStateChanged.emit(logged_in = False)
else:
# Retry a bit later, network may be offline right now and will hopefully be back soon
Logger.warning("Failed to get a new access token from the server, retrying later.")
self._refresh_token_retries += 1
self._refresh_token_retry_timer.start()
if self._currently_refreshing_token:
Logger.debug("Was already busy refreshing token. Do not start a new request.")
return
HttpRequestManager.getInstance().setDelayRequests(True)
self._currently_refreshing_token = True
self._auth_helpers.getAccessTokenUsingRefreshToken(self._auth_data.refresh_token, process_auth_data)
@ -279,7 +305,8 @@ class AuthorizationService:
message_type = Message.MessageType.ERROR)
Logger.warning("Unable to get user profile using auth data from preferences.")
self._unable_to_get_data_message.show()
self.getUserProfile(callback)
if self._get_user_profile:
self.getUserProfile(callback)
except (ValueError, TypeError):
Logger.logException("w", "Could not load auth data from preferences")
@ -294,7 +321,8 @@ class AuthorizationService:
self._auth_data = auth_data
self._currently_refreshing_token = False
if auth_data:
self.getUserProfile()
if self._get_user_profile:
self.getUserProfile()
self._preferences.setValue(self._settings.AUTH_DATA_PREFERENCE_KEY, json.dumps(auth_data.dump()))
else:
Logger.log("d", "Clearing the user profile")

View file

@ -16,6 +16,7 @@ class OAuth2Settings(BaseModel):
CALLBACK_PORT = None # type: Optional[int]
OAUTH_SERVER_URL = None # type: Optional[str]
CLIENT_ID = None # type: Optional[str]
CLIENT_SECRET = None # type: Optional[str]
CLIENT_SCOPES = None # type: Optional[str]
CALLBACK_URL = None # type: Optional[str]
AUTH_DATA_PREFERENCE_KEY = "" # type: str

View file

@ -7,6 +7,11 @@ from UM.Scene.Iterator import Iterator
from UM.Scene.SceneNode import SceneNode
from functools import cmp_to_key
from cura.HitChecker import HitChecker
from cura.PrintOrderManager import PrintOrderManager
from cura.Scene.CuraSceneNode import CuraSceneNode
class OneAtATimeIterator(Iterator.Iterator):
"""Iterator that returns a list of nodes in the order that they need to be printed
@ -16,8 +21,6 @@ class OneAtATimeIterator(Iterator.Iterator):
def __init__(self, scene_node) -> None:
super().__init__(scene_node) # Call super to make multiple inheritance work.
self._hit_map = [[]] # type: List[List[bool]] # For each node, which other nodes this hits. A grid of booleans on which nodes hit which.
self._original_node_list = [] # type: List[SceneNode] # The nodes that need to be checked for collisions.
def _fillStack(self) -> None:
"""Fills the ``_node_stack`` with a list of scene nodes that need to be printed in order. """
@ -38,104 +41,50 @@ class OneAtATimeIterator(Iterator.Iterator):
self._node_stack = node_list[:]
return
# Copy the list
self._original_node_list = node_list[:]
hit_checker = HitChecker(node_list)
# Initialise the hit map (pre-compute all hits between all objects)
self._hit_map = [[self._checkHit(i, j) for i in node_list] for j in node_list]
if PrintOrderManager.isUserDefinedPrintOrderEnabled():
self._node_stack = self._getNodesOrderedByUser(hit_checker, node_list)
else:
self._node_stack = self._getNodesOrderedAutomatically(hit_checker, node_list)
# Check if we have to files that block each other. If this is the case, there is no solution!
for a in range(0, len(node_list)):
for b in range(0, len(node_list)):
if a != b and self._hit_map[a][b] and self._hit_map[b][a]:
return
# update print orders so that user can try to arrange the nodes automatically first
# and if result is not satisfactory he/she can switch to manual mode and change it
for index, node in enumerate(self._node_stack):
node.printOrder = index + 1
@staticmethod
def _getNodesOrderedByUser(hit_checker: HitChecker, node_list: List[CuraSceneNode]) -> List[CuraSceneNode]:
nodes_ordered_by_user = sorted(node_list, key=lambda n: n.printOrder)
if hit_checker.canPrintNodesInProvidedOrder(nodes_ordered_by_user):
return nodes_ordered_by_user
return [] # No solution
@staticmethod
def _getNodesOrderedAutomatically(hit_checker: HitChecker, node_list: List[CuraSceneNode]) -> List[CuraSceneNode]:
# Check if we have two files that block each other. If this is the case, there is no solution!
if hit_checker.anyTwoNodesBlockEachOther(node_list):
return [] # No solution
# Sort the original list so that items that block the most other objects are at the beginning.
# This does not decrease the worst case running time, but should improve it in most cases.
sorted(node_list, key = cmp_to_key(self._calculateScore))
node_list = sorted(node_list, key = cmp_to_key(hit_checker.calculateScore))
todo_node_list = [_ObjectOrder([], node_list)]
while len(todo_node_list) > 0:
current = todo_node_list.pop()
for node in current.todo:
# Check if the object can be placed with what we have and still allows for a solution in the future
if not self._checkHitMultiple(node, current.order) and not self._checkBlockMultiple(node, current.todo):
if hit_checker.canPrintAfter(node, current.order) and hit_checker.canPrintBefore(node, current.todo):
# We found a possible result. Create new todo & order list.
new_todo_list = current.todo[:]
new_todo_list.remove(node)
new_order = current.order[:] + [node]
if len(new_todo_list) == 0:
# We have no more nodes to check, so quit looking.
self._node_stack = new_order
return
return new_order # Solution found!
todo_node_list.append(_ObjectOrder(new_order, new_todo_list))
self._node_stack = [] #No result found!
# Check if first object can be printed before the provided list (using the hit map)
def _checkHitMultiple(self, node: SceneNode, other_nodes: List[SceneNode]) -> bool:
node_index = self._original_node_list.index(node)
for other_node in other_nodes:
other_node_index = self._original_node_list.index(other_node)
if self._hit_map[node_index][other_node_index]:
return True
return False
def _checkBlockMultiple(self, node: SceneNode, other_nodes: List[SceneNode]) -> bool:
"""Check for a node whether it hits any of the other nodes.
:param node: The node to check whether it collides with the other nodes.
:param other_nodes: The nodes to check for collisions.
:return: returns collision between nodes
"""
node_index = self._original_node_list.index(node)
for other_node in other_nodes:
other_node_index = self._original_node_list.index(other_node)
if self._hit_map[other_node_index][node_index] and node_index != other_node_index:
return True
return False
def _calculateScore(self, a: SceneNode, b: SceneNode) -> int:
"""Calculate score simply sums the number of other objects it 'blocks'
:param a: node
:param b: node
:return: sum of the number of other objects
"""
score_a = sum(self._hit_map[self._original_node_list.index(a)])
score_b = sum(self._hit_map[self._original_node_list.index(b)])
return score_a - score_b
def _checkHit(self, a: SceneNode, b: SceneNode) -> bool:
"""Checks if a can be printed before b
:param a: node
:param b: node
:return: true if a can be printed before b
"""
if a == b:
return False
a_hit_hull = a.callDecoration("getConvexHullBoundary")
b_hit_hull = b.callDecoration("getConvexHullHeadFull")
overlap = a_hit_hull.intersectsPolygon(b_hit_hull)
if overlap:
return True
# Adhesion areas must never overlap, regardless of printing order
# This would cause over-extrusion
a_hit_hull = a.callDecoration("getAdhesionArea")
b_hit_hull = b.callDecoration("getAdhesionArea")
overlap = a_hit_hull.intersectsPolygon(b_hit_hull)
if overlap:
return True
else:
return False
return [] # No result found!
class _ObjectOrder:

View file

@ -39,6 +39,11 @@ class PlatformPhysics:
Application.getInstance().getPreferences().addPreference("physics/automatic_push_free", False)
Application.getInstance().getPreferences().addPreference("physics/automatic_drop_down", True)
self._app_all_model_drop = False
def setAppAllModelDropDown(self):
self._app_all_model_drop = True
self._onChangeTimerFinished()
def _onSceneChanged(self, source):
if not source.callDecoration("isSliceable"):
@ -80,12 +85,12 @@ class PlatformPhysics:
# Move it downwards if bottom is above platform
move_vector = Vector()
if node.getSetting(SceneNodeSettings.AutoDropDown, app_automatic_drop_down) and not (node.getParent() and node.getParent().callDecoration("isGroup") or node.getParent() != root) and node.isEnabled(): #If an object is grouped, don't move it down
if (node.getSetting(SceneNodeSettings.AutoDropDown, app_automatic_drop_down) or self._app_all_model_drop) and not (node.getParent() and node.getParent().callDecoration("isGroup") or node.getParent() != root) and node.isEnabled():
z_offset = node.callDecoration("getZOffset") if node.getDecorator(ZOffsetDecorator.ZOffsetDecorator) else 0
move_vector = move_vector.set(y = -bbox.bottom + z_offset)
move_vector = move_vector.set(y=-bbox.bottom + z_offset)
# If there is no convex hull for the node, start calculating it and continue.
if not node.getDecorator(ConvexHullDecorator) and not node.callDecoration("isNonPrintingMesh"):
if not node.getDecorator(ConvexHullDecorator) and not node.callDecoration("isNonPrintingMesh") and node.callDecoration("getLayerData") is None:
node.addDecorator(ConvexHullDecorator())
# only push away objects if this node is a printing mesh
@ -168,6 +173,8 @@ class PlatformPhysics:
op = PlatformPhysicsOperation.PlatformPhysicsOperation(node, move_vector)
op.push()
# setting this drop to model same as app_automatic_drop_down
self._app_all_model_drop = False
# After moving, we have to evaluate the boundary checks for nodes
build_volume.updateNodeBoundaryCheck()

View file

@ -45,17 +45,17 @@ class PreviewPass(RenderPass):
This is useful to get a preview image of a scene taken from a different location as the active camera.
"""
def __init__(self, width: int, height: int) -> None:
def __init__(self, width: int, height: int, *, root: CuraSceneNode = None) -> None:
super().__init__("preview", width, height, 0)
self._camera = None # type: Optional[Camera]
self._camera: Optional[Camera] = None
self._renderer = Application.getInstance().getRenderer()
self._shader = None # type: Optional[ShaderProgram]
self._non_printing_shader = None # type: Optional[ShaderProgram]
self._support_mesh_shader = None # type: Optional[ShaderProgram]
self._scene = Application.getInstance().getController().getScene()
self._shader: Optional[ShaderProgram] = None
self._non_printing_shader: Optional[ShaderProgram] = None
self._support_mesh_shader: Optional[ShaderProgram] = None
self._root = Application.getInstance().getController().getScene().getRoot() if root is None else root
# Set the camera to be used by this render pass
# if it's None, the active camera is used
@ -96,7 +96,7 @@ class PreviewPass(RenderPass):
batch_support_mesh = RenderBatch(self._support_mesh_shader)
# Fill up the batch with objects that can be sliced.
for node in DepthFirstIterator(self._scene.getRoot()):
for node in DepthFirstIterator(self._root):
if hasattr(node, "_outside_buildarea") and not getattr(node, "_outside_buildarea"):
if node.callDecoration("isSliceable") and node.getMeshData() and node.isVisible():
per_mesh_stack = node.callDecoration("getStack")

174
cura/PrintOrderManager.py Normal file
View file

@ -0,0 +1,174 @@
from typing import List, Callable, Optional, Any
from PyQt6.QtCore import pyqtProperty, pyqtSignal, QObject, pyqtSlot
from UM.Application import Application
from UM.Scene.Selection import Selection
from cura.Scene.CuraSceneNode import CuraSceneNode
class PrintOrderManager(QObject):
"""Allows to order the object list to set the print sequence manually"""
def __init__(self, get_nodes: Callable[[], List[CuraSceneNode]]) -> None:
super().__init__()
self._get_nodes = get_nodes
self._configureEvents()
_settingsChanged = pyqtSignal()
_uiActionsOutdated = pyqtSignal()
printOrderChanged = pyqtSignal()
@pyqtSlot()
def swapSelectedAndPreviousNodes(self) -> None:
selected_node, previous_node, next_node = self._getSelectedAndNeighborNodes()
self._swapPrintOrders(selected_node, previous_node)
@pyqtSlot()
def swapSelectedAndNextNodes(self) -> None:
selected_node, previous_node, next_node = self._getSelectedAndNeighborNodes()
self._swapPrintOrders(selected_node, next_node)
@pyqtProperty(str, notify=_uiActionsOutdated)
def previousNodeName(self) -> str:
selected_node, previous_node, next_node = self._getSelectedAndNeighborNodes()
return self._getNodeName(previous_node)
@pyqtProperty(str, notify=_uiActionsOutdated)
def nextNodeName(self) -> str:
selected_node, previous_node, next_node = self._getSelectedAndNeighborNodes()
return self._getNodeName(next_node)
@pyqtProperty(bool, notify=_uiActionsOutdated)
def shouldEnablePrintBeforeAction(self) -> bool:
selected_node, previous_node, next_node = self._getSelectedAndNeighborNodes()
can_swap_with_previous_node = selected_node is not None and previous_node is not None
return can_swap_with_previous_node
@pyqtProperty(bool, notify=_uiActionsOutdated)
def shouldEnablePrintAfterAction(self) -> bool:
selected_node, previous_node, next_node = self._getSelectedAndNeighborNodes()
can_swap_with_next_node = selected_node is not None and next_node is not None
return can_swap_with_next_node
@pyqtProperty(bool, notify=_settingsChanged)
def shouldShowEditPrintOrderActions(self) -> bool:
return PrintOrderManager.isUserDefinedPrintOrderEnabled()
@staticmethod
def isUserDefinedPrintOrderEnabled() -> bool:
stack = Application.getInstance().getGlobalContainerStack()
is_enabled = stack and \
stack.getProperty("print_sequence", "value") == "one_at_a_time" and \
stack.getProperty("user_defined_print_order_enabled", "value")
return bool(is_enabled)
@staticmethod
def initializePrintOrders(nodes: List[CuraSceneNode]) -> None:
"""Just created (loaded from file) nodes have print order 0.
This method initializes print orders with max value to put nodes at the end of object list"""
max_print_order = max(map(lambda n: n.printOrder, nodes), default=0)
for node in nodes:
if node.printOrder == 0:
max_print_order += 1
node.printOrder = max_print_order
@staticmethod
def updatePrintOrdersAfterGroupOperation(
all_nodes: List[CuraSceneNode],
group_node: CuraSceneNode,
grouped_nodes: List[CuraSceneNode]
) -> None:
group_node.printOrder = min(map(lambda n: n.printOrder, grouped_nodes))
all_nodes.append(group_node)
for node in grouped_nodes:
all_nodes.remove(node)
# reassign print orders so there won't be gaps like 1 2 5 6 7
sorted_nodes = sorted(all_nodes, key=lambda n: n.printOrder)
for i, node in enumerate(sorted_nodes):
node.printOrder = i + 1
@staticmethod
def updatePrintOrdersAfterUngroupOperation(
all_nodes: List[CuraSceneNode],
group_node: CuraSceneNode,
ungrouped_nodes: List[CuraSceneNode]
) -> None:
all_nodes.remove(group_node)
nodes_to_update_print_order = filter(lambda n: n.printOrder > group_node.printOrder, all_nodes)
for node in nodes_to_update_print_order:
node.printOrder += len(ungrouped_nodes) - 1
for i, child in enumerate(ungrouped_nodes):
child.printOrder = group_node.printOrder + i
all_nodes.append(child)
def _swapPrintOrders(self, node1: CuraSceneNode, node2: CuraSceneNode) -> None:
if node1 and node2:
node1.printOrder, node2.printOrder = node2.printOrder, node1.printOrder # swap print orders
self.printOrderChanged.emit() # update object list first
self._uiActionsOutdated.emit() # then update UI actions
def _getSelectedAndNeighborNodes(self
) -> (Optional[CuraSceneNode], Optional[CuraSceneNode], Optional[CuraSceneNode]):
nodes = self._get_nodes()
ordered_nodes = sorted(nodes, key=lambda n: n.printOrder)
for i, node in enumerate(ordered_nodes, 1):
node.printOrder = i
selected_node = PrintOrderManager._getSingleSelectedNode()
if selected_node and selected_node in ordered_nodes:
selected_node_index = ordered_nodes.index(selected_node)
else:
selected_node_index = None
if selected_node_index is not None and selected_node_index - 1 >= 0:
previous_node = ordered_nodes[selected_node_index - 1]
else:
previous_node = None
if selected_node_index is not None and selected_node_index + 1 < len(ordered_nodes):
next_node = ordered_nodes[selected_node_index + 1]
else:
next_node = None
return selected_node, previous_node, next_node
@staticmethod
def _getNodeName(node: CuraSceneNode, max_length: int = 30) -> str:
node_name = node.getName() if node else ""
truncated_node_name = node_name[:max_length]
return truncated_node_name
@staticmethod
def _getSingleSelectedNode() -> Optional[CuraSceneNode]:
if len(Selection.getAllSelectedObjects()) == 1:
selected_node = Selection.getSelectedObject(0)
return selected_node
return None
def _configureEvents(self) -> None:
Selection.selectionChanged.connect(self._onSelectionChanged)
self._global_stack = None
Application.getInstance().globalContainerStackChanged.connect(self._onGlobalStackChanged)
self._onGlobalStackChanged()
def _onGlobalStackChanged(self) -> None:
if self._global_stack:
self._global_stack.propertyChanged.disconnect(self._onSettingsChanged)
self._global_stack.containersChanged.disconnect(self._onSettingsChanged)
self._global_stack = Application.getInstance().getGlobalContainerStack()
if self._global_stack:
self._global_stack.propertyChanged.connect(self._onSettingsChanged)
self._global_stack.containersChanged.connect(self._onSettingsChanged)
def _onSettingsChanged(self, *args: Any) -> None:
self._settingsChanged.emit()
def _onSelectionChanged(self) -> None:
self._uiActionsOutdated.emit()

View file

@ -40,9 +40,22 @@ class ExtruderConfigurationModel(QObject):
def setHotendID(self, hotend_id: Optional[str]) -> None:
if self._hotend_id != hotend_id:
self._hotend_id = hotend_id
self._hotend_id = ExtruderConfigurationModel.applyNameMappingHotend(hotend_id)
self.extruderConfigurationChanged.emit()
@staticmethod
def applyNameMappingHotend(hotendId) -> str:
_EXTRUDER_NAME_MAP = {
"mk14_hot":"1XA",
"mk14_hot_s":"2XA",
"mk14_c":"1C",
"mk14":"1A",
"mk14_s":"2A"
}
if hotendId in _EXTRUDER_NAME_MAP:
return _EXTRUDER_NAME_MAP[hotendId]
return hotendId
@pyqtProperty(str, fset = setHotendID, notify = extruderConfigurationChanged)
def hotendID(self) -> Optional[str]:
return self._hotend_id

View file

@ -9,6 +9,8 @@ from PyQt6.QtCore import pyqtProperty, QObject
class MaterialOutputModel(QObject):
def __init__(self, guid: Optional[str], type: str, color: str, brand: str, name: str, parent = None) -> None:
super().__init__(parent)
name, guid = MaterialOutputModel.getMaterialFromDefinition(guid, type, brand, name)
self._guid = guid
self._type = type
self._color = color
@ -19,6 +21,34 @@ class MaterialOutputModel(QObject):
def guid(self) -> str:
return self._guid if self._guid else ""
@staticmethod
def getMaterialFromDefinition(guid, type, brand, name):
_MATERIAL_MAP = { "abs" :{"name" :"ABS" ,"guid": "2780b345-577b-4a24-a2c5-12e6aad3e690"},
"abs-cf10" :{"name": "ABS-CF" ,"guid": "495a0ce5-9daf-4a16-b7b2-06856d82394d"},
"abs-wss1" :{"name" :"ABS-R" ,"guid": "88c8919c-6a09-471a-b7b6-e801263d862d"},
"asa" :{"name" :"ASA" ,"guid": "f79bc612-21eb-482e-ad6c-87d75bdde066"},
"nylon12-cf":{"name": "Nylon 12 CF" ,"guid": "3c6f2877-71cc-4760-84e6-4b89ab243e3b"},
"nylon" :{"name" :"Nylon" ,"guid": "283d439a-3490-4481-920c-c51d8cdecf9c"},
"pc" :{"name" :"PC" ,"guid": "62414577-94d1-490d-b1e4-7ef3ec40db02"},
"petg" :{"name" :"PETG" ,"guid": "69386c85-5b6c-421a-bec5-aeb1fb33f060"},
"pla" :{"name" :"PLA" ,"guid": "0ff92885-617b-4144-a03c-9989872454bc"},
"pva" :{"name" :"PVA" ,"guid": "a4255da2-cb2a-4042-be49-4a83957a2f9a"},
"wss1" :{"name" :"RapidRinse" ,"guid": "a140ef8f-4f26-4e73-abe0-cfc29d6d1024"},
"sr30" :{"name" :"SR-30" ,"guid": "77873465-83a9-4283-bc44-4e542b8eb3eb"},
"bvoh" :{"name" :"BVOH" ,"guid": "923e604c-8432-4b09-96aa-9bbbd42207f4"},
"cpe" :{"name" :"CPE" ,"guid": "da1872c1-b991-4795-80ad-bdac0f131726"},
"hips" :{"name" :"HIPS" ,"guid": "a468d86a-220c-47eb-99a5-bbb47e514eb0"},
"tpu" :{"name" :"TPU 95A" ,"guid": "19baa6a9-94ff-478b-b4a1-8157b74358d2"}
}
if guid is None and brand != "empty" and type in _MATERIAL_MAP:
name = _MATERIAL_MAP[type]["name"]
guid = _MATERIAL_MAP[type]["guid"]
return name, guid
@pyqtProperty(str, constant = True)
def type(self) -> str:
return self._type

View file

@ -1,6 +1,8 @@
# Copyright (c) 2018 Aldo Hoeben / fieldOfView
# NetworkMJPGImage is released under the terms of the LGPLv3 or higher.
from typing import Optional
from PyQt6.QtCore import QUrl, pyqtProperty, pyqtSignal, pyqtSlot, QRect, QByteArray
from PyQt6.QtGui import QImage, QPainter
from PyQt6.QtQuick import QQuickPaintedItem
@ -19,9 +21,9 @@ class NetworkMJPGImage(QQuickPaintedItem):
self._stream_buffer = QByteArray()
self._stream_buffer_start_index = -1
self._network_manager = None # type: QNetworkAccessManager
self._image_request = None # type: QNetworkRequest
self._image_reply = None # type: QNetworkReply
self._network_manager: Optional[QNetworkAccessManager] = None
self._image_request: Optional[QNetworkRequest] = None
self._image_reply: Optional[QNetworkReply] = None
self._image = QImage()
self._image_rect = QRect()

View file

@ -415,7 +415,18 @@ class NetworkedPrinterOutputDevice(PrinterOutputDevice):
@pyqtProperty(str, constant = True)
def printerType(self) -> str:
return self._properties.get(b"printer_type", b"Unknown").decode("utf-8")
return NetworkedPrinterOutputDevice.applyPrinterTypeMapping(self._properties.get(b"printer_type", b"Unknown").decode("utf-8"))
@staticmethod
def applyPrinterTypeMapping(printer_type):
_PRINTER_TYPE_NAME = {
"fire_e": "ultimaker_method",
"lava_f": "ultimaker_methodx",
"magma_10": "ultimaker_methodxl"
}
if printer_type in _PRINTER_TYPE_NAME:
return _PRINTER_TYPE_NAME[printer_type]
return printer_type
@pyqtProperty(str, constant = True)
def ipAddress(self) -> str:

View file

@ -111,11 +111,7 @@ class ConvexHullDecorator(SceneNodeDecorator):
# Parent can be None if node is just loaded.
if self._isSingularOneAtATimeNode():
hull = self.getConvexHullHeadFull()
if hull is None:
return None
hull = self._add2DAdhesionMargin(hull)
return hull
return self.getConvexHullHeadFull()
return self._compute2DConvexHull()
@ -323,6 +319,7 @@ class ConvexHullDecorator(SceneNodeDecorator):
def _compute2DConvexHeadFull(self) -> Optional[Polygon]:
convex_hull = self._compute2DConvexHull()
convex_hull = self._add2DAdhesionMargin(convex_hull)
if convex_hull:
return convex_hull.getMinkowskiHull(self._getHeadAndFans())
return None

View file

@ -11,6 +11,7 @@ from UM.Scene.SceneNode import SceneNode
from UM.Scene.SceneNodeDecorator import SceneNodeDecorator # To cast the deepcopy of every decorator back to SceneNodeDecorator.
import cura.CuraApplication # To get the build plate.
from UM.Scene.SceneNodeSettings import SceneNodeSettings
from cura.Settings.ExtruderStack import ExtruderStack # For typing.
from cura.Settings.SettingOverrideDecorator import SettingOverrideDecorator # For per-object settings.
@ -25,13 +26,26 @@ class CuraSceneNode(SceneNode):
if not no_setting_override:
self.addDecorator(SettingOverrideDecorator()) # Now we always have a getActiveExtruderPosition, unless explicitly disabled
self._outside_buildarea = False
self._print_order = 0
def setOutsideBuildArea(self, new_value: bool) -> None:
self._outside_buildarea = new_value
@property
def printOrder(self):
return self._print_order
@printOrder.setter
def printOrder(self, new_value):
self._print_order = new_value
def isOutsideBuildArea(self) -> bool:
return self._outside_buildarea or self.callDecoration("getBuildPlateNumber") < 0
@property
def isDropDownEnabled(self) ->bool:
return self.getSetting(SceneNodeSettings.AutoDropDown, Application.getInstance().getPreferences().getValue("physics/automatic_drop_down"))
def isVisible(self) -> bool:
return super().isVisible() and self.callDecoration("getBuildPlateNumber") == cura.CuraApplication.CuraApplication.getInstance().getMultiBuildPlateModel().activeBuildPlate
@ -157,3 +171,6 @@ class CuraSceneNode(SceneNode):
def transformChanged(self) -> None:
self._transformChanged()
def __repr__(self) -> str:
return "{print_order}. {name}".format(print_order = self._print_order, name = self.getName())

View file

@ -359,7 +359,7 @@ class CuraContainerStack(ContainerStack):
return self.definition
@classmethod
def _findInstanceContainerDefinitionId(cls, machine_definition: DefinitionContainerInterface) -> str:
def findInstanceContainerDefinitionId(cls, machine_definition: DefinitionContainerInterface) -> str:
"""Find the ID that should be used when searching for instance containers for a specified definition.
This handles the situation where the definition specifies we should use a different definition when
@ -379,7 +379,7 @@ class CuraContainerStack(ContainerStack):
Logger.log("w", "Unable to find parent definition {parent} for machine {machine}", parent = quality_definition, machine = machine_definition.id) #type: ignore
return machine_definition.id #type: ignore
return cls._findInstanceContainerDefinitionId(definitions[0])
return cls.findInstanceContainerDefinitionId(definitions[0])
def getExtruderPositionValueWithDefault(self, key):
"""getProperty for extruder positions, with translation from -1 to default extruder number"""

View file

@ -56,11 +56,12 @@ class CuraFormulaFunctions:
if isinstance(value, SettingFunction):
value = value(extruder_stack, context = context)
if isinstance(value, str):
value = value.lower()
return value
# Gets all extruder values as a list for the given property.
def getValuesInAllExtruders(self, property_key: str,
context: Optional["PropertyEvaluationContext"] = None) -> List[Any]:
def _getActiveExtruders(self, context: Optional["PropertyEvaluationContext"] = None) -> List[str]:
machine_manager = self._application.getMachineManager()
extruder_manager = self._application.getExtruderManager()
@ -73,7 +74,17 @@ class CuraFormulaFunctions:
# only include values from extruders that are "active" for the current machine instance
if int(extruder.getMetaDataEntry("position")) >= global_stack.getProperty("machine_extruder_count", "value", context = context):
continue
result.append(extruder)
return result
# Gets all extruder values as a list for the given property.
def getValuesInAllExtruders(self, property_key: str,
context: Optional["PropertyEvaluationContext"] = None) -> List[Any]:
global_stack = self._application.getMachineManager().activeMachine
result = []
for extruder in self._getActiveExtruders(context):
value = extruder.getRawProperty(property_key, "value", context = context)
if value is None:
@ -89,6 +100,25 @@ class CuraFormulaFunctions:
return result
# Get the first extruder that adheres to a specific (boolean) property, like 'material_is_support_material'.
def getAnyExtruderPositionWithOrDefault(self, filter_key: str,
context: Optional["PropertyEvaluationContext"] = None) -> str:
for extruder in self._getActiveExtruders(context):
value = extruder.getRawProperty(filter_key, "value", context=context)
if value is None or not value:
continue
return str(extruder.position)
# Get the first extruder with material that adheres to a specific (boolean) property, like 'material_is_support_material'.
def getExtruderPositionWithMaterial(self, filter_key: str,
context: Optional["PropertyEvaluationContext"] = None) -> str:
for extruder in self._getActiveExtruders(context):
material_container = extruder.material
value = material_container.getProperty(filter_key, "value", context)
if value is not None:
return str(extruder.position)
return self.getDefaultExtruderPosition()
# Get the resolve value or value for a given key.
def getResolveOrValue(self, property_key: str, context: Optional["PropertyEvaluationContext"] = None) -> Any:
machine_manager = self._application.getMachineManager()

View file

@ -284,16 +284,20 @@ class CuraStackBuilder:
abstract_machines = registry.findContainerStacks(id = abstract_machine_id)
if abstract_machines:
return cast(GlobalStack, abstract_machines[0])
definitions = registry.findDefinitionContainers(id=definition_id)
name = ""
if definitions:
name = definitions[0].getName()
stack = cls.createMachine(abstract_machine_id, definition_id, show_warning_message=False)
if not stack:
return None
if not stack.getMetaDataEntry("visible", True):
return None
stack.setName(name)
stack.setMetaDataEntry("is_abstract_machine", True)

View file

@ -1,4 +1,4 @@
# Copyright (c) 2021 Ultimaker B.V.
# Copyright (c) 2023 UltiMaker
# Cura is released under the terms of the LGPLv3 or higher.
from UM.Settings.SQLQueryFactory import SQLQueryFactory
@ -10,8 +10,8 @@ class IntentDatabaseHandler(DatabaseMetadataContainerController):
"""The Database handler for Intent containers"""
def __init__(self) -> None:
super().__init__(SQLQueryFactory(table = "intent",
fields = {
super().__init__(SQLQueryFactory(table="intent",
fields={
"id": "text",
"name": "text",
"quality_type": "text",
@ -20,6 +20,8 @@ class IntentDatabaseHandler(DatabaseMetadataContainerController):
"definition": "text",
"material": "text",
"version": "text",
"setting_version": "text"
"setting_version": "text",
"icon": "text",
"description": "text",
}))
self._container_type = InstanceContainer

View file

@ -10,13 +10,16 @@ class VariantDatabaseHandler(DatabaseMetadataContainerController):
"""The Database handler for Variant containers"""
def __init__(self):
super().__init__(SQLQueryFactory(table = "variant",
fields = {
"id": "text",
"name": "text",
"hardware_type": "text",
"definition": "text",
"version": "text",
"setting_version": "text"
}))
super().__init__(SQLQueryFactory(
table="variant",
fields={
"id": "text",
"name": "text",
"hardware_type": "text",
"definition": "text",
"version": "text",
"setting_version": "text",
"reference_extruder_id": "text",
},
))
self._container_type = InstanceContainer

View file

@ -316,7 +316,13 @@ class ExtruderManager(QObject):
# Starts with the adhesion extruder.
adhesion_type = global_stack.getProperty("adhesion_type", "value")
if adhesion_type in {"skirt", "brim"}:
return max(0, int(global_stack.getProperty("skirt_brim_extruder_nr", "value"))) # optional skirt/brim extruder defaults to zero
skirt_brim_extruder_nr = global_stack.getProperty("skirt_brim_extruder_nr", "value")
# if the skirt_brim_extruder_nr is -1, then we use the first used extruder
if skirt_brim_extruder_nr == -1:
used_extruders = self.getUsedExtruderStacks()
return used_extruders[0].position
else:
return skirt_brim_extruder_nr
if adhesion_type == "raft":
return global_stack.getProperty("raft_base_extruder_nr", "value")

View file

@ -48,6 +48,8 @@ from UM.i18n import i18nCatalog
catalog = i18nCatalog("cura")
from cura.Settings.GlobalStack import GlobalStack
if TYPE_CHECKING:
from PyQt6.QtCore import QVariantList
from cura.CuraApplication import CuraApplication
from cura.Machines.MaterialNode import MaterialNode
from cura.Machines.QualityChangesGroup import QualityChangesGroup
@ -581,6 +583,10 @@ class MachineManager(QObject):
def activeMachine(self) -> Optional["GlobalStack"]:
return self._global_container_stack
@pyqtProperty("QVariantList", notify=activeVariantChanged)
def activeMachineExtruders(self) -> Optional["QVariantList"]:
return self._global_container_stack.extruderList if self._global_container_stack else None
@pyqtProperty(str, notify = activeStackChanged)
def activeStackId(self) -> str:
if self._active_container_stack:
@ -1700,6 +1706,16 @@ class MachineManager(QObject):
else: # No intent had the correct category.
extruder.intent = empty_intent_container
@pyqtSlot()
def resetIntents(self) -> None:
"""Reset the intent category of the current printer.
"""
global_stack = self._application.getGlobalContainerStack()
if global_stack is None:
return
for extruder in global_stack.extruderList:
extruder.intent = empty_intent_container
def activeQualityGroup(self) -> Optional["QualityGroup"]:
"""Get the currently activated quality group.

View file

@ -1,6 +1,6 @@
# Copyright (c) 2017 Ultimaker B.V.
# Cura is released under the terms of the LGPLv3 or higher.
from typing import List, Optional, TYPE_CHECKING
from typing import List, Optional, Set, TYPE_CHECKING
from PyQt6.QtCore import QObject, QTimer, pyqtProperty, pyqtSignal
from UM.FlameProfiler import pyqtSlot
@ -168,37 +168,26 @@ class SettingInheritanceManager(QObject):
def settingsWithInheritanceWarning(self) -> List[str]:
return self._settings_with_inheritance_warning
def _settingIsOverwritingInheritance(self, key: str, stack: ContainerStack = None) -> bool:
"""Check if a setting has an inheritance function that is overwritten"""
def _userSettingIsOverwritingInheritance(self, key: str, stack: ContainerStack, all_keys: Set[str] = set()) -> bool:
"""Check if a setting known as having a User state has an inheritance function that is overwritten"""
has_setting_function = False
if not stack:
stack = self._active_container_stack
if not stack: # No active container stack yet!
return False
if self._active_container_stack is None:
return False
all_keys = self._active_container_stack.getAllKeys()
containers = [] # type: List[ContainerInterface]
has_user_state = stack.getProperty(key, "state") == InstanceState.User
"""Check if the setting has a user state. If not, it is never overwritten."""
if not has_user_state:
return False
# If a setting is not enabled, don't label it as overwritten (It's never visible anyway).
if not stack.getProperty(key, "enabled"):
return False
user_container = stack.getTop()
"""Also check if the top container is not a setting function (this happens if the inheritance is restored)."""
# Also check if the top container is not a setting function (this happens if the inheritance is restored).
if user_container and isinstance(user_container.getProperty(key, "value"), SettingFunction):
return False
if not all_keys:
all_keys = self._active_container_stack.getAllKeys()
## Mash all containers for all the stacks together.
while stack:
containers.extend(stack.getContainers())
@ -229,17 +218,35 @@ class SettingInheritanceManager(QObject):
break # There is a setting function somewhere, stop looking deeper.
return has_setting_function and has_non_function_value
def _settingIsOverwritingInheritance(self, key: str, stack: ContainerStack = None) -> bool:
"""Check if a setting has an inheritance function that is overwritten"""
if not stack:
stack = self._active_container_stack
if not stack: # No active container stack yet!
return False
if self._active_container_stack is None:
return False
has_user_state = stack.getProperty(key, "state") == InstanceState.User
if not has_user_state:
return False
return self._userSettingIsOverwritingInheritance(key, stack)
def _update(self) -> None:
self._settings_with_inheritance_warning = [] # Reset previous data.
# Make sure that the GlobalStack is not None. sometimes the globalContainerChanged signal gets here late.
if self._global_container_stack is None:
if self._global_container_stack is None or self._active_container_stack is None:
return
# Check all setting keys that we know of and see if they are overridden.
for setting_key in self._global_container_stack.getAllKeys():
override = self._settingIsOverwritingInheritance(setting_key)
if override:
# Check all user setting keys that we know of and see if they are overridden.
all_keys = self._active_container_stack.getAllKeys()
for setting_key in self._active_container_stack.getAllKeysWithUserState():
if self._userSettingIsOverwritingInheritance(setting_key, self._active_container_stack, all_keys):
self._settings_with_inheritance_warning.append(setting_key)
# Check all the categories if any of their children have their inheritance overwritten.

View file

@ -28,6 +28,7 @@ empty_material_container.setMetaDataEntry("type", "material")
empty_material_container.setMetaDataEntry("base_file", "empty_material")
empty_material_container.setMetaDataEntry("GUID", "FFFFFFFF-FFFF-FFFF-FFFF-FFFFFFFFFFFF")
empty_material_container.setMetaDataEntry("material", "empty")
empty_material_container.setMetaDataEntry("brand", "empty_brand")
# Empty quality
EMPTY_QUALITY_CONTAINER_ID = "empty_quality"

View file

@ -5,16 +5,18 @@ import json
import os
from typing import List, Optional
from PyQt6.QtCore import QUrl
from PyQt6.QtNetwork import QLocalServer, QLocalSocket
from UM.Qt.QtApplication import QtApplication #For typing.
from UM.Qt.QtApplication import QtApplication # For typing.
from UM.Logger import Logger
class SingleInstance:
def __init__(self, application: QtApplication, files_to_open: Optional[List[str]]) -> None:
def __init__(self, application: QtApplication, files_to_open: Optional[List[str]], url_to_open: Optional[List[str]]) -> None:
self._application = application
self._files_to_open = files_to_open
self._url_to_open = url_to_open
self._single_instance_server = None
@ -33,7 +35,7 @@ class SingleInstance:
return False
# We only send the files that need to be opened.
if not self._files_to_open:
if not self._files_to_open and not self._url_to_open:
Logger.log("i", "No file need to be opened, do nothing.")
return True
@ -55,8 +57,12 @@ class SingleInstance:
payload = {"command": "open", "filePath": os.path.abspath(filename)}
single_instance_socket.write(bytes(json.dumps(payload) + "\n", encoding = "ascii"))
for url in self._url_to_open:
payload = {"command": "open-url", "urlPath": url.toString()}
single_instance_socket.write(bytes(json.dumps(payload) + "\n", encoding="ascii"))
payload = {"command": "close-connection"}
single_instance_socket.write(bytes(json.dumps(payload) + "\n", encoding = "ascii"))
single_instance_socket.write(bytes(json.dumps(payload) + "\n", encoding="ascii"))
single_instance_socket.flush()
single_instance_socket.waitForDisconnected()
@ -72,7 +78,7 @@ class SingleInstance:
def _onClientConnected(self) -> None:
Logger.log("i", "New connection received on our single-instance server")
connection = None #type: Optional[QLocalSocket]
connection = None # type: Optional[QLocalSocket]
if self._single_instance_server:
connection = self._single_instance_server.nextPendingConnection()
@ -81,7 +87,7 @@ class SingleInstance:
def __readCommands(self, connection: QLocalSocket) -> None:
line = connection.readLine()
while len(line) != 0: # There is also a .canReadLine()
while len(line) != 0: # There is also a .canReadLine()
try:
payload = json.loads(str(line, encoding = "ascii").strip())
command = payload["command"]
@ -94,13 +100,19 @@ class SingleInstance:
elif command == "open":
self._application.callLater(lambda f = payload["filePath"]: self._application._openFile(f))
#command: Load a url link in Cura
elif command == "open-url":
url = QUrl(payload["urlPath"])
self._application.callLater(lambda: self._application._openUrl(url))
# Command: Activate the window and bring it to the top.
elif command == "focus":
# Operating systems these days prevent windows from moving around by themselves.
# 'alert' or flashing the icon in the taskbar is the best thing we do now.
main_window = self._application.getMainWindow()
if main_window is not None:
self._application.callLater(lambda: main_window.alert(0)) # type: ignore # I don't know why MyPy complains here
self._application.callLater(lambda: main_window.alert(0)) # type: ignore # I don't know why MyPy complains here
# Command: Close the socket connection. We're done.
elif command == "close-connection":

View file

@ -1,7 +1,9 @@
# Copyright (c) 2021 Ultimaker B.V.
# Copyright (c) 2023 UltiMaker
# Cura is released under the terms of the LGPLv3 or higher.
import numpy
from typing import Optional
from PyQt6 import QtCore
from PyQt6.QtCore import QCoreApplication
from PyQt6.QtGui import QImage
@ -10,30 +12,133 @@ from UM.Logger import Logger
from cura.PreviewPass import PreviewPass
from UM.Application import Application
from UM.Math.AxisAlignedBox import AxisAlignedBox
from UM.Math.Matrix import Matrix
from UM.Math.Vector import Vector
from UM.Scene.Camera import Camera
from UM.Scene.Iterator.DepthFirstIterator import DepthFirstIterator
from UM.Scene.SceneNode import SceneNode
from UM.Qt.QtRenderer import QtRenderer
class Snapshot:
DEFAULT_WIDTH_HEIGHT = 300
MAX_RENDER_DISTANCE = 10000
BOUND_BOX_FACTOR = 1.75
CAMERA_FOVY = 30
ATTEMPTS_FOR_SNAPSHOT = 10
@staticmethod
def getImageBoundaries(image: QImage):
# Look at the resulting image to get a good crop.
# Get the pixels as byte array
def getNonZeroPixels(image: QImage):
pixel_array = image.bits().asarray(image.sizeInBytes())
width, height = image.width(), image.height()
# Convert to numpy array, assume it's 32 bit (it should always be)
pixels = numpy.frombuffer(pixel_array, dtype=numpy.uint8).reshape([height, width, 4])
# Find indices of non zero pixels
nonzero_pixels = numpy.nonzero(pixels)
return numpy.nonzero(pixels)
@staticmethod
def getImageBoundaries(image: QImage):
nonzero_pixels = Snapshot.getNonZeroPixels(image)
min_y, min_x, min_a_ = numpy.amin(nonzero_pixels, axis=1) # type: ignore
max_y, max_x, max_a_ = numpy.amax(nonzero_pixels, axis=1) # type: ignore
return min_x, max_x, min_y, max_y
@staticmethod
def snapshot(width = 300, height = 300):
def isometricSnapshot(width: int = DEFAULT_WIDTH_HEIGHT, height: int = DEFAULT_WIDTH_HEIGHT, *, node: Optional[SceneNode] = None) -> Optional[QImage]:
"""
Create an isometric snapshot of the scene.
:param width: width of the aspect ratio default 300
:param height: height of the aspect ratio default 300
:param node: node of the scene default is the root of the scene
:return: None when there is no model on the build plate otherwise it will return an image
"""
if node is None:
node = Application.getInstance().getController().getScene().getRoot()
# the direction the camera is looking at to create the isometric view
iso_view_dir = Vector(-1, -1, -1).normalized()
bounds = Snapshot.nodeBounds(node)
if bounds is None:
Logger.log("w", "There appears to be nothing to render")
return None
camera = Camera("snapshot")
# find local x and y directional vectors of the camera
tangent_space_x_direction = iso_view_dir.cross(Vector.Unit_Y).normalized()
tangent_space_y_direction = tangent_space_x_direction.cross(iso_view_dir).normalized()
# find extreme screen space coords of the scene
x_points = [p.dot(tangent_space_x_direction) for p in bounds.points]
y_points = [p.dot(tangent_space_y_direction) for p in bounds.points]
min_x = min(x_points)
max_x = max(x_points)
min_y = min(y_points)
max_y = max(y_points)
camera_width = max_x - min_x
camera_height = max_y - min_y
if camera_width == 0 or camera_height == 0:
Logger.log("w", "There appears to be nothing to render")
return None
# increase either width or height to match the aspect ratio of the image
if camera_width / camera_height > width / height:
camera_height = camera_width * height / width
else:
camera_width = camera_height * width / height
# Configure camera for isometric view
ortho_matrix = Matrix()
ortho_matrix.setOrtho(
-camera_width / 2,
camera_width / 2,
-camera_height / 2,
camera_height / 2,
-Snapshot.MAX_RENDER_DISTANCE,
Snapshot.MAX_RENDER_DISTANCE
)
camera.setPerspective(False)
camera.setProjectionMatrix(ortho_matrix)
camera.setPosition(bounds.center)
camera.lookAt(bounds.center + iso_view_dir)
# Render the scene
renderer = QtRenderer()
render_pass = PreviewPass(width, height, root=node)
renderer.setViewportSize(width, height)
renderer.setWindowSize(width, height)
render_pass.setCamera(camera)
renderer.addRenderPass(render_pass)
renderer.beginRendering()
renderer.render()
return render_pass.getOutput()
@staticmethod
def isNodeRenderable(node):
return not getattr(node, "_outside_buildarea", False) and node.callDecoration(
"isSliceable") and node.getMeshData() and node.isVisible() and not node.callDecoration(
"isNonThumbnailVisibleMesh")
@staticmethod
def nodeBounds(root_node: SceneNode) -> Optional[AxisAlignedBox]:
axis_aligned_box = None
for node in DepthFirstIterator(root_node):
if Snapshot.isNodeRenderable(node):
if axis_aligned_box is None:
axis_aligned_box = node.getBoundingBox()
else:
axis_aligned_box = axis_aligned_box + node.getBoundingBox()
return axis_aligned_box
@staticmethod
def snapshot(width = DEFAULT_WIDTH_HEIGHT, height = DEFAULT_WIDTH_HEIGHT, number_of_attempts = ATTEMPTS_FOR_SNAPSHOT):
"""Return a QImage of the scene
Uses PreviewPass that leaves out some elements Aspect ratio assumes a square
@ -55,14 +160,7 @@ class Snapshot:
camera = Camera("snapshot", root)
# determine zoom and look at
bbox = None
for node in DepthFirstIterator(root):
if not getattr(node, "_outside_buildarea", False):
if node.callDecoration("isSliceable") and node.getMeshData() and node.isVisible() and not node.callDecoration("isNonThumbnailVisibleMesh"):
if bbox is None:
bbox = node.getBoundingBox()
else:
bbox = bbox + node.getBoundingBox()
bbox = Snapshot.nodeBounds(root)
# If there is no bounding box, it means that there is no model in the buildplate
if bbox is None:
Logger.log("w", "Unable to create snapshot as we seem to have an empty buildplate")
@ -76,13 +174,13 @@ class Snapshot:
looking_from_offset = Vector(-1, 1, 2)
if size > 0:
# determine the watch distance depending on the size
looking_from_offset = looking_from_offset * size * 1.75
looking_from_offset = looking_from_offset * size * Snapshot.BOUND_BOX_FACTOR
camera.setPosition(look_at + looking_from_offset)
camera.lookAt(look_at)
satisfied = False
size = None
fovy = 30
fovy = Snapshot.CAMERA_FOVY
while not satisfied:
if size is not None:
@ -97,9 +195,14 @@ class Snapshot:
pixel_output = preview_pass.getOutput()
try:
min_x, max_x, min_y, max_y = Snapshot.getImageBoundaries(pixel_output)
except (ValueError, AttributeError):
Logger.logException("w", "Failed to crop the snapshot!")
return None
except (ValueError, AttributeError) as e:
if number_of_attempts == 0:
Logger.warning( f"Failed to crop the snapshot even after {Snapshot.ATTEMPTS_FOR_SNAPSHOT} attempts!")
return None
else:
number_of_attempts = number_of_attempts - 1
Logger.info("Trying to get the snapshot again.")
return Snapshot.snapshot(width, height, number_of_attempts)
size = max((max_x - min_x) / render_width, (max_y - min_y) / render_height)
if size > 0.5 or satisfied:

View file

@ -14,6 +14,9 @@ from UM.Scene.SceneNode import SceneNode
from UM.Scene.Selection import Selection
from UM.i18n import i18nCatalog
from cura.PrintOrderManager import PrintOrderManager
from cura.Scene.CuraSceneNode import CuraSceneNode
catalog = i18nCatalog("cura")
@ -69,13 +72,16 @@ class ObjectsModel(ListModel):
self._group_name_template = catalog.i18nc("@label", "Group #{group_nr}")
self._group_name_prefix = self._group_name_template.split("#")[0]
self._naming_regex = re.compile("^(.+)\(([0-9]+)\)$")
self._naming_regex = re.compile(r"^(.+)\(([0-9]+)\)$")
def setActiveBuildPlate(self, nr: int) -> None:
if self._build_plate_number != nr:
self._build_plate_number = nr
self._update()
def getNodes(self) -> List[CuraSceneNode]:
return list(map(lambda n: n["node"], self.items))
def _updateSceneDelayed(self, source) -> None:
if not isinstance(source, Camera):
self._update_timer.start()
@ -175,6 +181,10 @@ class ObjectsModel(ListModel):
all_nodes = self._renameNodes(name_to_node_info_dict)
user_defined_print_order_enabled = PrintOrderManager.isUserDefinedPrintOrderEnabled()
if user_defined_print_order_enabled:
PrintOrderManager.initializePrintOrders(all_nodes)
for node in all_nodes:
if hasattr(node, "isOutsideBuildArea"):
is_outside_build_area = node.isOutsideBuildArea() # type: ignore
@ -223,8 +233,13 @@ class ObjectsModel(ListModel):
# for anti overhang meshes and groups the extruder nr is irrelevant
extruder_number = -1
if not user_defined_print_order_enabled:
name = node.getName()
else:
name = "{print_order}. {name}".format(print_order = node.printOrder, name = node.getName())
nodes.append({
"name": node.getName(),
"name": name,
"selected": Selection.isSelected(node),
"outside_build_area": is_outside_build_area,
"buildplate_number": node_build_plate_number,
@ -234,5 +249,5 @@ class ObjectsModel(ListModel):
"node": node
})
nodes = sorted(nodes, key=lambda n: n["name"])
nodes = sorted(nodes, key=lambda n: n["name"] if not user_defined_print_order_enabled else n["node"].printOrder)
self.setItems(nodes)

View file

@ -148,6 +148,9 @@ class CloudMaterialSync(QObject):
continue
if metadata["id"] == "empty_material": # Don't export the empty material.
continue
# Ignore materials that are marked as not visible for whatever reason
if not bool(metadata.get("visible", True)):
continue
material = registry.findContainers(id = metadata["id"])[0]
suffix = registry.getMimeTypeForContainer(type(material)).preferredSuffix
filename = metadata["id"] + "." + suffix

View file

@ -15,6 +15,10 @@ if "" in sys.path:
import argparse
import faulthandler
import os
# set the environment variable QT_QUICK_FLICKABLE_WHEEL_DECELERATION to 5000 as mentioned in qt6.6 update log to overcome scroll related issues
os.environ["QT_QUICK_FLICKABLE_WHEEL_DECELERATION"] = str(int(os.environ.get("QT_QUICK_FLICKABLE_WHEEL_DECELERATION", "5000")))
if sys.platform != "linux": # Turns out the Linux build _does_ use this, but we're not making an Enterprise release for that system anyway.
os.environ["QT_PLUGIN_PATH"] = "" # Security workaround: Don't need it, and introduces an attack vector, so set to nul.
os.environ["QML2_IMPORT_PATH"] = "" # Security workaround: Don't need it, and introduces an attack vector, so set to nul.

View file

@ -1,37 +0,0 @@
How to Profile Cura and See What It is Doing
============================================
Cura has a simple flame graph profiler available as a plugin which can be used to see what Cura is doing as it runs and how much time it takes. A flame graph profile shows its output as a timeline and stacks of "blocks" which represent parts of the code and are stacked up to show call depth. These often form little peaks which look like flames. It is a simple yet powerful way to visualise the activity of a program.
Setting up and installing the profiler
--------------------------------------
The profiler plugin is kept outside of the Cura source code here: https://github.com/sedwards2009/cura-big-flame-graph
To install it do:
* Use `git clone https://github.com/sedwards2009/cura-big-flame-graph.git` to grab a copy of the code.
* Copy the `BigFlameGraph` directory into the `plugins` directory in your local Cura.
* Set the `URANIUM_FLAME_PROFILER` environment variable to something before starting Cura. This flags to the profiler code in Cura to activate and insert the needed hooks into the code.
Using the profiler
------------------
To open the profiler go to the Extensions menu and select "Start BFG" from the "Big Flame Graph" menu. A page will open up in your default browser. This is the profiler UI. Click on "Record" to start recording, go to Cura and perform an action and then back in the profiler click on "Stop". The results should now load in.
The time scale is at the top of the window. The blocks should be read as meaning the blocks at the bottom call the blocks which are stacked on top of them. Hover the mouse to get more detailed information about a block such as the name of the code involved and its duration. Use the zoom buttons or mouse wheel to zoom in. The display can be panned by dragging with the left mouse button.
Note: The profiler front-end itself is quite "heavy" (ok, not optimised). It runs much better in Google Chrome or Chromium than Firefox. It is also a good idea to keep recording sessions short for the same reason.
What the Profiler Sees
----------------------
The profiler doesn't capture every function call in Cura. It hooks into a number of important systems which give a good picture of activity without too much run time overhead. The most important system is Uranium's signal mechanism and PyQt5 slots. Functions which are called via the signal mechanism are recorded and their names appear in the results. PyQt5 slots appear in the results with the prefix `[SLOT]`.
Note that not all slots are captured. Only those slots which belong to classes which use the `pyqtSlot` decorator from the `UM.FlameProfiler` module.
Manually adding profiling code to more detail
---------------------------------------------
It is also possible to manually add decorators to methods to make them appear in the profiler results. The `UM.FlameProfiler` module contains the `profile` decorator which can be applied to methods. There is also a `profileCall` context manager which can be used with Python's `with` statement to measure a block of code. `profileCall` takes one argument, a label to use in the results.

View file

@ -1 +0,0 @@
<mxfile host="www.draw.io" modified="2019-12-20T12:34:56.339Z" agent="Mozilla/5.0 (X11; Fedora; Linux x86_64; rv:66.0) Gecko/20100101 Firefox/66.0" etag="1NLsmsxIqXUmOJee4m9D" version="12.4.3" type="device" pages="1"><diagram id="K0t5C8WxT4tyKudoHXNk" name="Page-1">7VzbcqM4EP0aP+4WSFzsx8SZmd2tpCqTbO1kn1KKkW3VYOQBObHn61cyF0MLM9jhkmyo8gMSLXQ5R+rTDckIT1fbLyFZL2+4R/0RMrztCF+NEDIthEbqZ3i7uMZ13LhiETIvMTpU3LOfNKk0ktoN82hUMBSc+4Kti5UzHgR0Jgp1JAz5S9Fszv1ir2uyoFrF/Yz4eu035ollXDtG7qH+D8oWy7Rn05nEd1YkNU5mEi2Jx19yVfjTCE9DzkV8tdpOqa8WL12XuN3nI3ezgYU0EHUa/OXdPfz4exV+fRLWl92tG9388/m35CnPxN8kE/4zEOqB8ZDFLl2H6IWtfBLI0uWcB+I+uWPI8mzJfO+a7PhGjSMSZPY9LV0uech+Snviy1umrJC3Q5HAjFXrOfP9Kfd5KCsCvu/g0OhePSzpJqSRbHabztcEVTdkWzC8JpFIB8h9n6wj9rQfsmq4IuGCBZdcCL5KjJKFoKGg26MrbGa4ScJTvqIi3EmTtIGTQJ1w3URWXH45MMecJDbLPGvShiRh6yJ7dtbdnWQ3CRZyCll/2SZJ+7MMrT+npDvkFHsjvqBhQAS95JvAi/Iskhe5mR6q9tw6gWdY4xnb8+xxJrtdcDVTOSi8vciQyHFPIiL21An5dwq4UkIf4rNFIIs+natmClImN/RFUi34Wj1sTWYsWFzvba6sQ81dsk6qisu2c3+/aZfM82ig2MUFEeQpY/+ay5nsF9K+lD+53FPjd3tky4FPZdk8lOVPmYdiygM5F8L2rKKSpy9UcbUeBY9vY52XCS+wTotSGkJe5FlYIMSp6Fsa+vJ0pCGTp8KAdbNY207PWE80rD06ZwETjAcD2g2jPUY9o516oBzcz0RubKUghgO9LdhNY9w37rpw/LGROIndo9it6YB404jjmlKyNcRtDfCvMeCdhApyWv+vUMEtSndslmg0qyxUwBWhwqsAdgaR1txutit3MyoRaWVgt7aZ3eH07hJvu0SmdYr3eFBp3aPuloi0TlE39SM9H4sNyLeFvGmUqLVuoddPeA1lGngXKkMuSzOfRBGbKUElVqn+olsmHpJFV9f/qmu5snHpapu7dbVLC4Ec/UO+kGuliodm+1LaLh4c9bRkPBBUcgJ8E85o1dTT1wRSuNEqDN1yDHOY2SWQpXUh9Ylgz8XxVuRvbxVtcwJwXBSAaAIeEc8zaZVP64MHZe8XMl8DHhSvg/YgCT3Z5cySbXV8wBgM2DUrxwXtsWsDVscjaDTNbOqO7qiIHcKUX4cpFgDRrKtkshOw+ZNNTzA+kYg+ylUfhOsZaYdxpSfrPVJBeoJxyCe3h3fvkQrSE4tZqDKA3SzYvQcoqSh9lUo9U3Gm6jZVunXUbYMq1aopUmN315dKxSBNmanWU1WqBT5VQJZbS6U2JQyR/gr62LEy6MLTdSEa1wx40yOn+aNEfz8RkNWgCE93GvFWecOKsDqrNeDdLN79K0I9pv8oImFcUySgI+nIbkSCbYAMlHumSLBtmMoCD2oolQXFiGUYleOC9hhhwOoWUln4pMB3EC2/Fi0OyJRip+bRlh6BjR9tWA92pf3gwk51YdgoB/6tSBasRx+nu7AzXNFbeBvj1PRh+Mjm7caHWeDTffvcQNcGvsKG3+s05cMs4MOSv0g56sOgvVF4fdOSD9ODso/Ce/Q+eI9AIO+cy3voXG2rHd7bE+DErUn1fpz0wXs9RO2W92Z/vDfeBe8d+HYahhp1ee+CDKkFN1BDvHcBj5FZfd5D+2543+BXJ++N93W/Ouk3VncMSItzz3sQq2O7Hd7DASMLV4/LgBu7i1hd/ybhdN6fyeFzvtJqkPeTd3Hcm4AVDlTldWlvwhRVS8e9Cd+XmdUyB9pbdnVKS3MPp9p34U4sPT3SrTvpMWyuK6P6dSdaZnRy5r6C8TduKXzQ30Pb1ePqI/VroQ/L+7pRc7+fRcDX35ror0178BWw1VK2CI/h9qp2J9DenLzquJfFw/85ic0P/y0Gf/oP</diagram></mxfile>

Binary file not shown.

Before

Width:  |  Height:  |  Size: 22 KiB

View file

@ -1,81 +0,0 @@
# Reporting Issues
Please attach the following information in case <br>
you want to report crashing or similar issues.
<br>
## DxDiag
### ![Badge Windows]
The log as produced by **dxdiag**.
<kbd>start</kbd>  »  <kbd>run</kbd>  »  <kbd>dxdiag</kbd>  »  <kbd>save output</kbd>
<br>
<br>
## Cura GUI Log
If the Cura user interface still starts, you can also <br>
reach these directories from the application menu:
<kbd>Help</kbd>  »  <kbd>Show settings folder</kbd>
<br>
### ![Badge Windows]
```
%APPDATA%\cura\< >\cura.log
```
or
```
C:\Users\<your username>\AppData\Roaming\cura\< >\cura.log
```
<br>
### ![Badge Linux]
```
~/.local/share/cura/< >/cura.log
```
<br>
### ![Badge MacOS]
```
~/Library/Application Support/cura/< >/cura.log
```
<br>
<br>
## Alternative
An alternative is to install the **[ExtensiveSupportLogging]** <br>
plugin this creates a zip folder of the relevant log files.
If you're experiencing performance issues, we might ask <br>
you to connect the CPU profiler in this plugin and attach <br>
the collected data to your support ticket.
<br>
<!----------------------------------------------------------------------------->
[ExtensiveSupportLogging]: https://marketplace.ultimaker.com/app/cura/plugins/UltimakerPackages/ExtensiveSupportLogging
<!---------------------------------[ Badges ]---------------------------------->
[Badge Windows]: https://img.shields.io/badge/Windows-0078D6?style=for-the-badge&logoColor=white&logo=Windows
[Badge Linux]: https://img.shields.io/badge/Linux-00A95C?style=for-the-badge&logoColor=white&logo=Linux
[Badge MacOS]: https://img.shields.io/badge/MacOS-403C3D?style=for-the-badge&logoColor=white&logo=MacOS

View file

@ -1,22 +0,0 @@
Cura Documentation
====
Welcome to the Cura documentation pages.
Objective
----
The goal of this documentation is to give an overview of the architecture of Cura's source code. The purpose of this overview is to make programmers familiar with Cura's source code so that they may contribute more easily, write plug-ins more easily or get started within the Cura team more quickly.
There are some caveats though. These are *not* within the scope of this documentation:
* There is no documentation on individual functions or classes of the code here. For that, refer to the Doxygen documentation and Python Docstrings in the source code itself, or generate the documentation locally using Doxygen.
* It's virtually impossible and indeed not worth the effort or money to keep this 100% up to date.
* There are no example plug-ins here. There are a number of example plug-ins in the Ultimaker organisation on Github.com to draw from.
* The slicing process is not documented here. Refer to CuraEngine for that.
This documentation will touch on the inner workings of Uranium as well though, due to the nature of the architecture.
Index
----
The following chapters are available in this documentation:
* [Repositories](repositories.md): An overview of the repositories that together make up the Cura application.
* [Profiles](profiles/profiles.md): About the setting and profile system of Cura.
* [Scene](scene/scene.md): How Cura's 3D scene looks.

View file

@ -1,33 +0,0 @@
Container Stacks
====
When the user selects the profiles and settings to print with, he can swap out a number of profiles. The profiles that are currently in use are stored in several container stacks. These container stacks always have a definition container at the bottom, which defines all available settings and all available properties for each setting. The profiles on top of that definition can then override the `value` property of some of those settings.
When deriving a setting value, a container stack starts looking at the top-most profile to see if it contains an override for that setting. If it does, it returns that override. Otherwise, it looks into the second profile. If that also doesn't have an override for this setting, it looks into the third profile, and so on. The last profile is always a definition container which always contains an value for all settings. This way, the profiles at the top will always win over the profiles at the bottom. There is a clear precedence order for which profile wins over which other profile.
A Machine Instance
----
A machine instance is a printer that the user has added to his configuration. It consists of multiple container stacks: One for global settings and one for each of the available extruders. This way, different extruders can contain different materials and quality profiles, for instance. The global stack contains a different set of profiles than the extruder stacks.
While Uranium defines no specific roles for the entries in a container stack, Cura defines rigid roles for each slot in a container stack. These are the layouts for the container stacks of an example printer with 2 extruders.
![Three container stacks](../resources/machine_instance.svg)
To expand on this a bit further, each extruder stack contains the following profiles:
* A user profile, where extruder-specific setting changes are stored that are not (yet) saved to a custom profile. If the user changes a setting that can be adjusted per extruder (such as infill density) then it gets stored here. If the user adjusts a setting that is global it will immediately be stored in the user profile of the global stack.
* A custom profile. If the user saves his setting changes to a custom profile, it gets moved from the user profile to here. Actually a "custom profile" as the user sees it consists of multiple profiles: one for each extruder and one for the global settings.
* An intent profile. The user can select between several intents for his print, such as precision, strength, visual quality, etc. This may be empty as well, which indicates the "default" intent.
* A quality profile. The user can select between several quality levels.
* A material profile, where the user selects which material is loaded in this extruder.
* A nozzle profile, where the user selects which nozzle is installed in this extruder.
* Definition changes, which stores the changes that the user made for this extruder in the Printer Settings dialogue.
* Extruder. The user is not able to swap this out. This is a definition that lists the extruder number for this extruder and optionally things that are fixed in the printer, such as the nozzle offset.
The global container stack contains the following profiles:
* A user profile, where global setting changes are stored that are not (yet) saved to a custom profile. If the user changes for instance the layer height, the new value for the layer height gets stored here.
* A custom profile. If the user saves his setting changes to a custom profile, the global settings that were in the global user profile get moved here.
* An intent profile. Currently this must ALWAYS be empty. There are no global intent profiles. This is there for historical reasons.
* A quality profile. This contains global settings that match with the quality level that the user selected. This global quality profile cannot be specific to a material or nozzle.
* A material profile. Currently this must ALWAYS be empty. There are no global material profiles. This is there for historical reasons.
* A variant profile. Currently this must ALWAYS be empty. There are no global variant profiles. This is there for historical reasons.
* Definition changes, which stores the changes that the user made to the printer in the Printer Settings dialogue.
* Printer. This specifies the currently used printer model, such as Ultimaker 3, Ultimaker S5, etc.

View file

@ -1,66 +0,0 @@
Getting a Setting Value
====
How Cura gets a setting's value is a complex endeavour that requires some explanation. The `value` property gets special treatment for this because there are a few other properties that influence the value. In this page we explain the algorithm to getting a setting value.
This page explains all possible cases for a setting, but not all of them may apply. For instance, a global setting will not evaluate the per-object settings to get its value. Exceptions to the rules for other types of settings will be written down.
Per Object Settings
----
Per-object settings, which are added to an object using the per-object settings tool, will always prevail over other setting values. They are not evaluated with the rest of the settings system because Cura's front-end doesn't need to send all setting values for all objects to CuraEngine separately. It only sends over the per-object settings that get overridden. CuraEngine then evaluates settings that can be changed per-object using the list of settings for that object but if the object doesn't have the setting attached falls back on the settings in the object's extruder. Refer to the [CuraEngine](#CuraEngine) chapter to see how this works.
Settings where the `settable_per_mesh` property is false will not be shown in Cura's interface in the list of available settings in the per-object settings panel. They cannot be adjusted per object then. CuraEngine will also not evaluate those settings for each object separately. There is (or should always be) a good reason why each of these settings are not evaluated per object: Simply because CuraEngine is not processing one particular mesh at that moment. For instance, when writing the move to change to the next layer, CuraEngine hasn't processed any of the meshes on that layer yet and so the layer change movement speed, or indeed the layer height, can't change for each object.
The per-object settings are stored in a separate container stack that is particular to the object. The container stack is added to the object via a scene decorator. It has just a single container in it, which contains all of the settings that the user changed.
Resolve
----
If the setting is not listed in the per-object settings, it needs to be evaluated from the main settings list. However before evaluating it from a particular extruder, Cura will check if the setting has the `resolve` property. If it does, it returns the output of the `resolve` property and that's everything.
The `resolve` property is intended for settings which are global in nature, but still need to be influenced by extruder-specific settings. A good example is the Build Plate Temperature, which is very dependent on the material(s) used by the printer, but there can only be a single bed temperature at a time.
Cura will simply evaluate the `resolve` setting if present, which is an arbitrary Python expression, and return its result as the setting's value. However typically the `resolve` property is a function that takes the values of this setting for all extruders in use and then computes a result based on those. There is a built-in function for that called `extruderValues()`, which returns a list of setting values, one for each extruder. The function can then for instance take the average of those. In the case of the build plate temperature it will take the highest of those. In the case of the adhesion type it will choose "raft" if any extruder uses a raft, or "brim" as second choice, "skirt" as third choice and "none" only if all extruders use "none". Each setting with a `resolve` property has its own way of resolving the setting. The `extruderValues()` function continues with the algorithm as written below, but repeats it for each extruder.
Limit To Extruder
----
If a setting is evaluated from a particular extruder stack, it normally gets evaluated from the extruder that the object is assigned to. However there are some exceptions. Some groups of settings belong to a particular "extruder setting", like the Infill Extruder setting, or the Support Extruder setting. Which extruder a setting belongs to is stored in the `limit_to_extruder` property. Settings which have their `limit_to_extruder` property set to `adhesion_extruder_nr`, for instance, belong to the build plate adhesion settings.
If the `limit_to_extruder` property evaluates to a positive number, instead of getting the setting from the object's extruder it will be obtained from the extruder written in the `limit_to_extruder` property. So even if an object is set to be printed with extruder 0, if the infill extruder is set to extruder 1 any infill setting will be obtained from extruder 1. If `limit_to_extruder` is negative (in particular -1, which is the default), then the setting will be obtained from the object's own extruder.
This property is communicated to CuraEngine separately. CuraEngine makes sure that the setting is evaluated from the correct extruder. Refer to the [CuraEngine](#CuraEngine) chapter to see how this works.
Evaluating a Stack
----
After the resolve and limit to extruder properties have been checked, the setting value needs to be evaluated from an extruder stack.
This is explained in more detail in the [Container Stacks](container_stacks.md) documentation. In brief, Cura will check the highest container in the extruder stack first to see whether that container overrides the setting. If it does, it returns that as the setting value. Otherwise, it checks the second container on the stack to see if that one overrides it. If it does it returns that value, and otherwise it checks the third container, and so on. If a setting is not overridden by any container in the extruder stack, it continues downward in the global stack. If it is also not overridden there, it eventually arrives at the definition in the bottom of the global stack.
Evaluating a Definition
----
If the evaluation for a setting reaches the last entry of the global stack, its definition, a few more things can happen.
Definition containers have an inheritance structure. For instance, the `ultimaker3` definition container specifies in its metadata that it inherits from `ultimaker`, which in turn inherits from `fdmprinter`. So again here, when evaluating a property from the `ultimaker3` definition it will first look to see if the property is overridden by the `ultimaker3` definition itself, and otherwise refer on to the `ultimaker` definition or otherwise finally to the `fdmprinter` definition. `fdmprinter` is the last line of defence, and it contains *all* properties for *all* settings.
But even in `fdmprinter`, not all settings have a `value` property. It is not a required property. If the setting doesn't have a `value` property, the `default_value` property is returned, which is a required property. The distinction between `value` and `default_value` is made in order to allow CuraEngine to load a definition file as well when running from the command line (a debugging technique for CuraEngine). It then won't have all of the correct setting values but it at least doesn't need to evaluate all of the Python expressions and you'll be able to make some debugging slices.
Evaluating a Value Property
----
The `value` property may contain a formula, which is an arbitrary Python expression that will be executed by Cura to arrive at a setting value. All containers may set the `value` property. Instance containers can only set the `value`, while definitions can set all properties.
While the value could be any sort of formula, some functions of Python are restricted for security reasons. Since Cura 4.6, profiles are no longer a "trusted" resource and are therefore subject to heavy restrictions. It can use Python's built in mathematical functions and list functions as well as a few basic other ones, but things like writing to a file are prohibited.
There are also a few extra things that can be used in these expressions:
* Any setting key can be used as a variable that contains the setting's value.
* As explained before, `extruderValues(key)` is a function that returns a list of setting values for a particular setting for all used extruders.
* The function `extruderValue(extruder, key)` will evaluate a particular setting for a particular extruder.
* The function `resolveOrValue(key)` will perform the full setting evaluation as described in this document for the current context (so if this setting is being evaluated for the second extruder it would perform it as if coming from the second extruder).
* The function `defaultExtruderPosition()` will get the first extruder that is not disabled. For instance, if a printer has three extruders but the first is disabled, this would return `1` to indicate the second extruder (0-indexed).
* The function `valueFromContainer(key, index)` will get a setting value from the global stack, but skip the first few containers in that stack. It will skip until it reaches a particular index in the container stack.
* The function `valueFromExtruderContainer(key, index)` will get a setting value from the current extruder stack, but skip the first few containers in that stack. It will skip until it reaches a particular index in the container stack.
CuraEngine
----
When starting a slice, Cura will send the scene to CuraEngine and with each model send over the per-object settings that belong to it. It also sends all setting values over, as evaluated from each extruder and from the global stack, and sends the `limit_to_extruder` property along as well. CuraEngine stores this and then starts its slicing process. CuraEngine also has a hierarchical structure for its settings with fallbacks. This is explained in detail in [the documentation of CuraEngine](https://github.com/Ultimaker/CuraEngine/blob/master/docs/settings.md) and shortly again here.
Each model gets a setting container assigned. The per-object settings are stored in those. The fallback for this container is set to be the extruder with which the object is printed. The extruder uses the current *mesh group* as fallback (which is a concept that Cura's front-end doesn't have). Each mesh group uses the global settings container as fallback.
During the slicing process CuraEngine will evaluate the settings from its current context as it goes. For instance, when processing the walls for a particular mesh, it will request the Outer Wall Line Width setting from the settings container of that mesh. When it's not processing a particular mesh but for instance the travel moves between two meshes, it uses the currently applicable extruder. So this business logic defines actually how a setting can be configured per mesh, per extruder or only globally. The `settable_per_extruder`, and related properties of settings are only used in the front-end to determine how the settings are shown to the user.

View file

@ -1,30 +0,0 @@
Profiles
====
Cura's profile system is very advanced and has gotten pretty complex. This chapter is an attempt to document how it is structured.
Index
----
The following pages describe the profile and setting system of Cura:
* [Container Stacks](container_stacks.md): Which profiles can be swapped out and how they are ordered when evaluating a setting.
* [Setting Properties](setting_properties.md): What properties can each setting have?
* [Getting a Setting Value](getting_a_setting_value.md): How Cura arrives at a value for a certain setting.
Glossary
----
The terminology for these profiles is not always obvious. Here is a glossary of the terms that we'll use in this chapter.
* **Profile:** Either an *instance container* or a *definition container*.
* **Definition container:** Profile that's stored as .def.json file, defining new settings and all of their properties. In Cura these represent printer models and extruder trains.
* **Instance container:** Profile that's stored as .inst.cfg file or .xml.fdm_material file, which override some setting values. In Cura these represent the other profiles.
* **[Container] stack:** A list of profiles, with one definition container at the bottom and instance containers for the rest. All settings are defined in the definition container. The rest of the profiles each specify a set of value overrides. The profiles at the top always override the profiles at the bottom.
* **Machine instance:** An instance of a printer that the user has added. The list of all machine instances is shown in a drop-down in Cura's interface.
* **Material:** A type of filament that's being sold by a vendor as a product.
* **Filament spool:** A single spool of material.
* **Quality profile:** A profile that is one of the options when the user selects which quality level they want to print with.
* **Intent profile:** A profile that is one of the options when the user selects what his intent is.
* **Custom profile:** A user-made profile that is stored when the user selects to "create a profile from the current settings/overrides".
* **Quality-changes profile:** Alternative name for *custom profile*. This name is used in the code more often, but it's a bit misleading so this documentation prefers the term "custom profile".
* **User profile:** A profile containing the settings that the user has changed, but not yet saved to a profile.
* **Variant profile:** A profile containing some overrides that allow the user to select variants of the definition. As of this writing this is only used for the nozzles.
* **Quality level:** A measure of quality where the user can select from, for instance "normal", "fast", "high". When selecting a quality level, Cura will select a matching quality profile for each extruder.
* **Quality type:** Alternative name for *quality level*. This name is used in the code more often, but this documentation prefers the term "quality level".
* **Inheritance function:** A function through which the `value` of a setting is calculated. This may depend on other settings.

View file

@ -1,78 +0,0 @@
Setting Properties
====
Each setting in Cura has a number of properties. It's not just a key and a value. This page lists the properties that a setting can define.
* `key` (string): __The identifier by which the setting is referenced.__
* This is not a human-readable name, but just a reference string, such as `layer_height_0`.
* This is not actually a real property but just an identifier; it can't be changed.
* Typically these are named with the most significant category first, in order to sort them better, such as `material_print_temperature`.
* `value` (optional): __The current value of the setting.__
* This can be a function (an arbitrary Python expression) that depends on the values of other settings.
* If it's not present, the `default_value` is used.
* `default_value`: __A default value for the setting if `value` is undefined.__
* This property is required.
* It can't be a Python expression, but it can be any JSON type.
* This is made separate so that CuraEngine can read it out for its debugging mode via the command line, without needing a complete Python interpreter.
* `label` (string): __The human-readable name for the setting.__
* This label is translated.
* `description` (string): __A longer description of what the setting does when you change it.__
* This description is translated.
* `type` (string): __The type of value that this setting contains.__
* Allowed types are: `bool`, `str`, `float`, `int`, `enum`, `category`, `[int]`, `vec3`, `polygon` and `polygons`.
* `unit` (optional string): __A unit that is displayed at the right-hand side of the text field where the user enters the setting value.__
* `resolve` (optional string): __A Python expression that resolves disagreements for global settings if multiple per-extruder profiles define different values for a setting.__
* Typically this takes the values for the setting from all stacks and computes one final value for it that will be used for the global setting. For instance, the `resolve` function for the build plate temperature is `max(extruderValues('material_bed_temperature')`, meaning that it will use the hottest bed temperature of all materials of the extruders in use.
* `limit_to_extruder` (optional): __A Python expression that indicates which extruder a setting will be obtained from.__
* This is used for settings that may be extruder-specific but the extruder is not necessarily the current extruder. For instance, support settings need to be evaluated for the support extruder. Infill settings need to be evaluated for the infill extruder if the infill extruder is changed.
* `enabled` (optional string or boolean): __Whether the setting can currently be made visible for the user.__
* This can be a simple true/false, or a Python expression that depends on other settings.
* Typically used for settings that don't apply when another setting is disabled, such as to hide the support settings if support is disabled.
* `minimum_value` (optional): __The lowest acceptable value for this setting.__
* If it's any lower, Cura will not allow the user to slice.
* This property only applies to numerical settings.
* By convention this is used to prevent setting values that are technically or physically impossible, such as a layer height of 0mm.
* `maximum_value` (optional): __The highest acceptable value for this setting.__
* If it's any higher, Cura will not allow the user to slice.
* This property only applies to numerical settings.
* By convention this is used to prevent setting values that are technically or physically impossible, such as a support overhang angle of more than 90 degrees.
* `minimum_value_warning` (optional): __The threshold under which a warning is displayed to the user.__
* This property only applies to numerical settings.
* By convention this is used to indicate that it will probably not print very nicely with such a low setting value.
* `maximum_value_warning` (optional): __The threshold above which a warning is displayed to the user.__
* This property only applies to numerical settings.
* By convention this is used to indicate that it will probably not print very nicely with such a high setting value.
* `settable_globally` (optional boolean): __Whether the setting can be changed globally.__
* For some mesh-type settings such as `support_mesh` this doesn't make sense, so those can't be changed globally. They are not displayed in the main settings list then.
* `settable_per_meshgroup` (optional boolean): __Whether a setting can be changed per group of meshes.__
* *This is currently unused by Cura.*
* `settable_per_extruder` (optional boolean): __Whether a setting can be changed per extruder.__
* Some settings, like the build plate temperature, can't be adjusted separately for each extruder. An icon is shown in the interface to indicate this.
* If the user changes these settings they are stored in the global stack.
* `settable_per_mesh` (optional boolean): __Whether a setting can be changed per mesh.__
* The settings that can be changed per mesh are shown in the list of available settings in the per-object settings tool.
* `children` (optional list): __A list of child settings.__
* These are displayed with an indentation. If all child settings are overridden by the user, the parent setting gets greyed out to indicate that the parent setting has no effect any more. This is not strictly always the case though, because that would depend on the inheritance functions in the `value`.
* `icon` (optional string): __A path to an icon to be displayed.__
* Only applies to setting categories.
* `allow_empty` (optional bool): __Whether the setting is allowed to be empty.__
* If it's not, this will be treated as a setting error and Cura will not allow the user to slice.
* Only applies to string-type settings.
* `warning_description` (optional string): __A warning message to display when the setting has a warning value.__
* *This is currently unused by Cura.*
* `error_description` (optional string): __An error message to display when the setting has an error value.__
* *This is currently unused by Cura.*
* `options` (dictionary): __A list of values that the user can choose from.__
* The keys of this dictionary are keys that CuraEngine identifies the option with.
* The values are human-readable strings and will be translated.
* Only applies to (and only required for) enum-type settings.
* `comments` (optional string): __Comments to other programmers about the setting.__
* *This is currently unused by Cura.*
* `is_uuid` (optional boolean): __Whether or not this setting indicates a UUID-4.__
* If it is, the setting will indicate an error if it's not in the correct format.
* Only applies to string-type settings.
* `regex_blacklist_pattern` (optional string): __A regular expression, where if the setting value matches with this regular expression, it gets an error state.__
* Only applies to string-type settings.
* `error_value` (optional): __If the setting value is equal to this value, it will show a setting error.__
* This is used to display errors for non-numerical settings such as checkboxes.
* `warning_value` (optional): __If the setting value is equal to this value, it will show a setting warning.__
* This is used to display warnings for non-numerical settings such as checkboxes.

Some files were not shown because too many files have changed in this diff Show more