Merge branch 'main' into patch-1
63
.github/ISSUE_TEMPLATE/SlicingCrash.yaml
vendored
|
@ -5,38 +5,25 @@ body:
|
|||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
### ✨Try our improved Cura 5.7✨
|
||||
Before filling out the report below, we want you to try the latest Cura 5.7.
|
||||
This version of Cura has become significantly more reliable and has an updated slicing engine that will automatically send a report to the Cura Team for analysis.
|
||||
#### [You can find the downloads here](https://github.com/Ultimaker/Cura/releases/latest) ####
|
||||
If you still encounter a crash you are still welcome to report the issue so we can use your model as a test case, you can find instructions on how to do that below.
|
||||
### ✨Are you stuck? Have you tried these two things? ✨
|
||||
1- Are you on a Cura version lower than Cura 5.7? We really recommend updating because it resolves a lot of slicing crashes!
|
||||
2- Have you tried fixing the model with software that repairs 3d files and makes them watertight?
|
||||
Are you seeing spots and dots on your model? That is Cura indicating that your model is not watertight.
|
||||
You can try doing a quick [Mesh Fix with the Meshtools Plugin](https://marketplace.ultimaker.com/app/cura/plugins/fieldofview/MeshTools) or other mesh editing software.
|
||||
|
||||
### Project File
|
||||
**⚠️ Before you continue, we need your project file to troubleshoot a slicing crash.**
|
||||
It contains the printer and settings we need for troubleshooting.
|
||||
If you still encounter a crash you are welcome to report the issue so we can use your model as a test case.
|
||||
You can find instructions on how to share your model in a Package for Technical Support below.
|
||||
|
||||

|
||||
|
||||
To save a project file go to File -> Save project.
|
||||
Please make sure to .zip your project file.
|
||||
For big files, you may need to use [WeTransfer](https://wetransfer.com/) or similar file-sharing sites.
|
||||
|
||||
🤔 Before you share, please think to yourself. Is this a model that can be shared?
|
||||
Unfortunately we cannot help if this file is missing.
|
||||
Do you have the project file? Than let's continue ⬇️
|
||||
|
||||
### Questions
|
||||
🤔 Before you share, please think to yourself. Is this a model that can be shared on the internet?
|
||||
**Unfortunately, we cannot help if this file is missing.**
|
||||
|
||||
### Questions
|
||||
- type: input
|
||||
attributes:
|
||||
label: Cura Version
|
||||
placeholder: 5.6.0
|
||||
description: We work hard on improving our slicing crashes. If you are not on the latest version of Cura, [you can download it here](https://github.com/Ultimaker/Cura/releases/latest)
|
||||
validations:
|
||||
required: true
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
We work hard on improving our slicing crashes. Our most recent release is 5.7.1.
|
||||
If you are not on the latest version of Cura, [you can download it here](https://github.com/Ultimaker/Cura/releases/latest)
|
||||
- type: input
|
||||
attributes:
|
||||
label: Operating System
|
||||
|
@ -50,27 +37,13 @@ body:
|
|||
description: Which printer was selected in Cura?
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
attributes:
|
||||
label: Name abnormal settings
|
||||
description: Are there any settings that you might have changed that caused the crash? Does your model slice when you select the default profiles?
|
||||
placeholder:
|
||||
validations:
|
||||
- type: input
|
||||
attributes:
|
||||
label: Describe model location
|
||||
description: Does your model slice if you rotate the model 90 degrees or if you move it away from the center of the buildplate?
|
||||
placeholder:
|
||||
validations:
|
||||
- type: input
|
||||
attributes:
|
||||
label: Describe your model
|
||||
description: Have you sliced your model succesfully before? Is it watertight? Have you tried doing a quick [Mesh Fix with the Meshtools Plugin](https://marketplace.ultimaker.com/app/cura/plugins/fieldofview/MeshTools)?
|
||||
validations:
|
||||
required: true
|
||||
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: Add your .zip here ⬇️
|
||||
description: You can add the zip file and additional information that is relevant to the issue in the comments below.
|
||||
label: Describe your problem and add the package for technical support as a .zip here ⬇️
|
||||
description: |
|
||||
If you still have Cura open with your crash > Click on Help on top bar > Click on Export Package For Technical Support > Compress the file into a zip > Add the file here to your GitHub issue 🔗
|
||||
|
||||
If you closed Cura, please open Cura to recreate the crash> Select your printer > Load your model > Select your print settings > Click on Help on top bar > Click on Export Package For Technical Support > Compress the file into a zip > Add the file here to your GitHub issue 🔗
|
||||
validations:
|
||||
required: true
|
||||
|
|
32
.github/workflows/conan-package-resources.yml
vendored
|
@ -12,6 +12,7 @@ on:
|
|||
- 'resources/quality/**'
|
||||
- 'resources/variants/**'
|
||||
- 'resources/conanfile.py'
|
||||
- 'resources/conandata.yml'
|
||||
branches:
|
||||
- 'main'
|
||||
- 'CURA-*'
|
||||
|
@ -20,21 +21,22 @@ on:
|
|||
- '[0-9].[0-9]*'
|
||||
- '[0-9].[0-9][0-9]*'
|
||||
|
||||
env:
|
||||
CONAN_LOGIN_USERNAME_CURA: ${{ secrets.CONAN_USER }}
|
||||
CONAN_PASSWORD_CURA: ${{ secrets.CONAN_PASS }}
|
||||
|
||||
jobs:
|
||||
conan-recipe-version:
|
||||
uses: ultimaker/cura-workflows/.github/workflows/conan-recipe-version.yml@main
|
||||
conan-package:
|
||||
uses: ultimaker/cura-workflows/.github/workflows/conan-package.yml@main
|
||||
with:
|
||||
project_name: cura_resources
|
||||
|
||||
conan-package-export:
|
||||
needs: [ conan-recipe-version ]
|
||||
uses: ultimaker/cura-workflows/.github/workflows/conan-recipe-export.yml@main
|
||||
with:
|
||||
recipe_id_full: ${{ needs.conan-recipe-version.outputs.recipe_id_full }}
|
||||
recipe_id_latest: ${{ needs.conan-recipe-version.outputs.recipe_id_latest }}
|
||||
conan_recipe_root: "./resources/"
|
||||
secrets: inherit
|
||||
platform_windows: false
|
||||
platform_mac: false
|
||||
install_system_dependencies: false
|
||||
secrets: inherit
|
||||
|
||||
signal-curator:
|
||||
needs: conan-package
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Trigger Curator Workflow
|
||||
run: |
|
||||
gh workflow run --repo ultimaker/curator -r main package.yml
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.CURATOR_TRIGGER_PAT_C3PO }}
|
||||
|
|
25
.github/workflows/conan-package.yml
vendored
|
@ -32,28 +32,7 @@ on:
|
|||
- '[0-9].[0-9]*'
|
||||
- '[0-9].[0-9][0-9]*'
|
||||
|
||||
env:
|
||||
CONAN_LOGIN_USERNAME_CURA: ${{ secrets.CONAN_USER }}
|
||||
CONAN_PASSWORD_CURA: ${{ secrets.CONAN_PASS }}
|
||||
|
||||
jobs:
|
||||
conan-recipe-version:
|
||||
uses: ultimaker/cura-workflows/.github/workflows/conan-recipe-version.yml@main
|
||||
with:
|
||||
project_name: cura
|
||||
|
||||
conan-package-export:
|
||||
needs: [ conan-recipe-version ]
|
||||
uses: ultimaker/cura-workflows/.github/workflows/conan-recipe-export.yml@main
|
||||
with:
|
||||
recipe_id_full: ${{ needs.conan-recipe-version.outputs.recipe_id_full }}
|
||||
recipe_id_latest: ${{ needs.conan-recipe-version.outputs.recipe_id_latest }}
|
||||
conan-package:
|
||||
uses: ultimaker/cura-workflows/.github/workflows/conan-package.yml@main
|
||||
secrets: inherit
|
||||
|
||||
conan-package-create:
|
||||
needs: [ conan-recipe-version, conan-package-export ]
|
||||
uses: ultimaker/cura-workflows/.github/workflows/conan-package-create-linux.yml@main
|
||||
with:
|
||||
recipe_id_full: ${{ needs.conan-recipe-version.outputs.recipe_id_full }}
|
||||
conan_extra_args: "-o cura:enable_i18n=True"
|
||||
secrets: inherit
|
282
.github/workflows/installers.yml
vendored
|
@ -6,290 +6,34 @@ on:
|
|||
inputs:
|
||||
cura_conan_version:
|
||||
description: 'Cura Conan Version'
|
||||
default: 'cura/latest@ultimaker/testing'
|
||||
required: true
|
||||
default: ''
|
||||
type: string
|
||||
package_overrides:
|
||||
description: 'List of specific packages to be used (space-separated)'
|
||||
default: ''
|
||||
type: string
|
||||
conan_args:
|
||||
description: 'Conan args: eq.: --require-override'
|
||||
description: 'Conan args'
|
||||
default: ''
|
||||
required: false
|
||||
type: string
|
||||
enterprise:
|
||||
description: 'Build Cura as an Enterprise edition'
|
||||
default: false
|
||||
required: true
|
||||
type: boolean
|
||||
staging:
|
||||
description: 'Use staging API'
|
||||
default: false
|
||||
required: true
|
||||
type: boolean
|
||||
nightly:
|
||||
description: 'Upload to nightly release'
|
||||
default: false
|
||||
required: true
|
||||
type: boolean
|
||||
|
||||
workflow_call:
|
||||
inputs:
|
||||
cura_conan_version:
|
||||
default: 'cura/latest@ultimaker/testing'
|
||||
required: true
|
||||
type: string
|
||||
conan_args:
|
||||
default: ''
|
||||
required: false
|
||||
type: string
|
||||
enterprise:
|
||||
default: false
|
||||
required: true
|
||||
type: boolean
|
||||
staging:
|
||||
default: false
|
||||
required: true
|
||||
type: boolean
|
||||
nightly:
|
||||
default: false
|
||||
required: true
|
||||
type: boolean
|
||||
|
||||
schedule:
|
||||
# Daily at 4:15 CET (main-branch) and 5:15 CET (release-branch)
|
||||
- cron: '15 3 * * *'
|
||||
- cron: '15 4 * * *'
|
||||
|
||||
env:
|
||||
CONAN_ARGS: ${{ inputs.conan_args || '' }}
|
||||
ENTERPRISE: ${{ inputs.enterprise || false }}
|
||||
STAGING: ${{ inputs.staging || false }}
|
||||
|
||||
jobs:
|
||||
default_values:
|
||||
uses: ultimaker/cura-workflows/.github/workflows/cura-installer-default-value.yml@main
|
||||
installers:
|
||||
name: Create installers
|
||||
uses: ultimaker/cura-workflows/.github/workflows/cura-installers.yml@main
|
||||
with:
|
||||
cura_conan_version: ${{ inputs.cura_conan_version }}
|
||||
latest_release: '5.6'
|
||||
latest_release_schedule_hour: 4
|
||||
latest_release_tag: 'nightly'
|
||||
|
||||
windows-installer:
|
||||
uses: ultimaker/cura-workflows/.github/workflows/cura-installer-windows.yml@main
|
||||
needs: [ default_values ]
|
||||
with:
|
||||
cura_conan_version: ${{ needs.default_values.outputs.cura_conan_version }}
|
||||
conan_args: ${{ github.event.inputs.conan_args }}
|
||||
enterprise: ${{ github.event.inputs.enterprise == 'true' }}
|
||||
staging: ${{ github.event.inputs.staging == 'true' }}
|
||||
architecture: X64
|
||||
operating_system: self-hosted-Windows-X64
|
||||
package_overrides: ${{ inputs.package_overrides }}
|
||||
conan_args: ${{ inputs.conan_args }}
|
||||
enterprise: ${{ inputs.enterprise }}
|
||||
staging: ${{ inputs.staging }}
|
||||
secrets: inherit
|
||||
|
||||
linux-installer:
|
||||
uses: ultimaker/cura-workflows/.github/workflows/cura-installer-linux.yml@main
|
||||
needs: [ default_values ]
|
||||
with:
|
||||
cura_conan_version: ${{ needs.default_values.outputs.cura_conan_version }}
|
||||
conan_args: ${{ github.event.inputs.conan_args }}
|
||||
enterprise: ${{ github.event.inputs.enterprise == 'true' }}
|
||||
staging: ${{ github.event.inputs.staging == 'true' }}
|
||||
architecture: X64
|
||||
operating_system: self-hosted-Ubuntu22-X64
|
||||
secrets: inherit
|
||||
|
||||
macos-installer:
|
||||
uses: ultimaker/cura-workflows/.github/workflows/cura-installer-macos.yml@main
|
||||
needs: [ default_values ]
|
||||
with:
|
||||
cura_conan_version: ${{ needs.default_values.outputs.cura_conan_version }}
|
||||
conan_args: ${{ github.event.inputs.conan_args }}
|
||||
enterprise: ${{ github.event.inputs.enterprise == 'true' }}
|
||||
staging: ${{ github.event.inputs.staging == 'true' }}
|
||||
architecture: X64
|
||||
operating_system: self-hosted-X64
|
||||
secrets: inherit
|
||||
|
||||
macos-arm-installer:
|
||||
uses: ultimaker/cura-workflows/.github/workflows/cura-installer-macos.yml@main
|
||||
needs: [ default_values ]
|
||||
with:
|
||||
cura_conan_version: ${{ needs.default_values.outputs.cura_conan_version }}
|
||||
conan_args: ${{ github.event.inputs.conan_args }}
|
||||
enterprise: ${{ github.event.inputs.enterprise == 'true' }}
|
||||
staging: ${{ github.event.inputs.staging == 'true' }}
|
||||
architecture: ARM64
|
||||
operating_system: self-hosted-ARM64
|
||||
secrets: inherit
|
||||
|
||||
# Run and update nightly release when the nightly input is set to true or if the schedule is triggered
|
||||
update-nightly-release:
|
||||
if: ${{ inputs.nightly || github.event_name == 'schedule' }}
|
||||
runs-on: ubuntu-latest
|
||||
needs: [ default_values, windows-installer, linux-installer, macos-installer, macos-arm-installer ]
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 1
|
||||
|
||||
- name: Download the run info
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: linux-run-info
|
||||
|
||||
- name: Set the run info as environment variables
|
||||
run: |
|
||||
. run_info.sh
|
||||
|
||||
- name: Output the name file name and extension
|
||||
id: filename
|
||||
shell: python
|
||||
run: |
|
||||
import os
|
||||
import datetime
|
||||
enterprise = "-Enterprise" if "${{ github.event.inputs.enterprise }}" == "true" else ""
|
||||
linux = f"UltiMaker-Cura-{os.getenv('CURA_VERSION_FULL')}{enterprise}-linux-X64"
|
||||
mac_x64_dmg = f"UltiMaker-Cura-{os.getenv('CURA_VERSION_FULL')}{enterprise}-macos-X64"
|
||||
mac_x64_pkg = f"UltiMaker-Cura-{os.getenv('CURA_VERSION_FULL')}{enterprise}-macos-X64"
|
||||
mac_arm_dmg = f"UltiMaker-Cura-{os.getenv('CURA_VERSION_FULL')}{enterprise}-macos-ARM64"
|
||||
mac_arm_pkg = f"UltiMaker-Cura-{os.getenv('CURA_VERSION_FULL')}{enterprise}-macos-ARM64"
|
||||
win_msi = installer_filename = f"UltiMaker-Cura-{os.getenv('CURA_VERSION_FULL')}{enterprise}-win64-X64"
|
||||
win_exe = installer_filename = f"UltiMaker-Cura-{os.getenv('CURA_VERSION_FULL')}{enterprise}-win64-X64"
|
||||
nightly_name = "UltiMaker-Cura-" + os.getenv('CURA_VERSION_FULL').split("+")[0]
|
||||
nightly_creation_time = str(datetime.datetime.now().strftime("%Y-%m-%d %H:%M:%S"))
|
||||
output_env = os.environ["GITHUB_OUTPUT"]
|
||||
content = ""
|
||||
if os.path.exists(output_env):
|
||||
with open(output_env, "r") as f:
|
||||
content = f.read()
|
||||
with open(output_env, "w") as f:
|
||||
f.write(content)
|
||||
f.writelines(f"LINUX={linux}\n")
|
||||
f.writelines(f"MAC_X64_DMG={mac_x64_dmg}\n")
|
||||
f.writelines(f"MAC_X64_PKG={mac_x64_pkg}\n")
|
||||
f.writelines(f"MAC_ARM_DMG={mac_arm_dmg}\n")
|
||||
f.writelines(f"MAC_ARM_PKG={mac_arm_pkg}\n")
|
||||
f.writelines(f"WIN_MSI={win_msi}\n")
|
||||
f.writelines(f"WIN_EXE={win_exe}\n")
|
||||
f.writelines(f"NIGHTLY_NAME={nightly_name}\n")
|
||||
f.writelines(f"NIGHTLY_TIME={nightly_creation_time}\n")
|
||||
|
||||
- name: Download linux installer jobs artifacts
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: ${{ steps.filename.outputs.LINUX }}-AppImage
|
||||
path: installers
|
||||
|
||||
- name: Download linux installer jobs asc artifacts
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: ${{ steps.filename.outputs.LINUX }}-asc
|
||||
path: installers
|
||||
|
||||
- name: Rename Linux installer to nightlies
|
||||
run: |
|
||||
mv installers/${{ steps.filename.outputs.LINUX }}.AppImage installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-linux-X64.AppImage
|
||||
mv installers/${{ steps.filename.outputs.LINUX }}.AppImage.asc installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-linux-X64.AppImage.asc
|
||||
|
||||
- name: Update nightly release for Linux
|
||||
run: |
|
||||
gh release upload ${{ needs.default_values.outputs.release_tag }} installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-linux-X64.AppImage --clobber
|
||||
gh release upload ${{ needs.default_values.outputs.release_tag }} installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-linux-X64.AppImage.asc --clobber
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Download win msi installer jobs artifacts
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: ${{ steps.filename.outputs.WIN_MSI }}-msi
|
||||
path: installers
|
||||
|
||||
- name: Download win exe installer jobs artifacts
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: ${{ steps.filename.outputs.WIN_EXE }}-exe
|
||||
path: installers
|
||||
|
||||
- name: Rename Windows installers to nightlies
|
||||
run: |
|
||||
mv installers/${{ steps.filename.outputs.WIN_MSI }}.msi installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-win64-X64.msi
|
||||
mv installers/${{ steps.filename.outputs.WIN_EXE }}.exe installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-win64-X64.exe
|
||||
|
||||
- name: Update nightly release for Windows
|
||||
run: |
|
||||
gh release upload ${{ needs.default_values.outputs.release_tag }} installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-win64-X64.msi --clobber
|
||||
gh release upload ${{ needs.default_values.outputs.release_tag }} installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-win64-X64.exe --clobber
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Download MacOS (X64) dmg installer jobs artifacts
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: ${{ steps.filename.outputs.MAC_X64_DMG }}-dmg
|
||||
path: installers
|
||||
|
||||
- name: Download MacOS (X64) pkg installer jobs artifacts
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: ${{ steps.filename.outputs.MAC_X64_PKG }}-pkg
|
||||
path: installers
|
||||
|
||||
- name: Rename MacOS (X64) installers to nightlies
|
||||
run: |
|
||||
mv installers/${{ steps.filename.outputs.MAC_X64_DMG }}.dmg installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-macos-X64.dmg
|
||||
mv installers/${{ steps.filename.outputs.MAC_X64_PKG }}.pkg installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-macos-X64.pkg
|
||||
|
||||
- name: Update nightly release for MacOS (X64)
|
||||
run: |
|
||||
gh release upload ${{ needs.default_values.outputs.release_tag }} installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-macos-X64.dmg --clobber
|
||||
gh release upload ${{ needs.default_values.outputs.release_tag }} installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-macos-X64.pkg --clobber
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Download MacOS (ARM-64) dmg installer jobs artifacts
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: ${{ steps.filename.outputs.MAC_ARM_DMG }}-dmg
|
||||
path: installers
|
||||
|
||||
- name: Download MacOS (ARM-64) pkg installer jobs artifacts
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: ${{ steps.filename.outputs.MAC_ARM_PKG }}-pkg
|
||||
path: installers
|
||||
|
||||
- name: Rename MacOS (ARM-64) installers to nightlies
|
||||
run: |
|
||||
mv installers/${{ steps.filename.outputs.MAC_ARM_DMG }}.dmg installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-macos-ARM64.dmg
|
||||
mv installers/${{ steps.filename.outputs.MAC_ARM_PKG }}.pkg installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-macos-ARM64.pkg
|
||||
|
||||
- name: Update nightly release for MacOS (ARM-64)
|
||||
run: |
|
||||
gh release upload ${{ needs.default_values.outputs.release_tag }} installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-macos-ARM64.dmg --clobber
|
||||
gh release upload ${{ needs.default_values.outputs.release_tag }} installers/${{ steps.filename.outputs.NIGHTLY_NAME }}-macos-ARM64.pkg --clobber
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: create the release notes
|
||||
shell: python
|
||||
run: |
|
||||
import os
|
||||
import datetime
|
||||
from jinja2 import Template
|
||||
|
||||
with open(".github/workflows/release_notes.md.jinja", "r") as f:
|
||||
release_notes = Template(f.read())
|
||||
|
||||
current_nightly_beta = "${{ needs.default_values.outputs.release_tag }}".split("nightly-")[-1]
|
||||
with open("release-notes.md", "w") as f:
|
||||
f.write(release_notes.render(
|
||||
timestamp="${{ steps.filename.outputs.NIGHTLY_TIME }}",
|
||||
branch="" if "${{ needs.default-values.outputs.release_tag == 'nightly' }}" == 'true' else current_nightly_beta,
|
||||
branch_specific="" if os.getenv("GITHUB_REF") == "refs/heads/main" else f"?branch={current_nightly_beta}",
|
||||
))
|
||||
|
||||
- name: Update nightly release description (with date)
|
||||
if: always()
|
||||
run: |
|
||||
gh release edit ${{ needs.default_values.outputs.release_tag }} --title "${{ steps.filename.outputs.NIGHTLY_NAME }}" --notes-file release-notes.md
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
|
18
.github/workflows/linux.yml
vendored
|
@ -6,11 +6,14 @@ on:
|
|||
inputs:
|
||||
cura_conan_version:
|
||||
description: 'Cura Conan Version'
|
||||
default: 'cura/latest@ultimaker/testing'
|
||||
required: true
|
||||
default: ''
|
||||
type: string
|
||||
package_overrides:
|
||||
description: 'List of specific packages to be used (space-separated)'
|
||||
default: ''
|
||||
type: string
|
||||
conan_args:
|
||||
description: 'Conan args: eq.: --require-override'
|
||||
description: 'Conan args'
|
||||
default: ''
|
||||
required: false
|
||||
type: string
|
||||
|
@ -24,13 +27,6 @@ on:
|
|||
default: false
|
||||
required: true
|
||||
type: boolean
|
||||
architecture:
|
||||
description: 'Architecture'
|
||||
required: true
|
||||
default: 'X64'
|
||||
type: choice
|
||||
options:
|
||||
- X64
|
||||
operating_system:
|
||||
description: 'OS'
|
||||
required: true
|
||||
|
@ -45,9 +41,9 @@ jobs:
|
|||
uses: ultimaker/cura-workflows/.github/workflows/cura-installer-linux.yml@main
|
||||
with:
|
||||
cura_conan_version: ${{ inputs.cura_conan_version }}
|
||||
package_overrides: ${{ inputs.package_overrides }}
|
||||
conan_args: ${{ inputs.conan_args }}
|
||||
enterprise: ${{ inputs.enterprise }}
|
||||
staging: ${{ inputs.staging }}
|
||||
architecture: ${{ inputs.architecture }}
|
||||
operating_system: ${{ inputs.operating_system }}
|
||||
secrets: inherit
|
||||
|
|
10
.github/workflows/macos.yml
vendored
|
@ -6,11 +6,14 @@ on:
|
|||
inputs:
|
||||
cura_conan_version:
|
||||
description: 'Cura Conan Version'
|
||||
default: 'cura/latest@ultimaker/testing'
|
||||
required: true
|
||||
default: ''
|
||||
type: string
|
||||
package_overrides:
|
||||
description: 'List of specific packages to be used (space-separated)'
|
||||
default: ''
|
||||
type: string
|
||||
conan_args:
|
||||
description: 'Conan args: eq.: --require-override'
|
||||
description: 'Conan args'
|
||||
default: ''
|
||||
required: false
|
||||
type: string
|
||||
|
@ -47,6 +50,7 @@ jobs:
|
|||
uses: ultimaker/cura-workflows/.github/workflows/cura-installer-macos.yml@main
|
||||
with:
|
||||
cura_conan_version: ${{ inputs.cura_conan_version }}
|
||||
package_overrides: ${{ inputs.package_overrides }}
|
||||
conan_args: ${{ inputs.conan_args }}
|
||||
enterprise: ${{ inputs.enterprise }}
|
||||
staging: ${{ inputs.staging }}
|
||||
|
|
16
.github/workflows/nightly-stable.yml
vendored
Normal file
|
@ -0,0 +1,16 @@
|
|||
name: Nightly build - stable release
|
||||
run-name: Nightly build - stable release
|
||||
|
||||
# on:
|
||||
# schedule:
|
||||
# # Daily at 5:15 CET
|
||||
# - cron: '15 4 * * *'
|
||||
|
||||
jobs:
|
||||
build-nightly:
|
||||
uses: ./.github/workflows/nightly.yml
|
||||
with:
|
||||
cura_conan_version: "cura/[*]"
|
||||
release_tag: "nightly-stable"
|
||||
caller_workflow: "nightly-stable.yml"
|
||||
secrets: inherit
|
16
.github/workflows/nightly-testing.yml
vendored
Normal file
|
@ -0,0 +1,16 @@
|
|||
name: Nightly build - dev release
|
||||
run-name: Nightly build - dev release
|
||||
|
||||
# on:
|
||||
# schedule:
|
||||
# # Daily at 4:15 CET
|
||||
# - cron: '15 3 * * *'
|
||||
|
||||
jobs:
|
||||
build-nightly:
|
||||
uses: ./.github/workflows/nightly.yml
|
||||
with:
|
||||
cura_conan_version: "cura/[*]@ultimaker/testing"
|
||||
release_tag: "nightly-testing" # Fixed version, we reuse the same tag forever
|
||||
caller_workflow: "nightly-testing.yml"
|
||||
secrets: inherit
|
75
.github/workflows/nightly.yml
vendored
Normal file
|
@ -0,0 +1,75 @@
|
|||
name: Nightly build
|
||||
run-name: Nightly build
|
||||
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
cura_conan_version:
|
||||
required: true
|
||||
type: string
|
||||
release_tag:
|
||||
required: true
|
||||
type: string
|
||||
caller_workflow:
|
||||
required: true
|
||||
type: string
|
||||
|
||||
|
||||
jobs:
|
||||
create-installers:
|
||||
name: Create installers
|
||||
uses: ultimaker/cura-workflows/.github/workflows/cura-installers.yml@main
|
||||
with:
|
||||
cura_conan_version: ${{ inputs.cura_conan_version }}
|
||||
secrets: inherit
|
||||
|
||||
update-nightly-release:
|
||||
name: Upload installers
|
||||
runs-on: ubuntu-latest
|
||||
needs: [ create-installers ]
|
||||
steps:
|
||||
- name: Setup the build environment
|
||||
uses: ultimaker/cura-workflows/.github/actions/setup-build-environment@main
|
||||
|
||||
- name: Download installers jobs artifacts
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
pattern: UltiMaker-Cura-*
|
||||
path: installers
|
||||
merge-multiple: true
|
||||
|
||||
- name: Rename the installers
|
||||
id: rename-installers
|
||||
working-directory: installers
|
||||
run: python ../Cura-workflows/runner_scripts/rename_installers.py --tag "nightly" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: create the release notes
|
||||
shell: python
|
||||
run: |
|
||||
import os
|
||||
import datetime
|
||||
from jinja2 import Template
|
||||
|
||||
with open(".github/workflows/nightly_release_notes.md.jinja", "r") as f:
|
||||
release_notes = Template(f.read())
|
||||
|
||||
short_version = "${{ steps.rename-installers.outputs.short_version }}"
|
||||
caller_workflow = "${{ inputs.caller_workflow }}"
|
||||
is_main = os.getenv("GITHUB_REF") == "refs/heads/main"
|
||||
|
||||
with open("release-notes.md", "w") as f:
|
||||
f.write(release_notes.render(
|
||||
timestamp=str(datetime.datetime.now().strftime("%Y-%m-%d %H:%M:%S")),
|
||||
caller_workflow=caller_workflow,
|
||||
branch_specific="" if is_main else f"?branch={short_version}",
|
||||
))
|
||||
|
||||
- name: Update nightly release description and binaries
|
||||
run: |
|
||||
gh release edit ${{ inputs.release_tag }} --title "${{ steps.rename-installers.outputs.cura_version }}" --notes-file release-notes.md
|
||||
|
||||
for file in "installers/*"; do
|
||||
gh release upload ${{ inputs.release_tag }} $file --clobber
|
||||
done
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
|
@ -4,18 +4,17 @@
|
|||
|
||||
| | |
|
||||
|--------------:|--------------------------------------------------------------------------------------------|
|
||||
| **Nightlies** | [](https://github.com/Ultimaker/Cura/actions/workflows/installers.yml) |
|
||||
| **Nightlies** | [](https://github.com/Ultimaker/Cura/actions/workflows/installers.yml) |
|
||||
|
||||
# Unit Test results
|
||||
|
||||
| | |
|
||||
|-------------------------------:|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| **Cura {{ branch }}** | [](https://github.com/Ultimaker/Cura/actions/workflows/unit-test.yml) |
|
||||
| **CuraEngine {{ branch }}** | [](https://github.com/Ultimaker/CuraEngine/actions/workflows/unit-test.yml) |
|
||||
| **Uranium {{ branch }}** | [](https://github.com/Ultimaker/Uranium/actions/workflows/unit-test.yml) |
|
||||
| **CuraEngine GradualFlow 0.1** | [](https://github.com/Ultimaker/CuraEngine_plugin_gradual_flow/actions/workflows/unit-test.yml) |
|
||||
| **synsepalum-dulcificum 0.1** | [](https://github.com/Ultimaker/synsepalum-dulcificum/actions/workflows/unit-test.yml) |
|
||||
| **Cura** | [](https://github.com/Ultimaker/Cura/actions/workflows/unit-test.yml) |
|
||||
| **CuraEngine** | [](https://github.com/Ultimaker/CuraEngine/actions/workflows/unit-test.yml) |
|
||||
| **Uranium** | [](https://github.com/Ultimaker/Uranium/actions/workflows/unit-test.yml) |
|
||||
| **CuraEngine GradualFlow** | [](https://github.com/Ultimaker/CuraEngine_plugin_gradual_flow/actions/workflows/unit-test.yml) |
|
||||
| **synsepalum-dulcificum** | [](https://github.com/Ultimaker/synsepalum-dulcificum/actions/workflows/unit-test.yml) |
|
||||
| **libSavitar** | [](https://github.com/Ultimaker/libSavitar/actions/workflows/unit-test.yml) |
|
||||
| **libnest2d** | [](https://github.com/Ultimaker/libnest2d/actions/workflows/unit-test.yml) |
|
||||
|
||||
|
@ -23,17 +22,17 @@
|
|||
|
||||
| | |
|
||||
|------------------------------------:|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| **Cura {{ branch }}** | [](https://github.com/Ultimaker/Cura/actions/workflows/conan-package.yml) |
|
||||
| **CuraEngine {{ branch }}** | [](https://github.com/Ultimaker/CuraEngine/actions/workflows/conan-package.yml) |
|
||||
| **Uranium {{ branch }}** | [](https://github.com/Ultimaker/Uranium/actions/workflows/conan-package.yml) |
|
||||
| **fdm_materials {{ branch }}** | [](https://github.com/Ultimaker/fdm_materials/actions/workflows/conan-package.yml) |
|
||||
| **cura-binary-data {{ branch }}** | [](https://github.com/Ultimaker/cura-binary-data/actions/workflows/conan-package.yml) |
|
||||
| **CuraEngine GradualFlow 0.1** | [](https://github.com/Ultimaker/CuraEngine_plugin_gradual_flow/actions/workflows/conan-package.yml) |
|
||||
| **synsepalum-dulcificum 0.1** | [](https://github.com/Ultimaker/synsepalum-dulcificum/actions/workflows/conan-package.yml) |
|
||||
| **CuraEngine gRPC definitions 0.1** | [](https://github.com/Ultimaker/CuraEngine_grpc_definitions/actions/workflows/conan-package.yml) |
|
||||
| **Cura** | [](https://github.com/Ultimaker/Cura/actions/workflows/conan-package.yml) |
|
||||
| **CuraEngine** | [](https://github.com/Ultimaker/CuraEngine/actions/workflows/package.yml) |
|
||||
| **Uranium** | [](https://github.com/Ultimaker/Uranium/actions/workflows/conan-package.yml) |
|
||||
| **fdm_materials** | [](https://github.com/Ultimaker/fdm_materials/actions/workflows/conan-package.yml) |
|
||||
| **cura-binary-data** | [](https://github.com/Ultimaker/cura-binary-data/actions/workflows/conan-package.yml) |
|
||||
| **CuraEngine GradualFlow** | [](https://github.com/Ultimaker/CuraEngine_plugin_gradual_flow/actions/workflows/conan-package.yml) |
|
||||
| **synsepalum-dulcificum** | [](https://github.com/Ultimaker/synsepalum-dulcificum/actions/workflows/conan-package.yml) |
|
||||
| **CuraEngine gRPC definitions** | [](https://github.com/Ultimaker/CuraEngine_grpc_definitions/actions/workflows/conan-package.yml) |
|
||||
| **libArcus** | [](https://github.com/Ultimaker/libArcus/actions/workflows/conan-package.yml) |
|
||||
| **pyArcus** | [](https://github.com/Ultimaker/pyArcus/actions/workflows/conan-package.yml) |
|
||||
| **libSavitar** | [](https://github.com/Ultimaker/libSavitar/actions/workflows/conan-package.yml) |
|
||||
| **pySavitar** | [](https://github.com/Ultimaker/pySavitar/actions/workflows/conan-package.yml) |
|
||||
| **libnest2d** | [](https://github.com/Ultimaker/libnest2d/actions/workflows/conan-package.yml) |
|
||||
| **pynest2d** | [](https://github.com/Ultimaker/pynest2d/actions/workflows/conan-package.yml) |
|
||||
| **pynest2d** | [](https://github.com/Ultimaker/pynest2d/actions/workflows/conan-package.yml) |
|
|
@ -1,36 +0,0 @@
|
|||
name: notify_on_print_profile_change
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ "main" ]
|
||||
paths:
|
||||
- 'resources/definitions/fdmprinter.def.json'
|
||||
- 'resources/definitions/ultimaker**'
|
||||
- 'resources/extruders/ultimaker**'
|
||||
- 'resources/intent/ultimaker**'
|
||||
- 'resources/quality/ultimaker**'
|
||||
- 'resources/variants/ultimaker**'
|
||||
pull_request:
|
||||
branches: [ "main" ]
|
||||
paths:
|
||||
- 'resources/definitions/fdmprinter.def.json'
|
||||
- 'resources/definitions/ultimaker**'
|
||||
- 'resources/extruders/ultimaker**'
|
||||
- 'resources/intent/ultimaker**'
|
||||
- 'resources/quality/ultimaker**'
|
||||
- 'resources/variants/ultimaker**'
|
||||
permissions: {}
|
||||
jobs:
|
||||
slackNotification:
|
||||
name: Slack Notification
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Ultimaker Print Profile Changed
|
||||
uses: rtCamp/action-slack-notify@v2
|
||||
env:
|
||||
SLACK_CHANNEL: profile-changes
|
||||
SLACK_USERNAME: ${{ github.repository }}
|
||||
SLACK_COLOR: '#00FF00'
|
||||
SLACK_TITLE: Print profiles changed
|
||||
MSG_MINIMAL: commit
|
||||
SLACK_WEBHOOK: ${{ secrets.SLACK_CURA_PPM_HOOK }}
|
57
.github/workflows/printer-linter-format.yml
vendored
|
@ -1,46 +1,21 @@
|
|||
name: printer-linter-format
|
||||
|
||||
on:
|
||||
push:
|
||||
paths:
|
||||
- 'resources/definitions/**'
|
||||
- 'resources/extruders/**'
|
||||
- 'resources/intent/**'
|
||||
- 'resources/quality/**'
|
||||
- 'resources/variants/**'
|
||||
push:
|
||||
paths:
|
||||
- 'resources/definitions/**'
|
||||
- 'resources/extruders/**'
|
||||
- 'resources/intent/**'
|
||||
- 'resources/quality/**'
|
||||
- 'resources/variants/**'
|
||||
|
||||
jobs:
|
||||
printer-linter-format:
|
||||
name: Printer linter auto format
|
||||
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- uses: technote-space/get-diff-action@v6
|
||||
with:
|
||||
PATTERNS: |
|
||||
resources/+(definitions|extruders)/*.def.json
|
||||
resources/+(intent|quality|variants)/**/*.inst.cfg
|
||||
|
||||
- name: Setup Python and pip
|
||||
if: env.GIT_DIFF && !env.MATCHED_FILES # If nothing happens with python and/or pip after, the clean-up crashes.
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: 3.11.x
|
||||
cache: 'pip'
|
||||
cache-dependency-path: .github/workflows/requirements-printer-linter.txt
|
||||
|
||||
- name: Install Python requirements for runner
|
||||
if: env.GIT_DIFF && !env.MATCHED_FILES
|
||||
run: pip install -r .github/workflows/requirements-printer-linter.txt
|
||||
|
||||
- name: Format file
|
||||
if: env.GIT_DIFF && !env.MATCHED_FILES
|
||||
run: python printer-linter/src/terminal.py --format ${{ env.GIT_DIFF_FILTERED }}
|
||||
|
||||
- uses: stefanzweifel/git-auto-commit-action@v4
|
||||
if: env.GIT_DIFF && !env.MATCHED_FILES
|
||||
with:
|
||||
commit_message: "Applied printer-linter format"
|
||||
printer-linter-format:
|
||||
name: Printer linter auto format
|
||||
uses: ultimaker/cura-workflows/.github/workflows/lint-formatter.yml@main
|
||||
with:
|
||||
file_patterns: |
|
||||
resources/+(definitions|extruders)/*.def.json
|
||||
resources/+(intent|quality|variants)/**/*.inst.cfg
|
||||
command: python printer-linter/src/terminal.py --format
|
||||
commit_message: "Apply printer-linter format"
|
||||
|
|
20
.github/workflows/printer-linter-pr-diagnose.yml
vendored
|
@ -14,30 +14,16 @@ jobs:
|
|||
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 2
|
||||
- name: Setup the build environment
|
||||
uses: ultimaker/cura-workflows/.github/actions/setup-build-environment@main
|
||||
|
||||
- uses: technote-space/get-diff-action@v6
|
||||
- uses: greguintow/get-diff-action@v7
|
||||
with:
|
||||
DIFF_FILTER: AMRCD
|
||||
PATTERNS: |
|
||||
resources/+(extruders|definitions)/*.def.json
|
||||
resources/+(intent|quality|variants)/**/*.inst.cfg
|
||||
|
||||
- name: Setup Python and pip
|
||||
if: env.GIT_DIFF && !env.MATCHED_FILES # If nothing happens with python and/or pip after, the clean-up crashes.
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: 3.11.x
|
||||
cache: "pip"
|
||||
cache-dependency-path: .github/workflows/requirements-printer-linter.txt
|
||||
|
||||
- name: Install Python requirements for runner
|
||||
if: env.GIT_DIFF && !env.MATCHED_FILES
|
||||
run: pip install -r .github/workflows/requirements-printer-linter.txt
|
||||
|
||||
- name: Create results directory
|
||||
run: mkdir printer-linter-result
|
||||
|
||||
|
|
|
@ -15,6 +15,7 @@ jobs:
|
|||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
package_version: ${{ steps.version_parser.outputs.major }}.${{ steps.version_parser.outputs.minor }}.0-alpha.1
|
||||
branch: ${{ steps.version_parser.outputs.major }}.${{ steps.version_parser.outputs.minor }}
|
||||
steps:
|
||||
- name: Parse version string
|
||||
id: version_parser
|
||||
|
@ -28,5 +29,6 @@ jobs:
|
|||
needs: [parse-version]
|
||||
with:
|
||||
cura_version: ${{ needs.parse-version.outputs.package_version }}
|
||||
branch: ${{ needs.parse-version.outputs.branch }}
|
||||
create_feature_branch: true
|
||||
secrets: inherit
|
||||
|
|
|
@ -36,6 +36,7 @@ jobs:
|
|||
needs: [parse-version]
|
||||
with:
|
||||
cura_version: ${{ inputs.cura_version }}
|
||||
branch: ${{ needs.parse-version.outputs.branch_name }}
|
||||
create_feature_branch: false
|
||||
secrets: inherit
|
||||
|
||||
|
@ -99,9 +100,8 @@ jobs:
|
|||
run: |
|
||||
echo "main_commit=`git rev-parse HEAD`" >> "$GITHUB_OUTPUT"
|
||||
|
||||
|
||||
create-dependencies-packages:
|
||||
name: Create conan packages for dependencies
|
||||
create-packages:
|
||||
name: Create conan packages
|
||||
uses: ultimaker/cura-workflows/.github/workflows/conan-package-release.yml@main
|
||||
needs: [parse-version, freeze-packages-versions]
|
||||
strategy:
|
||||
|
@ -113,38 +113,17 @@ jobs:
|
|||
conan_recipe_root: "resources"
|
||||
with:
|
||||
repository: ${{ matrix.repository }}
|
||||
ref_name: ${{ needs.parse-version.outputs.branch_name }}
|
||||
version: ${{ inputs.cura_version }}
|
||||
conan_release: true
|
||||
conan_user_channel: ultimaker/stable
|
||||
conan_internal: false
|
||||
conan_latest: true
|
||||
branch: ${{ needs.parse-version.outputs.branch_name }}
|
||||
conan_recipe_root: ${{ matrix.conan_recipe_root }}
|
||||
secrets: inherit
|
||||
|
||||
create-cura-package:
|
||||
name: Create conan package for Cura
|
||||
uses: ultimaker/cura-workflows/.github/workflows/conan-package-release.yml@main
|
||||
needs: [parse-version, create-dependencies-packages]
|
||||
with:
|
||||
repository: Cura
|
||||
ref_name: ${{ needs.parse-version.outputs.branch_name }}
|
||||
version: ${{ inputs.cura_version }}
|
||||
conan_release: true
|
||||
conan_user_channel: ultimaker/stable
|
||||
conan_internal: false
|
||||
conan_latest: true
|
||||
secrets: inherit
|
||||
|
||||
create-installers:
|
||||
name: Create installers
|
||||
uses: ./.github/workflows/installers.yml
|
||||
needs: [parse-version, create-cura-package]
|
||||
uses: ultimaker/cura-workflows/.github/workflows/cura-installers.yml@main
|
||||
needs: [parse-version, create-packages]
|
||||
with:
|
||||
cura_conan_version: cura/${{ inputs.cura_version }}@/
|
||||
enterprise: false
|
||||
staging: false
|
||||
nightly: false
|
||||
cura_conan_version: cura/${{ inputs.cura_version }}
|
||||
conan_args: "-c user.sentry:environment=production"
|
||||
secrets: inherit
|
||||
|
||||
create-release-draft:
|
||||
|
@ -174,7 +153,7 @@ jobs:
|
|||
body: formatted_changelog.txt
|
||||
|
||||
- name: Download artifacts
|
||||
uses: actions/download-artifact@v4.1.7
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
path: artifacts
|
||||
merge-multiple: true
|
||||
|
|
|
@ -1,2 +0,0 @@
|
|||
conan>=1.60.2,<2.0.0
|
||||
sip<=6.7.12
|
|
@ -1 +0,0 @@
|
|||
pyyaml
|
0
.github/workflows/requirements-runner.txt
vendored
3
.github/workflows/unit-test-post.yml
vendored
|
@ -9,6 +9,5 @@ jobs:
|
|||
publish-test-results:
|
||||
uses: ultimaker/cura-workflows/.github/workflows/unit-test-post.yml@main
|
||||
with:
|
||||
event: ${{ github.event.workflow_run.event }}
|
||||
conclusion: ${{ github.event.workflow_run.conclusion }}
|
||||
workflow_run_json: ${{ toJSON(github.event.workflow_run) }}
|
||||
secrets: inherit
|
||||
|
|
21
.github/workflows/unit-test.yml
vendored
|
@ -37,26 +37,9 @@ on:
|
|||
- main
|
||||
- '[0-9]+.[0-9]+'
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
env:
|
||||
CONAN_LOGIN_USERNAME: ${{ secrets.CONAN_USER }}
|
||||
CONAN_PASSWORD: ${{ secrets.CONAN_PASS }}
|
||||
|
||||
jobs:
|
||||
conan-recipe-version:
|
||||
uses: ultimaker/cura-workflows/.github/workflows/conan-recipe-version.yml@main
|
||||
with:
|
||||
project_name: cura
|
||||
|
||||
testing:
|
||||
name: Run unit tests
|
||||
uses: ultimaker/cura-workflows/.github/workflows/unit-test.yml@main
|
||||
needs: [ conan-recipe-version ]
|
||||
with:
|
||||
recipe_id_full: ${{ needs.conan-recipe-version.outputs.recipe_id_full }}
|
||||
conan_extra_args: '-g VirtualPythonEnv -o cura:devtools=True -c tools.build:skip_test=False --options "*:enable_sentry=False"'
|
||||
unit_test_cmd: 'pytest --junitxml=junit_cura.xml'
|
||||
unit_test_dir: 'tests'
|
||||
conan_generator_dir: './venv/bin'
|
||||
secrets: inherit
|
||||
test_use_pytest: true
|
||||
|
|
92
.github/workflows/update-translation.yml
vendored
|
@ -1,87 +1,15 @@
|
|||
name: update-translations
|
||||
name: Update translations
|
||||
|
||||
on:
|
||||
push:
|
||||
paths:
|
||||
- 'plugins/**'
|
||||
- 'resources/**'
|
||||
- 'cura/**'
|
||||
- 'icons/**'
|
||||
- 'tests/**'
|
||||
- 'packaging/**'
|
||||
- '.github/workflows/conan-*.yml'
|
||||
- '.github/workflows/notify.yml'
|
||||
- '.github/workflows/requirements-conan-package.txt'
|
||||
- 'requirements*.txt'
|
||||
- 'conanfile.py'
|
||||
- 'conandata.yml'
|
||||
- 'GitVersion.yml'
|
||||
- '*.jinja'
|
||||
branches:
|
||||
- '[1-9].[0-9]'
|
||||
- '[1-9].[0-9][0-9]'
|
||||
tags:
|
||||
- '[1-9].[0-9].[0-9]*'
|
||||
- '[1-9].[0-9].[0-9]'
|
||||
- '[1-9].[0-9][0-9].[0-9]*'
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
branch:
|
||||
description: 'Specific branch to update translations on'
|
||||
required: false
|
||||
type: string
|
||||
|
||||
jobs:
|
||||
update-translations:
|
||||
name: Update translations
|
||||
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Cache Conan data
|
||||
id: cache-conan
|
||||
uses: actions/cache@v3
|
||||
with:
|
||||
path: ~/.conan
|
||||
key: ${{ runner.os }}-conan
|
||||
|
||||
- name: Setup Python and pip
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: 3.11.x
|
||||
cache: pip
|
||||
cache-dependency-path: .github/workflows/requirements-conan-package.txt
|
||||
|
||||
- name: Install Python requirements for runner
|
||||
run: pip install -r .github/workflows/requirements-conan-package.txt
|
||||
|
||||
# NOTE: Due to what are probably github issues, we have to remove the cache and reconfigure before the rest.
|
||||
# This is maybe because grub caches the disk it uses last time, which is recreated each time.
|
||||
- name: Install Linux system requirements
|
||||
if: ${{ runner.os == 'Linux' }}
|
||||
run: |
|
||||
sudo rm /var/cache/debconf/config.dat
|
||||
sudo dpkg --configure -a
|
||||
sudo add-apt-repository ppa:ubuntu-toolchain-r/test -y
|
||||
sudo apt update
|
||||
sudo apt upgrade
|
||||
sudo apt install efibootmgr build-essential checkinstall libegl-dev zlib1g-dev libssl-dev ninja-build autoconf libx11-dev libx11-xcb-dev libfontenc-dev libice-dev libsm-dev libxau-dev libxaw7-dev libxcomposite-dev libxcursor-dev libxdamage-dev libxdmcp-dev libxext-dev libxfixes-dev libxi-dev libxinerama-dev libxkbfile-dev libxmu-dev libxmuu-dev libxpm-dev libxrandr-dev libxrender-dev libxres-dev libxss-dev libxt-dev libxtst-dev libxv-dev libxvmc-dev libxxf86vm-dev xtrans-dev libxcb-render0-dev libxcb-render-util0-dev libxcb-xkb-dev libxcb-icccm4-dev libxcb-image0-dev libxcb-keysyms1-dev libxcb-randr0-dev libxcb-shape0-dev libxcb-sync-dev libxcb-xfixes0-dev libxcb-xinerama0-dev xkb-data libxcb-dri3-dev uuid-dev libxcb-util-dev libxkbcommon-x11-dev pkg-config flex bison g++-12 gcc-12 -y
|
||||
|
||||
- name: Install GCC-13
|
||||
run: |
|
||||
sudo apt install g++-13 gcc-13 -y
|
||||
sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-13 13
|
||||
sudo update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-13 13
|
||||
|
||||
- name: Create the default Conan profile
|
||||
run: conan profile new default --detect --force
|
||||
|
||||
- name: Get Conan configuration
|
||||
run: |
|
||||
conan config install https://github.com/Ultimaker/conan-config.git
|
||||
conan config install https://github.com/Ultimaker/conan-config.git -a "-b runner/${{ runner.os }}/${{ runner.arch }}"
|
||||
|
||||
- name: generate the files using Conan install
|
||||
run: conan install . --build=missing --update -o cura:devtools=True
|
||||
|
||||
- uses: stefanzweifel/git-auto-commit-action@v4
|
||||
with:
|
||||
file_pattern: resources/i18n/*.po resources/i18n/*.pot
|
||||
status_options: --untracked-files=no
|
||||
commit_message: update translations
|
||||
uses: ultimaker/cura-workflows/.github/workflows/update-translations.yml@main
|
||||
with:
|
||||
branch: ${{ inputs.branch }}
|
||||
|
|
18
.github/workflows/windows.yml
vendored
|
@ -6,11 +6,14 @@ on:
|
|||
inputs:
|
||||
cura_conan_version:
|
||||
description: 'Cura Conan Version'
|
||||
default: 'cura/latest@ultimaker/testing'
|
||||
required: true
|
||||
default: ''
|
||||
type: string
|
||||
package_overrides:
|
||||
description: 'List of specific packages to be used (space-separated)'
|
||||
default: ''
|
||||
type: string
|
||||
conan_args:
|
||||
description: 'Conan args: eq.: --require-override'
|
||||
description: 'Conan args'
|
||||
default: ''
|
||||
required: false
|
||||
type: string
|
||||
|
@ -24,13 +27,6 @@ on:
|
|||
default: false
|
||||
required: true
|
||||
type: boolean
|
||||
architecture:
|
||||
description: 'Architecture'
|
||||
required: true
|
||||
default: 'X64'
|
||||
type: choice
|
||||
options:
|
||||
- X64
|
||||
operating_system:
|
||||
description: 'OS'
|
||||
required: true
|
||||
|
@ -45,9 +41,9 @@ jobs:
|
|||
uses: ultimaker/cura-workflows/.github/workflows/cura-installer-windows.yml@main
|
||||
with:
|
||||
cura_conan_version: ${{ inputs.cura_conan_version }}
|
||||
package_overrides: ${{ inputs.package_overrides }}
|
||||
conan_args: ${{ inputs.conan_args }}
|
||||
enterprise: ${{ inputs.enterprise }}
|
||||
staging: ${{ inputs.staging }}
|
||||
architecture: ${{ inputs.architecture }}
|
||||
operating_system: ${{ inputs.operating_system }}
|
||||
secrets: inherit
|
||||
|
|
|
@ -4,6 +4,7 @@
|
|||
CuraAppName = "{{ cura_app_name }}"
|
||||
CuraAppDisplayName = "{{ cura_app_display_name }}"
|
||||
CuraVersion = "{{ cura_version }}"
|
||||
CuraVersionFull = "{{ cura_version_full }}"
|
||||
CuraBuildType = "{{ cura_build_type }}"
|
||||
CuraDebugMode = {{ cura_debug_mode }}
|
||||
CuraCloudAPIRoot = "{{ cura_cloud_api_root }}"
|
||||
|
@ -14,4 +15,7 @@ CuraDigitalFactoryURL = "{{ cura_digital_factory_url }}"
|
|||
CuraLatestURL = "{{ cura_latest_url }}"
|
||||
|
||||
ConanInstalls = {{ conan_installs }}
|
||||
|
||||
PythonInstalls = {{ python_installs }}
|
||||
|
||||
DependenciesDescriptions = {{ dependencies_description }}
|
||||
|
|
|
@ -39,6 +39,8 @@
|
|||
[![Button Settings]][Settings]
|
||||
[![Button Localize]][Localize]
|
||||
|
||||
[![Button Libraries]][Libraries]
|
||||
|
||||
<br>
|
||||
<br>
|
||||
|
||||
|
@ -75,6 +77,7 @@
|
|||
[Report]: docs/Report.md
|
||||
[Logo]: resources/images/cura-icon.png
|
||||
[#]: #
|
||||
[Libraries]: https://github.com/Ultimaker/Cura/blob/main/licenses_thirdparty
|
||||
|
||||
|
||||
<!---------------------------------[ Badges ]---------------------------------->
|
||||
|
@ -90,6 +93,7 @@
|
|||
[Badge Downloads]: https://img.shields.io/github/downloads-pre/Ultimaker/Cura/latest/total?style=for-the-badge
|
||||
|
||||
|
||||
|
||||
<!---------------------------------[ Buttons ]--------------------------------->
|
||||
|
||||
[Button Localize]: https://img.shields.io/badge/Help_Localize-e2467d?style=for-the-badge&logoColor=white&logo=GoogleTranslate
|
||||
|
@ -98,5 +102,6 @@
|
|||
[Button Building]: https://img.shields.io/badge/Building_Cura-blue?style=for-the-badge&logoColor=white&logo=GitBook
|
||||
[Button Plugins]: https://img.shields.io/badge/Plugin_Usage-569A31?style=for-the-badge&logoColor=white&logo=ROS
|
||||
[Button Report]: https://img.shields.io/badge/Report_Issues-C9284D?style=for-the-badge&logoColor=white&logo=Cliqz
|
||||
[Button Libraries]: https://img.shields.io/badge/third--party_libraries-b928c9?style=for-the-badge
|
||||
|
||||
|
||||
|
|
|
@ -28,7 +28,7 @@ a = Analysis(
|
|||
binaries=binaries,
|
||||
datas=datas,
|
||||
hiddenimports=hiddenimports,
|
||||
hookspath=[],
|
||||
hookspath=["Cura-workflows/runner_scripts/pyinstaller_hooks"],
|
||||
hooksconfig={},
|
||||
runtime_hooks=[],
|
||||
excludes=[],
|
||||
|
@ -75,7 +75,7 @@ app = BUNDLE(
|
|||
coll,
|
||||
name='{{ display_name }}.app',
|
||||
icon={{ icon }},
|
||||
bundle_identifier={{ osx_bundle_identifier }} + "_" + '{{ display_name }}'.replace(" ", "_") + "_" {{ short_version }},
|
||||
bundle_identifier={{ osx_bundle_identifier }} + "_" + '{{ display_name }}'.replace(" ", "_"),
|
||||
version={{ version }},
|
||||
info_plist={
|
||||
'CFBundleDisplayName': '{{ display_name }}',
|
||||
|
|
435
conandata.yml
|
@ -1,17 +1,18 @@
|
|||
version: "5.10.0-alpha.0"
|
||||
version: "5.11.0-alpha.0"
|
||||
requirements:
|
||||
- "cura_resources/(latest)@ultimaker/testing"
|
||||
- "uranium/(latest)@ultimaker/testing"
|
||||
- "curaengine/(latest)@ultimaker/testing"
|
||||
- "cura_binary_data/(latest)@ultimaker/testing"
|
||||
- "fdm_materials/(latest)@ultimaker/testing"
|
||||
- "dulcificum/0.2.1"
|
||||
- "pysavitar/5.3.0"
|
||||
- "pynest2d/5.3.0"
|
||||
- "native_cad_plugin/2.0.0"
|
||||
- "cura_resources/5.11.0-alpha.0@ultimaker/testing"
|
||||
- "uranium/5.11.0-alpha.0@ultimaker/testing"
|
||||
- "curaengine/5.11.0-alpha.0@ultimaker/testing"
|
||||
- "cura_binary_data/5.11.0-alpha.0@ultimaker/testing"
|
||||
- "fdm_materials/5.11.0-alpha.0@ultimaker/testing"
|
||||
- "dulcificum/5.10.0"
|
||||
- "pysavitar/5.11.0-alpha.0"
|
||||
- "pynest2d/5.10.0"
|
||||
requirements_internal:
|
||||
- "fdm_materials/(latest)@ultimaker/testing"
|
||||
- "cura_private_data/(latest)@internal/testing"
|
||||
- "fdm_materials/5.11.0-alpha.0@ultimaker/testing"
|
||||
- "cura_private_data/5.11.0-alpha.0@internal/testing"
|
||||
requirements_enterprise:
|
||||
- "native_cad_plugin/2.0.0"
|
||||
urls:
|
||||
default:
|
||||
cloud_api_root: "https://api.ultimaker.com"
|
||||
|
@ -25,6 +26,7 @@ urls:
|
|||
marketplace_root: "https://marketplace-staging.ultimaker.com"
|
||||
digital_factory_url: "https://digitalfactory-staging.ultimaker.com"
|
||||
cura_latest_url: "https://software.ultimaker.com/latest.json"
|
||||
|
||||
pyinstaller:
|
||||
runinfo:
|
||||
entrypoint: "cura_app.py"
|
||||
|
@ -34,13 +36,15 @@ pyinstaller:
|
|||
src: "plugins"
|
||||
dst: "share/cura/plugins"
|
||||
native_cad_plugin:
|
||||
package: "native_cad_plugin"
|
||||
src: "res/plugins/NativeCADplugin"
|
||||
dst: "share/cura/plugins/NativeCADplugin"
|
||||
package: "native_cad_plugin"
|
||||
src: "res/plugins/NativeCADplugin"
|
||||
dst: "share/cura/plugins/NativeCADplugin"
|
||||
enterprise_only: true
|
||||
native_cad_plugin_bundled:
|
||||
package: "native_cad_plugin"
|
||||
src: "res/bundled_packages"
|
||||
dst: "share/cura/resources/bundled_packages"
|
||||
package: "native_cad_plugin"
|
||||
src: "res/bundled_packages"
|
||||
dst: "share/cura/resources/bundled_packages"
|
||||
enterprise_only: true
|
||||
cura_resources:
|
||||
package: "cura"
|
||||
src: "resources"
|
||||
|
@ -78,18 +82,12 @@ pyinstaller:
|
|||
package: "cura_binary_data"
|
||||
src: "windows"
|
||||
dst: "share/windows"
|
||||
oses:
|
||||
- "Windows"
|
||||
fdm_materials:
|
||||
package: "fdm_materials"
|
||||
src: "res/resources/materials"
|
||||
dst: "share/cura/resources/materials"
|
||||
tcl:
|
||||
package: "tcl"
|
||||
src: "lib/tcl8.6"
|
||||
dst: "tcl"
|
||||
tk:
|
||||
package: "tk"
|
||||
src: "lib/tk8.6"
|
||||
dst: "tk"
|
||||
binaries:
|
||||
curaengine:
|
||||
package: "curaengine"
|
||||
|
@ -109,6 +107,12 @@ pyinstaller:
|
|||
- "fcntl"
|
||||
- "stl"
|
||||
- "serial"
|
||||
- "win32cred"
|
||||
- "win32timezone"
|
||||
- "pkgutil"
|
||||
hiddenimports_WINDOWS_ONLY:
|
||||
- "PyQt6.Qt"
|
||||
- "PyQt6.Qt6"
|
||||
collect_all:
|
||||
- "cura"
|
||||
- "UM"
|
||||
|
@ -120,6 +124,11 @@ pyinstaller:
|
|||
- "PyQt6.QtNetwork"
|
||||
- "PyQt6.sip"
|
||||
- "stl"
|
||||
- "keyrings.alt"
|
||||
- "pynavlib"
|
||||
collect_all_WINDOWS_ONLY:
|
||||
- "PyQt6.Qt"
|
||||
- "PyQt6.Qt6"
|
||||
icon:
|
||||
Windows: "./icons/Cura.ico"
|
||||
Macos: "./icons/cura.icns"
|
||||
|
@ -132,11 +141,12 @@ pyinstaller:
|
|||
- [ "qt", "lab", "animat" ]
|
||||
- [ "qt", "mqtt" ]
|
||||
- [ "qt", "net", "auth" ]
|
||||
- [ "qt", "quick3d" ]
|
||||
- [ "quick3d" ]
|
||||
- [ "qt", "timeline" ]
|
||||
- [ "qt", "virt", "key" ]
|
||||
- [ "qt", "wayland", "compos" ]
|
||||
- [ "qt", "5", "compat" ]
|
||||
|
||||
pycharm_targets:
|
||||
- jinja_path: .run_templates/pycharm_cura_run.run.xml.jinja
|
||||
module_name: Cura
|
||||
|
@ -255,3 +265,374 @@ pycharm_targets:
|
|||
module_name: Cura
|
||||
name: pytest in TestSettingVisibilityPresets.py
|
||||
script_name: tests/Settings/TestSettingVisibilityPresets.py
|
||||
- jinja_path: .run_templates/pycharm_cura_test.run.xml.jinja
|
||||
module_name: Cura
|
||||
name: pytest in TestStartEndGCode.py
|
||||
script_name: tests/Machines/TestStartEndGCode.py
|
||||
|
||||
pip_requirements_core:
|
||||
any_os:
|
||||
charon:
|
||||
url: "git+https://github.com/ultimaker/libcharon@master/s-line#egg=charon"
|
||||
certifi:
|
||||
version: "2023.5.7"
|
||||
hashes:
|
||||
- sha256:c6c2e98f5c7869efca1f8916fed228dd91539f9f1b444c314c06eef02980c716
|
||||
- sha256:0f0d56dc5a6ad56fd4ba36484d6cc34451e1c6548c61daad8c320169f91eddc7
|
||||
zeroconf:
|
||||
version: "0.31.0"
|
||||
hashes:
|
||||
- sha256:5a468da018bc3f04bbce77ae247924d802df7aeb4c291bbbb5a9616d128800b0
|
||||
- sha256:53a180248471c6f81bd1fffcbce03ed93d7d8eaf10905c9121ac1ea996d19844
|
||||
importlib-metadata:
|
||||
version: "4.10.0"
|
||||
hashes:
|
||||
- sha256:b7cf7d3fef75f1e4c80a96ca660efbd51473d7e8f39b5ab9210febc7809012a4
|
||||
- sha256:92a8b58ce734b2a4494878e0ecf7d79ccd7a128b5fc6014c401e0b61f006f0f6
|
||||
trimesh:
|
||||
version: "3.9.36"
|
||||
hashes:
|
||||
- sha256:8ac8bea693b3ee119f11b022fc9b9481c9f1af06cb38bc859bf5d16bbbe49b23
|
||||
- sha256:f01e8edab14d1999700c980c21a1546f37417216ad915a53be649d263130181e
|
||||
setuptools:
|
||||
version: "75.6.0"
|
||||
hashes:
|
||||
- sha256:ce74b49e8f7110f9bf04883b730f4765b774ef3ef28f722cce7c273d253aaf7d
|
||||
- sha256:8199222558df7c86216af4f84c30e9b34a61d8ba19366cc914424cdbd28252f6
|
||||
sentry-sdk:
|
||||
version: "0.13.5"
|
||||
hashes:
|
||||
- sha256:05285942901d38c7ce2498aba50d8e87b361fc603281a5902dda98f3f8c5e145
|
||||
- sha256:c6b919623e488134a728f16326c6f0bcdab7e3f59e7f4c472a90eea4d6d8fe82
|
||||
pyserial:
|
||||
version: "3.4"
|
||||
hashes:
|
||||
- sha256:e0770fadba80c31013896c7e6ef703f72e7834965954a78e71a3049488d4d7d8
|
||||
- sha256:6e2d401fdee0eab996cf734e67773a0143b932772ca8b42451440cfed942c627
|
||||
chardet:
|
||||
version: "3.0.4"
|
||||
hashes:
|
||||
- sha256:fc323ffcaeaed0e0a02bf4d117757b98aed530d9ed4531e3e15460124c106691
|
||||
- sha256:84ab92ed1c4d4f16916e05906b6b75a6c0fb5db821cc65e70cbd64a3e2a5eaae
|
||||
idna:
|
||||
version: "2.8"
|
||||
hashes:
|
||||
- sha256:ea8b7f6188e6fa117537c3df7da9fc686d485087abf6ac197f9c46432f7e4a3c
|
||||
- sha256:c357b3f628cf53ae2c4c05627ecc484553142ca23264e593d327bcde5e9c3407
|
||||
attrs:
|
||||
version: "21.3.0"
|
||||
hashes:
|
||||
- sha256:8f7335278dedd26b58c38e006338242cc0977f06d51579b2b8b87b9b33bff66c
|
||||
- sha256:50f3c9b216dc9021042f71b392859a773b904ce1a029077f58f6598272432045
|
||||
requests:
|
||||
version: "2.32.3"
|
||||
hashes:
|
||||
- sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6
|
||||
- sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760
|
||||
twisted:
|
||||
version: "21.2.0"
|
||||
hashes:
|
||||
- sha256:aab38085ea6cda5b378b519a0ec99986874921ee8881318626b0a3414bb2631e
|
||||
- sha256:77544a8945cf69b98d2946689bbe0c75de7d145cdf11f391dd487eae8fc95a12
|
||||
constantly:
|
||||
version: "15.1.0"
|
||||
hashes:
|
||||
- sha256:dd2fa9d6b1a51a83f0d7dd76293d734046aa176e384bf6e33b7e44880eb37c5d
|
||||
- sha256:586372eb92059873e29eba4f9dec8381541b4d3834660707faf8ba59146dfc35
|
||||
hyperlink:
|
||||
version: "21.0.0"
|
||||
hashes:
|
||||
- sha256:e6b14c37ecb73e89c77d78cdb4c2cc8f3fb59a885c5b3f819ff4ed80f25af1b4
|
||||
- sha256:427af957daa58bc909471c6c40f74c5450fa123dd093fc53efd2e91d2705a56b
|
||||
incremental:
|
||||
version: "22.10.0"
|
||||
hashes:
|
||||
- sha256:b864a1f30885ee72c5ac2835a761b8fe8aa9c28b9395cacf27286602688d3e51
|
||||
- sha256:912feeb5e0f7e0188e6f42241d2f450002e11bbc0937c65865045854c24c0bd0
|
||||
zope.interface:
|
||||
version: "5.4.0"
|
||||
hashes:
|
||||
- sha256:7df1e1c05304f26faa49fa752a8c690126cf98b40b91d54e6e9cc3b7d6ffe8b7
|
||||
- sha256:2c98384b254b37ce50eddd55db8d381a5c53b4c10ee66e1e7fe749824f894021
|
||||
- sha256:08f9636e99a9d5410181ba0729e0408d3d8748026ea938f3b970a0249daa8192
|
||||
- sha256:0ea1d73b7c9dcbc5080bb8aaffb776f1c68e807767069b9ccdd06f27a161914a
|
||||
- sha256:273f158fabc5ea33cbc936da0ab3d4ba80ede5351babc4f577d768e057651531
|
||||
- sha256:f7ee479e96f7ee350db1cf24afa5685a5899e2b34992fb99e1f7c1b0b758d263
|
||||
- sha256:b0297b1e05fd128d26cc2460c810d42e205d16d76799526dfa8c8ccd50e74959
|
||||
- sha256:af310ec8335016b5e52cae60cda4a4f2a60a788cbb949a4fbea13d441aa5a09e
|
||||
- sha256:9a9845c4c6bb56e508651f005c4aeb0404e518c6f000d5a1123ab077ab769f5c
|
||||
- sha256:a1e6e96217a0f72e2b8629e271e1b280c6fa3fe6e59fa8f6701bec14e3354325
|
||||
- sha256:877473e675fdcc113c138813a5dd440da0769a2d81f4d86614e5d62b69497155
|
||||
- sha256:0b465ae0962d49c68aa9733ba92a001b2a0933c317780435f00be7ecb959c702
|
||||
- sha256:5dd9ca406499444f4c8299f803d4a14edf7890ecc595c8b1c7115c2342cadc5f
|
||||
- sha256:469e2407e0fe9880ac690a3666f03eb4c3c444411a5a5fddfdabc5d184a79f05
|
||||
- sha256:52de7fc6c21b419078008f697fd4103dbc763288b1406b4562554bd47514c004
|
||||
- sha256:3dd4952748521205697bc2802e4afac5ed4b02909bb799ba1fe239f77fd4e117
|
||||
- sha256:dd93ea5c0c7f3e25335ab7d22a507b1dc43976e1345508f845efc573d3d779d8
|
||||
- sha256:3748fac0d0f6a304e674955ab1365d515993b3a0a865e16a11ec9d86fb307f63
|
||||
- sha256:66c0061c91b3b9cf542131148ef7ecbecb2690d48d1612ec386de9d36766058f
|
||||
- sha256:d0c1bc2fa9a7285719e5678584f6b92572a5b639d0e471bb8d4b650a1a910920
|
||||
- sha256:2876246527c91e101184f63ccd1d716ec9c46519cc5f3d5375a3351c46467c46
|
||||
- sha256:334701327f37c47fa628fc8b8d28c7d7730ce7daaf4bda1efb741679c2b087fc
|
||||
- sha256:71aace0c42d53abe6fc7f726c5d3b60d90f3c5c055a447950ad6ea9cec2e37d9
|
||||
- sha256:5bb3489b4558e49ad2c5118137cfeaf59434f9737fa9c5deefc72d22c23822e2
|
||||
- sha256:1c0e316c9add0db48a5b703833881351444398b04111188069a26a61cfb4df78
|
||||
- sha256:6f0c02cbb9691b7c91d5009108f975f8ffeab5dff8f26d62e21c493060eff2a1
|
||||
- sha256:7d97a4306898b05404a0dcdc32d9709b7d8832c0c542b861d9a826301719794e
|
||||
- sha256:867a5ad16892bf20e6c4ea2aab1971f45645ff3102ad29bd84c86027fa99997b
|
||||
- sha256:5f931a1c21dfa7a9c573ec1f50a31135ccce84e32507c54e1ea404894c5eb96f
|
||||
- sha256:194d0bcb1374ac3e1e023961610dc8f2c78a0f5f634d0c737691e215569e640d
|
||||
- sha256:8270252effc60b9642b423189a2fe90eb6b59e87cbee54549db3f5562ff8d1b8
|
||||
- sha256:15e7d1f7a6ee16572e21e3576d2012b2778cbacf75eb4b7400be37455f5ca8bf
|
||||
- sha256:8892f89999ffd992208754851e5a052f6b5db70a1e3f7d54b17c5211e37a98c7
|
||||
- sha256:2e5a26f16503be6c826abca904e45f1a44ff275fdb7e9d1b75c10671c26f8b94
|
||||
- sha256:0f91b5b948686659a8e28b728ff5e74b1be6bf40cb04704453617e5f1e945ef3
|
||||
- sha256:4de4bc9b6d35c5af65b454d3e9bc98c50eb3960d5a3762c9438df57427134b8e
|
||||
- sha256:bf68f4b2b6683e52bec69273562df15af352e5ed25d1b6641e7efddc5951d1a7
|
||||
- sha256:63b82bb63de7c821428d513607e84c6d97d58afd1fe2eb645030bdc185440120
|
||||
- sha256:db1fa631737dab9fa0b37f3979d8d2631e348c3b4e8325d6873c2541d0ae5a48
|
||||
- sha256:f44e517131a98f7a76696a7b21b164bcb85291cee106a23beccce454e1f433a4
|
||||
- sha256:a9506a7e80bcf6eacfff7f804c0ad5350c8c95b9010e4356a4b36f5322f09abb
|
||||
- sha256:3c02411a3b62668200910090a0dff17c0b25aaa36145082a5a6adf08fa281e54
|
||||
- sha256:0cee5187b60ed26d56eb2960136288ce91bcf61e2a9405660d271d1f122a69a4
|
||||
- sha256:a8156e6a7f5e2a0ff0c5b21d6bcb45145efece1909efcbbbf48c56f8da68221d
|
||||
- sha256:205e40ccde0f37496904572035deea747390a8b7dc65146d30b96e2dd1359a83
|
||||
- sha256:3f24df7124c323fceb53ff6168da70dbfbae1442b4f3da439cd441681f54fe25
|
||||
- sha256:5208ebd5152e040640518a77827bdfcc73773a15a33d6644015b763b9c9febc1
|
||||
- sha256:17776ecd3a1fdd2b2cd5373e5ef8b307162f581c693575ec62e7c5399d80794c
|
||||
- sha256:d4d9d6c1a455d4babd320203b918ccc7fcbefe308615c521062bc2ba1aa4d26e
|
||||
- sha256:0cba8477e300d64a11a9789ed40ee8932b59f9ee05f85276dbb4b59acee5dd09
|
||||
- sha256:5dba5f530fec3f0988d83b78cc591b58c0b6eb8431a85edd1569a0539a8a5a0e
|
||||
automat:
|
||||
version: "20.2.0"
|
||||
hashes:
|
||||
- sha256:b6feb6455337df834f6c9962d6ccf771515b7d939bca142b29c20c2376bc6111
|
||||
- sha256:7979803c74610e11ef0c0d68a2942b152df52da55336e0c9d58daf1831cbdf33
|
||||
shapely:
|
||||
version: "2.0.6"
|
||||
hashes:
|
||||
- sha256:29a34e068da2d321e926b5073539fd2a1d4429a2c656bd63f0bd4c8f5b236d0b
|
||||
- sha256:e1c84c3f53144febf6af909d6b581bc05e8785d57e27f35ebaa5c1ab9baba13b
|
||||
- sha256:2ad2fae12dca8d2b727fa12b007e46fbc522148a584f5d6546c539f3464dccde
|
||||
- sha256:b3304883bd82d44be1b27a9d17f1167fda8c7f5a02a897958d86c59ec69b705e
|
||||
- sha256:3ec3a0eab496b5e04633a39fa3d5eb5454628228201fb24903d38174ee34565e
|
||||
- sha256:28f87cdf5308a514763a5c38de295544cb27429cfa655d50ed8431a4796090c4
|
||||
- sha256:5aeb0f51a9db176da9a30cb2f4329b6fbd1e26d359012bb0ac3d3c7781667a9e
|
||||
- sha256:9a7a78b0d51257a367ee115f4d41ca4d46edbd0dd280f697a8092dd3989867b2
|
||||
- sha256:f32c23d2f43d54029f986479f7c1f6e09c6b3a19353a3833c2ffb226fb63a855
|
||||
- sha256:b3dc9fb0eb56498912025f5eb352b5126f04801ed0e8bdbd867d21bdbfd7cbd0
|
||||
- sha256:d93b7e0e71c9f095e09454bf18dad5ea716fb6ced5df3cb044564a00723f339d
|
||||
- sha256:c02eb6bf4cfb9fe6568502e85bb2647921ee49171bcd2d4116c7b3109724ef9b
|
||||
- sha256:cec9193519940e9d1b86a3b4f5af9eb6910197d24af02f247afbfb47bcb3fab0
|
||||
- sha256:83b94a44ab04a90e88be69e7ddcc6f332da7c0a0ebb1156e1c4f568bbec983c3
|
||||
- sha256:537c4b2716d22c92036d00b34aac9d3775e3691f80c7aa517c2c290351f42cd8
|
||||
- sha256:98fea108334be345c283ce74bf064fa00cfdd718048a8af7343c59eb40f59726
|
||||
- sha256:42fd4cd4834747e4990227e4cbafb02242c0cffe9ce7ef9971f53ac52d80d55f
|
||||
- sha256:665990c84aece05efb68a21b3523a6b2057e84a1afbef426ad287f0796ef8a48
|
||||
- sha256:42805ef90783ce689a4dde2b6b2f261e2c52609226a0438d882e3ced40bb3013
|
||||
- sha256:6d2cb146191a47bd0cee8ff5f90b47547b82b6345c0d02dd8b25b88b68af62d7
|
||||
- sha256:e3fdef0a1794a8fe70dc1f514440aa34426cc0ae98d9a1027fb299d45741c381
|
||||
- sha256:2c665a0301c645615a107ff7f52adafa2153beab51daf34587170d85e8ba6805
|
||||
- sha256:0334bd51828f68cd54b87d80b3e7cee93f249d82ae55a0faf3ea21c9be7b323a
|
||||
- sha256:d37d070da9e0e0f0a530a621e17c0b8c3c9d04105655132a87cfff8bd77cc4c2
|
||||
- sha256:fa7468e4f5b92049c0f36d63c3e309f85f2775752e076378e36c6387245c5462
|
||||
- sha256:ed5867e598a9e8ac3291da6cc9baa62ca25706eea186117034e8ec0ea4355653
|
||||
- sha256:81d9dfe155f371f78c8d895a7b7f323bb241fb148d848a2bf2244f79213123fe
|
||||
- sha256:fbb7bf02a7542dba55129062570211cfb0defa05386409b3e306c39612e7fbcc
|
||||
- sha256:837d395fac58aa01aa544495b97940995211e3e25f9aaf87bc3ba5b3a8cd1ac7
|
||||
- sha256:c6d88ade96bf02f6bfd667ddd3626913098e243e419a0325ebef2bbd481d1eb6
|
||||
- sha256:8b3b818c4407eaa0b4cb376fd2305e20ff6df757bf1356651589eadc14aab41b
|
||||
- sha256:1bbc783529a21f2bd50c79cef90761f72d41c45622b3e57acf78d984c50a5d13
|
||||
- sha256:2423f6c0903ebe5df6d32e0066b3d94029aab18425ad4b07bf98c3972a6e25a1
|
||||
- sha256:2de00c3bfa80d6750832bde1d9487e302a6dd21d90cb2f210515cefdb616e5f5
|
||||
- sha256:3a82d58a1134d5e975f19268710e53bddd9c473743356c90d97ce04b73e101ee
|
||||
- sha256:392f66f458a0a2c706254f473290418236e52aa4c9b476a072539d63a2460595
|
||||
- sha256:eba5bae271d523c938274c61658ebc34de6c4b33fdf43ef7e938b5776388c1be
|
||||
- sha256:7060566bc4888b0c8ed14b5d57df8a0ead5c28f9b69fb6bed4476df31c51b0af
|
||||
- sha256:b02154b3e9d076a29a8513dffcb80f047a5ea63c897c0cd3d3679f29363cf7e5
|
||||
- sha256:44246d30124a4f1a638a7d5419149959532b99dfa25b54393512e6acc9c211ac
|
||||
- sha256:2b542d7f1dbb89192d3512c52b679c822ba916f93479fa5d4fc2fe4fa0b3c9e8
|
||||
- sha256:997f6159b1484059ec239cacaa53467fd8b5564dabe186cd84ac2944663b0bf6
|
||||
cython:
|
||||
version: "0.29.26"
|
||||
hashes:
|
||||
- sha256:c4b003b6b7aa9e74552ef8d4e6009b3e3c3e8fa585710b3a6d062e088e460c1b
|
||||
- sha256:ce804a021c92fea66c8c100781a111706f21bade7a546895c5a9c57fe2df8b24
|
||||
- sha256:93840f2071c1f15e613509eadee1fbcd335e8666772437fe5038e24059edd48c
|
||||
- sha256:10402f0f1564ffc6ecb9c45e07f995771d05bb0b0e1d151e40574638292ee3a5
|
||||
- sha256:8e07121b34221458a2151d37e137b8f5b011a9c51dd38db2499a6198590aa319
|
||||
- sha256:233a87db76941626f1db08f4b25a4a5b425b13b64ed0e673c3641f7b650a48d8
|
||||
- sha256:6773cce9d4b3b6168d8feb2b6f06b658ef1e11cbfec075041745666d8e2a5e45
|
||||
- sha256:c813799d533194b7d85203d881d8b4f567a8c644a67f50d47f1ffbf316df412f
|
||||
- sha256:362fbb9cb4627c7786231429768b54aaba5459a2a0e46c25e59f202ca6155437
|
||||
- sha256:2b834ff6e4d10ba6d7a0d676dd71c1b427a181ddbbbbf79e91d1861557aab59f
|
||||
- sha256:0c3093bc99facfc97e5019f6c5bc39987663792265c1334d9fc9e37c3a3dcd6f
|
||||
- sha256:bbf0149680c1fca07200a3ed372b22e6bad7851d191b717a61f9a68af370e180
|
||||
- sha256:a1cc55db32cd39474081d478263b96e036552cdbbab8831c90ea43fb385a9b66
|
||||
- sha256:ebe32e002a9e6553de399033e259ece72aa17c77f740b265e66f122572a8a278
|
||||
- sha256:6b385f68789c3e8a75b4827e8a4970ff04605ad3cb1c0b41005cc69368dad65d
|
||||
- sha256:1519eb639de308f5763eb0666b4cc7bd3958268f3f6228cc610b7b4d6c94b68b
|
||||
- sha256:e118525defec3f67471d8ee5ce04340d43195410a87e5d7ec8a1a9e953c0066a
|
||||
- sha256:706ea55f58c2722206e51cd9a8754ed0995c4c4231d24b095875d2641d745222
|
||||
- sha256:77913fe27c5e22c995bac090d01e200ff91e5f58aa944e2d2e94cbf67ea2ae34
|
||||
- sha256:51923120f57a42c59f5ee6bba9e89a31a394ae8cd419c753f64d8a3aea1ee8b7
|
||||
- sha256:82881565d04027728d7762edd8c085927a840873af7ee049d703e0ca226bc08d
|
||||
- sha256:531303085503959338e6cdac630626280ef111aecbb22d48321673a8c3897c0a
|
||||
- sha256:0205b685802eb4c039b14f67b7ac3f00c55ff04b9e3871df2249576d3e59ba42
|
||||
- sha256:7df94e56872df8f396ca669466fee60256f69f678654239f706b1e643c2ac4a5
|
||||
- sha256:4b7d04b393d9a4b5fec0cbc4b0f29fe083a9d862d95231a6e7608978bd661d7e
|
||||
- sha256:af91dd63ac5f1f7fc70dc91ea063f727db42b5eb934d1f3832611be18e25171e
|
||||
- sha256:d83dad8dc6c63706cb3178dc79010b3865b1345090727189d2cd61758a825ee8
|
||||
- sha256:ca10e9fde0eaba1407ab353ff07a26daaa3e4dbe356108a149e482d441f070dd
|
||||
- sha256:fec66cd0a48697fab903854566235aecf1084f62e3163d6589ae7335a1b4d448
|
||||
- sha256:b3041e45aefaa4449fd671902132c0ac1f72eedaedac745c0e1a70a13bf990bb
|
||||
- sha256:ed76fb98979f02b5e89079906071983a36f3634d3028b86f935cf0196f24fcaa
|
||||
- sha256:4d868e1a41f5123f51a20c1b8e82f7cb6fa3370c104e98e707f7c910e8cadad1
|
||||
- sha256:868f309095e557f06dc58723ae865e8c65cfedb2646a562bd8080c92d69e4e4b
|
||||
- sha256:be550b566345bf53b95616334793ce42a128cf1d9dcde1e28cbf7ce52ea61d6d
|
||||
- sha256:be13be1e2b9b7395588f2a23bfa692f2f3e6f7936ccf7825c83431b8c8c3452b
|
||||
- sha256:41ee918480371ae5e5851ba9b1ead5a183e24aedb27bf807c7405d124e943f40
|
||||
- sha256:c91b1ba0d43f7f7ccde8121c672207c7891735ddcc83496af1e0ab617ddc4aba
|
||||
- sha256:5ecf5cf5b57086cc6c1cfc76d6353bbd7023e95da32e0883f1302ca50e481c33
|
||||
- sha256:0ffce25bf50fa926ec6bf8d6f29650e7cb33fae445938c9880e1ce9b776353ef
|
||||
- sha256:5041adfef502d67ecd5e291a7cf645a37fed7a9dac557f40d491053f35204d00
|
||||
- sha256:5fd5db458c9d3d2c2abd047f3190624d9cce8a80a8e0ca0baa69cfd133a523bc
|
||||
- sha256:75eaa22911d2ec37a3841f77b710b178c805cd378b5e1c4fb82dbc35620d2062
|
||||
- sha256:3aed8c642e8fb27024bca46830b7f62335a44a92354acf708d6b8d050f945a3a
|
||||
- sha256:b5ca05c2bfba0c2480b5fd390ecffe46b8da574d887d600388d6e3caf3f99a88
|
||||
- sha256:f5e15ff892c8afad64931ee3dd723c4755c2c516606f9aae7613bebfac62b0f6
|
||||
- sha256:af377d543a762867da11fcf6e558f7a4a535ff8693f30cce123fab10c00fa312
|
||||
pybind11:
|
||||
version: "2.6.2"
|
||||
hashes:
|
||||
- sha256:2d8aebe1709bc367e34e3b23d8eccbf3f387ee9d5640548c6260d33b59f02405
|
||||
- sha256:d0e0aed9279656f21501243b707eb6e3b951e89e10c3271dedf3ae41c365e5ed
|
||||
wheel:
|
||||
version: "0.37.1"
|
||||
hashes:
|
||||
- sha256:4bdcd7d840138086126cd09254dc6195fb4fc6f01c050a1d7236f2630db1d22a
|
||||
- sha256:e9a504e793efbca1b8e0e9cb979a249cf4a0a7b5b8c9e8b65a5e39d49529c1c4
|
||||
ifaddr:
|
||||
version: "0.1.7"
|
||||
hashes:
|
||||
- sha256:d1f603952f0a71c9ab4e705754511e4e03b02565bc4cec7188ad6415ff534cd3
|
||||
- sha256:1f9e8a6ca6f16db5a37d3356f07b6e52344f6f9f7e806d618537731669eb1a94
|
||||
pycparser:
|
||||
version: "2.20"
|
||||
hashes:
|
||||
- sha256:7582ad22678f0fcd81102833f60ef8d0e57288b6b5fb00323d101be910e35705
|
||||
- sha256:2d475327684562c3a96cc71adf7dc8c4f0565175cf86b6d7a404ff4c771f15f0
|
||||
zipp:
|
||||
version: "3.5.0"
|
||||
hashes:
|
||||
- sha256:957cfda87797e389580cb8b9e3870841ca991e2125350677b2ca83a0e99390a3
|
||||
- sha256:f5812b1e007e48cff63449a5e9f4e7ebea716b4111f9c4f9a645f91d579bf0c4
|
||||
urllib3:
|
||||
version: "2.2.3"
|
||||
hashes:
|
||||
- sha256:ca899ca043dcb1bafa3e262d73aa25c465bfb49e0bd9dd5d59f1d0acba2f8fac
|
||||
- sha256:e7d814a81dad81e6caf2ec9fdedb284ecc9c73076b62654547cc64ccdcae26e9
|
||||
jeepney:
|
||||
version: "0.8.0"
|
||||
hashes:
|
||||
- sha256:c0a454ad016ca575060802ee4d590dd912e35c122fa04e70306de3d076cce755
|
||||
- sha256:5efe48d255973902f6badc3ce55e2aa6c5c3b3bc642059ef3a91247bcfcc5806
|
||||
SecretStorage:
|
||||
version: "3.3.3"
|
||||
hashes:
|
||||
- sha256:f356e6628222568e3af06f2eba8df495efa13b3b63081dafd4f7d9a7b7bc9f99
|
||||
- sha256:2403533ef369eca6d2ba81718576c5e0f564d5cca1b58f73a8b23e7d4eeebd77
|
||||
keyring:
|
||||
version: "25.5.0"
|
||||
hashes:
|
||||
- sha256:e67f8ac32b04be4714b42fe84ce7dad9c40985b9ca827c592cc303e7c26d9741
|
||||
- sha256:4c753b3ec91717fe713c4edd522d625889d8973a349b0e582622f49766de58e6
|
||||
jaraco.classes:
|
||||
version: "3.4.0"
|
||||
hashes:
|
||||
- sha256:f662826b6bed8cace05e7ff873ce0f9283b5c924470fe664fff1c2f00f581790
|
||||
- sha256:47a024b51d0239c0dd8c8540c6c7f484be3b8fcf0b2d85c13825780d3b3f3acd
|
||||
jaraco.functools:
|
||||
version: "4.1.0"
|
||||
hashes:
|
||||
- sha256:ad159f13428bc4acbf5541ad6dec511f91573b90fba04df61dafa2a1231cf649
|
||||
- sha256:70f7e0e2ae076498e212562325e805204fc092d7b4c17e0e86c959e249701a9d
|
||||
jaraco.context:
|
||||
version: "6.0.1"
|
||||
hashes:
|
||||
- sha256:f797fc481b490edb305122c9181830a3a5b76d84ef6d1aef2fb9b47ab956f9e4
|
||||
- sha256:9bae4ea555cf0b14938dc0aee7c9f32ed303aa20a3b73e7dc80111628792d1b3
|
||||
more_itertools:
|
||||
version: "10.5.0"
|
||||
hashes:
|
||||
- sha256:037b0d3203ce90cca8ab1defbbdac29d5f993fc20131f3664dc8d6acfa872aef
|
||||
- sha256:5482bfef7849c25dc3c6dd53a6173ae4795da2a41a80faea6700d9f5846c5da6
|
||||
charset-normalizer:
|
||||
version: "2.1.0"
|
||||
hashes:
|
||||
- sha256:5189b6f22b01957427f35b6a08d9a0bc45b46d3788ef5a92e978433c7a35f8a5
|
||||
- sha256:575e708016ff3a5e3681541cb9d79312c416835686d054a23accb873b254f413
|
||||
|
||||
Windows:
|
||||
twisted-iocpsupport:
|
||||
version: "1.0.2"
|
||||
hashes:
|
||||
- sha256:985c06a33f5c0dae92c71a036d1ea63872ee86a21dd9b01e1f287486f15524b4
|
||||
- sha256:81b3abe3527b367da0220482820cb12a16c661672b7bcfcde328902890d63323
|
||||
- sha256:9dbb8823b49f06d4de52721b47de4d3b3026064ef4788ce62b1a21c57c3fff6f
|
||||
- sha256:b9fed67cf0f951573f06d560ac2f10f2a4bbdc6697770113a2fc396ea2cb2565
|
||||
- sha256:b76b4eed9b27fd63ddb0877efdd2d15835fdcb6baa745cb85b66e5d016ac2878
|
||||
- sha256:851b3735ca7e8102e661872390e3bce88f8901bece95c25a0c8bb9ecb8a23d32
|
||||
- sha256:bf4133139d77fc706d8f572e6b7d82871d82ec7ef25d685c2351bdacfb701415
|
||||
- sha256:306becd6e22ab6e8e4f36b6bdafd9c92e867c98a5ce517b27fdd27760ee7ae41
|
||||
- sha256:3c61742cb0bc6c1ac117a7e5f422c129832f0c295af49e01d8a6066df8cfc04d
|
||||
- sha256:b435857b9efcbfc12f8c326ef0383f26416272260455bbca2cd8d8eca470c546
|
||||
- sha256:7d972cfa8439bdcb35a7be78b7ef86d73b34b808c74be56dfa785c8a93b851bf
|
||||
- sha256:72068b206ee809c9c596b57b5287259ea41ddb4774d86725b19f35bf56aa32a9
|
||||
pywin32-ctypes:
|
||||
version: "0.2.3"
|
||||
hashes:
|
||||
- sha256:8a1513379d709975552d202d942d9837758905c8d01eb82b8bcc30918929e7b8
|
||||
- sha256:d162dc04946d704503b2edc4d55f3dba5c1d539ead017afa00142c38b9885755
|
||||
pynavlib:
|
||||
version: "0.9.4"
|
||||
hashes:
|
||||
- sha256:fdd5ab5b6e0a2c9bbcebb154ac7303daf845865a1649be04e1bd8e8e5889401f
|
||||
- sha256:493c4b3cacc939b021a694d99723106dbd7ee5515ad4dfc1c7fc8219ef20cf3a
|
||||
- sha256:332831553a70be05fe58c43a08109b42970cfedc6086ffb4306859142a0e9210
|
||||
- sha256:9173f61ad83172c306b92bbe38f949889c158cd6dfdc924db01f257a437bf2a6
|
||||
|
||||
Macos:
|
||||
pynavlib:
|
||||
version: "0.9.4"
|
||||
hashes:
|
||||
- sha256:567efd0af97f9014326898b209eea94d9f5cc58e9f589ccf8354584568fcb87d
|
||||
- sha256:f0d7ce426e816788aa96b419fd7da263eafb99aca46ce3b6e5dbaf2bbf6b614a
|
||||
- sha256:33962a322033a78db05a8c2cc3d59e057fbea5b04879c3c54e2fe3041d691a12
|
||||
- sha256:d06d94b1dee4ba024b4a121869e572f571673a3b8c15b4055f52236d43c19a02
|
||||
|
||||
pip_requirements_dev:
|
||||
any_os:
|
||||
pytest: {}
|
||||
pyyaml: {}
|
||||
sip: { version: "6.5.1" }
|
||||
jinja2: {}
|
||||
|
||||
pip_requirements_installer:
|
||||
any_os:
|
||||
pyinstaller: { version: "6.11.1" }
|
||||
pyinstaller-hooks-contrib: {}
|
||||
|
||||
python_translation_source_folders:
|
||||
- cura
|
||||
- plugins
|
||||
qml_translation_source_folders:
|
||||
- resources/qml
|
||||
- plugins
|
||||
|
||||
extra_dependencies:
|
||||
conan:
|
||||
version: "2.7.1"
|
||||
sources_url: https://github.com/conan-io/conan
|
||||
license: MIT
|
||||
summary: Conan C/C++ package manager
|
||||
|
|
581
conanfile.py
|
@ -1,6 +1,14 @@
|
|||
import json
|
||||
import os
|
||||
import requests
|
||||
import yaml
|
||||
import tempfile
|
||||
import tarfile
|
||||
from datetime import datetime
|
||||
from io import StringIO
|
||||
from pathlib import Path
|
||||
from git import Repo
|
||||
from git.exc import GitCommandError
|
||||
|
||||
from jinja2 import Template
|
||||
|
||||
|
@ -11,7 +19,7 @@ from conan.tools.env import VirtualRunEnv, Environment, VirtualBuildEnv
|
|||
from conan.tools.scm import Version
|
||||
from conan.errors import ConanInvalidConfiguration, ConanException
|
||||
|
||||
required_conan_version = ">=1.58.0 <2.0.0"
|
||||
required_conan_version = ">=2.7.0" # When changing the version, also change the one in conandata.yml/extra_dependencies
|
||||
|
||||
|
||||
class CuraConan(ConanFile):
|
||||
|
@ -24,99 +32,55 @@ class CuraConan(ConanFile):
|
|||
build_policy = "missing"
|
||||
exports = "LICENSE*", "*.jinja"
|
||||
settings = "os", "compiler", "build_type", "arch"
|
||||
generators = "VirtualPythonEnv"
|
||||
tool_requires = "gettext/0.22.5"
|
||||
|
||||
# FIXME: Remove specific branch once merged to main
|
||||
python_requires = "translationextractor/[>=2.2.0]@ultimaker/stable"
|
||||
python_requires = "translationextractor/[>=2.2.0]"
|
||||
|
||||
options = {
|
||||
"enterprise": ["True", "False", "true", "false"], # Workaround for GH Action passing boolean as lowercase string
|
||||
"staging": ["True", "False", "true", "false"], # Workaround for GH Action passing boolean as lowercase string
|
||||
"devtools": [True, False], # FIXME: Split this up in testing and (development / build (pyinstaller) / system installer) tools
|
||||
"cloud_api_version": "ANY",
|
||||
"display_name": "ANY", # TODO: should this be an option??
|
||||
"enterprise": [True, False],
|
||||
"staging": [True, False],
|
||||
"cloud_api_version": ["ANY"],
|
||||
"display_name": ["ANY"], # TODO: should this be an option??
|
||||
"cura_debug_mode": [True, False], # FIXME: Use profiles
|
||||
"internal": ["True", "False", "true", "false"], # Workaround for GH Action passing boolean as lowercase string
|
||||
"enable_i18n": [True, False],
|
||||
"internal": [True, False],
|
||||
"i18n_extract": [True, False],
|
||||
"skip_licenses_download": [True, False],
|
||||
}
|
||||
default_options = {
|
||||
"enterprise": "False",
|
||||
"staging": "False",
|
||||
"devtools": False,
|
||||
"enterprise": False,
|
||||
"staging": False,
|
||||
"cloud_api_version": "1",
|
||||
"display_name": "UltiMaker Cura",
|
||||
"cura_debug_mode": False, # Not yet implemented
|
||||
"internal": "False",
|
||||
"enable_i18n": False,
|
||||
"internal": False,
|
||||
"i18n_extract": False,
|
||||
"skip_licenses_download": False,
|
||||
}
|
||||
|
||||
def set_version(self):
|
||||
if not self.version:
|
||||
self.version = self.conan_data["version"]
|
||||
|
||||
@property
|
||||
def _i18n_options(self):
|
||||
return self.conf.get("user.i18n:options", default = {"extract": True, "build": True}, check_type = dict)
|
||||
|
||||
@property
|
||||
def _pycharm_targets(self):
|
||||
return self.conan_data["pycharm_targets"]
|
||||
|
||||
# FIXME: These env vars should be defined in the runenv.
|
||||
_cura_env = None
|
||||
|
||||
@property
|
||||
def _cura_run_env(self):
|
||||
if self._cura_env:
|
||||
return self._cura_env
|
||||
|
||||
self._cura_env = Environment()
|
||||
self._cura_env.define("QML2_IMPORT_PATH", str(self._site_packages.joinpath("PyQt6", "Qt6", "qml")))
|
||||
self._cura_env.define("QT_PLUGIN_PATH", str(self._site_packages.joinpath("PyQt6", "Qt6", "plugins")))
|
||||
if not self.in_local_cache:
|
||||
self._cura_env.define("CURA_DATA_ROOT", str(self._share_dir.joinpath("cura")))
|
||||
|
||||
if self.settings.os == "Linux":
|
||||
self._cura_env.define("QT_QPA_FONTDIR", "/usr/share/fonts")
|
||||
self._cura_env.define("QT_QPA_PLATFORMTHEME", "xdgdesktopportal")
|
||||
self._cura_env.define("QT_XKB_CONFIG_ROOT", "/usr/share/X11/xkb")
|
||||
return self._cura_env
|
||||
|
||||
@property
|
||||
def _enterprise(self):
|
||||
return self.options.enterprise in ["True", 'true']
|
||||
|
||||
@property
|
||||
def _internal(self):
|
||||
return self.options.internal in ["True", 'true']
|
||||
|
||||
@property
|
||||
def _app_name(self):
|
||||
if self._enterprise:
|
||||
if self.options.enterprise:
|
||||
return str(self.options.display_name) + " Enterprise"
|
||||
return str(self.options.display_name)
|
||||
|
||||
@property
|
||||
def _urls(self):
|
||||
if self.options.staging in ["True", 'true']:
|
||||
if self.options.staging:
|
||||
return "staging"
|
||||
return "default"
|
||||
|
||||
@property
|
||||
def requirements_txts(self):
|
||||
if self.options.devtools:
|
||||
return ["requirements.txt", "requirements-ultimaker.txt", "requirements-dev.txt"]
|
||||
return ["requirements.txt", "requirements-ultimaker.txt"]
|
||||
def _root_dir(self):
|
||||
return Path(self.deploy_folder if hasattr(self, "deploy_folder") else self.source_folder)
|
||||
|
||||
@property
|
||||
def _base_dir(self):
|
||||
if self.install_folder is None:
|
||||
if self.build_folder is not None:
|
||||
return Path(self.build_folder)
|
||||
return Path(os.getcwd(), "venv")
|
||||
if self.in_local_cache:
|
||||
return Path(self.install_folder)
|
||||
else:
|
||||
return Path(self.source_folder, "venv")
|
||||
return self._root_dir.joinpath("venv")
|
||||
|
||||
@property
|
||||
def _share_dir(self):
|
||||
|
@ -132,7 +96,7 @@ class CuraConan(ConanFile):
|
|||
def _site_packages(self):
|
||||
if self.settings.os == "Windows":
|
||||
return self._base_dir.joinpath("Lib", "site-packages")
|
||||
py_version = Version(self.deps_cpp_info["cpython"].version)
|
||||
py_version = Version(self.dependencies["cpython"].ref.version)
|
||||
return self._base_dir.joinpath("lib", f"python{py_version.major}.{py_version.minor}", "site-packages")
|
||||
|
||||
@property
|
||||
|
@ -157,7 +121,7 @@ class CuraConan(ConanFile):
|
|||
# list of conan installs
|
||||
for dependency in self.dependencies.host.values():
|
||||
conan_installs[dependency.ref.name] = {
|
||||
"version": dependency.ref.version,
|
||||
"version": str(dependency.ref.version),
|
||||
"revision": dependency.ref.revision
|
||||
}
|
||||
return conan_installs
|
||||
|
@ -166,28 +130,198 @@ class CuraConan(ConanFile):
|
|||
self.output.info("Collecting python installs")
|
||||
python_installs = {}
|
||||
|
||||
# list of python installs
|
||||
run_env = VirtualRunEnv(self)
|
||||
env = run_env.environment()
|
||||
env.prepend_path("PYTHONPATH", str(self._site_packages.as_posix()))
|
||||
venv_vars = env.vars(self, scope = "run")
|
||||
collect_python_installs = "collect_python_installs.py"
|
||||
code = f"import importlib.metadata; print(';'.join([(package.metadata['Name']+','+ package.metadata['Version']) for package in importlib.metadata.distributions()]))"
|
||||
save(self, collect_python_installs, code)
|
||||
|
||||
outer = '"' if self.settings.os == "Windows" else "'"
|
||||
inner = "'" if self.settings.os == "Windows" else '"'
|
||||
buffer = StringIO()
|
||||
with venv_vars.apply():
|
||||
self.run(f"""python -c {outer}import pkg_resources; print({inner};{inner}.join([(s.key+{inner},{inner}+ s.version) for s in pkg_resources.working_set])){outer}""",
|
||||
env = "conanrun",
|
||||
output = buffer)
|
||||
self.run(f"""python {collect_python_installs}""", env = "virtual_python_env", stdout = buffer)
|
||||
rm(self, collect_python_installs, ".")
|
||||
|
||||
packages = str(buffer.getvalue()).split("-----------------\n")
|
||||
packages = packages[1].strip('\r\n').split(";")
|
||||
packages = str(buffer.getvalue()).strip('\r\n').split(";")
|
||||
for package in packages:
|
||||
name, version = package.split(",")
|
||||
python_installs[name] = {"version": version}
|
||||
|
||||
return python_installs
|
||||
|
||||
@staticmethod
|
||||
def _is_repository_url(url):
|
||||
# That will not work for ALL open-source projects, but should already get a large majority of them
|
||||
return (url.startswith("https://github.com/") or url.startswith("https://gitlab.com/")) and "conan-center-index" not in url
|
||||
|
||||
def _retrieve_pip_license(self, package, sources_url, dependency_description):
|
||||
# Download the sources to get the license file inside
|
||||
self.output.info(f"Retrieving license for {package}")
|
||||
response = requests.get(sources_url)
|
||||
response.raise_for_status()
|
||||
|
||||
with tempfile.TemporaryDirectory() as temp_dir:
|
||||
sources_path = os.path.join(temp_dir, "sources.tar.gz")
|
||||
with open(sources_path, 'wb') as sources_file:
|
||||
sources_file.write(response.content)
|
||||
|
||||
with tarfile.open(sources_path, 'r:gz') as sources_archive:
|
||||
license_file = "LICENSE"
|
||||
|
||||
for source_file in sources_archive.getnames():
|
||||
if Path(source_file).name == license_file:
|
||||
sources_archive.extract(source_file, temp_dir)
|
||||
|
||||
license_file_path = os.path.join(temp_dir, source_file)
|
||||
with open(license_file_path, 'r', encoding='utf8') as file:
|
||||
dependency_description["license_full"] = file.read()
|
||||
|
||||
def _make_pip_dependency_description(self, package, version, dependencies):
|
||||
url = ["https://pypi.org/pypi", package]
|
||||
if version is not None:
|
||||
url.append(version)
|
||||
url.append("json")
|
||||
|
||||
data = requests.get("/".join(url)).json()
|
||||
|
||||
dependency_description = {
|
||||
"summary": data["info"]["summary"],
|
||||
"version": data["info"]["version"],
|
||||
"license": data["info"]["license"]
|
||||
}
|
||||
|
||||
for url_data in data["urls"]:
|
||||
if url_data["packagetype"] == "sdist":
|
||||
sources_url = url_data["url"]
|
||||
dependency_description["sources_url"] = sources_url
|
||||
|
||||
if not self.options.skip_licenses_download:
|
||||
self._retrieve_pip_license(package, sources_url, dependency_description)
|
||||
|
||||
for source_url, check_source in [("source", False),
|
||||
("Source", False),
|
||||
("Source Code", False),
|
||||
("Repository", False),
|
||||
("Code", False),
|
||||
("homepage", True),
|
||||
("Homepage", True)]:
|
||||
try:
|
||||
url = data["info"]["project_urls"][source_url]
|
||||
if check_source and not self._is_repository_url(url):
|
||||
# That will not work for ALL open-source projects, but should already get a large majority of them
|
||||
self.output.warning(f"Source URL for {package} ({url}) doesn't seem to be a supported repository")
|
||||
continue
|
||||
dependency_description["sources_url"] = url
|
||||
break
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
if dependency_description["license"] is not None and len(dependency_description["license"]) > 32:
|
||||
# Some packages have their full license in this field
|
||||
dependency_description["license_full"] = dependency_description["license"]
|
||||
dependency_description["license"] = data["info"]["name"]
|
||||
|
||||
dependencies[data["info"]["name"]] = dependency_description
|
||||
|
||||
@staticmethod
|
||||
def _get_license_from_repository(sources_url, version, license_file_name = None):
|
||||
if sources_url.startswith("https://github.com/Ultimaker/") and "private" in sources_url:
|
||||
return None
|
||||
|
||||
git_url = sources_url
|
||||
if git_url.endswith('/'):
|
||||
git_url = git_url[:-1]
|
||||
if not git_url.endswith(".git"):
|
||||
git_url = f"{git_url}.git"
|
||||
git_url = git_url.replace("/cgit/", "/")
|
||||
|
||||
tags = [f"v{version}", version]
|
||||
files = ["LICENSE", "LICENSE.txt", "LICENSE.md", "COPYRIGHT", "COPYING", "COPYING.LIB"] if license_file_name is None else [license_file_name]
|
||||
|
||||
with tempfile.TemporaryDirectory() as clone_dir:
|
||||
repo = Repo.clone_from(git_url, clone_dir, depth=1, no_checkout=True)
|
||||
|
||||
for tag in tags:
|
||||
try:
|
||||
repo.git.fetch('--depth', '1', 'origin', 'tag', tag)
|
||||
except GitCommandError:
|
||||
continue
|
||||
|
||||
repo.git.sparse_checkout('init', '--cone')
|
||||
for file_name in files:
|
||||
repo.git.sparse_checkout('add', file_name)
|
||||
|
||||
try:
|
||||
repo.git.checkout(tag)
|
||||
except GitCommandError:
|
||||
pass
|
||||
|
||||
for file_name in files:
|
||||
license_file = os.path.join(clone_dir, file_name)
|
||||
if os.path.exists(license_file):
|
||||
with open(license_file, 'r', encoding='utf8') as file:
|
||||
return file.read()
|
||||
|
||||
break
|
||||
|
||||
def _make_conan_dependency_description(self, dependency, dependencies):
|
||||
dependency_description = {
|
||||
"summary": dependency.description,
|
||||
"version": str(dependency.ref.version),
|
||||
"license": ', '.join(dependency.license) if (isinstance(dependency.license, list) or isinstance(dependency.license, tuple)) else dependency.license,
|
||||
}
|
||||
|
||||
for source_url, check_source in [(dependency.homepage, True),
|
||||
(dependency.url, True),
|
||||
(dependency.homepage, False),
|
||||
(dependency.url, False)]:
|
||||
if source_url is None:
|
||||
continue
|
||||
|
||||
is_repository_source = self._is_repository_url(source_url)
|
||||
if not check_source or is_repository_source:
|
||||
dependency_description["sources_url"] = source_url
|
||||
|
||||
if is_repository_source and not self.options.skip_licenses_download:
|
||||
self.output.info(f"Retrieving license for {dependency.ref.name}")
|
||||
dependency_description["license_full"] = self._get_license_from_repository(source_url, str(dependency.ref.version))
|
||||
|
||||
break
|
||||
|
||||
dependencies[dependency.ref.name] = dependency_description
|
||||
|
||||
def _make_extra_dependency_description(self, dependency_name, dependency_data, dependencies):
|
||||
sources_url = dependency_data["sources_url"]
|
||||
version = dependency_data["version"]
|
||||
home_url = dependency_data["home_url"] if "home_url" in dependency_data else sources_url
|
||||
|
||||
dependency_description = {
|
||||
"summary": dependency_data["summary"],
|
||||
"version": version,
|
||||
"license": dependency_data["license"],
|
||||
"sources_url": home_url,
|
||||
}
|
||||
|
||||
if not self.options.skip_licenses_download:
|
||||
self.output.info(f"Retrieving license for {dependency_name}")
|
||||
license_file = dependency_data["license_file"] if "license_file" in dependency_data else None
|
||||
dependency_description["license_full"] = self._get_license_from_repository(sources_url, version, license_file)
|
||||
|
||||
dependencies[dependency_name] = dependency_description
|
||||
|
||||
def _dependencies_description(self):
|
||||
dependencies = {}
|
||||
|
||||
for dependency in [self] + list(self.dependencies.values()):
|
||||
self._make_conan_dependency_description(dependency, dependencies)
|
||||
|
||||
if "extra_dependencies" in dependency.conan_data:
|
||||
for dependency_name, dependency_data in dependency.conan_data["extra_dependencies"].items():
|
||||
self._make_extra_dependency_description(dependency_name, dependency_data, dependencies)
|
||||
|
||||
pip_requirements_summary = os.path.abspath(Path(self.generators_folder, "pip_requirements_summary.yml") )
|
||||
with open(pip_requirements_summary, 'r') as file:
|
||||
for package_name, package_version in yaml.safe_load(file).items():
|
||||
self._make_pip_dependency_description(package_name, package_version, dependencies)
|
||||
|
||||
return dependencies
|
||||
|
||||
def _generate_cura_version(self, location):
|
||||
with open(os.path.join(self.recipe_folder, "CuraVersion.py.jinja"), "r") as f:
|
||||
cura_version_py = Template(f.read())
|
||||
|
@ -197,15 +331,18 @@ class CuraConan(ConanFile):
|
|||
cura_version = Version(self.conf.get("user.cura:version", default = self.version, check_type = str))
|
||||
pre_tag = f"-{cura_version.pre}" if cura_version.pre else ""
|
||||
build_tag = f"+{cura_version.build}" if cura_version.build else ""
|
||||
internal_tag = f"+internal" if self._internal else ""
|
||||
internal_tag = f"+internal" if self.options.internal else ""
|
||||
cura_version = f"{cura_version.major}.{cura_version.minor}.{cura_version.patch}{pre_tag}{build_tag}{internal_tag}"
|
||||
|
||||
with open(os.path.join(location, "CuraVersion.py"), "w") as f:
|
||||
self.output.info(f"Write CuraVersion.py to {self.recipe_folder}")
|
||||
|
||||
with open(os.path.join(location, "CuraVersion.py"), "wb") as f:
|
||||
f.write(cura_version_py.render(
|
||||
cura_app_name = self.name,
|
||||
cura_app_display_name = self._app_name,
|
||||
cura_version = cura_version,
|
||||
cura_build_type = "Enterprise" if self._enterprise else "",
|
||||
cura_version_full = self.version,
|
||||
cura_build_type = "Enterprise" if self.options.enterprise else "",
|
||||
cura_debug_mode = self.options.cura_debug_mode,
|
||||
cura_cloud_api_root = self.conan_data["urls"][self._urls]["cloud_api_root"],
|
||||
cura_cloud_api_version = self.options.cloud_api_version,
|
||||
|
@ -215,7 +352,8 @@ class CuraConan(ConanFile):
|
|||
cura_latest_url=self.conan_data["urls"][self._urls]["cura_latest_url"],
|
||||
conan_installs=self._conan_installs(),
|
||||
python_installs=self._python_installs(),
|
||||
))
|
||||
dependencies_description=self._dependencies_description(),
|
||||
).encode("utf-8"))
|
||||
|
||||
def _delete_unwanted_binaries(self, root):
|
||||
dynamic_binary_file_exts = [".so", ".dylib", ".dll", ".pyd", ".pyi"]
|
||||
|
@ -228,6 +366,7 @@ class CuraConan(ConanFile):
|
|||
"qtmqtt",
|
||||
"qtnetworkauth",
|
||||
"qtquick3d",
|
||||
"quick3d",
|
||||
"qtquick3dphysics",
|
||||
"qtquicktimeline",
|
||||
"qtvirtualkeyboard",
|
||||
|
@ -256,7 +395,7 @@ class CuraConan(ConanFile):
|
|||
to_remove_files.append(pathname)
|
||||
for dirname in dir_:
|
||||
for forbidden in prohibited:
|
||||
if forbidden.lower() == str(dirname).lower():
|
||||
if forbidden.lower() in str(dirname).lower():
|
||||
pathname = os.path.join(root, dirname)
|
||||
to_remove_dirs.append(pathname)
|
||||
break
|
||||
|
@ -273,43 +412,52 @@ class CuraConan(ConanFile):
|
|||
except Exception as ex:
|
||||
print(f"WARNING: Attempt to delete folder {dir_} results in: {str(ex)}")
|
||||
|
||||
def _generate_pyinstaller_spec(self, location, entrypoint_location, icon_path, entitlements_file):
|
||||
def _generate_pyinstaller_spec(self, location, entrypoint_location, icon_path, entitlements_file, cura_source_folder):
|
||||
pyinstaller_metadata = self.conan_data["pyinstaller"]
|
||||
datas = []
|
||||
for data in pyinstaller_metadata["datas"].values():
|
||||
if not self._internal and data.get("internal", False):
|
||||
if (not self.options.internal and data.get("internal", False)) or (
|
||||
not self.options.enterprise and data.get("enterprise_only", False)):
|
||||
continue
|
||||
|
||||
if "oses" in data and self.settings.os not in data["oses"]:
|
||||
continue
|
||||
|
||||
if "package" in data: # get the paths from conan package
|
||||
if data["package"] == self.name:
|
||||
if self.in_local_cache:
|
||||
src_path = os.path.join(self.package_folder, data["src"])
|
||||
else:
|
||||
src_path = os.path.join(self.source_folder, data["src"])
|
||||
src_path = str(Path(cura_source_folder, data["src"]))
|
||||
else:
|
||||
if data["package"] not in self.deps_cpp_info.deps:
|
||||
continue
|
||||
src_path = os.path.join(self.deps_cpp_info[data["package"]].rootpath, data["src"])
|
||||
if data["package"] not in self.dependencies:
|
||||
raise ConanException(f"Required package {data['package']} does not exist as a dependency")
|
||||
|
||||
package_folder = self.dependencies[data['package']].package_folder
|
||||
if package_folder is None:
|
||||
raise ConanException(f"Unable to find package_folder for {data['package']}, check that it has not been skipped")
|
||||
|
||||
src_path = os.path.join(self.dependencies[data["package"]].package_folder, data["src"])
|
||||
elif "root" in data: # get the paths relative from the install folder
|
||||
src_path = os.path.join(self.install_folder, data["root"], data["src"])
|
||||
else:
|
||||
continue
|
||||
if Path(src_path).exists():
|
||||
datas.append((str(src_path), data["dst"]))
|
||||
raise ConanException("Misformatted conan data for pyinstaller datas, expected either package or root option")
|
||||
|
||||
if not Path(src_path).exists():
|
||||
raise ConanException(f"Missing folder {src_path} for pyinstaller data {data}")
|
||||
|
||||
datas.append((str(src_path), data["dst"]))
|
||||
|
||||
binaries = []
|
||||
for binary in pyinstaller_metadata["binaries"].values():
|
||||
if "package" in binary: # get the paths from conan package
|
||||
src_path = os.path.join(self.deps_cpp_info[binary["package"]].rootpath, binary["src"])
|
||||
src_path = os.path.join(self.dependencies[binary["package"]].package_folder, binary["src"])
|
||||
elif "root" in binary: # get the paths relative from the sourcefolder
|
||||
src_path = str(self.source_path.joinpath(binary["root"], binary["src"]))
|
||||
src_path = str(Path(self.source_folder, binary["root"], binary["src"]))
|
||||
if self.settings.os == "Windows":
|
||||
src_path = src_path.replace("\\", "\\\\")
|
||||
else:
|
||||
continue
|
||||
raise ConanException("Misformatted conan data for pyinstaller binaries, expected either package or root option")
|
||||
|
||||
if not Path(src_path).exists():
|
||||
self.output.warning(f"Source path for binary {binary['binary']} does not exist")
|
||||
continue
|
||||
raise ConanException(f"Missing folder {src_path} for pyinstaller binary {binary}")
|
||||
|
||||
for bin in Path(src_path).glob(binary["binary"] + "*[.exe|.dll|.so|.dylib|.so.]*"):
|
||||
binaries.append((str(bin), binary["dst"]))
|
||||
|
@ -350,6 +498,12 @@ class CuraConan(ConanFile):
|
|||
except Exception as ex:
|
||||
print(f"WARNING: Attempt to delete binary {unwanted_path} results in: {str(ex)}")
|
||||
|
||||
hiddenimports = pyinstaller_metadata["hiddenimports"]
|
||||
collect_all = pyinstaller_metadata["collect_all"]
|
||||
if self.settings.os == "Windows":
|
||||
hiddenimports += pyinstaller_metadata["hiddenimports_WINDOWS_ONLY"]
|
||||
collect_all += pyinstaller_metadata["collect_all_WINDOWS_ONLY"]
|
||||
|
||||
# Write the actual file:
|
||||
with open(os.path.join(location, "UltiMaker-Cura.spec"), "w") as f:
|
||||
f.write(pyinstaller.render(
|
||||
|
@ -359,8 +513,8 @@ class CuraConan(ConanFile):
|
|||
datas = datas,
|
||||
binaries = filtered_binaries,
|
||||
venv_script_path = str(self._script_dir),
|
||||
hiddenimports = pyinstaller_metadata["hiddenimports"],
|
||||
collect_all = pyinstaller_metadata["collect_all"],
|
||||
hiddenimports = hiddenimports,
|
||||
collect_all = collect_all,
|
||||
icon = icon_path,
|
||||
entitlements_file = entitlements_file,
|
||||
osx_bundle_identifier = "'nl.ultimaker.cura'" if self.settings.os == "Macos" else "None",
|
||||
|
@ -379,92 +533,82 @@ class CuraConan(ConanFile):
|
|||
copy(self, "*", os.path.join(self.recipe_folder, "plugins"), os.path.join(self.export_sources_folder, "plugins"))
|
||||
copy(self, "*", os.path.join(self.recipe_folder, "resources"), os.path.join(self.export_sources_folder, "resources"), excludes = "*.mo")
|
||||
copy(self, "*", os.path.join(self.recipe_folder, "tests"), os.path.join(self.export_sources_folder, "tests"))
|
||||
copy(self, "*", os.path.join(self.recipe_folder, "cura"), os.path.join(self.export_sources_folder, "cura"), excludes="CuraVersion.py")
|
||||
copy(self, "*", os.path.join(self.recipe_folder, "cura"), os.path.join(self.export_sources_folder, "cura"))
|
||||
copy(self, "*", os.path.join(self.recipe_folder, "packaging"), os.path.join(self.export_sources_folder, "packaging"))
|
||||
copy(self, "*", os.path.join(self.recipe_folder, ".run_templates"), os.path.join(self.export_sources_folder, ".run_templates"))
|
||||
copy(self, "requirements.txt", self.recipe_folder, self.export_sources_folder)
|
||||
copy(self, "requirements-dev.txt", self.recipe_folder, self.export_sources_folder)
|
||||
copy(self, "requirements-ultimaker.txt", self.recipe_folder, self.export_sources_folder)
|
||||
copy(self, "cura_app.py", self.recipe_folder, self.export_sources_folder)
|
||||
|
||||
def config_options(self):
|
||||
if self.settings.os == "Windows" and not self.conf.get("tools.microsoft.bash:path", check_type=str):
|
||||
del self.options.enable_i18n
|
||||
|
||||
def configure(self):
|
||||
self.options["pyarcus"].shared = True
|
||||
self.options["pysavitar"].shared = True
|
||||
self.options["pynest2d"].shared = True
|
||||
self.options["dulcificum"].shared = self.settings.os != "Windows"
|
||||
self.options["cpython"].shared = True
|
||||
self.options["boost"].header_only = True
|
||||
if self.settings.os == "Linux":
|
||||
self.options["openssl"].shared = True
|
||||
if self.conf.get("user.curaengine:sentry_url", "", check_type=str) != "":
|
||||
self.options["curaengine"].enable_sentry = True
|
||||
self.options["arcus"].enable_sentry = True
|
||||
self.options["clipper"].enable_sentry = True
|
||||
|
||||
def validate(self):
|
||||
version = self.conf.get("user.cura:version", default = self.version, check_type = str)
|
||||
if version and Version(version) <= Version("4"):
|
||||
raise ConanInvalidConfiguration("Only versions 5+ are support")
|
||||
if self.options.i18n_extract and self.settings.os == "Windows" and not self.conf.get("tools.microsoft.bash:path", check_type=str):
|
||||
raise ConanInvalidConfiguration("Unable to extract translations on Windows without Bash installed")
|
||||
|
||||
def requirements(self):
|
||||
for req in self.conan_data["requirements"]:
|
||||
if self._internal and "fdm_materials" in req:
|
||||
continue
|
||||
if not self._enterprise and "native_cad_plugin" in req:
|
||||
if self.options.internal and "fdm_materials" in req:
|
||||
continue
|
||||
self.requires(req)
|
||||
if self._internal:
|
||||
if self.options.internal:
|
||||
for req in self.conan_data["requirements_internal"]:
|
||||
self.requires(req)
|
||||
self.requires("cpython/3.10.4@ultimaker/stable")
|
||||
self.requires("clipper/6.4.2@ultimaker/stable")
|
||||
self.requires("openssl/3.2.0")
|
||||
self.requires("protobuf/3.21.12")
|
||||
self.requires("boost/1.82.0")
|
||||
self.requires("spdlog/1.12.0")
|
||||
self.requires("fmt/10.1.1")
|
||||
self.requires("zlib/1.2.13")
|
||||
|
||||
def build_requirements(self):
|
||||
if self.options.get_safe("enable_i18n", False):
|
||||
self.tool_requires("gettext/0.21", force_host_context = True)
|
||||
if self.options.enterprise:
|
||||
for req in self.conan_data["requirements_enterprise"]:
|
||||
self.requires(req)
|
||||
self.requires("cpython/3.12.2")
|
||||
|
||||
def layout(self):
|
||||
self.folders.source = "."
|
||||
self.folders.build = "venv"
|
||||
self.folders.generators = os.path.join(self.folders.build, "conan")
|
||||
self.folders.build = "build"
|
||||
self.folders.generators = os.path.join(self.folders.build, "generators")
|
||||
|
||||
self.cpp.package.libdirs = [os.path.join("site-packages", "cura")]
|
||||
self.cpp.package.bindirs = ["bin"]
|
||||
self.cpp.package.resdirs = ["resources", "plugins", "packaging", "pip_requirements"] # pip_requirements should be the last item in the list
|
||||
self.cpp.package.resdirs = ["resources", "plugins", "packaging"]
|
||||
|
||||
def _make_internal_distinct(self):
|
||||
test_colors_path = Path(self.source_folder, "resources", "themes", "daily_test_colors.json")
|
||||
if not test_colors_path.exists():
|
||||
print(f"Could not find '{str(test_colors_path)}'. Won't generate rotating colors for alpha builds.")
|
||||
return
|
||||
if "alpha" in self.version:
|
||||
with test_colors_path.open("r") as test_colors_file:
|
||||
test_colors = json.load(test_colors_file)
|
||||
biweekly_day = (datetime.now() - datetime(2025, 3, 14)).days % len(test_colors)
|
||||
for theme_dir in Path(self.source_folder, "resources", "themes").iterdir():
|
||||
if not theme_dir.is_dir():
|
||||
continue
|
||||
theme_path = Path(theme_dir, "theme.json")
|
||||
if not theme_path.exists():
|
||||
print(f"('Colorize-by-day' alpha builds): Skipping {str(theme_path)}, could not find file.")
|
||||
continue
|
||||
with theme_path.open("r") as theme_file:
|
||||
theme = json.load(theme_file)
|
||||
if theme["colors"]:
|
||||
theme["colors"]["main_window_header_background"] = test_colors[biweekly_day]
|
||||
with theme_path.open("w") as theme_file:
|
||||
json.dump(theme, theme_file)
|
||||
test_colors_path.unlink()
|
||||
|
||||
def generate(self):
|
||||
copy(self, "cura_app.py", self.source_folder, str(self._script_dir))
|
||||
cura_run_envvars = self._cura_run_env.vars(self, scope = "run")
|
||||
ext = ".ps1" if self.settings.os == "Windows" else ".sh"
|
||||
cura_run_envvars.save_script(os.path.join(self.folders.generators, f"cura_run_environment{ext}"))
|
||||
|
||||
vr = VirtualRunEnv(self)
|
||||
vr.generate()
|
||||
self._generate_cura_version(str(Path(self.source_folder, "cura")))
|
||||
|
||||
self._generate_cura_version(os.path.join(self.source_folder, "cura"))
|
||||
# Copy CuraEngine.exe to bindirs of Virtual Python Environment
|
||||
curaengine = self.dependencies["curaengine"].cpp_info
|
||||
copy(self, "CuraEngine.exe", curaengine.bindirs[0], self.source_folder, keep_path = False)
|
||||
copy(self, "CuraEngine", curaengine.bindirs[0], self.source_folder, keep_path = False)
|
||||
|
||||
if not self.in_local_cache:
|
||||
# Copy CuraEngine.exe to bindirs of Virtual Python Environment
|
||||
curaengine = self.dependencies["curaengine"].cpp_info
|
||||
copy(self, "CuraEngine.exe", curaengine.bindirs[0], self.source_folder, keep_path = False)
|
||||
copy(self, "CuraEngine", curaengine.bindirs[0], self.source_folder, keep_path = False)
|
||||
# Copy the external plugins that we want to bundle with Cura
|
||||
if self.options.enterprise:
|
||||
rmdir(self, str(Path(self.source_folder, "plugins", "NativeCADplugin")))
|
||||
native_cad_plugin = self.dependencies["native_cad_plugin"].cpp_info
|
||||
copy(self, "*", native_cad_plugin.resdirs[0], str(Path(self.source_folder, "plugins", "NativeCADplugin")),
|
||||
keep_path = True)
|
||||
copy(self, "bundled_*.json", native_cad_plugin.resdirs[1],
|
||||
str(Path(self.source_folder, "resources", "bundled_packages")), keep_path = False)
|
||||
|
||||
# Copy the external plugins that we want to bundle with Cura
|
||||
if self._enterprise:
|
||||
rmdir(self, str(self.source_path.joinpath("plugins", "NativeCADplugin")))
|
||||
native_cad_plugin = self.dependencies["native_cad_plugin"].cpp_info
|
||||
copy(self, "*", native_cad_plugin.resdirs[0], str(self.source_path.joinpath("plugins", "NativeCADplugin")), keep_path = True)
|
||||
copy(self, "bundled_*.json", native_cad_plugin.resdirs[1], str(self.source_path.joinpath("resources", "bundled_packages")), keep_path = False)
|
||||
# Make internal versions built on different days distinct, so people don't get confused while testing.
|
||||
self._make_internal_distinct()
|
||||
|
||||
# Copy resources of cura_binary_data
|
||||
cura_binary_data = self.dependencies["cura_binary_data"].cpp_info
|
||||
|
@ -484,48 +628,35 @@ class CuraConan(ConanFile):
|
|||
copy(self, "*.dylib", libdir, str(self._base_dir.joinpath("lib")), keep_path = False)
|
||||
|
||||
# Copy materials (flat)
|
||||
rmdir(self, os.path.join(self.source_folder, "resources", "materials"))
|
||||
rmdir(self, str(Path(self.source_folder, "resources", "materials")))
|
||||
fdm_materials = self.dependencies["fdm_materials"].cpp_info
|
||||
copy(self, "*", fdm_materials.resdirs[0], self.source_folder)
|
||||
|
||||
# Copy internal resources
|
||||
if self._internal:
|
||||
if self.options.internal:
|
||||
cura_private_data = self.dependencies["cura_private_data"].cpp_info
|
||||
copy(self, "*", cura_private_data.resdirs[0], str(self._share_dir.joinpath("cura")))
|
||||
|
||||
if self.options.devtools:
|
||||
entitlements_file = "'{}'".format(os.path.join(self.source_folder, "packaging", "MacOS", "cura.entitlements"))
|
||||
self._generate_pyinstaller_spec(
|
||||
location=self.generators_folder,
|
||||
entrypoint_location="'{}'".format(
|
||||
os.path.join(self.source_folder, self.conan_data["pyinstaller"]["runinfo"]["entrypoint"])).replace(
|
||||
"\\", "\\\\"),
|
||||
icon_path="'{}'".format(os.path.join(self.source_folder, "packaging",
|
||||
self.conan_data["pyinstaller"]["icon"][
|
||||
str(self.settings.os)])).replace("\\", "\\\\"),
|
||||
entitlements_file=entitlements_file if self.settings.os == "Macos" else "None"
|
||||
)
|
||||
|
||||
if self.options.get_safe("enable_i18n", False) and self._i18n_options["extract"]:
|
||||
if self.options.i18n_extract:
|
||||
vb = VirtualBuildEnv(self)
|
||||
vb.generate()
|
||||
|
||||
# # FIXME: once m4, autoconf, automake are Conan V2 ready use self.win_bash and add gettext as base tool_requirement
|
||||
cpp_info = self.dependencies["gettext"].cpp_info
|
||||
pot = self.python_requires["translationextractor"].module.ExtractTranslations(self, cpp_info.bindirs[0])
|
||||
pot = self.python_requires["translationextractor"].module.ExtractTranslations(self)
|
||||
pot.generate()
|
||||
|
||||
def build(self):
|
||||
if self.options.get_safe("enable_i18n", False) and self._i18n_options["build"]:
|
||||
for po_file in self.source_path.joinpath("resources", "i18n").glob("**/*.po"):
|
||||
mo_file = Path(self.build_folder, po_file.with_suffix('.mo').relative_to(self.source_path))
|
||||
mo_file = mo_file.parent.joinpath("LC_MESSAGES", mo_file.name)
|
||||
mkdir(self, str(unix_path(self, Path(mo_file).parent)))
|
||||
cpp_info = self.dependencies["gettext"].cpp_info
|
||||
self.run(f"{cpp_info.bindirs[0]}/msgfmt {po_file} -o {mo_file} -f", env="conanbuild", ignore_errors=True)
|
||||
for po_file in Path(self.source_folder, "resources", "i18n").glob("**/*.po"):
|
||||
mo_file = Path(self.build_folder, po_file.with_suffix('.mo').relative_to(self.source_folder))
|
||||
mo_file = mo_file.parent.joinpath("LC_MESSAGES", mo_file.name)
|
||||
mkdir(self, str(unix_path(self, Path(mo_file).parent)))
|
||||
self.run(f"msgfmt {po_file} -o {mo_file} -f", env="conanbuild")
|
||||
|
||||
def deploy(self):
|
||||
copy(self, "*", os.path.join(self.package_folder, self.cpp.package.resdirs[2]), os.path.join(self.install_folder, "packaging"), keep_path = True)
|
||||
''' Note: this deploy step is actually used to prepare for building a Cura distribution with pyinstaller, which is not
|
||||
the original purpose in the Conan philosophy '''
|
||||
|
||||
copy(self, "*", os.path.join(self.package_folder, self.cpp.package.resdirs[2]),
|
||||
os.path.join(self.deploy_folder, "packaging"), keep_path=True)
|
||||
|
||||
# Copy resources of Cura (keep folder structure) needed by pyinstaller to determine the module structure
|
||||
copy(self, "*", os.path.join(self.package_folder, self.cpp_info.bindirs[0]), str(self._base_dir), keep_path = False)
|
||||
|
@ -545,39 +676,17 @@ class CuraConan(ConanFile):
|
|||
copy(self, "*", uranium.resdirs[1], str(self._share_dir.joinpath("uranium", "plugins")), keep_path = True)
|
||||
copy(self, "*", uranium.libdirs[0], str(self._site_packages.joinpath("UM")), keep_path = True)
|
||||
|
||||
# Generate the GitHub Action version info Environment
|
||||
version = self.conf.get("user.cura:version", default = self.version, check_type = str)
|
||||
cura_version = Version(version)
|
||||
env_prefix = "Env:" if self.settings.os == "Windows" else ""
|
||||
activate_github_actions_version_env = Template(r"""echo "CURA_VERSION_MAJOR={{ cura_version_major }}" >> ${{ env_prefix }}GITHUB_ENV
|
||||
echo "CURA_VERSION_MINOR={{ cura_version_minor }}" >> ${{ env_prefix }}GITHUB_ENV
|
||||
echo "CURA_VERSION_PATCH={{ cura_version_patch }}" >> ${{ env_prefix }}GITHUB_ENV
|
||||
echo "CURA_VERSION_BUILD={{ cura_version_build }}" >> ${{ env_prefix }}GITHUB_ENV
|
||||
echo "CURA_VERSION_FULL={{ cura_version_full }}" >> ${{ env_prefix }}GITHUB_ENV
|
||||
echo "CURA_APP_NAME={{ cura_app_name }}" >> ${{ env_prefix }}GITHUB_ENV
|
||||
""").render(cura_version_major = cura_version.major,
|
||||
cura_version_minor = cura_version.minor,
|
||||
cura_version_patch = cura_version.patch,
|
||||
cura_version_build = cura_version.build if cura_version.build != "" else "0",
|
||||
cura_version_full = self.version,
|
||||
cura_app_name = self._app_name,
|
||||
env_prefix = env_prefix)
|
||||
|
||||
ext = ".sh" if self.settings.os != "Windows" else ".ps1"
|
||||
save(self, os.path.join(self._script_dir, f"activate_github_actions_version_env{ext}"), activate_github_actions_version_env)
|
||||
|
||||
self._generate_cura_version(os.path.join(self._site_packages, "cura"))
|
||||
|
||||
self._delete_unwanted_binaries(self._site_packages)
|
||||
self._delete_unwanted_binaries(self.package_folder)
|
||||
self._delete_unwanted_binaries(self._base_dir)
|
||||
self._delete_unwanted_binaries(self._share_dir)
|
||||
|
||||
entitlements_file = "'{}'".format(Path(self.cpp_info.res_paths[2], "MacOS", "cura.entitlements"))
|
||||
self._generate_pyinstaller_spec(location = self._base_dir,
|
||||
entitlements_file = "'{}'".format(Path(self.deploy_folder, "packaging", "MacOS", "cura.entitlements"))
|
||||
self._generate_pyinstaller_spec(location = self.deploy_folder,
|
||||
entrypoint_location = "'{}'".format(os.path.join(self.package_folder, self.cpp_info.bindirs[0], self.conan_data["pyinstaller"]["runinfo"]["entrypoint"])).replace("\\", "\\\\"),
|
||||
icon_path = "'{}'".format(os.path.join(self.package_folder, self.cpp_info.resdirs[2], self.conan_data["pyinstaller"]["icon"][str(self.settings.os)])).replace("\\", "\\\\"),
|
||||
entitlements_file = entitlements_file if self.settings.os == "Macos" else "None")
|
||||
entitlements_file = entitlements_file if self.settings.os == "Macos" else "None",
|
||||
cura_source_folder = self.package_folder)
|
||||
|
||||
def package(self):
|
||||
copy(self, "cura_app.py", src = self.source_folder, dst = os.path.join(self.package_folder, self.cpp.package.bindirs[0]))
|
||||
|
@ -585,8 +694,9 @@ echo "CURA_APP_NAME={{ cura_app_name }}" >> ${{ env_prefix }}GITHUB_ENV
|
|||
copy(self, "*", src = os.path.join(self.source_folder, "resources"), dst = os.path.join(self.package_folder, self.cpp.package.resdirs[0]))
|
||||
copy(self, "*.mo", os.path.join(self.build_folder, "resources"), os.path.join(self.package_folder, "resources"))
|
||||
copy(self, "*", src = os.path.join(self.source_folder, "plugins"), dst = os.path.join(self.package_folder, self.cpp.package.resdirs[1]))
|
||||
copy(self, "requirement*.txt", src = self.source_folder, dst = os.path.join(self.package_folder, self.cpp.package.resdirs[-1]))
|
||||
copy(self, "*", src = os.path.join(self.source_folder, "packaging"), dst = os.path.join(self.package_folder, self.cpp.package.resdirs[2]))
|
||||
copy(self, "pip_requirements_*.txt", src = self.generators_folder, dst = os.path.join(self.package_folder, self.cpp.package.resdirs[-1]))
|
||||
copy(self, "pip_requirements_summary.yml", src = self.generators_folder, dst = os.path.join(self.package_folder, self.cpp.package.resdirs[-1]))
|
||||
|
||||
# Remove the fdm_materials from the package
|
||||
rmdir(self, os.path.join(self.package_folder, self.cpp.package.resdirs[0], "materials"))
|
||||
|
@ -598,35 +708,8 @@ echo "CURA_APP_NAME={{ cura_app_name }}" >> ${{ env_prefix }}GITHUB_ENV
|
|||
rmdir(self, os.path.join(self.package_folder, self.cpp.package.resdirs[0], Path(res_dir).name))
|
||||
|
||||
def package_info(self):
|
||||
self.user_info.pip_requirements = "requirements.txt"
|
||||
self.user_info.pip_requirements_git = "requirements-ultimaker.txt"
|
||||
self.user_info.pip_requirements_build = "requirements-dev.txt"
|
||||
|
||||
if self.in_local_cache:
|
||||
self.runenv_info.append_path("PYTHONPATH", os.path.join(self.package_folder, "site-packages"))
|
||||
self.env_info.PYTHONPATH.append(os.path.join(self.package_folder, "site-packages"))
|
||||
self.runenv_info.append_path("PYTHONPATH", os.path.join(self.package_folder, "plugins"))
|
||||
self.env_info.PYTHONPATH.append(os.path.join(self.package_folder, "plugins"))
|
||||
else:
|
||||
self.runenv_info.append_path("PYTHONPATH", self.source_folder)
|
||||
self.env_info.PYTHONPATH.append(self.source_folder)
|
||||
self.runenv_info.append_path("PYTHONPATH", os.path.join(self.source_folder, "plugins"))
|
||||
self.env_info.PYTHONPATH.append(os.path.join(self.source_folder, "plugins"))
|
||||
self.runenv_info.append_path("PYTHONPATH", os.path.join(self.package_folder, "site-packages"))
|
||||
self.runenv_info.append_path("PYTHONPATH", os.path.join(self.package_folder, "plugins"))
|
||||
|
||||
def package_id(self):
|
||||
self.info.clear()
|
||||
|
||||
# The following options shouldn't be used to determine the hash, since these are only used to set the CuraVersion.py
|
||||
# which will als be generated by the deploy method during the `conan install cura/5.1.0@_/_`
|
||||
del self.info.options.enterprise
|
||||
del self.info.options.staging
|
||||
del self.info.options.devtools
|
||||
del self.info.options.cloud_api_version
|
||||
del self.info.options.display_name
|
||||
del self.info.options.cura_debug_mode
|
||||
if self.options.get_safe("enable_i18n", False):
|
||||
del self.info.options.enable_i18n
|
||||
|
||||
# TODO: Use the hash of requirements.txt and requirements-ultimaker.txt, Because changing these will actually result in a different
|
||||
# Cura. This is needed because the requirements.txt aren't managed by Conan and therefor not resolved in the package_id. This isn't
|
||||
# ideal but an acceptable solution for now.
|
||||
self.info.options.rm_safe("i18n_extract")
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# Copyright (c) 2018 Ultimaker B.V.
|
||||
# Copyright (c) 2025 UltiMaker
|
||||
# Cura is released under the terms of the LGPLv3 or higher.
|
||||
from typing import Tuple, Optional, TYPE_CHECKING, Dict, Any
|
||||
|
||||
|
@ -9,14 +9,10 @@ if TYPE_CHECKING:
|
|||
|
||||
|
||||
class Backups:
|
||||
"""The back-ups API provides a version-proof bridge between Cura's
|
||||
|
||||
BackupManager and plug-ins that hook into it.
|
||||
"""The back-ups API provides a version-proof bridge between Cura's BackupManager and plug-ins that hook into it.
|
||||
|
||||
Usage:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
from cura.API import CuraAPI
|
||||
api = CuraAPI()
|
||||
api.backups.createBackup()
|
||||
|
@ -26,19 +22,22 @@ class Backups:
|
|||
def __init__(self, application: "CuraApplication") -> None:
|
||||
self.manager = BackupsManager(application)
|
||||
|
||||
def createBackup(self) -> Tuple[Optional[bytes], Optional[Dict[str, Any]]]:
|
||||
def createBackup(self, available_remote_plugins: frozenset[str] = frozenset()) -> Tuple[Optional[bytes], Optional[Dict[str, Any]]]:
|
||||
"""Create a new back-up using the BackupsManager.
|
||||
|
||||
:return: Tuple containing a ZIP file with the back-up data and a dict with metadata about the back-up.
|
||||
"""
|
||||
|
||||
return self.manager.createBackup()
|
||||
return self.manager.createBackup(available_remote_plugins)
|
||||
|
||||
def restoreBackup(self, zip_file: bytes, meta_data: Dict[str, Any]) -> None:
|
||||
def restoreBackup(self, zip_file: bytes, meta_data: Dict[str, Any], auto_close: bool = True) -> None:
|
||||
"""Restore a back-up using the BackupsManager.
|
||||
|
||||
:param zip_file: A ZIP file containing the actual back-up data.
|
||||
:param meta_data: Some metadata needed for restoring a back-up, like the Cura version number.
|
||||
"""
|
||||
|
||||
return self.manager.restoreBackup(zip_file, meta_data)
|
||||
return self.manager.restoreBackup(zip_file, meta_data, auto_close=auto_close)
|
||||
|
||||
def shouldReinstallDownloadablePlugins(self) -> bool:
|
||||
return self.manager.shouldReinstallDownloadablePlugins()
|
||||
|
|
|
@ -3,7 +3,7 @@
|
|||
|
||||
from dataclasses import asdict
|
||||
|
||||
from typing import cast, Dict, TYPE_CHECKING
|
||||
from typing import cast, Dict, TYPE_CHECKING, Any
|
||||
|
||||
from UM.Settings.InstanceContainer import InstanceContainer
|
||||
from UM.Settings.SettingFunction import SettingFunction
|
||||
|
@ -54,6 +54,15 @@ class Settings:
|
|||
|
||||
return self.application.getSidebarCustomMenuItems()
|
||||
|
||||
def getAllGlobalSettings(self) -> Dict[str, Any]:
|
||||
global_stack = cast(GlobalStack, self.application.getGlobalContainerStack())
|
||||
|
||||
all_settings = {}
|
||||
for setting in global_stack.getAllKeys():
|
||||
all_settings[setting] = self._retrieveValue(global_stack, setting)
|
||||
|
||||
return all_settings
|
||||
|
||||
def getSliceMetadata(self) -> Dict[str, Dict[str, Dict[str, str]]]:
|
||||
"""Get all changed settings and all settings. For each extruder and the global stack"""
|
||||
print_information = self.application.getPrintInformation()
|
||||
|
@ -71,24 +80,16 @@ class Settings:
|
|||
"quality": asdict(machine_manager.activeQualityDisplayNameMap()),
|
||||
}
|
||||
|
||||
def _retrieveValue(container: InstanceContainer, setting_: str):
|
||||
value_ = container.getProperty(setting_, "value")
|
||||
for _ in range(0, 1024): # Prevent possibly endless loop by not using a limit.
|
||||
if not isinstance(value_, SettingFunction):
|
||||
return value_ # Success!
|
||||
value_ = value_(container)
|
||||
return 0 # Fallback value after breaking possibly endless loop.
|
||||
|
||||
global_stack = cast(GlobalStack, self.application.getGlobalContainerStack())
|
||||
|
||||
# Add global user or quality changes
|
||||
global_flattened_changes = InstanceContainer.createMergedInstanceContainer(global_stack.userChanges, global_stack.qualityChanges)
|
||||
for setting in global_flattened_changes.getAllKeys():
|
||||
settings["global"]["changes"][setting] = _retrieveValue(global_flattened_changes, setting)
|
||||
settings["global"]["changes"][setting] = self._retrieveValue(global_flattened_changes, setting)
|
||||
|
||||
# Get global all settings values without user or quality changes
|
||||
for setting in global_stack.getAllKeys():
|
||||
settings["global"]["all_settings"][setting] = _retrieveValue(global_stack, setting)
|
||||
settings["global"]["all_settings"][setting] = self._retrieveValue(global_stack, setting)
|
||||
|
||||
for i, extruder in enumerate(global_stack.extruderList):
|
||||
# Add extruder fields to settings dictionary
|
||||
|
@ -100,10 +101,19 @@ class Settings:
|
|||
# Add extruder user or quality changes
|
||||
extruder_flattened_changes = InstanceContainer.createMergedInstanceContainer(extruder.userChanges, extruder.qualityChanges)
|
||||
for setting in extruder_flattened_changes.getAllKeys():
|
||||
settings[f"extruder_{i}"]["changes"][setting] = _retrieveValue(extruder_flattened_changes, setting)
|
||||
settings[f"extruder_{i}"]["changes"][setting] = self._retrieveValue(extruder_flattened_changes, setting)
|
||||
|
||||
# Get extruder all settings values without user or quality changes
|
||||
for setting in extruder.getAllKeys():
|
||||
settings[f"extruder_{i}"]["all_settings"][setting] = _retrieveValue(extruder, setting)
|
||||
settings[f"extruder_{i}"]["all_settings"][setting] = self._retrieveValue(extruder, setting)
|
||||
|
||||
return settings
|
||||
|
||||
@staticmethod
|
||||
def _retrieveValue(container: InstanceContainer, setting_: str):
|
||||
value_ = container.getProperty(setting_, "value")
|
||||
for _ in range(0, 1024): # Prevent possibly endless loop by not using a limit.
|
||||
if not isinstance(value_, SettingFunction):
|
||||
return value_ # Success!
|
||||
value_ = value_(container)
|
||||
return 0 # Fallback value after breaking possibly endless loop.
|
|
@ -14,7 +14,7 @@ DEFAULT_CURA_LATEST_URL = "https://software.ultimaker.com/latest.json"
|
|||
# Each release has a fixed SDK version coupled with it. It doesn't make sense to make it configurable because, for
|
||||
# example Cura 3.2 with SDK version 6.1 will not work. So the SDK version is hard-coded here and left out of the
|
||||
# CuraVersion.py.in template.
|
||||
CuraSDKVersion = "8.9.0"
|
||||
CuraSDKVersion = "8.10.0"
|
||||
|
||||
try:
|
||||
from cura.CuraVersion import CuraLatestURL
|
||||
|
|
|
@ -40,7 +40,7 @@ class ArrangeObjectsJob(Job):
|
|||
|
||||
found_solution_for_all = False
|
||||
try:
|
||||
found_solution_for_all = arranger.arrange()
|
||||
found_solution_for_all = arranger.arrange(only_if_full_success = True)
|
||||
except: # If the thread crashes, the message should still close
|
||||
Logger.logException("e",
|
||||
"Unable to arrange the objects on the buildplate. The arrange algorithm has crashed.")
|
||||
|
|
|
@ -16,12 +16,16 @@ class Arranger:
|
|||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def arrange(self, add_new_nodes_in_scene: bool = False) -> bool:
|
||||
def arrange(self, add_new_nodes_in_scene: bool = False, only_if_full_success: bool = False) -> bool:
|
||||
"""
|
||||
Find placement for a set of scene nodes, and move them by using a single grouped operation.
|
||||
:param add_new_nodes_in_scene: Whether to create new scene nodes before applying the transformations and rotations
|
||||
:return: found_solution_for_all: Whether the algorithm found a place on the buildplate for all the objects
|
||||
"""
|
||||
grouped_operation, not_fit_count = self.createGroupOperationForArrange(add_new_nodes_in_scene)
|
||||
grouped_operation.push()
|
||||
return not_fit_count == 0
|
||||
full_success = not_fit_count == 0
|
||||
|
||||
if full_success or not only_if_full_success:
|
||||
grouped_operation.push()
|
||||
|
||||
return full_success
|
||||
|
|
|
@ -54,22 +54,6 @@ class Nest2DArrange(Arranger):
|
|||
machine_depth = self._build_volume.getDepth() - (edge_disallowed_size * 2)
|
||||
build_plate_bounding_box = Box(int(machine_width * self._factor), int(machine_depth * self._factor))
|
||||
|
||||
if self._fixed_nodes is None:
|
||||
self._fixed_nodes = []
|
||||
|
||||
# Add all the items we want to arrange
|
||||
node_items = []
|
||||
for node in self._nodes_to_arrange:
|
||||
hull_polygon = node.callDecoration("getConvexHull")
|
||||
if not hull_polygon or hull_polygon.getPoints is None:
|
||||
Logger.log("w", "Object {} cannot be arranged because it has no convex hull.".format(node.getName()))
|
||||
continue
|
||||
converted_points = []
|
||||
for point in hull_polygon.getPoints():
|
||||
converted_points.append(Point(int(point[0] * self._factor), int(point[1] * self._factor)))
|
||||
item = Item(converted_points)
|
||||
node_items.append(item)
|
||||
|
||||
# Use a tiny margin for the build_plate_polygon (the nesting doesn't like overlapping disallowed areas)
|
||||
half_machine_width = 0.5 * machine_width - 1
|
||||
half_machine_depth = 0.5 * machine_depth - 1
|
||||
|
@ -80,40 +64,66 @@ class Nest2DArrange(Arranger):
|
|||
[half_machine_width, half_machine_depth]
|
||||
], numpy.float32))
|
||||
|
||||
disallowed_areas = self._build_volume.getDisallowedAreas()
|
||||
for area in disallowed_areas:
|
||||
converted_points = []
|
||||
def _convert_points(points):
|
||||
if points is not None and len(points) > 2: # numpy array has to be explicitly checked against None
|
||||
converted_points = []
|
||||
for point in points:
|
||||
converted_points.append(Point(int(point[0] * self._factor), int(point[1] * self._factor)))
|
||||
return [converted_points]
|
||||
else:
|
||||
return []
|
||||
|
||||
polygons_nodes_to_arrange = []
|
||||
for node in self._nodes_to_arrange:
|
||||
hull_polygon = node.callDecoration("getConvexHull")
|
||||
if not hull_polygon or hull_polygon.getPoints is None:
|
||||
Logger.log("w", "Object {} cannot be arranged because it has no convex hull.".format(node.getName()))
|
||||
continue
|
||||
|
||||
polygons_nodes_to_arrange += _convert_points(hull_polygon.getPoints())
|
||||
|
||||
polygons_disallowed_areas = []
|
||||
for area in self._build_volume.getDisallowedAreas():
|
||||
# Clip the disallowed areas so that they don't overlap the bounding box (The arranger chokes otherwise)
|
||||
clipped_area = area.intersectionConvexHulls(build_plate_polygon)
|
||||
|
||||
if clipped_area.getPoints() is not None and len(
|
||||
clipped_area.getPoints()) > 2: # numpy array has to be explicitly checked against None
|
||||
for point in clipped_area.getPoints():
|
||||
converted_points.append(Point(int(point[0] * self._factor), int(point[1] * self._factor)))
|
||||
polygons_disallowed_areas += _convert_points(clipped_area.getPoints())
|
||||
|
||||
disallowed_area = Item(converted_points)
|
||||
polygons_fixed_nodes = []
|
||||
if self._fixed_nodes is None:
|
||||
self._fixed_nodes = []
|
||||
for node in self._fixed_nodes:
|
||||
hull_polygon = node.callDecoration("getConvexHull")
|
||||
|
||||
if hull_polygon is not None:
|
||||
polygons_fixed_nodes += _convert_points(hull_polygon.getPoints())
|
||||
|
||||
strategies = [NfpConfig.Alignment.CENTER,
|
||||
NfpConfig.Alignment.BOTTOM_LEFT,
|
||||
NfpConfig.Alignment.BOTTOM_RIGHT,
|
||||
NfpConfig.Alignment.TOP_LEFT,
|
||||
NfpConfig.Alignment.TOP_RIGHT]
|
||||
found_solution_for_all = False
|
||||
while not found_solution_for_all and len(strategies) > 0:
|
||||
|
||||
# Add all the items we want to arrange
|
||||
node_items = []
|
||||
for polygon in polygons_nodes_to_arrange:
|
||||
node_items.append(Item(polygon))
|
||||
|
||||
for polygon in polygons_disallowed_areas:
|
||||
disallowed_area = Item(polygon)
|
||||
disallowed_area.markAsDisallowedAreaInBin(0)
|
||||
node_items.append(disallowed_area)
|
||||
|
||||
for node in self._fixed_nodes:
|
||||
converted_points = []
|
||||
hull_polygon = node.callDecoration("getConvexHull")
|
||||
|
||||
if hull_polygon is not None and hull_polygon.getPoints() is not None and len(
|
||||
hull_polygon.getPoints()) > 2: # numpy array has to be explicitly checked against None
|
||||
for point in hull_polygon.getPoints():
|
||||
converted_points.append(Point(int(point[0] * self._factor), int(point[1] * self._factor)))
|
||||
item = Item(converted_points)
|
||||
for polygon in polygons_fixed_nodes:
|
||||
item = Item(polygon)
|
||||
item.markAsFixedInBin(0)
|
||||
node_items.append(item)
|
||||
|
||||
strategies = [NfpConfig.Alignment.CENTER] * 3 + [NfpConfig.Alignment.BOTTOM_LEFT] * 3
|
||||
found_solution_for_all = False
|
||||
while not found_solution_for_all and len(strategies) > 0:
|
||||
config = NfpConfig()
|
||||
config.accuracy = 1.0
|
||||
config.alignment = NfpConfig.Alignment.CENTER
|
||||
config.alignment = NfpConfig.Alignment.DONT_ALIGN
|
||||
config.starting_point = strategies[0]
|
||||
strategies = strategies[1:]
|
||||
|
||||
|
|
|
@ -1,5 +1,8 @@
|
|||
# Copyright (c) 2021 Ultimaker B.V.
|
||||
# Copyright (c) 2025 UltiMaker
|
||||
# Cura is released under the terms of the LGPLv3 or higher.
|
||||
import tempfile
|
||||
|
||||
import json
|
||||
|
||||
import io
|
||||
import os
|
||||
|
@ -7,12 +10,13 @@ import re
|
|||
import shutil
|
||||
from copy import deepcopy
|
||||
from zipfile import ZipFile, ZIP_DEFLATED, BadZipfile
|
||||
from typing import Dict, Optional, TYPE_CHECKING, List
|
||||
from typing import Callable, Dict, Optional, TYPE_CHECKING, List
|
||||
|
||||
from UM import i18nCatalog
|
||||
from UM.Logger import Logger
|
||||
from UM.Message import Message
|
||||
from UM.Platform import Platform
|
||||
from UM.PluginRegistry import PluginRegistry
|
||||
from UM.Resources import Resources
|
||||
from UM.Version import Version
|
||||
|
||||
|
@ -30,6 +34,7 @@ class Backup:
|
|||
"""These files should be ignored when making a backup."""
|
||||
|
||||
IGNORED_FOLDERS = [] # type: List[str]
|
||||
"""These folders should be ignored when making a backup."""
|
||||
|
||||
SECRETS_SETTINGS = ["general/ultimaker_auth_data"]
|
||||
"""Secret preferences that need to obfuscated when making a backup of Cura"""
|
||||
|
@ -42,7 +47,7 @@ class Backup:
|
|||
self.zip_file = zip_file # type: Optional[bytes]
|
||||
self.meta_data = meta_data # type: Optional[Dict[str, str]]
|
||||
|
||||
def makeFromCurrent(self) -> None:
|
||||
def makeFromCurrent(self, available_remote_plugins: frozenset[str] = frozenset()) -> None:
|
||||
"""Create a back-up from the current user config folder."""
|
||||
|
||||
cura_release = self._application.getVersion()
|
||||
|
@ -68,7 +73,7 @@ class Backup:
|
|||
|
||||
# Create an empty buffer and write the archive to it.
|
||||
buffer = io.BytesIO()
|
||||
archive = self._makeArchive(buffer, version_data_dir)
|
||||
archive = self._makeArchive(buffer, version_data_dir, available_remote_plugins)
|
||||
if archive is None:
|
||||
return
|
||||
files = archive.namelist()
|
||||
|
@ -77,9 +82,7 @@ class Backup:
|
|||
machine_count = max(len([s for s in files if "machine_instances/" in s]) - 1, 0) # If people delete their profiles but not their preferences, it can still make a backup, and report -1 profiles. Server crashes on this.
|
||||
material_count = max(len([s for s in files if "materials/" in s]) - 1, 0)
|
||||
profile_count = max(len([s for s in files if "quality_changes/" in s]) - 1, 0)
|
||||
# We don't store plugins anymore, since if you can make backups, you have an account (and the plugins are
|
||||
# on the marketplace anyway)
|
||||
plugin_count = 0
|
||||
plugin_count = len([s for s in files if "plugin.json" in s])
|
||||
# Store the archive and metadata so the BackupManager can fetch them when needed.
|
||||
self.zip_file = buffer.getvalue()
|
||||
self.meta_data = {
|
||||
|
@ -92,22 +95,72 @@ class Backup:
|
|||
# Restore the obfuscated settings
|
||||
self._illuminate(**secrets)
|
||||
|
||||
def _makeArchive(self, buffer: "io.BytesIO", root_path: str) -> Optional[ZipFile]:
|
||||
def _fillToInstallsJson(self, file_path: str, reinstall_on_restore: frozenset[str], add_to_archive: Callable[[str, str], None]) -> Optional[str]:
|
||||
""" Moves all plugin-data (in a config-file) for plugins that could be (re)installed from the Marketplace from
|
||||
'installed' to 'to_installs' before adding that file to the archive.
|
||||
|
||||
Note that the 'filename'-entry in the package-data (of the plugins) might not be valid anymore on restore.
|
||||
We'll replace it on restore instead, as that's the time when the new package is downloaded.
|
||||
|
||||
:param file_path: Absolute path to the packages-file.
|
||||
:param reinstall_on_restore: A set of plugins that _can_ be reinstalled from the Marketplace.
|
||||
:param add_to_archive: A function/lambda that takes a filename and adds it to the archive (as the 2nd name).
|
||||
"""
|
||||
with open(file_path, "r") as file:
|
||||
data = json.load(file)
|
||||
reinstall, keep_in = {}, {}
|
||||
for install_id, install_info in data["installed"].items():
|
||||
(reinstall if install_id in reinstall_on_restore else keep_in)[install_id] = install_info
|
||||
data["installed"] = keep_in
|
||||
data["to_install"].update(reinstall)
|
||||
if data is not None:
|
||||
tmpfile = tempfile.NamedTemporaryFile(delete_on_close=False)
|
||||
with open(tmpfile.name, "w") as outfile:
|
||||
json.dump(data, outfile)
|
||||
add_to_archive(tmpfile.name, file_path)
|
||||
return tmpfile.name
|
||||
return None
|
||||
|
||||
def _findRedownloadablePlugins(self, available_remote_plugins: frozenset) -> (frozenset[str], frozenset[str]):
|
||||
""" Find all plugins that should be able to be reinstalled from the Marketplace.
|
||||
|
||||
:param plugins_path: Path to all plugins in the user-space.
|
||||
:return: Tuple of a set of plugin-ids and a set of plugin-paths.
|
||||
"""
|
||||
plugin_reg = PluginRegistry.getInstance()
|
||||
id = "id"
|
||||
plugins = [v for v in plugin_reg.getAllMetaData()
|
||||
if v[id] in available_remote_plugins and not plugin_reg.isBundledPlugin(v[id])]
|
||||
return frozenset([v[id] for v in plugins]), frozenset([v["location"] for v in plugins])
|
||||
|
||||
def _makeArchive(self, buffer: "io.BytesIO", root_path: str, available_remote_plugins: frozenset) -> Optional[ZipFile]:
|
||||
"""Make a full archive from the given root path with the given name.
|
||||
|
||||
:param root_path: The root directory to archive recursively.
|
||||
:return: The archive as bytes.
|
||||
"""
|
||||
ignore_string = re.compile("|".join(self.IGNORED_FILES + self.IGNORED_FOLDERS))
|
||||
reinstall_instead_ids, reinstall_instead_paths = self._findRedownloadablePlugins(available_remote_plugins)
|
||||
tmpfiles = []
|
||||
try:
|
||||
archive = ZipFile(buffer, "w", ZIP_DEFLATED)
|
||||
for root, folders, files in os.walk(root_path):
|
||||
add_path_to_archive = lambda path, alt_path: archive.write(path, alt_path[len(root_path) + len(os.sep):])
|
||||
for root, folders, files in os.walk(root_path, topdown=True):
|
||||
for item_name in folders + files:
|
||||
absolute_path = os.path.join(root, item_name)
|
||||
if ignore_string.search(absolute_path):
|
||||
if ignore_string.search(absolute_path) or any([absolute_path.startswith(x) for x in reinstall_instead_paths]):
|
||||
continue
|
||||
archive.write(absolute_path, absolute_path[len(root_path) + len(os.sep):])
|
||||
if item_name == "packages.json":
|
||||
tmpfiles.append(
|
||||
self._fillToInstallsJson(absolute_path, reinstall_instead_ids, add_path_to_archive))
|
||||
else:
|
||||
add_path_to_archive(absolute_path, absolute_path)
|
||||
archive.close()
|
||||
for tmpfile_path in tmpfiles:
|
||||
try:
|
||||
os.remove(tmpfile_path)
|
||||
except IOError as ex:
|
||||
Logger.warning(f"Couldn't remove temporary file '{tmpfile_path}' because '{ex}'.")
|
||||
return archive
|
||||
except (IOError, OSError, BadZipfile) as error:
|
||||
Logger.log("e", "Could not create archive from user data directory: %s", error)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# Copyright (c) 2018 Ultimaker B.V.
|
||||
# Copyright (c) 2025 UltiMaker
|
||||
# Cura is released under the terms of the LGPLv3 or higher.
|
||||
|
||||
from typing import Dict, Optional, Tuple, TYPE_CHECKING
|
||||
|
@ -22,7 +22,10 @@ class BackupsManager:
|
|||
def __init__(self, application: "CuraApplication") -> None:
|
||||
self._application = application
|
||||
|
||||
def createBackup(self) -> Tuple[Optional[bytes], Optional[Dict[str, str]]]:
|
||||
def shouldReinstallDownloadablePlugins(self) -> bool:
|
||||
return True
|
||||
|
||||
def createBackup(self, available_remote_plugins: frozenset[str] = frozenset()) -> Tuple[Optional[bytes], Optional[Dict[str, str]]]:
|
||||
"""
|
||||
Get a back-up of the current configuration.
|
||||
|
||||
|
@ -31,17 +34,18 @@ class BackupsManager:
|
|||
|
||||
self._disableAutoSave()
|
||||
backup = Backup(self._application)
|
||||
backup.makeFromCurrent()
|
||||
backup.makeFromCurrent(available_remote_plugins if self.shouldReinstallDownloadablePlugins() else frozenset())
|
||||
self._enableAutoSave()
|
||||
# We don't return a Backup here because we want plugins only to interact with our API and not full objects.
|
||||
return backup.zip_file, backup.meta_data
|
||||
|
||||
def restoreBackup(self, zip_file: bytes, meta_data: Dict[str, str]) -> None:
|
||||
def restoreBackup(self, zip_file: bytes, meta_data: Dict[str, str], auto_close: bool = True) -> None:
|
||||
"""
|
||||
Restore a back-up from a given ZipFile.
|
||||
|
||||
:param zip_file: A bytes object containing the actual back-up.
|
||||
:param meta_data: A dict containing some metadata that is needed to restore the back-up correctly.
|
||||
:param auto_close: Normally, Cura will need to close immediately after restoring the back-up.
|
||||
"""
|
||||
|
||||
if not meta_data.get("cura_release", None):
|
||||
|
@ -54,7 +58,7 @@ class BackupsManager:
|
|||
backup = Backup(self._application, zip_file = zip_file, meta_data = meta_data)
|
||||
restored = backup.restore()
|
||||
|
||||
if restored:
|
||||
if restored and auto_close:
|
||||
# At this point, Cura will need to restart for the changes to take effect.
|
||||
# We don't want to store the data at this point as that would override the just-restored backup.
|
||||
self._application.windowClosed(save_data = False)
|
||||
|
|
|
@ -252,19 +252,23 @@ class BuildVolume(SceneNode):
|
|||
if not self.getMeshData() or not self.isVisible():
|
||||
return True
|
||||
|
||||
theme = self._application.getTheme()
|
||||
if not self._shader:
|
||||
self._shader = OpenGL.getInstance().createShaderProgram(Resources.getPath(Resources.Shaders, "default.shader"))
|
||||
self._grid_shader = OpenGL.getInstance().createShaderProgram(Resources.getPath(Resources.Shaders, "grid.shader"))
|
||||
theme = self._application.getTheme()
|
||||
self._grid_shader.setUniformValue("u_plateColor", Color(*theme.getColor("buildplate").getRgb()))
|
||||
self._grid_shader.setUniformValue("u_gridColor0", Color(*theme.getColor("buildplate_grid").getRgb()))
|
||||
self._grid_shader.setUniformValue("u_gridColor1", Color(*theme.getColor("buildplate_grid_minor").getRgb()))
|
||||
|
||||
plate_color = Color(*theme.getColor("buildplate").getRgb())
|
||||
if self._global_container_stack.getMetaDataEntry("has_textured_buildplate", False):
|
||||
plate_color.setA(0.5)
|
||||
self._grid_shader.setUniformValue("u_plateColor", plate_color)
|
||||
|
||||
renderer.queueNode(self, mode = RenderBatch.RenderMode.Lines)
|
||||
renderer.queueNode(self, mesh = self._origin_mesh, backface_cull = True)
|
||||
renderer.queueNode(self, mesh = self._grid_mesh, shader = self._grid_shader, backface_cull = True)
|
||||
renderer.queueNode(self, mesh = self._grid_mesh, shader = self._grid_shader, backface_cull = True, transparent = True, sort = -10)
|
||||
if self._disallowed_area_mesh:
|
||||
renderer.queueNode(self, mesh = self._disallowed_area_mesh, shader = self._shader, transparent = True, backface_cull = True, sort = -9)
|
||||
renderer.queueNode(self, mesh = self._disallowed_area_mesh, shader = self._shader, transparent = True, backface_cull = True, sort = -5)
|
||||
|
||||
if self._error_mesh:
|
||||
renderer.queueNode(self, mesh=self._error_mesh, shader=self._shader, transparent=True,
|
||||
|
|
|
@ -110,6 +110,7 @@ from cura.UI.MachineActionManager import MachineActionManager
|
|||
from cura.UI.AddPrinterPagesModel import AddPrinterPagesModel
|
||||
from cura.UI.MachineSettingsManager import MachineSettingsManager
|
||||
from cura.UI.ObjectsModel import ObjectsModel
|
||||
from cura.UI.OpenSourceDependenciesModel import OpenSourceDependenciesModel
|
||||
from cura.UI.RecommendedMode import RecommendedMode
|
||||
from cura.UI.TextManager import TextManager
|
||||
from cura.UI.WelcomePagesModel import WelcomePagesModel
|
||||
|
@ -139,7 +140,7 @@ class CuraApplication(QtApplication):
|
|||
# SettingVersion represents the set of settings available in the machine/extruder definitions.
|
||||
# You need to make sure that this version number needs to be increased if there is any non-backwards-compatible
|
||||
# changes of the settings.
|
||||
SettingVersion = 24
|
||||
SettingVersion = 25
|
||||
|
||||
Created = False
|
||||
|
||||
|
@ -187,6 +188,7 @@ class CuraApplication(QtApplication):
|
|||
|
||||
self._single_instance = None
|
||||
self._open_project_mode: Optional[str] = None
|
||||
self._read_operation_is_project_file: Optional[bool] = None
|
||||
|
||||
self._cura_formula_functions = None # type: Optional[CuraFormulaFunctions]
|
||||
|
||||
|
@ -1308,6 +1310,7 @@ class CuraApplication(QtApplication):
|
|||
qmlRegisterType(AddPrinterPagesModel, "Cura", 1, 0, "AddPrinterPagesModel")
|
||||
qmlRegisterType(TextManager, "Cura", 1, 0, "TextManager")
|
||||
qmlRegisterType(RecommendedMode, "Cura", 1, 0, "RecommendedMode")
|
||||
qmlRegisterType(OpenSourceDependenciesModel, "Cura", 1, 0, "OpenSourceDependenciesModel")
|
||||
|
||||
self.processEvents()
|
||||
qmlRegisterType(NetworkMJPGImage, "Cura", 1, 0, "NetworkMJPGImage")
|
||||
|
@ -1895,23 +1898,20 @@ class CuraApplication(QtApplication):
|
|||
def on_finish(response):
|
||||
content_disposition_header_key = QByteArray("content-disposition".encode())
|
||||
|
||||
if not response.hasRawHeader(content_disposition_header_key):
|
||||
Logger.log("w", "Could not find Content-Disposition header in response from {0}".format(
|
||||
model_url.url()))
|
||||
# Use the last part of the url as the filename, and assume it is an STL file
|
||||
filename = model_url.path().split("/")[-1] + ".stl"
|
||||
else:
|
||||
filename = model_url.path().split("/")[-1] + ".stl"
|
||||
|
||||
if response.hasRawHeader(content_disposition_header_key):
|
||||
# content_disposition is in the format
|
||||
# ```
|
||||
# content_disposition attachment; "filename=[FILENAME]"
|
||||
# content_disposition attachment; filename="[FILENAME]"
|
||||
# ```
|
||||
# Use a regex to extract the filename
|
||||
content_disposition = str(response.rawHeader(content_disposition_header_key).data(),
|
||||
encoding='utf-8')
|
||||
content_disposition_match = re.match(r'attachment; filename="(?P<filename>.*)"',
|
||||
content_disposition_match = re.match(r'attachment; filename=(?P<filename>.*)',
|
||||
content_disposition)
|
||||
assert content_disposition_match is not None
|
||||
filename = content_disposition_match.group("filename")
|
||||
if content_disposition_match is not None:
|
||||
filename = content_disposition_match.group("filename").strip("\"")
|
||||
|
||||
tmp = tempfile.NamedTemporaryFile(suffix=filename, delete=False)
|
||||
with open(tmp.name, "wb") as f:
|
||||
|
@ -2016,18 +2016,18 @@ class CuraApplication(QtApplication):
|
|||
self.deleteAll()
|
||||
break
|
||||
|
||||
is_project_file = self.checkIsValidProjectFile(file)
|
||||
self._read_operation_is_project_file = self.checkIsValidProjectFile(file)
|
||||
|
||||
if self._open_project_mode is None:
|
||||
self._open_project_mode = self.getPreferences().getValue("cura/choice_on_open_project")
|
||||
|
||||
if is_project_file and self._open_project_mode == "open_as_project":
|
||||
if self._read_operation_is_project_file and self._open_project_mode == "open_as_project":
|
||||
# open as project immediately without presenting a dialog
|
||||
workspace_handler = self.getWorkspaceFileHandler()
|
||||
workspace_handler.readLocalFile(file, add_to_recent_files_hint = add_to_recent_files)
|
||||
return
|
||||
|
||||
if is_project_file and self._open_project_mode == "always_ask":
|
||||
if self._read_operation_is_project_file and self._open_project_mode == "always_ask":
|
||||
# present a dialog asking to open as project or import models
|
||||
self.callLater(self.openProjectFile.emit, file, add_to_recent_files)
|
||||
return
|
||||
|
@ -2165,7 +2165,7 @@ class CuraApplication(QtApplication):
|
|||
nodes_to_arrange.append(node)
|
||||
# If the file is a project,and models are to be loaded from a that project,
|
||||
# models inside file should be arranged in buildplate.
|
||||
elif self._open_project_mode == "open_as_model":
|
||||
elif self._read_operation_is_project_file and self._open_project_mode == "open_as_model":
|
||||
nodes_to_arrange.append(node)
|
||||
|
||||
# This node is deep copied from some other node which already has a BuildPlateDecorator, but the deepcopy
|
||||
|
|
|
@ -72,6 +72,13 @@ class ActiveIntentQualitiesModel(ListModel):
|
|||
new_items.append(intent)
|
||||
added_quality_type_set.add(intent["quality_type"])
|
||||
|
||||
# If there aren't any possibilities when the Intent is kept the same, attempt to set it 'back' to default.
|
||||
current_quality_type = global_stack.quality.getMetaDataEntry("quality_type")
|
||||
if len(new_items) == 0 and self._intent_category != "default" and current_quality_type != "not_supported":
|
||||
IntentManager.getInstance().selectIntent("default", current_quality_type)
|
||||
self._update()
|
||||
return
|
||||
|
||||
new_items = sorted(new_items, key=lambda x: x["layer_height"])
|
||||
self.setItems(new_items)
|
||||
|
||||
|
|
|
@ -84,6 +84,7 @@ class MultiplyObjectsJob(Job):
|
|||
arranger = Nest2DArrange(nodes, Application.getInstance().getBuildVolume(), fixed_nodes, factor=1000)
|
||||
|
||||
group_operation, not_fit_count = arranger.createGroupOperationForArrange(add_new_nodes_in_scene=True)
|
||||
found_solution_for_all = not_fit_count == 0
|
||||
|
||||
if nodes_to_add_without_arrange:
|
||||
for nested_node in nodes_to_add_without_arrange:
|
||||
|
|
|
@ -127,6 +127,7 @@ class AuthorizationRequestHandler(BaseHTTPRequestHandler):
|
|||
def _sendHeaders(self, status: "ResponseStatus", content_type: str, redirect_uri: str = None) -> None:
|
||||
self.send_response(status.code, status.message)
|
||||
self.send_header("Content-type", content_type)
|
||||
self.send_header("Strict-Transport-Security", "max-age=900")
|
||||
if redirect_uri:
|
||||
self.send_header("Location", redirect_uri)
|
||||
self.end_headers()
|
||||
|
|
|
@ -16,8 +16,6 @@ if TYPE_CHECKING:
|
|||
import sys
|
||||
from UM.Platform import Platform
|
||||
if Platform.isWindows():
|
||||
if hasattr(sys, "frozen"):
|
||||
import win32timezone
|
||||
from keyring.backends.Windows import WinVaultKeyring
|
||||
keyring.set_keyring(WinVaultKeyring())
|
||||
if Platform.isOSX():
|
||||
|
|
|
@ -13,7 +13,9 @@ class FormatMaps:
|
|||
"fire_e": "ultimaker_method",
|
||||
"lava_f": "ultimaker_methodx",
|
||||
"magma_10": "ultimaker_methodxl",
|
||||
"sketch": "ultimaker_sketch"
|
||||
"sketch": "ultimaker_sketch",
|
||||
"sketch_large": "ultimaker_sketch_large",
|
||||
"sketch_sprint": "ultimaker_sketch_sprint"
|
||||
}
|
||||
|
||||
# A map from the extruder-name in their native file-formats to the internal name we use.
|
||||
|
@ -23,7 +25,10 @@ class FormatMaps:
|
|||
"mk14_c": "1C",
|
||||
"mk14": "1A",
|
||||
"mk14_s": "2A",
|
||||
"mk14_e": "LABS"
|
||||
"mk14_e": "LABS",
|
||||
"sketch_extruder": "0.4mm",
|
||||
"sketch_l_extruder": "0.4mm",
|
||||
"sketch_sprint_extruder": "0.4mm",
|
||||
}
|
||||
|
||||
# A map from the material-name in their native file-formats to some info, including the internal name we use.
|
||||
|
|
|
@ -1,9 +1,12 @@
|
|||
# Copyright (c) 2018 Ultimaker B.V.
|
||||
# Copyright (c) 2025 UltiMaker
|
||||
# Cura is released under the terms of the LGPLv3 or higher.
|
||||
|
||||
from PyQt6.QtCore import pyqtProperty, QObject, pyqtSignal
|
||||
from typing import List
|
||||
|
||||
from UM.Settings.ContainerRegistry import ContainerRegistry
|
||||
from UM.Settings.DefinitionContainer import DefinitionContainer
|
||||
|
||||
MYPY = False
|
||||
if MYPY:
|
||||
from cura.PrinterOutput.Models.ExtruderConfigurationModel import ExtruderConfigurationModel
|
||||
|
@ -68,6 +71,15 @@ class PrinterConfigurationModel(QObject):
|
|||
return True
|
||||
return False
|
||||
|
||||
@pyqtProperty("QStringList", constant=True)
|
||||
def validCoresForPrinterType(self) -> List[str]:
|
||||
printers = ContainerRegistry.getInstance().findContainersMetadata(
|
||||
ignore_case=True, type="machine", name=self._printer_type, container_type=DefinitionContainer)
|
||||
id = printers[0]["id"] if len(printers) > 0 and "id" in printers[0] else ""
|
||||
definitions = ContainerRegistry.getInstance().findContainersMetadata(
|
||||
ignore_case=True, type="variant", definition=id+"*")
|
||||
return [x["name"] for x in definitions]
|
||||
|
||||
def __str__(self):
|
||||
message_chunks = []
|
||||
message_chunks.append("Printer type: " + self._printer_type)
|
||||
|
|
|
@ -402,6 +402,9 @@ class CuraContainerStack(ContainerStack):
|
|||
|
||||
return super().getProperty(key, property_name, context)
|
||||
|
||||
def getValue(self, key: str, context = None) -> Any:
|
||||
return self.getProperty(key, "value", context)
|
||||
|
||||
|
||||
class _ContainerIndexes:
|
||||
"""Private helper class to keep track of container positions and their types."""
|
||||
|
|
|
@ -15,6 +15,7 @@ from UM.Scene.Selection import Selection
|
|||
from UM.Scene.Iterator.BreadthFirstIterator import BreadthFirstIterator
|
||||
from UM.Settings.ContainerRegistry import ContainerRegistry # Finding containers by ID.
|
||||
from cura.Machines.ContainerTree import ContainerTree
|
||||
from cura.Settings.ExtruderStack import ExtruderStack
|
||||
|
||||
from typing import Any, cast, Dict, List, Optional, TYPE_CHECKING, Union
|
||||
|
||||
|
@ -257,6 +258,7 @@ class ExtruderManager(QObject):
|
|||
limit_to_extruder_feature_list = ["wall_0_extruder_nr",
|
||||
"wall_x_extruder_nr",
|
||||
"roofing_extruder_nr",
|
||||
"flooring_extruder_nr",
|
||||
"top_bottom_extruder_nr",
|
||||
"infill_extruder_nr",
|
||||
]
|
||||
|
@ -303,6 +305,11 @@ class ExtruderManager(QObject):
|
|||
Logger.log("e", "Unable to find one or more of the extruders in %s", used_extruder_stack_ids)
|
||||
return []
|
||||
|
||||
def getFirstUsedExtruderStack(self)-> ExtruderStack:
|
||||
used_extruders = self.getUsedExtruderStacks()
|
||||
sorted_extruders = sorted(used_extruders, key=lambda extruder: extruder.getValue("extruder_nr"))
|
||||
return sorted_extruders[0]
|
||||
|
||||
def getInitialExtruderNr(self) -> int:
|
||||
"""Get the extruder that the print will start with.
|
||||
|
||||
|
@ -319,8 +326,7 @@ class ExtruderManager(QObject):
|
|||
skirt_brim_extruder_nr = global_stack.getProperty("skirt_brim_extruder_nr", "value")
|
||||
# if the skirt_brim_extruder_nr is -1, then we use the first used extruder
|
||||
if skirt_brim_extruder_nr == -1:
|
||||
used_extruders = self.getUsedExtruderStacks()
|
||||
return used_extruders[0].position
|
||||
return self.getFirstUsedExtruderStack().getValue("extruder_nr")
|
||||
else:
|
||||
return skirt_brim_extruder_nr
|
||||
if adhesion_type == "raft":
|
||||
|
@ -331,7 +337,7 @@ class ExtruderManager(QObject):
|
|||
return global_stack.getProperty("support_infill_extruder_nr", "value")
|
||||
|
||||
# REALLY no adhesion? Use the first used extruder.
|
||||
return self.getUsedExtruderStacks()[0].getProperty("extruder_nr", "value")
|
||||
return self.getFirstUsedExtruderStack().getValue("extruder_nr")
|
||||
|
||||
def removeMachineExtruders(self, machine_id: str) -> None:
|
||||
"""Removes the container stack and user profile for the extruders for a specific machine.
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# Copyright (c) 2018 Ultimaker B.V.
|
||||
# Copyright (c) 2024 UltiMaker
|
||||
# Cura is released under the terms of the LGPLv3 or higher.
|
||||
|
||||
from typing import Any, Dict, TYPE_CHECKING, Optional
|
||||
|
@ -12,11 +12,8 @@ from UM.Settings.ContainerRegistry import ContainerRegistry
|
|||
from UM.Settings.Interfaces import ContainerInterface, PropertyEvaluationContext
|
||||
from UM.Util import parseBool
|
||||
|
||||
import cura.CuraApplication
|
||||
|
||||
from . import Exceptions
|
||||
from .CuraContainerStack import CuraContainerStack, _ContainerIndexes
|
||||
from .ExtruderManager import ExtruderManager
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from cura.Settings.GlobalStack import GlobalStack
|
||||
|
@ -141,7 +138,11 @@ class ExtruderStack(CuraContainerStack):
|
|||
context.popContainer()
|
||||
return result
|
||||
|
||||
limit_to_extruder = super().getProperty(key, "limit_to_extruder", context)
|
||||
if not context:
|
||||
context = PropertyEvaluationContext(self)
|
||||
if "extruder_position" not in context.context:
|
||||
context.context["extruder_position"] = super().getProperty(key, "limit_to_extruder", context)
|
||||
limit_to_extruder = context.context["extruder_position"]
|
||||
if limit_to_extruder is not None:
|
||||
limit_to_extruder = str(limit_to_extruder)
|
||||
|
||||
|
|
|
@ -398,7 +398,8 @@ class MachineManager(QObject):
|
|||
self.setVariantByName(extruder.getMetaDataEntry("position"), machine_node.preferred_variant_name)
|
||||
variant_node = machine_node.variants.get(machine_node.preferred_variant_name)
|
||||
|
||||
material_node = variant_node.materials.get(extruder.material.getMetaDataEntry("base_file"))
|
||||
material_node = variant_node.materials.get(
|
||||
extruder.material.getMetaDataEntry("base_file")) if variant_node else None
|
||||
if material_node is None:
|
||||
Logger.log("w", "An extruder has an unknown material, switching it to the preferred material")
|
||||
if not self.setMaterialById(extruder.getMetaDataEntry("position"), machine_node.preferred_material):
|
||||
|
@ -1678,7 +1679,7 @@ class MachineManager(QObject):
|
|||
intent_category = self.activeIntentCategory,
|
||||
intent_name = IntentCategoryModel.translation(self.activeIntentCategory, "name", self.activeIntentCategory.title()),
|
||||
custom_profile = self.activeQualityOrQualityChangesName if global_stack.qualityChanges is not empty_quality_changes_container else None,
|
||||
layer_height = self.activeQualityLayerHeight if self.isActiveQualitySupported else None,
|
||||
layer_height = float("{:.2f}".format(self.activeQualityLayerHeight)) if self.isActiveQualitySupported else None,
|
||||
is_experimental = self.isActiveQualityExperimental and self.isActiveQualitySupported
|
||||
)
|
||||
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
# Copyright (c) 2020 Ultimaker B.V.
|
||||
# Uranium is released under the terms of the LGPLv3 or higher.
|
||||
import math
|
||||
|
||||
from PyQt6.QtCore import Qt, QCoreApplication, QTimer
|
||||
from PyQt6.QtGui import QPixmap, QColor, QFont, QPen, QPainter
|
||||
|
@ -51,6 +52,7 @@ class CuraSplashScreen(QSplashScreen):
|
|||
self._last_update_time = time.time()
|
||||
# Since we don't know how much time actually passed, check how many intervals of 50 we had.
|
||||
self._loading_image_rotation_angle -= 10 * (time_since_last_update * 1000 / 50)
|
||||
self._loading_image_rotation_angle = math.fmod(self._loading_image_rotation_angle, 360)
|
||||
self.repaint()
|
||||
|
||||
# Override the mousePressEvent so the splashscreen doesn't disappear when clicked
|
||||
|
|
23
cura/UI/OpenSourceDependenciesModel.py
Normal file
|
@ -0,0 +1,23 @@
|
|||
# Copyright (c) 2025 UltiMaker
|
||||
# Cura is released under the terms of the LGPLv3 or higher.
|
||||
|
||||
from typing import List
|
||||
|
||||
from PyQt6.QtCore import QObject, pyqtProperty
|
||||
|
||||
from cura import CuraVersion
|
||||
from .OpenSourceDependency import OpenSourceDependency
|
||||
|
||||
|
||||
class OpenSourceDependenciesModel(QObject):
|
||||
|
||||
def __init__(self, parent=None):
|
||||
super().__init__(parent)
|
||||
self._dependencies = []
|
||||
|
||||
for name, data in CuraVersion.DependenciesDescriptions.items():
|
||||
self._dependencies.append(OpenSourceDependency(name, data))
|
||||
|
||||
@pyqtProperty(list, constant=True)
|
||||
def dependencies(self) -> List[OpenSourceDependency]:
|
||||
return self._dependencies
|
40
cura/UI/OpenSourceDependency.py
Normal file
|
@ -0,0 +1,40 @@
|
|||
# Copyright (c) 2025 UltiMaker
|
||||
# Cura is released under the terms of the LGPLv3 or higher.
|
||||
|
||||
from PyQt6.QtCore import QObject, pyqtProperty, pyqtEnum
|
||||
|
||||
|
||||
class OpenSourceDependency(QObject):
|
||||
|
||||
def __init__(self, name, data):
|
||||
super().__init__()
|
||||
self._name = name
|
||||
self._version = data['version'] if data['version'] is not None else ''
|
||||
self._summary = data['summary'] if data['summary'] is not None else ''
|
||||
self._license = data['license'] if data['license'] is not None and len(data['license']) > 0 else name
|
||||
self._license_full = data['license_full'] if 'license_full' in data else ''
|
||||
self._sources_url = data['sources_url'] if 'sources_url' in data else ''
|
||||
|
||||
@pyqtProperty(str, constant=True)
|
||||
def name(self):
|
||||
return self._name
|
||||
|
||||
@pyqtProperty(str, constant=True)
|
||||
def version(self):
|
||||
return self._version
|
||||
|
||||
@pyqtProperty(str, constant=True)
|
||||
def summary(self):
|
||||
return self._summary
|
||||
|
||||
@pyqtProperty(str, constant=True)
|
||||
def license(self):
|
||||
return self._license
|
||||
|
||||
@pyqtProperty(str, constant=True)
|
||||
def license_full(self):
|
||||
return self._license_full
|
||||
|
||||
@pyqtProperty(str, constant=True)
|
||||
def sources_url(self):
|
||||
return self._sources_url
|
|
@ -449,6 +449,6 @@ class PrintInformation(QObject):
|
|||
"""If this is a sort of output 'device' (like local or online file storage, rather than a printer),
|
||||
the user could have altered the file-name, and thus the project name should be altered as well."""
|
||||
if isinstance(output_device, ProjectOutputDevice):
|
||||
new_name = output_device.getLastOutputName()
|
||||
new_name = output_device.popLastOutputName()
|
||||
if new_name is not None:
|
||||
self.setJobName(os.path.splitext(os.path.basename(new_name))[0])
|
||||
|
|
|
@ -217,8 +217,8 @@ class WelcomePagesModel(ListModel):
|
|||
def _getBuiltinWelcomePagePath(page_filename: str) -> QUrl:
|
||||
"""Convenience function to get QUrl path to pages that's located in "resources/qml/WelcomePages"."""
|
||||
from cura.CuraApplication import CuraApplication
|
||||
return QUrl.fromLocalFile(Resources.getPath(CuraApplication.ResourceTypes.QmlFiles,
|
||||
os.path.join("WelcomePages", page_filename)))
|
||||
return QUrl.fromLocalFile(
|
||||
Resources.getPath(CuraApplication.ResourceTypes.QmlFiles, "WelcomePages", page_filename))
|
||||
|
||||
# FIXME: HACKs for optimization that we don't update the model every time the active machine gets changed.
|
||||
def _onActiveMachineChanged(self) -> None:
|
||||
|
|
24
licenses_thirdparty/qt.md
vendored
Normal file
|
@ -0,0 +1,24 @@
|
|||
Sources for the Qt modules that Cura uses (links within UltiMaker repositories):
|
||||
|
||||
- qtbase: https://github.com/Ultimaker/qtbase
|
||||
- qtdeclarative: https://github.com/Ultimaker/qtdeclarative
|
||||
- qtsvg: https://github.com/Ultimaker/qtsvg
|
||||
- qtshadertools: https://github.com/Ultimaker/qtshadertools
|
||||
- qtimageformats: https://github.com/Ultimaker/qtimageformats
|
||||
|
||||
Sources for Qt modules that Cura doesn't use (and thus aren't necessary for building or running any part of Cura), but that may be shipped along the product anyway due to our build-process (links to off-site):
|
||||
|
||||
- qtsensors: https://github.com/qt/qtsensors
|
||||
- qtmultimedia: https://github.com/qt/qtmultimedia
|
||||
- qtpositioning: https://github.com/qt/qtpositioning
|
||||
- qtremoteobjects: https://github.com/qt/qtremoteobjects
|
||||
- qttexttospeech: https://github.com/qt/qtspeech
|
||||
- qtwebchannel: https://github.com/qt/qtwebchannel
|
||||
- qtwebsockets: https://github.com/qt/qtwebsockets
|
||||
- qtserialport: https://github.com/qt/qtserialport
|
||||
- qt (5) (linux): https://github.com/qt/qt5
|
||||
|
||||
Versions of Qt used in Cura (from 5.9.1 onwards):
|
||||
|
||||
- 5.9.x: Qt 6.6.0
|
||||
- 5.10.x: Qt 6.6.0
|
|
@ -5,11 +5,11 @@ GenericName=3D Printing Software
|
|||
GenericName[de]=3D-Druck-Software
|
||||
GenericName[nl]=3D-Print Software
|
||||
Comment=Cura converts 3D models into paths for a 3D printer. It prepares your print for maximum accuracy, minimum printing time and good reliability with many extra features that make your print come out great.
|
||||
Exec=UltiMaker-Cura %F
|
||||
Exec=UltiMaker-Cura %U
|
||||
Icon=cura-icon
|
||||
Terminal=false
|
||||
Type=Application
|
||||
MimeType=model/stl;application/vnd.ms-3mfdocument;application/prs.wavefront-obj;image/bmp;image/gif;image/jpeg;image/png;text/x-gcode;application/x-amf;application/x-ply;application/x-ctm;model/vnd.collada+xml;model/gltf-binary;model/gltf+json;model/vnd.collada+xml+zip;
|
||||
MimeType=model/stl;application/vnd.ms-3mfdocument;model/3mf;application/prs.wavefront-obj;image/bmp;image/gif;image/jpeg;image/png;text/x-gcode;application/x-amf;application/x-ply;application/x-ctm;model/vnd.collada+xml;model/gltf-binary;model/gltf+json;model/vnd.collada+xml+zip;x-scheme-handler/cura;
|
||||
Categories=Graphics;
|
||||
Keywords=3D;Printing;
|
||||
X-AppImage-Version={{ cura_version }}
|
||||
|
|
|
@ -25,7 +25,8 @@ def build_dmg(source_path: str, dist_path: str, filename: str, app_name: str) ->
|
|||
f"{dist_path}/{filename}",
|
||||
f"{dist_path}/{app_name}"]
|
||||
|
||||
subprocess.run(arguments)
|
||||
print(f"Run create dmg command [{" ".join([str(arg) for arg in arguments])}]")
|
||||
subprocess.run(arguments, check=True)
|
||||
|
||||
|
||||
def build_pkg(dist_path: str, app_filename: str, component_filename: str, cura_version: str, installer_filename: str) -> None:
|
||||
|
@ -56,7 +57,8 @@ def build_pkg(dist_path: str, app_filename: str, component_filename: str, cura_v
|
|||
else:
|
||||
print("CODESIGN_IDENTITY missing. The installer is not being signed")
|
||||
|
||||
subprocess.run(pkg_build_arguments)
|
||||
print(f"Run package build command [{" ".join([str(arg) for arg in pkg_build_arguments])}]")
|
||||
subprocess.run(pkg_build_arguments, check=True)
|
||||
|
||||
# This automatically generates a distribution.xml file that is used to build the installer.
|
||||
# If you want to make any changes to how the installer functions, this file should be changed to do that.
|
||||
|
@ -67,7 +69,8 @@ def build_pkg(dist_path: str, app_filename: str, component_filename: str, cura_v
|
|||
"--package", Path(dist_path, component_filename), # Package that will be inside installer
|
||||
Path(dist_path, "distribution.xml"), # Output location for sythesized distributions file
|
||||
]
|
||||
subprocess.run(distribution_creation_arguments)
|
||||
print(f"Run distribution creation command [{" ".join([str(arg) for arg in distribution_creation_arguments])}]")
|
||||
subprocess.run(distribution_creation_arguments, check=True)
|
||||
|
||||
# This creates the distributable package (Installer)
|
||||
installer_creation_arguments = [
|
||||
|
@ -80,7 +83,8 @@ def build_pkg(dist_path: str, app_filename: str, component_filename: str, cura_v
|
|||
if codesign_identity:
|
||||
installer_creation_arguments.extend(["--sign", codesign_identity])
|
||||
|
||||
subprocess.run(installer_creation_arguments)
|
||||
print(f"Run installer creation command [{" ".join([str(arg) for arg in installer_creation_arguments])}]")
|
||||
subprocess.run(installer_creation_arguments, check=True)
|
||||
|
||||
|
||||
def notarize_file(dist_path: str, filename: str) -> None:
|
||||
|
@ -99,7 +103,8 @@ def notarize_file(dist_path: str, filename: str) -> None:
|
|||
Path(dist_path, filename)
|
||||
]
|
||||
|
||||
subprocess.run(notarize_arguments)
|
||||
print(f"Run notarize command [{" ".join([str(arg) for arg in notarize_arguments])}]")
|
||||
subprocess.run(notarize_arguments, check=True)
|
||||
|
||||
|
||||
def create_pkg_installer(filename: str, dist_path: str, cura_version: str, app_name: str) -> None:
|
||||
|
@ -149,7 +154,7 @@ if __name__ == "__main__":
|
|||
parser.add_argument("--app_name", required = True, type = str, help = "Filename of the .app that will be contained within the dmg/pkg")
|
||||
args = parser.parse_args()
|
||||
|
||||
cura_version = args.cura_conan_version.split("/")[-1]
|
||||
cura_version = args.cura_conan_version.replace("+", "-") # + is not allowed for bundle identifier
|
||||
|
||||
app_name = f"{args.app_name}.app"
|
||||
|
||||
|
|
|
@ -1,9 +1,8 @@
|
|||
# Copyright (c) 2022 UltiMaker B.V.
|
||||
# Copyright (c) 2025 UltiMaker
|
||||
# Cura's build system is released under the terms of the AGPLv3 or higher.
|
||||
|
||||
!define APP_NAME "{{ app_name }}"
|
||||
!define COMP_NAME "{{ company }}"
|
||||
!define WEB_SITE "{{ web_site }}"
|
||||
!define VERSION "{{ version }}"
|
||||
!define VIVERSION "{{ version_major }}.{{ version_minor }}.{{ version_patch }}.0"
|
||||
!define COPYRIGHT "Copyright (c) {{ year }} {{ company }}"
|
||||
|
@ -16,13 +15,11 @@
|
|||
!define REG_APP_PATH "Software\Microsoft\Windows\CurrentVersion\App Paths\${APP_NAME}-${VERSION}"
|
||||
!define UNINSTALL_PATH "Software\Microsoft\Windows\CurrentVersion\Uninstall\${APP_NAME}-${VERSION}"
|
||||
|
||||
!define REG_START_MENU "Start Menu Folder"
|
||||
!define REG_START_MENU "Start Menu Shortcut"
|
||||
|
||||
;Require administrator access
|
||||
RequestExecutionLevel admin
|
||||
|
||||
var SM_Folder
|
||||
|
||||
######################################################################
|
||||
|
||||
VIProductVersion "${VIVERSION}"
|
||||
|
@ -64,11 +61,9 @@ InstallDir "$PROGRAMFILES64\${APP_NAME}"
|
|||
|
||||
!ifdef REG_START_MENU
|
||||
!define MUI_STARTMENUPAGE_NODISABLE
|
||||
!define MUI_STARTMENUPAGE_DEFAULTFOLDER "UltiMaker Cura"
|
||||
!define MUI_STARTMENUPAGE_REGISTRY_ROOT "${REG_ROOT}"
|
||||
!define MUI_STARTMENUPAGE_REGISTRY_KEY "${UNINSTALL_PATH}"
|
||||
!define MUI_STARTMENUPAGE_REGISTRY_VALUENAME "${REG_START_MENU}"
|
||||
!insertmacro MUI_PAGE_STARTMENU Application $SM_Folder
|
||||
!endif
|
||||
|
||||
!insertmacro MUI_PAGE_INSTFILES
|
||||
|
@ -107,27 +102,11 @@ SetOutPath "$INSTDIR"
|
|||
WriteUninstaller "$INSTDIR\uninstall.exe"
|
||||
|
||||
!ifdef REG_START_MENU
|
||||
!insertmacro MUI_STARTMENU_WRITE_BEGIN Application
|
||||
CreateDirectory "$SMPROGRAMS\$SM_Folder"
|
||||
CreateShortCut "$SMPROGRAMS\$SM_Folder\${APP_NAME}.lnk" "$INSTDIR\${MAIN_APP_EXE}"
|
||||
CreateShortCut "$SMPROGRAMS\$SM_Folder\Uninstall ${APP_NAME}.lnk" "$INSTDIR\uninstall.exe"
|
||||
|
||||
!ifdef WEB_SITE
|
||||
WriteIniStr "$INSTDIR\UltiMaker Cura website.url" "InternetShortcut" "URL" "${WEB_SITE}"
|
||||
CreateShortCut "$SMPROGRAMS\$SM_Folder\UltiMaker Cura website.lnk" "$INSTDIR\UltiMaker Cura website.url"
|
||||
!endif
|
||||
!insertmacro MUI_STARTMENU_WRITE_END
|
||||
CreateShortCut "$SMPROGRAMS\${APP_NAME}.lnk" "$INSTDIR\${MAIN_APP_EXE}"
|
||||
!endif
|
||||
|
||||
!ifndef REG_START_MENU
|
||||
CreateDirectory "$SMPROGRAMS\{{ app_name }}"
|
||||
CreateShortCut "$SMPROGRAMS\{{ app_name }}\${APP_NAME}.lnk" "$INSTDIR\${MAIN_APP_EXE}"
|
||||
CreateShortCut "$SMPROGRAMS\{{ app_name }}\Uninstall ${APP_NAME}.lnk" "$INSTDIR\uninstall.exe"
|
||||
|
||||
!ifdef WEB_SITE
|
||||
WriteIniStr "$INSTDIR\UltiMaker Cura website.url" "InternetShortcut" "URL" "${WEB_SITE}"
|
||||
CreateShortCut "$SMPROGRAMS\{{ app_name }}\UltiMaker Cura website.lnk" "$INSTDIR\UltiMaker Cura website.url"
|
||||
!endif
|
||||
CreateShortCut "$SMPROGRAMS\${APP_NAME}.lnk" "$INSTDIR\${MAIN_APP_EXE}"
|
||||
!endif
|
||||
|
||||
WriteRegStr ${REG_ROOT} "${REG_APP_PATH}" "" "$INSTDIR\${MAIN_APP_EXE}"
|
||||
|
@ -138,9 +117,6 @@ WriteRegStr ${REG_ROOT} "${UNINSTALL_PATH}" "DisplayIcon" "$INSTDIR\${MAIN_APP_
|
|||
WriteRegStr ${REG_ROOT} "${UNINSTALL_PATH}" "DisplayVersion" "${VERSION}"
|
||||
WriteRegStr ${REG_ROOT} "${UNINSTALL_PATH}" "Publisher" "${COMP_NAME}"
|
||||
|
||||
!ifdef WEB_SITE
|
||||
WriteRegStr ${REG_ROOT} "${UNINSTALL_PATH}" "URLInfoAbout" "${WEB_SITE}"
|
||||
!endif
|
||||
SectionEnd
|
||||
|
||||
######################################################################
|
||||
|
@ -177,29 +153,17 @@ RmDir "$INSTDIR\share\uranium"
|
|||
RmDir "$INSTDIR\share"
|
||||
|
||||
Delete "$INSTDIR\uninstall.exe"
|
||||
!ifdef WEB_SITE
|
||||
Delete "$INSTDIR\${APP_NAME} website.url"
|
||||
!endif
|
||||
|
||||
RmDir /r /REBOOTOK "$INSTDIR"
|
||||
|
||||
!ifdef REG_START_MENU
|
||||
!insertmacro MUI_STARTMENU_GETFOLDER "Application" $SM_Folder
|
||||
Delete "$SMPROGRAMS\$SM_Folder\${APP_NAME}.lnk"
|
||||
Delete "$SMPROGRAMS\$SM_Folder\Uninstall ${APP_NAME}.lnk"
|
||||
!ifdef WEB_SITE
|
||||
Delete "$SMPROGRAMS\$SM_Folder\UltiMaker Cura website.lnk"
|
||||
!endif
|
||||
RmDir "$SMPROGRAMS\$SM_Folder"
|
||||
Delete "$SMPROGRAMS\${APP_NAME}.lnk"
|
||||
Delete "$SMPROGRAMS\Uninstall ${APP_NAME}.lnk"
|
||||
!endif
|
||||
|
||||
!ifndef REG_START_MENU
|
||||
Delete "$SMPROGRAMS\{{ app_name }}\${APP_NAME}.lnk"
|
||||
Delete "$SMPROGRAMS\{{ app_name }}\Uninstall ${APP_NAME}.lnk"
|
||||
!ifdef WEB_SITE
|
||||
Delete "$SMPROGRAMS\{{ app_name }}\UltiMaker Cura website.lnk"
|
||||
!endif
|
||||
RmDir "$SMPROGRAMS\{{ app_name }}"
|
||||
Delete "$SMPROGRAMS\${APP_NAME}.lnk"
|
||||
Delete "$SMPROGRAMS\Uninstall ${APP_NAME}.lnk"
|
||||
!endif
|
||||
|
||||
!insertmacro APP_UNASSOCIATE "stl" "Cura.model"
|
||||
|
|
|
@ -1,10 +1,11 @@
|
|||
# Copyright (c) 2022 UltiMaker
|
||||
# Copyright (c) 2025 UltiMaker
|
||||
# Cura is released under the terms of the LGPLv3 or higher.
|
||||
|
||||
|
||||
import os
|
||||
import argparse # Command line arguments parsing and help.
|
||||
import subprocess
|
||||
import semver
|
||||
|
||||
import shutil
|
||||
from datetime import datetime
|
||||
|
@ -14,11 +15,12 @@ from pathlib import Path
|
|||
from jinja2 import Template
|
||||
|
||||
|
||||
def generate_nsi(source_path: str, dist_path: str, filename: str):
|
||||
def generate_nsi(source_path: str, dist_path: str, filename: str, version: str):
|
||||
dist_loc = Path(os.getcwd(), dist_path)
|
||||
source_loc = Path(os.getcwd(), source_path)
|
||||
instdir = Path("$INSTDIR")
|
||||
dist_paths = [p.relative_to(dist_loc.joinpath("UltiMaker-Cura")) for p in sorted(dist_loc.joinpath("UltiMaker-Cura").rglob("*")) if p.is_file()]
|
||||
parsed_version = semver.Version.parse(version)
|
||||
mapped_out_paths = {}
|
||||
for dist_path in dist_paths:
|
||||
if "__pycache__" not in dist_path.parts:
|
||||
|
@ -42,14 +44,13 @@ def generate_nsi(source_path: str, dist_path: str, filename: str):
|
|||
|
||||
|
||||
nsis_content = template.render(
|
||||
app_name = f"UltiMaker Cura {os.getenv('CURA_VERSION_FULL')}",
|
||||
app_name = f"UltiMaker Cura {version}",
|
||||
main_app = "UltiMaker-Cura.exe",
|
||||
version = os.getenv('CURA_VERSION_FULL'),
|
||||
version_major = os.environ.get("CURA_VERSION_MAJOR"),
|
||||
version_minor = os.environ.get("CURA_VERSION_MINOR"),
|
||||
version_patch = os.environ.get("CURA_VERSION_PATCH"),
|
||||
version = version,
|
||||
version_major = str(parsed_version.major),
|
||||
version_minor = str(parsed_version.minor),
|
||||
version_patch = str(parsed_version.patch),
|
||||
company = "UltiMaker",
|
||||
web_site = "https://ultimaker.com",
|
||||
year = datetime.now().year,
|
||||
cura_license_file = str(source_loc.joinpath("packaging", "cura_license.txt")),
|
||||
compression_method = "LZMA", # ZLIB, BZIP2 or LZMA
|
||||
|
@ -74,9 +75,10 @@ def build(dist_path: str):
|
|||
|
||||
if __name__ == "__main__":
|
||||
parser = argparse.ArgumentParser(description = "Create Windows exe installer of Cura.")
|
||||
parser.add_argument("source_path", type=str, help="Path to Conan install Cura folder.")
|
||||
parser.add_argument("dist_path", type=str, help="Path to Pyinstaller dist folder")
|
||||
parser.add_argument("filename", type = str, help = "Filename of the exe (e.g. 'UltiMaker-Cura-5.1.0-beta-Windows-X64.exe')")
|
||||
parser.add_argument("--source_path", type=str, help="Path to Conan install Cura folder.")
|
||||
parser.add_argument("--dist_path", type=str, help="Path to Pyinstaller dist folder")
|
||||
parser.add_argument("--filename", type=str, help="Filename of the exe (e.g. 'UltiMaker-Cura-5.1.0-beta-Windows-X64.exe')")
|
||||
parser.add_argument("--version", type=str, help="The full cura version, e.g. 5.9.0-beta.1+24132")
|
||||
args = parser.parse_args()
|
||||
generate_nsi(args.source_path, args.dist_path, args.filename)
|
||||
generate_nsi(args.source_path, args.dist_path, args.filename, args.version)
|
||||
build(args.dist_path)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# Copyright (c) 2022 UltiMaker
|
||||
# Copyright (c) 2025 UltiMaker
|
||||
# Cura is released under the terms of the LGPLv3 or higher.
|
||||
|
||||
|
||||
|
@ -7,6 +7,7 @@ import os
|
|||
import shutil
|
||||
import subprocess
|
||||
import uuid
|
||||
import semver
|
||||
from datetime import datetime
|
||||
from pathlib import Path
|
||||
|
||||
|
@ -20,11 +21,12 @@ def work_path(filename: Path) -> Path:
|
|||
return filename.parent
|
||||
|
||||
|
||||
def generate_wxs(source_path: Path, dist_path: Path, filename: Path, app_name: str):
|
||||
def generate_wxs(source_path: Path, dist_path: Path, filename: Path, app_name: str, version: str):
|
||||
source_loc = Path(os.getcwd(), source_path)
|
||||
dist_loc = Path(os.getcwd(), dist_path)
|
||||
work_loc = work_path(filename)
|
||||
work_loc.mkdir(parents=True, exist_ok=True)
|
||||
parsed_version = semver.Version.parse(version)
|
||||
|
||||
jinja_template_path = Path(source_loc.joinpath("packaging", "msi", "UltiMaker-Cura.wxs.jinja"))
|
||||
with open(jinja_template_path, "r") as f:
|
||||
|
@ -33,12 +35,11 @@ def generate_wxs(source_path: Path, dist_path: Path, filename: Path, app_name: s
|
|||
wxs_content = template.render(
|
||||
app_name=f"{app_name}",
|
||||
main_app="UltiMaker-Cura.exe",
|
||||
version=os.getenv('CURA_VERSION_FULL'),
|
||||
version_major=os.environ.get("CURA_VERSION_MAJOR"),
|
||||
version_minor=os.environ.get("CURA_VERSION_MINOR"),
|
||||
version_patch=os.environ.get("CURA_VERSION_PATCH"),
|
||||
version=version,
|
||||
version_major=str(parsed_version.major),
|
||||
version_minor=str(parsed_version.minor),
|
||||
version_patch=str(parsed_version.patch),
|
||||
company="UltiMaker",
|
||||
web_site="https://ultimaker.com",
|
||||
year=datetime.now().year,
|
||||
upgrade_code=str(uuid.uuid5(uuid.NAMESPACE_DNS, app_name)),
|
||||
cura_license_file=str(source_loc.joinpath("packaging", "msi", "cura_license.rtf")),
|
||||
|
@ -111,12 +112,13 @@ def build(dist_path: Path, filename: Path):
|
|||
|
||||
if __name__ == "__main__":
|
||||
parser = argparse.ArgumentParser(description="Create Windows msi installer of Cura.")
|
||||
parser.add_argument("source_path", type=Path, help="Path to Conan install Cura folder.")
|
||||
parser.add_argument("dist_path", type=Path, help="Path to Pyinstaller dist folder")
|
||||
parser.add_argument("filename", type=Path,
|
||||
parser.add_argument("--source_path", type=Path, help="Path to Conan install Cura folder.")
|
||||
parser.add_argument("--dist_path", type=Path, help="Path to Pyinstaller dist folder")
|
||||
parser.add_argument("--filename", type=Path,
|
||||
help="Filename of the exe (e.g. 'UltiMaker-Cura-5.1.0-beta-Windows-X64.msi')")
|
||||
parser.add_argument("name", type=str, help="App name (e.g. 'UltiMaker Cura')")
|
||||
parser.add_argument("--name", type=str, help="App name (e.g. 'UltiMaker Cura')")
|
||||
parser.add_argument("--version", type=str, help="The full cura version, e.g. 5.9.0-beta.1+24132")
|
||||
args = parser.parse_args()
|
||||
generate_wxs(args.source_path.resolve(), args.dist_path.resolve(), args.filename.resolve(), args.name)
|
||||
generate_wxs(args.source_path.resolve(), args.dist_path.resolve(), args.filename.resolve(), args.name, args.version)
|
||||
cleanup_artifacts(args.dist_path.resolve())
|
||||
build(args.dist_path.resolve(), args.filename)
|
||||
|
|
273
plugins/3DConnexion/NavlibClient.py
Normal file
|
@ -0,0 +1,273 @@
|
|||
# Copyright (c) 2025 3Dconnexion, UltiMaker
|
||||
# Cura is released under the terms of the LGPLv3 or higher.
|
||||
|
||||
from typing import Optional
|
||||
from UM.Math.Matrix import Matrix
|
||||
from UM.Math.Vector import Vector
|
||||
from UM.Math.AxisAlignedBox import AxisAlignedBox
|
||||
from cura.PickingPass import PickingPass
|
||||
from UM.Scene.Iterator.DepthFirstIterator import DepthFirstIterator
|
||||
from UM.Scene.SceneNode import SceneNode
|
||||
from UM.Scene.Scene import Scene
|
||||
from UM.Resources import Resources
|
||||
from UM.Tool import Tool
|
||||
from UM.View.Renderer import Renderer
|
||||
from .OverlayNode import OverlayNode
|
||||
import pynavlib.pynavlib_interface as pynav
|
||||
|
||||
|
||||
class NavlibClient(pynav.NavlibNavigationModel, Tool):
|
||||
|
||||
def __init__(self, scene: Scene, renderer: Renderer) -> None:
|
||||
pynav.NavlibNavigationModel.__init__(self, False, pynav.NavlibOptions.RowMajorOrder)
|
||||
Tool.__init__(self)
|
||||
self._scene = scene
|
||||
self._renderer = renderer
|
||||
self._pointer_pick = None
|
||||
self._was_pick = False
|
||||
self._hit_selection_only = False
|
||||
self._picking_pass = None
|
||||
self._pivot_node = OverlayNode(node=SceneNode(), image_path=Resources.getPath(Resources.Images, "cor.png"), size=2.5)
|
||||
self.put_profile_hint("UltiMaker Cura")
|
||||
self.enable_navigation(True)
|
||||
|
||||
def pick(self, x: float, y: float, check_selection: bool = False, radius: float = 0.) -> Optional[Vector]:
|
||||
|
||||
if self._picking_pass is None or radius < 0.:
|
||||
return None
|
||||
|
||||
step = 0.
|
||||
if radius == 0.:
|
||||
grid_resolution = 0
|
||||
else:
|
||||
grid_resolution = 5
|
||||
step = (2. * radius) / float(grid_resolution)
|
||||
|
||||
min_depth = 99999.
|
||||
result_position = None
|
||||
|
||||
for i in range(grid_resolution + 1):
|
||||
for j in range(grid_resolution + 1):
|
||||
|
||||
coord_x = (x - radius) + i * step
|
||||
coord_y = (y - radius) + j * step
|
||||
|
||||
picked_depth = self._picking_pass.getPickedDepth(coord_x, coord_y)
|
||||
max_depth = 16777.215
|
||||
|
||||
if 0. < picked_depth < max_depth:
|
||||
|
||||
valid_hit = True
|
||||
if check_selection:
|
||||
selection_pass = self._renderer.getRenderPass("selection")
|
||||
picked_object_id = selection_pass.getIdAtPosition(coord_x, coord_y)
|
||||
picked_object = self._scene.findObject(picked_object_id)
|
||||
|
||||
from UM.Scene.Selection import Selection
|
||||
valid_hit = Selection.isSelected(picked_object)
|
||||
|
||||
if not valid_hit and grid_resolution > 0.:
|
||||
continue
|
||||
elif not valid_hit and grid_resolution == 0.:
|
||||
return None
|
||||
|
||||
if picked_depth < min_depth:
|
||||
min_depth = picked_depth
|
||||
result_position = self._picking_pass.getPickedPosition(coord_x, coord_y)
|
||||
|
||||
return result_position
|
||||
|
||||
def get_pointer_position(self) -> "pynav.NavlibVector":
|
||||
|
||||
from UM.Qt.QtApplication import QtApplication
|
||||
main_window = QtApplication.getInstance().getMainWindow()
|
||||
|
||||
x_n = 2. * main_window._mouse_x / main_window.width() - 1.
|
||||
y_n = 2. * main_window._mouse_y / main_window.height() - 1.
|
||||
|
||||
if self.get_is_view_perspective():
|
||||
self._was_pick = True
|
||||
from cura.Utils.Threading import call_on_qt_thread
|
||||
wrapped_pick = call_on_qt_thread(self.pick)
|
||||
|
||||
self._pointer_pick = wrapped_pick(x_n, y_n)
|
||||
|
||||
return pynav.NavlibVector(0., 0., 0.)
|
||||
else:
|
||||
ray = self._scene.getActiveCamera().getRay(x_n, y_n)
|
||||
pointer_position = ray.origin + ray.direction
|
||||
|
||||
return pynav.NavlibVector(pointer_position.x, pointer_position.y, pointer_position.z)
|
||||
|
||||
def get_view_extents(self) -> "pynav.NavlibBox":
|
||||
|
||||
view_width = self._scene.getActiveCamera().getViewportWidth()
|
||||
view_height = self._scene.getActiveCamera().getViewportHeight()
|
||||
horizontal_zoom = view_width * self._scene.getActiveCamera().getZoomFactor()
|
||||
vertical_zoom = view_height * self._scene.getActiveCamera().getZoomFactor()
|
||||
|
||||
pt_min = pynav.NavlibVector(-view_width / 2 - horizontal_zoom, -view_height / 2 - vertical_zoom, -9001)
|
||||
pt_max = pynav.NavlibVector(view_width / 2 + horizontal_zoom, view_height / 2 + vertical_zoom, 9001)
|
||||
|
||||
return pynav.NavlibBox(pt_min, pt_max)
|
||||
|
||||
def get_view_frustum(self) -> "pynav.NavlibFrustum":
|
||||
|
||||
projection_matrix = self._scene.getActiveCamera().getProjectionMatrix()
|
||||
half_height = 2. / projection_matrix.getData()[1,1]
|
||||
half_width = half_height * (projection_matrix.getData()[1,1] / projection_matrix.getData()[0,0])
|
||||
|
||||
return pynav.NavlibFrustum(-half_width, half_width, -half_height, half_height, 1., 5000.)
|
||||
|
||||
def get_is_view_perspective(self) -> bool:
|
||||
return self._scene.getActiveCamera().isPerspective()
|
||||
|
||||
def get_selection_extents(self) -> "pynav.NavlibBox":
|
||||
|
||||
from UM.Scene.Selection import Selection
|
||||
bounding_box = Selection.getBoundingBox()
|
||||
|
||||
if(bounding_box is not None) :
|
||||
pt_min = pynav.NavlibVector(bounding_box.minimum.x, bounding_box.minimum.y, bounding_box.minimum.z)
|
||||
pt_max = pynav.NavlibVector(bounding_box.maximum.x, bounding_box.maximum.y, bounding_box.maximum.z)
|
||||
return pynav.NavlibBox(pt_min, pt_max)
|
||||
|
||||
def get_selection_transform(self) -> "pynav.NavlibMatrix":
|
||||
return pynav.NavlibMatrix()
|
||||
|
||||
def get_is_selection_empty(self) -> bool:
|
||||
from UM.Scene.Selection import Selection
|
||||
return not Selection.hasSelection()
|
||||
|
||||
def get_pivot_visible(self) -> bool:
|
||||
return False
|
||||
|
||||
def get_camera_matrix(self) -> "pynav.NavlibMatrix":
|
||||
|
||||
transformation = self._scene.getActiveCamera().getLocalTransformation()
|
||||
|
||||
return pynav.NavlibMatrix([[transformation.at(0, 0), transformation.at(0, 1), transformation.at(0, 2), transformation.at(0, 3)],
|
||||
[transformation.at(1, 0), transformation.at(1, 1), transformation.at(1, 2), transformation.at(1, 3)],
|
||||
[transformation.at(2, 0), transformation.at(2, 1), transformation.at(2, 2), transformation.at(2, 3)],
|
||||
[transformation.at(3, 0), transformation.at(3, 1), transformation.at(3, 2), transformation.at(3, 3)]])
|
||||
|
||||
def get_coordinate_system(self) -> "pynav.NavlibMatrix":
|
||||
return pynav.NavlibMatrix()
|
||||
|
||||
def get_front_view(self) -> "pynav.NavlibMatrix":
|
||||
return pynav.NavlibMatrix()
|
||||
|
||||
def get_model_extents(self) -> "pynav.NavlibBox":
|
||||
|
||||
result_bbox = AxisAlignedBox()
|
||||
build_volume_bbox = None
|
||||
|
||||
for node in DepthFirstIterator(self._scene.getRoot()):
|
||||
node.setCalculateBoundingBox(True)
|
||||
if node.__class__.__qualname__ == "CuraSceneNode" :
|
||||
result_bbox = result_bbox + node.getBoundingBox()
|
||||
elif node.__class__.__qualname__ == "BuildVolume":
|
||||
build_volume_bbox = node.getBoundingBox()
|
||||
|
||||
if not result_bbox.isValid():
|
||||
result_bbox = build_volume_bbox
|
||||
|
||||
if result_bbox is not None:
|
||||
pt_min = pynav.NavlibVector(result_bbox.minimum.x, result_bbox.minimum.y, result_bbox.minimum.z)
|
||||
pt_max = pynav.NavlibVector(result_bbox.maximum.x, result_bbox.maximum.y, result_bbox.maximum.z)
|
||||
self._scene_center = result_bbox.center
|
||||
self._scene_radius = (result_bbox.maximum - self._scene_center).length()
|
||||
return pynav.NavlibBox(pt_min, pt_max)
|
||||
|
||||
def get_pivot_position(self) -> "pynav.NavlibVector":
|
||||
return pynav.NavlibVector()
|
||||
|
||||
def get_hit_look_at(self) -> "pynav.NavlibVector":
|
||||
|
||||
if self._was_pick and self._pointer_pick is not None:
|
||||
return pynav.NavlibVector(self._pointer_pick.x, self._pointer_pick.y, self._pointer_pick.z)
|
||||
elif self._was_pick and self._pointer_pick is None:
|
||||
return None
|
||||
|
||||
from cura.Utils.Threading import call_on_qt_thread
|
||||
wrapped_pick = call_on_qt_thread(self.pick)
|
||||
picked_position = wrapped_pick(0, 0, self._hit_selection_only, 0.5)
|
||||
|
||||
if picked_position is not None:
|
||||
return pynav.NavlibVector(picked_position.x, picked_position.y, picked_position.z)
|
||||
|
||||
def get_units_to_meters(self) -> float:
|
||||
return 0.05
|
||||
|
||||
def is_user_pivot(self) -> bool:
|
||||
return False
|
||||
|
||||
def set_camera_matrix(self, matrix : "pynav.NavlibMatrix") -> None:
|
||||
|
||||
# !!!!!!
|
||||
# Hit testing in Orthographic view is not reliable
|
||||
# Picking starts in camera position, not on near plane
|
||||
# which results in wrong depth values (visible geometry
|
||||
# cannot be picked properly) - Workaround needed (camera position offset)
|
||||
# !!!!!!
|
||||
if not self.get_is_view_perspective():
|
||||
affine = matrix._matrix
|
||||
direction = Vector(-affine[0][2], -affine[1][2], -affine[2][2])
|
||||
distance = self._scene_center - Vector(affine[0][3], affine[1][3], affine[2][3])
|
||||
|
||||
cos_value = direction.dot(distance.normalized())
|
||||
|
||||
offset = 0.
|
||||
|
||||
if (distance.length() < self._scene_radius) and (cos_value > 0.):
|
||||
offset = self._scene_radius
|
||||
elif (distance.length() < self._scene_radius) and (cos_value < 0.):
|
||||
offset = 2. * self._scene_radius
|
||||
elif (distance.length() > self._scene_radius) and (cos_value < 0.):
|
||||
offset = 2. * distance.length()
|
||||
|
||||
matrix._matrix[0][3] = matrix._matrix[0][3] - offset * direction.x
|
||||
matrix._matrix[1][3] = matrix._matrix[1][3] - offset * direction.y
|
||||
matrix._matrix[2][3] = matrix._matrix[2][3] - offset * direction.z
|
||||
|
||||
transformation = Matrix(data = matrix._matrix)
|
||||
self._scene.getActiveCamera().setTransformation(transformation)
|
||||
|
||||
active_camera = self._scene.getActiveCamera()
|
||||
if active_camera.isPerspective():
|
||||
camera_position = active_camera.getWorldPosition()
|
||||
dist = (camera_position - self._pivot_node.getWorldPosition()).length()
|
||||
scale = dist / 400.
|
||||
else:
|
||||
view_width = active_camera.getViewportWidth()
|
||||
current_size = view_width + (2. * active_camera.getZoomFactor() * view_width)
|
||||
scale = current_size / view_width * 5.
|
||||
|
||||
self._pivot_node.scale(scale)
|
||||
|
||||
def set_view_extents(self, extents: "pynav.NavlibBox") -> None:
|
||||
view_width = self._scene.getActiveCamera().getViewportWidth()
|
||||
new_zoom = (extents._min._x + view_width / 2.) / - view_width
|
||||
self._scene.getActiveCamera().setZoomFactor(new_zoom)
|
||||
|
||||
def set_hit_selection_only(self, onlySelection : bool) -> None:
|
||||
self._hit_selection_only = onlySelection
|
||||
|
||||
def set_motion_flag(self, motion : bool) -> None:
|
||||
if motion:
|
||||
width = self._scene.getActiveCamera().getViewportWidth()
|
||||
height = self._scene.getActiveCamera().getViewportHeight()
|
||||
self._picking_pass = PickingPass(width, height)
|
||||
self._renderer.addRenderPass(self._picking_pass)
|
||||
else:
|
||||
self._was_pick = False
|
||||
self._renderer.removeRenderPass(self._picking_pass)
|
||||
|
||||
def set_pivot_position(self, position) -> None:
|
||||
self._pivot_node._target_node.setPosition(position=Vector(position._x, position._y, position._z), transform_space = SceneNode.TransformSpace.World)
|
||||
|
||||
def set_pivot_visible(self, visible) -> None:
|
||||
if visible:
|
||||
self._scene.getRoot().addChild(self._pivot_node)
|
||||
else:
|
||||
self._scene.getRoot().removeChild(self._pivot_node)
|
68
plugins/3DConnexion/OverlayNode.py
Normal file
|
@ -0,0 +1,68 @@
|
|||
# Copyright (c) 2025 3Dconnexion, UltiMaker
|
||||
# Cura is released under the terms of the LGPLv3 or higher.
|
||||
|
||||
from UM.Scene.SceneNode import SceneNode
|
||||
from UM.View.GL.OpenGL import OpenGL
|
||||
from UM.Mesh.MeshBuilder import MeshBuilder # To create the overlay quad
|
||||
from UM.Resources import Resources # To find shader locations
|
||||
from UM.Math.Matrix import Matrix
|
||||
from UM.Application import Application
|
||||
|
||||
try:
|
||||
from PyQt6.QtGui import QImage
|
||||
except:
|
||||
from PyQt5.QtGui import QImage
|
||||
|
||||
class OverlayNode(SceneNode):
|
||||
def __init__(self, node, image_path, size, parent=None):
|
||||
super().__init__(parent)
|
||||
self._target_node = node
|
||||
self.setCalculateBoundingBox(False)
|
||||
|
||||
self._overlay_mesh = self._createOverlayQuad(size)
|
||||
self._drawed_mesh = self._overlay_mesh
|
||||
self._shader = None
|
||||
self._scene = Application.getInstance().getController().getScene()
|
||||
self._scale = 1.
|
||||
self._image_path = image_path
|
||||
|
||||
def scale(self, factor):
|
||||
scale_matrix = Matrix()
|
||||
scale_matrix.setByScaleFactor(factor)
|
||||
self._drawed_mesh = self._overlay_mesh.getTransformed(scale_matrix)
|
||||
|
||||
def _createOverlayQuad(self, size):
|
||||
mb = MeshBuilder()
|
||||
mb.addFaceByPoints(-size / 2, -size / 2, 0, -size / 2, size / 2, 0, size / 2, -size / 2, 0)
|
||||
mb.addFaceByPoints(size / 2, size / 2, 0, -size / 2, size / 2, 0, size / 2, -size / 2, 0)
|
||||
|
||||
# Set UV coordinates so a texture can be created
|
||||
mb.setVertexUVCoordinates(0, 0, 1)
|
||||
mb.setVertexUVCoordinates(1, 0, 0)
|
||||
mb.setVertexUVCoordinates(4, 0, 0)
|
||||
mb.setVertexUVCoordinates(2, 1, 1)
|
||||
mb.setVertexUVCoordinates(5, 1, 1)
|
||||
mb.setVertexUVCoordinates(3, 1, 0)
|
||||
|
||||
return mb.build()
|
||||
|
||||
def render(self, renderer):
|
||||
|
||||
if not self._shader:
|
||||
# We now misuse the platform shader, as it actually supports textures
|
||||
self._shader = OpenGL.getInstance().createShaderProgram(Resources.getPath(Resources.Shaders, "platform.shader"))
|
||||
# Set the opacity to 0, so that the template is in full control.
|
||||
self._shader.setUniformValue("u_opacity", 0)
|
||||
self._texture = OpenGL.getInstance().createTexture()
|
||||
texture_image = QImage(self._image_path)
|
||||
self._texture.setImage(texture_image)
|
||||
self._shader.setTexture(0, self._texture)
|
||||
|
||||
node_position = self._target_node.getWorldPosition()
|
||||
position_matrix = Matrix()
|
||||
position_matrix.setByTranslation(node_position)
|
||||
camera_orientation = self._scene.getActiveCamera().getOrientation().toMatrix()
|
||||
|
||||
renderer.queueNode(self._scene.getRoot(), shader=self._shader, mesh=self._drawed_mesh.getTransformed(position_matrix.multiply(camera_orientation)), overlay=True)
|
||||
|
||||
return True # This node does it's own rendering.
|
26
plugins/3DConnexion/__init__.py
Normal file
|
@ -0,0 +1,26 @@
|
|||
# Copyright (c) 2025 UltiMaker
|
||||
# Cura is released under the terms of the LGPLv3 or higher.
|
||||
|
||||
from UM.Logger import Logger
|
||||
|
||||
from typing import TYPE_CHECKING, Dict, Any
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from UM.Application import Application
|
||||
|
||||
|
||||
def getMetaData() -> Dict[str, Any]:
|
||||
return {
|
||||
"tool": {
|
||||
"visible": False
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
def register(app: "Application") -> Dict[str, Any]:
|
||||
try:
|
||||
from .NavlibClient import NavlibClient
|
||||
return { "tool": NavlibClient(app.getController().getScene(), app.getRenderer()) }
|
||||
except BaseException as exception:
|
||||
Logger.warning(f"Unable to load 3Dconnexion library: {exception}")
|
||||
return { }
|
8
plugins/3DConnexion/plugin.json
Normal file
|
@ -0,0 +1,8 @@
|
|||
{
|
||||
"name": "3DConnexion mouses",
|
||||
"author": "3DConnexion",
|
||||
"version": "1.0.0",
|
||||
"description": "Allows working with 3D mouses inside Cura.",
|
||||
"api": 8,
|
||||
"i18n-catalog": "cura"
|
||||
}
|
|
@ -94,7 +94,7 @@ class ThreeMFReader(MeshReader):
|
|||
return temp_mat
|
||||
|
||||
@staticmethod
|
||||
def _convertSavitarNodeToUMNode(savitar_node: Savitar.SceneNode, file_name: str = "") -> Optional[SceneNode]:
|
||||
def _convertSavitarNodeToUMNode(savitar_node: Savitar.SceneNode, file_name: str = "", archive: zipfile.ZipFile = None) -> Optional[SceneNode]:
|
||||
"""Convenience function that converts a SceneNode object (as obtained from libSavitar) to a scene node.
|
||||
|
||||
:returns: Scene node.
|
||||
|
@ -115,6 +115,10 @@ class ThreeMFReader(MeshReader):
|
|||
|
||||
active_build_plate = CuraApplication.getInstance().getMultiBuildPlateModel().activeBuildPlate
|
||||
|
||||
component_path = savitar_node.getComponentPath()
|
||||
if component_path != "" and archive is not None:
|
||||
savitar_node.parseComponentData(archive.open(component_path.lstrip("/")).read())
|
||||
|
||||
um_node = CuraSceneNode() # This adds a SettingOverrideDecorator
|
||||
um_node.addDecorator(BuildPlateDecorator(active_build_plate))
|
||||
try:
|
||||
|
@ -143,7 +147,7 @@ class ThreeMFReader(MeshReader):
|
|||
um_node.setMeshData(mesh_data)
|
||||
|
||||
for child in savitar_node.getChildren():
|
||||
child_node = ThreeMFReader._convertSavitarNodeToUMNode(child)
|
||||
child_node = ThreeMFReader._convertSavitarNodeToUMNode(child, archive=archive)
|
||||
if child_node:
|
||||
um_node.addChild(child_node)
|
||||
|
||||
|
@ -232,7 +236,7 @@ class ThreeMFReader(MeshReader):
|
|||
CuraApplication.getInstance().getController().getScene().setMetaDataEntry(key, value)
|
||||
|
||||
for node in scene_3mf.getSceneNodes():
|
||||
um_node = ThreeMFReader._convertSavitarNodeToUMNode(node, file_name)
|
||||
um_node = ThreeMFReader._convertSavitarNodeToUMNode(node, file_name, archive)
|
||||
if um_node is None:
|
||||
continue
|
||||
|
||||
|
|
|
@ -23,7 +23,7 @@ def getMetaData() -> Dict:
|
|||
if "3MFReader.ThreeMFReader" in sys.modules:
|
||||
metaData["mesh_reader"] = [
|
||||
{
|
||||
"extension": "3mf",
|
||||
"extension": workspace_extension,
|
||||
"description": catalog.i18nc("@item:inlistbox", "3MF File")
|
||||
}
|
||||
]
|
||||
|
|
176
plugins/3MFWriter/BambuLabVariant.py
Normal file
|
@ -0,0 +1,176 @@
|
|||
# Copyright (c) 2025 UltiMaker
|
||||
# Cura is released under the terms of the LGPLv3 or higher.
|
||||
|
||||
import hashlib
|
||||
import json
|
||||
from io import StringIO
|
||||
import xml.etree.ElementTree as ET
|
||||
import zipfile
|
||||
|
||||
from PyQt6.QtCore import Qt, QBuffer
|
||||
from PyQt6.QtGui import QImage
|
||||
|
||||
from UM.Application import Application
|
||||
from UM.Logger import Logger
|
||||
from UM.Mesh.MeshWriter import MeshWriter
|
||||
from UM.PluginRegistry import PluginRegistry
|
||||
from typing import cast
|
||||
|
||||
from cura.CuraApplication import CuraApplication
|
||||
|
||||
from .ThreeMFVariant import ThreeMFVariant
|
||||
from UM.i18n import i18nCatalog
|
||||
catalog = i18nCatalog("cura")
|
||||
|
||||
# Path constants
|
||||
METADATA_PATH = "Metadata"
|
||||
THUMBNAIL_PATH_MULTIPLATE = f"{METADATA_PATH}/plate_1.png"
|
||||
THUMBNAIL_PATH_MULTIPLATE_SMALL = f"{METADATA_PATH}/plate_1_small.png"
|
||||
GCODE_PATH = f"{METADATA_PATH}/plate_1.gcode"
|
||||
GCODE_MD5_PATH = f"{GCODE_PATH}.md5"
|
||||
MODEL_SETTINGS_PATH = f"{METADATA_PATH}/model_settings.config"
|
||||
PLATE_DESC_PATH = f"{METADATA_PATH}/plate_1.json"
|
||||
SLICE_INFO_PATH = f"{METADATA_PATH}/slice_info.config"
|
||||
PROJECT_SETTINGS_PATH = f"{METADATA_PATH}/project_settings.config"
|
||||
|
||||
class BambuLabVariant(ThreeMFVariant):
|
||||
"""BambuLab specific implementation of the 3MF format."""
|
||||
|
||||
@property
|
||||
def mime_type(self) -> str:
|
||||
return "application/vnd.bambulab-package.3dmanufacturing-3dmodel+xml"
|
||||
|
||||
def process_thumbnail(self, snapshot: QImage, thumbnail_buffer: QBuffer,
|
||||
archive: zipfile.ZipFile, relations_element: ET.Element) -> None:
|
||||
"""Process the thumbnail for BambuLab variant."""
|
||||
# Write thumbnail
|
||||
archive.writestr(zipfile.ZipInfo(THUMBNAIL_PATH_MULTIPLATE), thumbnail_buffer.data())
|
||||
|
||||
# Add relations elements for thumbnails
|
||||
ET.SubElement(relations_element, "Relationship",
|
||||
Target="/" + THUMBNAIL_PATH_MULTIPLATE, Id="rel-2",
|
||||
pe="http://schemas.openxmlformats.org/package/2006/relationships/metadata/thumbnail")
|
||||
|
||||
ET.SubElement(relations_element, "Relationship",
|
||||
Target="/" + THUMBNAIL_PATH_MULTIPLATE, Id="rel-4",
|
||||
Type="http://schemas.bambulab.com/package/2021/cover-thumbnail-middle")
|
||||
|
||||
# Create and save small thumbnail
|
||||
small_snapshot = snapshot.scaled(128, 128, transformMode=Qt.TransformationMode.SmoothTransformation)
|
||||
small_thumbnail_buffer = QBuffer()
|
||||
small_thumbnail_buffer.open(QBuffer.OpenModeFlag.ReadWrite)
|
||||
small_snapshot.save(small_thumbnail_buffer, "PNG")
|
||||
|
||||
# Write small thumbnail
|
||||
archive.writestr(zipfile.ZipInfo(THUMBNAIL_PATH_MULTIPLATE_SMALL), small_thumbnail_buffer.data())
|
||||
|
||||
# Add relation for small thumbnail
|
||||
ET.SubElement(relations_element, "Relationship",
|
||||
Target="/" + THUMBNAIL_PATH_MULTIPLATE_SMALL, Id="rel-5",
|
||||
Type="http://schemas.bambulab.com/package/2021/cover-thumbnail-small")
|
||||
|
||||
def add_extra_files(self, archive: zipfile.ZipFile, metadata_relations_element: ET.Element) -> None:
|
||||
"""Add BambuLab specific files to the archive."""
|
||||
self._storeGCode(archive, metadata_relations_element)
|
||||
self._storeModelSettings(archive)
|
||||
self._storePlateDesc(archive)
|
||||
self._storeSliceInfo(archive)
|
||||
self._storeProjectSettings(archive)
|
||||
|
||||
def _storeGCode(self, archive: zipfile.ZipFile, metadata_relations_element: ET.Element):
|
||||
"""Store GCode data in the archive."""
|
||||
gcode_textio = StringIO()
|
||||
gcode_writer = cast(MeshWriter, PluginRegistry.getInstance().getPluginObject("GCodeWriter"))
|
||||
success = gcode_writer.write(gcode_textio, None)
|
||||
|
||||
if not success:
|
||||
error_msg = catalog.i18nc("@info:error", "Can't write GCode to 3MF file")
|
||||
self._writer.setInformation(error_msg)
|
||||
Logger.error(error_msg)
|
||||
raise Exception(error_msg)
|
||||
|
||||
gcode_data = gcode_textio.getvalue().encode("UTF-8")
|
||||
archive.writestr(zipfile.ZipInfo(GCODE_PATH), gcode_data)
|
||||
|
||||
gcode_relation_element = ET.SubElement(metadata_relations_element, "Relationship",
|
||||
Target=f"/{GCODE_PATH}", Id="rel-1",
|
||||
Type="http://schemas.bambulab.com/package/2021/gcode")
|
||||
|
||||
# Calculate and store the MD5 sum of the gcode data
|
||||
md5_hash = hashlib.md5(gcode_data).hexdigest()
|
||||
archive.writestr(zipfile.ZipInfo(GCODE_MD5_PATH), md5_hash.encode("UTF-8"))
|
||||
|
||||
def _storeModelSettings(self, archive: zipfile.ZipFile):
|
||||
"""Store model settings in the archive."""
|
||||
config = ET.Element("config")
|
||||
plate = ET.SubElement(config, "plate")
|
||||
ET.SubElement(plate, "metadata", key="plater_id", value="1")
|
||||
ET.SubElement(plate, "metadata", key="plater_name", value="")
|
||||
ET.SubElement(plate, "metadata", key="locked", value="false")
|
||||
ET.SubElement(plate, "metadata", key="filament_map_mode", value="Auto For Flush")
|
||||
extruders_count = len(CuraApplication.getInstance().getExtruderManager().extruderIds)
|
||||
ET.SubElement(plate, "metadata", key="filament_maps", value=" ".join("1" for _ in range(extruders_count)))
|
||||
ET.SubElement(plate, "metadata", key="gcode_file", value=GCODE_PATH)
|
||||
ET.SubElement(plate, "metadata", key="thumbnail_file", value=THUMBNAIL_PATH_MULTIPLATE)
|
||||
ET.SubElement(plate, "metadata", key="pattern_bbox_file", value=PLATE_DESC_PATH)
|
||||
|
||||
self._writer._storeElementTree(archive, MODEL_SETTINGS_PATH, config)
|
||||
|
||||
def _storePlateDesc(self, archive: zipfile.ZipFile):
|
||||
"""Store plate description in the archive."""
|
||||
plate_desc = {}
|
||||
|
||||
filament_ids = []
|
||||
filament_colors = []
|
||||
|
||||
for extruder in CuraApplication.getInstance().getExtruderManager().getUsedExtruderStacks():
|
||||
filament_ids.append(extruder.getValue("extruder_nr"))
|
||||
filament_colors.append(self._writer._getMaterialColor(extruder))
|
||||
|
||||
plate_desc["filament_ids"] = filament_ids
|
||||
plate_desc["filament_colors"] = filament_colors
|
||||
plate_desc["first_extruder"] = CuraApplication.getInstance().getExtruderManager().getInitialExtruderNr()
|
||||
plate_desc["is_seq_print"] = Application.getInstance().getGlobalContainerStack().getValue("print_sequence") == "one_at_a_time"
|
||||
plate_desc["nozzle_diameter"] = CuraApplication.getInstance().getExtruderManager().getActiveExtruderStack().getValue("machine_nozzle_size")
|
||||
plate_desc["version"] = 2
|
||||
|
||||
file = zipfile.ZipInfo(PLATE_DESC_PATH)
|
||||
file.compress_type = zipfile.ZIP_DEFLATED
|
||||
archive.writestr(file, json.dumps(plate_desc).encode("UTF-8"))
|
||||
|
||||
def _storeSliceInfo(self, archive: zipfile.ZipFile):
|
||||
"""Store slice information in the archive."""
|
||||
config = ET.Element("config")
|
||||
|
||||
header = ET.SubElement(config, "header")
|
||||
ET.SubElement(header, "header_item", key="X-BBL-Client-Type", value="slicer")
|
||||
ET.SubElement(header, "header_item", key="X-BBL-Client-Version", value="02.00.01.50")
|
||||
|
||||
plate = ET.SubElement(config, "plate")
|
||||
ET.SubElement(plate, "metadata", key="index", value="1")
|
||||
ET.SubElement(plate,
|
||||
"metadata",
|
||||
key="nozzle_diameters",
|
||||
value=str(CuraApplication.getInstance().getExtruderManager().getActiveExtruderStack().getValue("machine_nozzle_size")))
|
||||
|
||||
print_information = CuraApplication.getInstance().getPrintInformation()
|
||||
for index, extruder in enumerate(Application.getInstance().getGlobalContainerStack().extruderList):
|
||||
used_m = print_information.materialLengths[index]
|
||||
used_g = print_information.materialWeights[index]
|
||||
if used_m > 0.0 and used_g > 0.0:
|
||||
ET.SubElement(plate,
|
||||
"filament",
|
||||
id=str(extruder.getValue("extruder_nr") + 1),
|
||||
tray_info_idx="GFA00",
|
||||
type=extruder.material.getMetaDataEntry("material", ""),
|
||||
color=self._writer._getMaterialColor(extruder),
|
||||
used_m=str(used_m),
|
||||
used_g=str(used_g))
|
||||
|
||||
self._writer._storeElementTree(archive, SLICE_INFO_PATH, config)
|
||||
|
||||
def _storeProjectSettings(self, archive: zipfile.ZipFile):
|
||||
api = CuraApplication.getInstance().getCuraAPI()
|
||||
file = zipfile.ZipInfo(PROJECT_SETTINGS_PATH)
|
||||
json_string = json.dumps(api.interface.settings.getAllGlobalSettings(), separators=(", ", ": "), indent=4)
|
||||
archive.writestr(file, json_string.encode("UTF-8"))
|
33
plugins/3MFWriter/Cura3mfVariant.py
Normal file
|
@ -0,0 +1,33 @@
|
|||
# Copyright (c) 2025 UltiMaker
|
||||
# Cura is released under the terms of the LGPLv3 or higher.
|
||||
|
||||
import xml.etree.ElementTree as ET
|
||||
import zipfile
|
||||
|
||||
from PyQt6.QtCore import QBuffer
|
||||
from PyQt6.QtGui import QImage
|
||||
|
||||
from .ThreeMFVariant import ThreeMFVariant
|
||||
|
||||
# Standard 3MF paths
|
||||
METADATA_PATH = "Metadata"
|
||||
THUMBNAIL_PATH = f"{METADATA_PATH}/thumbnail.png"
|
||||
|
||||
class Cura3mfVariant(ThreeMFVariant):
|
||||
"""Default implementation of the 3MF format."""
|
||||
|
||||
@property
|
||||
def mime_type(self) -> str:
|
||||
return "application/vnd.ms-package.3dmanufacturing-3dmodel+xml"
|
||||
|
||||
def process_thumbnail(self, snapshot: QImage, thumbnail_buffer: QBuffer,
|
||||
archive: zipfile.ZipFile, relations_element: ET.Element) -> None:
|
||||
"""Process the thumbnail for default 3MF variant."""
|
||||
thumbnail_file = zipfile.ZipInfo(THUMBNAIL_PATH)
|
||||
# Don't try to compress snapshot file, because the PNG is pretty much as compact as it will get
|
||||
archive.writestr(thumbnail_file, thumbnail_buffer.data())
|
||||
|
||||
# Add thumbnail relation to _rels/.rels file
|
||||
ET.SubElement(relations_element, "Relationship",
|
||||
Target="/" + THUMBNAIL_PATH, Id="rel1",
|
||||
Type="http://schemas.openxmlformats.org/package/2006/relationships/metadata/thumbnail")
|
74
plugins/3MFWriter/ThreeMFVariant.py
Normal file
|
@ -0,0 +1,74 @@
|
|||
# Copyright (c) 2025 UltiMaker
|
||||
# Cura is released under the terms of the LGPLv3 or higher.
|
||||
|
||||
from abc import ABC, abstractmethod
|
||||
from typing import TYPE_CHECKING
|
||||
import xml.etree.ElementTree as ET
|
||||
import zipfile
|
||||
|
||||
from PyQt6.QtGui import QImage
|
||||
from PyQt6.QtCore import QBuffer
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from .ThreeMFWriter import ThreeMFWriter
|
||||
|
||||
class ThreeMFVariant(ABC):
|
||||
"""Base class for 3MF format variants.
|
||||
|
||||
Different vendors may have their own extensions to the 3MF format,
|
||||
such as BambuLab's 3MF variant. This class provides an interface
|
||||
for implementing these variants.
|
||||
"""
|
||||
|
||||
def __init__(self, writer: 'ThreeMFWriter'):
|
||||
"""
|
||||
:param writer: The ThreeMFWriter instance that will use this variant
|
||||
"""
|
||||
self._writer = writer
|
||||
|
||||
@property
|
||||
@abstractmethod
|
||||
def mime_type(self) -> str:
|
||||
"""The MIME type for this 3MF variant."""
|
||||
pass
|
||||
|
||||
def handles_mime_type(self, mime_type: str) -> bool:
|
||||
"""Check if this variant handles the given MIME type.
|
||||
|
||||
:param mime_type: The MIME type to check
|
||||
:return: True if this variant handles the MIME type, False otherwise
|
||||
"""
|
||||
return mime_type == self.mime_type
|
||||
|
||||
def prepare_content_types(self, content_types: ET.Element) -> None:
|
||||
"""Prepare the content types XML element for this variant.
|
||||
|
||||
:param content_types: The content types XML element
|
||||
"""
|
||||
pass
|
||||
|
||||
def prepare_relations(self, relations_element: ET.Element) -> None:
|
||||
"""Prepare the relations XML element for this variant.
|
||||
|
||||
:param relations_element: The relations XML element
|
||||
"""
|
||||
pass
|
||||
|
||||
def process_thumbnail(self, snapshot: QImage, thumbnail_buffer: QBuffer,
|
||||
archive: zipfile.ZipFile, relations_element: ET.Element) -> None:
|
||||
"""Process the thumbnail for this variant.
|
||||
|
||||
:param snapshot: The snapshot image
|
||||
:param thumbnail_buffer: Buffer containing the thumbnail data
|
||||
:param archive: The zip archive to write to
|
||||
:param relations_element: The relations XML element
|
||||
"""
|
||||
pass
|
||||
|
||||
def add_extra_files(self, archive: zipfile.ZipFile, metadata_relations_element: ET.Element) -> None:
|
||||
"""Add any extra files required by this variant to the archive.
|
||||
|
||||
:param archive: The zip archive to write to
|
||||
:param metadata_relations_element: The metadata relations XML element
|
||||
"""
|
||||
pass
|
|
@ -1,11 +1,13 @@
|
|||
# Copyright (c) 2015-2022 Ultimaker B.V.
|
||||
# Copyright (c) 2015-2025 UltiMaker
|
||||
# Cura is released under the terms of the LGPLv3 or higher.
|
||||
|
||||
import json
|
||||
import re
|
||||
import threading
|
||||
|
||||
from typing import Optional, cast, List, Dict, Pattern, Set
|
||||
from typing import Optional, cast, List, Dict, Set
|
||||
|
||||
from UM.PluginRegistry import PluginRegistry
|
||||
from UM.Mesh.MeshWriter import MeshWriter
|
||||
from UM.Math.Vector import Vector
|
||||
from UM.Logger import Logger
|
||||
|
@ -19,7 +21,9 @@ from UM.Settings.ContainerRegistry import ContainerRegistry
|
|||
|
||||
from cura.CuraApplication import CuraApplication
|
||||
from cura.CuraPackageManager import CuraPackageManager
|
||||
from cura.Machines.Models.ExtrudersModel import ExtrudersModel
|
||||
from cura.Settings import CuraContainerStack
|
||||
from cura.Settings.ExtruderStack import ExtruderStack
|
||||
from cura.Utils.Threading import call_on_qt_thread
|
||||
from cura.Scene.CuraSceneNode import CuraSceneNode
|
||||
from cura.Snapshot import Snapshot
|
||||
|
@ -45,11 +49,13 @@ import UM.Application
|
|||
|
||||
from .SettingsExportModel import SettingsExportModel
|
||||
from .SettingsExportGroup import SettingsExportGroup
|
||||
from .ThreeMFVariant import ThreeMFVariant
|
||||
from .Cura3mfVariant import Cura3mfVariant
|
||||
from .BambuLabVariant import BambuLabVariant
|
||||
|
||||
from UM.i18n import i18nCatalog
|
||||
catalog = i18nCatalog("cura")
|
||||
|
||||
THUMBNAIL_PATH = "Metadata/thumbnail.png"
|
||||
MODEL_PATH = "3D/3dmodel.model"
|
||||
PACKAGE_METADATA_PATH = "Cura/packages.json"
|
||||
|
||||
|
@ -68,6 +74,12 @@ class ThreeMFWriter(MeshWriter):
|
|||
self._store_archive = False
|
||||
self._lock = threading.Lock()
|
||||
|
||||
# Register available variants
|
||||
self._variants = {
|
||||
Cura3mfVariant(self).mime_type: Cura3mfVariant,
|
||||
BambuLabVariant(self).mime_type: BambuLabVariant
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def _convertMatrixToString(matrix):
|
||||
result = ""
|
||||
|
@ -201,26 +213,48 @@ class ThreeMFWriter(MeshWriter):
|
|||
|
||||
painter.end()
|
||||
|
||||
def write(self, stream, nodes, mode = MeshWriter.OutputMode.BinaryMode, export_settings_model = None) -> bool:
|
||||
def _getVariant(self, mime_type: str) -> ThreeMFVariant:
|
||||
"""Get the appropriate variant for the given MIME type.
|
||||
|
||||
:param mime_type: The MIME type to get the variant for
|
||||
:return: An instance of the variant for the given MIME type
|
||||
"""
|
||||
variant_class = self._variants.get(mime_type, Cura3mfVariant)
|
||||
return variant_class(self)
|
||||
|
||||
def write(self, stream, nodes, mode = MeshWriter.OutputMode.BinaryMode, export_settings_model = None, **kwargs) -> bool:
|
||||
self._archive = None # Reset archive
|
||||
archive = zipfile.ZipFile(stream, "w", compression = zipfile.ZIP_DEFLATED)
|
||||
|
||||
# Determine which variant to use based on mime type in kwargs
|
||||
mime_type = kwargs.get("mime_type", Cura3mfVariant(self).mime_type)
|
||||
variant = self._getVariant(mime_type)
|
||||
|
||||
try:
|
||||
model_file = zipfile.ZipInfo(MODEL_PATH)
|
||||
# Because zipfile is stupid and ignores archive-level compression settings when writing with ZipInfo.
|
||||
model_file.compress_type = zipfile.ZIP_DEFLATED
|
||||
|
||||
# Create content types file
|
||||
content_types_file = zipfile.ZipInfo("[Content_Types].xml")
|
||||
content_types_file.compress_type = zipfile.ZIP_DEFLATED
|
||||
content_types = ET.Element("Types", xmlns = self._namespaces["content-types"])
|
||||
rels_type = ET.SubElement(content_types, "Default", Extension = "rels", ContentType = "application/vnd.openxmlformats-package.relationships+xml")
|
||||
model_type = ET.SubElement(content_types, "Default", Extension = "model", ContentType = "application/vnd.ms-package.3dmanufacturing-3dmodel+xml")
|
||||
|
||||
# Create _rels/.rels file
|
||||
relations_file = zipfile.ZipInfo("_rels/.rels")
|
||||
relations_file.compress_type = zipfile.ZIP_DEFLATED
|
||||
relations_element = ET.Element("Relationships", xmlns = self._namespaces["relationships"])
|
||||
model_relation_element = ET.SubElement(relations_element, "Relationship", Target = "/" + MODEL_PATH, Id = "rel0", Type = "http://schemas.microsoft.com/3dmanufacturing/2013/01/3dmodel")
|
||||
relations_element = self._makeRelationsTree()
|
||||
model_relation_element = ET.SubElement(relations_element, "Relationship", Target="/" + MODEL_PATH,
|
||||
Id="rel0",
|
||||
Type="http://schemas.microsoft.com/3dmanufacturing/2013/01/3dmodel")
|
||||
|
||||
# Create Metadata/_rels/model_settings.config.rels
|
||||
metadata_relations_element = self._makeRelationsTree()
|
||||
|
||||
# Let the variant add its specific files
|
||||
variant.add_extra_files(archive, metadata_relations_element)
|
||||
|
||||
# Let the variant prepare content types and relations
|
||||
variant.prepare_content_types(content_types)
|
||||
variant.prepare_relations(relations_element)
|
||||
|
||||
# Attempt to add a thumbnail
|
||||
snapshot = self._createSnapshot()
|
||||
|
@ -233,16 +267,11 @@ class ThreeMFWriter(MeshWriter):
|
|||
thumbnail_buffer.open(QBuffer.OpenModeFlag.ReadWrite)
|
||||
snapshot.save(thumbnail_buffer, "PNG")
|
||||
|
||||
thumbnail_file = zipfile.ZipInfo(THUMBNAIL_PATH)
|
||||
# Don't try to compress snapshot file, because the PNG is pretty much as compact as it will get
|
||||
archive.writestr(thumbnail_file, thumbnail_buffer.data())
|
||||
|
||||
# Add PNG to content types file
|
||||
thumbnail_type = ET.SubElement(content_types, "Default", Extension="png", ContentType="image/png")
|
||||
# Add thumbnail relation to _rels/.rels file
|
||||
thumbnail_relation_element = ET.SubElement(relations_element, "Relationship",
|
||||
Target="/" + THUMBNAIL_PATH, Id="rel1",
|
||||
Type="http://schemas.openxmlformats.org/package/2006/relationships/metadata/thumbnail")
|
||||
|
||||
# Let the variant process the thumbnail
|
||||
variant.process_thumbnail(snapshot, thumbnail_buffer, archive, relations_element)
|
||||
|
||||
# Write material metadata
|
||||
packages_metadata = self._getMaterialPackageMetadata() + self._getPluginPackageMetadata()
|
||||
|
@ -305,8 +334,10 @@ class ThreeMFWriter(MeshWriter):
|
|||
scene_string = parser.sceneToString(savitar_scene)
|
||||
|
||||
archive.writestr(model_file, scene_string)
|
||||
archive.writestr(content_types_file, b'<?xml version="1.0" encoding="UTF-8"?> \n' + ET.tostring(content_types))
|
||||
archive.writestr(relations_file, b'<?xml version="1.0" encoding="UTF-8"?> \n' + ET.tostring(relations_element))
|
||||
self._storeElementTree(archive, "[Content_Types].xml", content_types)
|
||||
self._storeElementTree(archive, "_rels/.rels", relations_element)
|
||||
if len(metadata_relations_element) > 0:
|
||||
self._storeElementTree(archive, "Metadata/_rels/model_settings.config.rels", metadata_relations_element)
|
||||
except Exception as error:
|
||||
Logger.logException("e", "Error writing zip file")
|
||||
self.setInformation(str(error))
|
||||
|
@ -319,6 +350,25 @@ class ThreeMFWriter(MeshWriter):
|
|||
|
||||
return True
|
||||
|
||||
@staticmethod
|
||||
def _storeElementTree(archive: zipfile.ZipFile, file_path: str, root_element: ET.Element):
|
||||
file = zipfile.ZipInfo(file_path)
|
||||
file.compress_type = zipfile.ZIP_DEFLATED
|
||||
archive.writestr(file, b'<?xml version="1.0" encoding="UTF-8"?> \n' + ET.tostring(root_element))
|
||||
|
||||
def _makeRelationsTree(self):
|
||||
return ET.Element("Relationships", xmlns=self._namespaces["relationships"])
|
||||
|
||||
@staticmethod
|
||||
def _getMaterialColor(extruder: "ExtruderStack") -> str:
|
||||
position = int(extruder.getMetaDataEntry("position", default="0"))
|
||||
try:
|
||||
default_color = ExtrudersModel.defaultColors[position]
|
||||
except IndexError:
|
||||
default_color = "#e0e000"
|
||||
color_code = extruder.material.getMetaDataEntry("color_code", default=default_color)
|
||||
return color_code.upper()
|
||||
|
||||
@staticmethod
|
||||
def _storeMetadataJson(metadata: Dict[str, List[Dict[str, str]]], archive: zipfile.ZipFile, path: str) -> None:
|
||||
"""Stores metadata inside archive path as json file"""
|
||||
|
|
|
@ -28,11 +28,17 @@ def getMetaData():
|
|||
metaData["mesh_writer"] = {
|
||||
"output": [
|
||||
{
|
||||
"extension": "3mf",
|
||||
"extension": workspace_extension,
|
||||
"description": i18n_catalog.i18nc("@item:inlistbox", "3MF file"),
|
||||
"mime_type": "application/vnd.ms-package.3dmanufacturing-3dmodel+xml",
|
||||
"mode": ThreeMFWriter.ThreeMFWriter.OutputMode.BinaryMode
|
||||
},
|
||||
{
|
||||
"extension": f"gcode.{workspace_extension}",
|
||||
"description": i18n_catalog.i18nc("@item:inlistbox", "BambuLab 3MF file"),
|
||||
"mime_type": "application/vnd.bambulab-package.3dmanufacturing-3dmodel+xml",
|
||||
"mode": ThreeMFWorkspaceWriter.ThreeMFWorkspaceWriter.OutputMode.BinaryMode
|
||||
}
|
||||
]
|
||||
}
|
||||
metaData["workspace_writer"] = {
|
||||
|
@ -44,7 +50,7 @@ def getMetaData():
|
|||
"mode": ThreeMFWorkspaceWriter.ThreeMFWorkspaceWriter.OutputMode.BinaryMode
|
||||
},
|
||||
{
|
||||
"extension": "3mf",
|
||||
"extension": workspace_extension,
|
||||
"description": i18n_catalog.i18nc("@item:inlistbox", "Universal Cura Project"),
|
||||
"mime_type": "application/x-ucp",
|
||||
"mode": ThreeMFWorkspaceWriter.ThreeMFWorkspaceWriter.OutputMode.BinaryMode
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# Copyright (c) 2020 Ultimaker B.V.
|
||||
# Copyright (c) 2025 UltiMaker
|
||||
# Cura is released under the terms of the LGPLv3 or higher.
|
||||
import json
|
||||
import threading
|
||||
|
@ -13,11 +13,14 @@ from UM.Message import Message
|
|||
from UM.TaskManagement.HttpRequestManager import HttpRequestManager
|
||||
from UM.TaskManagement.HttpRequestScope import JsonDecoratorScope
|
||||
from UM.i18n import i18nCatalog
|
||||
from cura.ApplicationMetadata import CuraSDKVersion
|
||||
from cura.CuraApplication import CuraApplication
|
||||
from cura.UltimakerCloud.UltimakerCloudScope import UltimakerCloudScope
|
||||
import cura.UltimakerCloud.UltimakerCloudConstants as UltimakerCloudConstants
|
||||
|
||||
catalog = i18nCatalog("cura")
|
||||
|
||||
PACKAGES_URL = f"{UltimakerCloudConstants.CuraCloudAPIRoot}/cura-packages/v{UltimakerCloudConstants.CuraCloudAPIVersion}/cura/v{CuraSDKVersion}/packages"
|
||||
|
||||
class CreateBackupJob(Job):
|
||||
"""Creates backup zip, requests upload url and uploads the backup file to cloud storage."""
|
||||
|
@ -40,23 +43,54 @@ class CreateBackupJob(Job):
|
|||
self._job_done = threading.Event()
|
||||
"""Set when the job completes. Does not indicate success."""
|
||||
self.backup_upload_error_message = ""
|
||||
"""After the job completes, an empty string indicates success. Othrerwise, the value is a translated message."""
|
||||
"""After the job completes, an empty string indicates success. Otherwise, the value is a translated message."""
|
||||
|
||||
def _setPluginFetchErrorMessage(self, error_msg: str) -> None:
|
||||
Logger.error(f"Fetching plugins for backup resulted in error: {error_msg}")
|
||||
self.backup_upload_error_message = "Couldn't update currently available plugins, backup stopped."
|
||||
self._upload_message.hide()
|
||||
self._job_done.set()
|
||||
|
||||
def run(self) -> None:
|
||||
upload_message = Message(catalog.i18nc("@info:backup_status", "Creating your backup..."),
|
||||
self._upload_message = Message(catalog.i18nc("@info:backup_status", "Fetch re-downloadable package-ids..."),
|
||||
title = self.MESSAGE_TITLE,
|
||||
progress = -1)
|
||||
upload_message.show()
|
||||
self._upload_message.show()
|
||||
CuraApplication.getInstance().processEvents()
|
||||
|
||||
if CuraApplication.getInstance().getCuraAPI().backups.shouldReinstallDownloadablePlugins():
|
||||
request_url = f"{PACKAGES_URL}?package_type=plugin"
|
||||
scope = JsonDecoratorScope(UltimakerCloudScope(CuraApplication.getInstance()))
|
||||
HttpRequestManager.getInstance().get(
|
||||
request_url,
|
||||
scope=scope,
|
||||
callback=self._continueRun,
|
||||
error_callback=lambda reply, error: self._setPluginFetchErrorMessage(str(error)),
|
||||
)
|
||||
else:
|
||||
self._continueRun()
|
||||
|
||||
def _continueRun(self, reply: "QNetworkReply" = None) -> None:
|
||||
if reply is not None:
|
||||
response_data = HttpRequestManager.readJSON(reply)
|
||||
if "data" not in response_data:
|
||||
self._setPluginFetchErrorMessage(f"Missing 'data' from response. Keys in response: {response_data.keys()}")
|
||||
return
|
||||
available_remote_plugins = frozenset({v["package_id"] for v in response_data["data"]})
|
||||
else:
|
||||
available_remote_plugins = frozenset()
|
||||
|
||||
self._upload_message.setText(catalog.i18nc("@info:backup_status", "Creating your backup..."))
|
||||
CuraApplication.getInstance().processEvents()
|
||||
cura_api = CuraApplication.getInstance().getCuraAPI()
|
||||
self._backup_zip, backup_meta_data = cura_api.backups.createBackup()
|
||||
self._backup_zip, backup_meta_data = cura_api.backups.createBackup(available_remote_plugins)
|
||||
|
||||
if not self._backup_zip or not backup_meta_data:
|
||||
self.backup_upload_error_message = catalog.i18nc("@info:backup_status", "There was an error while creating your backup.")
|
||||
upload_message.hide()
|
||||
self._upload_message.hide()
|
||||
return
|
||||
|
||||
upload_message.setText(catalog.i18nc("@info:backup_status", "Uploading your backup..."))
|
||||
self._upload_message.setText(catalog.i18nc("@info:backup_status", "Uploading your backup..."))
|
||||
CuraApplication.getInstance().processEvents()
|
||||
|
||||
# Create an upload entry for the backup.
|
||||
|
@ -64,13 +98,18 @@ class CreateBackupJob(Job):
|
|||
backup_meta_data["description"] = "{}.backup.{}.cura.zip".format(timestamp, backup_meta_data["cura_release"])
|
||||
self._requestUploadSlot(backup_meta_data, len(self._backup_zip))
|
||||
|
||||
self._job_done.wait()
|
||||
# Note: One 'process events' call wasn't enough with the changed situation somehow.
|
||||
for _ in range(5000):
|
||||
CuraApplication.getInstance().processEvents()
|
||||
if self._job_done.wait(0.02):
|
||||
break
|
||||
|
||||
if self.backup_upload_error_message == "":
|
||||
upload_message.setText(catalog.i18nc("@info:backup_status", "Your backup has finished uploading."))
|
||||
upload_message.setProgress(None) # Hide progress bar
|
||||
self._upload_message.setText(catalog.i18nc("@info:backup_status", "Your backup has finished uploading."))
|
||||
self._upload_message.setProgress(None) # Hide progress bar
|
||||
else:
|
||||
# some error occurred. This error is presented to the user by DrivePluginExtension
|
||||
upload_message.hide()
|
||||
self._upload_message.hide()
|
||||
|
||||
def _requestUploadSlot(self, backup_metadata: Dict[str, Any], backup_size: int) -> None:
|
||||
"""Request a backup upload slot from the API.
|
||||
|
@ -83,7 +122,6 @@ class CreateBackupJob(Job):
|
|||
"metadata": backup_metadata
|
||||
}
|
||||
}).encode()
|
||||
|
||||
HttpRequestManager.getInstance().put(
|
||||
self._api_backup_url,
|
||||
data = payload,
|
||||
|
|
|
@ -1,8 +1,9 @@
|
|||
# Copyright (c) 2021 Ultimaker B.V.
|
||||
# Copyright (c) 2025 UltiMaker
|
||||
# Cura is released under the terms of the LGPLv3 or higher.
|
||||
|
||||
import base64
|
||||
import hashlib
|
||||
import json
|
||||
import os
|
||||
import threading
|
||||
from tempfile import NamedTemporaryFile
|
||||
from typing import Optional, Any, Dict
|
||||
|
@ -12,9 +13,16 @@ from PyQt6.QtNetwork import QNetworkReply, QNetworkRequest
|
|||
from UM.Job import Job
|
||||
from UM.Logger import Logger
|
||||
from UM.PackageManager import catalog
|
||||
from UM.Resources import Resources
|
||||
from UM.TaskManagement.HttpRequestManager import HttpRequestManager
|
||||
from cura.CuraApplication import CuraApplication
|
||||
from UM.Version import Version
|
||||
|
||||
from cura.ApplicationMetadata import CuraSDKVersion
|
||||
from cura.CuraApplication import CuraApplication
|
||||
from cura.UltimakerCloud.UltimakerCloudScope import UltimakerCloudScope
|
||||
import cura.UltimakerCloud.UltimakerCloudConstants as UltimakerCloudConstants
|
||||
|
||||
PACKAGES_URL_TEMPLATE = f"{UltimakerCloudConstants.CuraCloudAPIRoot}/cura-packages/v{UltimakerCloudConstants.CuraCloudAPIVersion}/cura/v{{0}}/packages/{{1}}/download"
|
||||
|
||||
class RestoreBackupJob(Job):
|
||||
"""Downloads a backup and overwrites local configuration with the backup.
|
||||
|
@ -38,7 +46,6 @@ class RestoreBackupJob(Job):
|
|||
self.restore_backup_error_message = ""
|
||||
|
||||
def run(self) -> None:
|
||||
|
||||
url = self._backup.get("download_url")
|
||||
assert url is not None
|
||||
|
||||
|
@ -48,7 +55,11 @@ class RestoreBackupJob(Job):
|
|||
error_callback = self._onRestoreRequestCompleted
|
||||
)
|
||||
|
||||
self._job_done.wait() # A job is considered finished when the run function completes
|
||||
# Note: Just to be sure, use the same structure here as in CreateBackupJob.
|
||||
for _ in range(5000):
|
||||
CuraApplication.getInstance().processEvents()
|
||||
if self._job_done.wait(0.02):
|
||||
break
|
||||
|
||||
def _onRestoreRequestCompleted(self, reply: QNetworkReply, error: Optional["QNetworkReply.NetworkError"] = None) -> None:
|
||||
if not HttpRequestManager.replyIndicatesSuccess(reply, error):
|
||||
|
@ -60,8 +71,8 @@ class RestoreBackupJob(Job):
|
|||
|
||||
# We store the file in a temporary path fist to ensure integrity.
|
||||
try:
|
||||
temporary_backup_file = NamedTemporaryFile(delete = False)
|
||||
with open(temporary_backup_file.name, "wb") as write_backup:
|
||||
self._temporary_backup_file = NamedTemporaryFile(delete_on_close = False)
|
||||
with open(self._temporary_backup_file.name, "wb") as write_backup:
|
||||
app = CuraApplication.getInstance()
|
||||
bytes_read = reply.read(self.DISK_WRITE_BUFFER_SIZE)
|
||||
while bytes_read:
|
||||
|
@ -69,23 +80,98 @@ class RestoreBackupJob(Job):
|
|||
bytes_read = reply.read(self.DISK_WRITE_BUFFER_SIZE)
|
||||
app.processEvents()
|
||||
except EnvironmentError as e:
|
||||
Logger.log("e", f"Unable to save backed up files due to computer limitations: {str(e)}")
|
||||
Logger.error(f"Unable to save backed up files due to computer limitations: {str(e)}")
|
||||
self.restore_backup_error_message = self.DEFAULT_ERROR_MESSAGE
|
||||
self._job_done.set()
|
||||
return
|
||||
|
||||
if not self._verifyMd5Hash(temporary_backup_file.name, self._backup.get("md5_hash", "")):
|
||||
if not self._verifyMd5Hash(self._temporary_backup_file.name, self._backup.get("md5_hash", "")):
|
||||
# Don't restore the backup if the MD5 hashes do not match.
|
||||
# This can happen if the download was interrupted.
|
||||
Logger.log("w", "Remote and local MD5 hashes do not match, not restoring backup.")
|
||||
Logger.error("Remote and local MD5 hashes do not match, not restoring backup.")
|
||||
self.restore_backup_error_message = self.DEFAULT_ERROR_MESSAGE
|
||||
self._job_done.set()
|
||||
return
|
||||
|
||||
# Tell Cura to place the backup back in the user data folder.
|
||||
with open(temporary_backup_file.name, "rb") as read_backup:
|
||||
metadata = self._backup.get("metadata", {})
|
||||
with open(self._temporary_backup_file.name, "rb") as read_backup:
|
||||
cura_api = CuraApplication.getInstance().getCuraAPI()
|
||||
cura_api.backups.restoreBackup(read_backup.read(), self._backup.get("metadata", {}))
|
||||
cura_api.backups.restoreBackup(read_backup.read(), metadata, auto_close=False)
|
||||
|
||||
self._job_done.set()
|
||||
# Read packages data-file, to get the 'to_install' plugin-ids.
|
||||
version_to_restore = Version(metadata.get("cura_release", "dev"))
|
||||
version_str = f"{version_to_restore.getMajor()}.{version_to_restore.getMinor()}"
|
||||
packages_path = os.path.abspath(os.path.join(os.path.abspath(
|
||||
Resources.getConfigStoragePath()), "..", version_str, "packages.json"))
|
||||
if not os.path.exists(packages_path):
|
||||
Logger.error(f"Can't find path '{packages_path}' to tell what packages should be redownloaded.")
|
||||
self.restore_backup_error_message = self.DEFAULT_ERROR_MESSAGE
|
||||
self._job_done.set()
|
||||
return
|
||||
|
||||
to_install = {}
|
||||
try:
|
||||
with open(packages_path, "r") as packages_file:
|
||||
packages_json = json.load(packages_file)
|
||||
if "to_install" in packages_json:
|
||||
for package_data in packages_json["to_install"].values():
|
||||
if "package_info" not in package_data:
|
||||
continue
|
||||
package_info = package_data["package_info"]
|
||||
if "package_id" in package_info and "sdk_version_semver" in package_info:
|
||||
to_install[package_info["package_id"]] = package_info["sdk_version_semver"]
|
||||
except IOError as ex:
|
||||
Logger.error(f"Couldn't open '{packages_path}' because '{str(ex)}' to get packages to re-install.")
|
||||
self.restore_backup_error_message = self.DEFAULT_ERROR_MESSAGE
|
||||
self._job_done.set()
|
||||
return
|
||||
|
||||
if len(to_install) < 1:
|
||||
Logger.info("No packages to reinstall, early out.")
|
||||
self._job_done.set()
|
||||
return
|
||||
|
||||
# Download all re-installable plugins packages, so they can be put back on start-up.
|
||||
redownload_errors = []
|
||||
def packageDownloadCallback(package_id: str, msg: "QNetworkReply", err: "QNetworkReply.NetworkError" = None) -> None:
|
||||
if err is not None or HttpRequestManager.safeHttpStatus(msg) != 200:
|
||||
redownload_errors.append(err)
|
||||
del to_install[package_id]
|
||||
|
||||
try:
|
||||
with NamedTemporaryFile(mode="wb", suffix=".curapackage", delete=False) as temp_file:
|
||||
bytes_read = msg.read(self.DISK_WRITE_BUFFER_SIZE)
|
||||
while bytes_read:
|
||||
temp_file.write(bytes_read)
|
||||
bytes_read = msg.read(self.DISK_WRITE_BUFFER_SIZE)
|
||||
CuraApplication.getInstance().processEvents()
|
||||
temp_file.close()
|
||||
if not CuraApplication.getInstance().getPackageManager().installPackage(temp_file.name):
|
||||
redownload_errors.append(f"Couldn't install package '{package_id}'.")
|
||||
except IOError as ex:
|
||||
redownload_errors.append(f"Couldn't process package '{package_id}' because '{ex}'.")
|
||||
|
||||
if len(to_install) < 1:
|
||||
if len(redownload_errors) == 0:
|
||||
Logger.info("All packages redownloaded!")
|
||||
self._job_done.set()
|
||||
else:
|
||||
msgs = "\n - ".join(redownload_errors)
|
||||
Logger.error(f"Couldn't re-install at least one package(s) because: {msgs}")
|
||||
self.restore_backup_error_message = self.DEFAULT_ERROR_MESSAGE
|
||||
self._job_done.set()
|
||||
|
||||
self._package_download_scope = UltimakerCloudScope(CuraApplication.getInstance())
|
||||
for package_id, package_api_version in to_install.items():
|
||||
def handlePackageId(package_id: str = package_id):
|
||||
HttpRequestManager.getInstance().get(
|
||||
PACKAGES_URL_TEMPLATE.format(package_api_version, package_id),
|
||||
scope=self._package_download_scope,
|
||||
callback=lambda msg: packageDownloadCallback(package_id, msg),
|
||||
error_callback=lambda msg, err: packageDownloadCallback(package_id, msg, err)
|
||||
)
|
||||
handlePackageId(package_id)
|
||||
|
||||
@staticmethod
|
||||
def _verifyMd5Hash(file_path: str, known_hash: str) -> bool:
|
||||
|
|
|
@ -76,7 +76,7 @@ class GcodeStartEndFormatter:
|
|||
# will be used. Alternatively, if the expression is formatted as "{[expression], [extruder_nr]}",
|
||||
# then the expression will be evaluated with the extruder stack of the specified extruder_nr.
|
||||
|
||||
_instruction_regex = re.compile(r"{(?P<condition>if|else|elif|endif)?\s*(?P<expression>.*?)\s*(?:,\s*(?P<extruder_nr_expr>.*))?\s*}(?P<end_of_line>\n?)")
|
||||
_instruction_regex = re.compile(r"{(?P<condition>if|else|elif|endif)?\s*(?P<expression>[^{}]*?)\s*(?:,\s*(?P<extruder_nr_expr>[^{}]*))?\s*}(?P<end_of_line>\n?)")
|
||||
|
||||
def __init__(self, all_extruder_settings: Dict[str, Dict[str, Any]], default_extruder_nr: int = -1) -> None:
|
||||
super().__init__()
|
||||
|
|
|
@ -24,7 +24,7 @@ class GCodeGzWriter(MeshWriter):
|
|||
def __init__(self) -> None:
|
||||
super().__init__(add_to_recent_files = False)
|
||||
|
||||
def write(self, stream: BufferedIOBase, nodes: List[SceneNode], mode = MeshWriter.OutputMode.BinaryMode) -> bool:
|
||||
def write(self, stream: BufferedIOBase, nodes: List[SceneNode], mode = MeshWriter.OutputMode.BinaryMode, **kwargs) -> bool:
|
||||
"""Writes the gzipped g-code to a stream.
|
||||
|
||||
Note that even though the function accepts a collection of nodes, the
|
||||
|
|
|
@ -298,8 +298,14 @@ class FlavorParser:
|
|||
position.e.extend([0] * (self._extruder_number - len(position.e) + 1))
|
||||
return position
|
||||
|
||||
def processMCode(self, M: int, line: str, position: Position, path: List[List[Union[float, int]]]) -> Position:
|
||||
pass
|
||||
def processMCode(self, M: int, line: str, position: Position, path: List[List[Union[float, int]]]) -> None:
|
||||
# Set extrusion mode
|
||||
if M == 82:
|
||||
# Set absolute extrusion mode
|
||||
self._is_absolute_extrusion = True
|
||||
elif M == 83:
|
||||
# Set relative extrusion mode
|
||||
self._is_absolute_extrusion = False
|
||||
|
||||
_type_keyword = ";TYPE:"
|
||||
_layer_keyword = ";LAYER:"
|
||||
|
|
|
@ -11,14 +11,6 @@ class RepRapFlavorParser(FlavorParser.FlavorParser):
|
|||
def __init__(self):
|
||||
super().__init__()
|
||||
|
||||
def processMCode(self, M, line, position, path):
|
||||
if M == 82:
|
||||
# Set absolute extrusion mode
|
||||
self._is_absolute_extrusion = True
|
||||
elif M == 83:
|
||||
# Set relative extrusion mode
|
||||
self._is_absolute_extrusion = False
|
||||
|
||||
def _gCode90(self, position, params, path):
|
||||
"""Set the absolute positioning
|
||||
|
||||
|
|
|
@ -56,7 +56,7 @@ class GCodeWriter(MeshWriter):
|
|||
|
||||
self._application = Application.getInstance()
|
||||
|
||||
def write(self, stream, nodes, mode = MeshWriter.OutputMode.TextMode):
|
||||
def write(self, stream, nodes, mode = MeshWriter.OutputMode.TextMode, **kwargs):
|
||||
"""Writes the g-code for the entire scene to a stream.
|
||||
|
||||
Note that even though the function accepts a collection of nodes, the
|
||||
|
|
|
@ -214,7 +214,7 @@ Item
|
|||
|
||||
settingStoreIndex: propertyStoreIndex
|
||||
|
||||
labelText: catalog.i18nc("@label", "Y min")
|
||||
labelText: catalog.i18nc("@label", "Y min ( '-' towards back)")
|
||||
labelFont: base.labelFont
|
||||
labelWidth: base.labelWidth
|
||||
controlWidth: base.controlWidth
|
||||
|
@ -254,7 +254,7 @@ Item
|
|||
settingKey: "machine_head_with_fans_polygon"
|
||||
settingStoreIndex: propertyStoreIndex
|
||||
|
||||
labelText: catalog.i18nc("@label", "Y max")
|
||||
labelText: catalog.i18nc("@label", "Y max ( '+' towards front)")
|
||||
labelFont: base.labelFont
|
||||
labelWidth: base.labelWidth
|
||||
controlWidth: base.controlWidth
|
||||
|
|
|
@ -46,6 +46,13 @@ class MakerbotWriter(MeshWriter):
|
|||
suffixes=["makerbot"]
|
||||
)
|
||||
)
|
||||
MimeTypeDatabase.addMimeType(
|
||||
MimeType(
|
||||
name="application/x-makerbot-replicator_plus",
|
||||
comment="Makerbot Toolpath Package",
|
||||
suffixes=["makerbot"]
|
||||
)
|
||||
)
|
||||
|
||||
_PNG_FORMAT = [
|
||||
{"prefix": "isometric_thumbnail", "width": 120, "height": 120},
|
||||
|
@ -84,7 +91,7 @@ class MakerbotWriter(MeshWriter):
|
|||
|
||||
return None
|
||||
|
||||
def write(self, stream: BufferedIOBase, nodes: List[SceneNode], mode=MeshWriter.OutputMode.BinaryMode) -> bool:
|
||||
def write(self, stream: BufferedIOBase, nodes: List[SceneNode], mode=MeshWriter.OutputMode.BinaryMode, **kwargs) -> bool:
|
||||
metadata, file_format = self._getMeta(nodes)
|
||||
if mode != MeshWriter.OutputMode.BinaryMode:
|
||||
Logger.log("e", "MakerbotWriter does not support text mode.")
|
||||
|
@ -114,6 +121,8 @@ class MakerbotWriter(MeshWriter):
|
|||
filename, filedata = "print.gcode", gcode_text_io.getvalue()
|
||||
case "application/x-makerbot":
|
||||
filename, filedata = "print.jsontoolpath", du.gcode_2_miracle_jtp(gcode_text_io.getvalue())
|
||||
case "application/x-makerbot-replicator_plus":
|
||||
filename, filedata = "print.jsontoolpath", du.gcode_2_miracle_jtp(gcode_text_io.getvalue(), nb_extruders=1)
|
||||
case _:
|
||||
raise Exception("Unsupported Mime type")
|
||||
|
||||
|
@ -249,11 +258,87 @@ class MakerbotWriter(MeshWriter):
|
|||
|
||||
meta["preferences"] = dict()
|
||||
bounds = application.getBuildVolume().getBoundingBox()
|
||||
intent = CuraApplication.getInstance().getIntentManager().currentIntentCategory
|
||||
meta["preferences"]["instance0"] = {
|
||||
"machineBounds": [bounds.right, bounds.front, bounds.left, bounds.back] if bounds is not None else None,
|
||||
"printMode": CuraApplication.getInstance().getIntentManager().currentIntentCategory,
|
||||
"printMode": intent
|
||||
}
|
||||
|
||||
if file_format == "application/x-makerbot":
|
||||
accel_overrides = meta["accel_overrides"] = {}
|
||||
if intent in ['highspeed', 'highspeedsolid']:
|
||||
accel_overrides['do_input_shaping'] = True
|
||||
accel_overrides['do_corner_rounding'] = True
|
||||
bead_mode_overrides = accel_overrides["bead_mode"] = {}
|
||||
|
||||
accel_enabled = global_stack.getProperty('acceleration_enabled', 'value')
|
||||
|
||||
if accel_enabled:
|
||||
global_accel_setting = global_stack.getProperty('acceleration_print', 'value')
|
||||
accel_overrides["rate_mm_per_s_sq"] = {
|
||||
"x": global_accel_setting,
|
||||
"y": global_accel_setting
|
||||
}
|
||||
|
||||
if global_stack.getProperty('acceleration_travel_enabled', 'value'):
|
||||
travel_accel_setting = global_stack.getProperty('acceleration_travel', 'value')
|
||||
bead_mode_overrides['Travel Move'] = {
|
||||
"rate_mm_per_s_sq": {
|
||||
"x": travel_accel_setting,
|
||||
"y": travel_accel_setting
|
||||
}
|
||||
}
|
||||
|
||||
jerk_enabled = global_stack.getProperty('jerk_enabled', 'value')
|
||||
if jerk_enabled:
|
||||
global_jerk_setting = global_stack.getProperty('jerk_print', 'value')
|
||||
accel_overrides["max_speed_change_mm_per_s"] = {
|
||||
"x": global_jerk_setting,
|
||||
"y": global_jerk_setting
|
||||
}
|
||||
|
||||
if global_stack.getProperty('jerk_travel_enabled', 'value'):
|
||||
travel_jerk_setting = global_stack.getProperty('jerk_travel', 'value')
|
||||
if 'Travel Move' not in bead_mode_overrides:
|
||||
bead_mode_overrides['Travel Move' ] = {}
|
||||
bead_mode_overrides['Travel Move'].update({
|
||||
"max_speed_change_mm_per_s": {
|
||||
"x": travel_jerk_setting,
|
||||
"y": travel_jerk_setting
|
||||
}
|
||||
})
|
||||
|
||||
|
||||
# Get bead mode settings per extruder
|
||||
available_bead_modes = {
|
||||
"infill": "FILL",
|
||||
"prime_tower": "PRIME_TOWER",
|
||||
"roofing": "TOP_SURFACE",
|
||||
"support_infill": "SUPPORT",
|
||||
"support_interface": "SUPPORT_INTERFACE",
|
||||
"wall_0": "WALL_OUTER",
|
||||
"wall_x": "WALL_INNER",
|
||||
"skirt_brim": "SKIRT"
|
||||
}
|
||||
for idx, extruder in enumerate(extruders):
|
||||
for bead_mode_setting, bead_mode_tag in available_bead_modes.items():
|
||||
ext_specific_tag = "%s_%s" % (bead_mode_tag, idx)
|
||||
if accel_enabled or jerk_enabled:
|
||||
bead_mode_overrides[ext_specific_tag] = {}
|
||||
|
||||
if accel_enabled:
|
||||
accel_val = extruder.getProperty('acceleration_%s' % bead_mode_setting, 'value')
|
||||
bead_mode_overrides[ext_specific_tag]["rate_mm_per_s_sq"] = {
|
||||
"x": accel_val,
|
||||
"y": accel_val
|
||||
}
|
||||
if jerk_enabled:
|
||||
jerk_val = extruder.getProperty('jerk_%s' % bead_mode_setting, 'value')
|
||||
bead_mode_overrides[ext_specific_tag][ "max_speed_change_mm_per_s"] = {
|
||||
"x": jerk_val,
|
||||
"y": jerk_val
|
||||
}
|
||||
|
||||
meta["miracle_config"] = {"gaggles": {"instance0": {}}}
|
||||
|
||||
version_info = dict()
|
||||
|
|
|
@ -25,6 +25,12 @@ def getMetaData():
|
|||
"description": catalog.i18nc("@item:inlistbox", "Makerbot Sketch Printfile"),
|
||||
"mime_type": "application/x-makerbot-sketch",
|
||||
"mode": MakerbotWriter.MakerbotWriter.OutputMode.BinaryMode,
|
||||
},
|
||||
{
|
||||
"extension": file_extension,
|
||||
"description": catalog.i18nc("@item:inlistbox", "Makerbot Replicator+ Printfile"),
|
||||
"mime_type": "application/x-makerbot-replicator_plus",
|
||||
"mode": MakerbotWriter.MakerbotWriter.OutputMode.BinaryMode,
|
||||
}
|
||||
]
|
||||
},
|
||||
|
|
|
@ -122,7 +122,7 @@ class Script:
|
|||
if not key in line or (';' in line and line.find(key) > line.find(';')):
|
||||
return default
|
||||
sub_part = line[line.find(key) + 1:]
|
||||
m = re.search('^-?[0-9]+\.?[0-9]*', sub_part)
|
||||
m = re.search(r'^-?[0-9]+\.?[0-9]*', sub_part)
|
||||
if m is None:
|
||||
return default
|
||||
try:
|
||||
|
|
|
@ -338,7 +338,7 @@ class PauseAtHeight(Script):
|
|||
nbr_negative_layers += 1
|
||||
|
||||
#Track the latest printing temperature in order to resume at the correct temperature.
|
||||
if re.match("T(\d*)", line):
|
||||
if re.match(r"T(\d*)", line):
|
||||
current_t = self.getValue(line, "T")
|
||||
m = self.getValue(line, "M")
|
||||
if m is not None and (m == 104 or m == 109) and self.getValue(line, "S") is not None:
|
||||
|
|
988
plugins/PostProcessingPlugin/scripts/PurgeLinesAndUnload.py
Normal file
|
@ -0,0 +1,988 @@
|
|||
# August 2024 - Designed by: GregValiant (Greg Foresi). Straightened out by: Hellaholic
|
||||
#
|
||||
# NOTE: You may have purge lines in your startup, or you may use this script, you should not do both. The script will attempt to comment out existing StartUp purge lines.
|
||||
# 'Add Purge Lines to StartUp' Allows the user to determine where the purge lines are on the build plate, or to not use purge lines if a print extends to the limits of the build surface.
|
||||
# This script will attempt to recognize and comment out purge lines in the StartUp Gcode but they should be removed if using this script.
|
||||
# The setting 'Purge Line Length' is only avaialble for rectangular beds because I was too lazy to calculate the 45° arcs.
|
||||
# 'Move to Start' takes an orthogonal path around the periphery before moving in to the print start location. It eliminates strings across the print area.
|
||||
# 'Adjust Starting E' is a correction in the E location before the skirt/brim starts. The user can make an adjustment so that the skirt / brim / raft starts where it should.
|
||||
# 'Unload' adds code to the Ending Gcode that will unload the filament from the machine. The unlaod distance is broken into chunks to avoid overly long E distances.
|
||||
# Added extra moves to account for Cura adding a "Travel to Prime Tower" move that can cross the middle of the build surface.
|
||||
# Added ability to take 'disallowed areas' into account.
|
||||
|
||||
import math
|
||||
from ..Script import Script
|
||||
from UM.Application import Application
|
||||
from UM.Message import Message
|
||||
import re
|
||||
from UM.Logger import Logger
|
||||
from enum import Enum
|
||||
|
||||
|
||||
class Location(str, Enum):
|
||||
LEFT = "left"
|
||||
RIGHT = "right"
|
||||
REAR = "rear"
|
||||
FRONT = "front"
|
||||
|
||||
|
||||
class Position(tuple, Enum):
|
||||
LEFT_FRONT = ("left", "front")
|
||||
RIGHT_FRONT = ("right", "front")
|
||||
LEFT_REAR = ("left", "rear")
|
||||
RIGHT_REAR = ("right", "rear")
|
||||
|
||||
|
||||
class PurgeLinesAndUnload(Script):
|
||||
|
||||
def __init__(self):
|
||||
super().__init__()
|
||||
self.global_stack = Application.getInstance().getGlobalContainerStack()
|
||||
self.extruder = self.global_stack.extruderList
|
||||
self.end_purge_location = None
|
||||
self.speed_travel = None
|
||||
# This will be True when there are more than 4 'machine_disallowed_areas'
|
||||
self.show_warning = False
|
||||
self.disallowed_areas = self.global_stack.getProperty("machine_disallowed_areas", "value")
|
||||
self.extruder = self.global_stack.extruderList
|
||||
self.extruder_count = self.global_stack.getProperty("machine_extruder_count", "value")
|
||||
self.bed_shape = self.global_stack.getProperty("machine_shape", "value")
|
||||
self.origin_at_center = self.global_stack.getProperty("machine_center_is_zero", "value")
|
||||
self.machine_width = self.global_stack.getProperty("machine_width", "value")
|
||||
self.machine_depth = self.global_stack.getProperty("machine_depth", "value")
|
||||
self.machine_left = 1.0
|
||||
self.machine_right = self.machine_width - 1.0
|
||||
self.machine_front = 1.0
|
||||
self.machine_back = self.machine_depth - 1.0
|
||||
self.start_x = None
|
||||
self.start_y = None
|
||||
|
||||
def initialize(self) -> None:
|
||||
super().initialize()
|
||||
# Get the StartUp Gcode from Cura and attempt to catch if it contains purge lines. Message the user if an extrusion is in the startup.
|
||||
startup_gcode = self.global_stack.getProperty("machine_start_gcode", "value")
|
||||
start_lines = startup_gcode.splitlines()
|
||||
for line in start_lines:
|
||||
if "G1" in line and " E" in line and (" X" in line or " Y" in line):
|
||||
Message(title="[Purge Lines and Unload]",
|
||||
text="It appears that there are 'purge lines' in the StartUp Gcode. Using the 'Add Purge Lines' function of this script will comment them out.").show()
|
||||
break
|
||||
# 'is_rectangular' is used to disable half-length purge lines for elliptic beds.
|
||||
self._instance.setProperty("is_rectangular", "value", True if self.global_stack.getProperty("machine_shape", "value") == "rectangular" else False)
|
||||
self._instance.setProperty("move_to_prime_tower", "value", True if self.global_stack.getProperty("machine_extruder_count", "value") > 1 else False)
|
||||
# Set the default E adjustment
|
||||
self._instance.setProperty("adjust_e_loc_to", "value", -abs(round(float(self.extruder[0].getProperty("retraction_amount", "value")), 1)))
|
||||
|
||||
def getSettingDataString(self):
|
||||
return """{
|
||||
"name": "Purge Lines and Unload Filament",
|
||||
"key": "PurgeLinesAndUnload",
|
||||
"metadata": {},
|
||||
"version": 2,
|
||||
"settings":
|
||||
{
|
||||
"add_purge_lines":
|
||||
{
|
||||
"label": "Add Purge Lines to StartUp",
|
||||
"description": "The purge lines can be left, right, front or back. If there are purge lines present in the StartUp Gcode remove them or comment them out before using this script. You don't want to double dip.",
|
||||
"type": "bool",
|
||||
"default_value": false,
|
||||
"value": false,
|
||||
"enabled": true
|
||||
},
|
||||
"purge_line_location":
|
||||
{
|
||||
"label": " Purge Line Location",
|
||||
"description": "What edge of the build plate should have the purge lines. If the printer is 'Elliptical' then it is assumed to be an 'Origin At Center' printer and the purge lines are 90° arcs.",
|
||||
"type": "enum",
|
||||
"options": {
|
||||
"left": "On left edge (Xmin)",
|
||||
"right": "On right edge (Xmax)",
|
||||
"front": "On front edge (Ymin)",
|
||||
"rear": "On back edge (Ymax)"},
|
||||
"default_value": "left",
|
||||
"enabled": "add_purge_lines"
|
||||
},
|
||||
"purge_line_length":
|
||||
{
|
||||
"label": " Purge Line Length",
|
||||
"description": "Select 'Full' for the entire Height or Width of the build plate. Select 'Half' for shorter purge lines. NOTE: This has no effect on elliptic beds.",
|
||||
"type": "enum",
|
||||
"options": {
|
||||
"purge_full": "Full",
|
||||
"purge_half": "Half"},
|
||||
"default_value": "purge_full",
|
||||
"enabled": "add_purge_lines and is_rectangular"
|
||||
},
|
||||
"border_distance":
|
||||
{
|
||||
"label": " Border Distance",
|
||||
"description": "This is the distance from the build plate edge to the first purge line. '0' works for most printers but you might want the lines further inboard. The allowable range is -12 to 12. ⚠️ Negative numbers are allowed for printers that have 'Disallowed Areas'. You must use due caution when using a negative value.",
|
||||
"type": "int",
|
||||
"unit": "mm ",
|
||||
"default_value": 0,
|
||||
"minimum_value": -12,
|
||||
"maximum_value": 12,
|
||||
"enabled": "add_purge_lines"
|
||||
},
|
||||
"prime_blob_enable":
|
||||
{
|
||||
"label": " Start with Prime Blob️",
|
||||
"description": "Enable a stationary purge before starting the purge lines. Available only when purge line location is 'left' or 'front'",
|
||||
"type": "bool",
|
||||
"default_value": false,
|
||||
"enabled": "add_purge_lines and purge_line_location in ['front', 'left']"
|
||||
},
|
||||
"prime_blob_distance":
|
||||
{
|
||||
"label": " Blob Distance️",
|
||||
"description": "How many mm's of filament should be extruded for the blob.",
|
||||
"type": "int",
|
||||
"default_value": 0,
|
||||
"unit": "mm ",
|
||||
"enabled": "add_purge_lines and prime_blob_enable and purge_line_location in ['front', 'left']"
|
||||
},
|
||||
"prime_blob_loc_x":
|
||||
{
|
||||
"label": " Blob Location X",
|
||||
"description": "The 'X' position to put the prime blob. 'Origin at Center' printers might require a negative value here. Keep in mind that purge lines always start in the left front, or the right rear. Pay attention or the nozzle can sit down into the prime blob.",
|
||||
"type": "int",
|
||||
"default_value": 0,
|
||||
"unit": "mm ",
|
||||
"enabled": "add_purge_lines and prime_blob_enable and purge_line_location in ['front', 'left']"
|
||||
},
|
||||
"prime_blob_loc_y":
|
||||
{
|
||||
"label": " Blob location Y",
|
||||
"description": "The 'Y' position to put the prime blob. 'Origin at Center' printers might require a negative value here. Keep in mind that purge lines always start in the left front, or the right rear. Pay attention or the nozzle can sit down into the prime blob.",
|
||||
"type": "int",
|
||||
"default_value": 0,
|
||||
"unit": "mm ",
|
||||
"enabled": "add_purge_lines and prime_blob_enable and purge_line_location in ['front', 'left']"
|
||||
},
|
||||
"move_to_start":
|
||||
{
|
||||
"label": "Circle around to layer start ⚠️",
|
||||
"description": "Depending on where the 'Layer Start X' and 'Layer Start Y' are for the print, the opening travel move can pass across the print area and leave a string there. This option will generate an orthogonal path that moves the nozzle around the edges of the build plate and then comes in to the Start Point. || ⚠️ || The nozzle can drop to Z0.0 and touch the build plate at each stop in order to 'nail down the string'. The nozzle always raises after the touch-down. It will not drag on the bed.",
|
||||
"type": "bool",
|
||||
"default_value": false,
|
||||
"enabled": true
|
||||
},
|
||||
"move_to_start_min_z":
|
||||
{
|
||||
"label": " Minimum Z height ⚠️",
|
||||
"description": "When moving to the start position, the nozzle can touch down on the build plate at each stop (Z = 0.0). That will stick the string to the build plate at each direction change so it doesn't pull across the print area. Some printers may not respond well to Z=0.0. You may set a minimum Z height here (min is 0.0 and max is 0.50). The string must stick or it defeats the purpose of moving around the periphery.",
|
||||
"type": "float",
|
||||
"default_value": 0.0,
|
||||
"minimum_value": 0.0,
|
||||
"maximum_value": 0.5,
|
||||
"enabled": "move_to_start"
|
||||
},
|
||||
"adjust_starting_e":
|
||||
{
|
||||
"label": "Adjust Starting E location",
|
||||
"description": "If there is a retraction after the purge lines in the Startup Gcode (like the 'Add Purge Lines' script here does) then often the skirt does not start where the nozzle starts. It is because Cura always adds a retraction prior to the print starting which results in a double retraction. Enabling this will allow you to adjust the starting E location and tune it so the skirt/brim/model starts right where it should. To fix a blob enter a positive number. To fix a 'dry start' enter a negative number.",
|
||||
"type": "bool",
|
||||
"default_value": false,
|
||||
"value": false,
|
||||
"enabled": true
|
||||
},
|
||||
"adjust_e_loc_to":
|
||||
{
|
||||
"label": " Starting E location",
|
||||
"description": "This is usually a negative amount and often equal to the '-Retraction Distance'. This 'G92 E' adjustment changes where the printer 'thinks' the end of the filament is in relation to the nozzle. It replaces the retraction that Cura adds prior to the start of 'LAYER:0'. If retraction is not enabled then this setting has no effect.",
|
||||
"type": "float",
|
||||
"unit": "mm ",
|
||||
"default_value": -6.5,
|
||||
"enabled": "adjust_starting_e"
|
||||
},
|
||||
"enable_unload":
|
||||
{
|
||||
"label": "Unload filament at print end",
|
||||
"description": "Adds an unload script to the Ending Gcode section. It goes in just ahead of the M104 S0. This scripts always unloads the active extruder. If the unload distance is greater than 150mm it will be broken into chunks to avoid tripping the excessive extrusion warning in some firmware.",
|
||||
"type": "bool",
|
||||
"default_value": false,
|
||||
"enabled": true
|
||||
},
|
||||
"unload_distance":
|
||||
{
|
||||
"label": " Unload Distance",
|
||||
"description": "The amount of filament to unload. Bowden printers usually require a significant amount and direct drives not as much.",
|
||||
"type": "int",
|
||||
"default_value": 440,
|
||||
"unit": "mm ",
|
||||
"enabled": "enable_unload"
|
||||
},
|
||||
"unload_quick_purge":
|
||||
{
|
||||
"label": " Quick purge before unload",
|
||||
"description": "When printing something fine that has a lot of retractions in a short space (like lettering or spires) right before the unload, the filament can get hung up in the hot end and unload can fail. A quick purge will soften the end of the filament so it will retract correctly. This 'quick purge' will take place at the last position of the nozzle.",
|
||||
"type": "bool",
|
||||
"default_value": false,
|
||||
"enabled": "enable_unload"
|
||||
},
|
||||
"move_to_prime_tower":
|
||||
{
|
||||
"label": "Hidden setting",
|
||||
"description": "Hidden setting that enables 'move_to_prime_tower' for multi extruder machines.",
|
||||
"type": "bool",
|
||||
"default_value": false,
|
||||
"enabled": false
|
||||
},
|
||||
"is_rectangular":
|
||||
{
|
||||
"label": "Bed is rectangular",
|
||||
"description": "Hidden setting that disables 'purge line length' for elliptical beds.",
|
||||
"type": "bool",
|
||||
"default_value": false,
|
||||
"enabled": false
|
||||
}
|
||||
}
|
||||
}"""
|
||||
|
||||
def execute(self, data):
|
||||
# Exit if the Gcode has already been processed.
|
||||
for num in range(0, len(data)):
|
||||
layer = data[num].split("\n")
|
||||
for line in layer:
|
||||
if ";LAYER:" in line:
|
||||
break
|
||||
elif "PurgeLinesAndUnload" in line:
|
||||
Logger.log("i", "[Add Purge Lines and Unload Filament] has already run on this gcode.")
|
||||
return data
|
||||
# The function also retrieves extruder settings used later in the script
|
||||
# 't0_has_offsets' is used to exit 'Add Purge Lines' and 'Circle around...' because the script is not compatible with machines with the right nozzle as the primary nozzle.
|
||||
self.t0_has_offsets = False
|
||||
self.init_ext_nr = self._get_initial_tool()
|
||||
# Adjust the usable size of the bed per any 'disallowed areas'
|
||||
self._get_build_plate_extents()
|
||||
# The start location changes according to which quadrant the nozzle is in at the beginning
|
||||
self.end_purge_location = self._get_real_start_point(data[1])
|
||||
self.border_distance = self.getSettingValueByKey("border_distance")
|
||||
self.prime_blob_enable = self.getSettingValueByKey("prime_blob_enable")
|
||||
if self.prime_blob_enable:
|
||||
self.prime_blob_distance = self.getSettingValueByKey("prime_blob_distance")
|
||||
else:
|
||||
self.prime_blob_distance = 0
|
||||
# Set the minimum Z to stick the string to the build plate when Move to Start is selected.
|
||||
self.touchdown_z = self.getSettingValueByKey("move_to_start_min_z")
|
||||
|
||||
# Mapping settings to corresponding methods
|
||||
procedures = {
|
||||
"add_purge_lines": self._add_purge_lines,
|
||||
"move_to_prime_tower": self._move_to_prime_tower,
|
||||
"move_to_start": self._move_to_start,
|
||||
"adjust_starting_e": self._adjust_starting_e,
|
||||
"enable_unload": self._unload_filament
|
||||
}
|
||||
# Run selected procedures
|
||||
for setting, method in procedures.items():
|
||||
if self.getSettingValueByKey(setting):
|
||||
method(data)
|
||||
# Format the startup and ending gcodes
|
||||
data[1] = self._format_string(data[1])
|
||||
data[-1] = self._format_string(data[-1])
|
||||
if self.getSettingValueByKey("add_purge_lines"):
|
||||
if self.show_warning:
|
||||
msg_text = ("The printer has ( " + str(len(self.disallowed_areas))
|
||||
+ " ) 'disallowed areas'. That can cause the area available for the purge lines to be small.\nOpen the Gcode file for preview in Cura and check the purge line location to insure it is acceptable.")
|
||||
else:
|
||||
msg_text = "Open the Gcode file for preview in Cura. Make sure the 'Purge Lines' don't run underneath something else and are acceptable."
|
||||
Message(title="[Purge Lines and Unload]", text=msg_text).show()
|
||||
return data
|
||||
|
||||
def _get_real_start_point(self, first_section: str) -> tuple:
|
||||
last_x, last_y = 0.0, 0.0
|
||||
start_quadrant = Position.LEFT_FRONT
|
||||
|
||||
for line in first_section.split("\n"):
|
||||
if line.startswith(";") and not line.startswith(";LAYER_COUNT") or not line:
|
||||
continue
|
||||
|
||||
if line.startswith("G28"):
|
||||
last_x, last_y = 0.0, 0.0
|
||||
elif line[:3] in {"G0 ", "G1 "}:
|
||||
last_x = self.getValue(line, "X") if " X" in line else last_x
|
||||
last_y = self.getValue(line, "Y") if " Y" in line else last_y
|
||||
elif "LAYER_COUNT" in line:
|
||||
break
|
||||
|
||||
midpoint_x, midpoint_y = (0.0, 0.0) if self.origin_at_center else (
|
||||
self.machine_width / 2, self.machine_depth / 2)
|
||||
|
||||
if last_x <= midpoint_x and last_y <= midpoint_y:
|
||||
start_quadrant = Position.LEFT_FRONT
|
||||
elif last_x > midpoint_x and last_y <= midpoint_y:
|
||||
start_quadrant = Position.RIGHT_FRONT
|
||||
elif last_x > midpoint_x and last_y > midpoint_y:
|
||||
start_quadrant = Position.RIGHT_REAR
|
||||
elif last_x <= midpoint_x and last_y > midpoint_y:
|
||||
start_quadrant = Position.LEFT_REAR
|
||||
|
||||
return start_quadrant
|
||||
|
||||
"""
|
||||
For some multi-extruder printers.
|
||||
Takes into account a 'Move to Prime Tower' if there is one and adds orthogonal travel moves to get there.
|
||||
'Move to Prime Tower' does not require that the prime tower is enabled,
|
||||
only that 'machine_extruder_start_position_?' is in the definition file.
|
||||
"""
|
||||
|
||||
def _move_to_prime_tower(self, first_section: str) -> str:
|
||||
if self.extruder_count == 1:
|
||||
return first_section
|
||||
adjustment_lines = ""
|
||||
move_to_prime_present = False
|
||||
prime_tower_x = self.global_stack.getProperty("prime_tower_position_x", "value")
|
||||
prime_tower_y = self.global_stack.getProperty("prime_tower_position_y", "value")
|
||||
prime_tower_loc = self._prime_tower_quadrant(prime_tower_x, prime_tower_y)
|
||||
# Shortstop an error if Start Location comes through as None
|
||||
if self.end_purge_location is None:
|
||||
self.end_purge_location = Position.LEFT_FRONT
|
||||
if prime_tower_loc != self.end_purge_location:
|
||||
startup = first_section[1].split("\n")
|
||||
for index, line in enumerate(startup):
|
||||
if ";LAYER_COUNT:" in line:
|
||||
try:
|
||||
if startup[index + 1].startswith("G0"):
|
||||
prime_move = startup[index + 1] + " ; Move to Prime Tower"
|
||||
adjustment_lines = self._move_to_location("Prime Tower", prime_tower_loc)
|
||||
startup[index + 1] = adjustment_lines + prime_move + "\n;---------------------[End of Prime Tower moves]\n" + startup[index]
|
||||
startup.pop(index)
|
||||
first_section[1] = "\n".join(startup)
|
||||
move_to_prime_present = True
|
||||
except IndexError:
|
||||
pass
|
||||
# The start_location changes to the prime tower location in case 'Move to Start' is enabled.
|
||||
if move_to_prime_present:
|
||||
self.end_purge_location = prime_tower_loc
|
||||
return first_section
|
||||
|
||||
# Determine the quadrant that the prime tower rests in so the orthogonal moves can be calculated
|
||||
def _prime_tower_quadrant(self, prime_tower_x: float, prime_tower_y: float) -> tuple:
|
||||
midpoint_x, midpoint_y = (0.0, 0.0) if self.origin_at_center else (
|
||||
self.machine_width / 2, self.machine_depth / 2)
|
||||
|
||||
if prime_tower_x < midpoint_x and prime_tower_y < midpoint_y:
|
||||
return Position.LEFT_FRONT
|
||||
elif prime_tower_x > midpoint_x and prime_tower_y < midpoint_y:
|
||||
return Position.RIGHT_FRONT
|
||||
elif prime_tower_x > midpoint_x and prime_tower_y > midpoint_y:
|
||||
return Position.RIGHT_REAR
|
||||
elif prime_tower_x < midpoint_x and prime_tower_y > midpoint_y:
|
||||
return Position.LEFT_REAR
|
||||
else:
|
||||
return Position.LEFT_FRONT # return Default in case of no match
|
||||
|
||||
def _move_to_location(self, location_name: str, location: tuple) -> str:
|
||||
"""
|
||||
Compare the input tuple (B) with the end purge location (A) and describe the move from A to B.
|
||||
Parameters:
|
||||
location_name (str): A descriptive name for the target location.
|
||||
location (tuple): The target tuple (e.g., ("right", "front")).
|
||||
Returns:
|
||||
str: G-code for the move from A to B or an empty string if no move is required.
|
||||
"""
|
||||
# Validate input
|
||||
if len(self.end_purge_location) != 2 or len(location) != 2:
|
||||
raise ValueError("Both locations must be tuples of length 2.")
|
||||
|
||||
# Extract components
|
||||
start_side, start_depth = self.end_purge_location
|
||||
target_side, target_depth = location
|
||||
# Start of the moves and a comment to highlight the move
|
||||
moves = [f";MESH:NONMESH---------[Circle around to {location_name}] Start from: {str(start_side)} {str(start_depth)} Go to: {target_side} {target_depth}\nG0 F600 Z2 ; Move up\n"]
|
||||
|
||||
# Helper function to add G-code for moves
|
||||
def add_move(axis: str, position: float) -> None:
|
||||
moves.append(
|
||||
f"G0 F{self.speed_travel} {axis}{position} ; Start move\n"
|
||||
f"G0 F600 Z{self.touchdown_z} ; Nail down the string\n"
|
||||
f"G0 F600 Z2 ; Move up\n"
|
||||
)
|
||||
|
||||
# Move to a corner
|
||||
if start_side == Location.LEFT:
|
||||
moves.append(f"G0 F{self.speed_travel} X{self.machine_left + 6} ; Init move\n")
|
||||
elif start_side == Location.RIGHT:
|
||||
moves.append(f"G0 F{self.speed_travel} X{self.machine_right - 6} ; Init move\n")
|
||||
if start_depth == Location.FRONT:
|
||||
add_move("Y", self.machine_front + 6)
|
||||
elif start_depth == Location.REAR:
|
||||
add_move("Y", self.machine_back - 6)
|
||||
# Compare sides
|
||||
if start_side != target_side:
|
||||
if target_side == Location.RIGHT:
|
||||
add_move("X", self.machine_right)
|
||||
else:
|
||||
add_move("X", self.machine_left)
|
||||
# Compare positions
|
||||
if start_depth != target_depth:
|
||||
if target_depth == Location.REAR:
|
||||
add_move("Y", self.machine_back)
|
||||
else:
|
||||
add_move("Y", self.machine_front)
|
||||
if len(moves) == 1:
|
||||
moves.append(f"G0 F{self.speed_travel} Y{self.start_y} ; Move to start Y\n")
|
||||
# Combine moves into a single G-code string and return
|
||||
return "".join(moves)
|
||||
|
||||
def _get_build_plate_extents(self):
|
||||
"""
|
||||
Machine disallowed areas can be ordered at the whim of the definition author and cannot be counted on when parsed
|
||||
This determines a simple rectangle that will be available for the purge lines. For some machines (Ex: UM3) it can be a small rectangle.
|
||||
If there are "extruder offsets" then use them to adjust the 'machine_right' and 'machine_back' independent of any disallowed areas.
|
||||
"""
|
||||
if self.bed_shape == "rectangular":
|
||||
if self.disallowed_areas:
|
||||
if len(self.disallowed_areas) > 4:
|
||||
self.show_warning = True
|
||||
mid_x = 0
|
||||
mid_y = 0
|
||||
left_x = -(self.machine_width / 2)
|
||||
right_x = (self.machine_width / 2)
|
||||
front_y = (self.machine_depth / 2)
|
||||
back_y = -(self.machine_depth / 2)
|
||||
for rect in self.disallowed_areas:
|
||||
for corner in rect:
|
||||
x = corner[0]
|
||||
if mid_x > x > left_x:
|
||||
left_x = x
|
||||
if mid_x < x < right_x:
|
||||
right_x = x
|
||||
y = corner[1]
|
||||
if mid_y < y < front_y:
|
||||
front_y = y
|
||||
if mid_y > y > back_y:
|
||||
back_y = y
|
||||
if self.origin_at_center:
|
||||
self.machine_left = round(left_x, 2)
|
||||
self.machine_right = round(right_x, 2)
|
||||
self.machine_front = round(front_y, 2)
|
||||
self.machine_back = round(back_y, 2)
|
||||
else:
|
||||
self.machine_left = round(left_x + self.machine_width / 2, 2)
|
||||
self.machine_right = round(right_x + self.machine_width / 2, 2)
|
||||
self.machine_front = round((self.machine_depth / 2) - front_y, 2)
|
||||
self.machine_back = round((self.machine_depth / 2) - back_y, 2)
|
||||
else:
|
||||
if self.origin_at_center:
|
||||
self.machine_left = round(-(self.machine_width / 2), 2)
|
||||
self.machine_right = round((self.machine_width / 2) - self.nozzle_offset_x, 2)
|
||||
self.machine_front = round(-(self.machine_depth / 2) + self.nozzle_offset_y, 2)
|
||||
self.machine_back = round((self.machine_depth / 2) - self.nozzle_offset_y, 2)
|
||||
else:
|
||||
self.machine_left = 0
|
||||
self.machine_right = self.machine_width - self.nozzle_offset_x
|
||||
if self.nozzle_offset_y >= 0:
|
||||
self.machine_front = 0
|
||||
self.machine_back = self.machine_depth - self.nozzle_offset_y
|
||||
elif self.nozzle_offset_y < 0:
|
||||
self.machine_front = abs(self.nozzle_offset_y)
|
||||
self.machine_back = self.machine_depth
|
||||
return
|
||||
|
||||
# Add Purge Lines to the user defined position on the build plate
|
||||
def _add_purge_lines(self, data: str):
|
||||
if self.t0_has_offsets:
|
||||
data[0] += "; [Purge Lines and Unload] 'Add Purge Lines' did not run because the assumed primary nozzle (T0) has tool offsets.\n"
|
||||
Message(title = "[Purge Lines and Unload]", text = "'Add Purge Lines' did not run because the assumed primary nozzle (T0) has tool offsets").show()
|
||||
return data
|
||||
|
||||
def calculate_purge_volume(line_width, purge_length, volume_per_mm):
|
||||
return round((line_width * 0.3 * purge_length) * 1.25 / volume_per_mm, 5)
|
||||
|
||||
def adjust_for_prime_blob_gcode(retract_speed, retract_distance):
|
||||
"""Generates G-code lines for prime blob adjustment."""
|
||||
gcode_lines = [
|
||||
f"G1 F{retract_speed} E{retract_distance} ; Unretract",
|
||||
"G92 E0 ; Reset extruder\n"
|
||||
]
|
||||
return "\n".join(gcode_lines)
|
||||
|
||||
purge_location = self.getSettingValueByKey("purge_line_location")
|
||||
purge_extrusion_full = True if self.getSettingValueByKey("purge_line_length") == "purge_full" else False
|
||||
purge_str = ";TYPE:CUSTOM----------[Purge Lines]\nG0 F600 Z2 ; Move up\nG92 E0 ; Reset extruder\n"
|
||||
purge_str += self._get_blob_code()
|
||||
# Normal cartesian printer with origin at the left front corner
|
||||
if self.bed_shape == "rectangular" and not self.origin_at_center:
|
||||
if purge_location == Location.LEFT:
|
||||
purge_len = int(self.machine_back - 20) if purge_extrusion_full else int((self.machine_back - self.machine_front) / 2)
|
||||
y_stop = int(self.machine_back - 10) if purge_extrusion_full else int(self.machine_depth / 2)
|
||||
purge_volume = calculate_purge_volume(self.init_line_width, purge_len, self.mm3_per_mm)
|
||||
purge_str = purge_str.replace("Lines", "Lines at MinX")
|
||||
# Travel to the purge start
|
||||
purge_str += f"G0 F{self.speed_travel} X{self.machine_left + self.border_distance} Y{self.machine_front + 10} ; Move to start\n"
|
||||
purge_str += f"G0 F600 Z0.3 ; Move down\n"
|
||||
if self.prime_blob_enable:
|
||||
purge_str += adjust_for_prime_blob_gcode(self.retract_speed, self.retract_dist)
|
||||
# Purge two lines
|
||||
purge_str += f"G1 F{self.print_speed} X{self.machine_left + self.border_distance} Y{y_stop} E{purge_volume} ; First line\n"
|
||||
purge_str += f"G0 X{self.machine_left + 3 + self.border_distance} Y{y_stop} ; Move over\n"
|
||||
purge_str += f"G1 F{self.print_speed} X{self.machine_left + 3 + self.border_distance} Y{self.machine_front + 10} E{round(purge_volume * 2, 5)} ; Second line\n"
|
||||
# Retract if enabled
|
||||
purge_str += f"G1 F{int(self.retract_speed)} E{round(purge_volume * 2 - self.retract_dist, 5)} ; Retract\n" if self.retraction_enable else ""
|
||||
purge_str += "G0 F600 Z8 ; Move Up\nG4 S1 ; Wait for 1 second\n"
|
||||
# Wipe
|
||||
purge_str += f"G0 F{self.print_speed} X{self.machine_left + 3 + self.border_distance} Y{self.machine_front + 20} Z0.3 ; Slide over and down\n"
|
||||
purge_str += f"G0 X{self.machine_left + 3 + self.border_distance} Y{self.machine_front + 35} ; Wipe\n"
|
||||
self.end_purge_location = Position.LEFT_FRONT
|
||||
elif purge_location == Location.RIGHT:
|
||||
purge_len = int(self.machine_depth - 20) if purge_extrusion_full else int((self.machine_back - self.machine_front) / 2)
|
||||
y_stop = int(self.machine_front + 10) if purge_extrusion_full else int(self.machine_depth / 2)
|
||||
purge_volume = calculate_purge_volume(self.init_line_width, purge_len, self.mm3_per_mm)
|
||||
purge_str = purge_str.replace("Lines", "Lines at MaxX")
|
||||
# Travel to the purge start
|
||||
purge_str += f"G0 F{self.speed_travel} X{self.machine_right - self.border_distance} ; Move\nG0 Y{self.machine_back - 10} ; Move\n"
|
||||
purge_str += f"G0 F600 Z0.3 ; Move down\n"
|
||||
if self.prime_blob_enable:
|
||||
purge_str += adjust_for_prime_blob_gcode(self.retract_speed, self.retract_dist)
|
||||
# Purge two lines
|
||||
purge_str += f"G1 F{self.print_speed} X{self.machine_right - self.border_distance} Y{y_stop} E{purge_volume} ; First line\n"
|
||||
purge_str += f"G0 X{self.machine_right - 3 - self.border_distance} Y{y_stop} ; Move over\n"
|
||||
purge_str += f"G1 F{self.print_speed} X{self.machine_right - 3 - self.border_distance} Y{self.machine_back - 10} E{purge_volume * 2} ; Second line\n"
|
||||
# Retract if enabled
|
||||
purge_str += f"G1 F{int(self.retract_speed)} E{round(purge_volume * 2 - self.retract_dist, 5)} ; Retract\n" if self.retraction_enable else ""
|
||||
purge_str += "G0 F600 Z8 ; Move Up\nG4 S1 ; Wait for 1 second\n"
|
||||
# Wipe
|
||||
purge_str += f"G0 F{self.print_speed} X{self.machine_right - 3 - self.border_distance} Y{self.machine_back - 20} Z0.3 ; Slide over and down\n"
|
||||
purge_str += f"G0 X{self.machine_right - 3 - self.border_distance} Y{self.machine_back - 35} ; Wipe\n"
|
||||
self.end_purge_location = Position.RIGHT_REAR
|
||||
elif purge_location == Location.FRONT:
|
||||
purge_len = int(self.machine_width) - self.nozzle_offset_x - 20 if purge_extrusion_full else int(
|
||||
(self.machine_right - self.machine_left) / 2)
|
||||
x_stop = int(self.machine_right - 10) if purge_extrusion_full else int(self.machine_width / 2)
|
||||
purge_volume = calculate_purge_volume(self.init_line_width, purge_len, self.mm3_per_mm)
|
||||
purge_str = purge_str.replace("Lines", "Lines at MinY")
|
||||
# Travel to the purge start
|
||||
purge_str += f"G0 F{self.speed_travel} X{self.machine_left + 10} Y{self.machine_front + self.border_distance} ; Move to start\n"
|
||||
purge_str += f"G0 F600 Z0.3 ; Move down\n"
|
||||
if self.prime_blob_enable:
|
||||
purge_str += adjust_for_prime_blob_gcode(self.retract_speed, self.retract_dist)
|
||||
# Purge two lines
|
||||
purge_str += f"G1 F{self.print_speed} X{x_stop} Y{self.machine_front + self.border_distance} E{purge_volume} ; First line\n"
|
||||
purge_str += f"G0 X{x_stop} Y{self.machine_front + 3 + self.border_distance} ; Move over\n"
|
||||
purge_str += f"G1 F{self.print_speed} X{self.machine_left + 10} Y{self.machine_front + 3 + self.border_distance} E{purge_volume * 2} ; Second line\n"
|
||||
# Retract if enabled
|
||||
purge_str += f"G1 F{int(self.retract_speed)} E{round(purge_volume * 2 - self.retract_dist, 5)} ; Retract\n" if self.retraction_enable else ""
|
||||
purge_str += "G0 F600 Z8 ; Move Up\nG4 S1 ; Wait for 1 second\n"
|
||||
# Wipe
|
||||
purge_str += f"G0 F{self.print_speed} X{self.machine_left + 20} Y{self.machine_front + 3 + self.border_distance} Z0.3 ; Slide over and down\n"
|
||||
purge_str += f"G0 X{self.machine_left + 35} Y{self.machine_front + 3 + self.border_distance} ; Wipe\n"
|
||||
self.end_purge_location = Position.LEFT_FRONT
|
||||
elif purge_location == Location.REAR:
|
||||
purge_len = int(self.machine_width - 20) if purge_extrusion_full else int(
|
||||
(self.machine_right - self.machine_left) / 2)
|
||||
x_stop = int(self.machine_left + 10) if purge_extrusion_full else int(self.machine_width / 2)
|
||||
purge_volume = calculate_purge_volume(self.init_line_width, purge_len, self.mm3_per_mm)
|
||||
purge_str = purge_str.replace("Lines", "Lines at MaxY")
|
||||
# Travel to the purge start
|
||||
purge_str += f"G0 F{self.speed_travel} Y{self.machine_back - self.border_distance} ; Ortho Move to back\n"
|
||||
purge_str += f"G0 X{self.machine_right - 10} ; Ortho move to start\n"
|
||||
purge_str += f"G0 F600 Z0.3 ; Move down\n"
|
||||
if self.prime_blob_enable:
|
||||
purge_str += adjust_for_prime_blob_gcode(self.retract_speed, self.retract_dist)
|
||||
# Purge two lines
|
||||
purge_str += f"G1 F{self.print_speed} X{x_stop} Y{self.machine_back - self.border_distance} E{purge_volume} ; First line\n"
|
||||
purge_str += f"G0 X{x_stop} Y{self.machine_back - 3 - self.border_distance} ; Move over\n"
|
||||
purge_str += f"G1 F{self.print_speed} X{self.machine_right - 10} Y{self.machine_back - 3 - self.border_distance} E{purge_volume * 2} ; Second line\n"
|
||||
# Retract if enabled
|
||||
purge_str += f"G1 F{int(self.retract_speed)} E{round(purge_volume * 2 - self.retract_dist, 5)} ; Retract\n" if self.retraction_enable else ""
|
||||
purge_str += "G0 F600 Z8 ; Move Up\nG4 S1 ; Wait 1 second\n"
|
||||
# Wipe
|
||||
purge_str += f"G0 F{self.print_speed} X{self.machine_right - 20} Y{self.machine_back - 3 - self.border_distance} Z0.3 ; Slide over and down\n"
|
||||
purge_str += f"G0 X{self.machine_right - 35} Y{self.machine_back - 3 - self.border_distance} ; Wipe\n"
|
||||
self.end_purge_location = Position.RIGHT_REAR
|
||||
# Some cartesian printers (BIBO, Weedo, MethodX, etc.) are Origin at Center
|
||||
elif self.bed_shape == "rectangular" and self.origin_at_center:
|
||||
if purge_location == Location.LEFT:
|
||||
purge_len = int(self.machine_back - self.machine_front - 20) if purge_extrusion_full else abs(
|
||||
int(self.machine_front - 10))
|
||||
y_stop = int(self.machine_back - 10) if purge_extrusion_full else 0
|
||||
purge_volume = calculate_purge_volume(self.init_line_width, purge_len, self.mm3_per_mm)
|
||||
# Travel to the purge start
|
||||
purge_str += f"G0 F{self.speed_travel} X{self.machine_left + self.border_distance} Y{self.machine_front + 10} ; Move to start\n"
|
||||
purge_str += f"G0 F600 Z0.3 ; Move down\n"
|
||||
if self.prime_blob_enable:
|
||||
purge_str += adjust_for_prime_blob_gcode(self.retract_speed, self.retract_dist)
|
||||
# Purge two lines
|
||||
purge_str += f"G1 F{self.print_speed} X{self.machine_left + self.border_distance} Y{y_stop} E{purge_volume} ; First line\n"
|
||||
purge_str += f"G0 X{self.machine_left + 3 + self.border_distance} Y{y_stop} ; Move over\n"
|
||||
purge_str += f"G1 F{self.print_speed} X{self.machine_left + 3 + self.border_distance} Y{self.machine_front + 10} E{round(purge_volume * 2, 5)} ; Second line\n"
|
||||
# Retract if enabled
|
||||
purge_str += f"G1 F{int(self.retract_speed)} E{round(purge_volume * 2 - self.retract_dist, 5)} ; Retract\n" if self.retraction_enable else ""
|
||||
purge_str += "G0 F600 Z8 ; Move Up\nG4 S1 ; Wait for 1 second\n"
|
||||
# Wipe
|
||||
purge_str += f"G0 F{self.print_speed} X{self.machine_left + 3 + self.border_distance} Y{self.machine_front + 20} Z0.3 ; Slide over and down\n"
|
||||
purge_str += f"G0 X{self.machine_left + 3 + self.border_distance} Y{self.machine_front + 35} ; Wipe\n"
|
||||
self.end_purge_location = Position.LEFT_FRONT
|
||||
elif purge_location == Location.RIGHT:
|
||||
purge_len = int(self.machine_back - 20) if purge_extrusion_full else int(
|
||||
(self.machine_back - self.machine_front) / 2)
|
||||
y_stop = int(self.machine_front + 10) if purge_extrusion_full else 0
|
||||
purge_volume = calculate_purge_volume(self.init_line_width, purge_len, self.mm3_per_mm)
|
||||
# Travel to the purge start
|
||||
purge_str += f"G0 F{self.speed_travel} X{self.machine_right - self.border_distance} Z2 ; Move\nG0 Y{self.machine_back - 10} Z2 ; Move to start\n"
|
||||
purge_str += f"G0 F600 Z0.3 ; Move down\n"
|
||||
if self.prime_blob_enable:
|
||||
purge_str += adjust_for_prime_blob_gcode(self.retract_speed, self.retract_dist)
|
||||
# Purge two lines
|
||||
purge_str += f"G1 F{self.print_speed} X{self.machine_right - self.border_distance} Y{y_stop} E{purge_volume} ; First line\n"
|
||||
purge_str += f"G0 X{self.machine_right - 3 - self.border_distance} Y{y_stop} ; Move over\n"
|
||||
purge_str += f"G1 F{self.print_speed} X{self.machine_right - 3 - self.border_distance} Y{self.machine_back - 10} E{purge_volume * 2} ; Second line\n"
|
||||
# Retract if enabled
|
||||
purge_str += f"G1 F{int(self.retract_speed)} E{round(purge_volume * 2 - self.retract_dist, 5)} ; Retract\n" if self.retraction_enable else ""
|
||||
purge_str += "G0 F600 Z8 ; Move Up\nG4 S1 ; Wait for 1 second\n"
|
||||
# Wipe
|
||||
purge_str += f"G0 F{self.print_speed} X{self.machine_right - 3 - self.border_distance} Y{self.machine_back - 20} Z0.3 ; Slide over and down\n"
|
||||
purge_str += f"G0 X{self.machine_right - 3 - self.border_distance} Y{self.machine_back - 35} ; Wipe\n"
|
||||
self.end_purge_location = Position.RIGHT_REAR
|
||||
elif purge_location == Location.FRONT:
|
||||
purge_len = int(self.machine_right - self.machine_left - 20) if purge_extrusion_full else int(
|
||||
(self.machine_right - self.machine_left) / 2)
|
||||
x_stop = int(self.machine_right - 10) if purge_extrusion_full else 0
|
||||
purge_volume = calculate_purge_volume(self.init_line_width, purge_len, self.mm3_per_mm)
|
||||
# Travel to the purge start
|
||||
purge_str += f"G0 F{self.speed_travel} X{self.machine_left + 10} Z2 ; Move\nG0 Y{self.machine_front + self.border_distance} Z2 ; Move to start\n"
|
||||
purge_str += f"G0 F600 Z0.3 ; Move down\n"
|
||||
if self.prime_blob_enable:
|
||||
purge_str += adjust_for_prime_blob_gcode(self.retract_speed, self.retract_dist)
|
||||
# Purge two lines
|
||||
purge_str += f"G1 F{self.print_speed} X{x_stop} Y{self.machine_front + self.border_distance} E{purge_volume} ; First line\n"
|
||||
purge_str += f"G0 X{x_stop} Y{self.machine_front + 3 + self.border_distance} ; Move over\n"
|
||||
purge_str += f"G1 F{self.print_speed} X{self.machine_left + 10} Y{self.machine_front + 3 + self.border_distance} E{purge_volume * 2} ; Second line\n"
|
||||
# Retract if enabled
|
||||
purge_str += f"G1 F{int(self.retract_speed)} E{round(purge_volume * 2 - self.retract_dist, 5)} ; Retract\n" if self.retraction_enable else ""
|
||||
purge_str += "G0 F600 Z8 ; Move Up\nG4 S1 ; Wait for 1 second\n"
|
||||
# Wipe
|
||||
purge_str += f"G0 F{self.print_speed} X{self.machine_left + 20} Y{self.machine_front + 3 + self.border_distance} Z0.3 ; Slide over and down\n"
|
||||
purge_str += f"G0 X{self.machine_left + 35} Y{self.machine_front + 3 + self.border_distance} ; Wipe\n"
|
||||
self.end_purge_location = Position.LEFT_FRONT
|
||||
elif purge_location == Location.REAR:
|
||||
purge_len = int(self.machine_right - self.machine_left - 20) if purge_extrusion_full else abs(
|
||||
int(self.machine_right - 10))
|
||||
x_stop = int(self.machine_left + 10) if purge_extrusion_full else 0
|
||||
purge_volume = calculate_purge_volume(self.init_line_width, purge_len, self.mm3_per_mm)
|
||||
# Travel to the purge start
|
||||
purge_str += f"G0 F{self.speed_travel} Y{self.machine_back - self.border_distance} Z2; Ortho Move to back\n"
|
||||
purge_str += f"G0 X{self.machine_right - 10} Z2 ; Ortho Move to start\n"
|
||||
purge_str += f"G0 F600 Z0.3 ; Move down\n"
|
||||
if self.prime_blob_enable:
|
||||
purge_str += adjust_for_prime_blob_gcode(self.retract_speed, self.retract_dist)
|
||||
# Purge two lines
|
||||
purge_str += f"G1 F{self.print_speed} X{x_stop} Y{self.machine_back - self.border_distance} E{purge_volume} ; First line\n"
|
||||
purge_str += f"G0 X{x_stop} Y{self.machine_back - 3 - self.border_distance} ; Move over\n"
|
||||
purge_str += f"G1 F{self.print_speed} X{self.machine_right - 10} Y{self.machine_back - 3 - self.border_distance} E{purge_volume * 2} ; Second line\n"
|
||||
# Retract if enabled
|
||||
purge_str += f"G1 F{int(self.retract_speed)} E{round(purge_volume * 2 - self.retract_dist, 5)} ; Retract\n" if self.retraction_enable else ""
|
||||
purge_str += "G0 F600 Z8 ; Move Up\nG4 S1 ; Wait for 1 second\n"
|
||||
# Wipe
|
||||
purge_str += f"G0 F{self.print_speed} X{self.machine_right - 20} Y{self.machine_back - 3 - self.border_distance} Z0.3 ; Slide over and down\n"
|
||||
purge_str += f"G0 X{self.machine_right - 35} Y{self.machine_back - 3 - self.border_distance} ; Wipe\n"
|
||||
self.end_purge_location = Position.RIGHT_REAR
|
||||
# Elliptic printers with Origin at Center
|
||||
elif self.bed_shape == "elliptic":
|
||||
if purge_location in [Location.LEFT, Location.RIGHT]:
|
||||
radius_1 = round((self.machine_width / 2) - 1, 2)
|
||||
else: # For purge_location in [Location.FRONT, Location.REAR]
|
||||
radius_1 = round((self.machine_depth / 2) - 1, 2)
|
||||
purge_len = int(radius_1) * math.pi / 4
|
||||
purge_volume = calculate_purge_volume(self.init_line_width, purge_len, self.mm3_per_mm)
|
||||
if purge_location == Location.LEFT:
|
||||
# Travel to the purge start
|
||||
purge_str += f"G0 F{self.speed_travel} X-{round(radius_1 * .707, 2)} Y-{round(radius_1 * .707, 2)} ; Travel\n"
|
||||
purge_str += f"G0 F600 Z0.3 ; Move down\n"
|
||||
# Purge two arcs
|
||||
purge_str += f"G2 F{self.print_speed} X-{round(radius_1 * .707, 2)} Y{round(radius_1 * .707, 2)} I{round(radius_1 * .707, 2)} J{round(radius_1 * .707, 2)} E{purge_volume} ; First Arc\n"
|
||||
purge_str += f"G0 X-{round((radius_1 - 3) * .707, 2)} Y{round((radius_1 - 3) * .707, 2)} ; Move Over\n"
|
||||
purge_str += f"G3 F{self.print_speed} X-{round((radius_1 - 3) * .707, 2)} Y-{round((radius_1 - 3) * .707, 2)} I{round((radius_1 - 3) * .707, 2)} J-{round((radius_1 - 3) * .707, 2)} E{purge_volume * 2} ; Second Arc\n"
|
||||
purge_str += f"G1 X-{round((radius_1 - 3) * .707 - 25, 2)} E{round(purge_volume * 2 + 1, 5)} ; Move Over\n"
|
||||
# Retract if enabled
|
||||
purge_str += f"G1 F{int(self.retract_speed)} E{round((purge_volume * 2 + 1) - self.retract_dist, 5)} ; Retract\n" if self.retraction_enable else ""
|
||||
purge_str += "G0 F600 Z5 ; Move Up\nG4 S1 ; Wait 1 Second\n"
|
||||
# Wipe
|
||||
purge_str += f"G0 F{self.print_speed} X-{round((radius_1 - 3) * .707 - 15, 2)} Z0.3 ; Slide Over\n"
|
||||
purge_str += f"G0 F{self.print_speed} X-{round((radius_1 - 3) * .707, 2)} ; Wipe\n"
|
||||
self.end_purge_location = Position.LEFT_FRONT
|
||||
elif purge_location == Location.RIGHT:
|
||||
# Travel to the purge start
|
||||
purge_str += f"G0 F{self.speed_travel} X{round(radius_1 * .707, 2)} Y-{round(radius_1 * .707, 2)} ; Travel\n"
|
||||
purge_str += f"G0 F600 Z0.3 ; Move down\n"
|
||||
# Purge two arcs
|
||||
purge_str += f"G3 F{self.print_speed} X{round(radius_1 * .707, 2)} Y{round(radius_1 * .707, 2)} I-{round(radius_1 * .707, 2)} J{round(radius_1 * .707, 2)} E{purge_volume} ; First Arc\n"
|
||||
purge_str += f"G0 X{round((radius_1 - 3) * .707, 2)} Y{round((radius_1 - 3) * .707, 2)} ; Move Over\n"
|
||||
purge_str += f"G2 F{self.print_speed} X{round((radius_1 - 3) * .707, 2)} Y-{round((radius_1 - 3) * .707, 2)} I-{round((radius_1 - 3) * .707, 2)} J-{round((radius_1 - 3) * .707, 2)} E{purge_volume * 2} ; Second Arc\n"
|
||||
purge_str += f"G1 X{round((radius_1 - 3) * .707 - 25, 2)} E{round(purge_volume * 2 + 1, 5)} ; Move Over\n"
|
||||
# Retract if enabled
|
||||
purge_str += f"G1 F{int(self.retract_speed)} E{round((purge_volume * 2 + 1) - self.retract_dist, 5)} ; Retract\n" if self.retraction_enable else ""
|
||||
purge_str += "G0 F600 Z5 ; Move Up\nG4 S1 ; Wait 1 Second\n"
|
||||
# Wipe
|
||||
purge_str += f"G0 F{self.print_speed} X{round((radius_1 - 3) * .707 - 15, 2)} Z0.3 ; Slide Over\n"
|
||||
purge_str += f"G0 F{self.print_speed} X{round((radius_1 - 3) * .707, 2)} ; Wipe\n"
|
||||
self.end_purge_location = Position.RIGHT_REAR
|
||||
elif purge_location == Location.FRONT:
|
||||
# Travel to the purge start
|
||||
purge_str += f"G0 F{self.speed_travel} X-{round(radius_1 * .707, 2)} Y-{round(radius_1 * .707, 2)} ; Travel\n"
|
||||
purge_str += f"G0 F600 Z0.3 ; Move down\n"
|
||||
# Purge two arcs
|
||||
purge_str += f"G3 F{self.print_speed} X{round(radius_1 * .707, 2)} Y-{round(radius_1 * .707, 2)} I{round(radius_1 * .707, 2)} J{round(radius_1 * .707, 2)} E{purge_volume} ; First Arc\n"
|
||||
purge_str += f"G0 X{round((radius_1 - 3) * .707, 2)} Y-{round((radius_1 - 3) * .707, 2)} ; Move Over\n"
|
||||
purge_str += f"G2 F{self.print_speed} X-{round((radius_1 - 3) * .707, 2)} Y-{round((radius_1 - 3) * .707, 2)} I-{round((radius_1 - 3) * .707, 2)} J{round((radius_1 - 3) * .707, 2)} E{purge_volume * 2} ; Second Arc\n"
|
||||
purge_str += f"G1 Y-{round((radius_1 - 3) * .707 - 25, 2)} E{round(purge_volume * 2 + 1, 5)} ; Move Over\n"
|
||||
# Retract if enabled
|
||||
purge_str += f"G1 F{int(self.retract_speed)} E{round((purge_volume * 2 + 1) - self.retract_dist, 5)} ; Retract\n" if self.retraction_enable else ""
|
||||
purge_str += "G0 F600 Z5 ; Move Up\nG4 S1 ; Wait 1 Second\n"
|
||||
# Wipe
|
||||
purge_str += f"G0 F{self.print_speed} Y-{round((radius_1 - 3) * .707 - 15, 2)} Z0.3 ; Slide Over\n"
|
||||
purge_str += f"G0 F{self.print_speed} Y-{round((radius_1 - 3) * .707, 2)} ; Wipe\n"
|
||||
self.end_purge_location = Position.LEFT_FRONT
|
||||
elif purge_location == Location.REAR:
|
||||
# Travel to the purge start
|
||||
purge_str += f"G0 F{self.speed_travel} X{round(radius_1 * .707, 2)} Y{round(radius_1 * .707, 2)} ; Travel\n"
|
||||
purge_str += f"G0 F600 Z0.3 ; Move down\n"
|
||||
# Purge two arcs
|
||||
purge_str += f"G3 F{self.print_speed} X-{round(radius_1 * .707, 2)} Y{round(radius_1 * .707, 2)} I-{round(radius_1 * .707, 2)} J-{round(radius_1 * .707, 2)} E{purge_volume} ; First Arc\n"
|
||||
purge_str += f"G0 X-{round((radius_1 - 3) * .707, 2)} Y{round((radius_1 - 3) * .707, 2)} ; Move Over\n"
|
||||
purge_str += f"G2 F{self.print_speed} X{round((radius_1 - 3) * .707, 2)} Y{round((radius_1 - 3) * .707, 2)} I{round((radius_1 - 3) * .707, 2)} J-{round((radius_1 - 3) * .707, 2)} E{purge_volume * 2} ; Second Arc\n"
|
||||
purge_str += f"G1 Y{round((radius_1 - 3) * .707 - 25, 2)} E{round(purge_volume * 2 + 1, 5)} ; Move Over\n"
|
||||
# Retract if enabled
|
||||
purge_str += f"G1 F{int(self.retract_speed)} E{round((purge_volume * 2 + 1) - self.retract_dist, 5)} ; Retract\n" if self.retraction_enable else ""
|
||||
purge_str += "G0 F600 Z5\nG4 S1 ; Wait 1 Second\n"
|
||||
# Wipe
|
||||
purge_str += f"G0 F{self.print_speed} Y{round((radius_1 - 3) * .707 - 15, 2)} Z0.3 ; Slide Over\n"
|
||||
purge_str += f"G0 F{self.print_speed} Y{round((radius_1 - 3) * .707, 2)} ; Wipe\n"
|
||||
self.end_purge_location = Position.RIGHT_REAR
|
||||
|
||||
# Common ending for purge_str
|
||||
purge_str += "G0 F600 Z2 ; Move Z\n;---------------------[End of Purge]"
|
||||
|
||||
# Comment out any existing purge lines in data
|
||||
startup = data[1].split("\n")
|
||||
for index, line in enumerate(startup):
|
||||
if "G1" in line and " E" in line and (" X" in line or " Y" in line):
|
||||
next_line = index
|
||||
try:
|
||||
while not startup[next_line].startswith("G92 E0"):
|
||||
startup[next_line] = ";" + startup[next_line]
|
||||
next_line += 1
|
||||
except IndexError:
|
||||
break
|
||||
data[1] = "\n".join(startup)
|
||||
|
||||
# Find the insertion location in data
|
||||
purge_str = self._format_string(purge_str)
|
||||
startup_section = data[1].split("\n")
|
||||
insert_index = len(startup_section) - 1
|
||||
for num in range(len(startup_section) - 1, 0, -1):
|
||||
# In Absolute Extrusion mode - insert above the last G92 E0 line
|
||||
if "G92 E0" in startup_section[num]:
|
||||
insert_index = num
|
||||
break
|
||||
# In Relative Extrusion mode - insert above the M83 line
|
||||
elif "M83" in startup_section[num]:
|
||||
insert_index = num
|
||||
break
|
||||
startup_section.insert(insert_index, purge_str)
|
||||
data[1] = "\n".join(startup_section)
|
||||
return data
|
||||
|
||||
# Travel moves around the bed periphery to keep strings from crossing the footprint of the model.
|
||||
def _move_to_start(self, data: str) -> str:
|
||||
if self.t0_has_offsets:
|
||||
data[0] += "; [Purge Lines and Unload] 'Circle Around to Layer Start' did not run because the assumed primary nozzle (T0) has tool offsets.\n"
|
||||
Message(title = "[Purge Lines and Unload]", text = "'Circle Around to Layer Start' did not run because the assumed primary nozzle (T0) has tool offsets.").show()
|
||||
return data
|
||||
self.start_x = None
|
||||
self.start_y = None
|
||||
move_str = None
|
||||
layer = data[2].split("\n")
|
||||
for line in layer:
|
||||
if line.startswith("G0") and " X" in line and " Y" in line:
|
||||
self.start_x = self.getValue(line, "X")
|
||||
self.start_y = self.getValue(line, "Y")
|
||||
break
|
||||
self.start_x = self.start_x or 0
|
||||
self.start_y = self.start_y or 0
|
||||
if self.end_purge_location is None:
|
||||
self.end_purge_location = Position.LEFT_FRONT
|
||||
midpoint_x = self.machine_width / 2
|
||||
midpoint_y = self.machine_depth / 2
|
||||
if not self.origin_at_center:
|
||||
if float(self.start_x) <= float(midpoint_x):
|
||||
x_target = Location.LEFT
|
||||
else:
|
||||
x_target = Location.RIGHT
|
||||
if float(self.start_y) <= float(midpoint_y):
|
||||
y_target = Location.FRONT
|
||||
else:
|
||||
y_target = Location.REAR
|
||||
else:
|
||||
if float(self.start_x) <= 0:
|
||||
x_target = Location.LEFT
|
||||
else:
|
||||
x_target = Location.RIGHT
|
||||
if float(self.start_y) <= 0:
|
||||
y_target = Location.FRONT
|
||||
else:
|
||||
y_target = Location.REAR
|
||||
target_location = (x_target, y_target)
|
||||
if self.bed_shape == "rectangular":
|
||||
move_str = self._move_to_location("Layer Start", target_location)
|
||||
elif self.bed_shape == "elliptic" and self.origin_at_center:
|
||||
move_str = f";MESH:NONMESH---------[Travel to Layer Start]\nG0 F600 Z2 ; Move up\n"
|
||||
radius = self.machine_width / 2
|
||||
offset_sin = round(2 ** .5 / 2 * radius, 2)
|
||||
if target_location == Position.LEFT_FRONT:
|
||||
move_str += f"G0 F{self.speed_travel} X-{offset_sin} Z2 ; Move\nG0 Y-{offset_sin} Z2 ; Move to start\n"
|
||||
elif target_location == Position.LEFT_REAR:
|
||||
if self.end_purge_location == Position.LEFT_REAR:
|
||||
move_str += f"G2 X0 Y{offset_sin} I{offset_sin} J{offset_sin} ; Move around to start\n"
|
||||
else:
|
||||
move_str += f"G0 F{self.speed_travel} X-{offset_sin} Z2 ; Ortho move\nG0 Y{offset_sin} Z2 ; Ortho move\n"
|
||||
elif target_location == Position.RIGHT_FRONT:
|
||||
move_str += f"G0 F{self.speed_travel} X{offset_sin} Z2 ; Ortho move\nG0 Y-{offset_sin} Z2 ; Ortho move\n"
|
||||
elif target_location == Position.RIGHT_REAR:
|
||||
move_str += f"G0 F{self.speed_travel} X{offset_sin} Z2 ; Ortho move\nG0 Y{offset_sin} Z2 ; Ortho move\n"
|
||||
move_str += ";---------------------[End of layer start travels]"
|
||||
# Add the move_str to the end of the StartUp section and move 'LAYER_COUNT' to the end.
|
||||
startup = data[1].split("\n")
|
||||
move_str = self._format_string(move_str)
|
||||
if move_str.startswith("\n"):
|
||||
move_str = move_str[1:]
|
||||
startup.append(move_str)
|
||||
# Move the 'LAYER_COUNT' line so it's at the end of data[1]
|
||||
for index, line in enumerate(startup):
|
||||
if "LAYER_COUNT" in line:
|
||||
lay_count = startup.pop(index) + "\n"
|
||||
startup.append(lay_count)
|
||||
break
|
||||
|
||||
data[1] = "\n".join(startup)
|
||||
# Remove any double-spaced lines
|
||||
data[1] = data[1].replace("\n\n", "\n")
|
||||
return data
|
||||
|
||||
# Unloading a large amount of filament in a single command can trip the 'Overlong Extrusion' warning in some firmware. Unloads longer than 150mm are split into individual 150mm segments.
|
||||
def _unload_filament(self, data: str) -> str:
|
||||
extrude_speed = 3000
|
||||
quick_purge_speed = round(float(self.nozzle_size) * 500)
|
||||
if self.material_diameter > 2: quick_purge_speed *= .38 # Adjustment for 2.85 filament
|
||||
retract_amount = self.extruder[0].getProperty("retraction_amount", "value")
|
||||
quick_purge_amount = retract_amount + 5 if retract_amount < 2.0 else retract_amount * 2
|
||||
unload_distance = self.getSettingValueByKey("unload_distance")
|
||||
quick_purge = self.getSettingValueByKey("unload_quick_purge")
|
||||
lines = data[-1].split("\n")
|
||||
for index, line in enumerate(lines):
|
||||
# Unload the filament just before the hot end turns off.
|
||||
if line.startswith("M104") and "S0" in line:
|
||||
filament_str = (
|
||||
"M83 ; [Unload] Relative extrusion\n"
|
||||
"M400 ; Complete all moves\n"
|
||||
)
|
||||
if quick_purge:
|
||||
filament_str += f"G1 F{quick_purge_speed} E{quick_purge_amount} ; Quick Purge before unload\n"
|
||||
if unload_distance > 150:
|
||||
filament_str += "".join(
|
||||
f"G1 F{extrude_speed} E-150 ; Unload some\n"
|
||||
for _ in range(unload_distance // 150)
|
||||
)
|
||||
remaining_unload = unload_distance % 150
|
||||
if remaining_unload > 0:
|
||||
filament_str += f"G1 F{extrude_speed} E-{remaining_unload} ; Unload the remainder\n"
|
||||
else:
|
||||
filament_str += f"G1 F{extrude_speed} E-{unload_distance} ; Unload\n"
|
||||
filament_str += (
|
||||
"M82 ; Absolute Extrusion\n"
|
||||
"G92 E0 ; Reset Extruder\n"
|
||||
)
|
||||
lines[index] = filament_str + line
|
||||
break
|
||||
data[-1] = "\n".join(lines)
|
||||
return data
|
||||
|
||||
# Make an adjustment to the starting E location so the skirt/brim/raft starts out when the nozzle starts out.
|
||||
def _adjust_starting_e(self, data: str) -> str:
|
||||
if not self.extruder[0].getProperty("retraction_enable", "value"):
|
||||
return data
|
||||
adjust_amount = self.getSettingValueByKey("adjust_e_loc_to")
|
||||
lines = data[1].split("\n")
|
||||
lines.reverse()
|
||||
if self.global_stack.getProperty("machine_firmware_retract", "value"):
|
||||
search_pattern = r"G10"
|
||||
else:
|
||||
search_pattern = r"G1 F(\d*) E-(\d.*)"
|
||||
for index, line in enumerate(lines):
|
||||
if re.search(search_pattern, line):
|
||||
lines[index] = re.sub(search_pattern, f"G92 E{adjust_amount}", line)
|
||||
lines.reverse()
|
||||
data[1] = "\n".join(lines)
|
||||
break
|
||||
return data
|
||||
|
||||
# Format the purge or travel-to-start strings. No reason they shouldn't look nice.
|
||||
def _format_string(self, any_gcode_str: str):
|
||||
temp_lines = any_gcode_str.split("\n")
|
||||
gap_len = 0
|
||||
for temp_line in temp_lines:
|
||||
if ";" in temp_line and not temp_line.startswith(";"):
|
||||
if gap_len - len(temp_line.split(";")[0]) + 1 < 0:
|
||||
gap_len = len(temp_line.split(";")[0]) + 1
|
||||
if gap_len < 30: gap_len = 30
|
||||
for temp_index, temp_line in enumerate(temp_lines):
|
||||
if ";" in temp_line and not temp_line.startswith(";"):
|
||||
temp_lines[temp_index] = temp_line.replace(temp_line.split(";")[0], temp_line.split(";")[0] + str(
|
||||
" " * (gap_len - len(temp_line.split(";")[0]))), 1)
|
||||
# This formats lines that are commented out but contain additional comments Ex: ;M420 ; leveling mesh
|
||||
elif temp_line.startswith(";") and ";" in temp_line[1:]:
|
||||
temp_lines[temp_index] = temp_line[1:].replace(temp_line[1:].split(";")[0],
|
||||
";" + temp_line[1:].split(";")[0] + str(" " * (
|
||||
gap_len - 1 - len(
|
||||
temp_line[1:].split(";")[0]))), 1)
|
||||
any_gcode_str = "\n".join(temp_lines)
|
||||
return any_gcode_str
|
||||
|
||||
def _get_initial_tool(self) -> int:
|
||||
# Get the Initial Extruder
|
||||
num = Application.getInstance().getExtruderManager().getInitialExtruderNr()
|
||||
if num is None or num == -1:
|
||||
num = 0
|
||||
# If there is an extruder offset X then it will be used to adjust the "machine_right" and a Y offset will adjust the "machine_back"
|
||||
if self.extruder_count > 1 and bool(self.global_stack.getProperty("machine_use_extruder_offset_to_offset_coords", "value")):
|
||||
self.nozzle_offset_x = self.extruder[1].getProperty("machine_nozzle_offset_x", "value")
|
||||
self.nozzle_offset_y = self.extruder[1].getProperty("machine_nozzle_offset_y", "value")
|
||||
else:
|
||||
self.nozzle_offset_x = 0.0
|
||||
self.nozzle_offset_y = 0.0
|
||||
self.material_diameter = self.extruder[num].getProperty("material_diameter", "value")
|
||||
self.nozzle_size = self.extruder[num].getProperty("machine_nozzle_size", "value")
|
||||
self.init_line_width = self.extruder[num].getProperty("skirt_brim_line_width", "value")
|
||||
self.print_speed = round(self.extruder[num].getProperty("speed_print", "value") * 60 * .75)
|
||||
self.speed_travel = round(self.extruder[num].getProperty("speed_travel", "value") * 60)
|
||||
self.retract_dist = self.extruder[num].getProperty("retraction_amount", "value")
|
||||
self.retraction_enable = self.extruder[num].getProperty("retraction_enable", "value")
|
||||
self.retract_speed = self.extruder[num].getProperty("retraction_retract_speed", "value") * 60
|
||||
self.mm3_per_mm = (self.material_diameter / 2) ** 2 * math.pi
|
||||
# Don't add purge lines if 'T0' has offsets.
|
||||
t0_x_offset = self.extruder[0].getProperty("machine_nozzle_offset_x", "value")
|
||||
t0_y_offset = self.extruder[0].getProperty("machine_nozzle_offset_y", "value")
|
||||
if t0_x_offset or t0_y_offset:
|
||||
self.t0_has_offsets = True
|
||||
return num
|
||||
|
||||
def _get_blob_code(self) -> str:
|
||||
if not self.prime_blob_enable or self.prime_blob_distance == 0 or self.getSettingValueByKey("purge_line_location") not in ["front", "left"]:
|
||||
return ""
|
||||
# Set extruder speed for 1.75 filament
|
||||
speed_blob = round(float(self.nozzle_size) * 500)
|
||||
# Adjust speed if 2.85 filament
|
||||
if self.material_diameter > 2: speed_blob *= .4
|
||||
blob_x = self.getSettingValueByKey("prime_blob_loc_x")
|
||||
blob_y = self.getSettingValueByKey("prime_blob_loc_y")
|
||||
blob_string = "G0 F1200 Z20 ; Move up\n"
|
||||
blob_string += f"G0 F{self.speed_travel} X{blob_x} Y{blob_y} ; Move to blob location\n"
|
||||
blob_string += f"G1 F{speed_blob} E{self.prime_blob_distance} ; Blob\n"
|
||||
blob_string += f"G1 F{self.retract_speed} E{self.prime_blob_distance - self.retract_dist} ; Retract\n"
|
||||
blob_string += "G92 E0 ; Reset extruder\n"
|
||||
blob_string += "M300 P500 S600 ; Beep\n"
|
||||
blob_string += "G4 S2 ; Wait\n"
|
||||
return blob_string
|
|
@ -1,20 +1,20 @@
|
|||
# Copyright (c) 2017 Ghostkeeper
|
||||
# The PostProcessingPlugin is released under the terms of the LGPLv3 or higher.
|
||||
# Altered by GregValiant (Greg Foresi) February, 2025.
|
||||
# Added option for "first instance only"
|
||||
# Added option for a layer search with a Start Layer and an End layer.
|
||||
# Added 'Ignore StartUp G-code' and 'Ignore Ending G-code' options
|
||||
|
||||
import re # To perform the search and replace.
|
||||
|
||||
import re
|
||||
from ..Script import Script
|
||||
|
||||
from UM.Application import Application
|
||||
|
||||
class SearchAndReplace(Script):
|
||||
"""Performs a search-and-replace on all g-code.
|
||||
|
||||
Due to technical limitations, the search can't cross the border between
|
||||
layers.
|
||||
"""Performs a search-and-replace on the g-code.
|
||||
"""
|
||||
|
||||
def getSettingDataString(self):
|
||||
return """{
|
||||
return r"""{
|
||||
"name": "Search and Replace",
|
||||
"key": "SearchAndReplace",
|
||||
"metadata": {},
|
||||
|
@ -23,37 +23,145 @@ class SearchAndReplace(Script):
|
|||
{
|
||||
"search":
|
||||
{
|
||||
"label": "Search",
|
||||
"description": "All occurrences of this text will get replaced by the replacement text.",
|
||||
"label": "Search for:",
|
||||
"description": "All occurrences of this text (within the search range) will be replaced by the 'Replace with' string. The search string is 'Case Sensitive' and 'Layer' is not the same as 'layer'.",
|
||||
"type": "str",
|
||||
"default_value": ""
|
||||
},
|
||||
"replace":
|
||||
{
|
||||
"label": "Replace",
|
||||
"description": "The search text will get replaced by this text.",
|
||||
"label": "Replace with:",
|
||||
"description": "The 'Search For' text will get replaced by this text. For MultiLine insertions use the newline character '\\n' as the delimiter. If your Search term ends with a '\\n' remember to add '\\n' to the end of this Replace term.",
|
||||
"type": "str",
|
||||
"default_value": ""
|
||||
},
|
||||
"is_regex":
|
||||
{
|
||||
"label": "Use Regular Expressions",
|
||||
"description": "When enabled, the search text will be interpreted as a regular expression.",
|
||||
"description": "When disabled the search string is treated as a simple text string. When enabled, the search text will be interpreted as a Python regular expression.",
|
||||
"type": "bool",
|
||||
"default_value": false
|
||||
},
|
||||
"enable_layer_search":
|
||||
{
|
||||
"label": "Enable search within a Layer Range:",
|
||||
"description": "When enabled, You can choose a Start and End layer for the search. When 'Layer Search' is enabled the StartUp and Ending gcodes are always ignored.",
|
||||
"type": "bool",
|
||||
"default_value": false,
|
||||
"enabled": true
|
||||
},
|
||||
"search_start":
|
||||
{
|
||||
"label": "Start S&R at Layer:",
|
||||
"description": "Use the Cura Preview layer numbering.",
|
||||
"type": "int",
|
||||
"default_value": 1,
|
||||
"minimum_value": 1,
|
||||
"enabled": "enable_layer_search"
|
||||
},
|
||||
"search_end":
|
||||
{
|
||||
"label": "Stop S&R at end of Layer:",
|
||||
"description": "Use the Cura Preview layer numbering. The replacements will conclude at the end of this layer. If the End Layer is equal to the Start Layer then only that single layer is searched.",
|
||||
"type": "int",
|
||||
"default_value": 2,
|
||||
"minimum_value": 1,
|
||||
"enabled": "enable_layer_search"
|
||||
},
|
||||
"first_instance_only":
|
||||
{
|
||||
"label": "Replace first instance only:",
|
||||
"description": "When enabled only the first instance is replaced.",
|
||||
"type": "bool",
|
||||
"default_value": false,
|
||||
"enabled": true
|
||||
},
|
||||
"ignore_start":
|
||||
{
|
||||
"label": "Ignore StartUp G-code:",
|
||||
"description": "When enabled the StartUp Gcode is unaffected. The StartUp Gcode is everything from ';generated with Cura...' to ';LAYER_COUNT:' inclusive.",
|
||||
"type": "bool",
|
||||
"default_value": true,
|
||||
"enabled": "not enable_layer_search"
|
||||
},
|
||||
"ignore_end":
|
||||
{
|
||||
"label": "Ignore Ending G-code:",
|
||||
"description": "When enabled the Ending Gcode is unaffected.",
|
||||
"type": "bool",
|
||||
"default_value": true,
|
||||
"enabled": "not enable_layer_search"
|
||||
}
|
||||
}
|
||||
}"""
|
||||
|
||||
def execute(self, data):
|
||||
global_stack = Application.getInstance().getGlobalContainerStack()
|
||||
extruder = global_stack.extruderList
|
||||
retract_enabled = bool(extruder[0].getProperty("retraction_enable", "value"))
|
||||
search_string = self.getSettingValueByKey("search")
|
||||
if not self.getSettingValueByKey("is_regex"):
|
||||
search_string = re.escape(search_string) #Need to search for the actual string, not as a regex.
|
||||
search_regex = re.compile(search_string)
|
||||
|
||||
replace_string = self.getSettingValueByKey("replace")
|
||||
is_regex = self.getSettingValueByKey("is_regex")
|
||||
enable_layer_search = self.getSettingValueByKey("enable_layer_search")
|
||||
start_layer = self.getSettingValueByKey("search_start")
|
||||
end_layer = self.getSettingValueByKey("search_end")
|
||||
ignore_start = self.getSettingValueByKey("ignore_start")
|
||||
ignore_end = self.getSettingValueByKey("ignore_end")
|
||||
if enable_layer_search:
|
||||
ignore_start = True
|
||||
ignore_end = True
|
||||
first_instance_only = bool(self.getSettingValueByKey("first_instance_only"))
|
||||
|
||||
for layer_number, layer in enumerate(data):
|
||||
data[layer_number] = re.sub(search_regex, replace_string, layer) #Replace all.
|
||||
# Account for missing layer numbers when a raft is used
|
||||
start_index = 1
|
||||
end_index = len(data) - 1
|
||||
data_list = [0,1]
|
||||
layer_list = [-1,0]
|
||||
lay_num = 1
|
||||
for index, layer in enumerate(data):
|
||||
if re.search(r";LAYER:(-?\d+)", layer):
|
||||
data_list.append(index)
|
||||
layer_list.append(lay_num)
|
||||
lay_num += 1
|
||||
|
||||
return data
|
||||
# Get the start and end indexes within the data
|
||||
if not enable_layer_search:
|
||||
if ignore_start:
|
||||
start_index = 2
|
||||
else:
|
||||
start_index = 1
|
||||
|
||||
if ignore_end:
|
||||
end_index = data_list[len(data_list) - 1]
|
||||
else:
|
||||
# Account for the extra data item when retraction is enabled
|
||||
end_index = data_list[len(data_list) - 1] + (2 if retract_enabled else 1)
|
||||
|
||||
elif enable_layer_search:
|
||||
for index, num in enumerate(layer_list):
|
||||
if num == start_layer:
|
||||
start_index = data_list[index]
|
||||
if num == end_layer:
|
||||
end_index = data_list[index]
|
||||
|
||||
# Make replacements
|
||||
replace_one = False
|
||||
if not is_regex:
|
||||
search_string = re.escape(search_string)
|
||||
search_regex = re.compile(search_string)
|
||||
for num in range(start_index, end_index + 1, 1):
|
||||
layer = data[num]
|
||||
# First_instance only
|
||||
if first_instance_only:
|
||||
if re.search(search_regex, layer) and replace_one == False:
|
||||
data[num] = re.sub(search_regex, replace_string, data[num], 1)
|
||||
replace_one = True
|
||||
break
|
||||
# All instances
|
||||
else:
|
||||
if end_index > start_index:
|
||||
data[num] = re.sub(search_regex, replace_string, layer)
|
||||
elif end_index == start_index:
|
||||
layer = data[start_index]
|
||||
data[start_index] = re.sub(search_regex, replace_string, layer)
|
||||
return data
|
||||
|
|
|
@ -101,7 +101,8 @@ class RemovableDriveOutputDevice(OutputDevice):
|
|||
self._stream = open(file_name, "wt", buffering = 1, encoding = "utf-8")
|
||||
else: #Binary mode.
|
||||
self._stream = open(file_name, "wb", buffering = 1)
|
||||
job = WriteFileJob(writer, self._stream, nodes, preferred_format["mode"])
|
||||
writer_args = {"mime_type": preferred_format["mime_type"]}
|
||||
job = WriteFileJob(writer, self._stream, nodes, preferred_format["mode"], writer_args)
|
||||
job.setFileName(file_name)
|
||||
job.progress.connect(self._onProgress)
|
||||
job.finished.connect(self._onFinished)
|
||||
|
|
|
@ -67,39 +67,40 @@ class SimulationPass(RenderPass):
|
|||
if not self._compatibility_mode:
|
||||
self._layer_shader.setUniformValue("u_starts_color", Color(*Application.getInstance().getTheme().getColor("layerview_starts").getRgb()))
|
||||
|
||||
if self._layer_view:
|
||||
self._layer_shader.setUniformValue("u_max_feedrate", self._layer_view.getMaxFeedrate())
|
||||
self._layer_shader.setUniformValue("u_min_feedrate", self._layer_view.getMinFeedrate())
|
||||
self._layer_shader.setUniformValue("u_max_thickness", self._layer_view.getMaxThickness())
|
||||
self._layer_shader.setUniformValue("u_min_thickness", self._layer_view.getMinThickness())
|
||||
self._layer_shader.setUniformValue("u_max_line_width", self._layer_view.getMaxLineWidth())
|
||||
self._layer_shader.setUniformValue("u_min_line_width", self._layer_view.getMinLineWidth())
|
||||
self._layer_shader.setUniformValue("u_max_flow_rate", self._layer_view.getMaxFlowRate())
|
||||
self._layer_shader.setUniformValue("u_min_flow_rate", self._layer_view.getMinFlowRate())
|
||||
self._layer_shader.setUniformValue("u_layer_view_type", self._layer_view.getSimulationViewType())
|
||||
self._layer_shader.setUniformValue("u_extruder_opacity", self._layer_view.getExtruderOpacities())
|
||||
self._layer_shader.setUniformValue("u_show_travel_moves", self._layer_view.getShowTravelMoves())
|
||||
self._layer_shader.setUniformValue("u_show_helpers", self._layer_view.getShowHelpers())
|
||||
self._layer_shader.setUniformValue("u_show_skin", self._layer_view.getShowSkin())
|
||||
self._layer_shader.setUniformValue("u_show_infill", self._layer_view.getShowInfill())
|
||||
self._layer_shader.setUniformValue("u_show_starts", self._layer_view.getShowStarts())
|
||||
else:
|
||||
#defaults
|
||||
self._layer_shader.setUniformValue("u_max_feedrate", 1)
|
||||
self._layer_shader.setUniformValue("u_min_feedrate", 0)
|
||||
self._layer_shader.setUniformValue("u_max_thickness", 1)
|
||||
self._layer_shader.setUniformValue("u_min_thickness", 0)
|
||||
self._layer_shader.setUniformValue("u_max_flow_rate", 1)
|
||||
self._layer_shader.setUniformValue("u_min_flow_rate", 0)
|
||||
self._layer_shader.setUniformValue("u_max_line_width", 1)
|
||||
self._layer_shader.setUniformValue("u_min_line_width", 0)
|
||||
self._layer_shader.setUniformValue("u_layer_view_type", 1)
|
||||
self._layer_shader.setUniformValue("u_extruder_opacity", [[1, 1, 1, 1], [1, 1, 1, 1], [1, 1, 1, 1], [1, 1, 1, 1]])
|
||||
self._layer_shader.setUniformValue("u_show_travel_moves", 0)
|
||||
self._layer_shader.setUniformValue("u_show_helpers", 1)
|
||||
self._layer_shader.setUniformValue("u_show_skin", 1)
|
||||
self._layer_shader.setUniformValue("u_show_infill", 1)
|
||||
self._layer_shader.setUniformValue("u_show_starts", 1)
|
||||
for shader in [self._layer_shader, self._layer_shadow_shader]:
|
||||
if self._layer_view:
|
||||
shader.setUniformValue("u_max_feedrate", self._layer_view.getMaxFeedrate())
|
||||
shader.setUniformValue("u_min_feedrate", self._layer_view.getMinFeedrate())
|
||||
shader.setUniformValue("u_max_thickness", self._layer_view.getMaxThickness())
|
||||
shader.setUniformValue("u_min_thickness", self._layer_view.getMinThickness())
|
||||
shader.setUniformValue("u_max_line_width", self._layer_view.getMaxLineWidth())
|
||||
shader.setUniformValue("u_min_line_width", self._layer_view.getMinLineWidth())
|
||||
shader.setUniformValue("u_max_flow_rate", self._layer_view.getMaxFlowRate())
|
||||
shader.setUniformValue("u_min_flow_rate", self._layer_view.getMinFlowRate())
|
||||
shader.setUniformValue("u_layer_view_type", self._layer_view.getSimulationViewType())
|
||||
shader.setUniformValue("u_extruder_opacity", self._layer_view.getExtruderOpacities())
|
||||
shader.setUniformValue("u_show_travel_moves", self._layer_view.getShowTravelMoves())
|
||||
shader.setUniformValue("u_show_helpers", self._layer_view.getShowHelpers())
|
||||
shader.setUniformValue("u_show_skin", self._layer_view.getShowSkin())
|
||||
shader.setUniformValue("u_show_infill", self._layer_view.getShowInfill())
|
||||
shader.setUniformValue("u_show_starts", self._layer_view.getShowStarts())
|
||||
else:
|
||||
#defaults
|
||||
shader.setUniformValue("u_max_feedrate", 1)
|
||||
shader.setUniformValue("u_min_feedrate", 0)
|
||||
shader.setUniformValue("u_max_thickness", 1)
|
||||
shader.setUniformValue("u_min_thickness", 0)
|
||||
shader.setUniformValue("u_max_flow_rate", 1)
|
||||
shader.setUniformValue("u_min_flow_rate", 0)
|
||||
shader.setUniformValue("u_max_line_width", 1)
|
||||
shader.setUniformValue("u_min_line_width", 0)
|
||||
shader.setUniformValue("u_layer_view_type", 1)
|
||||
shader.setUniformValue("u_extruder_opacity", [[1, 1, 1, 1], [1, 1, 1, 1], [1, 1, 1, 1], [1, 1, 1, 1]])
|
||||
shader.setUniformValue("u_show_travel_moves", 0)
|
||||
shader.setUniformValue("u_show_helpers", 1)
|
||||
shader.setUniformValue("u_show_skin", 1)
|
||||
shader.setUniformValue("u_show_infill", 1)
|
||||
shader.setUniformValue("u_show_starts", 1)
|
||||
|
||||
if not self._tool_handle_shader:
|
||||
self._tool_handle_shader = OpenGL.getInstance().createShaderProgram(Resources.getPath(Resources.Shaders, "toolhandle.shader"))
|
||||
|
|
|
@ -222,12 +222,11 @@ class SimulationView(CuraView):
|
|||
|
||||
self.setPath(i + fractional_value)
|
||||
|
||||
def advanceTime(self, time_increase: float) -> bool:
|
||||
def advanceTime(self, time_increase: float) -> None:
|
||||
"""
|
||||
Advance the time by the given amount.
|
||||
|
||||
:param time_increase: The amount of time to advance (in seconds).
|
||||
:return: True if the time was advanced, False if the end of the simulation was reached.
|
||||
"""
|
||||
total_duration = 0.0
|
||||
if len(self.cumulativeLineDuration()) > 0:
|
||||
|
@ -237,15 +236,13 @@ class SimulationView(CuraView):
|
|||
# If we have reached the end of the simulation, go to the next layer.
|
||||
if self.getCurrentLayer() == self.getMaxLayers():
|
||||
# If we are already at the last layer, go to the first layer.
|
||||
self.setTime(total_duration)
|
||||
return False
|
||||
|
||||
# advance to the next layer, and reset the time
|
||||
self.setLayer(self.getCurrentLayer() + 1)
|
||||
self.setLayer(0)
|
||||
else:
|
||||
# advance to the next layer, and reset the time
|
||||
self.setLayer(self.getCurrentLayer() + 1)
|
||||
self.setTime(0.0)
|
||||
else:
|
||||
self.setTime(self._current_time + time_increase)
|
||||
return True
|
||||
|
||||
def cumulativeLineDuration(self) -> List[float]:
|
||||
# Make sure _cumulative_line_duration is initialized properly
|
||||
|
|
|
@ -144,9 +144,7 @@ Item
|
|||
{
|
||||
// divide by 1000 to account for ms to s conversion
|
||||
const advance_time = simulationTimer.interval / 1000.0;
|
||||
if (!UM.SimulationView.advanceTime(advance_time)) {
|
||||
playButton.pauseSimulation();
|
||||
}
|
||||
UM.SimulationView.advanceTime(advance_time);
|
||||
// The status must be set here instead of in the resumeSimulation function otherwise it won't work
|
||||
// correctly, because part of the logic is in this trigger function.
|
||||
isSimulationPlaying = true;
|
||||
|
|
|
@ -54,9 +54,9 @@ class SimulationViewProxy(QObject):
|
|||
def currentPath(self):
|
||||
return self._simulation_view.getCurrentPath()
|
||||
|
||||
@pyqtSlot(float, result=bool)
|
||||
def advanceTime(self, duration: float) -> bool:
|
||||
return self._simulation_view.advanceTime(duration)
|
||||
@pyqtSlot(float)
|
||||
def advanceTime(self, duration: float) -> None:
|
||||
self._simulation_view.advanceTime(duration)
|
||||
|
||||
@pyqtProperty(int, notify=currentPathChanged)
|
||||
def minimumPath(self):
|
||||
|
|
|
@ -360,8 +360,8 @@ geometry41core =
|
|||
((v_prev_line_type[0] != 1) && (v_line_type[0] == 1)) ||
|
||||
((v_prev_line_type[0] != 4) && (v_line_type[0] == 4))
|
||||
)) {
|
||||
float w = size_x;
|
||||
float h = size_y;
|
||||
float w = max(0.05, size_x);
|
||||
float h = max(0.05, size_y);
|
||||
|
||||
myEmitVertex(v_vertex[0] + vec3( w, h, w), u_starts_color, normalize(vec3( 1.0, 1.0, 1.0)), viewProjectionMatrix * (gl_in[0].gl_Position + vec4( w, h, w, 0.0))); // Front-top-left
|
||||
myEmitVertex(v_vertex[0] + vec3(-w, h, w), u_starts_color, normalize(vec3(-1.0, 1.0, 1.0)), viewProjectionMatrix * (gl_in[0].gl_Position + vec4(-w, h, w, 0.0))); // Front-top-right
|
||||
|
|
|
@ -1,11 +1,12 @@
|
|||
# Copyright (c) 2023 UltiMaker
|
||||
# Cura is released under the terms of the LGPLv3 or higher.
|
||||
import datetime
|
||||
|
||||
import json
|
||||
import os
|
||||
import platform
|
||||
import time
|
||||
from typing import Optional, Set, TYPE_CHECKING
|
||||
from typing import Any, Optional, Set, TYPE_CHECKING
|
||||
|
||||
from PyQt6.QtCore import pyqtSlot, QObject
|
||||
from PyQt6.QtNetwork import QNetworkRequest
|
||||
|
@ -33,7 +34,18 @@ class SliceInfo(QObject, Extension):
|
|||
no model files are being sent (Just a SHA256 hash of the model).
|
||||
"""
|
||||
|
||||
info_url = "https://stats.ultimaker.com/api/cura"
|
||||
info_url = "https://statistics.ultimaker.com/api/v2/cura/slice"
|
||||
|
||||
_adjust_flattened_names = {
|
||||
"extruders_extruder": "extruders",
|
||||
"extruders_settings": "extruders",
|
||||
"models_model": "models",
|
||||
"models_transformation_data": "models_transformation",
|
||||
"print_settings_": "",
|
||||
"print_times": "print_time",
|
||||
"active_machine_": "",
|
||||
"slice_uuid": "slice_id",
|
||||
}
|
||||
|
||||
def __init__(self, parent = None):
|
||||
QObject.__init__(self, parent)
|
||||
|
@ -112,6 +124,26 @@ class SliceInfo(QObject, Extension):
|
|||
|
||||
return list(sorted(user_modified_setting_keys))
|
||||
|
||||
def _flattenData(self, data: Any, result: dict, current_flat_key: Optional[str] = None, lift_list: bool = False) -> None:
|
||||
if isinstance(data, dict):
|
||||
for key, value in data.items():
|
||||
total_flat_key = key if current_flat_key is None else f"{current_flat_key}_{key}"
|
||||
self._flattenData(value, result, total_flat_key, lift_list)
|
||||
elif isinstance(data, list):
|
||||
for item in data:
|
||||
self._flattenData(item, result, current_flat_key, True)
|
||||
else:
|
||||
actual_flat_key = current_flat_key.lower()
|
||||
for key, value in self._adjust_flattened_names.items():
|
||||
if actual_flat_key.startswith(key):
|
||||
actual_flat_key = actual_flat_key.replace(key, value)
|
||||
if lift_list:
|
||||
if actual_flat_key not in result:
|
||||
result[actual_flat_key] = []
|
||||
result[actual_flat_key].append(data)
|
||||
else:
|
||||
result[actual_flat_key] = data
|
||||
|
||||
def _onWriteStarted(self, output_device):
|
||||
try:
|
||||
if not self._application.getPreferences().getValue("info/send_slice_info"):
|
||||
|
@ -125,8 +157,7 @@ class SliceInfo(QObject, Extension):
|
|||
global_stack = machine_manager.activeMachine
|
||||
|
||||
data = dict() # The data that we're going to submit.
|
||||
data["time_stamp"] = time.time()
|
||||
data["schema_version"] = 0
|
||||
data["schema_version"] = 1000
|
||||
data["cura_version"] = self._application.getVersion()
|
||||
data["cura_build_type"] = ApplicationMetadata.CuraBuildType
|
||||
org_id = user_profile.get("organization_id", None) if user_profile else None
|
||||
|
@ -298,6 +329,11 @@ class SliceInfo(QObject, Extension):
|
|||
"time_backend": int(round(time_backend)),
|
||||
}
|
||||
|
||||
# Massage data into format used in the DB:
|
||||
flat_data = dict()
|
||||
self._flattenData(data, flat_data)
|
||||
data = flat_data
|
||||
|
||||
# Convert data to bytes
|
||||
binary_data = json.dumps(data).encode("utf-8")
|
||||
|
||||
|
|
|
@ -51,7 +51,7 @@ class UFPWriter(MeshWriter):
|
|||
# Qt thread. The File read/write operations right now are executed on separated threads because they are scheduled
|
||||
# by the Job class.
|
||||
@call_on_qt_thread
|
||||
def write(self, stream, nodes, mode = MeshWriter.OutputMode.BinaryMode):
|
||||
def write(self, stream, nodes, mode = MeshWriter.OutputMode.BinaryMode, **kwargs):
|
||||
archive = VirtualFile()
|
||||
archive.openStream(stream, "application/x-ufp", OpenMode.WriteOnly)
|
||||
|
||||
|
|
Before Width: | Height: | Size: 202 KiB After Width: | Height: | Size: 90 KiB |
Before Width: | Height: | Size: 620 KiB After Width: | Height: | Size: 64 KiB |
Before Width: | Height: | Size: 140 KiB After Width: | Height: | Size: 90 KiB |
Before Width: | Height: | Size: 185 KiB After Width: | Height: | Size: 75 KiB |
Before Width: | Height: | Size: 398 KiB After Width: | Height: | Size: 119 KiB |
Before Width: | Height: | Size: 899 KiB After Width: | Height: | Size: 138 KiB |
Before Width: | Height: | Size: 682 KiB After Width: | Height: | Size: 110 KiB |
Before Width: | Height: | Size: 874 KiB After Width: | Height: | Size: 50 KiB |
Before Width: | Height: | Size: 430 KiB After Width: | Height: | Size: 123 KiB |