i have a Lenovo Thinkpad P53s with a additional Nvidia Quadro P520 GPU.
During loading structures in PreForm it is impossible to work with the laptop.
I would change the Laptop to use only the Nvidia GPU but the support told me it will be selected automatically according to the workload of the Intel GPU.
I have also set in the Nvidia control panel the Quadro GPU for PreForm but it does not help.
While I’m not sure what may be causing PreForm to run slowly on your setup, one potential way to narrow this down further would be to share PreForm diagnostic logs and system information with our support team. They can provide instructions for getting the logs to them, and this may allow them to isolate any issues.
I’ve noticed that often preform doesn’t close properly. Make sure you kill any rogue instances, as they will definitely cause your preform to run slowly as I’ve experienced.
Thank you for sharing your experience about this - I feel this is worth sharing with support (logs especially since they’ll likely give the team insight on what’s going on with the rogue instances you are observing), and/or via Give Feedback with PreForm so the team can work on addressing these issues and improving the user experience - steps to reproduce will be great.
I’ve already shared this to support…doesn’t seem to be any useful recommendations from their end. Not sure if they shared with their Preform team.
Also doesn’t seem to be any obvious way to reproduce the issue. All I know is that over time, I end up seeing multiple instances of Preform still running in task manager and this makes everything I do in Preform laggy - especially when working with large parts and editing supports manually for 3L prints.
Once I kill these other instances, Preform goes back to normal.
Duly noted - those logs do get shared since the team does their due diligence on that end to help improve the user experience - thanks for sharing them.
I’ll make sure to keep an eye out in task manager when I open PreForm to use it - thank you for flagging this to our attention here as well!