PreForm memory consumption

Fusion 360 isn’t especially optimised on macOS, so this raised my eyebrow…

Same part loaded in both programs (PreForm & F360) - how is it possible that PreForm consume more memory when the only thing it has is the stl of the same model? :flushed:


Yours is using just a bit of memory :slight_smile:


Mac handles memory usage quite well and caches as much as your available memory allows.
I can easily “use” 512gb of ram just by doing the same things over a week or two. On the other hand I can also use Fusion 360, Preform and a few other with a low memory Mac (such as with 16gb) without going out of memory.

How much memory a program consumes is more than just a function of the size of the data file the program is using. You can’t assume they’ll correlate even though they’re both working on the same file.

You can’t compare apples to oranges, each program is different

My intention was not to point fingers, but as someone that has been writing code for a very long time it’s a bit hard to understand how we have ended up with the insane memory consumption we have today.

I fully understand how todays development works and what causes a big part of the consumption, but looking back to my days in x86 assembly I can’t but wonder how much faster and capable todays computers would have been if todays software were as optimised as it once was back in the day of limited hardware and before all bloated libs.

1 Like

When I started programming, a computer with 64K had a lot of memory (in fact, I think the first one I programmed had 4K). You could write a program in Assembly that consumed all of it. No virtual memory. No paging. You had to be clever. I remember the first time I wrote something significant in C, I spent about 2 weeks figuring out how the compiler optimized the source code so I could write my programs in a way that ensured the most compact code. Because memory was a limited resource.

Nowadays, apps are just too complicated for this type of development approach to work. To write complicated programs you need higher level languages and you have to make compromises in terms of executable code efficiency and programs get bigger for that, and of course because they’re more complicated and do much more. And since memory is rarely a limitation, no one puts a huge effort in to reducing their software footprint.

You can still write code for an Arduino using “old school” methods. But of course, the complexity of your program will be limited by the limited resources of the processor and memory. I do this all the time. Little SoCs like an Arduino Nano or an ATTiny85 make excellent “glue logic” for things.

I am not going to stake my life on this but I believe a big reason that PreForm uses so much memory is that it is using AutoDesk Netfabb in the background to create supports. Or at least that is what is appears to being doing when I was evaluating Netfabb as an optimization program. I could very well be incorrect as I am not a programmer or a computer expert. I am basing it on the fact that PreForm and Netfabb created the exact same supports for the same part when orientated the same.

Cause STLs are crappy mesh files with lots of points and triangles. A native CAD file on the other hand is clean analytical NURBS geometry (much less computationally “heavy”).

This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.