Computer Model and Performance

#41
Did a Pig in wrong environment test. A pig in Shade3d. Got a boost of 3,4 there. So it's not the complexity of the scene. Very strange. Perhaps its the pink-shader, when I replace it with a more complex one Cheetah gets a lot slower.
 

Attachments

Last edited:
#42
This pig is getting a true work-out!

Hope you can sort out what's going on. My hex doesn't arrive for a couple of weeks yet so I can't test anything here. My app list a a bit different from yours: Cheetah3D, Carrara 8.5, Daz3D.

But, like Martin, I am coming from an iMac i7 (2012) so I am only expecting a benefit in the number of cores (4 >> 6 = 1.5x). I need a machine that can sit and churn stuff out while I continue working on my iMac.

I also have recently started working with FCPX a lot, so the MP will make a huge difference there.

Now if Apple would release a 64-bit version of QuickTime...
 

podperson

Well-known member
#43
Interesting. I wonder if some apps can make better use of the architecture than others?
The new Mac Pros have dual GPUs, one of which isn't hooked up to displays, so um yes, some apps can make better use of the architecture than others. The obvious example is Final Cut Pro X vs. Premiere; the former is massively accelerated, the latter derives little benefit (for now). Adobe has been conspicuously failing to really leverage GPUs for years now.
 
#44
Sorry if I was unclear... I meant architecture without the GPUs, as all but a few apps have not been upgraded yet to take advantage of the dual arrangement.

A developer of an app I use singled out the 2009 MacPro as having a design that made it difficult for the apps to use the multiple cores. He didn't have the new MP so he couldn't weigh in on how it performed.

There are also different ways developers have gone about making their apps that impact how they perform. I use two apps that have distributed rendering built-in. This was broken in one of the apps, but not the other, even before an update.

Me, I'm just a lowly user who waddles along as best I can with whatever I have.
 
Last edited:
#45
Mac Pro (4,1)

Mac Pro (Early 2009) 4,1
16gb RAM
2.66 GHz Quad-Core Intel Xeon
10.9.2 Mavericks

Pig.jas file.
Still life render - height x width 2000x2000
Camera max samples set to 16x16

Time 1: 47.63 seconds.

I guess not that bad.
 
#47
What are the system factors that drive Render Time

This is an interesting thread.

I've read it through and guessed that render time is more affected by CPU grunt power rather than GPU or buckets of memory.

Would the fathers of C3D suggest a Hackingtosh with many big processors, or a huge GPU?

Regards



Andy
 

podperson

Well-known member
#48
I'd suggest going for a good CPU if C3D render times are your main concern. Having a big GPU is always nice (and improves C3D's performance when editing scenes, animating, etc.).

Cheetah 3D does take very good advantage of multiple cores, so expect every core to provide a pretty much linear performance improvement.

I see that some enterprising chap has build a dual Xeon Hackintosh so you could in theory get a ridiculous number of cores…

https://www.youtube.com/watch?v=ix3TmjEfEiQ
 
Last edited:
#49
Big CPU's

Could be an idea!

Been looking at TonyMac86.

My other thought is to make the whole scene up, then change the render ranges for each of the 4 Mac's I have around the place. And then leave them overnight rendering a section of the scene.

Then when complete pull all the PNG's into one place and renumber them all.

Andy
 

Swizl

Well-known member
#50
Could be an idea!

Been looking at TonyMac86.

My other thought is to make the whole scene up, then change the render ranges for each of the 4 Mac's I have around the place. And then leave them overnight rendering a section of the scene.

Then when complete pull all the PNG's into one place and renumber them all.

Andy
If you have QuickTime Pro (or some other video editor) you could just save each movie clip from the renders out as png and then import each rendered take into it's proper spot. That's usually what I do. Then I compress the file down to H.264 if I need to send it to someone. I leave them at full rez so my boss can import them into FCP with no resolution loss. I guess if you have something that can batch rename the png files, it wouldn't be as bad.

Cool build btw Pod! I have an hackintosh, but it's just on an old HP Elitebook. Nothing special as far as rendering goes.
 

podperson

Well-known member
#51
Cool build btw Pod! I have an hackintosh.
Life is too short for such things. The irony is that most of the folks doing extreme Hackintosh builds seem to be doing it for fun. After all, $2k is a lot to spend on an experiment (and that's for Core i7 builds), and if you don't want to go through hell every time there's an OS update Mac Pros are cheaper than equivalent Windows workstations.

http://www.extremetech.com/computin...s-2000-cheaper-than-the-equivalent-windows-pc

The one place where a Hackintosh really makes sense is for GPU-specific applications (NVidia GPUs in particular, since Apple has been skewing towards AMD a lot lately).
 

Swizl

Well-known member
#52
Life is too short for such things. The irony is that most of the folks doing extreme Hackintosh builds seem to be doing it for fun. After all, $2k is a lot to spend on an experiment (and that's for Core i7 builds), and if you don't want to go through hell every time there's an OS update Mac Pros are cheaper than equivalent Windows workstations.

http://www.extremetech.com/computin...s-2000-cheaper-than-the-equivalent-windows-pc

The one place where a Hackintosh really makes sense is for GPU-specific applications (NVidia GPUs in particular, since Apple has been skewing towards AMD a lot lately).
The laptop I have was $200 on Craig's List. It has a touch screen also (but no OS X drivers for that). So definetly won't break the bank. I mainly put OS X on it to run C3d. This is after having several multi-thousand dollar Apple laptops that fried out on me. They're now sitting in a bag in a closet.

If that's people's idea of fun though, then more power to them. I took my 1998 Bondi Blue iMac and put it in a pc tower because the monitor stopped working. It would only work with an external monitor plugged in. I had an external monitor sitting next to the iMac for a while, but got tired of it taking up so much space.
 
#53
Thought I'd check my Macs for comparison. ;)

Mac Pro 3.06 GHz 12-core (4.1>5.1), 24 GB, El Capitan.
Pig.jas
16 x 16 max samples
2000 x 2000 px
19,56 sec

Mac Mini 2.6 GHz quad-i7, 16 GB, El Capitan.
Pig.jas
16 x 16 max samples
2000 x 2000 px
30,05 sec
 
Last edited:

podperson

Well-known member
#55
Sad that Apple doesn't sell a quad core mini at the moment as it was always the best bang for your buck.

Apple is seriously alienating 3D and video folk by failing to provide us with a grunt box at any price (it already alienated gamers by refusing to provide a grunt box for a reasonable price). If Apple isn't careful it will drive away photographers too, although if photographers mainly use laptops they probably don't care (Apple's laptops are fine, all whining about the new MBP to the contrary).

I've solved my gaming woes by buying multiple PS4s (I used to buy gaming PCs but keeping them working is a serious pain in the ass; the PS4 simply has periodic long updates). I'm against hackintoshes ethically (but only slightly) and practically (it's a pain in the ass) but at some point it will be a choice between hackintosh and windows and I've got bigger problems with windows.
 
#56
Sad that Apple doesn't sell a quad core mini at the moment as it was always the best bang for your buck.
Pod your post on this comes at a very interesting time for me. Two days ago I had a serious system crash that not only housed my Hackintosh install (it was acting finicky for some months now - mostly due I think to me not fully understanding Clover) but also my Windows 8 disk too.

I managed to get my Windows system back up and and am nearly finished with fixing my Hackintosh volume as well.

I have been thinking of moving full time to Windows for a while now as it would just be easier with my three card NVIDIA GPU box. Cheetah is literally the ONLY app that keeps me on the Mac (plus I really do enjoy working on OS X - have not updated to Sierra yet).

I’m not completely ready to leave, but Apple seems to be doing their best to push me out the door! The hardware and OS is becoming increasingly restrictive for the way I like to work - this also includes iOS.

I was on the Apple site last night (while fixing my main work computer) trying to find any hardware offerings that might allow me to come back into a pure Apple environment without reverting to a Hackintosh. And I didn’t really see anything that would match my current system at a reasonable price. Even the Mac Pro is now completely unsuitable with its fixed and proprietary components.

So I started thinking about a MacBook Pro or Mac Mini where I could do the bulk of my creative work (which involves graphics, video and 3d) and just fall back to a PC box for GPU rendering when necessary (Octane will be making this much easier soon with headless rendering). But paying top dollar for a Mac Book Pro would still give me slightly weaker system specs than I now have and at a very costly price I can not currently afford. Even though the latest Mac Mini’s are underpowered, I would at least consider one if it too was not locked from expandability and access. By the way - during my system crash I started working temporarily on my old Mac Mini 2010. I love that machine! It has been one of the best and most versatile computers I ever purchased. Same with a used 2008 Mac Pro, which I often use to offload Cheetah rendering jobs to while continuing to work on my main system. The day that box stops working with Cheetah is most likely the day I finally leave Mac.
 

podperson

Well-known member
#57
I totally get your quandary.

No-one makes upgradeable laptops that are worth using, so Apple is hardly alone there, and its laptops are right up there performance and price/performance wise.

The Mac Pro is an odd beast. It's surprisingly modular but no-one makes modules. That said, I find I use desktop computers virtually never, so I'm leaning towards some kind of home NAS and laptops, but it would be nice to be able to send rendering off to some faceless commodity box. I had high hopes Apple would use GCD to allow transparent use of cloud-based render farms.

Tim Cook just swore that he has something coming out for pro users. My needs aren't urgent so I'll wait.
 
#58
Cheetbench.png

I think without a solution for multiple graphics cards in a box (probably third party stuff with thunderbolt or whatever), which is what the octane and cycles users use, rendering on macs will go nowhere.
Especially not with CPU only renderers, the hardware is too expensive in comparison.
 
#59
HP is aggressively marketing their workstation machines to Pro Customers:

http://www8.hp.com/us/en/campaigns/workstations/mac-to-z.html

Imagine up to 44 cores, 7 PCIe slots, a Terrabyte of memory, and more. Such a machine would be extremely expensive, but would give Pro shops the power to do work insanely fast.

I hope that Tim Cook does announce something more positive this year, as Pro's have found recent offerings to be limited in power and expandability.
 
#60
HP is aggressively marketing their workstation machines to Pro Customers:

http://www8.hp.com/us/en/campaigns/workstations/mac-to-z.html
Thanks for the link. That was really informative. Unluckily the information on that webpage is also very true.

Apples offerings for power users (3D designer, scientists, engineers, etc) are really shameful. I truly hope that Apple will release a high end Mac again. But I'm also worried that it's already too late. The users who needed high end hardware already left the Mac. I at least have the impression that the huge majority of the 3D design community already left towards Windows and I have doubts that they will come back. It was really frustrating to watch how Apple expelled power users.:frown:

I personally wished Apple would create a MacPro based on AMDs upcoming Naples CPU

https://www.youtube.com/watch?v=PN93G6Rg2ek

With that beast even a single processor machine would be fine for my purposes.:smile: A dual CPU solution would nevertheless be welcomed for those who need it. And I'm sure Naples would be also cheaper than Intels Xeons. But that's probably just wishful thinking.

I'm really curious what Apple did with it's $10,000,000,000 R&D budget last year.:shock:

Bye
Martin
 
Top