Geek Benchmarks for 2.3 Ghz i7 Mac Mini - Late 2012

Posted:
in Future Apple Hardware edited January 2014


I just ordered one of these bad boys and 8gb of ram


 


 

















































































































































































































































Model

Processor

Frequency

Cores

Platform

User

Score

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 32-bit

 

11067

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 32-bit

 

10175

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 32-bit

 

10345

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 32-bit

 

10308

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 32-bit

 

10430

Macmini6,2

Intel Core i7-3770K

3501

4

Mac OS X 32-bit

 

12043

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 32-bit

 

10201

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 32-bit

 

10461

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 64-bit

dewbage

11970

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 64-bit

 

11851

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 64-bit

 

11953

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 32-bit

 

10899

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 32-bit

 

10769

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 32-bit

 

10706

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 32-bit

 

11009

Macmini Server (Late 2012)

Intel Core i7-3615QM

2300

4

Mac OS X 32-bit

thinkbook

10772

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 32-bit

 

10912

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 64-bit

dewbage

11583

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 64-bit

dewbage

11817

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 64-bit

 

10484

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 32-bit

 

10773

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 64-bit

 

11697

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 32-bit

 

10765

Macmini6,2

Intel Core i7-3615QM

2300

4

Mac OS X 64-bit

CrouchingDonkey

11641

Macmini6,2

Intel Core i5-3570K

3392

4

Mac OS X 32-bit

 

9278

 
«1

Comments

  • Reply 1 of 34
    MarvinMarvin Posts: 15,309moderator
    It seems someone got an i7-3770k in their's. I thought it had the i7-3610QM but it appears that Intel has made another model with a slightly higher clocked IGP:

    http://ark.intel.com/products/64899/Intel-Core-i7-3610QM-Processor-6M-Cache-up-to-3_30-GHz
    http://ark.intel.com/products/64900/Intel-Core-i7-3615QM-Processor-6M-Cache-up-to-3_30-GHz

    1.2GHz instead of 1.1GHz. That extra 9% won't make a huge difference to anything but every little helps. You can see the difference here:

    http://www.notebookcheck.net/Intel-HD-Graphics-4000-Benchmarked.73567.0.html

    Battlefield 3 goes from 24.5fps on low to a whopping 26.9fps. At least the CPU is making up for it.

    The Cinebench score should be around 6.2, which puts it about even with the 8-core 2.8GHz 2008 Mac Pro ($2799) and the 4-core 2010 3.2GHz model ($2899) for just $799. Put in 16GB RAM, an SSD and you got a neat little workstation.
  • Reply 2 of 34
    wizard69wizard69 Posts: 13,377member
    I guess it depends upon what you call a workstation. The lack of strong 3D and OpenCL support makes seeing the Mini as a workstation a bit difficult. Well as a workstation where those features are important, the Mini should be a nice little XCode workstation.

    As nice as the updated models are, I still see them as a regression simply due to that missing GPU.
    Marvin wrote: »
    It seems someone got an i7-3770k in their's. I thought it had the i7-3610QM but it appears that Intel has made another model with a slightly higher clocked IGP:
    http://ark.intel.com/products/64899/Intel-Core-i7-3610QM-Processor-6M-Cache-up-to-3_30-GHz
    http://ark.intel.com/products/64900/Intel-Core-i7-3615QM-Processor-6M-Cache-up-to-3_30-GHz
    1.2GHz instead of 1.1GHz. That extra 9% won't make a huge difference to anything but every little helps. You can see the difference here:
    http://www.notebookcheck.net/Intel-HD-Graphics-4000-Benchmarked.73567.0.html
    Battlefield 3 goes from 24.5fps on low to a whopping 26.9fps. At least the CPU is making up for it.
    The Cinebench score should be around 6.2, which puts it about even with the 8-core 2.8GHz 2008 Mac Pro ($2799) and the 4-core 2010 3.2GHz model ($2899) for just $799. Put in 16GB RAM, an SSD and you got a neat little workstation.
  • Reply 3 of 34
    MarvinMarvin Posts: 15,309moderator
    wizard69 wrote: »
    I guess it depends upon what you call a workstation. The lack of strong 3D and OpenCL support makes seeing the Mini as a workstation a bit difficult.

    OpenCL on the GPU is never going to take off if Apple do stupid things with it like break support during an OS update, disable it based on video memory and don't keep it up to date:

    http://forum.netkas.org/index.php?topic=2250.0
    http://netkas.org/?p=1161
    http://forums.macrumors.com/showthread.php?t=1422871

    If they're going to do things like that, they need to put someone more responsible in control of the graphics drivers like NVidia/AMD. Then we might have a chance of also getting OpenGL 4 support within a reasonable timeframe compared to other platforms (2.5 years and counting).

    Typically the low-end GPUs match the CPU in OpenCL performance so doubling the CPU performance should leave OpenCL the same vs having a dedicated GPU. Still slow though.

    In terms of standard GPU usage, the HD4000 got level 2 support in CS6:

    http://help.adobe.com/en_US/aftereffects/cs/using/WS3878526689cb91655866c1103a4f2dff7-79f4a.html

    It's supported by Apple in FCPX/Motion so it should be ok for most things.
  • Reply 4 of 34
    winterwinter Posts: 1,238member
    Huh? I see the i7-3770K listed next to the Mac mini? Say what?
  • Reply 5 of 34
    MarvinMarvin Posts: 15,309moderator
    winter wrote: »
    Huh? I see the i7-3770K listed next to the Mac mini? Say what?

    That sort of thing happens with hackintoshes - OS X gets modified to run on custom built PCs.
  • Reply 6 of 34
    winterwinter Posts: 1,238member
    Marvin wrote: »
    That sort of thing happens with hackintoshes - OS X gets modified to run on custom built PCs.

    Oh duh! I saw Mac mini 6,2 and then i7-3770K and had to do a double take.
  • Reply 7 of 34
    wizard69wizard69 Posts: 13,377member
    Marvin wrote: »
    OpenCL on the GPU is never going to take off if Apple do stupid things with it like break support during an OS update, disable it based on video memory and don't keep it up to date:
    http://forum.netkas.org/index.php?topic=2250.0
    http://netkas.org/?p=1161
    http://forums.macrumors.com/showthread.php?t=1422871
    Apple isn't the first developer to have bugs in their app, drivers or what have you. As to keeping up to date, I'm not to certain who is responsible for what at Apple. Beyond that there is the issue of hardware support.
    If they're going to do things like that, they need to put someone more responsible in control of the graphics drivers like NVidia/AMD. Then we might have a chance of also getting OpenGL 4 support within a reasonable timeframe compared to other platforms (2.5 years and counting).
    Apples issues with drivers started well before OpenCL even existed. Yes they need to do better, but on shouldn't dismiss OpenCL because Apple is a little slow here or there.
    Typically the low-end GPUs match the CPU in OpenCL performance so doubling the CPU performance should leave OpenCL the same vs having a dedicated GPU. Still slow though.
    The effectiveness of a GPU for computation is highly variable but even if it is at parity with the CPU it is still an alternative execution path that runs in parallel with the CPU.
    In terms of standard GPU usage, the HD4000 got level 2 support in CS6:
    http://help.adobe.com/en_US/aftereffects/cs/using/WS3878526689cb91655866c1103a4f2dff7-79f4a.html
    It's supported by Apple in FCPX/Motion so it should be ok for most things.
  • Reply 8 of 34
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by Marvin View Post





    OpenCL on the GPU is never going to take off if Apple do stupid things with it like break support during an OS update, disable it based on video memory and don't keep it up to date:

    http://forum.netkas.org/index.php?topic=2250.0

    http://netkas.org/?p=1161

    http://forums.macrumors.com/showthread.php?t=1422871

    If they're going to do things like that, they need to put someone more responsible in control of the graphics drivers like NVidia/AMD. Then we might have a chance of also getting OpenGL 4 support within a reasonable timeframe compared to other platforms (2.5 years and counting).

    Typically the low-end GPUs match the CPU in OpenCL performance so doubling the CPU performance should leave OpenCL the same vs having a dedicated GPU. Still slow though.

    In terms of standard GPU usage, the HD4000 got level 2 support in CS6:

    http://help.adobe.com/en_US/aftereffects/cs/using/WS3878526689cb91655866c1103a4f2dff7-79f4a.html

    It's supported by Apple in FCPX/Motion so it should be ok for most things.




    I've seen a lot of software developers announcing OpenCL support. NVidia supports it on their cards, so available hardware shouldn't be a big deal. On OS updates, this is one of those reasons I tell anyone who uses a computer for work to be very conservative on applying updates or retain a backup.

  • Reply 9 of 34
    wizard69wizard69 Posts: 13,377member
    Industry adoption of OpenCL has been very significant, it is probably as successful as LLVM/CLang. Maybe a little lower key as companies often don't tout thier use of OpenCL nor involvement in its devlopment the way they do with LLVM.

    As to updates lately I take the attitude of let the chips fall where they may. If no one updates then no bugs are ever found. Of course I can get away with that right now.

    I've been reading about the massive reorganization at Apple which has me wondering how OS development will be refocused. Frankly I think Apple has gone off the quality track with the yearly Mac OS updates. I'd much rather see them go back to longer release cycles and refocus on quality. Some of these OpenCL issues raised here just strike me as the result of rush jobs, as such I don't see evil intent.
    hmm wrote: »

    I've seen a lot of software developers announcing OpenCL support. NVidia supports it on their cards, so available hardware shouldn't be a big deal. On OS updates, this is one of those reasons I tell anyone who uses a computer for work to be very conservative on applying updates or retain a backup.
  • Reply 10 of 34
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by wizard69 View Post



    Industry adoption of OpenCL has been very significant, it is probably as successful as LLVM/CLang. Maybe a little lower key as companies often don't tout thier use of OpenCL nor involvement in its devlopment the way they do with LLVM.

    As to updates lately I take the attitude of let the chips fall where they may. If no one updates then no bugs are ever found. Of course I can get away with that right now.

    I've been reading about the massive reorganization at Apple which has me wondering how OS development will be refocused. Frankly I think Apple has gone off the quality track with the yearly Mac OS updates. I'd much rather see them go back to longer release cycles and refocus on quality. Some of these OpenCL issues raised here just strike me as the result of rush jobs, as such I don't see evil intent.


    I get that. I tell people if they use it for work to ensure they have a backup prior to updating. This way if anything crucial breaks, they can revert. The alternative is just watching for any crippling bugs and bug fixes prior to applying updates. If you use something for work, the top priority is that it continues working, and either of those methods address that by minimizing the potential for significant downtime.


     


    Apple seems to have consolidated upper management. I'm still weirded out by the retail thing. I figured they knew who they were hiring. He would have presented plans at some point, as I doubt the role of a new executive is completely autonomous. I wouldn't be bothered by annual OS updates if they were entirely stable. That is always a hugely annoying issue. I'm running Lion due to a couple application issues under Mountain Lion (grumble) although I should check on that. Snow Leopard still seemed like the best to date overall once it stabilized. 

  • Reply 11 of 34
    MarvinMarvin Posts: 15,309moderator
    hmm wrote:
    I get that. I tell people if they use it for work to ensure they have a backup prior to updating. This way if anything crucial breaks, they can revert.

    When it comes to important things like graphics drivers, I'd like to see them put multiple versions in the OS that can be switched dynamically via an interface. They can bundle OpenGL 4 drivers but only have OpenGL 3 enabled by default. If you want to use OpenGL 4, switch the driver. The drivers should be developed in a way that when they crash, it doesn't take out the OS too.
  • Reply 12 of 34
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by Marvin View Post





    When it comes to important things like graphics drivers, I'd like to see them put multiple versions in the OS that can be switched dynamically via an interface. They can bundle OpenGL 4 drivers but only have OpenGL 3 enabled by default. If you want to use OpenGL 4, switch the driver. The drivers should be developed in a way that when they crash, it doesn't take out the OS too.


    I always wished they'd implement the potential for 10 bit drivers, but according to Adobe Apple stated they have no interest in doing so. It's one of the areas where I'm somewhat jealous of Windows. I'm not sure how such graphics drivers options would work out. I just don't know enough about the details of how they're loaded and accessed to give you an informed response there.

  • Reply 13 of 34
    MarvinMarvin Posts: 15,309moderator
    hmm wrote:
    I always wished they'd implement the potential for 10 bit drivers, but according to Adobe Apple stated they have no interest in doing so. It's one of the areas where I'm somewhat jealous of Windows.

    There are other options for 10-bit like:

    http://www.blackmagicdesign.com/products/ultrastudio3d/software

    But it would only work in certain apps. It would be quite nice to see a shift like we have seen to 64-bit to future-proof things. 12 bits per channel would be better under the assumption that 4K is going to be the standard but only if it's not too resource-intensive. I don't like how we seem to have settled on 8bpc. It's not high enough.

    On the subject of the Mac Mini performance, Macworld has benchmarked the new ones:

    http://www.macworld.com/article/2013250/lab-tested-2012-mac-mini-gets-a-nice-speed-boost.html

    I don't know why they always measure Cinebench by time. It gives you a score so you can compare it with other machines and they don't write the score. Anyway, it shows that for raw rendering, the new middle Mini is double the speed of the old one, which was expected. It's not quite as much for video encoding but a good boost all round.

    The GPU scores are a bit higher for the quad i7 vs the entry model. Intel clocks the quad-i7 GPU 9% higher. It actually benchmarks close to the 6630M and these tests would be for productive real-time graphics but for gaming, the 6630M was still faster. Casual-moderate gaming is fine though:

  • Reply 14 of 34
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by Marvin View Post





    There are other options for 10-bit like:

    http://www.blackmagicdesign.com/products/ultrastudio3d/software

    But it would only work in certain apps. It would be quite nice to see a shift like we have seen to 64-bit to future-proof things. 12 bits per channel would be better under the assumption that 4K is going to be the standard but only if it's not too resource-intensive. I don't like how we seem to have settled on 8bpc. It's not high enough.

    On the subject of the Mac Mini performance, Macworld has benchmarked the new ones:

    http://www.macworld.com/article/2013250/lab-tested-2012-mac-mini-gets-a-nice-speed-boost.html

    I don't know why they always measure Cinebench by time. It gives you a score so you can compare it with other machines and they don't write the score. Anyway, it shows that for raw rendering, the new middle Mini is double the speed of the old one, which was expected. It's not quite as much for video encoding but a good boost all round.

    The GPU scores are a bit higher for the quad i7 vs the entry model. Intel clocks the quad-i7 GPU 9% higher. It actually benchmarks close to the 6630M and these tests would be for productive real-time graphics but for gaming, the 6630M was still faster. Casual-moderate gaming is fine though:





    10 bit displayport has been around for some time. Standards for it are contained within displayport 1.2 specifications. This has nothing to do with resolution. It has to do with the number of values that can be fed to the gpu and display per channel for each pixel. 4k is something entirely different, although it would also be bandwidth intensive. It's supported under Windows assuming the use of a compatible display (which I own). The tendency toward wider gamut displays is what really makes such features relevant, as it reduces the need for dithering. Even then they all dither to some degree.


     


    Your link doesn't suggest gains quite as good as the ones you're suggesting, but they're still very significant. I felt the 6630m was really limiting outside of games. Ram available to the gpu isn't really a common thing to stress on with games, but it can be critical in other areas where applications simply will not run tasks on the gpu below a certain threshold.That's why I never suggested paying for the discrete option last year. Hopefully intel's progress with gpus will continue. Out of curiosity, does this make the cut to replace your current mini? I recall you planned it that way a long time ago, but I have no idea how well your current one is holding up.

  • Reply 15 of 34
    MarvinMarvin Posts: 15,309moderator
    hmm wrote:
    This has nothing to do with resolution. It has to do with the number of values that can be fed to the gpu and display per channel for each pixel. 4k is something entirely different, although it would also be bandwidth intensive.

    It helps avoid banding on higher resolution displays, which wouldn't be as noticable on lower resolution displays:

    http://www.digitalhome.ca/forum/showthread.php?t=156591&page=4

    "After seeing native 4k presentation on my 96" 21:9 screen viewed from 10' away, I can truly say that I welcome 4K wholeheartedly, but the colour banding and motion artifacts remain extremely visible"

    The colour range advantages are important for HDR photography and other high colour range media as you were saying. That'll likely be the next marketing term: HDR.
    hmm wrote:
    Your link doesn't suggest gains quite as good as the ones you're suggesting

    In what way? Cinebench is double vs the old model, GPU in the quad i7 scores more than 9% vs the base model.
    hmm wrote:
    Out of curiosity, does this make the cut to replace your current mini? I recall you planned it that way a long time ago, but I have no idea how well your current one is holding up.

    If it had a 640M and a quad-i7, it would have been an instant buy. I played Battlefield 3 on my current one no problem and I reckon it would just be too slow with the HD4000.

    BF3 on the HD4000 looks like this (choppy on low):

    On the 6630M, it looks like this (steady 30FPS+):


    I'll probably just wait out Haswell. I have a feeling it will arrive around June so not a long wait. I don't want the headache of an OS upgrade right now either. I'm getting concerned that Apple doesn't seem to want to bring feature parity to Quicktime X and seem to be happy to break things along the way so I'm overly cautious about adopting the new OSs.
  • Reply 16 of 34
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by Marvin View Post





    It helps avoid banding on higher resolution displays, which wouldn't be as noticable on lower resolution displays:

    http://www.digitalhome.ca/forum/showthread.php?t=156591&page=4

    "After seeing native 4k presentation on my 96" 21:9 screen viewed from 10' away, I can truly say that I welcome 4K wholeheartedly, but the colour banding and motion artifacts remain extremely visible"

    The colour range advantages are important for HDR photography and other high colour range media as you were saying. That'll likely be the next marketing term: HDR.

     


    I haven't viewed 4k displays, so I can't really comment there. HDR is interesting, although it's often used in really bad ways. Given that I have a reasonable background in that area, I've been shopping for a decent pano kit. The old one won't support a Canon 1 series body, although I could switch to something lighter. Spherical HDR panos are very useful for producing realistic reflections in renders, especially if you want to place something at a specific location. It's just annoying finding something with the locks/clamps appropriate to support a few pounds of camera.


     


    Anyway back to color gamut. The thing is that a display has a certain matrix of values that can be addressed. It's not one solid shape as it's portrayed if you bring up a gamut preview in colorsync. It's better to think of the device behavior as a point cloud. Given that the larger volumetric gamut means these points are spread thinner, it's natural to migrate toward the use of more points to at least partially alleviate the need for dithering. Buying a 10 bit display currently isn't an all encompassing solution to the dithering thing. People have noticed it in Dreamcolor displays and other super expensive ones in the past few years. It's just that it does need to be adopted if the trend is toward expanding gamuts. Right now they basically cap out at Adobe RGB. This has been possible for at least a decade, but they're much more common now. Apple has kind of ignored this and stuck within sRGB. It's their choice, and it may have been partially motivated by their focus on thunderbolt, as it doesn't fully support displayport 1.2.


     


     


    Quote:


    In what way? Cinebench is double vs the old model, GPU in the quad i7 scores more than 9% vs the base model.



    I just looked at the links again. I see what you were comparing now. The cinebench cpu test on the mid range model came up considerably, as it was bumped to a quad i7. The others aren't showing quite that advantage, although the mathematica gains are quite impressive.


    Quote:


     


    2012 Mac mini: Individual application scores















































     

    iTunes

    Encode

    Cinebench

    CPU

    VMware

    PCMark

    Mathe-

    maticaMark 8

    Mac mini/2.3GHz Core i7 (Late 2012)

    100.7

    73

    1347.7

    1.82

    Mac mini/2.5GHz Core i5 (Late 2012)

    113.7

    148

    1052.7

    1.06

    Mac mini/2.5GHz Core i5 (Mid 2011)

    115

    154

    1045.7

    1.02

    Mac mini/2.3GHz Core i5 (Mid 2011)

    123.7

    186.3

    855

    1

    Mac mini/2.4GHz Core 2 Duo (Mid 2010)

    161.7

    309.7

    707.7

    0.7


     




     


     


    Quote:


     


    If it had a 640M and a quad-i7, it would have been an instant buy. I played Battlefield 3 on my current one no problem and I reckon it would just be too slow with the HD4000.

    BF3 on the HD4000 looks like this (choppy on low):



    On the 6630M, it looks like this (steady 30FPS+):



    I'll probably just wait out Haswell. I have a feeling it will arrive around June so not a long wait. I don't want the headache of an OS upgrade right now either. I'm getting concerned that Apple doesn't seem to want to bring feature parity to Quicktime X and seem to be happy to break things along the way so I'm overly cautious about adopting the new OSs.




    That makes sense. They did reallocate costs somewhat. I thought the cpu pricing was closer together, but I just looked it up. I checked intel's site and wikipedia, specifically because they list launch pricing as opposed to current pricing, and it's always been accurate on this.


     


     


    http://ark.intel.com/products/52229


    http://ark.intel.com/products/64900/Intel-Core-i7-3615QM-Processor-6M-Cache-up-to-3_30-GHz


     


    Recommended customer pricing on the former was $150 cheaper on the 2011. I have no idea what the cost of a 640m looks like, but these listings are the same as they were at launch. It would have been nice. This is a downside to using mobile innards. Components with equivalent performance come with a pricing premium.  If 77W was doable, there are a couple decent upper range i5s around $225, which lines up with the cpu cost they were using last year. Unfortunately that wattage is still too high. In my opinion they made it too restrictive in favor of size. Given that Apple maintains a very lean lineup, it would have been nice to see a slightly better range of use cases.

  • Reply 17 of 34
    MarvinMarvin Posts: 15,309moderator
    hmm wrote:
    Spherical HDR panos are very useful for producing realistic reflections in renders, especially if you want to place something at a specific location. It's just annoying finding something with the locks/clamps appropriate to support a few pounds of camera.

    Although not as high quality due to the lenses, the iPhone's panorama feature with HDR setting could be good for this if it was combined with saving the source data:

    http://www.popphoto.com/2012/04/not-qute-raw-shooting-iphone-app-645-pro-now-available
    hmm wrote:
    I have no idea what the cost of a 640m looks like, but these listings are the same as they were at launch. It would have been nice. This is a downside to using mobile innards.

    PC manufacturers can put a quad-i7 and 640M in Ultrabooks for $750:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16834215541

    It should have been possible in an $800 Mini. Like I say, the integrated GPU will make things simpler for them going forward and it should be ok from next year onwards but I don't like it when they downgrade some components in a new machine relative to the old one.
  • Reply 18 of 34
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by Marvin View Post





    Although not as high quality due to the lenses, the iPhone's panorama feature with HDR setting could be good for this if it was combined with saving the source data:

    http://www.popphoto.com/2012/04/not-qute-raw-shooting-iphone-app-645-pro-now-available

     


    If that was all I had along at the time, I'd use it. You may be misinterpreting how I'm using these. I need a considerable amount of resolution and detail. Otherwise things tend to look blah. When I said rig, the point is to line everything up as precisely as possible to make stitching easier, as you have to stitch it, seamlessly remove tripod legs and any undesirable elements from the comp on an hdr image. You want it to be seamless, and this tends to be easier if you have really clean data, lens grid to correct distortion where possible, etc. I like as much control as possible, and I plan out time, check weather in the area, check google maps, plan exact time of day for lighting. If I need dawn light, show up while it's still dark carrying a maglite. It's difficult to get everything just right, and getting a really good axis of rotation with a sturdy head would make that a lot easier to line up. Just removing objects from hdr is a lot more annoying than on typical images if you're a perfectionist. I use a number of test curves and things to ensure it is absolutely seamless.


     


     


    Quote:


    PC manufacturers can put a quad-i7 and 640M in Ultrabooks for $750:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16834215541

    It should have been possible in an $800 Mini. Like I say, the integrated GPU will make things simpler for them going forward and it should be ok from next year onwards but I don't like it when they downgrade some components in a new machine relative to the old one.



     


    I didn't know that, but I previously suggested this had to do with Apple's desired margins. Note that the cpu price referenced against intel's recommended pricing was offset by $150. I don't blame you for being hesitant on it. I'm just not surprised they did this. Hopefully the Haswell version will meet your expectations. As of right now predictions on that are all over the place, so I'm not even going to comment. There are suggestions that Broadwell will be more impressive simply because of the list of changes being made, but I never know with Intel. When it's this far out, they always hype it.

  • Reply 19 of 34
    MarvinMarvin Posts: 15,309moderator
    hmm wrote:
    I need a considerable amount of resolution and detail. Otherwise things tend to look blah. When I said rig, the point is to line everything up as precisely as possible to make stitching easier, as you have to stitch it, seamlessly remove tripod legs and any undesirable elements from the comp on an hdr image.

    Are you quick enough to capture the sea seamlessly?

    1000

    The resolution you get is about 10784 x 2376 pixels (up to 28MPixels or something). The quality will certainly be better from a proper camera but for lighting and reflections, it should be usable for some output. Maybe not print resolution though.
    hmm wrote:
    There are suggestions that Broadwell will be more impressive simply because of the list of changes being made, but I never know with Intel. When it's this far out, they always hype it.

    Intel will support OpenGL 4 and OpenCL 1.2 from the outset with Haswell. The GPU performance will hopefully jump at least 80%.

    I don't see Broadwell improving performance as much because they are integrating things into the CPU:

    http://news.softpedia.com/news/Intel-Broadwell-CPUs-to-Make-Appearance-in-2014-237178.shtml

    I think a lot of people will be content with a $799 Haswell Mini especially if the Fusion drive price is lower by then.
  • Reply 20 of 34
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by Marvin View Post





    Are you quick enough to capture the sea seamlessly?



    The resolution you get is about 10784 x 2376 pixels (up to 28MPixels or something). The quality will certainly be better from a proper camera but for lighting and reflections, it should be usable for some output. Maybe not print resolution though.

     


     


    It's difficult to do so, but some of it can be corrected later. It's just tedious, and most people make a mess of stuff like that. As for resolution, that would still be pretty limiting. It's helpful to have start with that extra resolution then downsample at the end if necessary. Starting with something that is a little over 2k high is really really limiting. You'd also need a bit more ground. It's still awesome that you can capture that on your phone.


     


    Edit: What I meant was that I'm still using "pro" level cameras (in terms of their target market), just not renting medium format digital, and not upgrading that often at this point. I've been testing out some things I'd like to implement with 10k + renders. This means I need a lot of image data to keep it smooth. The other problem I've been having is that the renderer I've been using likes to sample reflections at low quality to speed up render times. This angers Morbo, so I may have see if I can write something to modify this behavior. Combine that with being limited to an awkward method of irradiance caching to make use of IBL nodes, and it's difficult to really get around the CG feel of it, especially with the poor interaction of bump maps. I've been looking at something like vray because of this. Their GI is really nice, but their shader documentation is pretty bad, meaning I'd have a lot of testing to determine certain situational behavior. It takes some tweaks in post regardless, but I like the best starting quality possible as I like things to look as photographic as possible. This means extremely precise models without distorted faces, a lot of shader testing, bump maps for micro-texture, and good IBL interaction with the use of bump maps. I just wanted to add some context, and I already know you're familiar with all the terms used as you've referenced this stuff before.


     


     


    Quote:


    Intel will support OpenGL 4 and OpenCL 1.2 from the outset with Haswell. The GPU performance will hopefully jump at least 80%.

    I don't see Broadwell improving performance as much because they are integrating things into the CPU:

    http://news.softpedia.com/news/Intel-Broadwell-CPUs-to-Make-Appearance-in-2014-237178.shtml

    I think a lot of people will be content with a $799 Haswell Mini especially if the Fusion drive price is lower by then.



    Anandtech seemed unsure on the gpu improvements there.


     


    http://www.anandtech.com/show/6355/intels-haswell-architecture/12


    Quote:


    Haswell's GPU


    Although Intel provided a good amount of detail on the CPU enhancements to Haswell, the graphics discussion at IDF was fairly limited. That being said, there's still some to talk about here.


    Haswell builds on the same fundamental GPU architecture we saw in Ivy Bridge. We won't see a dramatic redesign/re-plumbing of the graphics hardware until Broadwell in 2014 (that one is going to be a big one).




     


    I read the article a few weeks ago. Regardless of that, it should be a boost in functionality for the Mini. OpenCL 1.2 and OpenGL 4 are quite significant. What I find interesting is the pace of ipad improvements at the moment.

Sign In or Register to comment.