Author Topic: Does AH only use one core of an AMD chip?  (Read 4152 times)

Offline skribetm

  • Nickel Member
  • ***
  • Posts: 781
Re: Does AH only use one core of an AMD chip?
« Reply #15 on: October 18, 2011, 11:27:11 PM »
Quote
/// AMD uses macros to offer that instruction set and they will always suffer performance losses to any comparable Intel CPU when the SSE instructions are used.  //// neither Sony, Adobe, nor Pinnacle use the Intel compiler.

a.) macros, by w/c you mean macro-ops? intel uses macro-ops too(see: http://www.anandtech.com/show/1998/3), w/c are then fed into schedulers that emit the machine level instructions/assembly. i dont understand comparing SSE performance, one has to go down a bit further because SSE is not what's being executed by a cpu, but its equivalent assembly-level instruction. that is when one then looks at micro-op instruction latency and throughput tables for those many assembly-level instructions. for a comprehensive list see: http://www.agner.org/optimize/instruction_tables.pdf. i suggest reading P.71 for intel SB and P.161 for amd K10(athlon/phenom/thuban). if it gets too tedious, a much more concise list is available for common instructions: http://gmplib.org/~tege/x86-timing.pdf, starting at P.3. for intel SB and amd k10 comparison. as you can glean from the tables, it is no easy task to compare "SSE performance" between two modern uarchs. Even people who do this for a living don't say that because it is far more complex than anyone can imagine. One has to actually disassemble the app/workload into assembly, count the latency for each and every instruction, to arrive at a value for latency/throughput. no easy task, but if you have time do disassemble AH and the related DLL's... that would be some exercise! =)

the more likely explanation for this "better SSE performance" is not the hardware but the way software is optimized.
there are many ways to optimize software for a specific uarch, summarized in this post i made w/ a lecture slide from UC-Berkeley:
http://www.amdzone.com/phpbb3/viewtopic.php?f=532&t=138786&start=125#p210572
even cache-management strategies alone can yield vastly differing results between amd(exclusive-inclusive) & intel(inclusive) cache systems.
but of course, the easiest way to optimize for intel is use ICC, since it automatically de-optimizes for via/amd. =)

also, about sony not being optimized for intel..


i haven't found any optimization info for Adobe/Pinnacle, but intel pays a lot of $$$ to developers for intel-specific software optimization.

one last thing about performance, and it's about cache, did you know amd's so-many-years old k10 architecture has faster/lower latency L1, L2 than even intel SB?: http://images.hardwarecanucks.com/image/mac/reviews/AMD/Bulldozer/3.jpg

the bottom line for this long post really is, it's not SSE performance that intel uses to gain a lead, but intel-specific software optimization.
from using ICC(in some cases) to tailoring code to work best for one uarch over the other.

Quote
The Intel compiler is made by Intel.  Yes it does favor Intel CPU's.  Guess what?  THEY WROTE IT! ////

b.) my issue is the use of ICC in the many windows benchmarks misleads the ordinary consumer into believing a certain CPU is far better than the other, when in actual workloads they work just the same if not even marginally better. i dont even see disclaimers. i would call that fraud & misrepresentation.

Quote
Most of your support information is from AMD, or AMD support sites.  They are biased towards thier own product. //// If you chose to partake of only one side of a story it will, inherently, cause you to make potentially poor decisions.

c.) unfortunately, these are the only places you can read non-mainstream opinions that are de-popularized by big corporation marketing. where would you want me to read about amd's good points? anandtech? xD

Quote
/////  Intel is just following thier lead.

d.) intel has more than cheating to worry about, like for example bringing their latest gen GPU in SB to 2011 standards. as it is, it performs only as well as a 2005 nv/ati GPU: http://techreport.com/articles.x/21099/11, it has good fps in reviews mainly because it is rendering less work/worse picture. i'm not even going to mention drivers that are worse than ati/nV combined.

Quote
//// Basically it is saying the Intel CPU's execute certain instructions better than AMD and for some reason we are supposed to think that is a bad thing.
 

e.) it is a bad thing if it misleads people. again, running SSE2 for benchmarks if its an amd cpu and SSE3+, SSE4+ if intel is a fraudulent practice by benchmarketers.

Quote
//////// The only area AMD falls flat on its face is in the area of streaming video or anything that makes extensive use of the SSE family of instructions (most high end video editors).  A bit of a shame as AMD has a better FPU than Intel does.

actually, majority of SSE instructions for media/video editing/transcoding are integer. not float. see 3-operand AVX, XOP.

Quote
/////// Still waiting for AMD's Bulldozer to see what it brings to the application party.
great for multithreaded workloads, but for single threads, the narrow cores dont do very well. =)
the 8-cores are targeted for people who run heavily-threaded desktops/multiple apps open simultaneously at the same time.
people who use lightly-threaded apps or use only one app at a time on desktop should be buying faster duals/quads, anyway.

-mainconcept http://www.lostcircuits.com/mambo//index.php?option=com_content&task=view&id=102&Itemid=1&limit=1&limitstart=17
-mediashow http://www.guru3d.com/article/amd-fx-8150-processor-review/14
-h.264 http://www.guru3d.com/article/amd-fx-8150-processor-review/14
-vp8 http://www.guru3d.com/article/amd-fx-8150-processor-review/17
-sha1 http://www.guru3d.com/article/amd-fx-8150-processor-review/17
-photoshop cs5 http://www.lostcircuits.com/mambo//index.php?option=com_content&task=view&id=102&Itemid=1&limit=1&limitstart=14
-photoshop cs5 http://www.tomshardware.com/reviews/fx-8150-zambezi-bulldozer-990fx,3043-15.html
-winrar, faster than 2600k http://www.techspot.com/review/452-amd-bulldozer-fx-cpus/page7.html
-winrar, improves over x6 http://www.tomshardware.com/reviews/fx-8150-zambezi-bulldozer-990fx,3043-16.html
-7-zip better than 2600k here: http://images.anandtech.com/graphs/graph4955/41698.png http://www.anandtech.com/show/4955/the-bulldozer-review-amd-fx8150-tested/7
-7-zip same perf as 2600k http://www.tomshardware.com/reviews/fx-8150-zambezi-bulldozer-990fx,3043-16.html
-POV-ray, faster than 2600k http://www.legitreviews.com/article/1741/10/
-POV-ray http://www.nordichardware.se/test-lab-cpu-chipset/44360-amd-fx-8150-bulldozer-goer-entre-pa-marknaden-test.html?start=15#content
-x264(2nd pass AVX enabled) http://www.anandtech.com/show/4955/the-bulldozer-review-amd-fx8150-tested/7
-x264 (2nd pass, better overall than 2600k) http://www.bjorn3d.com/read.php?cID=2125&pageID=11108
-x264 (2nd pass +.3 than SB2600k) http://www.legitreviews.com/article/1741/7/
-handbrake; http://www.legitreviews.com/article/1741/9/
-truecrypt; http://www.bjorn3d.com/read.php?cID=2125&pageID=11111
-solidworks; faster than 2600k http://www.techspot.com/review/452-amd-bulldozer-fx-cpus/page7.html
-abbyy filereader http://www.tomshardware.com/reviews/fx-8150-zambezi-bulldozer-990fx,3043-16.html
-C-Ray, as fast as $1k i7-990X,  http://i664.photobucket.com/albums/vv4/wuttzi/c-rayir38.png

Offline skribetm

  • Nickel Member
  • ***
  • Posts: 781
Re: Does AH only use one core of an AMD chip?
« Reply #16 on: October 18, 2011, 11:39:38 PM »
I would love to go back to AMD someday, but I refuse if it is not working well in this game.

not sure what was wrong with your set-up. i've used the exact same CPU w/ a HD4890 a while back.
balls-to-the-walls settings at 2048x1152 rsolution. no hiccups. no slowdowns.   :( :( :(

fwiw, im even gaming on an amd a6-3400m(llano laptop).
see its comparison w/ a much pricier other model: http://youtu.be/mdPi4GPEI74

heres some screenies, note that it only runs at 1.4GHz, quad core, (easily clocks to 2.2GHz = more fps) 1366x768 resolution.
i think it makes fair use of at least three cores..

default settings, 37fps w/ fairly heavy smoke in foreground: http://i664.photobucket.com/albums/vv4/wuttzi/ahss2.jpg
ive joined bish horde missions, it never stutters even at lower fps (uppers 20's, remains smooth, no spiking like in SB laptop video), also note the rook horde:
http://i664.photobucket.com/albums/vv4/wuttzi/ahss1.jpg  :D

~10fps drop at max settings, w/ hi-res pack, 120deg FOV: http://i664.photobucket.com/albums/vv4/wuttzi/ahss3.jpg

also, my desktop is badly cluttered. IE9 w/ multiple tabs, msoffice is open, skype msnger, etc..

Offline 1701E

  • Silver Member
  • ****
  • Posts: 1885
      • VBF-18 Bearcats
Re: Does AH only use one core of an AMD chip?
« Reply #17 on: October 19, 2011, 12:06:48 AM »
 As it was stated some time ago, yes AH only uses 1 AMD core (for now). :)
Also as mentioned, AH can run great on that one core, or it can go badly. I run AMD with no issues at all, but if I had the cash, I would have likely chosen Intel. AMD is cheaper, there are reason for that, but for most any user it's not gonna be something we notice. Course the editors and programmers and whatnot will. I was running the same as a friends Intel system (Q6600 vs Phenom II 555 / same GPU) in the games we played which considering how much cheaper mine was (despite being way newer), was worth it. :D
But in short, yes, as mentioned several times, it only uses 1 core.
ID: Xcelsior
R.I.P. Fallen Friends & Family

"The only ones who should kill are those prepared to be killed"

Offline skribetm

  • Nickel Member
  • ***
  • Posts: 781
Re: Does AH only use one core of an AMD chip?
« Reply #18 on: October 19, 2011, 12:17:01 AM »
i dunno, it uses at least 3 cores on mine? http://db.tt/F7QTSAaX
anyone run AH and show the resource monitor history when immediately alt-tabbing out of game?
from the resource mon i see it has 21 threads when i'm playing at ~35+ avg cpu load on a 1.4GHz quad.

Offline Debrody

  • Platinum Member
  • ******
  • Posts: 4486
Re: Does AH only use one core of an AMD chip?
« Reply #19 on: October 19, 2011, 12:42:57 AM »
To BigR
AH uses one AMD core, so adjusting the clock speed will have a great effect on the in-game FPS.
My system: phenom II 955, 8 gigs of ddr3 1600, hd 5830.   The 5830 isnt as a great video card, performs a little bit worse than the 6850.
Running the cpu at 3.2 GHz i get a very steady 60 fps with all the eye cand on except the shadows at 4096. Made some tests, just couse i was curious. Downclocking the cpu has no effect on it til i go under 2 Ghz... (!)
Idk how can the 2600k be as a great improvement when the bottleneck is the monitor.
AoM
City of ice

Offline zack1234

  • Plutonium Member
  • *******
  • Posts: 13182
Re: Does AH only use one core of an AMD chip?
« Reply #20 on: October 19, 2011, 01:48:15 AM »
AMD Phenom,Ati 6970, 8gig of memory.

Everything on, hi res etc 60fsp.

Best bit of advice I have heard here is to use real programs to  benchmark your pc.

I rarely look at frame rates anymore, you can get a bit obsessive about it.

Only thing I do is increase fan speed, now it's getting colder I will stop doing that.

There are no pies stored in this plane overnight

                          
The GFC
Pipz lived in the Wilderness near Ontario

Offline BigR

  • Silver Member
  • ****
  • Posts: 926
Re: Does AH only use one core of an AMD chip?
« Reply #21 on: October 19, 2011, 03:54:25 AM »
Well, I figured something was wrong with my setup somehow. It's just that every other game was flawless and AH always sputtered out on my AMD. I did clean installs of everything, did all Skuzzy's recommendations, and it still gave me problems. I have always been a fan of AMD, and I will continue to be, but right now I am really happy with the i7, so I will stick with that for a while. I built up a system for my dad using my AMD gear and he loves it, so it is still being put to good use.

I will probably never find out what was happening in that system with AH, but honestly I was sick of dealing with it. The funny thing is I had an original 3 core Phenom chip before the 955, and It never gave me issues.   :headscratch:

Offline zack1234

  • Plutonium Member
  • *******
  • Posts: 13182
Re: Does AH only use one core of an AMD chip?
« Reply #22 on: October 19, 2011, 04:21:29 AM »
I had a Nvidia 550 and it caused me lots of problems with boot up etc :)

My Ati card is fine  :)

Just one of those things when hardware is moody, anyone want a 550 brand new? In UK  :)
There are no pies stored in this plane overnight

                          
The GFC
Pipz lived in the Wilderness near Ontario

Offline Skuzzy

  • Support Member
  • Administrator
  • *****
  • Posts: 31462
      • HiTech Creations Home Page
Re: Does AH only use one core of an AMD chip?
« Reply #23 on: October 19, 2011, 06:31:54 AM »
<snip>

There are so many things wrong in this post I should just delete it.  You take marketing information and put a spin on it as if it were technical.  Your conclusions are very flawed and seem more based on paranoia than actual fact.  Sony's optimizations are that they are fully threaded and will take advantage of all available cores.  Ta-da!  Sony is the smallest of the three I mentioned and there is no love between Sony and Intel.

Again, you are basically claiming AMD cannot run the same code as Intel and be as fast, so it must not be fair.  This does not make AMD look any better to anyone.

I have run my own tests using my own code and AMD has never been able to match Intel's SSE performance.  SSE (all forms) is used extensively by video editors to present the editing streams, which also run during transcoding.  You are aware that AMD is typically a couple of generations behind on the SSE levels available from Intel?  This is due to licensing issues and AMD also seems to treat SSE as an afterthought.  

If AMD has a hard time keeping up with the same code as Intel, then who is at fault?  Oh, I know you are going to claim they are not using the same code paths, which is pure paranoia.  Take a debugger and show me.  Good luck with that as I have already done it and there is only one code path in Sony's code.

If Intel wants to show any company a better way to code to take advantage of thier processors, then there is nothing wrong with that.  AMD could do the same thing, or AMD could find a way to improve the processor performance and make use of the same code optimizations Intel does.  Go look at Intel's site sometime.  They offer free code to help optimize performance.  It is source code and not one line of it is CPU specific.

Like I said, I run my own code, my own tests.  I do not trust any benchmark.  As I have written many benchmarks over the years, I have a pretty good understanding of how to accomplish what I need to accomplish and how to test what I need to test.  Any applications I use, I have gone through them with a debugger and witnessed the code used.

As far as other sites to trust.  I have not found one which goes to the level I go to when I test a CPU, so I prefer my own results.  I feel for any consumer who only has sites which make thier money from the advertisers who will pull the plug if said site says anything negative about the product(s).  Every site has an agenda, which is oft times hidden from the reader.  You need to be more paranoid and skeptical about the sites you read to get information from.


Getback, I will run Bulldozer when I can get my hands on one.


BigR, AMD has a compatibility issue in many of the CPU's (not all), which impacts not only Aces High, but other applications (not all).  Specifically, it impacts applications which make use of multiple high resolution timers AND natively multi-thread.  The code work-around, which basically forces the game to run on one CPU core, is provided by AMD and used without modification.  Again, not all AMD CPU's will be impacted by this.  It will also depend on what is running in the background.  If a background application is using a high resolution timer, then the game will be forced to one core.
« Last Edit: October 19, 2011, 06:52:39 AM by Skuzzy »
Roy "Skuzzy" Neese
support@hitechcreations.com

Offline CRYPTIC

  • Nickel Member
  • ***
  • Posts: 442
      • 365th-FBG  HELL HAWKS
Re: Does AH only use one core of an AMD chip?
« Reply #24 on: October 19, 2011, 10:46:14 AM »
Quote
Getback, I will run Bulldozer when I can get my hands on one.
Quote
I have a pretty good understanding of how to accomplish what I need to accomplish and how to test what I need to test.

Skuzzy please keep us posted. I like to here what you come up with on your benchmarks.

If what I'm reading and hearing it does seem promising,but we all know how that goes.
365th-FBG Hell Hawks XO
365th-FBG Hell Hawks

Offline Masherbrum

  • Radioactive Member
  • *******
  • Posts: 22408
Re: Does AH only use one core of an AMD chip?
« Reply #25 on: October 19, 2011, 11:46:56 AM »
I have no doubt you believe that stuff too.

More of your "fanboiism's".   Intel > AMD
« Last Edit: October 19, 2011, 11:48:27 AM by Masherbrum »
-=Most Wanted=-

FSO Squad 412th FNVG
http://worldfamousfridaynighters.com/
Co-Founder of DFC

Offline Getback

  • Platinum Member
  • ******
  • Posts: 6364
Re: Does AH only use one core of an AMD chip?
« Reply #26 on: October 19, 2011, 01:28:05 PM »
Benchmarks actually mean nothing to me. What's important is will the CPU satisfy my standards for the application I intend to use it for.

  Created by MyFitnessPal.com - Free Calorie Counter

Offline Tigger29

  • Gold Member
  • *****
  • Posts: 2568
Re: Does AH only use one core of an AMD chip?
« Reply #27 on: October 19, 2011, 01:52:45 PM »
Here's my take on things.

1> Generally speaking Intel is better than AMD.  For example a 3.2Mhz Intel processor is going to be better than a 3.2Mhz AMD processor for a number of reasons.  Now if you're matching a P3 intel to say an AMD Phenom well of course AMD is going to blow it away!  Let's make sure we're comparing apples to apples here.

2> However DOLLAR FOR DOLLAR AMD is probably going to be better than Intel.  In other words a $100 AMD processor is probably going to outperform a $100 Intel processor because for that money (and I'm just making these numbers up to show my point but it's generally correct) you can get a 3.2Ghz Quad core AMD processor as opposed to a 2.5Ghz Dual core Intel processor.

3> There are a lot of things that Intel processors can do that AMD processors can't... at least without taking a performance hit in the process.  But for the AVERAGE computer used this probably won't be evident.  It seems as if Intel "sets the standards" and AMD has to find a way to make it work reasonably well.

So in a nutshell - unless you use your computer for movie editing, 3D CAD design, or other "professional" processor intensive purposes then there's a good chance AMD is going to suit you just fine.  If you're on a budget and you want "more for your money" then AMD may be a good solution for you.  But if you're simply looking for the best performer and/or money isn't so much of an issue then Intel is the obvious choice for your needs.

Offline skribetm

  • Nickel Member
  • ***
  • Posts: 781
Re: Does AH only use one core of an AMD chip?
« Reply #28 on: October 19, 2011, 02:46:55 PM »
More of your "fanboiism's".   Intel > AMD

that argument is all bones & no meat at all! [or all hat & no cattle(?)]  :D :D :D

ya know, put in something like this: Guinness World CPU Record
http://www.overclocking-tv.com/content/news/11863/amd-continues-with-cold-bug-free-cpu-extreme-bulldozer/

even intel's much-praised 32nm dont come close.  :aok  :D  :D

Offline cattb

  • Silver Member
  • ****
  • Posts: 1163
Re: Does AH only use one core of an AMD chip?
« Reply #29 on: October 19, 2011, 03:15:16 PM »
Benchmarks actually mean nothing to me. What's important is will the CPU satisfy my standards for the application I intend to use it for.

You got it, now you need to do your research.
:Salute Easy8 EEK GUS Betty