Aces High Bulletin Board
General Forums => Hardware and Software => Topic started by: gnubee on April 19, 2005, 10:32:41 PM
-
Hey all...
I'm just wondering if you more tech savy folks out there would share their opinion on the 64 bit P4 chips that are out now... (P4 6** or whatever they are...)
I'm just curious about them... they seem to offer higher clock speeds than the the AMD chips, and now that they're 64 Bit that may actually mean something in the world of gaming... or does it? :confused:
I'm going to be making the plunge (starting from scratch) and I'm wondering what direction to turn...(AMD vs. Intel)
Thanks for the help-
Scott
-
Do yourself a favor and steer clear of any Intel CPU. At least until they get a handle on the thermal problems.
And this is coming from someone who has used Intel CPU's exclusively in every computer I have ever built.
But do not ask me about AMD either. I cannot and will not make a recommendation in that area.
-
I exclusively only use AMD. Never had a bit of problem with them
-
I have dual nocona's running right now with the 64 bit extension support. Cooling them is not a problem.
That said, the 64bit option on any of the new processors is not a benefit in any way shape or form for the average user when all things are considered.
Now that AMD and Intel have 64 bit extension processors, there might be more of a push for 64bit support from hardware and software manufacturers. Unfortunately, that support is virtually non-existant right now.
I think the 3 GHz or "3000" solutions are the best bang for the buck, reliability and compatibilty right now. The laws of physics are going to move the race away from clock speed. Unfortunately, nobody knows just exactly what direction "away from clock speed" points.
-
Originally posted by Skuzzy
.....But do not ask me about AMD either. I cannot and will not make a recommendation in that area.
Dang and I took all that time to compile all dem there questions.
But I guess I could ask these two:
Did Northwood pretty much quit making cpu's and if so why?
also
If Prescotts are having problems with heat, why are they like the mainstream chip now? is it cost?
I'm probably missing a whole bunch in there but I dont have a good knowledge base on the difference in the two chips to know where to begin to looks for an answer myself. So I am kinda seeking the laymans term type answer if at all possible.
-
gnubee - Clockspeed isn't everything, it what is performed each clock cycle. Thats why AMD's can at a lower clockpseed can equal or in most cases best an Intel CPU at a higher clockspeed.
As an example, if I can do two math problems in one step but it take you two steps, you have to run twice as fast to keep up. AMD CPU's do more 'math problems' per step compared to Intel.
Actualy Intels implemention of the 64 bit extensions are a poor attempted copy at AMD's ones.
Check the web for info on both, you'll be suprised at how much of a 'kludge' Intels offering is.
Same goes for the 'dual cores', only AMD will be a true dual core, Intels are actually 2 seperate CPU's shoehorned onto the one die.
MiniD- Way forward for a while is going to be having multiple cores on one die. Also increasing the number of instructions performed per clock cycle.
-
Until the majority of applications find a compelling need for using more than 4GB of memory I don't see 64bit processors being a big deal. personally I would rather see us skip 64 and go to 128 while there is still is plenty of room to grow with 32bit processors =)
-
I cant wait till Photonic Processors... that'll be the day when Aces High run's everything on full.
-
No Paul. Not true. When video cards have over 1GB of ram, then you can run Aces High on full tilt (assuming no more skins are submitted).
Mini D, are those 90nm process? If so, you may be cooling the CPU ok, but there is no way the components around the CPU or on the backside of the motherboard are running cool.
This has been the biggest issue I have with Intel's 90nm process. Until Intel can get a low-k process in place (which is coming..at least it better be), I cannot in good faith even suggest using any Intel CPU based on the current 90nm process.
Then stuck using those terrible DDR2 memory sticks. Does Nocona use DDR2? Sorry, but that sticks in my craw as well. More heat, higher latencies, and just flat out slower than the previous generation of parts.
Wolf, Northwood is the code name for the previous generation of Pentium 4 CPU. Probably the best CPU Intel ever made. Definately better than anything they currently make for the consumer.
-
I haven't seen a DDR2 Nocona motherboard. I'm using the PC3200 sticks on this system. I don't have extensive cooling on it, just two case fans and a fan on each processor. I don't think it's significantly warmer in that area than my P2-400's were running before. The video card, on the otherhand, is significantly warmer.
Also, the 90/120nm chat sounds cool and all, but it's really just smoke and mirrors. Don't get too hung up on it. The design and transistor count are the biggies. Compression and densification are the major increasers of power consumption. 90nm just means smaller transistors (and denser). Low-K won't be a cure for anything other than allowing us to put lines even closer together. Power requirements will continue to go up as long as the transistor count does... no matter what we do.
-
90nm paths in the same substrate is going to have higher resistance than 130nm paths will (I think you know that). BTU/Watt will be higher (I think you know that as well). Prescott cores do not have a significant transistor count over the Northwood core, yet power consumption is approximately 40% higher at the same clocke rates.
You simply cannot say power consumption is directly related to transistor count, but I also figure you are using general statements as well. While transistor count is a factor, it is not the only gauge of power consumption.
Take a look at GPU's today. They are using 3 times as many transistors as they were 3 years ago, with higher clock rates, and yet using less power (ATI's 800XL is a good example. It uses less power than an NVidia GF4Ti4600, but has 3.2 times the transistor count and is running a higher clock rate). Funny thing is, ATI has licensed a good deal of thier process from Intel.
If Intel sticks with that design, which they have not shown they are going to abandon, then they need a low-k process to reduce the power consumption to reasonable levels.
Given that Prescott cores are slower, clock for clock, than Northwood (by design), my hope is Intel will revamp and dump Prescott. Damn thing looks like a hyped up Willamette core for all the performance it delivers.
And, like you, I use terms most people are familiar MD. This is a topic which could easily overwhelm most people. Bottomline: The Prescott design is horrible and should have never been placed in the marketplace. Intel can do better.
I hope that FAB they are currently revamping will house some better processes.
Since Intel rolled out that processor, I have pretty much stopped keeping up with Intel until I see that design fade away. Which is why I asked about Nocona. Is it a 90nm chip? What family is it branched from or based on? What is the targer system for Nocona? Does it actually run cooler than Prescott?
-
Skuzzy,
To my knowledge the Nocona core is based on the Prescott core.
SSE3 is now a feature on Xeon processors, the result of using the Prescott core in the Xeon range as the basis for Nocona.
taken from here (http://www.hexus.net/content/reviews/review.php?dXJsX3Jldmlld19JRD04MjImdXJsX3BhZ2U9MQ)
And it is a 90nm processor. You can view some more details here (http://www.intel.com/products/processor/xeon/index.htm)
Also a great read on the Nocona v. Northwood to quote an Intel employee Major architectural differences from Northwood to Nocona include:most cache sizes doubled
more Write Combine buffers
increased pipeline depth (number of clocks start to finish)
A majority of single thread applications are expected to run slower on Nocona than Northwood at the same clock speed. The expectation has been that a Northwood would be upgraded to a higher clock speed Nocona, or that the customer wants to use the 64-bit extensions.
but you can read it here (http://softwareforums.intel.com/ids/board/message?board.id=HPC&message.id=168)
-
talk about a screw up .. shoulda bought intel stock when it was @ .42 the other day .. today its at 23.08 coulda made 4500. off of a 200. investment ...
-
There ya go Rosco. Was Intel that low?
Not sure what that has to do with the discussion though.
Are you attempting to ellude to they are doing well in spite of themselves? Just proves how many suckers are available in the marketplace. Most people have no clue and Intel is probably grateful for that.
-
Originally posted by Roscoroo
talk about a screw up .. shoulda bought intel stock when it was @ .42 the other day .. today its at 23.08 coulda made 4500. off of a 200. investment ...
I'm sure you meant down .42 as INTC has not been to .42 ever.
-
Skuzzy, you are thinking in "all else being equal" mode. That is seldomely the case. All else being equal, we could make a 60MHz P1 that wouldn't even require a heat sink at all with the current process. That is saying that 600k transistors aren't adding anything that needs to be worried about.
Now, you try ad jam 40million transistors into 1/4 of that area and tell me one more time how line size is what's making those things so hot.
Density and thermal transfer are the main issues. As the cross section are of a line goes down, the lenght does too. The overall resistance doesn't change much. The thing that changes is that where there were previously 10 lines there are now 15.
Also consider that the initial release of a chip is usually on the order of 15x12mm. Subsequent releases get reduced to about 12x10mm with at least 30% more transistors.
Very little about the Low-K process will reduce the resistance of the lines, it only allows you to put them closer together without reducing the width of the line. Anything you gain will be lost by increasing density once again. All else being equal, it would help. But all else is never equal and these things seldomely are the savior they are sold as.
The only solution to the current thermal "problems" is a major design change. Personally, I don't see them ever being solved. It is actually easier to figure out new and improved ways to cool these things than it is to figure out how to keep them from getting hot.
-
I agree with most everything you said (pretty much fact so no argument from me). Bottomline is Prescott is a bad design, compared to what it could be. Performance is poor, in comparison to Northwood, for example.
The thermal control of the part is an issue as it applies to the surrounding parts. Sure, the CPU itself can run insanely hot (comparatively speaking), but when you put a motherboard on a bench, with an HSF mounted on the CPU, and the neoprene rubber block on the backside of the motherboard gets welded to the motherboard from the heat of the CPU, that is a problem.
Epoxy can withstand a great deal of heat, but the surrounding capacitors and resistors will suffer premature failure from heat exposure as it stands right now.
It's just not a good part. Even if the thermal issues were solved, the performance is just not there. Of course, some of the handicap has to be put off on the use of DDR2 memory. Makes for good marketing as the numbers are higher, but the latencies kill any performance increase you could have.
Then there is the heat problem with DDR2 memory.
All this heat translates from power usage. The amount of power required for a Prescott based system is nuts, quite frankly.
I am just disappointed MD. Intel had a good base design in Northwood and it was faster than the AMD counterpart as well, near the top end. Prescott and the use of DDR2 ram is a giant step backwards in many areas.
I have heard rumblings about Intel working on a new part (code name starts with a 'C' if I recall) which should alleviate the thermal issues. I anxiously await it.
-
Originally posted by Wolf14
:
Did Northwood pretty much quit making cpu's and if so why?
also
If Prescotts are having problems with heat, why are they like the mainstream chip now? is it cost?
I buy about 60 -70 800MHz FSB P4 Processors a month. I'm always looking for Northwood Cores. Hard to Find now. We've tested the Prescott Core Processors and they do indeed run too hot for our application (enclosed in a housing and frame). I'm not sure what the deal with the Prescotts is....but we are actually re-engineering to use them because the Northwoods are goners. :confused:
Woof
-
Originally posted by Skuzzy
I am just disappointed MD. Intel had a good base design in Northwood and it was faster than the AMD counterpart as well, near the top end. Prescott and the use of DDR2 ram is a giant step backwards in many areas.
I have heard rumblings about Intel working on a new part (code name starts with a 'C' if I recall) which should alleviate the thermal issues. I anxiously await it.
Hehehe... I thought about this thread yesterday when we were discussing a new ILD that we developed that actually looks like it might work. If things go as planned, you'll read about it in 6 months or so.
I actually mentioned that someone was thinking this would cause the chips to be cooler and everyone started laughing .
I totally agree on the design, but I doubt it's for the same reasons. Right now Intel is able to manufacture a processor that is running 50% faster than it's counterpart with similar heat emission. The truly sad part about it is the processor running at 66% the speed of the pentium IV is performing just as well. That can only point at design. No ammount of process changes in the world are going to solve that problem.
I haven't heard of a new chip except maybe the "nahalem". It's important to remember, though, that chips names are design based, not process based.
I will say, though, that some things are start to look promissing that will revolutionize the concept of "dual core". Now, if they'd just do something with the "core" itself, i'd be happy.
-
c = "cedar mill" (I believe). I have no idea what it is.
-
No offense taken. Being outside the design process only allows outsiders to guess and speculate.
Only two reasons they would laugh about it. 1) If they had alreadydoen the research and found it would not help, or 2) if they choose to be arrogant and agreesively reject ideas which did not start within thier clique.
No offense to you in that. I have worked many years in, around, and on chip design and found many, if not most, chip designers to have a pretty arrogant attitude towards outsiders.
I do agree whole-heartedly the design is the issue with the current generation. I just wonder how they went from a decent design to the train wreck which is Prescott. Different teams? No internal communication? Prescott team had no access to the Northwood design (one of those brilliant, clean-sheet designs).
It's none of my business, but curiosity is a driving thing.
-
It's not a team thing skuzzy. We know why there is a need for Low-K films quite clearly. Afterall, we are the low-k group. Reducing power consumption was never a target of the group. Being able to place lines closer together was. Capacitance is an issue because of the power necessary to run the chip. Reducing the capacitance would not allow you to run the chip at a reduced power, it simply allows you to bump up the power with less impact to the line next door.
Now... reducing the power needed at a transistor is an effective control for thermal issues. The problem is, this doesn't occur inversely to the proximity of the transistors. We end up dropping transistor power requirements by 15% while increasing the density 35%.
Also, I'm pretty sure the prescott and nothwood people had a little more exposure to each other than you've been led to believe. Even for a large company, we communicate just a bit better than that. Especially when it comes to things that are going to be manufactured in our virtual factory (all sites simultaneously). I do work with the design groups occassionaly on our test vehicles. It really is a small circle once you hit the management in these teams. Everyone knows everyone and they all regularly talk/meet/plan.
-
So what is the new Pentium D I keep hearing people talk about? Is it an improvement? I've always pretty much stayed with AMD, but I try to stay open to new things.
-
Oh, I was not led to believe anything MD. I was just speculating. It's the outside-looking-in thing. Trying to make the best guess as to how Prescott got a bit fubared (from outward appearances).
I understand the issues involved with thermal control as density increases. It is an issue. But I will maintain there are those who have successfully done 90nm with higher transistor counts (ATI GPU's for example) than Prescott which run significantly cooler. I am not sure of the exact densities ATI is using on thier current parts though. The die sizes do not appear to be that large.
I attribute some of this to possibly better layout of the part. What would you attribute it to (I am asking out of curiousity and you probaly have a better idea than I do)?
-
Originally posted by Mini D
Hehehe... I thought about this thread yesterday when we were discussing a new ILD that we developed that actually looks like it might work. If things go as planned, you'll read about it in 6 months or so.
I actually mentioned that someone was thinking this would cause the chips to be cooler and everyone started laughing .
I totally agree on the design, but I doubt it's for the same reasons. Right now Intel is able to manufacture a processor that is running 50% faster than it's counterpart with similar heat emission. The truly sad part about it is the processor running at 66% the speed of the pentium IV is performing just as well. That can only point at design. No ammount of process changes in the world are going to solve that problem.
I haven't heard of a new chip except maybe the "nahalem". It's important to remember, though, that chips names are design based, not process based.
I will say, though, that some things are start to look promissing that will revolutionize the concept of "dual core". Now, if they'd just do something with the "core" itself, i'd be happy.
Please don't confuse Intels psudeo 'dual core' with AMDs true 'dual core' design.
Plenty of stuff on the web explaining why Intels offering isn't true dual core, far to techy for me to understand. Something about the way the two 'cores' are linked or communicate.
-
What I understand of it is this Kev. The Pentium "dual core" is basically two physical CPU's glued together that still share the same voltage and run in the same power state, and have to communicate across an external FSB. Any core to core communication is slower because of this. The AMD on-chip Northbridge set speeds things up alot. I'm re-reading an excellent writeup on it from last month that answers my own question from up above as well.
http://anandtech.com/printarticle.aspx?i=2397
Talks about the current 64s, the upcoming stuff, and compares it to the Intel offering.
-
I thought Prescotts problem were the leaking currents, not resistence.
Anyway the design had a lot of stuff in it which is new technology and intel wanted to performance test, i think they hoped that it would be easier with the heat and they could bump up the clock to 4+ GHz. Now that they found out they cant they will wait for the next design to do that.
Intels dual core is just as much dual core as AMD. Both are just a cheap start in the multi core future, slapping two single core chips together. That is good for programms that run multithreaded in a way which is thought of today.
Real future technology is something like the cell processor, where you have 10 instead of one processing cores on one chip. But to use it efficiently the programming has to be diffrent and the software producer already mourn that its hard to have 2 processors work at the same time.
-
Originally posted by Kev367th
Please don't confuse Intels psudeo 'dual core' with AMDs true 'dual core' design.
Plenty of stuff on the web explaining why Intels offering isn't true dual core, far to techy for me to understand. Something about the way the two 'cores' are linked or communicate.
LOL!
Dude... neither have anyhing to do with what I'm talking about. Like I said, it will be very interesting and I doubt very much that anyone in the microprocessor industry will be doing it. We still have alot of bugs to work out before the announcement, but we've gotten the chips to yield. In about another month the next batch wich fixes some lithography aligment issues is due out. It might yield well over 650 ISO. If it does, I think there will be some serious internal discussions about an announcement. The problem is, it's only going to double the density of transistors yet again.
Schutt, more leakage occurs in the transistors than anywhere. The irony is that this is the job of the "High-K" group to fix.
-
Did Northwood pretty much quit making cpu's and if so why?
Wolf, Northwood is the code name for the previous generation of Pentium 4 CPU. Probably the best CPU Intel ever made. Definately better than anything they currently make for the consumer.
This thread is waay to long to read and see if someone else said this already.
I just bought one of the 3.0 CPUs and had to get Northwood because my motherboard was Prescott-unfriendly. I'm not exactly sure how new it is, but it's not an old one.
Northwood P4 3.0 GHz (http://www2.newegg.com/Product/Product.asp?Item=N82E16819116163)
-
I missed this exchange:
Originally posted by Roscoroo
talk about a screw up .. shoulda bought intel stock when it was @ .42 the other day .. today its at 23.08 coulda made 4500. off of a 200. investment ...
Huh?
Originally posted by Skuzzy
There ya go Rosco. Was Intel that low?
Not sure what that has to do with the discussion though.
Are you attempting to ellude to they are doing well in spite of themselves? Just proves how many suckers are available in the marketplace. Most people have no clue and Intel is probably grateful for that.
Huh?
Intel Stock has not been anywhere near $0.42 ever. It's been bouncing around between $22-$24 for a while now.
The one thing that is abundantly clear is that the value of Intel stock has very little to do with the product that is available. The two are totally unrelated. Market trends have a much more significant impact.
-
Originally posted by Mini D
LOL!
Dude... neither have anyhing to do with what I'm talking about. Like I said, it will be very interesting and I doubt very much that anyone in the microprocessor industry will be doing it. We still have alot of bugs to work out before the announcement, but we've gotten the chips to yield. In about another month the next batch wich fixes some lithography aligment issues is due out. It might yield well over 650 ISO. If it does, I think there will be some serious internal discussions about an announcement. The problem is, it's only going to double the density of transistors yet again.
Schutt, more leakage occurs in the transistors than anywhere. The irony is that this is the job of the "High-K" group to fix.
You mentioned dual cores, just pointing out that there are differences between the two designs.
No-one will be doing it?
AMD selling Opteron Dual Cores now.
Intel paper launched theirs few weeks back.
So someone must be doing it.
Yup transistors double again, yet the 'dual core' Opterons manage to do this with a min power usage of 64W. Only slightly higher than their single core counterparts.
-
the main reason Intels prescot is so bad is marketing
ever since the P6 core (P3, and current P-Ms) hit a wall at 933mhz they had to make a "quick fix" they done this by doubling the pipeline lengh. This made the proccessor less efficent, but made the clock speed great.
clock speed sells. People see 3.2ghz and buy it over a 2ghz amd.
they then hit a wall at 3.4ghz with northwood, and tried the same, cheap pipeline increase on the prescott, which has failed...
also i think the ALU's in the CPU are double pumped, ie in a 3.2ghz CPU they run at 7.4ghz...thats HOT stuff.
However, in the backround, developing behind the desktop team, the mobile team have "fixed" the P6 core and we now have a VERY VERY good CPU in the P-M, 2ghz, 2mb cache and much faster than Prescott, and even beats A64's However, intel continues to push prescott for some unknown reason?
-
Originally posted by Kev367th
You mentioned dual cores, just pointing out that there are differences between the two designs.
And I am pointing out that has nothing to do with what I am talking about.No-one will be doing it?
AMD selling Opteron Dual Cores now.
Intel paper launched theirs few weeks back.
So someone must be doing it.
Why don't you just stop trying to make everything an Intel vs AMD debate. It gets old rather quickly.Yup transistors double again, yet the 'dual core' Opterons manage to do this with a min power usage of 64W. Only slightly higher than their single core counterparts.
Hehehe... you sure you meant to use "minimum" there?
-
Originally posted by Mini D
And I am pointing out that has nothing to do with what I am talking about.Why don't you just stop trying to make everything an Intel vs AMD debate. It gets old rather quickly.Hehehe... you sure you meant to use "minimum" there?
As previously stated, YOU mentioned dual cores.
Wasn't an Intel V AMD, pointing out BOTH companies have launched their dual core CPUs.
If both companies have launched their dual cores then contrary to your quote "I doubt very much that anyone in the microprocessor industry will be doing it" - obviously SOMEONE must be doing it.
Yup in min power mode dual core Opterons run at about 64W at around 1.1v or 1.0v, can't remember exact figures..
-
You simply decided that the mere mention of the phrase "dual core" was an opening for you to bring AMD into the conversation. Hey... show off a bit more if it makes you feel more secure. Just don't waste it on me, I'm not impressed with people quoting internet babble as if it's a religion.
I'll say this once very slowly for you:
I am not talking about anything that AMD or Intel has announced. I am talking about something completely different. There is a world that exists outside of tom's hardware.
-
Toms Hardware - True, but I stopped reading his babble years ago.
Obviously when you said quote - "I doubt very much that anyone in the microprocessor industry will be doing it" in relation to dual cores, you weren't talking about AMD or Intel.
In which case I apologise, as they can't be classed as anyone.
-
Sigh... it was in regards to a new way to do dual core you twit. You just refuse to see that.
Let me set you straight:
I said:
I will say, though, that some things are start to look promissing that will revolutionize the concept of "dual core". Now, if they'd just do something with the "core" itself, i'd be happy.
you said
Internet babble about paper releases and AMD being awesome
I said
I'm not talking about what Intel and AMD have announced.
you said
You brought up dual core
Read that as man times as you need for it to sink in. THERE IS MORE TO DUAL CORE THAN WHAT YOU HAVE READ. When I said "revulitionize" and "will announce" that tends to say YOU HAVEN'T READ ANYTHING ABOUT IT YET.
You decided to jump into a thread and spew your AMD rhotoric like I've seen you do any other time I've talked about the goings on at Intel. This was done (every time I've seen) completely outside of the descussion at hand and for no reason other than spewing unwanted propaganda. I feel sorry for you and I'm done trying to explain how really, you don't know what you're talking about.
-
I am not specifically AMD orientated.
I have gone from Cyrix years ago, to Intel, to AMD, to Intel, and back to AMD again. Even used Motorola.
Same goes for vid cards, used both nVidia and ATI.
I will put my money where I get the best bang for my buck.
Was no need to get things on a personal level but then I should have really expected it.
End
-
Kev, try posting without bashing Intel and singing AMDs praises some day. Then get back to me on making it personal.
-
Well I'm sorry if saying anything about Intel that isn't praising them is personal.
Bit like someone critizing Mitsubishi and me taking it personal because it's the make of car I own, GEEZ!
If they ever produce another CPU that doesn't have the power/thermal characteristics of your average toaster oven, isn't a kludge, and at a good price, I'll be MORE than happy to buy one.
-
Not praising them?
Dude.. you rag on them whenever you get the chance. You sing the praises of AMD whenever you get the chance. Your "the Intel dual core isn't REALLY a dual core" statement pretty much sums you up. You're ignorant. If you bought a cyrix, then you've most likely been that way for some time.
Not liking Intel's product offerings is one thing, but you hate the company. In a thread where I'm talking about working for that company, how can I not take it personally?
Like I said, learn to post without automatically dogging intel and praising AMD and you just might get a bit of credibility. Till then, you're just an internet fed lacky with a serious chip on his shoulder in regards to Intel.
-
Originally posted by Mini D
Kev, try posting without bashing Intel and singing AMDs praises some day. Then get back to me on making it personal.
sorry but 90% of the review sites do the same... bash Intel. why? well read this thread.....
p4 prescott sucks (heat, very poor IPC)
DDR2 sucks (brought in by intel)
there dualcores are NOT designed in a good way.
P-M rocks, but intel dont want to use it on desktop. So there only GOOD product, they are keeping from us.
thats the point of this thread, telling the orignal poster that he shouldnt touch a P4 with a barge pole. And it wasnt Kev that said that.....
-
Think Overlag just about sums it up, although he did forget to mention the Rambus fiasco.
But then what do us poor ignorant end users know.
Lets look at two occurences -
AMD launches 64bit CPUs - Intel paper launches P4 EE same day.
Intel paper launches "dual cores" on Monday April 18th after AMD made it public they would be lauching (as in available) Opteron dual cores on the 21st.
Now AMD are shipping (not paper launching) X2, the desktop version dual core.
^^^^^^not praises - FACT^^^^^^^
Unfortuneately Intel dropped the ball about two years ago and are now rushing to catch up, hence the EMT and the kludged dual core offerings.
Kind of 'lucky' Microsoft 'just happened' to release XP64 within a week (and not a lot earlier) of Intel 64's or Intel may have been further behind.
I hope I'm wrong and Intel has some secret all singing/dancing CPU up and coming, we need Intel/AMD competing side by side with comparable CPU's to keep the prices down.
If Intels CPU's actually equalled all their hype we'd ALL be happy. Luckily they have managed to convince average 'Joe Public' that a more Mhz/Ghz CPU is better than a slower one. Something that non-average "Joe Public" is now realising is not true.
Notice - Not ONE personal attack.
I NEVER said I hate Intel, geez talk about having a chip on their shoulder.
Prove to me Intels dual core are true dual core then - you can't, they're a kludge.
Anyway a quick question - Will DDR3 use the same slot as DDR2?
-
well said Kev, and i cant beleve i forgot rambus!
its not our fault Intel arnt giving us anything to be "pro" about. But does than make me Anti intel? maybe anti prescott, Id LOVE to see those Dohans (spellin?) reach desktop, with DDR400, PCI express etc But Intel wont do that....because it would be admitting defeat.
AMD make the best chips around right now, but does that make me a AMD fanboy? well i guess yes by other peoples logic....but until Intel give me something worth while, and worth singing praise about then what can i do?
I can if you want go on and on about my P3E 500 that did 800mhz at default volts and stock cooling. I can, if you like go on about those P-M Dohans (i do already)....
I nearly brought Northwood, but unluckly just as my funds cleared for a £1.6k system, A64 came out and i spent £500 on a A64 3200...man thats shocking to say that now lol.....
Oh, and the one Intel rig i have access too is my sisters 3.06 prescott laptop (err bad mix?!?!) and the only thing its good at is BURNING MY LEG, and making alot of noise (i hear it OVER my desktop rig)
-
Overlag -
I actually have 4 other PC's, all Intel.
I use them for servers as Intel makes excellent chipsets.
OH MY GOD did I just praise Intel!!!!!
Anybody who doesn't think there was some back scratching regarding XP64 and Intels EMT's are just being naive.
-
Yes, you've just spewed every bit of rhetoric out on the internet. Way to back him up, overlag.
The truth is that 32/64bit implimentation is worthless despite MS's offering of a 64e XP. The drivers are not out there to adequately support it and the program compatability will be abysmall for some time. But continue to sell that Opteron as if the 64bit competition meant anything.
As far as who's implimentation of 64bit extensions is better... I'm really curious as to just where you got your "facts" on that.
Now, let's get to "dual core". I find the discussion ironic given that most AMD users have been trying to explain how hyperthreading was pretty worthless since most programs don't support multithreading anyways. Now, we get to AMDs response to Hyperthreading and pretend that it's an original idea. Ok... we move to Intel's response to observing that people are actually stupid enough to pay $3000 for a processor and introduce a dual core hyperthreading chip. Yessir... that's 4 for the price of 2. I'm curious to see just how rabbid amd fans start to claim that multithreading doesn't matter unless it's on an AMD chip.
Yes... hold on to that "true" dual-core montra. It doesn't reak of zealot at all.
The power debate is also one that I find quite amuzing. This is a case of "find a chink in the armor and poke at it" if I've ever saw one. If you don't believe me, read back and see what was being said when the tables were turned there.
Personally, I tell of my own experience with processors and, in this case, processes. I don't give advice on which processors to buy, though I do keep an eye out for obvious liars. I've seen at least one in this thread.
-
Originally posted by Kev367th
Overlag -
I actually have 4 other PC's, all Intel.
I use them for servers as Intel makes excellent chipsets.
OH MY GOD did I just praise Intel!!!!!
aye, i had been using two PII350s on BX boards for almost 10 years as file servers/game servers and seti crunchers.
and my overclocking adventures with my P3e500 killed something...i think the MB so i got that "super chip" sitting staring at me doing nothing :(
Ive liked intel for ages. as Skuzzy says there chipsets are great...untill you hit the 775 socket (i forgot that in that list earlier!), Prescott and DDRII, then it just goes downhill.....
id have to say MD is more of a "fanboy" because hes singing praises on something that ISNT good.....and im sorry if you see that as a insult
-
Originally posted by Mini D
Yes, you've just spewed every bit of rhetoric out on the internet. Way to back him up, overlag.
The truth is that 32/64bit implimentation is worthless despite MS's offering of a 64e XP. The drivers are not out there to adequately support it and the program compatability will be abysmall for some time. But continue to sell that Opteron as if the 64bit competition meant anything.
As far as who's implimentation of 64bit extensions is better... I'm really curious as to just where you got your "facts" on that.
Now, let's get to "dual core". I find the discussion ironic given that most AMD users have been trying to explain how hyperthreading was pretty worthless since most programs don't support multithreading anyways. Now, we get to AMDs response to Hyperthreading and pretend that it's an original idea. Ok... we move to Intel's response to observing that people are actually stupid enough to pay $3000 for a processor and introduce a dual core hyperthreading chip. Yessir... that's 4 for the price of 2. I'm curious to see just how rabbid amd fans start to claim that multithreading doesn't matter unless it's on an AMD chip.
Yes... hold on to that "true" dual-core montra. It doesn't reak of zealot at all.
The power debate is also one that I find quite amuzing. This is a case of "find a chink in the armor and poke at it" if I've ever saw one. If you don't believe me, read back and see what was being said when the tables were turned there.
Personally, I tell of my own experience with processors and, in this case, processes. I don't give advice on which processors to buy, though I do keep an eye out for obvious liars. I've seen at least one in this thread.
oh well see my post above :rolleyes:
A64 isnt JUST about 64bit, its about much higher IPC, onboard memory, and built in ability for dual core from the first design.
I think you've been living under a rock for the past 2 years if you think A64 is just about 64bit.
edit again.... can i suggest you read some reviews on the pros and cons of BOTH multicore designs??? that you may understand what we mean by "true dualcore"
-
Liar - You want to get personal, fine.
That's rich coming from someone who's
1) Works for a company that has hoodwinked average "Joe Public" for years now with the faster the processor the faster the machine bullcrap. Original poster of this thread is an example, he thought higher clockspeed = faster system (no offence gnubee)
2) Works for a company that has questionable business practice regarding threatening retaliation against PC makers who used AMD chips. Currently the subject of a lawsuit in Europe and Japan.
Obviously you been well indoctrinated/brainwashed into the Intel rah,rah, rah club. I applaud your company loyalty, but get your head out of the sand and look around.
Yes the P4 dual core is a kludge, its a piss poor attempt to catch up on something that was always designed to part of AMD64's from day one. Hence the use of the FSB to communicate between the two cores.
$3000 for a CPU, maybe if you bought an Intel dual core toaster oven. Desktop AMD dual cores are supposed to start at around $500, big difference.
Hyperthreading - Well it makes sense for AMD dual cores to have BACKWARDS compatibility with Intels hyperthreading implementation. Will be interested to see how true native HTT stacks up against it.
Chink - Heat produced isn't a chink it's a gaping bleeding festering wound in Intels side.
You can spout on all you want about whats in development, for Mr average user it's whats on the street that matters.
FACT - AMD dual core are on the street
FACT - Intels aren't.
FACT - AMD runs cool
FACT - Once your P4 has died it's still useful for warming plates, makng popcorn etc.
-
and to be fair MD has turned it into this sort of thread....and turns all of us into the same group... IE he said about "we" not thinking HT was worthwhile.
well wrong, HT is what nearly swung me. HT for instance on my sisters laptop makes it "smoother" in windows. However with SINGLE THREADED (games) tasks its next to useless and the same can be said for the Dual cores coming out. Although HT has helped the trend for software makers to start making multithreaded software, even GAMES (soon)
however HT wont work on a AMD cpu due to the fact AMD cpus is highy effecient and has a high IPC.
HT helps Intels long pipeline design greatly. but HT on a A64 or a P-M just isnt worth the effort as the cpu's wont really benfit.
-
True, but I get sneaky feeling that we will start to see more and more multithreaded apps/games, and also more 64 bit versions of them now.
I am sure MiniD will correct me if I'm wrong here -
Next major release/redesign will be the "Merom/Conroe/Woodcrest"series in LATE 2006.
They will be missing Intels failed netburst.
All apart from Whitefield will be dual core initially.
At lower clockspeeds than current P4 offerings.
At first they will STILL use the FSB until the 1st 4 way "Whitefield" core.
Wonder where AMD will be by then considering they have already demoed 4 way cores :)
Of course if Intels offering proves to be superior to AMD's and at a comparable price I will got for an Intel, no brand loyalty for me.
You see I don't hate Intel I just go for the best available at a given cost, and at the moment it's AMD64.
Unfortuneately the majority of people dont work for Intel (I assume you get 'deals' on CPUs) and cannot afford the high price stuff.
For you to say because I bought a Cyrix processor years ago makes me ignorant, just shows how ignorant you are. Ever consider it was bought because it was cheap and thats all I could afford at the time?
Now go away and cook your dinner over your P4, like a good little Intel robot.
-
Overlag, show me where I've sung the praises of intel processors in this thread. I dare you. And... if you don't want to get lumped in with kev, stop blindly defending him. He's way out of line on virtually every post in this thread, and just keeps going with lies... erm "facts". Yes... that AMD is a really cool chip alright. I'm suprised it even needs a heatsink at all.
The HT comment was simply brought up to highlight how things are going to swing as far as "oppinions" and "facts" go. Multithreading was downplayed yesterday, yet it's the big thing today. Typical for the likes of kev. If he had a bit of a memory, he'd remember when FPUs were downplayed prior to the K6-2 (when AMD finally bought a decent design team with an excellent FPU design).
The thread is about 64 bit pentiums. These chips, much like the Opteron (no matter what it's "optimized" for) are pretty much worthless right now and will not do anything to enhance gaming performance in regards to 64bit processing. Neither will multi-core processors. Not for a couple of years, at least.
Kev, you really need to pay attention to releases and exactly what they mean for each company. You're talking like you really don't understand the whole process. You also need to take a bit of a look at pricing.
To both of you: don't read "stop slamming on Intel and always praising AMD" to mean anything other than that. If you can't see the pros and cons of both you are being a fanboy. I have not slammed AMD processors in this thread. I have not done that for 6 years now... since they actually became competitors. It just seems some people go out of there way to create disparities that don't really exist.
-
Already said that if Intel bring out a better chip at a comparable price I'd buy it. Just nothing on offer from Intel yet.
The majority are not in the priviledged position of getting their stuff (probably at a reduced price) the way you do.
Just tell me where I have lied.
Yup for the moment 64-bit is not what it could be, but if you notice I also said now that Intel have released theirs we should see more and more 64 bit apps/games.
Same goes for dual cores.
Releases - Thats why I said you would point out any mistakes in my previous post.
Expected pricing of X2 desktop dual core
Athlon 64 X2 4800+ 2.4 GHz 1 MB $1001
Athlon 64 X2 4600+ 2.4 GHz 512 KB $803
Athlon 64 X2 4400+ 2.2 GHz 1 MB $581
Athlon 64 X2 4200+ 2.2 GHz 512KB $537
Personally I'll wait for the prices to drop a little before buying in, if I buy in. I would also expect the cost to be slightly higher 1st release because of the obvious hike the resellers will add to max profits.
Not quite the $3000+ you mentioned.
Only CPU that comes close is the top of range max 8 CPU one
$2655 - Opteron 875 Dual-Core 2.2GHz (hardly a gaming processor)
In fact I think AMD are making a mistake going to socket M2 for DDR2. They should skip it and just wait for mainstream DDR3.
I resent being called ignorant just because of the fact I had no choice but to buy a Cyrix processor because it was cheap.
In fact that comment did nothing but show how ignorant/arrogant/blinkered you are.
-
Kev,
You're doing it again. You can stop lieing any time in some weak attempt to make yourself look unbiased. If you're doing nothing but slamming the hell out of Intel processors right now (hell... every time I've seen you post) and pushing AMD processors, you are biased. There is no other way to say it. Actions speak louder than words.
Also, you do realize the 800 series (most expensive processors) are the only dual core processors that've "been released"... right? One place seems to be "carrying" them. The rest are due out later next month. And the AMD stuff is going to be at least twice as expensive as the Intel stuff. The Pentium D is going to be around $250.
I'm sure you'll spend the next month trying to convince everyone that the AMD processors are well worth spending twice as much on. If only Intel made a better processor...
-
well i need a shrug icon...
the guy is asking about p4 64bit, and we, are telling the truth. an A64 would be much more value. its faster AND cheaper..... period.
A64's 32bit performance is far ahead of a P4 Prescott, and alittle ahead of the Northwood. The P-M and A64 are just about level, about 200mhz extra needed on the intel chip to = the A64.
im not sure what your arguement is, or what we are lying about either. im just about done with your 100% intel attitude. at least me and kev are thruthful... as soon as intel passes amd, id buy it. we buy the best at the time, and we recommend the best to others too, sorry about that.
funny how you compair the top end server amd chips price to the bottom end intel chips price......unbiased???:confused:
-
Actually, he's asking about 64bit performance for gaming. The proper answer is not Intel, nor AMD. It is "no".
64bit does not have a place in the home right now. It will not for some time. I'm sitting here with two 64e processors running a 32bit OS because I need compatibility with hardware and software. That's what most people will be doing with a 64bit processor too.
In some people's ferver to insist the AMD processor is the king, they've missed the point of the thread. Money can be spent better considering things other than 64bit capability. It is worthless for gaming.
-
sorry but 64bit chips run 32bit just fine. these chips are built for the change...from 32 to 64....and perform well under both.
if the guy has a p4 highspeed northwood, and is ONLY upgrading because of hes wondering about 64bits its not worth it.
however if he is upgrading no matter what he'd be better off buying 64bit for the "future". this is where you'd say intel even though they are slower and more expensive. but thats just the way you do things. Me, kev and even skuzzy tell it like it is.
-
Sigh...
What OS do they run in 64 bit mode on overlag? Is that a good OS for gaming? In truth, 99% of all people with 64e capabable chips will be running a 32bit OS. 64bit is irrelevant in the world of gaming.
"in the future" is irrelevant in the world of processors. By the time 64bit is mature, any processor built this year will be obsolete.
The real question is what chip gives the best performance. The high-end opterons are definately the choice here, but they're very expensive. The P-4 extremes are also very expensive and don't really add that much in performance gain, but they do use the word "extreme" and that seems to help.
The real options occur in the <$300 chips. An Opteron 3500+ and an Intel P4 550 (3.4ghz) are both very good options. The opteron gives about a 15% performance bennifit which will be better for the games, and the P4 offers hyperthreading which is better for day to day use. Both offer the opportunity to buy the CPU and high-end video card for the price of a high-end CPU without a significant performance drop.
-
Originally posted by Mini D
Sigh...
What OS do they run in 64 bit mode on overlag? Is that a good OS for gaming? In truth, 99% of all people with 64e capabable chips will be running a 32bit OS. 64bit is irrelevant in the world of gaming.
"in the future" is irrelevant in the world of processors. By the time 64bit is mature, any processor built this year will be obsolete.
The real question is what chip gives the best performance. The high-end opterons are definately the choice here, but they're very expensive. The P-4 extremes are also very expensive and don't really add that much in performance gain, but they do use the word "extreme" and that seems to help.
The real options occur in the <$300 chips. An Opteron 3500+ and an Intel P4 550 (3.4ghz) are both very good options. The opteron gives about a 15% performance bennifit which will be better for the games, and the P4 offers hyperthreading which is better for day to day use. Both offer the opportunity to buy the CPU and high-end video card for the price of a high-end CPU without a significant performance drop.
i really dont understand your problem.
so you are saying you WONT buy an A64 because it can run 64bit? even though its the fastest 32bit CPU?
What is a Opteron 3500+?
-
No... I'm saying 64bit is irrelevant to the discussion of gaming. No matter what. It's irrelevant to desktop CPUs on the whole right now. There is not consumer market level support for it. Saying otherwise, at all, is an outright lie.
The 3500+ is the 2.2ghz opteron. Very comparable to price and performance with the P4-550. Both Intel and AMD go up about $200 per 10% gain after that... so you can go from 110 fps with HL 2 to 126fps. Yippy.
-
Originally posted by Mini D
No... I'm saying 64bit is irrelevant to the discussion of gaming. No matter what. It's irrelevant to desktop CPUs on the whole right now. There is not consumer market level support for it. Saying otherwise, at all, is an outright lie.
The 3500+ is the 2.2ghz opteron. Very comparable to price and performance with the P4-550. Both Intel and AMD go up about $200 per 10% gain after that... so you can go from 110 fps with HL 2 to 126fps. Yippy.
3500+ is an A64, NOT a Opteron. Opteron is a 940 socket chip for servers and doesnt have a ratings like that, but has 1xx, 2xx and 8xx...
so you have a 3500 A64 venice for £199 or a P4 650 for £270, or a P4 550 for £190. what would you choose?
Id choose the 2nd cheapest, and fastest. A64 3500.... Of coarse for £240 you can get a San Diego 3700 which would be miles faster than any of them, yet still cheaper than that P4 650.
You would however choose anything without 64 in its name/specs? Because you are almost saying in most of your posts that having 64bit now disadvantages you. It doesnt disadvantage you at all. Though saying that i guess it does because "your" chip maker charges extra.......
-
Originally posted by Mini D
The 3500+ is the 2.2ghz opteron
And this is coming from the expert LOLOLOLOL.
-
Originally posted by Kev367th
And this is coming from the expert LOLOLOLOL.
thats what i thought :eek:
-
Well at least it explains one thing - why Intel have fallen behind - they don't even know what they are up against.
Seriously
In MiniD's opinion a 64 CPU is not worth it because of app/game/OS support. Hmmmmm.
Using AMD only as an example so don't get your panties in a twist MiniD
If I was to spend up to $1000 on a CPU I shouldn't buy an FX-55 then?
Even though it is the best gaming CPU out there?
Instead I should buy the fastest XP CPU i.e. the 3200XP?
MiniD do you actually realise the insanity in what your saying?
P.S. Opterons are NOT really a gaming chip, having to use ECC memory with them hurts their performance.
Oh I get because I said I would buy Intel I'm lieing. Like I said I have 4 other machines in the house, and I checked to make sure.
3 Intel (1 P4, 1 P3, and 1 P2) 1 AMD (XP3000)
I need them because I am a network admin and use them to try out apps before going live with them on the company network.
In fact have just put in 16k worth of new servers/network upgrades all Intel. This is despite the boss thinking about using a blade Opteron. As I explained to him, for servers Intel is still the only tried and trusted solution.
-
The 64bit processor is not worth it simply because there is no support for it. It is irrelevant in the grand scheme of things. It's as irrelevant as a processor with "built in multi-core support" that DOESN'T HAVE MULTIPLE CORES.
I can't believe this is such a stretch of the imagination. Dammit guys, stop being obtuse.
-
so you want us to buy outdated XPs or crappy ultra long pipelined Prescots? :lol just so we dont get 64bits?
to be fair i dont really care my A64 has 64bit possiblity. its the fastest gaming cpu. The bonus is it runs linux 64 and XP 64.
-
64bit is not a reason to purchase a processor right now. The 64bit aspect of any processor being released is irrelevant. The 32bit is the only part that matters. Especially when it comes to gaming. If you don't believe me, try installing XP-64e and see just how much fun you have trying to load programs and hardware on it.
So... continue to read the rags and pretend you know what the hottest thing on the market is and how in tune you are because you read the product roadmaps on anandtech. It might help hide the fact that you haven't applied a lick of comon sense to anything you've said... especially if you're touting 64bit extensions without warning people about Microsoft's 64e operating system.
AMD 64 3500+ (thought that was an opteron...)
Pentium 4 550
Both are good chips at a decent price. That still leaves you money to get a fancy new video card.
Or... go with the 3700 that provides a 3% increase in performance for another $120. Yes... that would be swell. There's some sage advice to be giving.
-
when did i advise that about the 3700? I ALWAYS buy the bottom end, ive brought 3 3200s, overclocking them past 3500/3700 ratings.
But right now things are different, Venice and san Diego are out, both offer overclocks upto around 2.8ghz. However the san diego is 1mb vs Venices 512kb. And the 3700 only costs £40 more than the 3500. and only 3% improvement? where the hell you find that info?
I myself HAVE loaded XP64 RC1 and i HAVE played games on it. RC1 sucked though, and ATI's drivers was even worse back then but it still worked.. not bad for RC1.
and what dont i believe you about?
where have i recommend 64bit chips because of there 64bit useage? and anyway, having 64bit support IS a good thing. Or you against extra features?
And i also see your now recommending a A64 yourself...so your going against your own arguement with me and kev?
-
I think your missing the whole point -
Quote -"The 64bit processor is not worth it simply because there is no support for it"
People are buying 64 bit CPU's not for 64 bit support but because they are the fastest gaming CPU's in 32-bit. To advise people that they are not worth getting because of lack of 64-bit support is crazy. When it runs just as happily 32-bit.
Value for money I agree, but a percentage of the population are willing to spend $800+ on a CPU else Intel and AMD wouldn't have them.
No-one has actually said there is a lot of support for 64-bit OR dual core, it will come.
Quote -"It's as irrelevant as a processor with "built in multi-core support" that DOESN'T HAVE MULTIPLE CORES. "
True but the whole idea was that the support for it was already designed into the die and didn't require a redesign or kludge to make it happen.
I think they had to wait for the 90nm process until it could be realised (I may be wrong).
Called providing an upgrade path, in this case an upgrade path that does not require a new motherboard, just a BIOS update for socket 939 boards.
About as irrelevant as a design that has a dual core CPU communicating through the FSB. Even you must admit it's a nasty inefficient hack, but it's the best they could do with the current range.
Any time I have mentioned XP64 I have said do it as a dual boot with a 32 bit OS also. Only app/game I had problems running under XP64 was Norton Anti Virus.
Found drivers for all my stuff apart from one SCSI RAID card which is very old.
-
Can anyone point me to the facts of "64bit" for athlon64 and P4 ?
I thought they still work on x86 set of instructions and have an extension to the instructions that allows for 64bit adresses and data?
So to say, the instructions are still 16 and 32 bit long, just a long adress/data is possible?
I expect a real speedup as soon as we finally get a new, clean instruction set tailored to 64bit and a new standard like x86 for that platform, which does away with the old way interrupts are done, using a "system" co processor.
-
http://techreport.com/reviews/2005q1/64-bits/index.x?pg=1
http://techreport.com/reviews/2005q2/athlon64-x2/index.x?pg=1
-
In my experience, keep with AMD and Nvidia, flee from ATI and Intel.
-
Originally posted by MANDO
In my experience, keep with AMD and Nvidia, flee from ATI and Intel.
Actually I'll go with whatever gives best value for money, contrary to what MiniD thinks.
If I'm going to spend $xxx amount on a CPU I'll check out whats available and buy the best one. Just so happens at the moment it's AMD, it will swing back again to Intel at some point.
-
Ya know, this was a great read up until Kev & Overlag came in to pounce on MiniD for the sheer fact he's an Intel employee.
Sad to see this thread go down as a pigpile.
-
LePaul -
I didn't even know he was an Intel employee, read back a bit you'll see where I mention it after he tells me.
It went into a 'pigpile' after the comments rather than sticking to facts turned into personal attacks, and that WAS NOT started by me.
-
You dove into a deep, structural discussion between Skuzzy & MiniD about microprocessor design. You felt a need to jump in and take it off-topic....which was too bad. Now its yet another AMD vs Intel CPU debate.
For what its worth, I use both...couldnt care less about whose cheaper/better...i was intrigued at the mindsets behind the current designs.
-
Great links whels, thank you thats what i wanted.
Based on that i expect 64bit games based on "x64" run up to 10 percent faster than with 32bit implementation.
For at least some years every game comes out with 32bit implementation with it and soon with an optional 64bit executeable on a few games.
So future in gaming? Sure, in some years all new apps and games come in 64bit. But the increase in speed is no big step, remember you get a 100 percent increase every 2 years. Applications tailored to multi core processors will give a bigger hit than the jump to 64 bit, at least i beleave that.
-
Originally posted by Schutt
Great links whels, thank you thats what i wanted.
Based on that i expect 64bit games based on "x64" run up to 10 percent faster than with 32bit implementation.
For at least some years every game comes out with 32bit implementation with it and soon with an optional 64bit executeable on a few games.
So future in gaming? Sure, in some years all new apps and games come in 64bit. But the increase in speed is no big step, remember you get a 100 percent increase every 2 years. Applications tailored to multi core processors will give a bigger hit than the jump to 64 bit, at least i beleave that.
I would hope to see a sizeable increase in both 64-bit and multithreaded games by the years end.
-
Not going to happen, but I have elaborated on that before and you ignore it.
I really am disappointed this thread got so convoluted. And on a side note, I am not impressed with the AMD64 performance. I am still doing better with an old Northwood than those benchmarks whels pointed to.
MD, you know of a good forum where we can carry on a technical discussion? It is painfully obvious we will not be able to do so here. Drop me an email with a link if you know of one. I would like to pursue the discussion.
-
Skuzzy - Both chip manufacturers are 'expecting' to see an increase in multithreaded stuff starting at the end of this year. But as you said this remains to be seen.
As for 64-bit games they have been trickling out, so I suppose the swap to 64-bit has started even if it is slowly.
Would you happen to know if DDR2 slots are going to be compatible with DDR3 when it finally arrives?
-
Originally posted by Skuzzy
Not going to happen, but I have elaborated on that before and you ignore it.
I really am disappointed this thread got so convoluted. And on a side note, I am not impressed with the AMD64 performance. I am still doing better with an old Northwood than those benchmarks whels pointed to.
MD, you know of a good forum where we can carry on a technical discussion? It is painfully obvious we will not be able to do so here. Drop me an email with a link if you know of one. I would like to pursue the discussion.
how does your northwood beat A64s if no one else sees that anymore?
and anyway MD brought it down
-
Overlag - Check through other threads you'll see why Skuzzy gets the peformance he does.
-
Skuzzy... I know of a place (http://forums.checksix.net/forum_topics.asp?FID=36) ;)
-
I was not saying 64bit games are not coming, but there will not be any measurable performance benefits for games Kev.
Hehe LePaul.
-
CC- Skuzzy, I think we'll see more benefit from multithreaded games in conjuction with 64-bit though.
I seem to remember that the initial change from 16 to 32 bit yielded no great immediate benefits, and took quite some time before any real benefits did appear.
Think we've kind of hit a plateau at the moment, needs time for both 64 bit and dual/multi core to mature before we can expect to see exactly what advantages can be gained.
On a positive note in now seems XP64 can run 32 bit games with less of a drop in speed than during the betas/RC versions.
I tried AH2 under XP64, took a slight frame rate hit, but the game itself appeared to run a lot smoother.
-
Try us, Skuzzy.
Even our political rants arent so bad.
Come to the dark side! Face MiniD's moderating!
-
Well, just to throw Skuzzy a small curve ball :)
What CryTek and AMD have managed to do is to give us more content and world quality in the game without losing performance! They are taking the power of the AMD Athlon64 CPU and leveraging that to offset any performance differences and adding more detail in the game.
http://www.hardocp.com/article.html?art=NzY3
AMD 64 Bit patch for Farcry on Winxp 64.
My take is if its a 32 bit game running on XP64 it runs basically the same, however if its coded as a 64 bit game on a 64 bit OS its seems they can get better performance or add extra details.
-
Not quite right Cav. It is not that it is 64bit compiled. So far, virtually every 32bit game that has been recompiled (no code changes) has run almost exactly the same. Sure, if you optimize the code before compiling it to 64bits, it will probably run better, but compiled as 32bit, it would run about the same.
You guys are really eating up the marketing on this.
There really is no magic in 64bit code which will make it faster. If anything it has the potential to be slower due to the doubling of the data sizes. This digs into your CPU cache in a big way.
Things only really get faster when we no longer have to deal with stupid ways to use big numbers. However, a game's numbers mostly fall between -1 and 1 for the bulk of the code.
But sit tight, I am sure AMD has some more marketing heading your way.
-
Ok, I was a bit edgy in that last post. You know what will buy more performance than anything else? The additional registers.
BUT (always seems to be one of those), the MS compiler through 6.0c only makes use of two scratch pad registers. Even though there have been more available for quite some time.
IF (gotta love speculating) MS gets off thier collective tushies and re-writes the compiler (yeah,..that'll happen) so it can actually make use of those registers without having to resort to assembly language programming, then it has the potential of really boosting performance.
Those gains have nothing to do with 64bit CPU's though.
-
well with all these 64bit os tests out you can see that most games perform the same, however some perform much better.
are you against that? was you against 16bit to 32bit? 8bit to 16bit?
we'd still be using 1bit pcs with 2kb ram if it wasnt for progress... 64bit is progress.
-
Skuzzy i know its not going to run 32 bit code faster "just" because its a 64 bit processor.
I do know for a fact in gaming my AMD 64 is considerably faster than my P4 was, but i put that down to it being a more efficient chip design not because it says its 64 bit.
However in some desktop stuff its painful without the hyperthreading.
That link re the Farcry patch they got inprovements because it was coded for a 64 bit OS and cpu, not because they just compiled a 32 bit game as 64 bit.
32 Bit code which is run on a 64 bit OS would be silly to expect it to run better. Its good enough that there is no performance penalty.
I also suspect it will be a while before any 64 bit optimised games are actually released let alone any further game patches.
Personally i am sticking with my 32 bit windows until i see many more programs optimised for 64 bit.
-
that farcry stuff looks interesting slightly better graphics, and on average 2fps faster than 32bit code.
however issuse with nvidia drivers and farcry.... (or was it both nvidia and ati...).
-
Originally posted by Kev367th
CC- Skuzzy, I think we'll see more benefit from multithreaded games in conjuction with 64-bit though.
I seem to remember that the initial change from 16 to 32 bit yielded no great immediate benefits, and took quite some time before any real benefits did appear.
Think we've kind of hit a plateau at the moment, needs time for both 64 bit and dual/multi core to mature before we can expect to see exactly what advantages can be gained.
On a positive note in now seems XP64 can run 32 bit games with less of a drop in speed than during the betas/RC versions.
I tried AH2 under XP64, took a slight frame rate hit, but the game itself appeared to run a lot smoother.
Kev, I think Skuzzy is referring to this
Originally posted by Skuzzy
No, recompiling to 64bits would slow us down appreciably. Microsoft's 64bit compiler puts all 64bit programs on top of the .NET architecture.
The MS compiler guru has said that in most cases native 64bit applications will run slower than the 32bit counterpart.
And indeed, he has posted it in many threads, some of which you were a participant in. Note the last sentence of his quote.
-
Overlag, I have never said it was a bad thing (you really are way to sensitive about this), and Cav, you misunderstood what I was saying, but ended up agreeing with me anyway. :)
The AMD64 can be an AMD32 and the improvements would be the same, as far as performance gains goes. Like you said Cav, it is more due to being a better design. There is nothing inherent in moving to 64bit which would improve performance.
Before you can understand why the move to 64bit is not a big deal, you have to understand why it was a big deal in the early days.
The moves from 8 to 16 to 32 bit were all good things which allowed programmers to use larger numbers without jumping through hoops. These migrations naturally brought better performance as now a programmer could directly reference a number that use to require several register loads and reads and stores in an array.
However, in the jump from 32bit to 64bit, it is the rare application which needs a number larger than 32bits. Games certainly do not need them. This is why the performance difference is not going to be that great. Indeed, Microsoft's own compiler wiz has already stated most applications will run worse in native 64bit mode versus the 32bit counterpart. It is pretty easy to understand why, when you know exactly how it all works.
Oh, and SSE2 is still slow on the AMD CPU as compared to even the Prescott. Just saying.
EDIT: Tha FarCry stuff is not an apples to apples comparison. I have no doubt it runs better, but it is not due to being compiled as 64bit. Of that, I can assure you.
EDIT2: I just read what they did in FarCry. CryTech should be shot. This is a marketing patch if I ever saw one. AMD must have shoved some money thier way as the *patch* does things only in the 64bit version, which can be done in the 32bit version but CryTek is not going to do it.
In other words, the optimizations done for the 64bit *patch* can be done for the 32bit version, they simply are choosing not to.
-
Originally posted by Skuzzy
Overlag, I have never said it was a bad thing (you really are way to sensitive about this), and Cav, you misunderstood what I was saying, but ended up agreeing with me anyway. :)
.
man, im sensitive with everything lately... which probably makes me come across as a AMD/64bit fan boy however im having a s*** time in real life, and my only retreat being my pc isnt fun either anymore.
sorry about the way ive been replying in this (and many other) threads.
Originally posted by Skuzzy
Oh, and SSE2 is still slow on the AMD CPU as compared to even the Prescott. Just saying.
EDIT: Tha FarCry stuff is not an apples to apples comparison. I have no doubt it runs better, but it is not due to being compiled as 64bit. Of that, I can assure you.
EDIT2: I just read what they did in FarCry. CryTech should be shot. This is a marketing patch if I ever saw one. AMD must have shoved some money thier way as the *patch* does things only in the 64bit version, which can be done in the 32bit version but CryTek is not going to do it.
In other words, the optimizations done for the 64bit *patch* can be done for the 32bit version, they simply are choosing not to.
Some AMD people would say because benchmarks are intel optimised for SSE2... however i think its more down to the fact these instructions are designed by intel for a INTEL. SSE2 makes better use of the long pipelines or something cant remember exactly. Basicaly SSE2 removes the performance loss from the long pipeline...?? Alittle bit more marketing there by amd. cos SSE2 isnt really worth it on there cpus?
SSE3 is kinda the same really, theres big hype over it being on the AMD cpu now, however i dont really see many benifits and its also got some commands that are gone because they only apply to Intels CPU design again.
And i didnt know that about farcry link on info? from the two sites ive looked at i didnt see it (but then i just flicked through). The screenshots look much more details in the 64 version, and theres a small performance improvement too.
some of the old things like MMX and 3dnow is apparently disabled in 64bit mode too, which is what HURTS old programs BADLY.
You also mentioned the registers on the a64 earlier and that thats what improves it not the 64bit itself. But i think theres 4x128bit registers on these cpus that are only enabled when in 64bit mode. not 100% certain on that though.
-
never had a cooling prob with intel and I o.c. every comp at least 15%min. Have had probs with skatty ram even top brands randomly.
-
Originally posted by AmRaaM
never had a cooling prob with intel and I o.c. every comp at least 15%min. Have had probs with skatty ram even top brands randomly.
you probably have a northwood though? thats comparable to a A64.
Its prescotts that are the problem, they themselfs are very hot. But they ALSO due to there high power useage/draw cause ALL other voltage regulators (mosfets etc) to get VERY hot, and as skuzzy said earlier, hot enough to MELT foam that most testers rest MBs on......
-
Tha FarCry patch was funded by AMD Overlag. It was designed to make the AMD64 look better. I'd wait until some games not funded by AMD to come out before making any assessments as this is a pure marketing shot being fired.
This is how CryTek's business model has worked since day one. Kudos to them for getting other companies to pay for thier development work.
The SSE[23] instruction sets are primarily used for streaming video/audio. In this area, Intel still has a more efficient design than AMD.
There are only a few applications which are really coded to use this mechanism efficiently. All high end video processing software make good use of it.
You guys are aware extra registers do not get used unless the application has programmed them to be used? This is where the biggest gains are to be had.
In a native 64bit application, integers will now occupy twice the CPU cache space as before, which reduces the cache hits and can cause a slow down for an application (which is what Microsoft is talking about when they say most 64bit applications will run slower than the 32bit counterpart). Why use 8 bytes of CPU cache to represent the number "1"?
-
Skuzzy - RE: The Farcry patch.
Having just looked at the screenies for the 1st time -
No doubt it can be done it 32 bit.
At what cost to the framerate though?
I am assuming they are saying that the same detail with 32 bit would cause a big frame rate hit?
-
There would be no impact to the 32bit version. Most of the things, which are changed, are being done by the video card, not the CPU.
They cannot release a 32bit version of the patch or AMD would probably demand thier money back.
-
Originally posted by LePaul
Try us, Skuzzy.
Even our political rants arent so bad.
Come to the dark side! Face MiniD's moderating!
I would not do that to your BB LePaul. There are elements who would cause you no end of problems over there if I did, but thanks for the offer.
Oh, Kev about the DDR2 to DDR3 transition. Anything can be designed to make that switch, but I doubt it will be done. The reason DDR2 is so hot is due to the termination being external. With DDR3, the termination is internal (where it should have been in the first place).
It definately would not be something automatic, but any motherboard manufacturer who did it, would be pushing it as a big marketing item.
-
Originally posted by Skuzzy
There would be no impact to the 32bit version. Most of the things, which are changed, are being done by the video card, not the CPU.
They cannot release a 32bit version of the patch or AMD would probably demand thier money back.
to be fair couldnt you say the same about intel offering SSE1/2/3.
things need a "push" to start changing to 64bit.. but the farcry makers had very little incentive to do so. Same with MS not "rushing" 64bit out.
also this 64bit patch does work on Intel CPU's.
-
Originally posted by Skuzzy
I would not do that to your BB LePaul. There are elements who would cause you no end of problems over there if I did, but thanks for the offer.
Elements? Bad ones? Oooh...Ripsnort...oh dont worry about him! :p
Seriously...no worries. Its up to you, whether you lurk/read or post...I'd double-dog-dare-ya...but don't think you'd take us up on it!
So..I'll have to resort to the ole standard. CHICKEN! :rofl
JK
-
Originally posted by Overlag
to be fair couldnt you say the same about intel offering SSE1/2/3.
things need a "push" to start changing to 64bit.. but the farcry makers had very little incentive to do so. Same with MS not "rushing" 64bit out.
also this 64bit patch does work on Intel CPU's.
Actually, the SSE family of instructions provide a specific purpose and are good at that purpose. Not using them does degrade performance significantly for the operations they are designed for.
They do not work as well on AMD CPU's as they do on Intel CPU's, but hey are still better than not being used at all.
I know the 'patch' for FarCry will work on Intel. The biggest issue I have with this so-called patch is how it is being marketed. It is highly misleading, to the point of being an outright lie.
-
Originally posted by LePaul
Elements? Bad ones? Oooh...Ripsnort...oh dont worry about him! :p
Seriously...no worries. Its up to you, whether you lurk/read or post...I'd double-dog-dare-ya...but don't think you'd take us up on it!
So..I'll have to resort to the ole standard. CHICKEN! :rofl
JK
Hehe,..you'll have to do better than that. :D