iGary
Aug 25, 03:05 PM
Apple needs to address this situation appropriately. As their products gain higher profile, as their customer base increases and they gain market share, it's only logical to think that there will be a greater need for support.
You're missing a comma. :p :D
You're missing a comma. :p :D
babyj
Sep 19, 07:43 AM
Actually, yes. I use my laptop as a portable desktop, and I do a lot of different things with my computer. My current PowerBook G4 is capable of some of them, but really not practical for many (scientific computing, ray-tracing molecular models, etc.). A current yonah-based MBP would certainly be faster, but it would still be a 32-bit processor, and like many other pro-users, I don't want to have to buy a new machine every year.
Maybe I'm missing something here, but I'd of thought buying the latest and fastest computer every year would be the first thing a 'pro-user' would do with his money.
If speed really is that important to all you 'pro-users' why are so many of you using older computers which are far slower than the current Macbooks that have been available for many months?
If I did something for a living which required heavy cpu processing, spending $1,000 updating it (cost price less resell price of old) would be the best $1,000 I could spend as I'd get the money back through increased productivity very quickly.
Maybe I'm missing something here, but I'd of thought buying the latest and fastest computer every year would be the first thing a 'pro-user' would do with his money.
If speed really is that important to all you 'pro-users' why are so many of you using older computers which are far slower than the current Macbooks that have been available for many months?
If I did something for a living which required heavy cpu processing, spending $1,000 updating it (cost price less resell price of old) would be the best $1,000 I could spend as I'd get the money back through increased productivity very quickly.
technocoy
Nov 29, 01:00 PM
I can't get over the blind greed of these companies.
I'm waiting on Apple to get "threatened" by the bastards one time too many and then Apple says "ok" then approaches all the artists and opens the store to them paying part of their production costs and then giving them 80 percent of the profit off every song sold. Let's see how long the record companies KEEP their artists after that.
They better wake up to the new century before their artists do.
With most music savvy artists able to produce an album for less than a few thousand bucks now, Apple could turn on the industry and just blow it out at any moment. the industry could fragment into producers and mastering studios that get only for the service of producing and then it goes up to iTunes where it's subjected to reviews by peers and by a DIGG type system to promote it.
Browsing and sampling does a lot to increase ones musical library.. I found 80 percent of my new music by just searching and browsing on napster back in the day... I would find a new artist by chance and then go and by their CD. If apple would make their previews longer you would have the same type environment.
I'm not against a company making billions, but those billions should be made from giving the people who put them there what they want.
ugh. sorry, rant over.
I'm waiting on Apple to get "threatened" by the bastards one time too many and then Apple says "ok" then approaches all the artists and opens the store to them paying part of their production costs and then giving them 80 percent of the profit off every song sold. Let's see how long the record companies KEEP their artists after that.
They better wake up to the new century before their artists do.
With most music savvy artists able to produce an album for less than a few thousand bucks now, Apple could turn on the industry and just blow it out at any moment. the industry could fragment into producers and mastering studios that get only for the service of producing and then it goes up to iTunes where it's subjected to reviews by peers and by a DIGG type system to promote it.
Browsing and sampling does a lot to increase ones musical library.. I found 80 percent of my new music by just searching and browsing on napster back in the day... I would find a new artist by chance and then go and by their CD. If apple would make their previews longer you would have the same type environment.
I'm not against a company making billions, but those billions should be made from giving the people who put them there what they want.
ugh. sorry, rant over.
kdarling
Apr 19, 04:04 PM
You made up your mind and you argue accordingly.
No, that's why I used questions. I'd really like to know if anyone thinks a normal buyer would think the Galaxy is made by Apple.
Consider this: Many people know the name "iPhone" and the way it looks, they may even know the name "Macintosh", but not the name "Apple".
They might have talked to someone who used an iPhone and was very happy with it, were convinced to buy one, and go to a shop and pick up the phone that looks exactly like the one they wanted to buy.
And end up with a Samsung phone when they actually wanted an iPhone.
So your argument is that someone would be familiar with the iPhone UI but not know it's made by Apple?
And that therefore when they went to buy an iPhone, they'd totally ignore the words Samsung Galaxy on the box simply because... what? some of the icons look similar?
Well, who knows. It's certainly happened with Chinese knockoffs!
No, that's why I used questions. I'd really like to know if anyone thinks a normal buyer would think the Galaxy is made by Apple.
Consider this: Many people know the name "iPhone" and the way it looks, they may even know the name "Macintosh", but not the name "Apple".
They might have talked to someone who used an iPhone and was very happy with it, were convinced to buy one, and go to a shop and pick up the phone that looks exactly like the one they wanted to buy.
And end up with a Samsung phone when they actually wanted an iPhone.
So your argument is that someone would be familiar with the iPhone UI but not know it's made by Apple?
And that therefore when they went to buy an iPhone, they'd totally ignore the words Samsung Galaxy on the box simply because... what? some of the icons look similar?
Well, who knows. It's certainly happened with Chinese knockoffs!
~Shard~
Jul 14, 04:55 PM
I wasn't being a smartass.

But Jessica failed to maintain

All Of Texas Hates Jessica

The 5 foot 3 inch Jessica

jessica simpson dukes of

jessica simpson dukes of

JESSICA SIMPSON 2011 WEIGHT

jessica simpson dukes of

jessica simpson weight.

Jessica Simpson should

Jessica Simpson Dukes of

Jessica Simpson

jessica simpson weight gain.

Jessica Simpson

jessica simpson weight in

ccrandall77
Aug 11, 03:45 PM
Well, I dont know where to begin... I work in science and you have to trust me when I say that you can't deduct anything from the "facts" you have. You are guessing.
The fact is that GSM has 81% of the world market... and that makes cdma a small market.
It's called an estimate... a scientist should know what that is. Care to dispute, then provide your own "facts". I also have a science background... big whoppde do! And I standby my assumption that the amount of internet usage is probably a good gauge of cell phone usage.
+15% of +1.5bil is hardly small. It may be in the minority, but +150mil people from affluent countries is a very profitable market.
The fact is that GSM has 81% of the world market... and that makes cdma a small market.
It's called an estimate... a scientist should know what that is. Care to dispute, then provide your own "facts". I also have a science background... big whoppde do! And I standby my assumption that the amount of internet usage is probably a good gauge of cell phone usage.
+15% of +1.5bil is hardly small. It may be in the minority, but +150mil people from affluent countries is a very profitable market.
skunk
Feb 28, 08:02 PM
Fornication doesn't matter if the person doesn't care about the religious connotations of marriageIt matters that you describe it as fornication.
Greek culture also endorsed pederasty!What has this dubious claim to do with anything? :confused:
Greek culture also endorsed pederasty!What has this dubious claim to do with anything? :confused:
macaddiict
Apr 25, 01:37 PM
I haven't read this lawsuit, so I don't know if they're claiming things that aren't true... but I really do not like the fact that the iPhone has a breadcrumbs database of my travels for the last 3 years!
This type of thing should not happen without users' knowledge... and it was. Or else this file would not be news!
This type of thing should not happen without users' knowledge... and it was. Or else this file would not be news!
rovex
Mar 22, 01:40 PM
Yeah a 50% smaller screen for the same price and less battery life is certainly going to crush the iPad2.
The screen is not 50% smaller. Nice way of making yourself look stupid.
Playbook has that elusive flash support out of the box which every apple fanboy wants to hide under the rug.
OS is more eloquent than iOS.
The screen is not 50% smaller. Nice way of making yourself look stupid.
Playbook has that elusive flash support out of the box which every apple fanboy wants to hide under the rug.
OS is more eloquent than iOS.
madhatter61
Mar 23, 10:36 AM
Widescreen is great for movie watching, and the spec-lover in me is all over that... but it�s not very flexible for portrait use. (Which is how you hold a tablet one-handed, and is how you see the most content on a web page or scrolling document.)
A 10.1� 1280x800 screen is actually almost exactly the same screen area as an iPad: the iPad is 45.2 sq. in., and the 10.1 is 45.8 sq. in.
Held in portrait mode, the 10.1 is .75� taller... but .5� narrower than an iPad. I don�t think I�d care for that. (But with 1280x800 you do gain 32 pixels of width, and 256 pixels of height. Still not great for portrait use.)
The 8.9 display, though�which seems to save a few bucks�is an interesting option for dropping the price floor on �real� tablets. (Not that I�d settle for Android�s failings. As pointed out: specs alone don�t make a good car, nor a good computer, nor a good tablet!)
Ha ha :D Good thinking!
Actually if you look at Xoom and Samsung 10.1 are both 16:10 ratio ... perfect for movies ... the iPAD is 4x3 old TV ratio ... creates the need for filler top/side bars... I think that is called letterboxing ... CRS?
The key advantage for iPAD is in Landscape there is more vertical space for the virtual keyboard ... duh?
Also key here is PPI and is the heart of the display issue. Apple wants the same density of PPI so software development has a common display requirement. Then all apps work across the board. That is why Apple has hundreds of thousands of apps that work.
A 10.1� 1280x800 screen is actually almost exactly the same screen area as an iPad: the iPad is 45.2 sq. in., and the 10.1 is 45.8 sq. in.
Held in portrait mode, the 10.1 is .75� taller... but .5� narrower than an iPad. I don�t think I�d care for that. (But with 1280x800 you do gain 32 pixels of width, and 256 pixels of height. Still not great for portrait use.)
The 8.9 display, though�which seems to save a few bucks�is an interesting option for dropping the price floor on �real� tablets. (Not that I�d settle for Android�s failings. As pointed out: specs alone don�t make a good car, nor a good computer, nor a good tablet!)
Ha ha :D Good thinking!
Actually if you look at Xoom and Samsung 10.1 are both 16:10 ratio ... perfect for movies ... the iPAD is 4x3 old TV ratio ... creates the need for filler top/side bars... I think that is called letterboxing ... CRS?
The key advantage for iPAD is in Landscape there is more vertical space for the virtual keyboard ... duh?
Also key here is PPI and is the heart of the display issue. Apple wants the same density of PPI so software development has a common display requirement. Then all apps work across the board. That is why Apple has hundreds of thousands of apps that work.
Dunepilot
Aug 8, 04:03 AM
I'm glad that Leopard will be completely (that's what they say, at least) 64-bit. I'm not sure why it's important to go on about the applications as if they were important to the operating system itself. Increased integration like what was displayed would cause the anti-trust machine to whip into action, if it was Microsoft instead of Apple.
Time Machine is not exactly revolutionary, considering that there were a few 3rd party products available--Rewind comes to mind--that journaled changes and allowed them to be restored. Still, it should stop the various threads "I accidentally deleted..." :)
Hopefully, the features not mentioned will include a better kernel that actually performs well. It would be nice to see operating system benchmarks that don't make me cringe when I look at the Mac OS X results.
Xcode version 3.0 looks good but they still haven't provided many details.
Yeah, my first thought was - oh yeah, that's just like Rewind. However, the poweronsoftware.com website now forwards to http://www.nowsoftware.com/, so maybe Rewind has been bought out by Apple to use as Time Machine. Anyone know any more about this?
Dune
Time Machine is not exactly revolutionary, considering that there were a few 3rd party products available--Rewind comes to mind--that journaled changes and allowed them to be restored. Still, it should stop the various threads "I accidentally deleted..." :)
Hopefully, the features not mentioned will include a better kernel that actually performs well. It would be nice to see operating system benchmarks that don't make me cringe when I look at the Mac OS X results.
Xcode version 3.0 looks good but they still haven't provided many details.
Yeah, my first thought was - oh yeah, that's just like Rewind. However, the poweronsoftware.com website now forwards to http://www.nowsoftware.com/, so maybe Rewind has been bought out by Apple to use as Time Machine. Anyone know any more about this?
Dune
greenstork
Aug 17, 05:14 PM
So you have 4hdds in total,with 2 of each in raid 0 or what?
Do you have the os on one pair and scratch on the other pair?
Just out of curiosity, is it even possible to configure a RAID 10 or 01 on OS X setup without a dedicated controller card? I was planning to configure a RAID 1 (two 500 GB drives) on my Mac Pro for the sake of redundancy, but with 4 drives bays to play with, a RAID 10 or 01 might be a little faster if I understand the technology correctly. Anyone?
Do you have the os on one pair and scratch on the other pair?
Just out of curiosity, is it even possible to configure a RAID 10 or 01 on OS X setup without a dedicated controller card? I was planning to configure a RAID 1 (two 500 GB drives) on my Mac Pro for the sake of redundancy, but with 4 drives bays to play with, a RAID 10 or 01 might be a little faster if I understand the technology correctly. Anyone?
raymondso
Sep 19, 09:26 AM
Come on APPLE! My pocket is full and ready for a New C2D MacBook! :D

amin
Aug 19, 09:42 AM
You make good points. I guess we'll learn more as more information becomes available.
Yes under some specific results the quad was a bit faster than the dual. Though with the combo of Rosetta+Photoshop its unclear what is causing the difference. However, if you compare the vast majority of the benchmarks, there's negligible difference.
Concerning Photoshop specifically, as can be experienced on a quad G5, the performance increase is 15-20%. A future jump to 8-core would theoretically be in the 8% increase mark. Photoshop (CS2) simply cannot scale adequately beyond 2 cores, maybe that'll change in Spring 2007. Fingers crossed it does.
I beg to differ. If an app or game is memory intensive, faster memory access does matter. Barefeats (http://barefeats.com/quad09.html) has some benchmarks on dual channel vs quad channel on the Mac Pro. I'd personally like to see that benchmark with an added Conroe system. If dual to quad channel gave 16-25% improvement, imagine what 75% increase in actual bandwidth will do. Besides, I was merely addressing your statements that Woodcrest is faster because of its higher speed FSB and higher memory bus bandwidth.
Anandtech, at the moment, is the only place with a quad xeon vs dual xeon benchmark. And yes, dual Woodcrest is fast enough, but is it cost effective compared to a single Woodcrest/Conroe? It seems that for the most part, Mac Pro users are paying for an extra chip but only really utilizing it when running several CPU intensive apps at the same time.
You're absolutely right about that, its only measuring the improvement over increased FSB. If you take into account FB-DIMM's appalling efficiency, there should be no increase at all (if not decrease) for memory intensive apps.
One question I'd like to put out there, if Apple has had a quad core mac shipping for the past 8 months, why would it wait til intel quads to optimize the code for FCP? Surely they must have known for some time before that that they would release a quad core G5 so either optimizing FCP for quads is a real bastard or they've been sitting on it for no reason.
Yes under some specific results the quad was a bit faster than the dual. Though with the combo of Rosetta+Photoshop its unclear what is causing the difference. However, if you compare the vast majority of the benchmarks, there's negligible difference.
Concerning Photoshop specifically, as can be experienced on a quad G5, the performance increase is 15-20%. A future jump to 8-core would theoretically be in the 8% increase mark. Photoshop (CS2) simply cannot scale adequately beyond 2 cores, maybe that'll change in Spring 2007. Fingers crossed it does.
I beg to differ. If an app or game is memory intensive, faster memory access does matter. Barefeats (http://barefeats.com/quad09.html) has some benchmarks on dual channel vs quad channel on the Mac Pro. I'd personally like to see that benchmark with an added Conroe system. If dual to quad channel gave 16-25% improvement, imagine what 75% increase in actual bandwidth will do. Besides, I was merely addressing your statements that Woodcrest is faster because of its higher speed FSB and higher memory bus bandwidth.
Anandtech, at the moment, is the only place with a quad xeon vs dual xeon benchmark. And yes, dual Woodcrest is fast enough, but is it cost effective compared to a single Woodcrest/Conroe? It seems that for the most part, Mac Pro users are paying for an extra chip but only really utilizing it when running several CPU intensive apps at the same time.
You're absolutely right about that, its only measuring the improvement over increased FSB. If you take into account FB-DIMM's appalling efficiency, there should be no increase at all (if not decrease) for memory intensive apps.
One question I'd like to put out there, if Apple has had a quad core mac shipping for the past 8 months, why would it wait til intel quads to optimize the code for FCP? Surely they must have known for some time before that that they would release a quad core G5 so either optimizing FCP for quads is a real bastard or they've been sitting on it for no reason.
MacsRgr8
Aug 5, 06:06 PM
I think the Merom will be introduced:
Thus a MacBook Pro wil probably be announced, and made available right away, or otherwise very soon.
The Conroe and Woodcrest will probably take longer.
So, the Mac Pro and Xserve Pro (uuuuggghhh!!!... must. remain. Xserve) will be announced, but shipping in about 6 weeks.
Thus a MacBook Pro wil probably be announced, and made available right away, or otherwise very soon.
The Conroe and Woodcrest will probably take longer.
So, the Mac Pro and Xserve Pro (uuuuggghhh!!!... must. remain. Xserve) will be announced, but shipping in about 6 weeks.
Multimedia
Aug 21, 01:25 AM
Mac Pros will need 64bit Leopard to achieve their full multi-core potential. Expect all Core 2 based Macs to hold value well through the next release cycle of OSX Leopard.
Apple is still selling G5's on the website for $3299! Until
Adobe gets out - and optimizes - universal binaries, Quad G5 will sell for more than Quad Xeon Mac Pros! :rolleyes:Quad G5 is only $2799 on the SAVE refurb page. Refurbs are the same as new with a new warranty. But I think that would be a poor choice compared to a Mac Pro. The Mac Pro is not cheaper because you have to add more expensive RAM. But it is faster overall and Rosetta Photoshop performance isn't bad. Quad G5 will also benefit from Leopard don't forget. It's not like Leopard is going to not be written to take advantage of the 64-bit G5 as well.
But I would not recomend a G5 Quad to anyone at this point. I'm pondering a Mac Pro purchase myself. But I'm going to try and hold out for a refurb or even see if I can wait for Clovertown. But I'm likely to be one of the first to snag a Mac Pro refurb when they hit the SAVE page in November-December. By then I may even be thinking about waiting for the January 9th SteveNote. Quad G5 is no slouch. But Mac Pro is faster overall.And I thought you were married to your quad last week ......While I may be married to my Quad G5, we're not exclusive and she likes a threesome with the younger faster models as much as I do too. :p
Apple is still selling G5's on the website for $3299! Until
Adobe gets out - and optimizes - universal binaries, Quad G5 will sell for more than Quad Xeon Mac Pros! :rolleyes:Quad G5 is only $2799 on the SAVE refurb page. Refurbs are the same as new with a new warranty. But I think that would be a poor choice compared to a Mac Pro. The Mac Pro is not cheaper because you have to add more expensive RAM. But it is faster overall and Rosetta Photoshop performance isn't bad. Quad G5 will also benefit from Leopard don't forget. It's not like Leopard is going to not be written to take advantage of the 64-bit G5 as well.
But I would not recomend a G5 Quad to anyone at this point. I'm pondering a Mac Pro purchase myself. But I'm going to try and hold out for a refurb or even see if I can wait for Clovertown. But I'm likely to be one of the first to snag a Mac Pro refurb when they hit the SAVE page in November-December. By then I may even be thinking about waiting for the January 9th SteveNote. Quad G5 is no slouch. But Mac Pro is faster overall.And I thought you were married to your quad last week ......While I may be married to my Quad G5, we're not exclusive and she likes a threesome with the younger faster models as much as I do too. :p

Tomaz
Aug 7, 03:40 PM
The top secret features better be REALLY good, this was disappointing and nothing was really new! Cupertino started it's photocopiers.... (The Vista banners are an actual joke after this keynote) :(
LightSpeed1
Mar 31, 02:40 PM
I knew it would happen eventually.
iGary
Sep 12, 11:02 AM
The folks over at Anandtech have dropped engineering samples of the quad core cloverton into a Mac Pro - http://www.anandtech.com/mac/showdoc.aspx?i=2832&p=6
and it worked ... all eight cores were recognised.
The rest of the article was interesting too.
This willl probably be the update I purchase next year - if it makes it into the Mac Pro - thanks for the link.
and it worked ... all eight cores were recognised.
The rest of the article was interesting too.
This willl probably be the update I purchase next year - if it makes it into the Mac Pro - thanks for the link.
tripjammer
Apr 11, 01:04 PM
You guys really believe this? We all know the Iphone 5 will basically have the guts of the Ipad 2...so all the componets are ready...it will be out this summer. These rumors are just to keep Android and Microsoft not knowing.
Ipad in the spring
Iphone in the summer
Itouch\AppleTV\IPODs in the fall
Its like that and it will always be...it works for apple.
Ipad in the spring
Iphone in the summer
Itouch\AppleTV\IPODs in the fall
Its like that and it will always be...it works for apple.
greenstork
Aug 17, 05:26 PM
Calm down. The OP was directing his question towards gamers. I agree with him, why salivate over a Macpro and whine for games when it's clear that the Macpro isn't intended for that kind of user. If I were a games enthusiast, I'd build my own custom PC that would be optimized for gaming performance. Apple is ignoring this segment of the market. For those of us who need to get real work done, the Macpro is a great machine. It will play games, but don't try hauling to a Lan party. You'll probably get laughed at.
Do you see now?
With no intention of jumping into the argument in question here, I have a slight issue with your definition of a gamer. I'm an intermediate photoshop user, web designer, and gamer. I don't just use my computer for games or work, there's this huge gray area in the middle. For me, the Mac Pro is the best of all worlds. I wouldn't dare rely on Windows for my workflow, design, and productivity software, OS X is a must for me. However, the ability to duat boot into Windows and play games natively is a bonus, one that I'm willing to pay a premium for, and whether or not it's even a premium is up for debate. Sure, I could build a PC just for games but if I can't run OS X ever then that machine is useless for me.
I'd be surprised if there weren't many more people out there who welcome the power of the Mac Pros for work and play, recognizing of course that the majority of buyers will be professionals.
Do you see now?
With no intention of jumping into the argument in question here, I have a slight issue with your definition of a gamer. I'm an intermediate photoshop user, web designer, and gamer. I don't just use my computer for games or work, there's this huge gray area in the middle. For me, the Mac Pro is the best of all worlds. I wouldn't dare rely on Windows for my workflow, design, and productivity software, OS X is a must for me. However, the ability to duat boot into Windows and play games natively is a bonus, one that I'm willing to pay a premium for, and whether or not it's even a premium is up for debate. Sure, I could build a PC just for games but if I can't run OS X ever then that machine is useless for me.
I'd be surprised if there weren't many more people out there who welcome the power of the Mac Pros for work and play, recognizing of course that the majority of buyers will be professionals.
milo
Aug 17, 09:21 AM
You're right. I'm extremely unimpressed that the fastest xeon only days old is actually slower mhz for mhz than a G5 that is pushing 4 year old technology. Really sad.
But overall it's not. Whenever you change chips, you'll probably always find a benchmark that favors the old one. Just because one app isn't faster doesn't mean the new chip is slower.
But it's not faster. Slower actually than the G5 at some apps. What's everyone looking at anyway? I'm pretty unimpressed. Other than Adobe's usage of cache (AE is a cache lover and will use all of it, hence the faster performance).
But the actual xeon processors are only as fast as the G5 processors. Look at the average specs... the 2.66 machines are only a teeny bit faster than the G5s except in a few apps like filemaker. But not in the biggies like Final Cut Pro where it actually appears that mhz for mhz the G5 is a faster machine hands down!
What are you talking about? The xeon is faster in every native benchmark, the only exception is one render where the slower xeon tied the G5. If you do indeed look at the average specs, the xeons blow away the G5.
Looks like the Xeons got killed by the G5 in Word in their tests.
Because it's running under rosetta, ram has nothing to do with it.
It's odd, seeing as Mac's are still the choice for many musicians that some kind of specs are never given that would be of interest to musicians. The released figures don't do much for me. I'd like to know the polyphony improvements say for Kontakt under both systems in Digital Performer 5.
There have been Logic benchmarks elsewhere, and they're pretty impressive. 1.4-1.5x improvements, pretty nice considering how fast the quad is already for audio plugins.
But overall it's not. Whenever you change chips, you'll probably always find a benchmark that favors the old one. Just because one app isn't faster doesn't mean the new chip is slower.
But it's not faster. Slower actually than the G5 at some apps. What's everyone looking at anyway? I'm pretty unimpressed. Other than Adobe's usage of cache (AE is a cache lover and will use all of it, hence the faster performance).
But the actual xeon processors are only as fast as the G5 processors. Look at the average specs... the 2.66 machines are only a teeny bit faster than the G5s except in a few apps like filemaker. But not in the biggies like Final Cut Pro where it actually appears that mhz for mhz the G5 is a faster machine hands down!
What are you talking about? The xeon is faster in every native benchmark, the only exception is one render where the slower xeon tied the G5. If you do indeed look at the average specs, the xeons blow away the G5.
Looks like the Xeons got killed by the G5 in Word in their tests.
Because it's running under rosetta, ram has nothing to do with it.
It's odd, seeing as Mac's are still the choice for many musicians that some kind of specs are never given that would be of interest to musicians. The released figures don't do much for me. I'd like to know the polyphony improvements say for Kontakt under both systems in Digital Performer 5.
There have been Logic benchmarks elsewhere, and they're pretty impressive. 1.4-1.5x improvements, pretty nice considering how fast the quad is already for audio plugins.
G5power
Jul 27, 09:48 AM
Assuming August 7 as an announcement date of new systems, the waiting is killer.
Multimedia
Jul 21, 12:20 PM
It really depends on your application.
On the desktop, if you're a typical user that's just interested in web surfing, playing music files, organizing your photo collection, etc., more than two cores will probably not be too useful. For these kinds of users, even two cores may be overkill, but two are useful for keeping a responsive UI when an application starts hogging all the CPU time.
If you start using higher-power applications (like video work - iMovie/iDVD, for instance) then more cores will speed up that kind of work (assuming the app is properly multithreaded, of course.) 4-core systems will definitely benefit this kind of user.
With current applications, however, I don't think more than 4 cores will be useful. The kind of work that will make 8 cores useful is the kinds that requires expensive professional software - which most people don't use...
Cluster computing has similar benefits. With 8 cores in each processor, it is almost as good as having 8 times as many computers in the cluster, and a lot less expensive. This concept will scale up as the number of cores increases, assuming motherbaords can be designed with enough memory and FSB bandwidth to keep them all busy.
I think we might see a single quad-core chip in consumer systems, like the iMac. I think it is likely that we'll see them in Pro systems, like the Mac Pro (including a high-end model with two quad-core chips.)
I think processors with more than 4 cores will never be seen outside of servers - Xserves and maybe some configurations of Mac Pro. Mostly because that's where there is a need for this kind of power.I strongly disagree. I could use 16 cores right now for notihng more than simple consumer electronics video compression routines. There will be a Mac Pro with 8 cores this Winter 2007.
You are completely blind to the need for many cores right now for very simple stupid work. All I want to do is run 4 copies of Toast while running 4 copies of Handbrake simultaneously. Each wants 2 cores or more. So you are not thinking of the current need for 16 cores already.
This is not even beginning to discuss how many Final Cut Studio Editors need 16 Cores. Man, I can't believe you wrote that. I think you are overlooking the obvious - the need to run multiple copies of today's applicaitons simultaneously.
So as long as the heat issue can be overcome, I don't see why 8 Cores can't belong inside an iMac by the end of 2008.
I apologize if I read a little hot. But I find the line of thought that 4 or 8 Cores are enough or more than enough to really annoy me. They are not nearly enough for those of us who see the problem of not enough cores EVERY DAY. The rest of you either have no imagination or are only using your Macs for word processing, browsing and email.
I am sincerely frustrated by not having enough cores to do simple stupid work efficiently. Just look at how crippled this G5 Quad is already only running three things. They can't even run full speed due to lack of cores.
On the desktop, if you're a typical user that's just interested in web surfing, playing music files, organizing your photo collection, etc., more than two cores will probably not be too useful. For these kinds of users, even two cores may be overkill, but two are useful for keeping a responsive UI when an application starts hogging all the CPU time.
If you start using higher-power applications (like video work - iMovie/iDVD, for instance) then more cores will speed up that kind of work (assuming the app is properly multithreaded, of course.) 4-core systems will definitely benefit this kind of user.
With current applications, however, I don't think more than 4 cores will be useful. The kind of work that will make 8 cores useful is the kinds that requires expensive professional software - which most people don't use...
Cluster computing has similar benefits. With 8 cores in each processor, it is almost as good as having 8 times as many computers in the cluster, and a lot less expensive. This concept will scale up as the number of cores increases, assuming motherbaords can be designed with enough memory and FSB bandwidth to keep them all busy.
I think we might see a single quad-core chip in consumer systems, like the iMac. I think it is likely that we'll see them in Pro systems, like the Mac Pro (including a high-end model with two quad-core chips.)
I think processors with more than 4 cores will never be seen outside of servers - Xserves and maybe some configurations of Mac Pro. Mostly because that's where there is a need for this kind of power.I strongly disagree. I could use 16 cores right now for notihng more than simple consumer electronics video compression routines. There will be a Mac Pro with 8 cores this Winter 2007.
You are completely blind to the need for many cores right now for very simple stupid work. All I want to do is run 4 copies of Toast while running 4 copies of Handbrake simultaneously. Each wants 2 cores or more. So you are not thinking of the current need for 16 cores already.
This is not even beginning to discuss how many Final Cut Studio Editors need 16 Cores. Man, I can't believe you wrote that. I think you are overlooking the obvious - the need to run multiple copies of today's applicaitons simultaneously.
So as long as the heat issue can be overcome, I don't see why 8 Cores can't belong inside an iMac by the end of 2008.
I apologize if I read a little hot. But I find the line of thought that 4 or 8 Cores are enough or more than enough to really annoy me. They are not nearly enough for those of us who see the problem of not enough cores EVERY DAY. The rest of you either have no imagination or are only using your Macs for word processing, browsing and email.
I am sincerely frustrated by not having enough cores to do simple stupid work efficiently. Just look at how crippled this G5 Quad is already only running three things. They can't even run full speed due to lack of cores.
No comments:
Post a Comment