Moyank24
Apr 30, 10:14 PM
Do words have no meaning? There can be no party without me there!
Whatever gets you through the night.
Whatever gets you through the night.
chrmjenkins
Apr 22, 02:03 PM
That's typical Apple. Intel chipset does not support USB 3.0? No USB 3.0 for Apple fans!
It's not built into the current Intel platform standards. That doesn't mean it doesn't support it. Most of Intel's reference boards even include it.
NVIDIA GPUs do not work with SandyBridge? Stick with outdated C2D CPUs for years.
Nvidia GPUs work fine with the Sandy Bridge platform. The problem was that they were not licensed to make chipsets for intel processors past the Montevina platform.
What's more important - CPU/chip or case? In case of Apple, the case always wins. Apple is all about image. Once designed, the case should stay unchanged for many years. Apple will wait until somebody designs a "suitable" chip. Is not it kind of backward?
Apple is using the same CPUs as everyone else, for which their enclosures are extremely competitive in terms of dimensions.
Then we hear excuses from Apple fans why Apple could not use separate USB 3.0 controller. This would require redesign of the motherboard - Wow! Think of it - redesigning a motherboard! Some companies redesign tens of motherboards every year but Apple? No way. Now iPhone users will be stuck with outdated technology for a year or two and they will be feeding us all kinds of excuses why LTE can not be used in iPhone. Just ridiculous.
There's no question that two radio chips would have caused the tiny logic board inside the iPhone 4 to grow. That means the battery gets smaller or they make some other sort of sacrifice which potentially changes the housing. Too much work to release the same iPhone on a different network, especially since apple wouldn't want to sacrifice battery life.
Since apple has to design to the greatest common denominator, I doubt they'd increase the size of the phone given the number of outspoken size critics on this forum.
It's not built into the current Intel platform standards. That doesn't mean it doesn't support it. Most of Intel's reference boards even include it.
NVIDIA GPUs do not work with SandyBridge? Stick with outdated C2D CPUs for years.
Nvidia GPUs work fine with the Sandy Bridge platform. The problem was that they were not licensed to make chipsets for intel processors past the Montevina platform.
What's more important - CPU/chip or case? In case of Apple, the case always wins. Apple is all about image. Once designed, the case should stay unchanged for many years. Apple will wait until somebody designs a "suitable" chip. Is not it kind of backward?
Apple is using the same CPUs as everyone else, for which their enclosures are extremely competitive in terms of dimensions.
Then we hear excuses from Apple fans why Apple could not use separate USB 3.0 controller. This would require redesign of the motherboard - Wow! Think of it - redesigning a motherboard! Some companies redesign tens of motherboards every year but Apple? No way. Now iPhone users will be stuck with outdated technology for a year or two and they will be feeding us all kinds of excuses why LTE can not be used in iPhone. Just ridiculous.
There's no question that two radio chips would have caused the tiny logic board inside the iPhone 4 to grow. That means the battery gets smaller or they make some other sort of sacrifice which potentially changes the housing. Too much work to release the same iPhone on a different network, especially since apple wouldn't want to sacrifice battery life.
Since apple has to design to the greatest common denominator, I doubt they'd increase the size of the phone given the number of outspoken size critics on this forum.
AndroidfoLife
Apr 24, 04:38 PM
Thanks for the anecdote.
The iPhone sets the bar. Google has to flood the market with a lot of junk to achieve higher share. That's hardly impressive. Google is the MS of mobile. Hardly a compliment. License out your beta OS to anyone that can slam together a box, give it away, and away you go.
The iPhone is still the #1 selling handset. Where are the iPhone killers? There aren't any. Because the competition doesn't know how to make one. Because Apple approaches tech from a totally different place.
The iOS platform still dominates, and given the iPad's success, it'll be that way for the foreseeable future.
Android enjoys highest smartphone market share. Yet the OS is pretty brutal and their ecosystem is a mess. So why do they have greater share? Not because they make a superior product, but because the only alternative to an iPhone was an Android-based device, and Eric T. Mole got to work licensing it out to everyone with no regard for design or User Experience. If you flood the market with what, 70+ (probably a lot more) devices and let everyone and their dog make the devices you'll eventually enjoy force of numbers.
Android is given away free to anyone to manufacture, to make as many POS devices as they wish, to sell for peanuts, in massive volume.
That's all it is. Market flooding at every price point and you get some sort of touchscreen and some sort of app store. And given Google's Microsoftian horizontal business model, that's all it'll ever be.
For instance, THIS is the kind of total junk that Google puts their name to:
http://www.gsmarena.com/zte_racer-reviews-3423.php
And guess what: Dell went ahead and copied it. The DELL XCD28. Same junk. But Android market share just went up!
Here's another amazing Android device:
http://arstechnica.com/gadgets/reviews/2010/11/worst-gadget-ever-ars-reviews-a-99-android-tablet.ars
Anything to be proud of? But hey, they're dirt cheap. And uh . . . "open" or whatever.
If Google actually *cared* about what they put the Android name to, if they actually gave a damn about the USER, would they allow this? Ask yourself that. That's the difference. There are some things Apple *will not* allow to exist - namely: garbage.
Yes, highest market share. Until you go hunting for the REASON.
Its only in opinions which OS is better. I know Windows is better for me because it allows me to build my own hardware. Not all android phones are dirt cheap. The top selling ones are not the dirt cheap ones it is the ones that compete directly with the iPhone in the high end smart phone. There are a lot of phones out there that top the iPhone. The Atrix, the G2X, Evo and multiple other ones are all better phones then the iPhone 4. The iPhone is a good phone, But is far from the best. It will never turn into a one horse game. There will never be an "iPhone Killer", And android is here to stay.
The iPhone can not meet everyones needs some people need a physical keyboard, Larger screen, SD slot, Smaller screen, HDMI, High end camaras, various things that end the end the Single model of the iPhone will not work for them
The iPhone sets the bar. Google has to flood the market with a lot of junk to achieve higher share. That's hardly impressive. Google is the MS of mobile. Hardly a compliment. License out your beta OS to anyone that can slam together a box, give it away, and away you go.
The iPhone is still the #1 selling handset. Where are the iPhone killers? There aren't any. Because the competition doesn't know how to make one. Because Apple approaches tech from a totally different place.
The iOS platform still dominates, and given the iPad's success, it'll be that way for the foreseeable future.
Android enjoys highest smartphone market share. Yet the OS is pretty brutal and their ecosystem is a mess. So why do they have greater share? Not because they make a superior product, but because the only alternative to an iPhone was an Android-based device, and Eric T. Mole got to work licensing it out to everyone with no regard for design or User Experience. If you flood the market with what, 70+ (probably a lot more) devices and let everyone and their dog make the devices you'll eventually enjoy force of numbers.
Android is given away free to anyone to manufacture, to make as many POS devices as they wish, to sell for peanuts, in massive volume.
That's all it is. Market flooding at every price point and you get some sort of touchscreen and some sort of app store. And given Google's Microsoftian horizontal business model, that's all it'll ever be.
For instance, THIS is the kind of total junk that Google puts their name to:
http://www.gsmarena.com/zte_racer-reviews-3423.php
And guess what: Dell went ahead and copied it. The DELL XCD28. Same junk. But Android market share just went up!
Here's another amazing Android device:
http://arstechnica.com/gadgets/reviews/2010/11/worst-gadget-ever-ars-reviews-a-99-android-tablet.ars
Anything to be proud of? But hey, they're dirt cheap. And uh . . . "open" or whatever.
If Google actually *cared* about what they put the Android name to, if they actually gave a damn about the USER, would they allow this? Ask yourself that. That's the difference. There are some things Apple *will not* allow to exist - namely: garbage.
Yes, highest market share. Until you go hunting for the REASON.
Its only in opinions which OS is better. I know Windows is better for me because it allows me to build my own hardware. Not all android phones are dirt cheap. The top selling ones are not the dirt cheap ones it is the ones that compete directly with the iPhone in the high end smart phone. There are a lot of phones out there that top the iPhone. The Atrix, the G2X, Evo and multiple other ones are all better phones then the iPhone 4. The iPhone is a good phone, But is far from the best. It will never turn into a one horse game. There will never be an "iPhone Killer", And android is here to stay.
The iPhone can not meet everyones needs some people need a physical keyboard, Larger screen, SD slot, Smaller screen, HDMI, High end camaras, various things that end the end the Single model of the iPhone will not work for them
AlanAudio
Jul 28, 08:02 AM
When Microsoft claim that their investment might not pay off for five years, they're paving the way for failure. For the next two or three years, when pressed about the lack of profits, they can claim that the payoff will be in a couple of years from then. They won't have to actually admit that they've failed until after 2010. It's not dissimilar to Bill Gates claiming that there's an 80% chance of Vista shipping on time, it sounds positive, but few people believe it actually will ship in January. It's just paving the way for the next excuse.
It's very important that Microsoft try very hard with Zune. They keep claiming that the iPod succeeded simply because of slick marketing, whereas everybody else knows that it succeeded by being an attractive proposition, combining style with ease of use. It was word-of-mouth publicity that really worked for the iPod. You can't buy that, it added massive value to the money that was spent on advertising.
So here's Microsoft's opportunity to look at the last five years of the iPod, together with three years of iTMS, take it all in and apply their 'innovation', show us the ultimate product and then spend a fortune marketing it. There must be no doubt that Microsoft must be seen to throw everything into this project. Then Steve Jobs will be delighted to rise to the challenge and delight in humiliating Bill Gates.
It's very important that Microsoft try very hard with Zune. They keep claiming that the iPod succeeded simply because of slick marketing, whereas everybody else knows that it succeeded by being an attractive proposition, combining style with ease of use. It was word-of-mouth publicity that really worked for the iPod. You can't buy that, it added massive value to the money that was spent on advertising.
So here's Microsoft's opportunity to look at the last five years of the iPod, together with three years of iTMS, take it all in and apply their 'innovation', show us the ultimate product and then spend a fortune marketing it. There must be no doubt that Microsoft must be seen to throw everything into this project. Then Steve Jobs will be delighted to rise to the challenge and delight in humiliating Bill Gates.
benhollberg
May 1, 10:32 PM
CNN says the Pakistan government had a part in the killing of Bin Laden.
The Maestro
Oct 24, 07:47 AM
wahoooooooo
i better get my card out
i better get my card out
sporadicMotion
Nov 24, 03:26 PM
A pair of these
http://t1.gstatic.com/images?q=tbn:ANd9GcQt4n6Q9MU4-IkNnwAJ_lWLpNuOGWKJvLOTzCw3pH3ByqeyG1hb
and one of these
http://www.podcastalley.com/forum/geek/gars/images/1/5/0/0/4/3/vRNC.jpg
http://t1.gstatic.com/images?q=tbn:ANd9GcQt4n6Q9MU4-IkNnwAJ_lWLpNuOGWKJvLOTzCw3pH3ByqeyG1hb
and one of these
http://www.podcastalley.com/forum/geek/gars/images/1/5/0/0/4/3/vRNC.jpg
awmazz
Mar 9, 07:57 AM
For movies it's different because each one is a narrative of it's own. You can't can't compare Sean Connery with Pierce Brosnan as you can't compare Never Say Never with Tomorrow Never Dies because both movies are done in their individual way.
On a television series, you have a continuous narrative that can change its direction, but as soon as you change major plot points or dare switch the main actors with new ones, that's a plain insult to the audience who watched from the start.
I'm the opposite. I had no problem with Catwoman changing from Julie Newmar to Eartha Kitt in the Batman TV series because the style and tenor of both the show and character didn't change.
I do have a problem with the modern Batman movie franchise where each movie is a revisioning deopending on which director got up on which side of the bed so every movie has a different feel and you have the Joker played completely differently by Jack Nicholson and Heath Ledger, Catwoman by Michelle Pfeifferand Halle Berry, and every Batman as well by Michael Keaton, Val Kilmer, George Clooney and Christian Bale. It's like multiple cover versions of the same song by different atrists and you're expected to like and buy them all, which is ridiculous.
On a television series, you have a continuous narrative that can change its direction, but as soon as you change major plot points or dare switch the main actors with new ones, that's a plain insult to the audience who watched from the start.
I'm the opposite. I had no problem with Catwoman changing from Julie Newmar to Eartha Kitt in the Batman TV series because the style and tenor of both the show and character didn't change.
I do have a problem with the modern Batman movie franchise where each movie is a revisioning deopending on which director got up on which side of the bed so every movie has a different feel and you have the Joker played completely differently by Jack Nicholson and Heath Ledger, Catwoman by Michelle Pfeifferand Halle Berry, and every Batman as well by Michael Keaton, Val Kilmer, George Clooney and Christian Bale. It's like multiple cover versions of the same song by different atrists and you're expected to like and buy them all, which is ridiculous.
rnelan7
Nov 28, 09:12 PM
Finally narrowed it down to a Shady McCoy jersey, Apple TV and a WD Elements 2tb. Merry Christmas everyone!
blackburn
Apr 14, 08:36 AM
iHackintosh:D
stroked
May 1, 11:45 PM
Osama is dead, so what? The U.S needed him alive.
for what?
for what?
thirumalkumaran
May 3, 07:52 AM
The IPS tech screens are removed from specs...
Have they moved to TN panel sinstead...?
Have they moved to TN panel sinstead...?
dsensi
Apr 27, 06:14 PM
Seriously, why not an Imac with touch-screen right now? Will we need to wait for the next iMac update to see this technology implemented?
Apple is surely working on it:
http://www.telegraph.co.uk/technology/apple/7961480/Apple-files-iMac-touch-patent.html
And, besides that, OS X Lion will be 100% focused on touch technology... and we�re not talking about an Ipad OS...
Apple is surely working on it:
http://www.telegraph.co.uk/technology/apple/7961480/Apple-files-iMac-touch-patent.html
And, besides that, OS X Lion will be 100% focused on touch technology... and we�re not talking about an Ipad OS...
eekcat
Apr 28, 04:21 PM
I held one earlier today and it felt .0001 oz. heavier. Perhaps it is just my super human ability to weigh things instantly that told me this....but still....:cool:
Don't worry, Safari is .0078 nanoseconds snappier on it! ;)
Don't worry, Safari is .0078 nanoseconds snappier on it! ;)
savage1881
Jul 27, 10:51 PM
And you guys accuse PC users of sticking to old stereotypes. If you want to see ugly, take a gaze at the army of external devices that my iMac is going to need. I prefer my cables be inside the case instead of covering my desk.
2+ full size optical drives opppsed to a single slow notebook drive
2+ hard drive bays
Card reader
Easy CPU upgrading
Easy RAM upgrading
Upgradable x16 PCI-Express slot compared to underclocked fixed notebook GPU
3+ PCI/ PCI-E x1 slots for upgrading to new devices
Choice of display
being able to choose what you want to do instead of having everything dictated to you by Steve Jobs.
An iMac is NOT suitable for the sort of computer use you are intending! As an experienced computer technician who works mostly on PCs, I can assure you that any new Dell, HP or Gateway tower is even less suited to handle the upgrades you are suggesting, with the exclusion of RAM upgrades.
New PCs are products of out-of-control cost cutting and nothing more. If you want upgradability, you must spend at least $2000 and get one from ABS or another semi-custom shop.
Finally, the Mac Pro tower is coming out soon. Then, many of your complaints about the Mac's faults will be dealt with. While I am a fan of the Mac platform, I run a custom dual-xeon PC that I built myself and I can say that, from my perspective, I would take any computer over a sub-$1500 PC.
I've got a fried Dell P4 Motherboard sitting at home b/c Dell decided to use proprietary pin configs with a standard ATX power connector (not my mistake :) ). Mass-manufactured PCs are made to be fortresses, preventing user upgrade. The Mac is a nice, good-looking alternative among only a few alternatives. Atleast you know the each of those external devices is going to work right as soon as you plug it in. With PCs today, esp. from dell, You have no such guarantee on any of the upgrades you suggested. People are making a mistake when they buy a cheap PC, whether you believe they ought to be buying a mac or not!
2+ full size optical drives opppsed to a single slow notebook drive
2+ hard drive bays
Card reader
Easy CPU upgrading
Easy RAM upgrading
Upgradable x16 PCI-Express slot compared to underclocked fixed notebook GPU
3+ PCI/ PCI-E x1 slots for upgrading to new devices
Choice of display
being able to choose what you want to do instead of having everything dictated to you by Steve Jobs.
An iMac is NOT suitable for the sort of computer use you are intending! As an experienced computer technician who works mostly on PCs, I can assure you that any new Dell, HP or Gateway tower is even less suited to handle the upgrades you are suggesting, with the exclusion of RAM upgrades.
New PCs are products of out-of-control cost cutting and nothing more. If you want upgradability, you must spend at least $2000 and get one from ABS or another semi-custom shop.
Finally, the Mac Pro tower is coming out soon. Then, many of your complaints about the Mac's faults will be dealt with. While I am a fan of the Mac platform, I run a custom dual-xeon PC that I built myself and I can say that, from my perspective, I would take any computer over a sub-$1500 PC.
I've got a fried Dell P4 Motherboard sitting at home b/c Dell decided to use proprietary pin configs with a standard ATX power connector (not my mistake :) ). Mass-manufactured PCs are made to be fortresses, preventing user upgrade. The Mac is a nice, good-looking alternative among only a few alternatives. Atleast you know the each of those external devices is going to work right as soon as you plug it in. With PCs today, esp. from dell, You have no such guarantee on any of the upgrades you suggested. People are making a mistake when they buy a cheap PC, whether you believe they ought to be buying a mac or not!
Snowy_River
Oct 23, 10:19 AM
Setting aside the question of no VM at all, has it occurred to anyone that having a restriction on running in a VM even on the licensed machine could put a damper on the idea of having Parallels (or VMWare) be able to start up off of the BootCamp partition? As that's an ability that I've been wanting, that's something that bothers me about this....
AppleScruff1
Apr 14, 01:23 AM
You won't be able to watch anything but paid content from Apple. But it will be magical. And it least it will be big enough so you can't hold it wrong.
kingtj
Mar 31, 01:45 PM
Personally? I find it humorous that so many people on here refuse to use the app, or have big issues with it, all because of the faux leather look to the top bar, or other attempts to make the app look like its physical counterpart. If the app has the FUNCTIONALITY you need, that's what makes it good! I've come to expect that Apple will regularly revise the LOOK of these applications. Even if they had a look that 99.9% of users agreed was "perfect"? They'd revise it with the next major release of the app or OS, simply because they know people don't feel like they really "got enough for their money" if it doesn't look different at a quick glance....
The way it defaults to entering new appointments with that "unnamed appointment" heading drives me nuts too. Accidental taps on the iPhone or iPad can lead to those things being added to your schedule, and if you don't notice it until later? You're left wondering if it's supposed be a real appointment for something, or if it was just a screw-up. They should make it so if you don't actually fill something in, it cancels adding it.
Another feature I'd like to see? It needs a way to easily open up a list of your contacts from the Address Book inside a pane in iCal itself, and drag one over to the calendar to add an appointment with their address inserted as the "location", and name plus maybe phone number(s) in the title. Like many people, I use iCal to track appointments I have with clients, so this info usually needs to go into them.
I saw where someone wrote a fancy Applescript to accomplish this, but IMHO, that's still a "hack" for functionality Apple could/should include!
What I want to know is have they made iCal more usable? I'm not sure how I feel about looks but there are quite a few pet peeves I wish they'd address.
1. When I say enter new appointment, I should be put straight into the edit screen. Not have it put in a unnamed appointment that I have to click at least two more times to actually get into a full edit screen. When I put in a new appointment of course I want it to say more than "new appointment!!!" I want to be able to name it and set a time and maybe even a reminder and tell it what calendar! What's worse is iCal used to work like this and for some reason some dipsh*t decided that some reason when I put in new appointment I just wanted to put in a new appointment at random time... what sense doe that make? (yes, this is a huge pet peeve of mine)
2. Reminders. First when I set a reminder for 2 days before, display on the appointment/task 2 days before, not how many minutes 2 days before equals.
Secondly, when it pops up the reminder and I want to tell it to remind me again, give me an option to set reminders. Or at least have more sensible ones (like give me a half a day later option, not just 1 hour or a full day. I want to be reminded later today, but not have to keep hitting one hour if I don't want a full day reminder).
Those are just the ones I can think of on top of my head, but they both annoy me a lot out of iCal (I really am not that picky. I'm sure people who want more out of their calendar/task app have a lot more things to nitpick about it cause iCal is pretty damned basic and really could use more functionality).
The way it defaults to entering new appointments with that "unnamed appointment" heading drives me nuts too. Accidental taps on the iPhone or iPad can lead to those things being added to your schedule, and if you don't notice it until later? You're left wondering if it's supposed be a real appointment for something, or if it was just a screw-up. They should make it so if you don't actually fill something in, it cancels adding it.
Another feature I'd like to see? It needs a way to easily open up a list of your contacts from the Address Book inside a pane in iCal itself, and drag one over to the calendar to add an appointment with their address inserted as the "location", and name plus maybe phone number(s) in the title. Like many people, I use iCal to track appointments I have with clients, so this info usually needs to go into them.
I saw where someone wrote a fancy Applescript to accomplish this, but IMHO, that's still a "hack" for functionality Apple could/should include!
What I want to know is have they made iCal more usable? I'm not sure how I feel about looks but there are quite a few pet peeves I wish they'd address.
1. When I say enter new appointment, I should be put straight into the edit screen. Not have it put in a unnamed appointment that I have to click at least two more times to actually get into a full edit screen. When I put in a new appointment of course I want it to say more than "new appointment!!!" I want to be able to name it and set a time and maybe even a reminder and tell it what calendar! What's worse is iCal used to work like this and for some reason some dipsh*t decided that some reason when I put in new appointment I just wanted to put in a new appointment at random time... what sense doe that make? (yes, this is a huge pet peeve of mine)
2. Reminders. First when I set a reminder for 2 days before, display on the appointment/task 2 days before, not how many minutes 2 days before equals.
Secondly, when it pops up the reminder and I want to tell it to remind me again, give me an option to set reminders. Or at least have more sensible ones (like give me a half a day later option, not just 1 hour or a full day. I want to be reminded later today, but not have to keep hitting one hour if I don't want a full day reminder).
Those are just the ones I can think of on top of my head, but they both annoy me a lot out of iCal (I really am not that picky. I'm sure people who want more out of their calendar/task app have a lot more things to nitpick about it cause iCal is pretty damned basic and really could use more functionality).
Rowbear
Apr 4, 05:56 AM
My 1 year-old a couple of days before his first birthday (click for larger).
http://gallery.me.com/crebelein/100053/IMG_5637/web.jpg
Give him the "high five" from all us here. I wish you all the best. :)
http://gallery.me.com/crebelein/100053/IMG_5637/web.jpg
Give him the "high five" from all us here. I wish you all the best. :)
junker
Jul 28, 09:26 AM
What aren't you understanding?
LOL!
He's Canadian!! just kidding...
LOL!
He's Canadian!! just kidding...
Psilocybin
Apr 19, 07:47 PM
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8G4 Safari/6533.18.5)
I had to finally register to comment on the hypocrisy in this and many other threads like it. Because some people want frame rates for gaming on an MBA, then your needs for GPU performance are valid, and others who don't game but could use CPU performance have invalid needs? Rubbish.
A perfect example is the above. So the C2D rates as a 100/100 for CPU performance and thus any improvement is useless? Really?! Nice to see that you framed the argument such that any improvement you don't see as needed is useless.
On Sunday I combined 6 or 8 short 720p video clips into a 7 minute video for YouTube with a simple title screen and transitions. It took the C2D ~40 minutes to process the video and save in a new format. So you're really going to argue that there is nothing to be gained from a significant bump in processor speed?
For me and many other potential MBA purchasers, a CPU bump from the media processing abilities of the Core i processors would be welcome, and GPU performance over and above the ability to play real-time HD video is useless. We shouldn't be saddled with an out-of-date processor or forced to subsidize "unnecessary" frame rate performance just to appease game-players. And that perspective is as valid as yours.
Welcome!
CPU and GPU are both important. There is one critical difference between CPU and GPU though and thats this:
A user can usually wait on on the CPU with no impact other than the fact that they had to wait. Using your example. You waited 40 minutes. A CPU that that was twice as fast might have reduced your wait to 25 minutes. A CPU that was half a fast would have increased your wait time to maybe 75 minutes. The only consequence of CPU speed is time in general. There is rarely a difference in the final product.
GPU is different, GPU is often used to perform realtime calculations (Game or movie frames). Because the frames are related to a specific point in time, a difference is GPU performance can make the difference between usable and unusable. For that reason, people that like, want or need GPU performance tend to be focal.
In my experience, poor GPU performance bugs me more than poor CPU performance. You can't just wait for the GPU to get done, like you can with a CPU. There does have to be a balance though.
Well said
I had to finally register to comment on the hypocrisy in this and many other threads like it. Because some people want frame rates for gaming on an MBA, then your needs for GPU performance are valid, and others who don't game but could use CPU performance have invalid needs? Rubbish.
A perfect example is the above. So the C2D rates as a 100/100 for CPU performance and thus any improvement is useless? Really?! Nice to see that you framed the argument such that any improvement you don't see as needed is useless.
On Sunday I combined 6 or 8 short 720p video clips into a 7 minute video for YouTube with a simple title screen and transitions. It took the C2D ~40 minutes to process the video and save in a new format. So you're really going to argue that there is nothing to be gained from a significant bump in processor speed?
For me and many other potential MBA purchasers, a CPU bump from the media processing abilities of the Core i processors would be welcome, and GPU performance over and above the ability to play real-time HD video is useless. We shouldn't be saddled with an out-of-date processor or forced to subsidize "unnecessary" frame rate performance just to appease game-players. And that perspective is as valid as yours.
Welcome!
CPU and GPU are both important. There is one critical difference between CPU and GPU though and thats this:
A user can usually wait on on the CPU with no impact other than the fact that they had to wait. Using your example. You waited 40 minutes. A CPU that that was twice as fast might have reduced your wait to 25 minutes. A CPU that was half a fast would have increased your wait time to maybe 75 minutes. The only consequence of CPU speed is time in general. There is rarely a difference in the final product.
GPU is different, GPU is often used to perform realtime calculations (Game or movie frames). Because the frames are related to a specific point in time, a difference is GPU performance can make the difference between usable and unusable. For that reason, people that like, want or need GPU performance tend to be focal.
In my experience, poor GPU performance bugs me more than poor CPU performance. You can't just wait for the GPU to get done, like you can with a CPU. There does have to be a balance though.
Well said
dongmin
Jul 24, 10:02 PM
sounds interesting, though i have a feeling many people will just ignore the feature and end up touching the screen anyway, lol.I think some of you have the wrong understanding of this 'non-touch' concept. You'll still be touching the screen. The purpose of the non-touch technology is to hide the scroll wheel (or any other controller) whenever it's not needed. But I think you'll still be touching the screen to actually activate the virtual buttons. That's my reading of it, anyways.
Kinda takes away from that whole "Simplicity is everything" slogan Apple is known for, doesn't it? While I'll reserve my judgments on the design until it's worked into a final product, it does look like the user needs to take unnecessary steps to actually use the click wheel. Then again, pictures (drawings) probably can't do the interface justice.
Still, not everything has to be digital over analog...I actually think this will be even more intuitive than other interfaces because the controls will be contextual. The buttons will automatically appear and disappear as you move your fingers over the iPod screen. The buttons themselves, I'm imagining, will maintain the look and feel of the trademark iPod scrollwheel. If you are smart enough to operate the current iPods, you'll be smart enough to use the touch-sensitive controls.
BTW, wasn't this story already posted elsewhere a couple of days back? Shouldn't Macrumors be crediting the original publisher?
edit: Appleinsider (http://www.appleinsider.com/article.php?id=1902) had this article last week. It goes into more detail too.
Kinda takes away from that whole "Simplicity is everything" slogan Apple is known for, doesn't it? While I'll reserve my judgments on the design until it's worked into a final product, it does look like the user needs to take unnecessary steps to actually use the click wheel. Then again, pictures (drawings) probably can't do the interface justice.
Still, not everything has to be digital over analog...I actually think this will be even more intuitive than other interfaces because the controls will be contextual. The buttons will automatically appear and disappear as you move your fingers over the iPod screen. The buttons themselves, I'm imagining, will maintain the look and feel of the trademark iPod scrollwheel. If you are smart enough to operate the current iPods, you'll be smart enough to use the touch-sensitive controls.
BTW, wasn't this story already posted elsewhere a couple of days back? Shouldn't Macrumors be crediting the original publisher?
edit: Appleinsider (http://www.appleinsider.com/article.php?id=1902) had this article last week. It goes into more detail too.
appleguy123
Apr 24, 11:17 PM
Just a few more hours now till I get to chomp on some villagers, see some wolves, protect useless forms of life, or become a useless form of life.
I'm biting my (werewolf?) nails here.
Edit: and Neko girl contacted me a few minutes ago. She said that she didn't have much Internet where she was, and might properly play in the next game.
I'm biting my (werewolf?) nails here.
Edit: and Neko girl contacted me a few minutes ago. She said that she didn't have much Internet where she was, and might properly play in the next game.
Doylem
Apr 4, 03:55 PM
Crummackdale...
http://img816.imageshack.us/img816/5480/crummackdale.jpg
http://img816.imageshack.us/img816/5480/crummackdale.jpg
No comments:
Post a Comment