
Original Link: https://www.anandtech.com/show/5688/apple-ipad-2012-review
The Apple iPad Review (2012)
by Vivek Gowri & Anand Lal Shimpi on March 28, 2012 3:14 PM ESTSince Apple launched the first iPad two years ago, the tablet market has evolved rapidly. While slate tablets were nothing new, the original iPad was the first serious tablet to be built around smartphone components and a user interface designed specifically for touchscreen input. The hardware was enough to run the OS smoothly while maintaining good battery life, the thin and light form factor lent itself to easy portability, and the touch-based user experience was miles better than earlier devices based on desktop operating systems.
We take it for granted now, but this was all news back in 2010, and the iPad was practically in a category of its own, with no real competitors to speak of. After Apple started shipping the iPad, the segment basically exploded—we had Google jump in with Honeycomb, HP got into it (and then out of it) with webOS, RIM had a go with the PlayBook, Amazon pushed the Kindle line into the tablet space, and Microsoft created its next release of Windows with tablets in mind. Along the way, Apple updated the iPad, both on the software side with multitasking, a new notifications system, and a myriad of UI updates, as well as launching second generation iPad hardware. The iPad 2 was a comprehensive update, bringing a dual core processor, unrivaled graphics performance, cameras fore and aft, and a ground up redesign that brought a thinner and lighter form factor.
The iPad 2 was a significant improvement over the original—faster, more portable, and generally a far more polished device. Not that it was perfect: iOS 4 still had issues with smooth multitasking and an archaic notifications system, the cameras were mediocre, and the XGA display, while a great quality panel, didn’t have the kind of pixel density expected of a premium mobile device. The iPad 2 hit market around the same time as Honeycomb (in Motorola’s Xoom) early last year, and at first Apple still held a major edge in terms of hardware. As more impressive Honeycomb devices like Samsung’s Galaxy Tab 10.1 and the ASUS Transformer Prime were launched, along with Ice Cream Sandwich looming on the horizon, Android became a much more viable tablet alternative to iOS. And with Microsoft planning for a major push later this year for ARM-based Windows 8 tablets centered around the Metro UI, Apple has never faced such stiff competition in the tablet space. Which brings us to the third generation of iPad hardware.
It has a display resolution that dwarfs most high-end desktop displays. The panel also puts a real emphasis on quality, not just resolution. For a computing device targeted squarely at the consumer market, both of these things are rarities.
Its SoC is the absolute largest ever squeezed into an ARM based tablet. The chip itself is even bigger than what you find in most mainstream notebooks. It’s expensive, it puts out a ton of heat and it offers a tremendous GPU performance advantage over anything else in its class.
And it has a battery that’s larger than what ships in the current crop of similarly sized ultraportables and Ultrabooks.
The new iPad doesn’t significantly change the tablet usage paradigm, but it does put all previous attempts at building hardware in this space to shame. It’s the sort of no holds barred, performance at any expense design that we’re used to seeing from enthusiast PC component vendors—but in a tablet...from Apple.
Welcome to the new iPad.
The new iPad
First things first, the name. It's the new iPad, or simply the iPad, not the iPad 3, iPad 2S, iPad HD or any other variation on the theme—no more alphanumerics after the product name. Like the car industry, Apple has started peddling its wares by model years. It's a marketing coup for Apple, and their ability to pull it off in the computer industry shows how far they've come in the last five years.
Like the iPhone 4S, the iPad update mostly focuses on a component level, with only minor external changes being made to the hardware. Obviously, the Retina Display is the headline feature here, being the basis of Apple’s marketing campaign for the new iPad. It brings a resolution of 2048x1536 to the iPad’s 9.7” display, boosting pixel density to an impressive 264 dots per square inch. This isn’t quite as concentrated as the iPhone 4/4S Retina Display, which has a density of 326dpi, but because the viewing distance for a tablet is expected to be greater than a phone, effective pixel density is similar. The highest pixel density we’ve seen in a tablet prior to this is 224, with a brace of Android tablets from ASUS, Acer, Huawei, amongst others, boasting 10.1” WUXGA panels.
To keep up with the QXGA display, Apple chose to update their A5 SoC to suit the needs of the new device. Called the A5X, the new chip retains the pair of 1GHz ARM Cortex A9 cores from the A5 but features a quad-core replacement for the already potent dual-core PowerVR SGX 543MP2 graphics processor. With the quad-core version, known as the SGX 543MP4, Apple claims that graphics performance has doubled with respect to the SGX 543MP2. Given that SGX 543MP4 is basically the same as SGX 543MP2 except with twice as many execution cores, the claim indicates that Apple kept the GPU clock at 250MHz. SGX 543MP2 was overkill for the iPad 2’s XGA display, but the iPad’s QXGA screen is so vast in terms of pixel count that the more powerful GPU was likely necessary just to drive the display smoothly. In addition to the SoC update, the new iPad got a much-needed bump to 1GB of system RAM.
The increase in number of pixels (and transistors powering the display—one for each RGB subpixel) comes with a corresponding increase in the percentage of light being blocked by the transistors and filaments. Thus, the percentage transparent area for each pixel is lower, necessitating a significantly stronger backlight when pixel density is increased. Between the more power-hungry backlight and the faster SoC, the power consumption of the iPad is significantly greater than it was previously. And so, the 25Wh battery was swapped out for a downright huge 42.5Wh Lithium-polymer pack. That’s well into ultrabook territory as far as battery capacity goes—it’s bigger, actually, than the 35Wh battery in the 11” Macbook Air. Apple claims 10 hours of battery life, identical to the previous WiFi model, relatively impressive given the increases in power consumption.
After the Retina Display, the most important new addition to the new iPad is the availability of LTE as an option, replacing the 3G models from before. Available in both AT&T and Verizon flavours, the LTE models both make use of Qualcomm’s 45nm MDM9600 LTE baseband. MDM9600 has support for UE Category 3 LTE, CDMA2000 1x/EVDO Rev.A (and B), GSM/EDGE, and WCDMA/HSPA+ all the way through DC-HSPA+ 42 Mbps, so it can roam internationally on 3G as well as connect to 3G in areas without LTE coverage. Unfortunately, the new iPad’s release schedule meant that it missed the 28nm shipping window by a couple of months, so we will probably have to wait until the 2013 iPad to see the more efficient MDM9615 modem.
The third major internal upgrade in the new iPad was the camera: in place of the iPad 2’s 720p rear facing camera raided from the iPod touch-parts bin, the iPad gets the huge upgrade to a 5MP camera with a backlit illuminated sensor and an f/2.4 lens. Basically, it uses the iPhone 4 sensor and iPhone 4S optics, so it’s still a parts-bin special, just with significantly better parts.
As before, the iPad is available with either a black or a white bezel, and pricing and storage options have stayed the same. Each step up in storage size costs $100 (so 32GB costs $100, while 64GB goes $200), and the addition of mobile broadband goes for $130. If you do the math and carry the ones properly, you’ll find MSRPs that range from $499 for the basic 16GB WiFi model all the way up to $829 for the 64GB LTE model. I picked up two 16GB WiFi iPads, a white one for me and a black one for a friend, while Anand picked up a black 16GB unit with Verizon LTE. We rounded out the collection with a black 64GB model on AT&T's LTE network.
Tablet Specification Comparison | |||||||
ASUS Transformer Pad Infinity | Apple's new iPad (2012) | Apple iPad 2 | Apple iPad | ||||
Dimensions | 263 x 180.8 x 8.5mm | 241.2 x 185.7 x 9.4mm | 241.2 x 185.7 x 8.8mm | 243.0 x 190.0 x 13.4mm | |||
Display | 10.1-inch 1920 x 1200 Super IPS+ | 9.7-inch 2048 x 1536 IPS | 9.7-inch 1024 x 768 IPS | 9.7-inch 1024 x 768 IPS | |||
Weight | 586g | 652g (WiFi) | 601g (WiFi) | 680g (WiFi) | |||
Processor |
3G/4G LTE—1.5GHz Qualcomm Snapdragon S4 MSM8960 (2 x Krait) WiFi—1.6GHz NVIDIA Tegra 3 T33 (4 x Cortex A9) |
Apple A5X (2 x Cortex A9, PowerVR SGX 543MP4) |
1GHz Apple A5 (2 x Cortex A9, PowerVR SGX543MP2) | 1GHz Apple A4 (1 x Cortex A8, PowerVR SGX 535) | |||
Connectivity | WiFi , Optional 4G LTE | WiFi , Optional 4G LTE | WiFi , Optional 3G | WiFi , Optional 3G | |||
Memory | 1GB | 1GB | 512MB | 256MB | |||
Storage | 16GB—64GB | 16GB—64GB | 16GB—64GB | 16GB—64GB | |||
Battery | 25Whr | 42.5Whr | 25Whr | 25Whr | |||
Pricing | $599—$799 est | $499—$829 | $399, $529 | - |
So internally, the iPad is definitely a step or two up from the iPad 2, but what about the device hardware? The short version of the story is that iPad hasn’t changed much at all. And by hasn’t changed much at all, I mean that it looks exactly the same as the iPad 2 except that it’s slightly thicker.
New chassis designs involve a lot of effort—between engineering, development, prototyping, tooling, and manufacturing, the entire process requires a lot of time and money. Apple’s design lifespan directly correlates to the maturity of the product line as well as the competitiveness of the market the product is in. With its mobile device lines, the tendency has been to keep a two year design life for established products. As such, the iPad design stays relatively unchanged, with the same design language as the iPad 2. That device had a ground-up design refresh based on the ergonomically-curved design language that debuted with the fourth-generation iPod touch in September 2010. Overall, we were pretty pleased with the form and aesthetics of the iPad 2, so we’re not sorry to see the design stay much the same.
iPad 2 (left) vs. new iPad (right)
But due to that massive battery, thickness has gone up (by 0.6mm to 9.4mm), and so has weight. At 1.44lbs, the iPad is 0.11lbs heavier than its predecessor (for the WiFi models; the LTE iPad adds 0.12lbs to the outgoing 3G iPad 2’s 1.34lbs). The Retina Display necessitated a faster GPU and a much larger battery to keep performance and battery life similar to the previous level, and you pay for it with a step backwards in terms of form factor. Six-tenths of a millimeter isn’t a lot, but when you consider that the iPad 2 was only 8.8mm thick to begin with, that 0.6mm represents a relatively significant 6.8% increase in thickness. Same goes for the 8.3% weight increase. It’s not a huge deal (after all, we’re talking about fractions of a millimeter and less than 2 ounces of weight), but if you’re familiar with the iPad 2, the additional heft is definitely noticeable.
Left to Right: iPad 2, new iPad, Transformer Prime
The new iPad (left) vs. the ASUS Transformer Prime (right)
iPad 2 (left) vs. new iPad (right)
It’s a little bit of an unfortunate development, because the thinner and lighter form factor was what differentiated the iPad 2 from the original iPad and made it so much more comfortable to hold. The 2012 iPad is now closer to the original iPad in weight, though it’s still significantly thinner—and still nearly as thin as the 9.3mm thick iPhone 4 and 4S, which is worth mentioning while we’re at it. It’s still definitely a thin device, just not as much as before.
And I’m not necessarily convinced that it’s a bad thing. The iPad 2, like the fourth generation iPod touch, was at the low end of skinny for me. At some point, a device becomes so thin that it simply is no longer comfortable to hold, and the iPad 2 and iTouch 4 were close to that line for me. The new iPad just backs away from that line slightly, and on a personal level, I think it’s slightly more comfortable to hold.
From Left to Right: iPad (2010), iPad 2 (2011), iPad (2012)
From Bottom to Top: iPad (2010), iPad 2 (2011), iPad (2012)
However, the weight gain means that the iPad is once again tiring to hold for very long—carrying it with one hand while reading results in more arm fatigue than the iPad 2. The iPad 2 was a significant improvement over the original in this regard particularly, so it’s a a disappointment to see that become an issue once more.
But taken as a whole, the 2012 iPad hardware is a big step forward. It improves on the two major component issues with the iPad 2—the screen and the camera—without making any major concessions with regards to performance, portability or battery life, as we’ll see. The new iPad is just as usable as its predecessor, it's just better.
Anand and I tag teamed this review. I'm responsible for this section, in addition to the intro, iPhoto, camera and Vivek's impressions sections. The rest is told from Anand's perspective. Hopefully that clears up any confusion as you make it through the review.
The Display
The most visible improvement of the new iPad is naturally its Retina Display. Originally introduced with the iPhone 4, the concept of an Apple Retina Display was created to refer to a display where the pixel density was high enough that the human eye, at a standard viewing distance, could not resolve or identify individual pixels.
Unlike traditional OSes, iOS doesn't support a laundry list of display resolutions. The iPhone was introduced at 480 x 320 (3:2 aspect ratio), while the iPad came to be at 1024 x 768 (4:3 aspect ratio). Rather than require iPhone applications be redesigned for a higher resolution iPhone, Apple simply doubled both the vertical and horizontal resolution for the iPhone 4—maintaining the same aspect ratio as the previous models, and only requiring higher quality assets, not a redesigned UI, to take advantage of the new display.
The iPad on the other hand always required a redesigned UI to make the most of the iPad's larger display and higher resolution. With a different aspect ratio, simply scaling up an iPhone app wouldn't work (although to enable backwards compatibility Apple did allow you to do just that). Admittedly Apple wouldn't want to allow such easy portability between iPhone and iPad apps as it wanted the extra effort to improve the quality of tablet apps.
The new iPad does what the iPhone 4 did and doubles both horizontal and vertical resolution: from 1024 x 768 to 2048 x 1536. All iPad applications work by default as developers don't directly address pixels but rather coordinates on the screen. Existing apps take up the full screen, and if higher resolution images are present they are used as they avoid the interpolation associated with scaling up an image designed for the original iPad resolution. For example, below we have a makeshift iOS icon in three different forms—10x7 native (72x72), upscaled to a 2x version using bicubic interpolation (144x144), and a 2x resolution version (144x144):
72x72
|
144x144 (Upscaled)
|
144x144 (Native)
|
![]() |
![]() |
![]() |
The upscaled form looks good, but the 2x resolution version looks better.
More traditional OSes have always given you additional desktop real estate with increased resolutions. iOS simply gives you a better looking desktop. This distinction is arguably one of the reasons why the new iPad's display can be so polarizing. As consumers of high-end displays we're used to higher resolution going hand in hand with a larger panel size. Alternatively, we're used to a higher resolution enabling us to see more on a screen at one time. In the case of the new iPad, the higher resolution just makes things look sharper. It's a ton of work for an admittedly more understated impact, but it's the type of thing that simply had to be done.
Retina Display Enabled Apps
Apple has created such a healthy marketplace with the app store that developers are eager to quickly deliver apps with updated graphics for the new iPad. Sure enough, by the day of launch we saw several high profile applications with higher resolution assets for the new iPad. The motivation to have Retina Display support is huge as Apple is actively promoting those apps that have been updated for the new iPad via the app store:
These updated apps now come with larger image assets, which can increase the total app size. Not all apps will grow in size (e.g. Infinity Blade 2 simply renders at a higher resolution vs. using tons of new content, not to mention that textures are already heavily compressed) but some have/will. The retina burden unfortunately impacts all iPads as there's only a single app package delivered upon download. Even if you don't use them, the higher resolution retina graphics are there.
Note that iPhone apps will now load their Retina assets (designed for 640 x 960) rather than their normal assets (designed for 320 x 480) on the new iPad, resulting in a significant improvement in image quality there as well:
Games are a special exception to the 2x asset scaling of the new iPad. Applications that simply have their UI accelerated by the A5X's GPU do fairly well at the iPad's native resolution. 3D games are another story however.
If all you're doing is determining the color of a single pixel on the screen, not impacted by lights in 3D space or other transparent surfaces above the surface, it's a relatively simple and painless process. For the majority of what you're looking at in iOS, this is simply the procedure. The app instructs the drawing APIs to place a red pixel at a set of coordinates and that's what happens. In a 3D game however, arriving at the color value of that pixel can require quite a bit of math, and quite a bit of memory bandwidth.
Game developers have a few options on the new iPad. One option is to not update a game, running it at 1024 x 768 and rely on the iPad's scaler to upscale the image to 2048 x 1536. The game will take up the full screen, run faster than on the iPad 2, but it won't necessarily look any better. Low resolution content upscaled to a higher resolution display still maintains much of the aliasing you'd see at a lower resolution.
Another option is to render all scenes at the new iPad's resolution: 2048 x 1536. With four times the number of pixels to fill and only 2x the compute and memory bandwidth compared to the iPad 2, this will only work for fairly lightweight content. Not to say that it's impossible—even GLBenchmark's Egypt test, in its current form, actually runs very well at the new iPad's native resolution. Many stressful 3D games won't fall into this category however.
The third, and more popular option is for a game developer to render all frames offscreen at an intermediate resolution between 1024 x 768 and 2048 x 1536, then scale up to the panel's native res. So long as the developer maintains aspect ratio, it'll be possible to use this approach and get a good balance of higher resolution and performance.
Infinity Blade 2 for the new iPad Renders at roughly 1.4x the iPad 2's resolution, then upscales to fill the screen
Infinity Blade 2, for example, renders offscreen at roughly 1.4x the resolution of the iPad 2 before scaling up to 2048 x 1536 for final display. The result is a sharper image than what you'd get on an iPad 2, without sacrificing performance.
Game developers may choose to increase the level of anti-aliasing instead of or in combination with an increase in resolution. As we'll discuss shortly, Apple's A5X does come equipped with more GPU execution resources and dedicated memory bandwidth for graphics that would allow for an increase in quality without a corresponding decrease in frame rate.
The Display: In Numbers
Apple is very big on maintaining a consistent experience between its products. We see this a lot in our Mac reviews where it's not unusual to see similar white points across virtually all Apple products. It's no surprise that the with the move to the Retina Display Apple wanted to retain as much of the original iPad's display characteristics as possible. We'll start with an analysis of brightness and contrast, both of which remain relatively unchanged from the iPad 2:
Apple is expected to have triple sourced panels for the new iPad, so you can expect to see variation in these results but for the most part you can expect the new iPad's display to perform similarly to the previous model.
Despite similar brightness and contrast to the previous model, the new iPad offers remarkably better color gamut and color reproduction than its predecessor. Relative to other tablets, the iPad's display is spectacular.
As we mentioned in our Retina Display analysis, Apple delivered on its claims of a 44% increase in color gamut. The new iPad offers nearly full coverage of the sRGB color space and over 60% of the Adobe RGB gamut:
Below is the CIE diagram for the new panel with an sRGB reference plotted on the same chart so you can visualize the data another way:
Color accuracy has improved tremendously if we look at delta E values for the primary and secondary colors:
Remember from our display reviews, lower delta E values indicate greater color accuracy. Values below 4 are typically considered good and you can see that the iPad 2 as well as the Transformer Prime both fell short in this department. With the new iPad Apple has clearly focused on color accuracy, which makes sense given it was used as the vehicle to introduce iPhoto for iOS.
Apple still has a lot of work ahead of itself to really put forth a professional quality display in a tablet, but for now the Retina Display is easily the best we've seen in a tablet and a tremendous step forward.
What's most absurd about the iPad's Retina Display is that you're able to get this resolution and panel quality in a $499 device. While we must be careful not to give Apple too much credit here as Samsung, Sharp and its other display partners clearly make the Retina Display, it's obvious that Apple has really been pushing its partners to develop solutions like this.
The biggest problem in the production of any commoditized component is the primary motivation for innovation is to lower cost. For years I argued with notebook PC makers to use higher quality LCD panels but no one was willing to commit to the quantities that would lower costs enough. I was also told that as soon as you put these notebooks on shelves at Best Buy, users wouldn't really care whether they were getting a high quality IPS display or not—all that mattered was the final price.
Apple, under the leadership of Steve Jobs, had a different mentality. Steve's pursuit was quality and experience, cost was a secondary concern. Through slow and steady iteration of this approach, Apple was able to build up a large enough customer base and revenue to be a significant force in the industry when it came to driving costs down. Apple can easily fill your fabs and eat all that you can produce, but you'll have to do whatever it wants to get the order.
Apple's behavior since it got rich has been to drive down the cost of higher quality components, LCDs being a perfect example. Unfortunately other companies don't benefit as much here as Apple tends to buy up all of the production of what it has pushed to create. That's one reason why, although ASUS was first to introduce a 1080p Transformer Pad, it won't launch until well after the new iPad. From what I've heard, the panel makers are all busy servicing Apple's needs—everyone else comes second.
Eventually the entire industry will benefit and all indications point to Apple doing something special for "pro" users in the notebook space next. As I've said previously, Apple has raised the bar with the iPad's Retina Display. The time for average display quality in a $500 tablet is over, the bar has been raised. It remains to be seen whether or not Apple will be able to maintain this quality across all suppliers of its Retina Display. On the iPhone Apple has been entirely too lax about maintaining consistency between suppliers. If it wants to be taken seriously in this space Apple needs to ensure a consistent experience across all of its component vendors.
The Display: In Practice
I remember a time when 3D games stopped looking good to me. It didn't matter what what id, Epic or Crytek would do, I just came away unimpressed while all of my friends oohed and ahhed. It wasn't because these developers and artists were doing a bad job, it was because I was spoiled by the 3D tech demos ATI and NVIDIA would show me on a regular basis. Not burdened by the realities of running a game company, including not having to build something that would run on every PC, not just the top 1%, ATI and NVIDIA would regularly share with me the absolute best of what could be done on the world's fastest GPUs. They were truly pushing the limits with these tech demos, they always looked amazing, but they ruined all actual games for me. Nothing ever looked as good as these demos and thus I was never really impressed with the visuals of any game I saw. I always came to expect better looking visuals and the best a game developer could deliver was on par with my expectations. Never could a developer exceed them.
I feel like Apple and other companies pushing the display industry forward have done something similar to the visual impact of a good display. I'm rarely blown away by a display, I've just come to expect good ones and I'm sorely disappointed when I encounter bad ones. The Retina Display on the new iPad is good, spectacular, amazing, sharp, great even, but it's what I've come to expect.
There's been a lot of debate online among reviewers and users alike about just how good, subjectively, the new Retina Display is. I feel like the cause for the debate boils down to just where your expectations are. If you're used to using the bargain basement TN panels that this industry is littered with, then you will be blown away by the iPad's Retina Display. If, however, you're surrounded by good displays—perhaps even those used in other Apple products—you'll like the new display, but you'll be grounded in your reaction to it.
Assuming you don't have absolutely perfect eyesight, you'll have trouble resolving individual pixels on the new Retina Display. If you're used to this because of your iPhone 4/4S, HTC Rezound or other amazingly high-density display, the experience on the new iPad will be similar, just on a larger scale.
To the left we have the original 1024 x 768 panel, and to the right we have the new Retina Display. At this distance you can still identify individual pixels, an ability that quickly vanishes at normal viewing distances. The Music app icon is an even better example of what you gain from the newer display as it has more high contrast edges that appear more aliased on the 1024 x 768 panel:
If we take a few (or an order of magnitude) more steps closer to the display and put it under the microscope we can get an even better appreciation for exactly what Samsung (and Apple's other display vendors) have done with the creation of this panel. Below are shots at 50x magnification of the display from the iPad 2, new iPad, ASUS TF Prime and iPhone 4S, organized from lowest to highest DPI:
Apple iPad 2, 1024 x 768, 9.7-inches
ASUS Eee Pad Transformer Prime, 1280 x 800, 10.1-inches
Apple iPad Retina Display (2012), 2048 x 1536, 9.7-inches
Apple iPhone 4S, 960 x 640, 3.5-inches
What you're looking at here are shots of the three subpixels for each pixel. Subpixel shapes will vary by panel type/manufacturer (hence the iPhone 4S vs. iPad subpixel structure), but the increase in density is tremendous.
I hate the term "painted on" because it gives the impression, at least to me, of limited separation between the viewer and the object as you would have with a piece of paper. Despite the very shallow gap between the outermost glass and the display stack itself, you can still tell that you're looking at a screen with the new iPad. Perhaps it's because of the reflections in the glass or the lack of tactile feedback convincing you that you are still looking at a virtualized interface, but the Retina Display does not break down the barrier of reality. It's always dangerous when using hyperbole to speak about a display as good as the iPad's. Get too excited in your description and you're bound to disappoint those whose expectations are simply too high. The iPad's Retina Display is stunning but the best way I'd describe it is this: the Retina Display is the type of display a $500+ tablet should come with.
iOS is often applauded for maintaing a smooth UI frame rate during animations like screen swipes. The screen on the new iPad delivers delivers a similarly seamless experience, but with regards to stationary content. Icons on your home screen look permanent, in place and, I hate to say it because nothing ever truly is, perfect.
Text is always sharp and extremely legible. You won't get the minimal fatigue you get from an e-ink display, but once you do have the versatility of a full blown tablet with the iPad.
Remote desktop apps stand to benefit tremendously from the new iPad's Retina Display, particularly if you're remotely accessing a very high resolution desktop. The image below is what my 2560 x 1440 desktop looks like, remotely accessed from an iPad 2:
Text is extremely aliased, basically illegible. I can see everything on a single screen but I can't really make out much of what I'm looking at.
Here's the same remote desktop app for Android running on an ASUS Transformer Prime at 1280 x 800:
Here we have a tangible improvement, but still not tremendously better than the iPad 2. Now let's look at the desktop on the new iPad:
I can actually read the contents of IMs I'm receiving, I'm still going to want to zoom in to actually do anything but even zoomed in the crispness of everything is just so much better. This is a functional, dare I say productive use of the new Retina Display.
Mouse over the links above to see a crop of me remotely accessing my desktop from the three tablets. There's a huge improvement in how legible the remotely accessed text is on the Retina Display.
Apple iPad 2 | Apple iPad (3rd gen) | ASUS TF Prime |
original | original | original |
Keep in mind that you are limited to the resolution of the content you're viewing, which is particularly a problem on the web. The majority of displays fall somewhere between the 1366 x 768 and 1680 x 1050 range, and the majority of websites are designed to be fixed width at no more than 1200 pixels wide. As a result, images embedded on these websites are relatively low resolution compared to the iPad's display. We'll have to see a fairly large shift in display resolution across the board to really push the web towards embedding higher resolution photos and images, but it'll be a point of frustration.
Safari's 2MP JPEG Limitation
There's another present limitation in mobile Safari for iOS: JPEG images larger than 2.097MP are automatically downscaled for display, even on the new iPad. To test behavior I created a few images at varying resolutions and listed the displayed resolution on the iPad:
Mobile Safari (iOS) JPEG Limits | |||||||||
Image | Native Height | Native Width | iPad H | iPad W | Ratio | Native MP | iPad MP | ||
Full | 4256px | 2832px | 1064px | 708px | 1:4 | 12MP | 0.75MP | ||
3000px | 3000px | 1996px | 1500px | 998px | 1:2 | 6MP | 1.5MP | ||
2000px | 2000px | 1331px | 1000px | 666px | 1:2 | 2.66MP | 0.67MP | ||
1600px | 1600px | 1065px | 1600px | 1065px | 1:1 | 1.70MP | 1.70MP |
As soon as you pass the 2.097MP threshold, mobile Safari will reduce the horizontal or vertical resolution (or both) of the image being rendered by an integer factor until the resulting image is below the threshold. The limit is purely resolution based. Apple documents the limit is 2MP however I was able to create images that were slightly bigger without triggering the downsample. The maximum decoded JPEG resolution according to Apple is 32MP.
The downscaling made sense when the iPad had a 1024 x 768 display, but it's now time for a Safari update lifting the restriction to properly support the new iPad.
Note that this limit doesn't apply to GIFs, PNGs or TIFFs, only JPEGs. Apple claims that for these other file formats you're limited to a max decoded size of 3MP on iDevices with less than 256MB, or 5MP on 512MB iDevices, although you'll notice a 64MP PNG in the image above being rendered at what appears to be full resolution.
The Most Tangible Feature: LTE Support
As the iPad straddles the divide between the smartphone and the notebook, wireless connectivity is a must to maximize its usefulness. In cities where WiFi is plentiful, opting for cellular connectivity isn't absolutely necessary, but in most of my travels I find that having some form of data plan associated with your tablet makes it a far more useful device.
If you already have a wireless hotspot or can create one through your smartphone, the appeal of a cellular modem in your iPad is diminished. There's still the convenience aspect of simply unlocking your tablet and having internet access regardless of where you are, and without having to turn on another device or configure a software based personal hotspot.
If you don't have either of these things however, and you plan on using your iPad regularly outside of your home/office, buying one of the cellular enabled versions is a costly but sensible decision.
In the past the iPad was limited to 3G operation, however arguably one of the most tangible improvements with the new iPad is its support for LTE. Through Qualcomm's MDM9600, a 45nm LTE modem with support for EVDO and HSPA+ (but no voice), Apple brings the most complete set of cellular connectivity options we've seen on the iPad.
Qualcomm's MDM9600 in the LTE iPad, courtesy iFixit
Before we get to the discussion of service plans, performance and the personal hotspot, I must stress just how big of an improvement LTE is on the iPad compared to last year's 3G models. Although LTE on a smartphone is pretty amazing, it's even more shocking on a tablet. Assuming your usage model on an iPad is a closer approximation of a notebook usage model, the inclusion of LTE is akin to always being on an extremely fast cable internet connection. Web pages load up just as quickly over LTE as they do over WiFi at my home (since the iPad's WiFi is limited to around 30Mbps in most cases, which ends up being peak downstream for me on AT&T's LTE network here in Raleigh). At my parents' house, where the fastest internet available is 6Mbps DSL, it's actually even faster for me to browse the web on the LTE iPad than it is using their WiFi. Obviously their home internet offers unlimited data transfers, while the LTE iPad does not, but for non-primary use the performance is absolutely worth the entry fee.
I mentioned the LTE connectivity on the new iPad is the most tangible feature of the tablet because the improvement in web page loading times alone makes the tablet feel much faster than its predecessor. While you can argue about how significant the Retina Display is, there's no debating about how much faster LTE is over the 3G iPad 2 models when out of range of WiFi. It's just awesome.
The LTE Breakdown, Carrier/Frequency Support
In the US, Apple makes three versions of the new iPad available: a WiFi-only device that lacks the MDM9600 and its associated components, a Verizon LTE version and an AT&T LTE version. The pricing between the three options is outlined below:
The new iPad Lineup | |||||
16GB | 32GB | 64GB | |||
WiFi | $499 | $599 | $699 | ||
AT&T WiFi + 4G | $629 | $729 | $829 | ||
Verizon WiFi + 4G | $629 | $729 | $829 |
As has always been the case, these's a $130 adder to enable cellular connectivity on the iPad. Apple is making up for slimmer than usual margins on the 16GB WiFi iPads by charging quite a bit for NAND and cellular upgrades. Unfortunately there's no way around the cost (outside of relying on an external hotspot via smartphone/MiFi) and the added functionality is definitely worth it.
Just as before, AT&T and Verizon offer no-contract data plans for use with the new iPad. These plans don't require any activation fee and can be managed on the iPad itself. You can cancel and re-activate at any time:
iPad Data Plans | ||||||
$14.99 | $20 | $30 | $50 | |||
AT&T | 250MB | - | 3GB | 5GB | ||
Verizon | - | 1GB | 2GB | 5GB |
AT&T offers the better "deal" at $30 per month although both carriers offer the same 5GB limit for $50 per month. Currently only Verizon enables iOS' personal hotspot option on all of its plans for no additional charge. AT&T claims it is working on enabling personal hotspot, however it is currently not available.
Both the AT&T and Verizon versions support the same GSM/UMTS/HSPA/HSPA+ frequencies and ship carrier unlocked so you can swap in any microSIM and use your iPad on a supported network. The table of bands supported by both models is below:
Cellular Network Support | ||||
AT&T WiFi + 4G | Verizon WiFi + 4G | |||
4G LTE | 700 MHz, AWS | 700MHz | ||
EV-DO Rev. A | - | 800, 1900 MHz | ||
UMTS/HSPA/HSPA+/DC-HSDPA | 850, 900, 1900, 2100 MHz | |||
GSM/EDGE | 850, 900, 1800, 1900 MHz |
The premise behind shipping the iPad unlocked is to allow users to purchase and use SIM cards from around the world when traveling. As long as the network you're on is supported by the iPad, your microSIM will work.
LTE support is unfortunately confined to North America only. International support is limited to 3G. And although DC-HSPA+ is supported by the new iPad, T-Mobile customers in the US are mostly out of luck. A T-Mobile microSIM will work but unless you're in a market where T-Mobile has enabled W-CDMA on 1900MHz, you'll be limited to EDGE speeds. In theory, if T-Mobile had two available W-CDMA carriers on 1900MHz in your area you could get DC-HSPA+ but that seems highly unlikely given the limited 1900MHz spectrum T-Mobile has available.
Although both AT&T and Verizon have LTE-FDD deployed on 10MHz wide carriers in the US, many AT&T markets use 5MHz carriers. In a 5MHz AT&T LTE-FDD market, assuming all else is equal in terms of deployment and loading, Verizon's network should be significantly faster. The reality of the matter is far more complex. Verizon's LTE network is (presumably) far more utilized as its been in operation for longer than AT&T's. Verizon's carrier bandwidth advantages can easily be eaten up by an increase in active LTE subscribers. On the flip side, there's also the question of deployment strategies. Take Las Vegas for example. As we found at CES, AT&T had great coverage in key areas (e.g. the Las Vegas Convention Center), however at other hotels around the Las Vegas Strip we typically had better luck on Verizon. It's been my personal experience that AT&T's network is either be great or horrible, with very little in between. Verizon on the other hand tends to deploy much more evenly from what I've seen.
Raleigh, NC, my home town, happens to be a 5MHz market for AT&T. With both AT&T and Verizon LTE deployed here, I ran through a combination of nearly 200 speedtests across two LTE iPads around the North Raleigh area:
AT&T LTE vs. Verizon LTE in Raleigh, NC—Downstream | |||||
Average | Max | Min | |||
AT&T | 11.46 Mbps | 25.85 Mbps | 1.12 Mbps | ||
Verizon | 13.33 Mbps | 29.52 Mbps | 0.33 Mbps |
AT&T LTE vs. Verizon LTE in Raleigh, NC—Upstream | |||||
Average | Max | Min | |||
AT&T | 4.44 Mbps | 12.35 Mbps | 0.07 Mbps | ||
Verizon | 4.52 Mbps | 19.67 Mbps | 0.01 Mbps |
AT&T LTE vs. Verizon LTE in Raleigh, NC—Latency | |||||
Average | Max | Min | |||
AT&T | 72.9 ms | 120.0 ms | 58.0 ms | ||
Verizon | 84.1 ms | 217.0 ms | 60.0 ms |
On average, Verizon was faster than AT&T. I measured a 15% advantage in average downstream speed and a similar improvement in latency. The two were roughly equivalent in average upload speeds, with AT&T managing a small 1.8% advantage. The numbers were closer than expected, given that Raleigh is a 5MHz market for AT&T, but I suspect some of the mitigating factors I mentioned above are at work here.
Subjectively, Verizon did seem to be faster more often although I didn't really have any complaints about the performance of the AT&T LTE iPad. Both iPads indicated they remained on LTE although, as you can see from the data above, performance can get very low before officially falling back to 3G.
In the case of the AT&T iPad, if you don't have LTE coverage you first fall back to HSPA+ which can still deliver respectable performance. Verizon iPad owners will unfortunately fall back to EVDO, which can be significantly slower. If Verizon LTE coverage is good in the places you plan on using your iPad then this difference isn't really a big deal. As with any smartphone carrier decision, you need to factor in where you plan on using the device into your decision.
The iPad as a Personal Hotspot: Over 25 Hours of Continuous Use
Verizon makes the decision of which iPad to buy even more difficult by being the only of the two US carriers to enable the personal hotspot option on the new iPad. For no additional monthly fee on top of your data plan your Verizon LTE iPad can act as a wireless hotspot, allowing up to five other devices to use its cellular connection over WiFi (2.4GHz only, unfortunately) or Bluetooth. One device can use the hotspot via the iPad's USB dock cable.
If you don't already have the personal hotspot option in the initial settings page, you'll need to go to general settings, then network, and activate personal hotspot there. Once you've done so you'll see a new item for personal hotspot in the default settings page.
You must remain on the personal hotspot settings page for the iPad's SSID to be visible to nearby devices. Once you leave the settings page, the iPad stops broadcasting its personal hotspot SSID.
In general the iPad's personal hotspot seems to be better behaved than similar options under Android. I've noticed all too often that Android hotspots will either stop routing traffic after an extended period of use, requiring either cycling the radio states on the hotspot device itself or in some cases a full reset of the hardware. The iPad wasn't immune to this sort of behavior, it just seemed to happen less than on the Android tablets and smartphones that I've tested. In one test it took only a few hours before I had to reset the iPad to make its hotspot work properly again, while in another case it was only after 24 hours of continuous use that the feature began misbehaving. Overall I am very pleased with the Verizon iPad as a personal hotspot, the bigger issue is the cost of the data that you're sharing with all of those devices.
As I mentioned in our Galaxy Tab 10.1 LTE review, these LTE tablets make great hotspots simply because you are pairing smartphone modems with gigantic (for a smartphone) batteries. The end result is if you have to treat your LTE tablet as a true hotspot (screen off and all), you get great battery life. The new iPad takes this idea to a completely new level since its battery is now squarely in the laptop-sized category, but its LTE modem is still designed to run on a < 6Wh smartphone battery.
Our standard hotspot battery life test involves running four copies of our web browsing battery life test and playing a 128Kbps internet radio stream on a laptop tethered via WiFi to the hotspot being tested. While peak download speeds during this test can reach as high as 1MB/s, remember that these web browsing battery life scripts include significant idle time to simulate reading a web page. The average data transferred over the duration of the test amounts to around 25KB/s if you take into account idle periods.
With the Galaxy Tab 10.1 LTE I tried something different—letting the tethered notebook download at full speed using the Tab's LTE connection. On the new iPad, after nearly an hour of downloads at well over 1MB/s I saw no drop in the battery percentage indicator—it was stuck at 100%. Not wanting to upset Verizon too much, I needed to find a good balance between a realistic workload and something that wasn't going to make me rack up over a hundred GB in overages.
If our standard hotspot test averages around 25KB/s of transfers, I figured doubling it couldn't hurt. I downloaded a sufficiently large file at a constant 50KB/s on a laptop tethered over WiFi to the new iPad to see how long it would last. The result was astounding: 25.3 hours on a single charge
I used up over 4.5GB during this period—almost the entire amount that my $50/month plan gave me, all without having to plug the iPad in to recharge it. That's the beauty of using a 42.5Wh battery to drive a cellular modem that can last a couple of hours on a tenth of that capacity. If you want to use the new iPad as a personal hotspot, you'll likely run out of data before you run out of battery life.
It's a real shame that AT&T decided against enabling personal hotspot on its version of the LTE iPad. It's for this reason alone that I'd recommend the Verizon version, assuming that you're planning on using your iPad in an area where Verizon has LTE coverage of course.
The Camera, It's Much Improved
by Vivek Gowri
iOS 5.1 brought with it a number of bugfixes along with a few minor changes to the core entertainment applications (Music, Photos, Videos), but the only real UI change it brought was the redesigned camera application for the iPad. It fixes our biggest complaint with the original—the shutter button’s location in the middle of the settings bar at the bottom of the screen—and ends up being a big improvement from a usability standpoint. The shutter now resides in a floating circular button on the right side of the display, right where your right thumb falls when holding the iPad with two hands. It’s a more intuitive location for the shutter, so taking a picture is a far more natural feeling exercise than it was before. Other than that, the app looks pretty similar—the settings bar now has the still/video slider, front/rear camera switch, an options button, and the link to the photo gallery.
In terms of camera options, there’s only one. You can either have the rule of thirds grid overlay visible or hidden....and that’s it. There’s no other settings for you to change. No exposure, white balance, ISO, shutter speed, or anything else that isn’t the shutter button. Unfortunately, even the HDR mode from the 4 and 4S is nowhere to be found on the iPad. You literally just point and shoot. That’s all there is for you to do.
In our review of the iPad 2, we summed up the cameras with just one word, mediocre. Looking back, I realize now that the word mediocre is a pretty charitable way to describe the iPad 2’s camera situation. Both sensors were borrowed from the iPod touch, and while the VGA front facing camera was acceptable, the rear facing 720p camera was legitimately bad by the standards of a $499 device.
The new iPad fixes that rear camera problem in a big way, with the five element f/2.4 lens and optics borrowed from the iPhone 4S paired with the Omnivision OV5650 CMOS image sensor from the iPhone 4. A quick refresher on specs: 5 megapixels, backside illuminated, 1080p video at 60fps. If you ignore megapixel count, it’s a pretty competitive camera on paper. There’s a lot of recycled parts here, with bits and pieces from other iDevices frankensteined together to come up with a new imaging system for the iPad, but parts-bin raids aren’t bad when the bins being raided from contain top-tier components. The result ends up being pretty good—as a camera, the new iPad is light years ahead of its predecessor in basically every way.
In practice, it’s nothing short of stellar. Image quality is comparable to most high end smartphones, though not quite good enough to be on par with the bleeding edge cameraphones (4S, Nokia N8/N9, HTC Amaze 4G, Galaxy S 2, etc). Interestingly enough, the preview image looks to be running below 30 fps, appearing a little bit choppy at times. This is likely due to the high resolution of the preview and upscaling it to a very high display resolution, but it doesn’t particularly affect image capture. I measured shot to shot time at exactly one second (I had a range between 0.98 and 1.04 seconds, averaged out to 1.0 when factoring in reaction time). That’s about double what Apple claimed for the 4S, and a bit longer than the iPad 2. Granted, the iPad 2’s camera was very quick in part because the amount of processing it takes to capture a 960x720 image is almost zero, with about 13.8% as many pixels as each 2592x1936 image captured by the new iPad.
The focal length is 4.28mm, a bit longer than the iPad 2’s 3.85mm. The difference is actually noticeable; when taking pictures of nearby subjects, you’re sometimes surprised by how magnified the subject appears. However, the camera is good for landscapes, as you can see from the sample gallery. I took the iPad with me on a weekend trip to Victoria, B.C. and used it as my primary camera on the trip. Now, while I wouldn’t trade my SLR for an iPad anytime soon, I can’t deny that the results turned out pretty well. Colours were vibrant, white balance was accurate, and the clouds were nicely highlighted. It’s a quantum leap from the noisy, 0.7MP mess that was the iPad 2 camera. Mouse over the links below to see some comparisons between the cameras on the iPad 2, 3rd gen iPad and TF Prime.
Apple iPad 2 | Apple iPad (3rd gen) | ASUS TF Prime |
original | original | original |
The new sensor can record 1080p video, up from 720p. Video quality was probably the best aspect of the iPad 2 camera, and it's even better here. Output is recorded at 29.970 fps and encoded in h.264 Baseline with a bitrate of 21Mbps and single channel audio at 64kbps. The recorded video impresses, with crisp detailing and adequate audio quality from the single mic.
The front facing camera keeps the Omnivision OV297AA sensor from the iPad 2, and as such, image and video quality remain unchanged. It’s not necessarily a bad thing, since it remains adequate for FaceTime and Skype, but it would have been nice to see an update to an HD-quality webcam up front.
With augmented reality apps, I’m starting to see the benefit of rear cameras on tablets. For example, the Yelp app, which takes location and compass data to display what restaurants are the direction the iPad is pointing, with a real-time street view of the search results. It’s not necessarily the most useful way to use the rear facing camera in an AR application, but overall it’s an idea that has potential. Apple also tells us that its business and education customers see usefulness in the iPad's rear facing camera as they can use it to quickly document something while using the iPad as a productivity tool. As a consumer, you’re going to get weird looks if you’re using the iPad to take pictures though, it’s a relatively comical sight.
And that’s really the problem: from an ergonomic standpoint, smartphones are just so much easier and more comfortable to use as cameras. And because the imaging hardware is so similar, I’m not sure I see the real benefit of having a rear facing camera on a tablet except in very specific use cases.
Handheld Image Editing: iPhoto for iOS
by Vivek Gowri
Alongside the iPad 2, Apple launched iOS versions of iMovie and GarageBand. Now, Apple has announced iPhoto for iOS, completing the iLife collection for iOS. Like iMovie and GarageBand, iPhoto goes for $4.99 on the App Store and makes an ideal companion for the iPad Camera Connection Kit.
iPhoto can take images from multiple sources, including iTunes, Camera Roll, iCloud, as well as pictures imported through the Camera Connection Kit’s SD card. When you open iPhoto, you’re greeted by thumbnails of photo albums corresponding to the albums synced from iTunes, the desktop iPhoto, and iCloud Photo Stream, as well as the device’s Camera Roll, images imported from the Camera Connection Kit, and a set of albums created within iPhoto for edited photos, flagged images, favorites, or pictures beamed to the iPad from other iOS devices with iPhoto. The album view is similar to iBooks or Newsstand in that the thumbnails are displayed on shelves, though instead of a virtual wooden bookshelf, iPhoto has a more modern aesthetic with glass shelves floating on a light gray background.
The other tabs are photos, events, and journals. Photos simply is all the photos taken on, imported to, or beamed to the device. Events are collections of images synced to your device from iTunes or imported using the Camera Connection Kit. iPhoto journals are a digital scrapbook of a selected set of images, arranged as a flow of differently sized elements in a digital mosaic.
From an album, event or the photo box, tapping an image will take you to the main image page, with a few buttons on the top bar. The most prominent and most important is the edit button in the top right corner, along with options for sharing, image information, and a “show original” button on that side of the toolbar, while the left side of the toolbar has an option to show/hide the thumbnail grid on the left edge, a help button, and an undo button (that only functions in image editing mode). Touching and holding the image with two fingers brings up a magnifying loupe to zoom in on a specific spot.
Entering the editing mode brings up a toolbar on the bottom, with editing tools, tagging options, and a gear that brings up secondary options. As far as editing tools go, iPhoto has most of the major ones—crop and rotate, exposure, color saturation, brushes, and various effects, all of which take up residence in 5 buttons at the bottom left corner. More general options are in the middle: auto-enhance, 90-degree rotation, flagging, favoriting, and hiding, then on the right side a settings menu that allows for selecting multiple photos, copy/pasting edits to multiple photos, and reverting to original.
Cropping is pretty straightforward, with pinch to zoom and a composition grid, as well as a few preselected crop aspect ratios accessible via the options gear. Rotation comes courtesy of a dial at the bottom of the screen, which allows you to accurately straighten your images.
Exposure controls brightness and contrast, which are combined into a slider that allows for adjustment of the dynamic range. You can control all three separately using that slider, or by pressing and holding the image, bringing up a four directional arrow that you can drag. The two different axes represent control over two different options, depending on where on the image you press. The options gear has three options: copy, paste, and, like in all of the editing modes, a reset for the individual editing mode (as opposed to the entire image). The entire editing process is very intuitive and the tactility of the program makes post-processing easy to control even for imaging novices.
The color options are pretty basic; there are sliders for color saturation, skies, grass and plants, and skin tone, along with a circle with WB for the different white balance options—as shot, sun, cloudy, flash, shade, incandescent, fluorescent, face balance, and custom, which brings up a magnifying ring to select a point of neutral color. The gear brings up the standard copy, paste and reset, but also has a setting to preserve skin tones, for keeping skin tones as shot while saturation is increased or decreased.
The brushes are the most interesting tool here, basically letting you paint on the image to edit in very specific regions. There are eight different brush tools—repair, red eye, saturate, desaturate, lighten, darken, sharpen and soften. Repair patches areas of a photo using pixels from the surrounding areas, while the rest are pretty self explanatory.
The settings and options with the brush tools are pretty endless. The most useful one is probably the edge detection setting, which lets strokes apply only to areas similar to the initially painted region—ie, if you were softening a body of water or the sky. Other options include strength and intensity of the brushes, the ability to erase individual brush strokes, having brush strokes shown as they’re drawn, and to apply the effect to the entire image. The other nice touch here is that, in addition to being able to reset all brush strokes for an image, you can reset the strokes made with any specific brush. Thus, you can reset the softening brush while not changing any edits made with the other brush tools.
The last editing mode is effects, which lets you apply a number of different effects and filters. There are six different preset effects that are displayed in a swatch book—artistic, vintage, aura, black and white, duotone, and warm and cool. Each effect has options, with artistic and vintage having different filters and the others having sliders to adjust the color or level of the filter. Some of the effects have vignetting (which can be adjusted with a pinch motion), while others have color and texture options like adding grain or a sepia tone. Effects is a fun one for the Instagram crowd, my thirteen year old brother particularly enjoyed playing with them.
The tools themselves are pretty decent in mobile use; all of the main features you would want in an editing program are there, and the simplicity of use quotient is really high. But iPhoto was unexpectedly slow on the new iPad—simple stuff like filters and color editing feel a little bit sluggish, with changes taking a beat to show up, but more complex operations with brushes feel like they take forever to happen. Just entering brushing mode takes a decent chunk of time, over 10 seconds, and the editing once you get there is far from smooth. If you’ve applied a brush tool then want to add an effect, expect things to move at an agonizing pace.
Using iPhoto, it’s easily possible to peg both cores at near-100% CPU utilization, particularly when applying a brush. This is one of the very few times I’ve felt like the iPad is CPU-limited, but a quad-core SoC would likely have been very helpful in making the iPhoto experience smoother and faster. iPhoto is available for the A5-based iPad 2 and iPhone 4S, as well as the A4-based iPhone 4. The original iPad is excluded from the list of supported devices, as is the 4th gen iPod touch, presumably due to concerns about system RAM (the iPad and iPod touch 4 had 256MB RAM instead of the iPhone 4’s 512MB), but even so, I can imagine iPhoto being terribly slow on the single-core iPhone 4.
But other than the smoothness, iPhoto is a nice tool to have at your disposal. For basic edits, iPhoto is definitely adequate, and it makes image post-processing a much more attainable tool for beginners, both in terms of ease of use as well as cost—compared to how much Lightroom or the different versions of Photoshop cost, $4.99 is almost a pittance. For serious photographers, it’s not powerful enough or fast enough for normal use, but it’s an interesting tool to quickly create previews in mobile situations. And for casual users, it excels, delivering a lot of flexibility and a decent amount of editing power literally at one’s fingertips.
Apple gave us a number of high-res photos to try out iPhoto with. We gave the originals along with a new iPad to a photographer and had her try her hands at editing on the iPad. The result of her editing work is below, hover over the links to show you what type of editing you can do with iPhoto for iOS.
Sample 1 | Sample 2 | Sample 3 |
Before (original) | Before (original) | Before (original) |
After (original) | After (original) | After (original) |
Battery Life
For a company that has been so laser focused on reducing weight and device thickness, the new iPad actually growing in both of these dimensions was unexpected. From a technical standpoint, the tradeoff makes sense. The new Retina Display consumes significantly more power than its predecessor, as do the A5X and MDM9600 baseband. Both of those ASICs are still built on a 4x-nm LP process and will surely increase power consumption over the iPad 2.
With more transistors switching on the same process node and a display (and backlight) driving more pixels at the same brightness, the battery either had to be larger or battery life would suffer. Apple understandably chose the former and the new iPad ships with a 42.5Wh battery—the largest we've ever seen used in an ARM tablet. The new iPad's battery is so large it's even bigger than what Apple uses in the 11-inch MacBook Air, and it's within striking distance of the 50Wh unit you'll find in the 13-inch model. I do believe this move says a lot about how Apple sees the iPad moving up in the world, but I'll get to a discussion about that later.
With a 70% larger battery than the iPad 2 but with more power hungry components inside, how does the new iPad fare in real world usage? Subjectively: it doesn't last as long as its predecessor. Objectively, our numbers seem to agree.
Our web browsing battery life test browses through dozens of web pages, pausing on each to simulate reading time, until the battery is depleted. All of our tests are run at the same brightness settings (200 nits) to ensure we are comparing apples to apples.
On WiFi we measured an 8% decrease in battery life compared to the iPad 2—nothing huge but not insignificant either. Fall off of WiFi and depend on LTE and you'll see around a 9% decrease in battery life, again—noticeable but not unusable.
I also threw in numbers from the Motorola Xyboard 10.1, an LTE enabled Android tablet running 3.2. Equipped with a much smaller battery (~26Wh), the Xyboard 10.1 delivered 7.31 hours in our LTE web browsing test. The new iPad managed to last 16% longer on a single charge—a smaller advantage than you'd expect given the 70% increase in battery capacity, showing just how much power the new Retina Display and its backlight consume.
The iPad is more than usable on long flights or throughout the day without being tethered to a wall outlet, but in practice you can expect a decrease in battery life compared to last year's model.
Keep in mind that these values are all at 200 nits (roughly 70% brightness on the iPad). If you use the iPad at max brightness (~400 nits) you'll see considerably lower numbers:
In our web browsing test, at max brightness, we saw 5 hours and 34 minutes of continuous use before the battery died. The iPad 2 incurs a similar penalty, lasting under 7 hours in the same test. Do keep this in mind if you need to get a lot of untethered use out of the new iPad. In order to come close to Apple's battery life estimates you'll have to be below 70% brightness.
Charging
Despite the significant increase in battery capacity, Apple continues to ship the new iPad with the same 10W USB power adapter as the previous two models. You can charge the iPad via a Mac/PC USB port that implements the USB charging spec, however doing so will take a minor eternity to fully charge the tablet. Just as before, the new iPad will not charge off of a USB port if the tablet is awake; it will only charge when locked/asleep. The convenience of having a USB based charger is evident, but you'll want to stick with the 10W adapter to actually charge the iPad.
Charging the larger battery does take longer. If we measure from a completely dead state to when the iPad indicates that it's fully charged the increase in time is approximately 50%, from 4 hours with the iPad 2 to 6 hours with the new iPad. ASUS' Transformer Prime, by comparison, requires only 2.5 hours as it ships with an 18W charger. And no, you can't use ASUS' charger to speed up charge times on the new iPad—when connected, the TF Prime charger will only supply 9W to the iPad.
The story doesn't end there however. While the iPad 2 will draw 0W after its 4 hour charge cycle is complete, the new iPad will continue to draw around 3W after it claims to be fully charged. This will continue for roughly another hour at which point the power adapter will draw anywhere from 0.1—0.6W.
Note that when running at full brightness and with a heavy GPU load (e.g. Infinity Blade 2), the power adapter can't supply enough to keep the iPad charged and drive the display/internal components.
There's no good solution here other than for Apple to start shipping the iPad with a higher wattage power adapter. I do believe faster charge times are going to be necessary if Apple is keen on sticking with this larger battery, not to mention the usage issues of not being able to maintain charge equilibrium under load.
Thermals
The increase in power consumption of the new iPad also manifests in the form of increased heat production. A 163mm^2 SoC built on a 45nm LP process is a serious chip. Although it doesn't run hot enough to require active cooling, the SoC alone is responsible for a couple of watts of the iPad's TDP under heavy load. Combine that with a 45nm LTE modem and the heat put off by the more powerful backlight and you've got a recipe for a noticeably warmer device.
Does the new iPad get warmer than the previous one? Absolutely. I would even go as far as to say that it can get uncomfortably warm, but it never gets too hot to hold. If you've used any of the modern Mac notebooks, I don't believe it's anywhere near as bad.
When holding the new iPad in portrait mode, with the home button at the bottom, the lower left corner of the device ends up being the warmest. Along the left edge of the iPad is where the logic board resides, and the lower half is home to the A5X SoC. Under load, particularly a heavy GPU load (e.g. playing a 3D game), this area is going to heat up quickly.
I took several measurements using a contactless IR thermometer in the same ambient conditions on a new iPad vs. the iPad 2. The results are below:
Thermal Comparison (Max Temperature) | |||||
iPad 2 | iPad (3rd generation) | ||||
Web Browsing (2 hours) | 32.7˚C | 37.6˚C | |||
Infinity Blade 2 (1 hour) | 34.2˚C | 41.9˚C |
Again, I don't believe this is a deal breaker but it's the obvious result of remaining on Samsung's 45nm LP process combined with a more power hungry display/backlight. I suspect there will be improvements in efficiency on the display side over time, but I can't see the Retina Display being any lower power than the iPad 2's 1024 x 768 screen. The real avenue for improvement will be when Apple shifts to 28/32nm silicon for the SoC and LTE modem. If you want a cooler running iPad, you'll have to wait until next year for that.
The A5X SoC
The ridiculousness of the new iPad begins at its heart: the A5X SoC.
The A5X breaks Apple's longstanding tradition of debuting its next smartphone SoC in the iPad first. I say this with such certainty because the A5X is an absolute beast of an SoC. As it's implemented in the new iPad, the A5X under load consumes more power than an entire iPhone 4S.
In many ways in the A5X is a very conservative design, while in others it's absolutely pushing the limits of what had been previously done in a tablet. Similar to the A5 and A4 before it, the A5X is still built on Samsung's 45nm LP process. Speculation about a shift to 32nm or even a move TSMC was rampant this go around. I'll admit I even expected to see a move to 32nm for this chip, but Apple decided that 45nm was the way to go.
Why choose 45nm over smaller, cooler running options that are on the table today? Process maturity could be one reason. Samsung has yet to ship even its own SoC at 32nm, much less one for Apple. It's quite possible that Samsung's 32nm LP simply wasn't ready/mature enough for the sort of volumes Apple needed for an early 2012 iPad launch. The fact that there was no perceivable slip in the launch timeframe of the new iPad (roughly 12 months after its predecessor) does say something about how early 32nm readiness was communicated to Apple. Although speculation is quite rampant about Apple being upset enough with Samsung to want to leave for TSMC, the relationship on the foundry side appears to be good from a product delivery standpoint.
Another option would be that 32nm was ready but Apple simply opted against using it. Companies arrive at different conclusions as to how aggressive they need to be on the process technology side. For example, ATI/AMD was typically more aggressive on adopting new process technologies while NVIDIA preferred to make the transition once all of the kinks were worked out. It could be that Apple is taking a similar approach. Wafer costs generally go up at the start of a new process node, combine that with lower yields and strict design rules and it's not a guarantee that you'd actually save any money from moving to a new process technology—at least not easily or initially. The associated risk of something going wrong might have been one that Apple wasn't willing to accept.
CPU Specification Comparison | ||||||||
CPU | Manufacturing Process | Cores | Transistor Count | Die Size | ||||
Apple A5X | 45nm | 2 | ? | 163mm2 | ||||
Apple A5 | 45nm | 2 | ? | 122mm2 | ||||
Intel Sandy Bridge 4C | 32nm | 4 | 995M | 216mm2 | ||||
Intel Sandy Bridge 2C (GT1) | 32nm | 2 | 504M | 131mm2 | ||||
Intel Sandy Bridge 2C (GT2) | 32nm | 2 | 624M | 149mm2 | ||||
NVIDIA Tegra 3 | 40nm | 4+1 | ? | ~80mm2 | ||||
NVIDIA Tegra 2 | 40nm | 2 | ? | 49mm2 |
Whatever the reasoning, the outcome is significant: the A5X is approximately 2x the size of NVIDIA's Tegra 3, and even larger than a dual-core Sandy Bridge desktop CPU. Its floorplan is below:
Courtesy: Chipworks
From the perspective of the CPU, not much has changed with the A5X. Apple continues to use a pair of ARM Cortex A9 cores running at up to 1.0GHz, each with MPE/NEON support and a shared 1MB L2 cache. While it's technically possible for Apple to have ramped up CPU clocks in pursuit of higher performance (A9 designs have scaled up to 1.6GHz on 4x-nm processes), Apple has traditionally been very conservative on CPU clock frequency. Higher clocks require higher voltages (especially on the same process node), which result in an exponential increase in power consumption.
ARM Cortex A9 Based SoC Comparison | ||||||
Apple A5X | Apple A5 | TI OMAP 4 | NVIDIA Tegra 3 | |||
Manufacturing Process | 45nm LP | 45nm LP | 45nm LP | 40nm LPG | ||
Clock Speed | Up to 1GHz | Up to 1GHz | Up to 1GHz | Up to 1.5GHz | ||
Core Count | 2 | 2 | 2 | 4+1 | ||
L1 Cache Size | 32KB/32KB | 32KB/32KB | 32KB/32KB | 32KB/32KB | ||
L2 Cache Size | 1MB | 1MB | 1MB | 1MB | ||
Memory Interface to the CPU | Dual Channel LP-DDR2 | Dual Channel LP-DDR2 | Dual Channel LP-DDR2 | Single Channel LP-DDR2 | ||
NEON Support | Yes | Yes | Yes | Yes |
With no change on the CPU side, CPU performance remains identical to the iPad 2. This means everything from web page loading to non-gaming app interactions are no faster than they were last year:
JavaScript performance remains unchanged, as you can see from both the BrowserMark and SunSpider results above. Despite the CPU clock disadvantage compared to the Tegra 3, Apple does have the advantage of an extremely efficient and optimized software stack in iOS. Safari just went through an update in improving its Javascript engine, which is why we see competitive performance here.
Geekbench has been updated with Android support, so we're able to do some cross platform comparisons here. Geekbench is a suite composed of completely synthetic, low-level tests—many of which can execute entirely out of the CPU's L1/L2 caches.
Geekbench 2 | ||||||
Apple iPad (3rd gen) | ASUS TF Prime | Apple iPad 2 | Motorola Xyboard 10.1 | |||
Integer Score | 688 | 1231 | 684 | 883 | ||
Blowfish ST | 13.2 MB/s | 23.3 MB/s | 13.2 MB/s | 17.6 MB/s | ||
Blowfish MT | 26.3 MB/s | 60.4 MB/s | 26.0 MB/s | - | ||
Text Compress ST | 1.52 MB/s | 1.58 MB/s | 1.51 MB/s | 1.63 MB/s | ||
Text Compress MT | 2.85 MB/s | 3.30 MB/s | 2.83 MB/s | 2.93 MB/s | ||
Text Decompress ST | 2.08 MB/s | 2.00 MB/s | 2.09 MB/s | 2.11MB/s | ||
Text Decompress MT | 3.20 MB/s | 3.09 MB/s | 3.27 MB/s | 2.78 MB/s | ||
Image Compress ST | 4.09 Mpixels/s | 5.56 Mpixels/s | 4.08 Mpixels/s | 5.42 Mpixels/s | ||
Image Compress MT | 8.12 Mpixels/s | 21.4 Mpixels/s | 7.98 Mpixels/s | 10.5 Mpixels/s | ||
Image Decompress ST | 6.70 Mpixels/s | 9.37 Mpixels/s | 6.67 Mpixels/s | 9.18 Mpixels/s | ||
Image Decompress MT | 13.2 Mpixels/s | 20.3 Mpixels/s | 13.0 Mpixels/s | 17.9 Mpixels/s | ||
Lua ST | 257.2 Knodes/s | 417.9 Knodes/s | 257.0 Knodes/s | 406.9 Knodes/s | ||
Lua MT | 512.3 Knodes/s | 1500 Knodes/s | 505.6 Knodes/s | 810.0 Knodes/s | ||
FP Score | 920 | 2223 | 915 | 1514 | ||
Mandelbrot ST | 279.5 MFLOPS | 334.8 MFLOPS | 279.0 MFLOPS | 328.9 MFLOPS | ||
Mandelbrot MT | 557.0 MFLOPS | 1290 MFLOPS | 550.3 MFLOPS | 648.0 MFLOPS | ||
Dot Product ST | 221.9 MFLOPS | 477.5 MFLOPS | 221.5 MFLOPS | 455.2 MFLOPS | ||
Dot Product MT | 438.9 MFLOPS | 1850 MFLOPS | 439.4 MFLOPS | 907.4 MFLOPS | ||
LU Decomposition ST | 217.5 MFLOPS | 171.4 MFLOPS | 214.6 MFLOPS | 177.9 MFLOPS | ||
LU Decomposition MT | 434.2 MFLOPS | 333.9 MFLOPS | 437.4 MFLOPS | 354.1 MFLOPS | ||
Primality ST | 177.3 MFLOPS | 175.6 MFLOPS | 178.0 MFLOPS | 172.9 MFLOPS | ||
Primality MT | 321.5 MFLOPS | 273.2 MFLOPS | 316.9 MFLOPS | 220.7 MFLOPS | ||
Sharpen Image ST | 1.68 Mpixels/s | 3.87 Mpixels/s | 1.68 Mpixels/s | 3.86 Mpixels/s | ||
Sharpen Image MT | 3.35 Mpixels/s | 9.85 Mpixels/s | 3.32 Mpixels/s | 7.52 Mpixels/s | ||
Blur Image ST | 666.0 Kpixels/s | 1.62 Kpixels/s | 664.8 Kpixels/s | 1.58 Kpixels/s | ||
Blur Image MT | 1.32 Mpixels/s | 6.25 Mpixels/s | 1.31 Mpixels/s | 3.06 Mpixels/s | ||
Memory Score | 821 | 1079 | 829 | 1122 | ||
Read Sequential ST | 312.0 MB/s | 249.0 MB/s | 347.1 MB/s | 364.1 MB/s | ||
Write Sequential ST | 988.6 MB/s | 1.33 GB/s | 989.6 MB/s | 1.32 GB/s | ||
Stdlib Allocate ST | 1.95 Mallocs/sec | 2.25 Mallocs/sec | 1.95 Mallocs/sec | 2.2 Mallocs/sec | ||
Stdlib Write | 2.90 GB/s | 1.82 GB/s | 2.90 GB/s | 1.97 GB/s | ||
Stdlib Copy | 554.6 MB/s | 1.82 GB/s | 564.5 MB/s | 1.91 GB/s | ||
Stream Score | 331 | 288 | 335 | 318 | ||
Stream Copy | 456.4 MB/s | 386.1 MB/s | 466.6 MB/s | 504 MB/s | ||
Stream Scale | 380.2 MB/s | 351.9 MB/s | 371.1 MB/s | 478.5 MB/s | ||
Stream Add | 608.8 MB/s | 446.8 MB/s | 654.0 MB/s | 420.1 MB/s | ||
Stream Triad | 457.7 MB/s | 463.7 MB/s | 437.1 MB/s | 402.8 MB/s |
Almost entirely across the board NVIDIA delivers better CPU performance, either as a result of having more cores, having higher clocked cores or due to an inherent low-level Android advantage. Prioritizing GPU performance over a CPU upgrade is nothing new for Apple, and in the case of the A5X Apple could really only have one or the other—the new iPad gets hot enough and draws enough power as it is; Apple didn't need an even more power hungry set of CPU cores to make matters worse.
Despite the stagnation on the CPU side, most users would be hard pressed to call the iPad slow. Apple does a great job of prioritizing responsiveness of the UI thread, and all the entire iOS UI is GPU accelerated, resulting in a very smooth overall experience. There's definitely a need for faster CPUs to enable some more interesting applications and usage models. I suspect Apple will fulfill that need with the A6 in the 4th generation iPad next year. That being said, in most applications I don't believe the iPad feels slow today.
I mention most applications because there are some iOS apps that are already pushing the limits of what's possible today.
iPhoto: A Case Study in Why More CPU Performance is Important
In our section on iPhoto we mentioned just how frustratingly slow the app can be when attempting to use many of its editing tools. In profiling the app it becomes abundantly clear why it's slow. Despite iPhoto being largely visual, it's extremely CPU bound. For whatever reason, simply having iPhoto open is enough to eat up an entire CPU core.
Use virtually any of the editing tools and you'll see 50—95% utilization of the remaining, unused core. The screenshot below is what I saw during use of the saturation brush:
The problem is not only are the two A9s not fast enough to deal with the needs of iPhoto, but anything that needs to get done in the background while you're using iPhoto is going to suffer as well. This is most obvious when you look at how long it takes for UI elements within iPhoto to respond when you're editing. It's very rare that we see an application behave like this on iOS, even Infinity Blade only uses a single core most of the time, but iPhoto is a real exception.
I have to admit, I owe NVIDIA an apology here. While I still believe that quad-cores are mostly unnecessary for current smartphone/tablet workloads, iPhoto is a very tangible example of where Apple could have benefitted from having four CPU cores on A5X. Even an increase in CPU frequency would have helped. In this case, Apple had much bigger fish to fry: figuring out how to drive all 3.1M pixels on the Retina Display.
The GPU
3D rendering is a massively parallel problem. Your GPU ultimately has to determine the color value of each pixel which may not remain constant between frames, at a rate of dozens of times per second. The iPad 2 had 786,432 pixels in its display, and by all available measures its GPU was more than sufficient to drive that resolution. The new iPad has 3.14 million pixels to drive. The iPad 2's GPU would not be sufficient.
When we first heard Apple using the term A5X to refer to the new iPad's SoC, I assumed we were looking at a die shrunk, higher clock version of the A5. As soon as it became evident that Apple remained on Samsung's 45nm LP process, higher clocks were out of the question. The only room for improving performance was to go wider. Thankfully, as 3D rendering is a massively parallel problem, simply adding more GPU execution resources tends to be a great way of dealing with a more complex workload. The iPad 2 shocked the world with its dual-core PowerVR SGX 543MP2 GPU, and the 3rd generation iPad doubled the amount of execution hardware with its quad-core PowerVR SGX 543MP4.
Mobile SoC GPU Comparison | |||||||||||
Adreno 225 | PowerVR SGX 540 | PowerVR SGX 543MP2 | PowerVR SGX 543MP4 | Mali-400 MP4 | Tegra 2 | Tegra 3 | |||||
SIMD Name | - | USSE | USSE2 | USSE2 | Core | Core | Core | ||||
# of SIMDs | 8 | 4 | 8 | 16 | 4 + 1 | 8 | 12 | ||||
MADs per SIMD | 4 | 2 | 4 | 4 | 4 / 2 | 1 | 1 | ||||
Total MADs | 32 | 8 | 32 | 64 | 18 | 8 | 12 | ||||
GFLOPS @ 200MHz | 12.8 GFLOPS | 3.2 GFLOPS | 12.8 GFLOPS | 25.6 GFLOPS | 7.2 GFLOPS | 3.2 GFLOPS | 4.8 GFLOPS | ||||
GFLOPS @ 300MHz | 19.2 GFLOPS | 4.8 GFLOPS | 19.2 GFLOPS |
38.4 GFLOPS |
10.8 GFLOPS | 4.8 GFLOPS | 7.2 GFLOPS | ||||
GFLOPS As Shipped by Apple/ASUS | - | - | 16 GFLOPS | 32 GFLOPS | - | - |
12 GFLOPS |
We see this approach all of the time in desktop and notebook GPUs. To allow games to run at higher resolutions, companies like AMD and NVIDIA simply build bigger GPUs. These bigger GPUs have more execution resources and typically more memory bandwidth, which allows them to handle rendering to higher resolution displays.
Apple acted no differently than a GPU company would in this case. When faced with the challenge of rendering to a 3.14MP display, Apple increased compute horsepower and memory bandwidth. What's surprising about Apple's move is that the A5X isn't a $600 desktop GPU, it's a sub 4W mobile SoC. And did I mention that Apple isn't a GPU company?
That's quite possibly the most impressive part of all of this. Apple isn't a GPU company. It's a customer of GPU companies like AMD and NVIDIA, yet Apple has done what even NVIDIA would not do: commit to building an SoC with an insanely powerful GPU.
I whipped up an image to help illustrate. Below is a representation, to-scale, of Apple and NVIDIA SoCs, their die size, and time of first product introduction:
If we look back to NVIDIA's Tegra 2, it wasn't a bad SoC—it was basically identical in size to Apple's A4. The problem was that the Tegra 2 made its debut a full year after Apple's A4 did. The more appropriate comparison would be between the Tegra 2 and the A5, both of which were in products in the first half of 2011. Apple's A5 was nearly 2.5x the size of NVIDIA's Tegra 2. A good hunk of that added die area came from the A5's GPU. Tegra 3 took a step in the right direction but once again, at 80mm^2 the A5 was still over 50% larger.
The A5X obviously dwarfs everything, at around twice the size of NVIDIA's Tegra 3 and 33.6% larger than Apple's A5. With silicon, size isn't everything, but when we're talking about similar architectures on similar manufacturing processes, size does matter. Apple has been consistently outspending NVIDIA when it comes to silicon area, resulting in a raw horsepower advantage, which in turns results in better peak GPU performance.
Apple Builds a Quad-Channel (128-bit) Memory Controller
There's another side effect that you get by having a huge die: room for wide memory interfaces. Silicon layout is a balancing act. You want density to lower costs, but you don't want hotspots so you need heavy compute logic to be spread out. You want wide IO interfaces but you don't want them to be too wide because then you'll cause your die area to balloon as a result. There's only so much room on the perimeter of your SoC to get data out of the chip, hence the close relationship between die size and interface width.
Most mobile SoCs are equipped with either a single or dual-channel LP-DDR2 memory controller. Unlike in the desktop/notebook space where a single DDR2/DDR3 channel refers to a 64-bit wide interface, in the mobile SoC world a single channel is 32-bits wide. Both Qualcomm and NVIDIA use single-channel interfaces, with Snapdragon S4 finally making the jump to dual-channel this year. Apple, Samsung, and TI have used dual-channel LP-DDR2 interfaces instead.
With the A5X Apple did the unthinkable and outfitted the chip with four 32-bit wide LP-DDR2 memory controllers. The confirmation comes from two separate sources. First we have the annotated A5X floorplan courtesy of UBMTechInsights:
You can see the four DDR interfaces around the lower edge of the SoC. Secondly, we have the part numbers of the discrete DRAM devices on the opposite side of the motherboard. Chipworks and iFixit played the DRAM lottery and won samples with both Samsung and Elpida LP-DDR2 devices on-board, respectively. While both Samsung and Elpida do a bad job of updating public part number decoders, both strings match up very closely to 216-ball PoP 2x32-bit PoP DRAM devices. The part numbers don't match up exactly, but they are close enough that I believe we're simply looking at a discrete flavor of those PoP DRAM devices.
K3PE4E400M-XG is the Samsung part number for a 2x32-bit LPDDR2 device, K3PE4E400E-XG is the part used in the iPad. I've made bold the only difference.
A cross reference with JEDEC's LP-DDR2 spec tells us that there is an official spec for a single package, 216-ball dual-channel (2x32-bit) LP-DDR2 device, likely what's used here on the new iPad.
The ball out for a 216-ball, single-package, dual-channel (64-bit) LPDDR2 DRAM
This gives the A5X a 128-bit wide memory interface, double what the closest competition can muster and putting it on par with what we've come to expect from modern x86 CPUs and mainstream GPUs.
The Geekbench memory tests show no improvement in bandwidth, which simply tells us that the interface from the CPU cores to the memory controller hasn't seen a similar increase in width.
Memory Bandwidth Comparison—Geekbench 2 | ||||||
Apple iPad (3rd gen) | ASUS TF Prime | Apple iPad 2 | Motorola Xyboard 10.1 | |||
Overall Memory Score | 821 | 1079 | 829 | 1122 | ||
Read Sequential | 312.0 MB/s | 249.0 MB/s | 347.1 MB/s | 364.1 MB/s | ||
Write Sequential | 988.6 MB/s | 1.33 GB/s | 989.6 MB/s | 1.32 GB/s | ||
Stdlib Allocate | 1.95 Mallocs/sec | 2.25 Mallocs/sec | 1.95 Mallocs/sec | 2.2 Mallocs/sec | ||
Stdlib Write | 2.90 GB/s | 1.82 GB/s | 2.90 GB/s | 1.97 GB/s | ||
Stdlib Copy | 554.6 MB/s | 1.82 GB/s | 564.5 MB/s | 1.91 GB/s | ||
Overall Stream Score | 331 | 288 | 335 | 318 | ||
Stream Copy | 456.4 MB/s | 386.1 MB/s | 466.6 MB/s | 504 MB/s | ||
Stream Scale | 380.2 MB/s | 351.9 MB/s | 371.1 MB/s | 478.5 MB/s | ||
Stream Add | 608.8 MB/s | 446.8 MB/s | 654.0 MB/s | 420.1 MB/s | ||
Stream Triad | 457.7 MB/s | 463.7 MB/s | 437.1 MB/s | 402.8 MB/s |
Although Apple designed its own memory controller in the A5X, you can see that all of these A9 based SoCs deliver roughly similar memory performance. The numbers we're showing here aren't very good at all. Even though Geekbench has never been good at demonstrating peak memory controller efficiency to begin with, the Stream numbers are very bad. ARM's L2 cache controller is very limiting in the A9, something that should be addressed by the time the A15 rolls around.
Firing up the memory interface is a very costly action from a power standpoint, so it makes sense that Apple would only want to do so when absolutely necessary. Furthermore, notice how the memory interface moved from being closer to the CPU in A4/A5 to being adjacent to the GPU in the A5X. It would appear that only the GPU has access to all four channels.
A Word on Packaging
Unlike the first two iPads, the 3rd generation iPad abandons the high density flip-chip PoP SoC/DRAM stack and uses a discrete, flip-chip BGA package for the SoC and two discrete BGA packages for the DRAMs.
If you think of SoC silicon as a stack, the lowest layer is where you'll find the actual transistor logic, while the layers of metal above it connect everything together. In the old days, the silicon stack would sit just as I've described it—logic at the bottom, metal layers on top. Pads around the perimeter of the top of the silicon would connect to very thin wires, that would then route to the package substrate and eventually out to balls or pins on the underside of the package. These wire bonded packages, as they were called, had lower limits of how many pins you could have connecting to your chip.
There are also cooling concerns. In a traditional wire bonded package, your cooling solution ultimately rests on a piece of your packaging substrate. The actual silicon itself isn't exposed.
As its name implies, a flip-chip package is literally the inverse of this. Instead of the metal layers being at the top of the stack, before packaging the silicon is inverted and the metal layers are at the bottom of the stack. Solder bumps at the top of the silicon stack (now flipped and at the bottom) connect the topmost metal layer to the package itself. Since we're dealing with solder bumps on the silicon itself rather than wires routed to the edge of the silicon, there's much more surface area for signals to get in/out of the silicon.
Since the chip is flipped, the active logic is now exposed in a flip-chip package and the hottest part of the silicon can be directly attached to a cooling solution.
An example of a PoP stack
To save on PCB real estate however, many SoC vendors would take a flip-chip SoC and stack DRAM on top of it in a package-on-package (PoP) configuration. Ultimately this re-introduces many of the problems from older packaging techniques—mainly it becomes difficult to have super wide memory interfaces as your ball-out for the PoP stack is limited to the area around your die, and cooling is a concern once more. For low power, low bandwidth mobile SoCs this hasn't really been a problem, which is why we see PoP stacks deployed all over the place.
Take a look at the A5, a traditional FC-BGA SoC with PoP DRAM vs. the A5X (this isn't to scale):
Images courtesy iFixit
The A5X in this case is a FC-BGA SoC but without any DRAM stacked on top of it. The A5X is instead covered in a thermally conductive paste and then with a metallic heatspreader to conduct heat away from the SoC and protect the silicon.
Given the size and complexity of the A5X SoC, it's no surprise that Apple didn't want to insulate the silicon with a stack of DRAM on top of it. In typical package-on-package stacks, you'd see solder bumps around the silicon, on the package itself, that a separate DRAM package would adhere to. Instead of building up a PoP stack here, Apple simply located its two 64-bit DRAM devices on the opposite side of the iPad's logic board and routed the four 32-bit LP-DDR2 memory channels through the PCB layers.
iPad (3rd gen) logic board back (top) and front (bottom), courtesy iFixit
If I'm seeing this correctly, it looks like the DRAM devices are shifted lower than the center point of the A5X. Routing high speed parallel interfaces isn't easy and getting the DRAM as close to the memory controller as possible makes a lot of sense. For years motherboard manufacturers and chipset vendors alike complained about the difficulties of routing a high-speed, 128-bit parallel DRAM interface on a (huge, by comparison) ATX motherboard. What Apple and its partners have achieved here is impressive when you consider that this type of interface only made it to PCs within the past decade.
Looking Forward: 12.8GB/s, the Magical Number
The DRAM speeds in the new iPad haven't changed. The -8D in the Elpida DRAM string tells us this memory is rated at the same 800MHz datarate as what's used in the iPhone 4S and iPad 2. With twice the number of channels to transfer data over however, the total available bandwidth (at least to the GPU) doubles. I brought back the graph I made for our iPhone 4S review to show just how things have improved:
The A5X's memory interface is capable of sending/receiving data at up to 12.8GB/s. While this is still no where near the 100GB/s+ we need for desktop quality graphics at Retina Display resolutions, it's absolutely insane for a mobile SoC. Bandwidth utilization is another story entirely—we have no idea how good Apple's memory controller is (it is designed in-house), but there's 4x the theoretical bandwidth available to the A5X as there is to NVIDIA's Tegra 3.
There's a ton of memory bandwidth here, but Apple got to this point by building a huge, very power hungry SoC. Too power hungry for use in a smartphone. As I mentioned at the start of this article, the SoC alone in the new iPad can consume more power than the entire iPhone 4S (e.g. A5X running Infinity Blade 2 vs. iPhone 4S loading a web page):
Power Consumption Comparison | ||||
Apple A5X (SoC + mem interface) | Apple iPhone 4S (entire device) | |||
Estimated Power Consumption | 2.6W—Infinity Blade 2 | 1.6W—Web Page Loading |
There's no question that we need this much (and more) memory bandwidth, but the A5X's route to delivering it is too costly from a standpoint of power. There is a solution to this problem however: Wide IO DRAM.
Instead of using wires to connect DRAM to solder balls on a package that's then stacked on top of your SoC package, Wide IO DRAM uses through-silicon-vias (TSVs) to connect a DRAM die directly to the SoC die. It's an even more costly packaging technique, but the benefits are huge.
Just as we saw in our discussion of flip-chip vs. wire bonded packages, conventional PoP solutions have limits to how many IO pins you can have in the stack. If you can use the entire silicon surface for direct IO however, you can build some very wide interfaces. It also turns out that these through silicon interfaces are extremely power efficient.
The first Wide IO DRAM spec calls for a 512-bit, 200MHz SDR (single data rate) interface delivering an aggregate of 12.8GB/s of bandwidth. The bandwidth comes at much lower power consumption, while delivering all of the integration benefits of a traditional PoP stack. There are still cooling concerns, but for lower wattage chips they are less worrisome.
Intel originally predicted that by 2015 we'd see 3D die stacking using through-silicon-vias. Qualcomm's roadmaps project usage of TSVs by 2015 as well. The iPhone won't need this much bandwidth in its next generation thanks to a lower resolution display, but when the time comes, there will be a much lower power solution available thanks to Wide IO DRAM.
Oh and 2015 appears to be a very conservative estimate. I'm expecting to see the first Wide IO memory controllers implemented long before then...
The Impact of Larger Memory
Apple doubled memory capacity on the new iPad to 1GB, marking the first time in recent history that Apple's flagship product offers a similar amount of memory to the current crop of high-end Android devices. Apple's iOS can do a relatively good job with limited system memory as it will conservatively unload applications from memory in the event that it needs to free up more space. iOS does not support paging to flash, making DRAM size a hard limit for developers looking to really push the platform.
Apple has always been conservative on DRAM sizing because it's a great way of reducing the BOM (Bill of Materials) cost. If Apple can make up for having less DRAM by being more aggressive in software (read: kicking apps out of memory), it's a tradeoff that makes sense. It's really Apple's foray into gaming that has added pressure to increase memory sizes.
With the move to the Retina Display, the amount of memory needed to store a single frame increases by 4x—from 3MB to 12MB. Assuming two buffered frames you're looking at 24MB of RAM just to smoothly display what you're seeing on the screen.
The bigger problem isn't the frame buffer, but rather all of the other data you need (e.g. level data, textures, etc...). The higher the screen resolution, the more important it is to have higher quality assets in your game. Texture compression can go a long way, but at some point there's simply more data to deal with as game complexity increases. It's not just about the increase in resolution either. As GPU horsepower increases, so will the complexity of what game developers try to build.
While the frame buffer size increased thanks to the Retina Display, total system memory increased by a much larger amount. With 1GB of memory, game developers are now less constrained.
A more immediate benefit is apps and web pages will remain resident in memory longer as you open open up and switch to other apps. For example, on the iPad 2 if I open four tabs in Safari (AT, Engadget, Reddit, and Tech Report), open iPhoto, run Infinity Blade 2 and GTA 3, switching between the latter two will always require a full game reload (as in you see the intro and everything before you pick up where you left off). On the new iPad, with the same setup, I can switch between Infinity Blade 2 and GTA 3 and automatically resume where I last left off thanks to the extra DRAM. You can still create a scenario where even 1GB isn't enough, it's just that the limit is now higher than it was on the iPad 2.
GPU Performance
All of our discussions around the new iPad and its silicon thus far have been in the theoretical space. Unfortunately the state of Android/iOS benchmarking is abysmal at best today. Convincing game developers to include useful benchmarks and timedemo modes in their games is seemingly impossible without a suitably large check. I have no doubt this will happen eventually, but today we're left with some great games and no way to benchmark them.
Without suitable game benchmarks, we rely on GLBenchmark quite a bit to help us in evaluating mobile GPU performance. Although even the current most stressful GLBenchmark test (Egypt) is a far cry from what modern Android/iOS games look like, it's the best we've got today.
We'll start out with the synthetic tests, which should show us roughly a 2x increase in performance compared to the iPad 2. Remember the PowerVR SGX 543MP4 simply bundles four SGX 543 cores instead of two. Since we're still on a 45nm LP process, GPU clocks haven't increased so we're looking at a pure doubling of virtually all GPU resources.
Indeed we see a roughly 2x increase in triangle and fill rates. Below we have the output from GLBenchmark's low level tests. Pay particular attention to how, at 1024 x 768, performance doubles compared to the iPad 2 but at 2048 x 1536 performance can drop to well below what the iPad 2 was able to deliver at 10 x 7. It's because of this drop in performance at the iPad's native resolution that we won't see many (if any at all), visually taxing games run at anywhere near 2048 x 1536.
GLBenchmark 2.1.3 Low Level Comparison | ||||||
iPad 2 (10x7) | iPad 3 (10x7) | iPad 3 (20x15) | ASUS TF Prime | |||
Trigonometric test—vertex weighted |
35 fps
|
60 fps
|
57 fps
|
47 fps
|
||
Trigonometric test—fragment weighted |
7 fps
|
14 fps
|
4 fps
|
20 fps
|
||
Trigonometric test—balanced |
5 fps
|
10 fps
|
2 fps
|
9 fps
|
||
Exponential test—vertex weighted |
59 fps
|
60 fps
|
60 fps
|
41 fps
|
||
Exponential test—fragment weighted |
25 fps
|
49 fps
|
13 fps
|
18 fps
|
||
Exponential test—balanced |
19 fps
|
37 fps
|
8 fps
|
7 fps
|
||
Common test—vertex weighted |
49 fps
|
60 fps
|
60 fps
|
35 fps
|
||
Common test—fragment weighted |
8 fps
|
16 fps
|
4 fps
|
28 fps
|
||
Common test—balanced |
6 fps
|
13 fps
|
2 fps
|
12 fps
|
||
Geometric test—vertex weighted |
57 fps
|
60 fps
|
60 fps
|
27 fps
|
||
Geometric test—fragment weighted |
12 fps
|
24 fps
|
6 fps
|
20 fps
|
||
Geometric test—balanced |
9 fps
|
18 fps
|
4 fps
|
9 fps
|
||
For loop test—vertex weighted |
59 fps
|
60 fps
|
60 fps
|
28 fps
|
||
For loop test—fragment weighted |
30 fps
|
57 fps
|
16 fps
|
42 fps
|
||
For loop test—balanced |
22 fps
|
43 fps
|
11 fps
|
15 fps
|
||
Branching test—vertex weighted |
58 fps
|
60 fps
|
60 fps
|
45 fps
|
||
Branching test—fragment weighted |
58 fps
|
60 fps
|
30 fps
|
46 fps
|
||
Branching test—balanced |
22 fps
|
43 fps
|
16 fps
|
16 fps
|
||
Array test—uniform array access |
59 fps
|
60 fps
|
60 fps
|
60 fps
|
||
Fill test—Texture Fetch |
1001483136 texels/s
|
1977874688
texels/s |
1904501632
texels/s |
415164192
texels/s |
||
Triangle test—white |
65039568
triangles/s |
133523176
triangles/s |
85110008
triangles/s |
55729532
triangles/s |
||
Triangle test—textured |
56129984
triangles/s |
116735856
triangles/s |
71362616
triangles/s |
54023840
triangles/s |
||
Triangle test—textured, vertex lit |
45314484
triangles/s |
93638456
triangles/s |
46841924
triangles/s |
28916834
triangles/s |
||
Triangle test—textured, fragment lit |
43527292
triangles/s |
92831152
triangles/s |
39277916
triangles/s |
26935792
triangles/s |
GLBenchmark also includes two tests designed to be representative of a workload you could see in an actual 3D game. The older Pro test uses OpenGL ES 1.0 while Egypt is an ES 2.0 test. These tests can either run at the device's native resolution with vsync enabled, or rendered offscreen at 1280 x 720 with vsync disabled. The latter offers us a way to compare GPUs without device screen resolution creating unfair advantages.
Unfortunately there was a bug in the iOS version of GLBenchmark 2.1.2 that resulted in all on-screen benchmarks running at 1024 x 768 rather than the new iPad's native 2048 x 1536 resolution. This is why all of the native GLBenchmark scores from the new iPad are capped at 60 fps. It's not because the new GPU is fast enough to render at speeds above 60 fps at 2048 x 1536, it's because the benchmark is actually showing performance at 1024 x 768. Luckily, GLBenchmark 2.1.3 fixes this problem and delivers results at the new iPad's native screen resolution:
Surprisingly enough, the A5X is actually fast enough to complete these tests at over 50 fps. Perhaps this is more of an indication of how light the Egypt workload has become, as the current crop of Retina Display enhanced 3D titles for the iPad all render offscreen to a non-native resolution due to performance constraints. The bigger takeaway is that with the 543MP4 and a quad-channel LP-DDR2 interface, it is possible to run a 3D game at 2048 x 1536 and deliver playable frame rates. It won't be the prettiest game around, but it's definitely possible.
The offscreen results give us the competitive analysis that we've been looking for. With a ~2x die size advantage, the fact that we're seeing a 2-3x gap in performance here vs. NVIDIA's Tegra 3 isn't surprising:
The bigger worry is what happens when the first 1920 x 1200 enabled Tegra 3 tablets start shipping. With (presumably) no additional GPU horsepower or memory bandwidth under the hood, we'll see this gap widen.
A5X vs. Tegra 3 in the Real World
Even with the inclusion of GLBenchmark data, we're still arguing over theoretical advantages. Any 3D game developed for Android or iOS is going to target 30 or 60 fps and try its best to stay there. Similar to a console, on an Android or iOS tablet there's no disabling vsync and there's (almost) no tinkering with image quality settings to impact performance. The trick on Android is really ensuring that experience across all available SoCs, but if we cull the list down to the best of the best—chances are you'll have a good experience on both sides of the fence.
NVIDIA's Tegra 3 is at the heart of our current favorite in the Android tablet space: ASUS' Transformer Prime. There also happen to be some games that are available on both Android and iOS. Take Modern Combat 3, available on both iOS and Android—do we see the same ~2x performance advantage from Egypt in this title compared to ASUS' TF Prime?
NVIDIA makes two counter arguments against Apple's claim that the A5X delivers superior gaming performance. The first is that despite any theoretical performance advantages the A5X may hold, they don't manifest in games today. The second NVIDIA argument is that via Tegra Zone, Android titles can look better than their iOS counterpart. Both of these are potentially valid claims, but let's test them.
Shadowgun is an NVIDIA favorite. It's a first person shooter that's available via NVIDIA's Tegra Zone app. The Tegra specific version offers support for gamepads and stereoscopic 3D with enhanced graphics specifically for Tegra 2 and Tegra 3. It works for our little experiment here because it's also available on iOS.
Shadowgun also ends up being a great example of what the Android/iOS divide looks like for many Tegra Zone games. This particular title appears to still render at 1024 x 768 on the new iPad, we simply get an upscaled image rather than a higher resolution. In turn it means we get more aliasing and a less sharp image compared to what we get on ASUS' Transformer Prime, where the game runs at 1280 x 800.
Apple's A5X delivers an extremely smooth frame rate in Shadowgun. Although there's no built in timedemo, frame counter or benchmark functionality, the game runs subjectively smoother on iOS compared to on the TF Prime running Ice Cream Sandwich. Although the frame rate is higher on the iPad, I wouldn't consider it unacceptably low on the TF Prime—both are definitely playable.
Where the Tegra Zone version of the game has an advantage is in its visuals. The NVIDIA enhanced version uses higher resolution textures and features what appears to be CPU accelerated cloth physics in objects that simply don't exist in the iOS version. Some scenes also include other additional details (e.g. water on the floor) that aren't present in the iOS version. None of these additions fundamentally change the gameplay at all, but they do make for a better looking game.
Is it physically possible to have the same experience on iOS? Quite possibly. What we're seeing here is a mobile representation of what NVIDIA has done in the PC industry for years. By lending its support to smaller Android developers, their games are made prettier on NVIDIA hardware, and in turn NVIDIA helps promote those games via Tegra Zone and other channels. Obviously Apple could do the same, but thus far it hasn't needed to. Apple instead prefers giving its partners what someone very smart once referred to as most favored nation status. As a MFN, these game developers get additional exposure in the app store and elsewhere in Apple's promotions. It's very similar to what NVIDIA is doing, except on a much larger scale and without the iOS specific visual enhancements.
The comparison becomes even more complicated when you take into account the iPad's Retina Display looks better than the panel on the TF Prime. It's not really the higher resolution but rather the improved color reproduction on the iPad.
Riptide GP is another example of a Tegra Zone title available on both iOS and Android, although here the Tegra optimizations are less impressive while the Retina Display's advantages are more pronounced. There's no perceivable difference in frame rate here either, making it a much closer call:
Grand Theft Auto 3 was recently ported to both Android and iOS, and with this title we find ourselves in a unique position: the Android version offers customizable visual quality. The iPad version of the title renders at 1024 x 768 regardless of hardware generation and performance is understandably smooth. Visual quality is configurable, only on the Android version, with draw distance, screen resolution and effects quality vectors:
These sliders/options have different defaults depending on what SoC they are running on. The Transformer Prime is capable of running GTA3 at the highest quality settings with a tangible but livable drop in frame rate. The end result is a significantly better looking game on Android, although to be honest it's something that you really only notice if you are doing a side by side comparison. If all you have is an iPad or TF Prime, you'd likely just grow used to whatever platform you had.
The TF Prime experience doesn't map as well to other tablets unfortunately. While playing GTA 3 on a Xyboard 10.1 (OMAP 4430) even at its default settings of 50% draw distance, 60% resolution and low effects quality, there's unusual stuttering during gameplay. The frame rate is otherwise smooth, but the periods of stutter significantly impact the overall experience.
Unfortunately the Android vs. iOS gaming comparison isn't always this easy. While some apps won't run on older Apple hardware, there are only three generations of iPad to worry about. Furthermore, within a single generation there aren't multiple performance levels to worry about as Apple only offers a single SoC. By comparison, there's a far larger selection of Android devices. Simply having the latest and greatest hardware isn't a guarantee that you'll be able to play every game in the Google Play store. Take Modern Combat 3 for example. Modern Combat 3 is a Call of Duty clone available on both iOS and Android. The game won't install on an ASUS Transformer Prime or a Motorola Xyboard 10.1:
The Play store keeps track of all of the devices I've used with my Google account and the compatibility list doesn't look good. Obviously this isn't Google's fault directly as the responsibility falls upon the game developer to ensure broad platform compatibility, but it is a problem for anyone who purchases a flagship Android device like the Transformer Prime. Either Google has to enforce compatibility across all of its devices or it needs to at least force developers to support a single, flagship platform. Perhaps one reason I'm seeing this today is because there is no Nexus tablet yet. If that's the route Google is going to count on, the first iteration of any major platform needs to be a Nexus device. In other words, the Transformer Prime should have been a Nexus to begin with.
Gaming Conclusion
In situations where a game is available in both the iOS app store as well as NVIDIA's Tegra Zone, NVIDIA generally delivers a comparable gaming experience to what you get on the iPad. In some cases you even get improved visual quality as well. The iPad's GPU performance advantage just isn't evident in those cases—likely because the bulk of iOS devices out there still use far weaker GPUs. That's effectively a software answer to a hardware challenge, but it's true.
NVIDIA isn't completely vindicated however. In Apple's corner you have Infinity Blade 2 and the upcoming Infinity Blade Dungeons, both of which appear to offer a significant visual advantage over the best of the best that's available on Android today. There are obvious business complexities that are the cause of this today, but if you want to play those games you need to buy an iPad.
The final point is this: Tegra 3 can deliver a good gaming experience on Android, we've already demonstrated that. But as a GPU company NVIDIA should know that it isn't about delivering the minimum acceptable experience, but rather pushing the industry forward. Just last week NVIDIA launched a $500 GPU that is overkill for the vast majority of users. But NVIDIA built the GeForce GTX 680 to move the industry forward, and it's a shame that it hasn't done so in the mobile SoC space thus far.
Controller Support: An Android Advantage
With Honeycomb and subsequent versions of Android, Google baked in wired and wireless controller support into the OS. NVIDIA worked with game developers to ensure proper support for these controllers made it into their games and as a result there are a number of titles available through Tegra Zone that offer support for external gamepads. Logitech's Wireless Gamepad F710 comes with a USB nano receiver that can be plugged into the Transformer Prime's dock. It's using this controller that I played Shadowgun, GTA 3 and Riptide. Out of the three, the ability to use a gamepad made GTA 3 much more enjoyable (and it made me much better at the game as well).
Although many casual Android/iOS games do just fine with touch, some are certainly better suited for some sort of a controller. While controller support in Android in its infancy at best, it's more than iOS currently offers. I know of an internal Apple project to bring a physical controller to market, but whether or not it will ever see the light of day remains to be seen. As smartphones and tablets come close to equalling the performance of current game consoles, I feel like the controller problem must be addressed.
There's also the chance that physical controls will lose out entirely with these devices. A friend of mine in the game industry once said that we are too quick to forget how superior input devices don't always win. The keyboard + mouse is a much more precise setup for a first person shooter, but much FPS development these days is targeted at gamepads instead. The same could eventually be true for touch based devices, but it's too early to tell. Until then I'm hoping we see continued controller support in Android and hopefully that'll put some pressure on Apple to do the same. It is an important consideration for the future of gaming on these platforms.
WiFi & GPS
The WiFi stack gets an update with the new iPad courtesy of Broadcom's 65nm BCM4330, compared to the BCM4329 used in the previous two iPads. Both 2.4GHz and 5GHz operation are supported, although as I mentioned earlier the carrier-dependent personal hotspot is only available over 2.4GHz.
As with most smartphone/tablet designs the BCM4330 only supports a single spatial stream, for a maximum link speed of 72Mbps. Similar to the iPad 2, Apple hides the WiFi antenna behind the speaker grille at the bottom of the tablet. The cellular antennas (there are now two) are at the top of the tablet, behind the plastic RF window.
WiFi Performance Comparison | ||||||
Distance from AP | 3 feet | 20 feet (Different Room) | 50 feet (Different Room/Floor) | 100 feet (Different Room) | ||
ASUS TF Prime (2.4GHz) | 26.9 Mbps | 9.85 Mbps | 13.5 Mbps | 2.20 Mbps | ||
Apple iPad 2 (2.4GHz) | 35.1 Mbps | 29.9 Mbps | 26.9 Mbps | 10.6 Mbps | ||
Apple iPad 3 (2.4GHz) | 35.1 Mbps | 29.9 Mbps | 27.9 Mbps | 9.98 Mbps | ||
Apple iPad 2 (5GHz) | 36.7 Mbps | 36.7 Mbps | 36.7 Mbps | 11.9 Mbps | ||
Apple iPad 3 (5GHz) | 36.7 Mbps | 36.7 Mbps | 36.7 Mbps | 11.7 Mbps |
With a similar WiFi stack and similar antenna placement, it's no surprise that I noticed very similar WiFi performance to the iPad 2.
The same goes for GPS performance between the new iPad and the iPad 2. Both devices were able to lock and track me driving around in a car with comparable accuracy from what I could tell.
Airplay Support with the new Apple TV
When paired with a second or third generation Apple TV, the iPad supports wireless display mirroring or content streaming to the iPad via AirPlay. In other words, if you have an Apple TV hooked up to your HDTV, you can use your HDTV as a large, mirrored, secondary display for your iPad—wirelessly. The only requirement is that you have a 2nd or 3rd generation Apple TV and that it's on the same network as your iPad. With those requirements met, enabling AirPlay mirroring is simple—just bring up the iOS task switcher, swipe left to right until you see the brightness/playback controls and tap the AirPlay icon.
Mirroring gives you exactly what you'd expect—a complete mirror of everything you see on the local iPad screen. All sounds are also sent over and come out via your TV's speakers—the local speaker remains silent.
The frame rate isn't as high on the remote display, but there's virtually no impact to the performance of the iPad itself. There's noticeable latency of course since the display output is transcoded as a video, sent over WiFi to the Apple TV, decoded and displayed on your TV via HDMI. I measured the AirPlay latency at ms, which is reasonable for browsing the web but too high for any real-time games. If you want to use the iPad to drive your HDTV for gaming you'll need to buy the optional HDMI output dongle.
While AirPlay mirroring on the iPad works at 720p, if you're playing a 1080p movie on the new iPad and you have a 3rd generation Apple TV, the video is also displayed in 1080p rather than downscaled to 720p.
Video playback is an interesting use case for AirPlay and the iPad. If you don't have mirroring enabled, you can actually start playing a movie on the iPad, have it stream to your TV via the Apple TV, and go about using your iPad as if nothing else was happening. Most apps will allow you to stream video in the background without interrupting, however some games (e.g. GTA 3, Infinity Blade 2) and some apps (e.g. iMovie) will insist on streaming their UI to your Apple TV instead.
Although iOS and the iPad don't do a great job of promoting multi-user experiences, using AirPlay to push video to a TV wirelessly is an exception. If you frequently load your iPad up with movies you can use it to keep others entertained while you either get work done or just goof around on your iPad at the same time. It's a great fit for families where people want to do two different things. If you do put a lot of movies on your iPhone/iPad, the 3rd generation Apple TV is probably a must buy for this reason alone.
The Next iPhone
Historically the iPad has been the launch vehicle for Apple's next-generation iPhone SoC. It's safe to say that the 45nm A5X we've seen here today won't be finding its way into a smartphone. Instead what we're likely to see in the next iPhone this year is a 28/32nm shrink of the A5, coupled with Qualcomm's 28nm MDM9615 instead of the 45nm MDM9600 enabling LTE support.
It'll be next year before we see the introduction of the A6 in the fourth generation iPad, which will likely bring ARM's Cortex A15 to the table as well as Imagination Technologies' PowerVR Series 6 (codename Rogue) GPU. Apple isn't done driving GPU performance. There's still a chance we'd see the introduction of a Cortex A15 based SoC late this year for the new iPhone but I still believe the timing is too aggressive for that to happen.
Haswell
In working on this review, Vivek IMed me and told me the best part of using an iPad instead of a notebook is the battery life. When the battery indicator reads only 20% left, chances are you've still got a good couple of hours of battery life left on the new iPad. On a MacBook Pro? You're lucky if you get half of that.
The question is, must this gap always exist? The MacBook Pro has much more power hungry silicon, and it's running a much more power hungry OS and application set. I won't go too far into this but one of the promises Intel is making with Haswell, its 2013 microprocessor architecture, is for a > 20x decrease in connected standby power. Intel's goal is to be able to deliver an Ultrabook in 2013 that can remain in connected standby (still receiving emails, Twitter updates, push notfications, etc...) for up to 10 days on a single charge.
What about for a lighter, more tablet like usage model? Will Haswell be able to deliver more iPad-like battery life for most tasks, but offer the horsepower and flexibility to run a traditional OS? I'm hearing very exciting things about next year...
Windows 8
A while ago I made a list of the top 10 things I did with my computer. It looked something like this:
Web Browsing
IM
Photo/Video Editing
Excel
Editing Reviews (HTML)
Publishing Reviews (FTP, CMS access)
3D Gaming
Writing
Email
Twitter
Of that list of 10, most of them could be done on a tablet, but only a couple of them delivered a better experience on a tablet than on a desktop/notebook (web browsing and email). You could argue that interacting with Twitter is also better on a tablet as well. Regardless of where you draw the line however, the fact of the matter is that for a user like me I can't replace a notebook with a tablet or vice versa. I need both. I don't like the idea of needing both, I'd rather just have one that could always deliver the best experience possible.
It's this problem I believe Microsoft is trying to address with Windows 8. Put Windows 8 on a convertible or dockable tablet (ala ASUS' Transformer Prime), with x86 hardware, and you've got a very real solution to this problem. When you want a touchscreen tablet, you've got one. When you want a more traditional workhorse notebook, you've got one there as well. I make the x86 reference because that way you don't lose out on compatibility with all of your older desktop apps that you may rely on.
For years Microsoft has failed to deliver a consumer friendly tablet by forcing a desktop UI on it. Its experience with Media Center taught us all that vastly different usage models need different user interfaces. It took Microsoft a long time to realize this, but with Windows 8 I believe it has one solution to the tablet problem. It is ironic/funny/depressing that with Windows 8 Microsoft is simply making the same mistake it made for years with tablets, in reverse. This time around the desktop experience suffers (or at best, just isn't moved forward) in order to focus more on the tablet experience. Sigh, one of these days they'll figure it out.
The point of this sidebar on Windows 8 is to talk about the iOS equivalent. Apple advocated so strongly with the iPhone for the consolidation of devices, I can't help but assume that we'll see a similar move in the MacBook Air/iPad space. iOS is far more multitasking friendly today than it was a couple of years ago. The support for multitasking gestures alone on the iPad is huge. But there clearly has to be more. I don't even know if iOS 6 is really when we'll see this intersection between tablet and ultra portable happen. Like Haswell, this may also be a 2013 thing...
Vivek's Impressions
Over the last two-plus years, I’ve had an interesting relationship with the the iPad. I never intended to buy the original iPad, but I ended up getting one simply because the "oooh shiny" factor was too much to resist. It was a little buggy, a little slow, and mostly useless. In a footnote that may or may not be related, I returned it 12 days later.
After my experience with the original iPad, I was keen on revisiting the experience a year later with the iPad 2. I appreciated the industrial design and performance boost, along with the thriving iPad-specific application ecosystem, though I noted that the XGA display wasn't aging well. I said I wanted to give it a shot at being a real productivity device, and bet that I wouldn't end up returning it. Thankfully, I'm not a betting man, because if I was, I would have lost my money. I used it a lot the month I got it, as well as the month leading up to my iOS 5 review, but other than that, it ended up sitting around my house until I sold it in December. It just didn't function properly in my usage model, nothing about a tablet fit into my workflow.
And it wasn't just the iPad; I had more than a dozen other tablets go through my hands over the last 12 months. iOS, Honeycomb, webOS (R.I.P.)...it didn’t really seem to matter, I just couldn’t get a tablet to feel like anything other than an accessory that made my computing setup that much less streamlined. I've heard Anand and Brian convey similar thoughts multiple times over the last couple of years. We're writers; as devices without keyboards, tablets work for us as laptop replacements roughly as well as wheel-less bicycles would do as car replacements.
Regardless of that minor concern, I ended up at an Apple Store on the launch day of the new iPad for the third year in a row (at 6AM, no less). And for the third year in a row, I ended up purchasing the latest and greatest in Apple slate computing. It's relatively rare to see Apple compromise form factor in favor of more screen, more GPU, and more battery, but Apple breaking from the tradition (philosophy?) of sacrificing anything and everything at the alter of thinness has resulted in a device that's actually very interesting.
I liked the iPad 2 hardware. It was a better tablet experience than the original, and the new iPad builds on that. Adding the Retina Display and LTE gives the form factor a breath of fresh air, but there’s another 16,000 words describing how and why. The main points: it’s new and it’s great to use, but the question is (also asked by Anand), will I be using this in six months? The answer for the original iPad was a resounding no; for the iPad 2, the answer was still no, but getting there. The new iPad? We’ll see.
The new iPad comes into my life at an interesting point—I got rid of my MacBook Pro because I felt like changing things up, and since then I’ve been bouncing from notebook to notebook (mostly review units) for the last eight weeks. With my mobile computing situation in flux until the next MacBook Pro launch, what better time to see if the iPad can really fit into my life?
To find out, I picked up a Logitech keyboard case for it, one that turns the iPad into something approximating the world's greatest netbook. Early returns are promising, I've gotten more written on the iPad in the last two days than I did in the entirety of the 9 months I owned the iPad 2. Shocking, that having a keyboard would make it easier to write, but in all seriousness, it allows me to be as productive on the iPad as I might be on a netbook. Probably more so, in fact. Also helping the case: dumping Google Docs Mobile (mostly terrible) for Evernote (less terrible). Multitouch gestures make switching between tasks less of a pain and the screen is finally crisp enough for the iPad to be a viable ebook reader. The new usability enhancements and the keyboard have significantly changed the usage model for me, now to the point where it has a daily role as a primary mobile computing device.
I don’t know how long it’ll last, but finally, the iPad is actually playing a meaningful part in my life.
Final Words
The new iPad represented Apple’s largest tablet launch yet, and according to their sales figures, three million units were moved over the opening weekend. That’s nearly $2 billion in tablets...in three days. Hotcakes are selling like iPads these days.
The new iPad is externally very similar to the iPad 2, but my feeling is that there's a much larger step in usability from the iPad 2 to the new iPad than there was from the original to the iPad 2. It's a difference that has nothing to do with form factor and everything to do with the Retina Display. The iPad 2 took the original iPad and made it better or more refined in every way—thinner, lighter, faster—but the experience didn't change radically. The Retina Display represents a fundamental change in how you visually interact with the device. The display is really the center of a tablet's experience, and with a display that drastically improved, the experience is correspondingly better.
It really is something that you notice in every single way you use the tablet. Text, whether you're reading it or writing it, is rendered far more accurately. High resolution graphics look fantastic, and UI elements look sharp in a way that the iPad 2 simply cannot match. Compared to the original iPad, the difference is stark, and it’s impossible to emphasize how huge a step up from the original 9.7" XGA display the Retina Display really is. It's a bit like the jump from SD to HD television, or from DVD to Bluray. Functionally, it's not terribly different, but it's a fundamental leap in technology. And once you take that leap, it's difficult to go back.
If you pay for and frequently use a cellular data plan on your iPad, the new iPad is worth the upgrade for LTE alone. LTE is very impressive on a smartphone but you're limited by how much downloading/browsing/multitasking you're willing to do on a very small screen. On a tablet, you're much more likely to treat the device like an ultraportable notebook, in which case an LTE iPad has a huge advantage over most WiFi-only ultraportables. LTE on the iPad is just like having awesome WiFi wherever you go. It's great.
I prefaced all of this with a question about your willingness to pay for the data plan, because even though you're not bound by any sort of a contract, the cost per GB transferred over LTE on both AT&T and Verizon is just unreasonable. If these carriers don't raise their data limits soon, they'll be directly responsible for stifling the growth of the mobile market. Can you imagine what the Internet revolution would've been like had we remained on hourly billing for cable/DSL?
Apple continues to push the envelope on the SoC side as well. Shipping a 163mm2 SoC on a 45nm LP process is something I never expected Apple to do, but it's here and will hopefully encourage other, actual SoC vendors to start behaving like good chip design companies and not like commodity peddlers. We need faster CPUs and GPUs in a major way; Apple can't be the only company aggressively pursuing these needs if others want to be successful. No one ever won by being the slowest on the block.
With all of this said—should you buy the new iPad?
If you are an existing iPad owner, the question is whether or not you should upgrade. If you don't use your iPad all that much, the upgrade obviously isn't worth it. Even if you do use your iPad a lot, unless you're going to use LTE, there isn't a functional or performance advantage to the new iPad. As is always the case, if you can hold off there's always something better around the corner. In this case, next-year's model should bring with it better performance and an increase in power efficiency thanks to 28/32nm silicon. There the decision really boils down to how much you'd appreciate the Retina Display—and as we already mentioned, there's a lot to appreciate.
If you have an iPad 2 you actually end up making a bit of a battery life and portability trade off if you choose the new iPad. It's still not as bulky as a MacBook Air (which already isn't bulky) but it's noticeably heavier than the iPad 2. The new iPad is nicer to use, but it's not as nice to carry. If you're still on the original iPad and use it frequently, the upgrade is a no brainer—you get a faster platform, a lighter chassis, better display and better cellular connectivity (optional).
If you're not a tablet owner, are in desperate need of one, and are looking to buy one now—the new iPad is as good as it gets today. This is Apple's halo iDevice. It has the fastest and best of nearly every component inside and out. It's got everything but the kitchen sink. As long as you're ok with iOS, there's no reason not to get the new iPad.