Comments Locked

27 Comments

Back to Article

  • ArchAngel777 - Tuesday, December 21, 2010 - link

    It will be real interesting to see if this performance increase comes at the cost of battery life.
  • knedle - Tuesday, December 21, 2010 - link

    On my G1 8 hrs of listening to music = ~32 hrs battery life, while no listening to music = ~72 hrs of battery life.
    So it looks like playing games or running 3D eye candy could ruin your battery even faster.
  • vol7ron - Tuesday, December 21, 2010 - link

    I don't understand your assessment. 8hrs music = 32 hrs battery? Does that mean with listening to your music you had ~24hrs of standby or ~32 afterwards? Did you talk on the phone, was it hooked up to wifi (any dead zones)? Did you use the internet?

    I know you're trying to give an estimate, but I'm not clear on your figures or any of the background behind it.

    "So it looks like playing games or running 3D eye candy could ruin your battery even faster"
    I don't think the question was if it would use more power to play the games, I think he wanted to know if the same games would require more/less juice w/ the new hardware. - I'm curious if just the standby graphics would require more/less energy.

    Just because it can do more things, does not necessarily make it less efficient.
  • amitp - Tuesday, December 21, 2010 - link

    Okey, I'm no H/W expert, So please don't kill me if I'm wrong.

    As per my knowledge Tegra 2 is dual core SoC and samsung hummingbird is Single Core SoC and only a little performance gain over samsung hummmingbird ?

    I also read somewhere that Samsung will use Tegra 2 in future for its devices. If they can make dual core hummingbird SoC, then I think they will get better performance then Tegra 2 ?

    Regards
    amitp
  • aegisofrime - Tuesday, December 21, 2010 - link

    Vivek Gowri, perhaps the Galaxy was unplugged when you tested it the first time?

    @amitp, Samsung indeed does have a dual-core SoC. It's called Orion, although some news sites are reporting that Samsung bought a whole bunch of Tegra 2 chips. Either they are giving up on Orion, or they are just using Tegra to tide them over until Orion is ready. Hummingbird is supposed to be a beast of a chip though, so Samsung indeed has the capability to design a very good SoC.
  • leomax999 - Tuesday, December 21, 2010 - link

    Samsung mobile division follows different strategy. They dont always go with in-house components.
    Looks like Orion would better Tegra 2,but would take some more time to market,crucially.
  • Goffers - Tuesday, December 21, 2010 - link

    Remember that Samsung are using a Qualcomm Snapdragon for their first Windows Phone 7's - as required by MSFT. Presumably Sg are not too proud to use a foreign SOC as a stopgap.
  • Alexvrb - Tuesday, December 21, 2010 - link

    Well that is certainly smart of them, for multiple reasons. For one, they don't risk being late to market because they are waiting on a new chip, or worse, being forced to launch with a previous gen chip because their new chip wasn't ready yet. Also, they are drawing from multiple sources so they don't have to worry quite as much about demand outstripping their own in-house chip production. They don't have all their eggs in one basket, in other words.

    Another side benefit is that it pushes Samsung's chip designers to put out a competitive product.
  • TareX - Wednesday, December 22, 2010 - link

    Orion is not power friendly at all. It's GPU was super-duper-uber powerful, but not battery friendly -unlike, of course, Tegra 2 SOC.
  • blueboy_10 - Thursday, December 23, 2010 - link

    Hey, I thought I heard somewhere that the Tegra 2 chips were to improve on battery life quite a bit. Has anybody heard on this? I'm just curious, cause that would do good on a smartphone with capabilities. Just think: a smartphone that like your computer/laptop, a good camcorder (1080p of course), and a very good camera, with this time some superior UI on it, all the while taking minimum bit on your battery life. Sound like 2012/2013 or out of our reach for sure, who knows? -BLUEBOY
  • synaesthetic - Tuesday, December 21, 2010 - link

    These tests are much more GPU than CPU, so it's not really going to show the benefit of the Tegra's dual-core Cortex A9 over the single-core Cortex A8 in the Hummingbird SoC.
  • Alexvrb - Tuesday, December 21, 2010 - link

    We don't know that for sure. I'd like to see some results of the Hummingbird with the CPU overclocked a good bit.
  • VivekGowri - Tuesday, December 21, 2010 - link

    As the others said, these are purely graphics benchmarks, so it's looking more at Nvidia's GeForce ULP versus the PowerVR SGX 540 that Samsung is using in their Hummingbird chips. For the CPU benchmarks, read the original article, it'll have a more in depth comparison of the dual core Cortex A9 versus the Cortex A8 and Snapdragon.

    The difference in the Galaxy Tab's performance wasn't due to on AC/on battery - I tried that and didn't see a difference in performance either way during my retest. It's really perplexing, I literally tried everything and couldn't get anything approaching my old result.

    The Samsung + Tegra 2 thing sounds weird to me. I'm not too sure of what the point is - maybe they hit some kind of development snag in Orion so they're going to release the 2nd gen Galaxy Tab with Tegra 2 just as a stopgap? I can't honestly see Samsung going with another chipmaker's SoC long term when they have a very parallel, potentially superior, chip already in the development stage.
  • sarge78 - Tuesday, December 21, 2010 - link

    Could be a Apple/Intrinsity thing?
  • mczak - Tuesday, December 21, 2010 - link

    While it's true these benchmarks shouldn't really be cpu dependent, they are most likely definitely affected by memory bandwidth. Tegra 2 chip could have an advantage here - I believe Tegra 2 is using (32bit) lpddr2 whereas Hummingbird is (32bit) lpddr1 (not sure it's hard to get definitive answers on that!) That's not to say it's an unfair advantage, but next year Cortex A9 designs featuring SGX540 will also use lpddr2 (and actually the ti omap 4430 supports 2x32bit lpddr2). Now SGX chips are not really heavily dependent on bandwidth (thanks to the tile based deferred rendering), but this could easily make a big difference I think, given that we're talking very low bandwidth with 32bit lpddr1.
  • dagamer34 - Wednesday, December 22, 2010 - link

    SImple reason. Most people care about product branding than hardware chips. It doesn't matter that a Galaxy Tab 2 will be build with a Tegra 2 SoC, because the Galaxy Tab 3 is likely to have Orion in it. But because Tegra 2 is the official reference platform for Honeycomb, I doubt they'd have enough time to finish both the Orion SoC and Honeycomb support for the SoC without a few months passing by with nothing new on the market.

    Favoring your own chips is not always a wise decision if it gets in the way of marketing.
  • ltcommanderdata - Tuesday, December 21, 2010 - link

    In terms of how well existing games optimized for PowerVR GPUs work on Tegra, does Tegra support the GL_IMG_texture_compression_pvrtc extension? I believe that's the primary Imagination developed OpenGL ES extension and is very commonly used in iOS games, which being the tier-1 mobile platform means most mobile games, and makes sense to use given it's bandwidth saving benefits. For example, even for porting old games like Doom to iOS, John Carmack converts textures to PVRTC as much as possible. Since Imagination published it as an open extension it'll be interesting to see if nVidia supports it.
  • R3MF - Tuesday, December 21, 2010 - link

    the real test will arrive along with the new TI SoC sporting the PowerVR SGX540 GPU!
  • argosreality - Tuesday, December 21, 2010 - link

    Interesting results its just sad that you cant get the Viewsonic tablet through staples anymore. They were pulled for hardware issues
  • Screenr - Tuesday, December 21, 2010 - link

    There's actually nothing wrong with the hardware (other than a screen with bad viewing angles from the edge, which doesn't really bug me). Staples posted a "manufacturing defect" because the default software interface sucks big time. With about 15 minutes of work (basically putting a rom on a memory card or via USB to the internal drive), you can have this thing running smoothly with a more default Android 2.2 interface, and it runs beautifully, whomping my ipad in overall performance.

    The ipad is a smoother customer experience out of the box.

    Viewsonic and Staples are squabbling over this, and it will clearly hurt Viewsonic's sales, but quite a few people are using this device quite well via xda forums and androidforums.
  • xodius80 - Tuesday, December 21, 2010 - link

    i thought i mention this ( i would in the forums but somehow i can post here but cant log in into the forums) i wount post this information but i think this has gone too far in terms of abuse in some part from this mayor website, i would like to share it with you all, and let this be an example of what has NOT to be done in a mayor tech or any website:

    http://forums.guru3d.com/showthread.php?t=334905&a...

    i find it sad, how a person just because he posts there something thats against the website or does not agree about the way the website informs the reviews, gets harrased and bashed by the comunity forum, and worse than that, the staff and owner of the site encourages this kind of behaviour, its a shame.
  • Shadowmaster625 - Tuesday, December 21, 2010 - link

    "so I really have no idea why I got a framerate that low, much less why it was repeatable"

    I've seen this sort of thing before. It is called "SOICANHASFREESTUFFFROMVNVIDIA"
  • Demon-Xanth - Tuesday, December 21, 2010 - link

    http://www.anandtech.com/show/391/16

    They're on par with the 450K6-2 system in the Geforce256 review. And these are phones/tablets. Ain't technology grand?
  • tipoo - Wednesday, December 22, 2010 - link

    Hah, that is interesting, thanks for pointing it out.
  • GnillGnoll - Wednesday, December 22, 2010 - link

    Are these benchmarks running at 16 or 32 bit color depth? I'm also wondering what the Z buffer depth is, as I've seen the Egypt benchmark running on a Toshiba AC-100 (a Tegra 2 device) using a 16-bit Z buffer and there was pretty bad Z fighting visible.
  • TareX - Wednesday, December 22, 2010 - link

    The main appeal Tegra 2 possesses, is not its power, but the fact it's an ultra low power. NVIDIA have done a feat making its GPU draw a whole lot less power than other GPUs. It's why Samsung dropped its very own -much hyped, Orion SOC.
  • A4i - Monday, December 27, 2010 - link

    Well, T2 is on TSMC plain silicon tech process. Nothing magical there. It is not HKMG or SOI, so no low power benefit at all. Maybe next Tegra on 28nm? Who knows. But dual core cpu versus single core& old gen? Dual core allways wins.

Log in

Don't have an account? Sign up now