yep, but if you look at the die size, you can see that they're kinda stuck - huge generational die size increase vs GP107, and even RX570/580 are only 232mm2 compared to 200mm2.
I can see how AMD can happily sell 570s for the same price since that design has been long paid for vs. Turing and the MFG costs shouldn't be much higher
Problem is on an OEM box you'll have to upgrade the PSU as well.
Dealing with normies for customers, the good ones will understand, but most of them wouldn't have bought a crappy OEM box in the first place. Most normies will buy the 1650 alone.
AMD needs 570ish performance without the need for auxiliary power.
Depending on the amount of gaming done, it probably saves over 50 dollars in electricity costs over a 2 year period compared to the RX 570. Of course the 570 is a bit faster on average.
Nobody in their right mind that's specifically on the market for an aftermarket GPU (a buying decision that comes about BECAUSE they're dissatisfied with the current framerate or performance of their existing, or lack of, a GPU) is making their primary purchasing decision on power savings alone. In other words, people aren't saying "Man, my ForkNight performance is good, but my power bills are too high! In order to remedy the exorbitant cost of my power bill, I'm going to go out and purchase a $150 GPU (which is more than 1 month of my power bill alone), even if it offers the same performance of my current GPU, just to save money on my power bill!"
Someone might make that their primary purchasing decision for a power supply, because outside of being able to supply a given wattage for the system, the only thing that matters is its efficiency, and yes, over the long term higher efficiency PSUs are better built, last longer, and provide a justifiable hidden cost savings.
Lower power for the same performance at a similar enough price can be a tie-breaker between two competing options, but that's not the case here for the 1650. It has essentially outpriced itself from competing viably in the lower budget GPU market.
I don't know what you consider being in a right mind is, but anyone making a cost sensitive buying decision that is not considering total cost of ownership is not making his decision correctly. The electricity is not free unless one has some special arrangement. It will be paid for and it will reduce one's wealth and ability to make other purchases.
So I assume you measure the efficiency of the AC unit in your car and how it relates to your gas mileage over duration of ownership as well? since you're so worried about every calculation in making that buying decision?
It doesn't really change the argument if he does or does not take into account his AC unit in his car. Electricity is not free. You can ignore the price of electricity if you want, but your decision to ignore it or not does not change the total cost of ownership. (I'm not defending the electricity calculations above, I haven't verified them)
Over here, it's quite routine for people to consider the efficiency cost of using AC in a car and whether it's more sensible to open the window... If you had a choice over a GTX1080 and Vega64 which perform nearly the same, assume they cost nearly the same, then you'd take into account one requires a small nuclear reactor to run whilst the other is probably more energy sipping than your current card. Also, some of us are on this thing called a budget. $50 saving is a weeks food shopping.
Except your comment is exactly in line with what I said: "Lower power for the same performance at a similar enough price can be a tie-breaker between two competing options, but that's not the case here for the 1650"
I'm not saying power use of the GPU is irrelevant, I'm saying performance/price is ultimately more important. The RX 570 GPU is not only significantly cheaper, but it outperforms the GTX 1650 is most scenarios. Yes, the RX 570 does so by consuming more power, but it'd take 2 or so years of power bills (at least according to avg American power bill per month) to split an even cost with the GTX 1650, and even at that mark where the cost of ownership is equivalent, the RX 570 still has provided 2 years of consistently better performance, and will continue to offer better performance.
Absolutely, a GTX1080 is a smarter buy compared the to the Vega64 given the power consumption, but that's because power consumption was the tie breaker. The comparison wouldn't be as ideal for the GTX1080 if it costed 30% more than the Vega64, offered similar performance, but came with the long term promise of ~eventually~ paying for the upfront difference in cost with a reduction in power cost.
Again, the sheer majority of users on the market are looking for best performance/price, and the GTX1650 outpriced itself out of the market it should be competing with.
"it'd take 2 or so years of power bills (at least according to avg American power bill per month) to split an even cost with the GTX 1650, and even at that mark where the cost of ownership is equivalent, the RX 570 still has provided 2 years of consistently better performance, and will continue to offer better performance."
This.
Plus, if people are so worried about power consumption maybe they should get some solar panels.
wrong on so many levels. If you find the highest cost electricity city in the US, plug in the most die hard gamer who plays only new games on max settings that runs GPU at 100% load at all times, and assume he plays more hours than most people work you might get close to those numbers. The sad kid who fits the above scenario games hard enough he would never choose to get such a bad card that is significantly slower than last gen's budget performers (RX 570 and GTX 1060 3GB). Kids in this scenario would not be calculating the nickels and dimes he's saving here and there - they'd would be getting the best card in their NOW budget without subtracting the quarter or so they might get back a week. You're trying to create a scenario that just doesn't exist. Super energy conscious people logging every penny of juice they spend don't game dozens of hours a week and would be nit-picky enough they would probably find settings to save that extra 2 cents a week so wouldn't even be running their GPU at 100% load.
Total cost of ownership is a significant factor in any buying decision. Not only should one consider the electrical costs of a GPU, but indirect additional expenses such as air conditioning needs or reductions in heating costs offset by heat output along with the cost to upgrade at a later date based on the potential for dissatisfaction with future performance. Failing to consider those and other factors ignores important recurring expenses.
Then people need to buy Ryzen R7 2700X than i9 9900K. As 9900K use more power, runs hot so need more powerful cooler and powerful cooler use more current compared to a 2700X.
Not everyone puts as much value on cost as others. When discussing a budget product, it absolutely makes sense to consider, since you possibly wouldn't buy such a GPU if money was no object.
But if someone buys a high-end CPU, the interests shift drastically, and as such, your logic makes no sense anymore. Plenty people buy the fastest not because its cheap, but because its the absolutely fastest.
Agreed with nevc on this one. When you start discussing higher end and higher cost components, consideration for power consumption comes off the proverbial table to a great extent because priority is naturally assigned moreso to performance than purchase price or electrical consumption and TCO.
Disclaimer, not done reading the article yet, but I saw your comment.
Some people look for low wattage cards that don't require a power connector. These types of cards are particularly suited for MiniITX systems that may sit under the TV. The 750ti was super popular because of this. Having Turings HEVC video encode/decode is really handy. You can put together a nice small MiniITX with something like the Node 202 and it will handle media duties much better than other solutions.
That would be great if it actually had the Turing HVEC encoder - it does not; it retains the Volta encoder for cost saving or some other Nvidia-Alone-Knows reason. (source: Hardware Unboxed and Gamer's Nexus).
Anyone buying a 1650 and expecting to get the Turing video encoding hardware is in for a nasty surprise.
Or if you're going with a miniITX low wattage system, you can cut out the 75w GPU and just go with a 65w AMD Ryzen 2400G since the integrated Vega GPU is perfectly suitable for an HTPC type system. It'll save you way more money with that logic.
The us average is a bit over 13 cents per kilowatt hour. But I made an error in the calculation and was way off. It's more like $15 over 2 years and not $50. Sorry.
That's for an average of 2h/day gaming. Bump it up to a hard core 6h/day and you get around $50/2 years. Or 2h/day but somewhere with obnoxiously expensive electricity like Hawaii or Germany.
I'd just like to point out that if you've gamed for an average of 6h per day over 2 years with a 570 instead of a 1650, then you've also been enjoying 10% or so extra performance. That's more than 4000 hours of higher detail settings and/or frame rates. If people are trying to calculate the true "value" of a card, then I would argue that this extra performance over time, let's not forget the performance benefits!
That's true, and I noted that in my original post. But the important thing is that the price/performance comparison should consider the total cost of ownership of the card. Ultimately, the value of any particular increment in performance is a matter of personal preference, though it is possible for someone to make a poor choice because he doesn't understand the situation well.
This power consumption electricity savings debate has gone on too long. The math is not hard - the annual electricity cost is equal to (Watts / 1,000) x (hours used per day) x (365 days / year) x (cost per kWh)
In my area, electricity costs $0.115/kWh so a rather excessive (for me) 3 hours of gaming every day of the year means that an extra 100W power consumption equals only $12.50 higher electricity cost every year.
So for me, the electricity cost of the higher power consumption isn't even remotely important. I think most people are in the same boat, but run the numbers yourself and make your own decision. The only people who should care either live somewhere with expensive electricity and/or game way too much, in which case they should probably be using a better GPU.
How is $12.50 a year not remotely important? Would you say a card costing $25 less is a big deal? If one costs $150 and the other is $175 you would not consider that to be at all a consideration to your purchase?
How IS $12.50/year even worth thinking about? That's less than an hour of work for most people, it's like 3 cents a day, you could pay for it by finding pennies on the sidewalk! PLUS you get much better performance! It's a faster card for a completely meaningless power increase. If your PSU doesn't have a six pin, get the 1650 I guess, otherwise the price is kinda silly.
I like the way you think. Whatever you buy, just buy it from me for $12.50 more than you could otherwise get it, because it's just not worth thinking about. What you say would be entirely reasonable if it didn't apply to every single purchase you make. I mean if a company comes along as says "Come on, buy this pen for $20. You're only going to buy one pen this year." would you do it? Do you ask the people who are saying NVIDIA's new cards are too expensive because they are $20 more expensive than the previous generation equivalents "How is $10 a year even worth thinking about?"
Hey, if you are willing to throw money out the window if it is for electricity but not for anything else that's up to you, but you are making unreasonable decisions that harm yourself.
Using your logic, why don't we all just save bunches of money by using Intel Integrated graphics. Since the money we save on power usage is all that matters, we might as well make sure we are only using Mobile CPU's as well. What your paying for here is the improved gaming experience provided by the extra performance of the RX570. For many people, the real-world improvement in the gaming experience is worth the relatively low cost of energy usage. Realistically, the only reason to get one of these over the 570 is if your power supply cannot handle the RX570.
Holy crap man! The amount of electricity I spent to read this comment thread and that mount of keyboard clicks that've been consumed from my 70 million clicks from my mechanical keyboard from my total cost of ownership was totally worth reading and replying to this.
If you're pinching pennies that hard, you're probably better off not spending 4 hours a day gaming. Those games cost money, and you know what they say about time! Maybe even set the card mining when you're away, there are profits to be had even now.
Anyone calculating the total ownership cost of a video card in cents per day should also consider that the slightly higher performance of the 570 may allow it to last a few more months before justifying replacement, allowing the purchase price to be spread over a longer period.
"Anyone calculating the total ownership cost of a video card in cents per day should also consider that the slightly higher performance of the 570 may allow it to last a few more months before justifying replacement, allowing the purchase price to be spread over a longer period."
Sure. Not that likely, though, because the difference isn't that great so what is more likely to affect the timing of upgrade is the card that becomes available. But at the moment, NVIDIA has a big gap between the 1650 and the 1660 so there aren't two more-efficient cards that bracket the 570 well from a price standpoint.
Of course, some people apparently don't care about $25 at all so I don't understand why they should care about $25 more than that (for a total of 50) such that it would prevent them from getting a 1660, which has a performance that blows the 570 out of the water and would be a lot more likely to play a factor in the timing of a future upgrade.
At least you went through and acknowledge how horribly wrong the math was so the entire initial premise is flawed. The $12.50 per year is also very high case scenario that would rarely fit a hardcore gamer who cares about TINY amounts of power savings. This is assuming 3 hours per day, 7 days a week never missing a day of gaming and that every single minute of this computer time is running the GPU at 100%. Even if you twist every number to match your claims it just doesn't pan out - period. The video cards being compared are not $25 difference. Energy conservative adults who care that much about every penny they spend on electricity don't game hardcore 21 hours a week. If you use realistic numbers of 2-3h game time 5 times a week and the fact that the GPU's are not constantly at 100% load and say a more realistic number like 75% of max power usage on average - this results in a value much below the $25 (which again is only half the price difference of the GPU's you're comparing). Using these more realistic numbers it's closer to $8 per year energy cost difference to own a superior card that results in better gaming quality for over a thousand hours. If saving $8 is that big a deal to you to have a lower gaming experience, then you're not really a gamer and probably don't care what card you're running. Just run a 2400G on 720p and low settings and call it a day. Playing the math game with blatantly wrong numbers doesn't validate the value of this card.
Right. My calculation is a bit higher with $ 0.12 per KWh but playing at 8 hours day, 365 days. I will take the rx570 and undervolt to reduce the consumption.
No, it doesn't. It's about 25 dollars over a 2 year period , if you play for 8 hours/day, every day for 2 years. If you're gaming less , or just browsing the difference is way smaller.
Per my last bill, I pay $0.0769USD per kWh. So, spending $50USD means I've used 650.195056 kWh, or 650,195.056 Wh. Comparing the power usage at full, it looks like on average you save maybe 80W using the GTX 1650 vs. the RX 570 (75W at full power, 86W at idle, so call it 80W average). That means it takes me (650195.056 Wh / 80W) = 8,127.4382 hours of gaming to have "saved" that much power. In a 2-year period, assuming the average 365.25 days per year & 24 hours per day, there's a maximum available of 17,532 hours. The ratio, then, of the time needed to spend gaming vs. total elapsed time in order to "save" that much power is (8127.4382 / 17352) = 46.838625%...which equates to an average 11.24127 hours (call it 11 hours 15 minutes) of gaming ***per day***. Now, ***MAYBE*** if I a) didn't have to work (or the equivalent, i.e. school) Monday through Friday, b) didn't have some minimum time to be social (i.e. spending time with my spouse), c) didn't have to also take care of chores & errands (mowing the lawn, cleaning the house, grocery shopping, etc.), & d) take the time for other things that also interest me besides PC gaming (reading books, watching movies & TV shows, taking vacations, going to Origins & comic book conventions, etc.), & e) I have someone providing me a roof to live under/food to eat/money to spend on said games & PC, I ****MIGHT**** be able to handle that kind of gaming schedule...but I not only doubt that would happen, but I would probably get very bored & sick of gaming (PC or otherwise) in short order.
Even someone who's more of an avid gamer & averages 4 hours of gaming per day, assuming their cost for electricity is the same as mine, will need to wait ***five to six years*** before they can say they saved $50USD on their electrical bill (or the cost of a single AAA game). But let's be honest; even avid gamers of that level are probably not going to be satisfied with a GTX 1650's performance (or even an RX 570's); they're going to want a 1070/1080/1080TI/2060/2070/2080 or similar GPU (depending on their other system specs). Or, the machine rocking the GTX 1650 is their ***secondary*** gaming PC...& since even that is going to set them back a few hundred dollars to build, I seriously doubt they're going to quibble about saving maybe $1 a month on their electrical bill.
You need to game on average 4 hour per day to reach the 50 euro in two years. If gaming is that important to you, you might want to look at another video card.
I think performance per watt is an important metric to consider, not because of money saved on electricity but because of less heat dumped into my case.
Yeah, sure seems like it. RX570s have been pretty regularly $120 (4GB) to $150 (8GB) for the last five months. I'm guessing we'll see a 1650SE with 3GB for $109 soon enough (but it won't be labeled as such)...
Pricing is even better right now for the RX570. The 4GB starts at $130 and the 8GB starts at $140, whereas the cheapest GTX 1650 is $150. Unless you need a sub 75W GPU, there is no reason at all to buy the 1650, not when you can get 10-20% better performance for $10-20 less cost.
Seems like it. Although I do know some people that run Dell/HP refurbs from years ago (Core i5-750 or i7-860, maybe a Sandybridge if they are lucky) and need the 75W graphics card. They all have GTX 750 still. This may be a card to replace that, since the rest still serves them fine. Otherwise, this is really kinda disappointing. I still rock a GTX 960 2GB (from my HTPC, it has to be small), since I sold my 1080 when I saw that I played only a few hours each month. But I won't be upgrading to this. I'd rather get a 580 8GB or save more and get a 2060 that can last me for several years. Oh well, guess someone will buy it. And it'll end up in tons of off-the-shelf PCs.
They don't need a 75W graphics card on an old refurb PC. What they desperately need is to replace the PSU with a modern 80+ certified one. The PSU in those old OEM PCs is typically 220W-280W ones with 75% maximum efficiency. And probably not over 70% with a 75W graphics card. Anandtech have tests of old OEM PSUs that shows that. Replacing the PSU to a reasonably low cost modern 80+ one gets you at least 50% more power capacity, and they will generally be at or near 90% efficient in the 40-50% load sweet spot which they will be at in gaming with an RX570 for instance. So they can get a new PSU and an RX570 for the same price. Have at least 15% better performance, have a quieter and a more power efficient system for the same price as if they bought a 1650. At $150 literally no one should even consider buying this. If the price was in the $100-$110 it would be another matter. Maybe even ok at $120. But at $150 it makes no sense for anyone to buy.
The "with compromises" bit could also mean setting the resolution to 1600x900. Power and temps are okay for the performance offered. The typical Nvidia ego-induced, absent-competition Turing price premium isn't as terrible at the low end. However a ~30W replacement for the 1030 would be nice as it would likely fit on a half-height, single slot card.
The name of this card is pretty confusing. GTX 1650 being noticeably slower than a GTX 1060 despite being 590 numbers higher doesn't make much sense. Why didn't Nvidia keep their naming to one scheme (2000 series) instead of having the GTX 16XX cards with confusing names.
last two digits are the performance category, the more significant digits are the generation. It is strange that right now they basically have two generation numbers 1600 and 2000. But that 50 is slower than 60 is not too confusing (for me anyways). Different performance category.
That makes no sense. The 2060 is slower than the 1080 Ti, but it is 980 "numbers higher". A Core i3-8100 is slower than an i5 or i7 of an earlier generation (being some 500 to thousands of "numbers" higher). Don't get me wrong, Nvidia's naming scheme sucks. But not because of the reason you stated.
@DeathAngel. Not sure what your problem is. 80>70>60>50>30 etc...
But that obviously only applies within a current generation. When you compare to an older generation then New x80 will be faster than old x80 and so on.
Of these low-mid cards, looks like the 1660 is where it's at. ~70% more cores and ~70% more performance for ~40% more money. I know, they need to have tiers, but as far as value goes it's the better bang for the buck if you can scrape together a bit more cash.
Because no one has been able to benchmark said graphics cards so no one knows if something is going to mop floors or just draw polygons. (Personally, I'm in for a GPU that will mop my floors for me. I'd also like one that will mow the yard, wash the dishes, and take care of the laundry.)
Good point but I seriously believe the next architecture Radeon built on 7nm could perform almost twice as fast than a RX 560 with 1024 CUs. Am I the only one hyped for 7nm graphics cards?
More recent benchmarking actually shows the RVII with the performance edge vs the RTX 2080 (AMD just completely botched the launch drivers-wise, as isn't particularly uncommon for them) as many recent videos have shown, but you're totally passing over the fact that it uses the exact same Vega architecture as 14nm Vega 10 but manages to outperform it by around 30% while pulling LESS power than a V64. That's nearly a 40-50% boost in power efficiency per fps, with absolutely no arch changes beyond 2x additional memory controllers. Even if Navi only matches that kind of efficiency bump vs Polaris it'll still be looking really good just as long as they maintain their performance advantage as well.
7nm TSMC isn't nearly as impressive as 5nm TSMC. 80% increase in density with 5nm. 7nm is a little bit sad, really. But, it saves companies money because it doesn't require nearly as much design rules modification, so porting existing 14nm stuff is much easier.
I'm really looking forward to seeing what 7nm GPUs do once they hit the market, but I want to hold back on making judgements before we see what sorts of performance and power numbers emerge. I'm also more interested in mobile than desktop components because I have not put together or purchased a desktop PC in the past 5 years since I find laptops and phones a better fit in my living space and lifestyle.
Personally, the only reason I would ever care about a 75W card is for video duties - and AMDs video decoding/encoding is significantly worse then Intels or NVIDIAs. So there is that.
I would be excited if they were trying to make a high-end 7nm card that doesn't suck, but apparently its once again just low-power cards. same old same old. I'm bored already.
Its the current person mopping the floor who designed AMD's last generation of gfx cards. Another reason to buy this Crda is that you may not want the heat produced . I for one have started to use a 10w NUC in prefernece to a 75w HTPC just becuase the heating effect is less . UK,not jamaica or Saudi
Quite frankly at the $150, no one, and I do mean no one should buy this card. Even if you refurb an old OEM system the price difference up to an RX570 lets you buy a decent 80+ certified power supply and have a system that is more powerful and probably more power efficient at the same time. A standard OEM PSU in a an old computer is so inefficient that just replacing it makes up for more than the power consumption difference between a 1650 and an RX570. And gives you at least 15% more performance for the same amount of money spent.
You can't compare the 1650 to the 950, they're priced completely differently at launch. Stop going directly with the product number. The 1650 is between 960 and 970.
"Notably, B-frames incorporate information about both the frame before them and the frame after them, allowing for greater space savings versus simpler uni-directional P-frames."
No. H.264 and H.265 (AVC/HEVC) have (optional) bi-directional P-Frames. That increases the complexity of the search required to create a B-Frame which would use significantly less data than a P-Frame. A lower-capability GPU may not be able to perform that search in real time, and in that case there's no point implementing it, even if it would increase compression efficiency, because the selling point of hardware HEVC compression is that it can be done in real time. B-Frames are simpler than P-Frames. Not the other way around. To be clear: I-Frames are effectively a still shot of the scene, like a JPEG. P-Frames hold motion data with references to I-Frames and P-Frames - they encode linear motion for blocks in the image, they encode replacement blocks for new data needed to replace changes, ie when something moves over a background and reveals what was behind it. If B-Frames are used, then intermediate frames are calculated between the P-Frames and their references based on their encoded block motion data. These result in what are called "tweens" in animation - images that are partway between a start and an end. The B-Frames encode small fixes for errors in the guessed (by linear interpolation) intermediate frames. The less motion there is, and the more linear the motion is, the more accurate the interpolated frames are and the more B-Frames you can have between P-Frames before the B-Frames become necessarily larger than a new P-Frame would have been. Generating those B-Frames and estimating / discarding them based on whether they can be as efficient as the P-Frames is a lot of work even when the P-Frames don't have bidirectional references. HEVC allows for more than just bidirectional (2 frame) motion prediction references. It allows using an P-Frame to inherit any other P-Frame's motion references and it allows P-Frames to target a B-Frame for motion estimation. That introduces an order of magnitude more search possibilities than H.264/AVC. HEVC with B-Frames disabled basically performs at a similar efficiency to AVC because all those options are off the table.
A P-Frame (Predictive Frame) by definition is only in one direction - backwards. B-Frames (Bidirectional Predictive Frame) are allowed in both directions. This is an import distinction because it matters in which order those frames are put into the encoded video. "Future" frames of course need to be send first, or you can't use them for prediction.
Thats where pattern like "IPBBB" come from. You start with a single I frame, a single P frame referencing that I frame (the P might be shown after some B frames), and then an array of B frames that reference both the I and P frames - and possibly each other.
P and B frames are otherwise identical in how they work. Both contain motion vectors and entropy data to correct the interpolation.
Also note that H264 already supported up to 16 reference frames for interpolation. Its called bidirectional not because its two frames, but two directions - past and future.
Wow. This card makes no sense. Go watch hardware unboxed's video where he conveniently shoots down the "power efficiency" argument. It's a load of rubbish, there is absolutely no reason to buy this card over even the 4GB 570, for any new gaming build. This review tried so hard to paint this turd in a positive light, continually underscoring AMD's "technological disadvantages" and "thin profit margin". P20 isn't even that much bigger than TU117 also.
I'm sorry I just feel it is too friendly to nvidia and doesn't criticize this terrible product pricing enough. RX570 8GB pulse, fro sapphire is cooler running, quieter, vastly higher build quality, >10% faster, twice the vram and 135W board power, which is perfectly fine even for potato OEM builds anyway.
Seriously, drop Ty efficiency arguy. This card is DOA at 149 because 570 killed it.
1024 CC card at 130 bucks would've been passable, not this joke.
It makes perfect sense for Nvidia. The corporation's goal is to sell the lowest-quality product for the most money possible. Nvidia has, can, and does, rely on its brand mindshare to overcome deficits in its products at various price points, especially around the lower end. In other words, people buy it because it's painted green.
I don't believe this trend is going to keep going. Everyone is now checking reviews online before making their choice. No way this will pass like butter in a pan.
"RX570 8GB pulse, fro sapphire is cooler running, quieter, vastly higher build quality, >10% faster, twice the vram and 135W board power, which is perfectly fine even for potato OEM builds anyway."
Not if the OEM build doesn't have a 6-pin PCIE cable. If you're building you own computer, then I agree that the 570 is a much better choice. However, if you want to do a quick upgrade to an older OEM system running a 750TI without a 6-pin, then the 1650 makes sense.
Many of my son's friends have lease-return desktop PCs their parents bought at a good price (i5/i7 2xxx to 4xxx) along with a 720p or 1080p LCD screen (usually less than a $300 investment) and many with SSDs. That being said, most of them use the iGPU (with a few of them with a lowend NVIDIA Quadro or AMD FirePro PCIe-based GPU). That being said, they want to be able to game at 720p/1080p with their friends and it usually doesn't cut it because of the iGPU or poor PCIe GPU.
When it comes to upgrading the GPU, one of the drawbacks for these systems are the lack of a 6-pin PCIe connector from the power supply and lackluster power supplies in general which can't be easily upgraded. In the past, I've recommended they get a 1050 and they've been very happy with their purchase along with a quick 10 minute upgrade. I can see the 1650 being what I'd recommend to them in the future if it fits their budget.
I'm with with most of you though, where if you have a system that can handle a 570, that is a much better choice.
It would be interesting to see how big is the market for 75W GPUs based on desktop PCs which can't handle anything more than that (which has nothing to do with saving power on someone's electric bill).
If one has so little money that one has to do a significant amount of PC gaming on a machine that can't handle more than a 75W GPU perhaps it's time to reconsider spending that time on PC gaming.
It seems like it would be a better idea to buy a used GPU and a cheap, but decent, PSU.
This is by far the most 1650 friendly review I have seen online. I mean, the choice of words, it's almost like someone is trying to not spoil his resume. Also it is the only review where AMD looks desperate, while it is a huge questionmark for how much time it will be willing to try to defend it's position with the RX 570 in the market. If only there where new cards coming from them in a couple of months, but I guess they don't prepare anything.
Polaris is such an old architecture and it was very cheap to buy years ago, prior to the crypto surge. For it to be so competitive against the wealthiest GPU company's latest design is a sad statement about the quality of competition in the PC gaming space. Duopoly competition isn't good enough.
If there were high-quality competition happening no company would be able to get away with putting overpriced turkeys into the market.
Hey, Turing is a joke. The only thing Turing brought is a different price bracket. Nvidia took 2 years and half before releasing Turing... so I don't see the age of Polaris to be an issue when new cards are coming in a couple of months.
"This is by far the most 1650 friendly review I have seen online."
Having finally read the other GTX 1650 reviews (I don't read them beforehand, to avoid coloring my own video card reviews), I agree with you on that. Still, I stand by my article.
AMD is by no means desperate here. But they are willing to take thinner profit margins than NVIDIA does. And that creates all kinds of glorious havoc in the sub-$200 video card market.
No one card wins in all categories here; one has better performance, another has better power efficiency. So it makes things a little more interesting for buyers as they now need to consider what they are using a card for - and what attributes they value the most.
Next to the GTX 1650, the RX 570 is really kind of a lumbering beast. The power consumption difference for the 11% performance advantage is quite high. But at the end of the day it's still 11% faster for the same price, so if you're buying on a pure price/performance basis, then it's an easy call to make.
As for Navi, AMD will eventually have a successor of some sort for Polaris 11. However I'm not expecting it in Q3; AMD normally launches the low-end stuff later.
You can stand by your article, but it doesn't mean you are right because of it. You are living in LALA land Ryan for even believing that 75W difference is important. It would be important if the cards were of the same performances at a similar price... but it isn't.
At this point, you can probably undervolt the RX 570 pretty close to the 1650 if that was sooooo important...
I made the calculation that it is going to cost you 15-20$ of power per year for playing 4 hours per day. You cannot defend this. It is insanity.
AMD in all those last years, is trying to defend it's position with smaller profit margins. It's not something that is doing now and it's not something that is doing only with RX 570, to make us question it's ability to maintain this price.
One other thing is that, while in the review the GTX 1650 is tested against the 4GB RX 570, when there is something to be said about pricing and profit margings and questions about the ability of AMD to keep selling the RX 570 under $150, the 8GB model of the RX 580 is used. No mentioning of the much cheaper 4GB version that is used in the review.
In the end of the day, RX 570 is not 11% faster for the same price. It's 11% faster for $30 less and the only question is if GTX 1650's power efficiency and a couple of other features are enough to justify the loss of 11% performance(or more if the RX 570 model was not overclocked) and a significant(for this price range) higher price tag.
And no, we can't assume that in the near future AMD's prices will just jump 20% to make the GTX 1650 less of an expensive card. Especially when Navi is not far away, meaning that older hardware will have to be sold to make room for the new models, or just stay at those low prices to not interfere with newer Navi models that could come at $200 and up.
In the 1660 and 1660Ti reviews, the RX 570 wasn't included; however, the RX 590 and RX 580 are shown taking 201 and 222 seconds respectively to complete the V-Ray benchmark 1.0.8, where this chart shows the RX 570 only taking 153 seconds. The GTX 1660 is shown taking 109 seconds in both that chart and this one. Since the 570 typically falls short of its 580/590 siblings, how did it manage to stomp them in this benchmark?
I think this is a reasonable review. Using twice the power at maximum load is not an insignificant factor over the life of the card. But it depends on if additional heat means a cost, or just means you can run your heating less, how often you game, who is paying for your power, etc. Then there are factors such as Linux source driver support, which may or may not matter for a particular person.
If pressed, I'd get the RX 570 in a heartbeat, but maybe not if I wanted to put it in my microserver (admittedly, I'd also need a low-profile card for that). But I'd rather wait for Navi in an APU. :-)
The article tries too hard to make Nvidia look good despite the GTX 1650 being inferior in performance compared to the RX 570 and overpriced for what it is offering.
The last time I remember any major tech news site give Nvidia any grief was in the Fermi days, with the 480 and especially the 465. As bad as the 480 was, too, people still bragged about their triple 480 SLI systems and 480 dual SLI was routinely featured in benchmarks.
4GB RX 570 is $130, not $150. And beats the 4GB 1650 out of sight. It also only draws ~120W, which is not a lot seeing as the majority of 1650s (ie- those with a 6 pin) draw ~90W anyway.
The actual 75W 1650s should be $99, and the rest shouldn't even exist. Because at $150-160 they are a complete and utter joke.
The RX570 won't even draw that much power if you know how to undervolt it. Both cards I've used in builds undervolted without a decrease in performance. Same for my RX580, undervolted it without a loss of performance.
Why does this shit still have a fucking DVI connector. Great card and I would totally buy it but I've got only DisplayPort monitors and guess what, this dumb piece of shit wastes half the bracket for a connector that has been dead for the last DECADE. Seriously who the fuck has a monitor with DVI? Last I saw this was on some Lenovo biz monitor that was still 4:3!
WTB: GTX1650 with 2x DP, 1x USB-C and 0x or 1x HDMI
I for one still have a high-resolution DVI monitor. I've got no good reason to replace my Dell U2711, it's 2560x1440 and can only be driven at that resolution by Displayport or Dual-link DVI. Since I have multiple systems, and there are two DVI inputs and only one displayport, it's useful if I can connect to it via DVI still. Displayport to dual-link DVI adapters are still ludicrously expensive, so they aren't an option. Since DVI is easilly passively adapted to HDMI it's not useless even if you don't want DVI, but you can't passivle adapt HDMI or DisplayPort to dual-link DVI. Around the time I got htis monitor there were also a few quite populat 2560x1440 options which were DVI only, with no displayport, so it's good that this card would still support them.
I do agree that DVI is a dead animal. DisplayPort is following suit though as the industry settles into HDMI so I think that's where we'll end up in a few years.
timecop1818 and PeachNCream ... you 2 are complaining about DVI being on a video card in 2019, and saying its a dead animal ?? what about putting VGA ports on a NEW monitor in 2019 ? when was the last time a vga port was on a new video card ??? IMO.. DVI is a lot more useful then VGA, and instead of VGA on a monitor.. add another hdmi, or display port... along with a single DVI port. timecop1818 vga has been dead for longer... and FYI.. all my monitors have DVI, and are in use :-) if the last monitor you saw was a 4:3 lenovo.. then you havent bought a new one.. or havent looked at a new monitor for a while...
DVI is far more common than DP. Far more common than HDMI on monitors. Besides, if looking at the offerings from Zotac and other they often have very similar cards with different output options. So you're free to pick the one you like.
There have been premium GTX 1650's announced with prices higher than stock RX580's!
Sure the AMD card uses a lot of power, but performance wise it trounces the nVidia card.
System builders will like this card simply for being fully slot powered, everyone consumer building a system is better of going for AMD. Better performace, better price and if you get the 8GB version more future proof for even low end gaming.
Every time there a review of Nvidia gpu, AMD fans came crawling out of the woodwork. Just to set the record straight - I’m not an Nvidia fan nor AMD fan. While I agree with all of the commenters here about Nvidia pricing that this card should be around $120, I do not agree with most people’s perceptions of performance. You can talk about performance per dollar all day long but people like me sees efficiency as better in a whole. I don’t have this card or the RX 560 to compare but I do have a Zotac GTX 1060 GDDR5X 6GB (5 stars ratings) bought for $200 from Newegg last month and I have access to RX 590 8GB which is currently priced at $220 (the cheapest). I was going for the 590 but there were several reasons that held me back. First, all the cheap 590s (less than $250) had terrible reviews (mostly 3 out of 5 stars). Second, I don’t want a loud foot heater. Last but not least, I define performance based on efficiency. The RX 590 uses almost twice the power of the GTX 1060 but only is 17% faster. How can you call that a better performance? Obviously, AMD haven’t got their acts together. They need to come up with a better chip architecture and revamp everything. I had hopes for them but they fell me every time. Words from Lisa’s mouth are cheap. Until AMD can actually deliver, Nvidia will keep selling their gpus at whatever price they want.
I'm running The Division 2 on a marginally under-volted Nitro+ 590, and with a mixture of Ultra and some High settings at 1080p with a manually imposed frame cap of 75, I'm getting 60-75fps for a power consumption figure of 130-160W. The card is audible, but barely.
It's just the one title, but I really doubt that the 1060 can deliver anywhere close to that level of performance, and certainly not at about half the power.
No they can't. The higher tier RTX cards are not selling well because they're too expensive, and so is the 1650. You're some kind of delusional if you think Nvidia can charge whatever they want.
Spelling and grammar corrections (Only 2, good work):
"This is where a lot of NVIDIA's previously touted "25% bitrate savings" for Turing come from." Should be "comes": "This is where a lot of NVIDIA's previously touted "25% bitrate savings" for Turing comes from."
"Though the greater cooling requirements for a higher power card does means forgoing the small form factor." Extra s: "Though the greater cooling requirements for a higher power card does mean forgoing the small form factor."
I purchased the Zotac 1650 OC for Rs. 12920 (USD 175.39) and later found out the 1650 super is 30% faster than 1650 and the a measly 3/4% slower than the 1660! Returned and got the 1650 Super Zotac.
I purchased the Zotac 1650 OC for Rs. 12920 (USD 175.39) and later found out the 1650 super is 30% faster than 1650 and the a measly 3/4% slower than the 1660! Returned and got the 1650 Super Zotac for 192.75 USD (14199 INR)
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
126 Comments
Back to Article
Marlin1975 - Friday, May 3, 2019 - link
Not a bad card, but it is a bad price.drexnx - Friday, May 3, 2019 - link
yep, but if you look at the die size, you can see that they're kinda stuck - huge generational die size increase vs GP107, and even RX570/580 are only 232mm2 compared to 200mm2.I can see how AMD can happily sell 570s for the same price since that design has been long paid for vs. Turing and the MFG costs shouldn't be much higher
Karmena - Tuesday, May 7, 2019 - link
Check the prices of RX570, they cost 120$ on newegg. And you can get one under 150$tarqsharq - Tuesday, May 7, 2019 - link
And the RX570's come with The Division 2 and World War Z right now.You can get the ASrock version with 8GB VRAM for only $139!
0ldman79 - Sunday, May 19, 2019 - link
Problem is on an OEM box you'll have to upgrade the PSU as well.Dealing with normies for customers, the good ones will understand, but most of them wouldn't have bought a crappy OEM box in the first place. Most normies will buy the 1650 alone.
AMD needs 570ish performance without the need for auxiliary power.
Yojimbo - Friday, May 3, 2019 - link
Depending on the amount of gaming done, it probably saves over 50 dollars in electricity costs over a 2 year period compared to the RX 570. Of course the 570 is a bit faster on average.JoeyJoJo123 - Friday, May 3, 2019 - link
Nobody in their right mind that's specifically on the market for an aftermarket GPU (a buying decision that comes about BECAUSE they're dissatisfied with the current framerate or performance of their existing, or lack of, a GPU) is making their primary purchasing decision on power savings alone. In other words, people aren't saying "Man, my ForkNight performance is good, but my power bills are too high! In order to remedy the exorbitant cost of my power bill, I'm going to go out and purchase a $150 GPU (which is more than 1 month of my power bill alone), even if it offers the same performance of my current GPU, just to save money on my power bill!"Someone might make that their primary purchasing decision for a power supply, because outside of being able to supply a given wattage for the system, the only thing that matters is its efficiency, and yes, over the long term higher efficiency PSUs are better built, last longer, and provide a justifiable hidden cost savings.
Lower power for the same performance at a similar enough price can be a tie-breaker between two competing options, but that's not the case here for the 1650. It has essentially outpriced itself from competing viably in the lower budget GPU market.
Yojimbo - Friday, May 3, 2019 - link
I don't know what you consider being in a right mind is, but anyone making a cost sensitive buying decision that is not considering total cost of ownership is not making his decision correctly. The electricity is not free unless one has some special arrangement. It will be paid for and it will reduce one's wealth and ability to make other purchases.logamaniac - Friday, May 3, 2019 - link
So I assume you measure the efficiency of the AC unit in your car and how it relates to your gas mileage over duration of ownership as well? since you're so worried about every calculation in making that buying decision?serpretetsky - Friday, May 3, 2019 - link
It doesn't really change the argument if he does or does not take into account his AC unit in his car. Electricity is not free. You can ignore the price of electricity if you want, but your decision to ignore it or not does not change the total cost of ownership. (I'm not defending the electricity calculations above, I haven't verified them)philehidiot - Friday, May 3, 2019 - link
Over here, it's quite routine for people to consider the efficiency cost of using AC in a car and whether it's more sensible to open the window... If you had a choice over a GTX1080 and Vega64 which perform nearly the same, assume they cost nearly the same, then you'd take into account one requires a small nuclear reactor to run whilst the other is probably more energy sipping than your current card. Also, some of us are on this thing called a budget. $50 saving is a weeks food shopping.JoeyJoJo123 - Friday, May 3, 2019 - link
Except your comment is exactly in line with what I said:"Lower power for the same performance at a similar enough price can be a tie-breaker between two competing options, but that's not the case here for the 1650"
I'm not saying power use of the GPU is irrelevant, I'm saying performance/price is ultimately more important. The RX 570 GPU is not only significantly cheaper, but it outperforms the GTX 1650 is most scenarios. Yes, the RX 570 does so by consuming more power, but it'd take 2 or so years of power bills (at least according to avg American power bill per month) to split an even cost with the GTX 1650, and even at that mark where the cost of ownership is equivalent, the RX 570 still has provided 2 years of consistently better performance, and will continue to offer better performance.
Absolutely, a GTX1080 is a smarter buy compared the to the Vega64 given the power consumption, but that's because power consumption was the tie breaker. The comparison wouldn't be as ideal for the GTX1080 if it costed 30% more than the Vega64, offered similar performance, but came with the long term promise of ~eventually~ paying for the upfront difference in cost with a reduction in power cost.
Again, the sheer majority of users on the market are looking for best performance/price, and the GTX1650 outpriced itself out of the market it should be competing with.
Oxford Guy - Saturday, May 4, 2019 - link
"it'd take 2 or so years of power bills (at least according to avg American power bill per month) to split an even cost with the GTX 1650, and even at that mark where the cost of ownership is equivalent, the RX 570 still has provided 2 years of consistently better performance, and will continue to offer better performance."This.
Plus, if people are so worried about power consumption maybe they should get some solar panels.
Yojimbo - Sunday, May 5, 2019 - link
Why in the world would you get solar panels? That would only increase the cost even more!Karmena - Tuesday, May 7, 2019 - link
So, you multiplied it once, why not multiply that value again. and make it 100$?Gigaplex - Sunday, May 5, 2019 - link
Kids living with their parents generally don't care about the power bill.gglaw - Sunday, May 5, 2019 - link
wrong on so many levels. If you find the highest cost electricity city in the US, plug in the most die hard gamer who plays only new games on max settings that runs GPU at 100% load at all times, and assume he plays more hours than most people work you might get close to those numbers. The sad kid who fits the above scenario games hard enough he would never choose to get such a bad card that is significantly slower than last gen's budget performers (RX 570 and GTX 1060 3GB). Kids in this scenario would not be calculating the nickels and dimes he's saving here and there - they'd would be getting the best card in their NOW budget without subtracting the quarter or so they might get back a week. You're trying to create a scenario that just doesn't exist. Super energy conscious people logging every penny of juice they spend don't game dozens of hours a week and would be nit-picky enough they would probably find settings to save that extra 2 cents a week so wouldn't even be running their GPU at 100% load.PeachNCream - Friday, May 3, 2019 - link
Total cost of ownership is a significant factor in any buying decision. Not only should one consider the electrical costs of a GPU, but indirect additional expenses such as air conditioning needs or reductions in heating costs offset by heat output along with the cost to upgrade at a later date based on the potential for dissatisfaction with future performance. Failing to consider those and other factors ignores important recurring expenses.Geranium - Saturday, May 4, 2019 - link
Then people need to buy Ryzen R7 2700X than i9 9900K. As 9900K use more power, runs hot so need more powerful cooler and powerful cooler use more current compared to a 2700X.nevcairiel - Saturday, May 4, 2019 - link
Not everyone puts as much value on cost as others. When discussing a budget product, it absolutely makes sense to consider, since you possibly wouldn't buy such a GPU if money was no object.But if someone buys a high-end CPU, the interests shift drastically, and as such, your logic makes no sense anymore. Plenty people buy the fastest not because its cheap, but because its the absolutely fastest.
PeachNCream - Tuesday, May 7, 2019 - link
Agreed with nevc on this one. When you start discussing higher end and higher cost components, consideration for power consumption comes off the proverbial table to a great extent because priority is naturally assigned moreso to performance than purchase price or electrical consumption and TCO.eek2121 - Friday, May 3, 2019 - link
Disclaimer, not done reading the article yet, but I saw your comment.Some people look for low wattage cards that don't require a power connector. These types of cards are particularly suited for MiniITX systems that may sit under the TV. The 750ti was super popular because of this. Having Turings HEVC video encode/decode is really handy. You can put together a nice small MiniITX with something like the Node 202 and it will handle media duties much better than other solutions.
CptnPenguin - Friday, May 3, 2019 - link
That would be great if it actually had the Turing HVEC encoder - it does not; it retains the Volta encoder for cost saving or some other Nvidia-Alone-Knows reason. (source: Hardware Unboxed and Gamer's Nexus).Anyone buying a 1650 and expecting to get the Turing video encoding hardware is in for a nasty surprise.
Oxford Guy - Saturday, May 4, 2019 - link
"That would be great if it actually had the Turing HVEC encoder - it does not; it retains the Volta encoder"Yeah, lack of B support stinks.
JoeyJoJo123 - Friday, May 3, 2019 - link
Or if you're going with a miniITX low wattage system, you can cut out the 75w GPU and just go with a 65w AMD Ryzen 2400G since the integrated Vega GPU is perfectly suitable for an HTPC type system. It'll save you way more money with that logic.0ldman79 - Sunday, May 19, 2019 - link
What they are going to do though is look at the fast GPU + PSU vs the slower GPU alone.People with OEM boxes are going to buy one part at a time. Trust me on this, it's frustrating, but it's consistent.
Gich - Friday, May 3, 2019 - link
25$ a year? So 7cents a day?7cents is more then 1kWh where I live.
Yojimbo - Friday, May 3, 2019 - link
The us average is a bit over 13 cents per kilowatt hour. But I made an error in the calculation and was way off. It's more like $15 over 2 years and not $50. Sorry.DanNeely - Friday, May 3, 2019 - link
That's for an average of 2h/day gaming. Bump it up to a hard core 6h/day and you get around $50/2 years. Or 2h/day but somewhere with obnoxiously expensive electricity like Hawaii or Germany.rhysiam - Saturday, May 4, 2019 - link
I'd just like to point out that if you've gamed for an average of 6h per day over 2 years with a 570 instead of a 1650, then you've also been enjoying 10% or so extra performance. That's more than 4000 hours of higher detail settings and/or frame rates. If people are trying to calculate the true "value" of a card, then I would argue that this extra performance over time, let's not forget the performance benefits!Yojimbo - Saturday, May 4, 2019 - link
That's true, and I noted that in my original post. But the important thing is that the price/performance comparison should consider the total cost of ownership of the card. Ultimately, the value of any particular increment in performance is a matter of personal preference, though it is possible for someone to make a poor choice because he doesn't understand the situation well.dmammar - Friday, May 3, 2019 - link
This power consumption electricity savings debate has gone on too long. The math is not hard - the annual electricity cost is equal to (Watts / 1,000) x (hours used per day) x (365 days / year) x (cost per kWh)In my area, electricity costs $0.115/kWh so a rather excessive (for me) 3 hours of gaming every day of the year means that an extra 100W power consumption equals only $12.50 higher electricity cost every year.
So for me, the electricity cost of the higher power consumption isn't even remotely important. I think most people are in the same boat, but run the numbers yourself and make your own decision. The only people who should care either live somewhere with expensive electricity and/or game way too much, in which case they should probably be using a better GPU.
Yojimbo - Saturday, May 4, 2019 - link
How is $12.50 a year not remotely important? Would you say a card costing $25 less is a big deal? If one costs $150 and the other is $175 you would not consider that to be at all a consideration to your purchase?OTG - Saturday, May 4, 2019 - link
How IS $12.50/year even worth thinking about?That's less than an hour of work for most people, it's like 3 cents a day, you could pay for it by finding pennies on the sidewalk!
PLUS you get much better performance! It's a faster card for a completely meaningless power increase.
If your PSU doesn't have a six pin, get the 1650 I guess, otherwise the price is kinda silly.
Yojimbo - Saturday, May 4, 2019 - link
I like the way you think. Whatever you buy, just buy it from me for $12.50 more than you could otherwise get it, because it's just not worth thinking about. What you say would be entirely reasonable if it didn't apply to every single purchase you make. I mean if a company comes along as says "Come on, buy this pen for $20. You're only going to buy one pen this year." would you do it? Do you ask the people who are saying NVIDIA's new cards are too expensive because they are $20 more expensive than the previous generation equivalents "How is $10 a year even worth thinking about?"Hey, if you are willing to throw money out the window if it is for electricity but not for anything else that's up to you, but you are making unreasonable decisions that harm yourself.
jardows2 - Monday, May 6, 2019 - link
Using your logic, why don't we all just save bunches of money by using Intel Integrated graphics. Since the money we save on power usage is all that matters, we might as well make sure we are only using Mobile CPU's as well.What your paying for here is the improved gaming experience provided by the extra performance of the RX570. For many people, the real-world improvement in the gaming experience is worth the relatively low cost of energy usage. Realistically, the only reason to get one of these over the 570 is if your power supply cannot handle the RX570.
Sushisamurai - Tuesday, May 7, 2019 - link
Holy crap man! The amount of electricity I spent to read this comment thread and that mount of keyboard clicks that've been consumed from my 70 million clicks from my mechanical keyboard from my total cost of ownership was totally worth reading and replying to this.OTG - Tuesday, May 7, 2019 - link
If you're pinching pennies that hard, you're probably better off not spending 4 hours a day gaming.Those games cost money, and you know what they say about time!
Maybe even set the card mining when you're away, there are profits to be had even now.
WarlockOfOz - Saturday, May 4, 2019 - link
Anyone calculating the total ownership cost of a video card in cents per day should also consider that the slightly higher performance of the 570 may allow it to last a few more months before justifying replacement, allowing the purchase price to be spread over a longer period.Yojimbo - Sunday, May 5, 2019 - link
"Anyone calculating the total ownership cost of a video card in cents per day should also consider that the slightly higher performance of the 570 may allow it to last a few more months before justifying replacement, allowing the purchase price to be spread over a longer period."Sure. Not that likely, though, because the difference isn't that great so what is more likely to affect the timing of upgrade is the card that becomes available. But at the moment, NVIDIA has a big gap between the 1650 and the 1660 so there aren't two more-efficient cards that bracket the 570 well from a price standpoint.
Of course, some people apparently don't care about $25 at all so I don't understand why they should care about $25 more than that (for a total of 50) such that it would prevent them from getting a 1660, which has a performance that blows the 570 out of the water and would be a lot more likely to play a factor in the timing of a future upgrade.
Gigaplex - Sunday, May 5, 2019 - link
I spend more than that on lunch most days.Yojimbo - Sunday, May 5, 2019 - link
"I spend more than that on lunch most days."Economics is hard.
gglaw - Sunday, May 5, 2019 - link
At least you went through and acknowledge how horribly wrong the math was so the entire initial premise is flawed. The $12.50 per year is also very high case scenario that would rarely fit a hardcore gamer who cares about TINY amounts of power savings. This is assuming 3 hours per day, 7 days a week never missing a day of gaming and that every single minute of this computer time is running the GPU at 100%. Even if you twist every number to match your claims it just doesn't pan out - period. The video cards being compared are not $25 difference. Energy conservative adults who care that much about every penny they spend on electricity don't game hardcore 21 hours a week. If you use realistic numbers of 2-3h game time 5 times a week and the fact that the GPU's are not constantly at 100% load and say a more realistic number like 75% of max power usage on average - this results in a value much below the $25 (which again is only half the price difference of the GPU's you're comparing). Using these more realistic numbers it's closer to $8 per year energy cost difference to own a superior card that results in better gaming quality for over a thousand hours. If saving $8 is that big a deal to you to have a lower gaming experience, then you're not really a gamer and probably don't care what card you're running. Just run a 2400G on 720p and low settings and call it a day. Playing the math game with blatantly wrong numbers doesn't validate the value of this card.zodiacfml - Saturday, May 4, 2019 - link
Right. My calculation is a bit higher with $ 0.12 per KWh but playing at 8 hours day, 365 days.I will take the rx570 and undervolt to reduce the consumption.
Yojimbo - Saturday, May 4, 2019 - link
Yes good idea. The you can get the performance of the 1650 for just a few more watts than the 1650.eddieobscurant - Sunday, May 5, 2019 - link
No, it doesn't. It's about 25 dollars over a 2 year period , if you play for 8 hours/day, every day for 2 years. If you're gaming less , or just browsing the difference is way smaller.spdragoo - Monday, May 6, 2019 - link
Per my last bill, I pay $0.0769USD per kWh. So, spending $50USD means I've used 650.195056 kWh, or 650,195.056 Wh. Comparing the power usage at full, it looks like on average you save maybe 80W using the GTX 1650 vs. the RX 570 (75W at full power, 86W at idle, so call it 80W average). That means it takes me (650195.056 Wh / 80W) = 8,127.4382 hours of gaming to have "saved" that much power. In a 2-year period, assuming the average 365.25 days per year & 24 hours per day, there's a maximum available of 17,532 hours. The ratio, then, of the time needed to spend gaming vs. total elapsed time in order to "save" that much power is (8127.4382 / 17352) = 46.838625%...which equates to an average 11.24127 hours (call it 11 hours 15 minutes) of gaming ***per day***. Now, ***MAYBE*** if I a) didn't have to work (or the equivalent, i.e. school) Monday through Friday, b) didn't have some minimum time to be social (i.e. spending time with my spouse), c) didn't have to also take care of chores & errands (mowing the lawn, cleaning the house, grocery shopping, etc.), & d) take the time for other things that also interest me besides PC gaming (reading books, watching movies & TV shows, taking vacations, going to Origins & comic book conventions, etc.), & e) I have someone providing me a roof to live under/food to eat/money to spend on said games & PC, I ****MIGHT**** be able to handle that kind of gaming schedule...but I not only doubt that would happen, but I would probably get very bored & sick of gaming (PC or otherwise) in short order.Even someone who's more of an avid gamer & averages 4 hours of gaming per day, assuming their cost for electricity is the same as mine, will need to wait ***five to six years*** before they can say they saved $50USD on their electrical bill (or the cost of a single AAA game). But let's be honest; even avid gamers of that level are probably not going to be satisfied with a GTX 1650's performance (or even an RX 570's); they're going to want a 1070/1080/1080TI/2060/2070/2080 or similar GPU (depending on their other system specs). Or, the machine rocking the GTX 1650 is their ***secondary*** gaming PC...& since even that is going to set them back a few hundred dollars to build, I seriously doubt they're going to quibble about saving maybe $1 a month on their electrical bill.
Foeketijn - Tuesday, May 7, 2019 - link
You need to game on average 4 hour per day to reach the 50 euro in two years.If gaming is that important to you, you might want to look at another video card.
Hixbot - Tuesday, May 7, 2019 - link
I think performance per watt is an important metric to consider, not because of money saved on electricity but because of less heat dumped into my case.nathanddrews - Friday, May 3, 2019 - link
Yeah, sure seems like it. RX570s have been pretty regularly $120 (4GB) to $150 (8GB) for the last five months. I'm guessing we'll see a 1650SE with 3GB for $109 soon enough (but it won't be labeled as such)...schujj07 - Friday, May 3, 2019 - link
Pricing is even better right now for the RX570. The 4GB starts at $130 and the 8GB starts at $140, whereas the cheapest GTX 1650 is $150. Unless you need a sub 75W GPU, there is no reason at all to buy the 1650, not when you can get 10-20% better performance for $10-20 less cost.Death666Angel - Friday, May 3, 2019 - link
Seems like it. Although I do know some people that run Dell/HP refurbs from years ago (Core i5-750 or i7-860, maybe a Sandybridge if they are lucky) and need the 75W graphics card. They all have GTX 750 still. This may be a card to replace that, since the rest still serves them fine.Otherwise, this is really kinda disappointing.
I still rock a GTX 960 2GB (from my HTPC, it has to be small), since I sold my 1080 when I saw that I played only a few hours each month. But I won't be upgrading to this. I'd rather get a 580 8GB or save more and get a 2060 that can last me for several years. Oh well, guess someone will buy it. And it'll end up in tons of off-the-shelf PCs.
SaturnusDK - Friday, May 3, 2019 - link
They don't need a 75W graphics card on an old refurb PC. What they desperately need is to replace the PSU with a modern 80+ certified one. The PSU in those old OEM PCs is typically 220W-280W ones with 75% maximum efficiency. And probably not over 70% with a 75W graphics card. Anandtech have tests of old OEM PSUs that shows that.Replacing the PSU to a reasonably low cost modern 80+ one gets you at least 50% more power capacity, and they will generally be at or near 90% efficient in the 40-50% load sweet spot which they will be at in gaming with an RX570 for instance.
So they can get a new PSU and an RX570 for the same price. Have at least 15% better performance, have a quieter and a more power efficient system for the same price as if they bought a 1650.
At $150 literally no one should even consider buying this. If the price was in the $100-$110 it would be another matter. Maybe even ok at $120. But at $150 it makes no sense for anyone to buy.
PeachNCream - Friday, May 3, 2019 - link
The "with compromises" bit could also mean setting the resolution to 1600x900. Power and temps are okay for the performance offered. The typical Nvidia ego-induced, absent-competition Turing price premium isn't as terrible at the low end. However a ~30W replacement for the 1030 would be nice as it would likely fit on a half-height, single slot card.Flunk - Friday, May 3, 2019 - link
The name of this card is pretty confusing. GTX 1650 being noticeably slower than a GTX 1060 despite being 590 numbers higher doesn't make much sense. Why didn't Nvidia keep their naming to one scheme (2000 series) instead of having the GTX 16XX cards with confusing names.serpretetsky - Friday, May 3, 2019 - link
last two digits are the performance category, the more significant digits are the generation. It is strange that right now they basically have two generation numbers 1600 and 2000. But that 50 is slower than 60 is not too confusing (for me anyways). Different performance category.Death666Angel - Friday, May 3, 2019 - link
That makes no sense. The 2060 is slower than the 1080 Ti, but it is 980 "numbers higher". A Core i3-8100 is slower than an i5 or i7 of an earlier generation (being some 500 to thousands of "numbers" higher).Don't get me wrong, Nvidia's naming scheme sucks. But not because of the reason you stated.
guidryp - Friday, May 3, 2019 - link
@DeathAngel. Not sure what your problem is. 80>70>60>50>30 etc...But that obviously only applies within a current generation. When you compare to an older generation then New x80 will be faster than old x80 and so on.
It's about as logical as you can make it.
serpretetsky - Friday, May 3, 2019 - link
DeathAngel was replying to Flunk.sor - Friday, May 3, 2019 - link
Of these low-mid cards, looks like the 1660 is where it's at. ~70% more cores and ~70% more performance for ~40% more money. I know, they need to have tiers, but as far as value goes it's the better bang for the buck if you can scrape together a bit more cash.onbquo - Friday, May 3, 2019 - link
Why is it nobody talking about coming 7nm Radeons mopping the floor in the 75W segment?PeachNCream - Friday, May 3, 2019 - link
Because no one has been able to benchmark said graphics cards so no one knows if something is going to mop floors or just draw polygons. (Personally, I'm in for a GPU that will mop my floors for me. I'd also like one that will mow the yard, wash the dishes, and take care of the laundry.)onbquo - Friday, May 3, 2019 - link
Good point but I seriously believe the next architecture Radeon built on 7nm could perform almost twice as fast than a RX 560 with 1024 CUs. Am I the only one hyped for 7nm graphics cards?guidryp - Friday, May 3, 2019 - link
You are making a pile of assumptions with no evidence.Process bumps aren't the big win that they once were. Radeon 7 is 7nm and it didn't get twice as fast. RTX2080 outperforms it while using less power.
7nm is NOT a magic bullet. We need to wait and see what actually happens.
Cooe - Friday, May 3, 2019 - link
More recent benchmarking actually shows the RVII with the performance edge vs the RTX 2080 (AMD just completely botched the launch drivers-wise, as isn't particularly uncommon for them) as many recent videos have shown, but you're totally passing over the fact that it uses the exact same Vega architecture as 14nm Vega 10 but manages to outperform it by around 30% while pulling LESS power than a V64. That's nearly a 40-50% boost in power efficiency per fps, with absolutely no arch changes beyond 2x additional memory controllers. Even if Navi only matches that kind of efficiency bump vs Polaris it'll still be looking really good just as long as they maintain their performance advantage as well.guidryp - Saturday, May 4, 2019 - link
Better in one or two AMD favorable games, but not overall. Beating power of V64 is needed, but still doesn't come close to NVidia power usage.Oxford Guy - Saturday, May 4, 2019 - link
7nm TSMC isn't nearly as impressive as 5nm TSMC. 80% increase in density with 5nm. 7nm is a little bit sad, really. But, it saves companies money because it doesn't require nearly as much design rules modification, so porting existing 14nm stuff is much easier.PeachNCream - Tuesday, May 7, 2019 - link
I'm really looking forward to seeing what 7nm GPUs do once they hit the market, but I want to hold back on making judgements before we see what sorts of performance and power numbers emerge. I'm also more interested in mobile than desktop components because I have not put together or purchased a desktop PC in the past 5 years since I find laptops and phones a better fit in my living space and lifestyle.nevcairiel - Saturday, May 4, 2019 - link
Personally, the only reason I would ever care about a 75W card is for video duties - and AMDs video decoding/encoding is significantly worse then Intels or NVIDIAs. So there is that.I would be excited if they were trying to make a high-end 7nm card that doesn't suck, but apparently its once again just low-power cards. same old same old. I'm bored already.
Oxford Guy - Saturday, May 4, 2019 - link
"Personally, the only reason I would ever care about a 75W card is for video duties "Then the lack of B frame support in the encoder is a deal-breaker.
dromoxen - Monday, May 20, 2019 - link
Its the current person mopping the floor who designed AMD's last generation of gfx cards.Another reason to buy this Crda is that you may not want the heat produced . I for one have started to use a 10w NUC in prefernece to a 75w HTPC just becuase the heating effect is less . UK,not jamaica or Saudi
plonk420 - Friday, May 3, 2019 - link
thanks for all the compute benches! yuuuugely appreciated!ads295 - Friday, May 3, 2019 - link
Can I use this to play ten year old games in full glory at 1440p?Ryan Smith - Friday, May 3, 2019 - link
Easily. Heck, depending on the game, you could probably get away with doing that on an iGPU.Ashinjuka - Saturday, May 4, 2019 - link
Probably not full-glory S.T.A.L.K.E.R. Definitely not full-glory S.T.A.L.K.E.R. with graphics mods.SaturnusDK - Friday, May 3, 2019 - link
Quite frankly at the $150, no one, and I do mean no one should buy this card. Even if you refurb an old OEM system the price difference up to an RX570 lets you buy a decent 80+ certified power supply and have a system that is more powerful and probably more power efficient at the same time. A standard OEM PSU in a an old computer is so inefficient that just replacing it makes up for more than the power consumption difference between a 1650 and an RX570. And gives you at least 15% more performance for the same amount of money spent.Oxford Guy - Saturday, May 4, 2019 - link
I doubt anyone should have purchased the 960 and yet it's the 5th most popular Steam card.This place didn't even bother to review it.
RSAUser - Friday, May 3, 2019 - link
A 1060 costs the same price as this 1650 here, I see no reason to buy it. Terrible value for money.RSAUser - Friday, May 3, 2019 - link
You can't compare the 1650 to the 950, they're priced completely differently at launch. Stop going directly with the product number. The 1650 is between 960 and 970.linuxgeex - Friday, May 3, 2019 - link
"Notably, B-frames incorporate information about both the frame before them and the frame after them, allowing for greater space savings versus simpler uni-directional P-frames."No. H.264 and H.265 (AVC/HEVC) have (optional) bi-directional P-Frames. That increases the complexity of the search required to create a B-Frame which would use significantly less data than a P-Frame. A lower-capability GPU may not be able to perform that search in real time, and in that case there's no point implementing it, even if it would increase compression efficiency, because the selling point of hardware HEVC compression is that it can be done in real time.
B-Frames are simpler than P-Frames. Not the other way around.
To be clear: I-Frames are effectively a still shot of the scene, like a JPEG.
P-Frames hold motion data with references to I-Frames and P-Frames - they encode linear motion for blocks in the image, they encode replacement blocks for new data needed to replace changes, ie when something moves over a background and reveals what was behind it.
If B-Frames are used, then intermediate frames are calculated between the P-Frames and their references based on their encoded block motion data. These result in what are called "tweens" in animation - images that are partway between a start and an end. The B-Frames encode small fixes for errors in the guessed (by linear interpolation) intermediate frames. The less motion there is, and the more linear the motion is, the more accurate the interpolated frames are and the more B-Frames you can have between P-Frames before the B-Frames become necessarily larger than a new P-Frame would have been. Generating those B-Frames and estimating / discarding them based on whether they can be as efficient as the P-Frames is a lot of work even when the P-Frames don't have bidirectional references. HEVC allows for more than just bidirectional (2 frame) motion prediction references. It allows using an P-Frame to inherit any other P-Frame's motion references and it allows P-Frames to target a B-Frame for motion estimation. That introduces an order of magnitude more search possibilities than H.264/AVC. HEVC with B-Frames disabled basically performs at a similar efficiency to AVC because all those options are off the table.
nevcairiel - Saturday, May 4, 2019 - link
A P-Frame (Predictive Frame) by definition is only in one direction - backwards. B-Frames (Bidirectional Predictive Frame) are allowed in both directions. This is an import distinction because it matters in which order those frames are put into the encoded video. "Future" frames of course need to be send first, or you can't use them for prediction.Thats where pattern like "IPBBB" come from. You start with a single I frame, a single P frame referencing that I frame (the P might be shown after some B frames), and then an array of B frames that reference both the I and P frames - and possibly each other.
P and B frames are otherwise identical in how they work. Both contain motion vectors and entropy data to correct the interpolation.
Also note that H264 already supported up to 16 reference frames for interpolation. Its called bidirectional not because its two frames, but two directions - past and future.
Opencg - Friday, May 3, 2019 - link
please include fortnight average fps over 10 hour playtime. for all cards. all on the same patch. thxBulat Ziganshin - Friday, May 3, 2019 - link
The "NVIDIA is holding back a bit" part is duplicated on pages 1 and 2Ryan Smith - Friday, May 3, 2019 - link
Whoops. That was meant to get excised when I rearranged the article. Thanks!eva02langley - Friday, May 3, 2019 - link
This card shouldn't exist.R7 was making sense because it was cheaper than a 2080, however this is more expensive than a RX 570... AND WEAKER!
Oxford Guy - Saturday, May 4, 2019 - link
It apparently exists for the GTX 960 buyers (the people who don't do their homework).eek2121 - Friday, May 3, 2019 - link
In before 1650ti. ;)AshlayW - Friday, May 3, 2019 - link
Wow. This card makes no sense. Go watch hardware unboxed's video where he conveniently shoots down the "power efficiency" argument. It's a load of rubbish, there is absolutely no reason to buy this card over even the 4GB 570, for any new gaming build. This review tried so hard to paint this turd in a positive light, continually underscoring AMD's "technological disadvantages" and "thin profit margin". P20 isn't even that much bigger than TU117 also.I'm sorry I just feel it is too friendly to nvidia and doesn't criticize this terrible product pricing enough. RX570 8GB pulse, fro sapphire is cooler running, quieter, vastly higher build quality, >10% faster, twice the vram and 135W board power, which is perfectly fine even for potato OEM builds anyway.
Seriously, drop Ty efficiency arguy. This card is DOA at 149 because 570 killed it.
1024 CC card at 130 bucks would've been passable, not this joke.
AshlayW - Friday, May 3, 2019 - link
The 570 8Gb pulse is also the same price or cheaper than 1650, at least here in the UK. Forgot to mention that important point.AshlayW - Friday, May 3, 2019 - link
Typos as I'm on my phone and I have fat fingers.Should read: "drop the efficiency argument"
dr.denton - Saturday, May 4, 2019 - link
I honestly thought you were doing a weird "ye olde" kind of thing there. Thanks for clearing that up :DOxford Guy - Saturday, May 4, 2019 - link
"Wow. This card makes no sense."It makes perfect sense for Nvidia. The corporation's goal is to sell the lowest-quality product for the most money possible. Nvidia has, can, and does, rely on its brand mindshare to overcome deficits in its products at various price points, especially around the lower end. In other words, people buy it because it's painted green.
eva02langley - Sunday, May 5, 2019 - link
I don't believe this trend is going to keep going. Everyone is now checking reviews online before making their choice. No way this will pass like butter in a pan.cfenton - Sunday, May 5, 2019 - link
"RX570 8GB pulse, fro sapphire is cooler running, quieter, vastly higher build quality, >10% faster, twice the vram and 135W board power, which is perfectly fine even for potato OEM builds anyway."Not if the OEM build doesn't have a 6-pin PCIE cable. If you're building you own computer, then I agree that the 570 is a much better choice. However, if you want to do a quick upgrade to an older OEM system running a 750TI without a 6-pin, then the 1650 makes sense.
nunya112 - Friday, May 3, 2019 - link
Wow what a terrible product. a 570 beats it the price highlights that problem.SolarAxix - Saturday, May 4, 2019 - link
Many of my son's friends have lease-return desktop PCs their parents bought at a good price (i5/i7 2xxx to 4xxx) along with a 720p or 1080p LCD screen (usually less than a $300 investment) and many with SSDs. That being said, most of them use the iGPU (with a few of them with a lowend NVIDIA Quadro or AMD FirePro PCIe-based GPU). That being said, they want to be able to game at 720p/1080p with their friends and it usually doesn't cut it because of the iGPU or poor PCIe GPU.When it comes to upgrading the GPU, one of the drawbacks for these systems are the lack of a 6-pin PCIe connector from the power supply and lackluster power supplies in general which can't be easily upgraded. In the past, I've recommended they get a 1050 and they've been very happy with their purchase along with a quick 10 minute upgrade. I can see the 1650 being what I'd recommend to them in the future if it fits their budget.
I'm with with most of you though, where if you have a system that can handle a 570, that is a much better choice.
It would be interesting to see how big is the market for 75W GPUs based on desktop PCs which can't handle anything more than that (which has nothing to do with saving power on someone's electric bill).
Oxford Guy - Saturday, May 4, 2019 - link
If one has so little money that one has to do a significant amount of PC gaming on a machine that can't handle more than a 75W GPU perhaps it's time to reconsider spending that time on PC gaming.It seems like it would be a better idea to buy a used GPU and a cheap, but decent, PSU.
cfenton - Sunday, May 5, 2019 - link
Swapping out a GPU is relatively simple. Swapping out a power supply, especially in an OEM system with a custom power supply, is much more involved.yannigr2 - Saturday, May 4, 2019 - link
This is by far the most 1650 friendly review I have seen online. I mean, the choice of words, it's almost like someone is trying to not spoil his resume. Also it is the only review where AMD looks desperate, while it is a huge questionmark for how much time it will be willing to try to defend it's position with the RX 570 in the market. If only there where new cards coming from them in a couple of months, but I guess they don't prepare anything.Oxford Guy - Saturday, May 4, 2019 - link
Polaris is such an old architecture and it was very cheap to buy years ago, prior to the crypto surge. For it to be so competitive against the wealthiest GPU company's latest design is a sad statement about the quality of competition in the PC gaming space. Duopoly competition isn't good enough.If there were high-quality competition happening no company would be able to get away with putting overpriced turkeys into the market.
eva02langley - Sunday, May 5, 2019 - link
Hey, Turing is a joke. The only thing Turing brought is a different price bracket. Nvidia took 2 years and half before releasing Turing... so I don't see the age of Polaris to be an issue when new cards are coming in a couple of months.Ryan Smith - Saturday, May 4, 2019 - link
"This is by far the most 1650 friendly review I have seen online."Having finally read the other GTX 1650 reviews (I don't read them beforehand, to avoid coloring my own video card reviews), I agree with you on that. Still, I stand by my article.
AMD is by no means desperate here. But they are willing to take thinner profit margins than NVIDIA does. And that creates all kinds of glorious havoc in the sub-$200 video card market.
No one card wins in all categories here; one has better performance, another has better power efficiency. So it makes things a little more interesting for buyers as they now need to consider what they are using a card for - and what attributes they value the most.
Next to the GTX 1650, the RX 570 is really kind of a lumbering beast. The power consumption difference for the 11% performance advantage is quite high. But at the end of the day it's still 11% faster for the same price, so if you're buying on a pure price/performance basis, then it's an easy call to make.
As for Navi, AMD will eventually have a successor of some sort for Polaris 11. However I'm not expecting it in Q3; AMD normally launches the low-end stuff later.
eva02langley - Sunday, May 5, 2019 - link
You can stand by your article, but it doesn't mean you are right because of it. You are living in LALA land Ryan for even believing that 75W difference is important. It would be important if the cards were of the same performances at a similar price... but it isn't.At this point, you can probably undervolt the RX 570 pretty close to the 1650 if that was sooooo important...
I made the calculation that it is going to cost you 15-20$ of power per year for playing 4 hours per day. You cannot defend this. It is insanity.
https://www.youtube.com/watch?v=um63-_YPNcA
https://youtu.be/WTaSIG5Z-HM
yannigr2 - Thursday, May 9, 2019 - link
AMD in all those last years, is trying to defend it's position with smaller profit margins. It's not something that is doing now and it's not something that is doing only with RX 570, to make us question it's ability to maintain this price.One other thing is that, while in the review the GTX 1650 is tested against the 4GB RX 570, when there is something to be said about pricing and profit margings and questions about the ability of AMD to keep selling the RX 570 under $150, the 8GB model of the RX 580 is used. No mentioning of the much cheaper 4GB version that is used in the review.
In the end of the day, RX 570 is not 11% faster for the same price. It's 11% faster for $30 less and the only question is if GTX 1650's power efficiency and a couple of other features are enough to justify the loss of 11% performance(or more if the RX 570 model was not overclocked) and a significant(for this price range) higher price tag.
And no, we can't assume that in the near future AMD's prices will just jump 20% to make the GTX 1650 less of an expensive card. Especially when Navi is not far away, meaning that older hardware will have to be sold to make room for the new models, or just stay at those low prices to not interfere with newer Navi models that could come at $200 and up.
yannigr2 - Thursday, May 9, 2019 - link
EDIT - clarification: In many tests, there are scores for the RX 570 4GB and not for the 8GB model.catavalon21 - Saturday, May 4, 2019 - link
In the 1660 and 1660Ti reviews, the RX 570 wasn't included; however, the RX 590 and RX 580 are shown taking 201 and 222 seconds respectively to complete the V-Ray benchmark 1.0.8, where this chart shows the RX 570 only taking 153 seconds. The GTX 1660 is shown taking 109 seconds in both that chart and this one. Since the 570 typically falls short of its 580/590 siblings, how did it manage to stomp them in this benchmark?https://www.anandtech.com/show/14071/nvidia-gtx-16...
GreenReaper - Saturday, May 4, 2019 - link
I think this is a reasonable review. Using twice the power at maximum load is not an insignificant factor over the life of the card. But it depends on if additional heat means a cost, or just means you can run your heating less, how often you game, who is paying for your power, etc. Then there are factors such as Linux source driver support, which may or may not matter for a particular person.If pressed, I'd get the RX 570 in a heartbeat, but maybe not if I wanted to put it in my microserver (admittedly, I'd also need a low-profile card for that). But I'd rather wait for Navi in an APU. :-)
Koenig168 - Saturday, May 4, 2019 - link
The article tries too hard to make Nvidia look good despite the GTX 1650 being inferior in performance compared to the RX 570 and overpriced for what it is offering.Oxford Guy - Saturday, May 4, 2019 - link
The last time I remember any major tech news site give Nvidia any grief was in the Fermi days, with the 480 and especially the 465. As bad as the 480 was, too, people still bragged about their triple 480 SLI systems and 480 dual SLI was routinely featured in benchmarks.Haawser - Sunday, May 5, 2019 - link
4GB RX 570 is $130, not $150. And beats the 4GB 1650 out of sight. It also only draws ~120W, which is not a lot seeing as the majority of 1650s (ie- those with a 6 pin) draw ~90W anyway.The actual 75W 1650s should be $99, and the rest shouldn't even exist. Because at $150-160 they are a complete and utter joke.
Znaak - Monday, May 6, 2019 - link
The RX570 won't even draw that much power if you know how to undervolt it. Both cards I've used in builds undervolted without a decrease in performance. Same for my RX580, undervolted it without a loss of performance.timecop1818 - Sunday, May 5, 2019 - link
Why does this shit still have a fucking DVI connector. Great card and I would totally buy it but I've got only DisplayPort monitors and guess what, this dumb piece of shit wastes half the bracket for a connector that has been dead for the last DECADE. Seriously who the fuck has a monitor with DVI? Last I saw this was on some Lenovo biz monitor that was still 4:3!WTB: GTX1650 with 2x DP, 1x USB-C and 0x or 1x HDMI
mm0zct - Monday, May 6, 2019 - link
I for one still have a high-resolution DVI monitor. I've got no good reason to replace my Dell U2711, it's 2560x1440 and can only be driven at that resolution by Displayport or Dual-link DVI. Since I have multiple systems, and there are two DVI inputs and only one displayport, it's useful if I can connect to it via DVI still. Displayport to dual-link DVI adapters are still ludicrously expensive, so they aren't an option. Since DVI is easilly passively adapted to HDMI it's not useless even if you don't want DVI, but you can't passivle adapt HDMI or DisplayPort to dual-link DVI. Around the time I got htis monitor there were also a few quite populat 2560x1440 options which were DVI only, with no displayport, so it's good that this card would still support them.PeachNCream - Monday, May 6, 2019 - link
I do agree that DVI is a dead animal. DisplayPort is following suit though as the industry settles into HDMI so I think that's where we'll end up in a few years.Korguz - Monday, May 6, 2019 - link
timecop1818 and PeachNCream ...you 2 are complaining about DVI being on a video card in 2019, and saying its a dead animal ?? what about putting VGA ports on a NEW monitor in 2019 ? when was the last time a vga port was on a new video card ??? IMO.. DVI is a lot more useful then VGA, and instead of VGA on a monitor.. add another hdmi, or display port... along with a single DVI port. timecop1818 vga has been dead for longer... and FYI.. all my monitors have DVI, and are in use :-) if the last monitor you saw was a 4:3 lenovo.. then you havent bought a new one.. or havent looked at a new monitor for a while...
Xyler94 - Thursday, May 9, 2019 - link
to be fair on the VGA front, you'd be very hard pressed to find a server with anything but a VGA port.Only consumer/gaming products aren't using VGA.
Calista - Monday, May 13, 2019 - link
DVI is far more common than DP. Far more common than HDMI on monitors. Besides, if looking at the offerings from Zotac and other they often have very similar cards with different output options. So you're free to pick the one you like.Znaak - Monday, May 6, 2019 - link
There have been premium GTX 1650's announced with prices higher than stock RX580's!Sure the AMD card uses a lot of power, but performance wise it trounces the nVidia card.
System builders will like this card simply for being fully slot powered, everyone consumer building a system is better of going for AMD. Better performace, better price and if you get the 8GB version more future proof for even low end gaming.
sonny73n - Monday, May 6, 2019 - link
Every time there a review of Nvidia gpu, AMD fans came crawling out of the woodwork. Just to set the record straight - I’m not an Nvidia fan nor AMD fan. While I agree with all of the commenters here about Nvidia pricing that this card should be around $120, I do not agree with most people’s perceptions of performance. You can talk about performance per dollar all day long but people like me sees efficiency as better in a whole. I don’t have this card or the RX 560 to compare but I do have a Zotac GTX 1060 GDDR5X 6GB (5 stars ratings) bought for $200 from Newegg last month and I have access to RX 590 8GB which is currently priced at $220 (the cheapest). I was going for the 590 but there were several reasons that held me back. First, all the cheap 590s (less than $250) had terrible reviews (mostly 3 out of 5 stars). Second, I don’t want a loud foot heater. Last but not least, I define performance based on efficiency. The RX 590 uses almost twice the power of the GTX 1060 but only is 17% faster. How can you call that a better performance? Obviously, AMD haven’t got their acts together. They need to come up with a better chip architecture and revamp everything. I had hopes for them but they fell me every time. Words from Lisa’s mouth are cheap. Until AMD can actually deliver, Nvidia will keep selling their gpus at whatever price they want.silverblue - Tuesday, May 7, 2019 - link
I'm running The Division 2 on a marginally under-volted Nitro+ 590, and with a mixture of Ultra and some High settings at 1080p with a manually imposed frame cap of 75, I'm getting 60-75fps for a power consumption figure of 130-160W. The card is audible, but barely.It's just the one title, but I really doubt that the 1060 can deliver anywhere close to that level of performance, and certainly not at about half the power.
Haawser - Thursday, May 9, 2019 - link
No they can't. The higher tier RTX cards are not selling well because they're too expensive, and so is the 1650. You're some kind of delusional if you think Nvidia can charge whatever they want.ballsystemlord - Thursday, May 9, 2019 - link
Spelling and grammar corrections (Only 2, good work):"This is where a lot of NVIDIA's previously touted "25% bitrate savings" for Turing come from."
Should be "comes":
"This is where a lot of NVIDIA's previously touted "25% bitrate savings" for Turing comes from."
"Though the greater cooling requirements for a higher power card does means forgoing the small form factor."
Extra s:
"Though the greater cooling requirements for a higher power card does mean forgoing the small form factor."
pcgpus - Saturday, October 5, 2019 - link
interesting review, but GTX1650 is too exepnsive according to RX570 (and RX has better performance).If you want to watch more results check this link (results from few services in 3 resolutions and 21 games):
https://warmbit.blogspot.com/2019/10/gtx1650-vs-gt...
To translate just use Google translate from right side of site.
GoSolarQuotes - Tuesday, February 25, 2020 - link
https://www.gosolarquotes.com.au/Rockfella.Killswitch - Tuesday, October 27, 2020 - link
I purchased the Zotac 1650 OC for Rs. 12920 (USD 175.39) and later found out the 1650 super is 30% faster than 1650 and the a measly 3/4% slower than the 1660! Returned and got the 1650 Super Zotac.Rockfella.Killswitch - Tuesday, October 27, 2020 - link
I purchased the Zotac 1650 OC for Rs. 12920 (USD 175.39) and later found out the 1650 super is 30% faster than 1650 and the a measly 3/4% slower than the 1660! Returned and got the 1650 Super Zotac for 192.75 USD (14199 INR)**