yes the lg C9 well all of lg 2019 tv lineup including the nanocell ips tv's have hdmi 2.1 you can run them at 1080p @ 120hz in pc mode and 1440p @ 120hz
yes the lg C9 oled tv well all of lg 2019 tv lineup including the nanocell ips tv's have hdmi 2.1 you can run them at 1080p @ 120hz in pc mode and 1440p @ 120hz so we are only waiting on hdmi 2.1 video cards to hit before we can run the tv's at 4k 120hz 4:4:4 color chroma the response time is 12ms on the 2019 sets.
The driver ICs for a 65" TV and the hardware needed to run a 15" laptop screen are COMPLETELY DIFFERENT. Stop comparing TV panels to laptops or monitors, and especially stop price comparing them.
Yeah. Unfortunately it's still only an 80 Wh battery instead of slapping the 100 Wh limit like the XPS 15 does. I get it. Weight is a big deal and 5 pounds is a hard cutoff for me. Still this laptop makes a lot of changes in the right direction. I'd like to see GbE brought to the higher end model so there's one less thing to dongle. The CPU update is healthy. A 400 MHz bump in boost clock is nothing to snuff at with tools like throttlestop around. The monitors are also great to see. I personally got an XPS 15 recently. Base i7-8750h model on sale was $1100. Another $400 gets 32 GB of RAM and a 1 TB SSD. Thermal mods were also necessary but it's hard to find a better price/performance deal in a premium package.
Yes. There is a sweet-spot between the base and advanced Razer models here.
GbE, 32GB memory and bigger battery. The 2.5" drive bay is redundant and the lack of GbE makes me a sad panda.
My Dell Precision 5510 is two years old. It came with a 4 cour CPU (I'd like 6 cores), 16GB memory (I need 32) and 500GB NVMe (I could use 1TB). And it has that ridiculous GbE dongle (in constant use and always in the way). A better GPU would be most welcome as well.
100 Wh airline regulation is per single battery. You could have more than one battery in a laptop which would exceed 100 Wh. This has been achieved several times before with certain business grade laptops with dual-battery systems.
If the 200-230W power adapter is truly "compact", it's likely to get quite hot as well. Not quite sure what they're thinking with removing GbE, since you'd think latency would be a concern for gamers, and Wi-Fi is a contended medium. Perhaps the expectation is that they'll use Thunderbolt docks?
If they've found a GaN adapter it might indeed be compact. If not, it's likely just slightly smaller than the average huge brick of an adapter. GaN will save us all from massive power bricks, and it will be glorious.
Also, they should really have included a USB Ethernet adapter here. For that price, it's a no-brainer.
"Since the Blade 15 Advanced SKUs are designed for power users that tend to be less conservative yet more demanding than mainstream gamers, the PCs deprecate GbE,"
Put this way makes it sound like an exceptionally stupid decision; higher end gamers are more likely to want a wired connection for better QoS than wifi than more casual users.
240Hz, but no G-Sync? That sounds ... juddery. Unless all you're doing is playing esports games, that is, or _love_ to adjust your refresh rate every time you start a new game.
Also lower your settings., g-sync is an implementation, it doesn't change the rules and limitations. It is maybe 15% better for smoother frame rates when the frame rate is already way, way too low while adding 15% more latency.
Adaptive refresh rate technologies add about as much latency as proper triple buffering (0-1 frame). Vsync off will always be the lowest latency option. If you can’t drive 240 fps you should still be driving 100+ fps. Seeing frame rate fluctuations above 100 fps is hardly an issue compared to frame rate fluctuations below 60 fps. Adaptive refresh rates technology were invented to polish turds because that’s where they shine. At the end of the day if you have infinite fps all of these solutions are equivalent, so why not just turn down your settings and get higher frame rates and lower input latency if you care about it?
How do you measure latency in frames by saying "0-1 frame"? 1 frame at 60Hz is 16.66ms which has way more different meaning than 1 frame at 240Hz which is 4.16ms. Thats why I said that the latency is result of the frame rate since the latency will change based on what refresh rate the GSync or FreeSync will change to on a moments notice.
You would have to have a perfect frame rate capped at X with no fluctuations at all and test with and without GSync or FreeSync to be able to even measure it which is very difficult at high frame rates so the tests usually become very flawed above 60FPS.
And you cant just lower settings to get better performance if you aim for 240Hz gaming. You need CPU power to push the frames, not just GPU. And you are 100% capped by CPU in all laptops that exist since they cant boost high enough either due to clock and thermal constraints or power limit. This is assuming you have infinite GPU power, your FPS will still swing like hell because of the CPU. Sure, you may be able to play few of the eSports titles just fine but not all of them.
I said the latency was 0-1 frames because that is what it is. Frame times are shorter with higher refresh rates. What is the part that you are contesting? On average you will get lower input latency with vsync off. Input latency fluctuations become less important with shorter frame times. If you can't drive < 60 fps then you should cap the frame rate. Adaptive refresh rates won't solve that problem.
The main point I'm making is that adaptive refresh rate technologies aren't the end all solution and tbh a higher refresh rate panel is a better solution to the same sets of problems that a panel with an adaptive refresh rate would solve.
speaking as someone who basically has to do this (HDMI Freesync monitor + Geforce... yes i knew what i was getting into when purchased) setting refresh rates for each game isn't so bad. It's just another setting to change when dialing in the optimal config for a newly-installed game. And then the game remembers just like any other setting and that's it.
At least we get to adjust refresh rates again! I grew up gaming on CRTs getting to be choosy about refresh rates, only to spend the past decade-and-a-half being stuck on 60hz LCDs with little options other than overclocking or $$$ panels.
My XPS 15 was stolen recently and I have been waiting for the new OLED version to come out before buying a new one. This Razer might be a suitable replacement. I'm just not sure about Razer's reliability based off my anecdotal personal experience with Razer mice and LTT's RMA track record with their laptops.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
31 Comments
Back to Article
Vitor - Tuesday, April 23, 2019 - link
Too many compromises missing the sweet spot. 1440p 120hz OLED would be the sweet spot for me.jeremyshaw - Tuesday, April 23, 2019 - link
Does such a panel exist?Vitor - Tuesday, April 23, 2019 - link
Nice question, but it's totally feasiable.skavi - Tuesday, April 23, 2019 - link
Razer isn't going to make a custom panel. They don't have the market share or capital for that.princepwnage - Wednesday, April 24, 2019 - link
yes the lg C9 well all of lg 2019 tv lineup including the nanocell ips tv's have hdmi 2.1 you can run them at 1080p @ 120hz in pc mode and 1440p @ 120hzKateH - Wednesday, April 24, 2019 - link
that's not the same as a bare 15" panel that goes into a laptop though. Comparing Apples to Apple PiePeachNCream - Tuesday, April 23, 2019 - link
They don't have Killer NICs so they're doing something right at least.ZoZo - Tuesday, April 23, 2019 - link
"1440p 120hz OLED would be the sweet spot for me."That. 1000 times that.
princepwnage - Wednesday, April 24, 2019 - link
yes the lg C9 oled tv well all of lg 2019 tv lineup including the nanocell ips tv's have hdmi 2.1 you can run them at 1080p @ 120hz in pc mode and 1440p @ 120hz so we are only waiting on hdmi 2.1 video cards to hit before we can run the tv's at 4k 120hz 4:4:4 color chroma the response time is 12ms on the 2019 sets.FullmetalTitan - Wednesday, April 24, 2019 - link
The driver ICs for a 65" TV and the hardware needed to run a 15" laptop screen are COMPLETELY DIFFERENT. Stop comparing TV panels to laptops or monitors, and especially stop price comparing them.Kevin G - Tuesday, April 23, 2019 - link
It would be nice to have an option for a second battery in that 2.5" bay. Every little bit helps.skavi - Tuesday, April 23, 2019 - link
That's basically what the Advanced version does. Swaps out the bay for more battery space.willis936 - Tuesday, April 23, 2019 - link
Yeah. Unfortunately it's still only an 80 Wh battery instead of slapping the 100 Wh limit like the XPS 15 does. I get it. Weight is a big deal and 5 pounds is a hard cutoff for me.Still this laptop makes a lot of changes in the right direction. I'd like to see GbE brought to the higher end model so there's one less thing to dongle. The CPU update is healthy. A 400 MHz bump in boost clock is nothing to snuff at with tools like throttlestop around. The monitors are also great to see. I personally got an XPS 15 recently. Base i7-8750h model on sale was $1100. Another $400 gets 32 GB of RAM and a 1 TB SSD. Thermal mods were also necessary but it's hard to find a better price/performance deal in a premium package.
BikeDude - Wednesday, April 24, 2019 - link
Yes. There is a sweet-spot between the base and advanced Razer models here.GbE, 32GB memory and bigger battery. The 2.5" drive bay is redundant and the lack of GbE makes me a sad panda.
My Dell Precision 5510 is two years old. It came with a 4 cour CPU (I'd like 6 cores), 16GB memory (I need 32) and 500GB NVMe (I could use 1TB). And it has that ridiculous GbE dongle (in constant use and always in the way). A better GPU would be most welcome as well.
Prestissimo - Thursday, May 2, 2019 - link
100 Wh airline regulation is per single battery. You could have more than one battery in a laptop which would exceed 100 Wh. This has been achieved several times before with certain business grade laptops with dual-battery systems.ratbert1 - Tuesday, April 23, 2019 - link
Proofread much?GreenReaper - Tuesday, April 23, 2019 - link
If the 200-230W power adapter is truly "compact", it's likely to get quite hot as well. Not quite sure what they're thinking with removing GbE, since you'd think latency would be a concern for gamers, and Wi-Fi is a contended medium. Perhaps the expectation is that they'll use Thunderbolt docks?jeremyshaw - Tuesday, April 23, 2019 - link
The launch RB15 never had GbE. The “Base” model (with GbE) came after the retroactively renamed “Advanced” model.Valantar - Tuesday, April 23, 2019 - link
If they've found a GaN adapter it might indeed be compact. If not, it's likely just slightly smaller than the average huge brick of an adapter. GaN will save us all from massive power bricks, and it will be glorious.Also, they should really have included a USB Ethernet adapter here. For that price, it's a no-brainer.
DanNeely - Tuesday, April 23, 2019 - link
"Since the Blade 15 Advanced SKUs are designed for power users that tend to be less conservative yet more demanding than mainstream gamers, the PCs deprecate GbE,"Put this way makes it sound like an exceptionally stupid decision; higher end gamers are more likely to want a wired connection for better QoS than wifi than more casual users.
Valantar - Tuesday, April 23, 2019 - link
240Hz, but no G-Sync? That sounds ... juddery. Unless all you're doing is playing esports games, that is, or _love_ to adjust your refresh rate every time you start a new game.willis936 - Tuesday, April 23, 2019 - link
Juddery? areYouSureAboutThatJohnCena.aviAlso lower your settings., g-sync is an implementation, it doesn't change the rules and limitations. It is maybe 15% better for smoother frame rates when the frame rate is already way, way too low while adding 15% more latency.
KateH - Wednesday, April 24, 2019 - link
"Juddery"i caught that too...
I think i'd take *stuttery* gameplay over my computer suddenly violently shaking midgame ;-)
TheWereCat - Wednesday, April 24, 2019 - link
GSync/FreeSync don't add latency, the higher latency is a result of lower frame rate.And Valantar is right. There's no way that a laptop CPU can handle 240Hz games without massive swings in FPS. GSync/FreeSync is a must
willis936 - Wednesday, April 24, 2019 - link
Adaptive refresh rate technologies add about as much latency as proper triple buffering (0-1 frame). Vsync off will always be the lowest latency option.If you can’t drive 240 fps you should still be driving 100+ fps. Seeing frame rate fluctuations above 100 fps is hardly an issue compared to frame rate fluctuations below 60 fps. Adaptive refresh rates technology were invented to polish turds because that’s where they shine. At the end of the day if you have infinite fps all of these solutions are equivalent, so why not just turn down your settings and get higher frame rates and lower input latency if you care about it?
TheWereCat - Wednesday, April 24, 2019 - link
How do you measure latency in frames by saying "0-1 frame"? 1 frame at 60Hz is 16.66ms which has way more different meaning than 1 frame at 240Hz which is 4.16ms.Thats why I said that the latency is result of the frame rate since the latency will change based on what refresh rate the GSync or FreeSync will change to on a moments notice.
You would have to have a perfect frame rate capped at X with no fluctuations at all and test with and without GSync or FreeSync to be able to even measure it which is very difficult at high frame rates so the tests usually become very flawed above 60FPS.
And you cant just lower settings to get better performance if you aim for 240Hz gaming. You need CPU power to push the frames, not just GPU. And you are 100% capped by CPU in all laptops that exist since they cant boost high enough either due to clock and thermal constraints or power limit. This is assuming you have infinite GPU power, your FPS will still swing like hell because of the CPU.
Sure, you may be able to play few of the eSports titles just fine but not all of them.
willis936 - Wednesday, April 24, 2019 - link
I said the latency was 0-1 frames because that is what it is. Frame times are shorter with higher refresh rates. What is the part that you are contesting? On average you will get lower input latency with vsync off. Input latency fluctuations become less important with shorter frame times. If you can't drive < 60 fps then you should cap the frame rate. Adaptive refresh rates won't solve that problem.The main point I'm making is that adaptive refresh rate technologies aren't the end all solution and tbh a higher refresh rate panel is a better solution to the same sets of problems that a panel with an adaptive refresh rate would solve.
Brett Howse - Wednesday, April 24, 2019 - link
Blade uses Optimus which you can't have with G-SYNC unfortunately.KateH - Wednesday, April 24, 2019 - link
speaking as someone who basically has to do this (HDMI Freesync monitor + Geforce... yes i knew what i was getting into when purchased) setting refresh rates for each game isn't so bad. It's just another setting to change when dialing in the optimal config for a newly-installed game. And then the game remembers just like any other setting and that's it.At least we get to adjust refresh rates again! I grew up gaming on CRTs getting to be choosy about refresh rates, only to spend the past decade-and-a-half being stuck on 60hz LCDs with little options other than overclocking or $$$ panels.
oRAirwolf - Wednesday, April 24, 2019 - link
My XPS 15 was stolen recently and I have been waiting for the new OLED version to come out before buying a new one. This Razer might be a suitable replacement. I'm just not sure about Razer's reliability based off my anecdotal personal experience with Razer mice and LTT's RMA track record with their laptops.s.yu - Tuesday, April 30, 2019 - link
IMO with a full sized SD reader and a 32GB option, the advanced model in white would be perfect, literal, perfection.