Comments Locked

20 Comments

Back to Article

  • DanNeely - Tuesday, May 22, 2018 - link

    Looking at Razer's website I think this is replacing the 14" model, not slotting in above it. Their laptop page only lists 13, 15, and 17" models, and the 13" model is just a razer styled ultrabook not a light gaming model with an MX150 or GT1050 GPU.

    Sadly the 4k display appears to only be available in the most expensive config which starts at $2899 with a 1070, 32gb ram, and a 512gb SSD.
  • Valantar - Tuesday, May 22, 2018 - link

    I really wish they'd follow this up with a smaller "intro" model - ~13" with a U-series CPU (preferably @25W) and something like a GTX1050. Or, heck, why not Kaby Lake-G? Don't care at all if it would be a bit thick by today's 13" ultrabook standards, as long as the design tapered a bit at the edges. If Dell can successfully cool 25W worth of CPU power in the XPS 13, it really shouldn't take a much thicker design to double that cooling capacity - or just make the device ever so slightly larger to accommodate bigger heatsinks.
  • jeremyshaw - Tuesday, May 22, 2018 - link

    Except Dell isn't successfully cooling 25W of TDP in the XPS13. Compare it to another laptop with the same CPU, like a larger T480, and suddenly, the same CPU performs much better. Dell did set 25W as the TDP limit, but their poor thermal sensor placement means the CPU will thermally throttle long before that ever happens, and the fans are deliberately slow to ramp up. The thermal insulation pad + the slow fan gives the impression to some end users that the XPS 13 is actually running cool and quiet, but the reality is downclock, downclock, downclock.
  • id4andrei - Tuesday, May 22, 2018 - link

    Isn't Max-Q a compromise from a performance perspective? You're basically paying for thinness. Once the thermal throttling kicks in over the already throttled Max-Q GPU, that 1900$ investment(pre-tax) doesn't sound so good now, does it?
  • ImSpartacus - Tuesday, May 22, 2018 - link

    Yes, Max-Q is effectively traditional mobile parts in that they move down to a more efficient part of the voltage curve.
  • limitedaccess - Tuesday, May 22, 2018 - link

    1080p 144hz IPS panels available for laptops but none for desktops.
  • peterfares - Wednesday, May 23, 2018 - link

    Really? I got a 1440p 144Hz IPS monitor, someone must make a 1080p one.
  • eiskafee - Tuesday, May 22, 2018 - link

    I have been using Razer Blade 14 as a primary machine for the last 3 years and 2 months. While I am drooling over this new machine, I have reservations.

    My Blade 14 has become SUPER loud and I have tried everything from under-volting, cleaning fans consistently, keeping it on a cooler surface but once I open a few browser windows and Photoshop, it kicks off like a jet engine. This is very embarrassing in a meeting.

    Battery died within 2 years and I had to get it replaced, which was expensive and took time. The charger also got burned once but Razer was nice enough to ship me one for free, both left and right mouse buttons on the trackpad broke within 2 years. 3 months ago the backlight stopped working and it's one of the most expensive things to fix.

    I am happy to spend $3000+ on this beauty if Razer promised at least 3 years full-time warranty.
  • BMNify - Tuesday, May 22, 2018 - link

    Three years warranty is for plebs, this is luxury product for the rich and as such buying a new one after every year or two should not be a problem.
  • Retycint - Tuesday, May 22, 2018 - link


    >It is super loud
    Well of course it is, that's the trade-off of cramming a full sized 1060 and a 45W quad core into such a thin and light form factor. Its major competitors i.e. Aero 14/15 and the new MSI GS65 is equally as loud.

    Anyway, no other major manufacturer that I know of offers a 3-year manufacturer (not retailer) warranty as standard (excluding boutique brands like Origin, Xoticpc etc). In fact most brands only offer 1 year as standard
  • wintermute000 - Wednesday, May 23, 2018 - link

    Exactly, the lack of decent support and dell like NBD onsite pro support means this is a no go. I'm not paying 3k+ for a daily driver that I have to mail overseas for service
  • Gunbuster - Tuesday, May 22, 2018 - link

    Razer who will ban you from their social media accounts on a whim if they interpret anything you say and not 100% ra-ra suckup posting... give them $2400? Nope.
  • boeush - Tuesday, May 22, 2018 - link

    Not a knock on Razer specifically, but it's a shame that there is no middle-ground option for laptop displays. Either it's 1080p, or it's full 4K. Wouldn't it be nice to also have the option of 3K (i.e. 2560x1440) - as a compromise that significantly increases sharpness relative to full-HD while having a much less severe impact on framerates/battery?
  • nimi - Wednesday, May 23, 2018 - link

    Such panels are uncommon, thus much higher priced. It'll probably end up costing the same as or more than the 4K model, killing itself in the process. So I think not having the option is the reasonable outcome.
  • boeush - Wednesday, May 23, 2018 - link

    Chicken. vs. egg: if the panels are not even an option, then they will indeed remain 'uncommon'. Strangely, such 3K) resolutions are much more easily available in 27" desktop monitor form factor - where they are indeed quite effective and popular, as far as I can tell. So the conundrum is, why are they good enough at 27", but not good enough at 15.6"?
  • moozooh - Thursday, May 24, 2018 - link

    This might be a long shot, but I think the high-density laptop screens mostly reuse the technological processes and solutions originally developed for smartphone screens, so there are a lot less expenses involved in adapting those to larger panels.

    Another idea is that 1080 is a very strong compatibility standard every application, media content, and other things created in the last 10–12 years or so are either released in or tested against. Naturally, it more often than not works best at that resolution. On 15.6–18" laptops a 1080p screen provides the extra benefit of not (always) having to use OS-level scaling which is almost always horrible. So those that deem it enough will not upgrade to 1440p as it won't really help much in terms of screen estate (screens themselves don't grow with resolution after all) yet might introduce more problems than it will solve.

    Those that will upgrade to higher pixel density, however, might just want to double-down on it to solve problems associated with its increase and non-1:1 graphics scaling. Because the interpolation that might be visible at ~190 PPI might not bother one as much at ~280. So you can drop the resolution by an exact factor of 4 to 1080p and enjoy the 1080p content looking almost as good as on the actual 1080p screen.
  • Sarchasm - Tuesday, May 22, 2018 - link

    Lots of problems here. 144hz, but no G-Sync (EVGA's SC series has it). Max-Q takes a hit on performance, so you're paying extra for thinness... ironic given all the shade Razer has thrown at Apple for doing the same. And don't even get me started on the wonkiness of the keyboard layout, which is going to feel cramped and awkward because they insisted on accommodating full-size arrow keys. The lack of screen bezels is a needed improvement from the old generation, but there are simply way too many design compromises to warrant $2k+ for the sticker.
  • hfm - Tuesday, May 22, 2018 - link

    G-Sync is only truly useful at lower frame rates, I don't think a 1070 is ill pared with a 144Hz panel that it's going to blast past that refresh and start to cause tearing.
  • MamiyaOtaru - Tuesday, May 22, 2018 - link

    what? It's arguably even more important on a high refresh rate screen. Your game is even less likely to hit the refresh rate and be able to safely vsync. And without vsync (or gsync) it'll tear at framerates above 60 just as well as below. Tearing doesn't just happen when a game "blast(s) past (the) refresh"
  • highfly117 - Thursday, May 24, 2018 - link

    it's for battery life both Optimus and G-sync are incompatible so it's one or the other. Without Optimus you could cut the battery life by half.

Log in

Don't have an account? Sign up now