No Boilerplate Has A Laptop Problem

UPDATE: No Boilerplate read the post, updated the 'errata' section in the video's pinned comment, and also wrote a good clarifying thread in reply to this post! It practically entirely corrects what I've said here, so please give it a read!

I will also add that I should have clarified further - I am not a developer, and instead am more aligned with typical content-creation workloads, which absolutely impacts how I view machine performance. This information should've been in the post from the start, I apologize!


Alternate title: A Response to No Boilerplate's "Developers Have A Laptop Problem"

This isn't intended as an attack on NBP. I love No Boilerplate a lot, despite not actually being a programmer, but something about this video specifically didn't sit right with me, so I wanted to write about it. This was initially formatted as a Discord message, so sorry for the unusual format.


I was rewatching the Developers Have A Laptop Problem video by No Boilerplate recently. I generally love NBP's content, but this video specifically felt...a bit off.

I'm not an expert in this field so if I'm grossly off the mark, I would love to hear about it, but this video didn't sit right with me for a few reasons. If you'd like to correct me, ping me @oracle@phantomthieves.net on the Fediverse.

There's a few claims that irk me a little bit because they don't feel 'true'. I think the most egregious one is "Cheap desktop machines from five years ago are much faster compared to even the latest high-powered laptops" (6:13 in). I did a bit of digging, because this claim sounded kind of absurd.

The first thing I tried to do was figure out what a "cheap desktop machine from five years ago" would look like. I thought the most influential aspect was going to be CPU performance, since even most recent laptop GPUs match or outperform the GTX 10XX series, so CPU is mainly what I focused on. Assuming that said five-year-old desktop had a mid/high-end CPU, and it was 'five years ago' as of the video's release time (December 2023), I figured the best representation of that was the Intel Core i7-8700K, which has a GeekBench score of 1582 in single-core, and 6444 in multi-core. I'm not sure how much I'd rely on GeekBench for objective performance measurements, but it was used as a basis of comparison in the video's markdown document, so I think it's somewhat applicable here.

I then went and checked the 8700K's score against newer laptop CPUs, choosing especially to focus on the mid-lower end. My first instinct was the Intel Core i5-1240P, a CPU that's a bit older and was mid-range for it's time when it released in early 2022. It was prominently used in a lot of ultrabooks. Although GeekBench's CPU page is weirdly broken for this CPU, the search page indicates that it generally outperforms the i7-8700K, at least in burst workloads. Going a step down, to the Intel Core i5-1235U, we can see it does have a GeekBench page, where it achieves a single-core score of 1854 and a multi-core score of 5722. Considering this is a lower-end CPU from two years ago that was optimized for low-power usage, it does perform a little worse than the 8700K, especially at sustained workloads.

Jumping up to a mid-range, performance-focused CPU from the next generation, however, I feel like the claim that "an old machine will perform better" starts to fall apart. I looked at the Core i5-13500H, which again doesn't have a CPU page, but does have pretty strong results in the search results. In pretty much every configuration (minus the ASUS Vivobook line and some QEMU results), it has a single-core score of around 2200, and a multi-core score of around 9000 in sub-optimal conditions. By this metric, any jump further than this to an even higher-end CPU, like a 13th gen i7 or i9, will almost certainly see higher performance than a five-year-old machine. This is especially noticable if you compare an older MacBook, where the base-model M1 on a MacBook Pro 2020 still outperforms the i7-8700K, with a single-core of 2330, and a multi-core of 8231.

Obviously this does not account for sustained workloads - in any case, the 8700K will see much less of a dip in performance over time given sufficient cooling. I would assume most machines bought used would need a bit of maintenaince before reaching that point (i.e. a dusting or thermal paste replacement), but yes, this is true and fair. However, I still think that CPUs (and modern laptops in general) are generally much more performant than the video lets on, and the performance difference is grossly misrepresented in the video.

The other thing that came into my head was the price-to-performance assessment. I do think it is, generally, true that laptops are often priced higher than their desktop counterparts, but again, the way it's presented feels like a misrepresentation of this fact. Notably, I think that laptops have gotten to a point that the price is starting to look pretty justified given the options.

I bought my gaming laptop on December 25, 2023, two days after the No Boilerplate video released. It's a Lenovo Legion Pro 5 16IRX8. It's got an Intel Core i9-13900HX, a laptop RTX 4070, 32GB of DDR5-5600 RAM, and a 1TB Samsung NVMe SSD. I paid, in total, $1,609 USD for this machine after tax. It has great build quality, is very easy to carry around, and has a great 2560x1600 IPS panel with 240Hz refresh rate and NVIDIA G-Sync. I travel a lot, and having a small computer with a lot of power that I can bring with me is great.

I wanted to try and see what an equivalently-powered, new PC would cost if I were to build it. In keeping it equivalent, I made sure to build it in a small-form-factor case for portability, using a mid-range motherboard with WiFi built-in, and, most notably, with a nearly-equal display (1440p, 240Hz, G-Sync-capable). I picked the Intel Core i9-12900K, as it performs very similarly to my Core i9-13900HX, according to the results of a GeekBench run that I personally performed. I also picked a desktop RTX 4060 as an equivalent to the laptop 4070. I picked a Samsung SSD to best match the Samsung OEM SSD in my machine, picked a "cheap but good enough" CPU cooler to match most laptop cooling solutions, and then got the cheapest RAM, reputable PSU, and display I could. The PCPartPicker list is here, and even still, it costs $20 more than my gaming laptop, and that's before tax.

It's worth noting that if a mid/high-range laptop Core i9 can keep pace with a modern desktop Core i9, it's very hard to make the case that a "desktop machine from five years ago" could hope to keep up.

My conclusion is this - I agree that laptop pricing is notably worse than desktop pricing. NVIDIA's naming scheme for laptop GPUs is stupid. Older desktop chips are still good value, but I think the way the information in the video is presented is incredibly disingenuous and fairly misinformed. It feels very vague, doesnt' go into specifics, and honestly feels rather under-researched. It feels like outdated knowledge that hasn't been re-checked in a hot minute; a lot of the claims would make more sense if they were made a few years ago, like during 2017 or 2018, but not so much in 2023 or 2024. I love No Boilerplate content a lot, but this video especially feels rather misleading.

Until next time.

Lost Time System out.