A few weeks ago I went shopping for a new laptop to run my business a little easier. I am on the go quite a bit, and wanted a powerful machine with few limitations, but one that was still slim, portable, and had a decent battery life. I very briefly mulled over the Apple option, but found their recent line of machines to be severely lacking in several areas, and way too expensive for what you get. And for a brief moment, I even considered a gaming machine due to the extra horsepower. But instead, I saved a few hundred bucks and settled on a nice little ultrabook, sans dedicated video card, which I was a bit hesitant about but which would have added significantly to the price without being justifiable as far as balancing the books was concerned.
Once I got all my tools set up and spent some time configuring my development environment, I hesitantly searched the download page for Steam. No, that isn't why I bought the machine. No, I shouldn't be loading distractions on it before I had even used it for "real work". But I did. So sue me.
The best part about buying a computer in 2019 regardless of its function is that, depending on what kind of gamer you are, every PC is a gaming PC. And while I am not going to be loading up PUBG on this thing any time soon (or on any machine for that matter since I am super burned out on Battle Royale crap) before the night was out I had played FTL, Demon's Tilt, Toejam & Earl: Back in the Groove, and a few others and solid framerates with no hiccups - not to mention the original Diablo from GOG which is as fun today as the first time I played it.
A Lower Barrier To Entry
PC's in the early 2000's were not a trivial matter. The investment required to build a decent gaming rig was matched only by the confusion surrounding what hardware to go with, and the looming fear that your hardware would be obsolete seconds after swiping your credit card. PC game development was about pushing hardware limits - ports to console were an after thought and usually paled in comparison to their PC counterpart, leaving a huge library of games inaccessible unless you were a PC owner.
This exclusivity has mostly gone the way of the dodo, with a "console first" mentality leading to a much larger overlap of platform availability, and with console games being a far better approximation (and often practically identical, sans mods and graphical options) to the PC version.
So while this definitely makes the lives of console gamers easier, there is still a disparity. How long has Path of Exile been out, and we are only now seeing a PS4 port? Torchlight 2 just had a similar treatment announced, and that game is well out of the memory of people who have long since moved on to other experiences. Games are still developed on PC's, and as such, the PC version is generally done first, with console versions to follow.
Personally, I think the best thing to happen to PC gaming is the huge push for the legitimization of indie games, which really got traction during the last generation. It's also the reason I can go through my library of 300 + Steam games, and know that the majority of them are playable on my laptop. Yeah, I may be missing out on Overwatch running at 120 frames, but I can play Undertale, Stardew Valley, and Into the Breach - three of the finest games made in the last decade - with no issues whatsoever. Most people still own a dedicated PC despite the push for mobile devices to replace them, and most of those PC's can handle games such as these. So while you might not be able to enter the realm of the dual wielding GeForce RTX "PC master race" with fans that could double as propulsion for a hovercraft, you can still enjoy a limitless library of amazing games.
Longer Hardware Longevity
System requirements used to be an absolute death sentence. If you didn't have the exact right processor, memory, and video card, you were screwed - we are talking a difference of megabytes here. I remember many a time being 2 megs of video RAM short and simply not being able to load up a game. Now, I read the system requirements, and if my video card is up to snuff, that is generally enough to experience the game at high settings, regardless of if my other hardware is lacking somewhat.
Simply put, we have reached a point in time where most people have far more machine then they actually need. A first generation i7 is still perfectly capable of dealing with most games as long as you are sporting a decent card. This means you can stretch out your investment far longer if all you are interested in is enjoying games. While I do know a couple of people who upgrade their machines every time new hardware comes along, this isn't the norm for most of my friends. I am a relative luddite in comparison to them, but we can still play the same games, and still have a generally similar experience. Sure, my setup might not be as pretty, my screens aren't as vibrant, my framerate might be lower, but it sure isn't the dealbreaker it used to be.
Imagine playing the original Unreal without a video card, and getting around 2 FPS. I remember suffering all kinds of technical limitations just to be able to play a game on a PC that wasn't remotely capable of handling it. Now that problem generally only exists if software is horribly optimized (I'm looking at you, ARK) and of course, you have plenty of resources at your disposal to figure that out before hand, from benchmark programs to forums, if you are really unsure about how your system might deal with a new game.
Something I won't ever apologize for at this point in my life is the fact that I generally stick to the more "grassroots" game developers and projects, and stay further away from the AAA scene. I passed on Anthem, Battlefield, and just about every other big EA and Activision title you can think of for about the past five years, and there is no love lost. I still had an endless conveyer belt of amazing and interesting games to play, and a lot of those originated on the PC.
With large publishers trying to dominate the market by turning games into an endless subscription service chock full of microtransactions, it's so wonderful to see the "rock stars" of the game industry going back to their roots to try and capture lightning in a bottle a second time - and sometimes succeeding with flying colors. Pillars of Eternity is the new Baldur's Gate we thought we would never get. Wasteland 2 was a total joy. And we even have folks like John Romero coming back, not to create a brand new game as redemption for Daikatana, but instead actually producing an entire new set of levels for the original DOOM with his new project Sigil. While Pillars and Wasteland 2 would eventually go on to enjoy a console release, you'll never see something like Sigil hit a console, and the aformentioned FTL still hasn't seen a coveted console release. Games are developed on PC, and because of that there will always be more games available there than anywhere else.
To wrap up, I think it's ignorant to assume that PC gaming is only for some particular class of priveleged gamer, as long as you keep your expectations in check. If you do have the money to throw around, and are interested in building a giant eyesore of a system with more glowing bulbs than a Lite-Brite, you can do so knowing that it's going to be solid for a long time to come. For these reasons, PC gaming is easier to get into and better today than it ever has been.