Tag Archives: ‘Gaming

AMD Will Support Smart Access Memory on Ryzen 3000 CPUs for Gaming

This site may earn affiliate commissions from the links on this page. Terms of use.

When AMD announced its Ryzen 5000 CPUs, it introduced a feature it dubbed Smart Access Memory, known more generally across the industry as Resizable BAR. Resizable BAR allows a CPU to access more than 256MB of GPU memory at any given time. The feature can boost game performance on Ryzen 5000 CPUs by between 3-7 percent on average based on our tests at 1080p, 1440p, and 4K. Nvidia claims up to 10 percent improvements for Ampere GPUs.

Initially, SAM was going to be a Ryzen 5000 feature and required a 500-series motherboard from the B550 or X570 families to use. Now, AMD has announced it’ll be bringing the feature to its Ryzen 3000 CPUs as well. A 500-series motherboard will still be required. Note that the Ryzen 3000 APUs, which technically use the Zen+ architecture, are not included here.

Formal support will still require a GPU from AMD’s RDNA2 family or an Nvidia Ampere card. In Nvidia’s case, it also requires a VBIOS update (all RDNA2 GPUs support the feature out of the box). Presumably, motherboard vendor UEFIs will also need to be updated to enable the feature on Ryzen 3000 CPUs. Intel support will be available on Z490 motherboards and upcoming 500-series products for 10th and 11th Gen CPUs. Z390 has apparently been supported as well by some manufacturers, but that’ll be OEM by OEM.

This slide shows how SAM / ReBAR works across Intel and AMD platforms with AMD and NV GPUs both.

This is a canny move on AMD’s part. Ryzen 5000 chips have been in very short supply these last months, making it harder than usual for the company’s fans to actually buy its hardware. Extending this small boost downward into the Ryzen 3000 family won’t change anybody’s life, but it’s a nice gesture to people who were looking for upgrades this year and may not have gotten the hardware they wanted.

The reason we won’t see Resizable BAR/SAM support added across the spectrum of current PC hardware is that UEFI/BIOS updates and GPU BIOS updates are apparently both required. Motherboard vendors and GPU manufacturers aren’t going to revisit the idea of adding these features to older cards and card families.

Hunting for Performance in the Proverbial Couch Cushions

Companies are getting increasingly creative in the places they look for additional performance. Nvidia’s DLSS feature leverages the cloud and AI/ML training to provide superior visual quality at a lower base resolution. DLSS 2.0 is a substantial improvement on 1.0, and while the feature isn’t perfect, it’s evolving nicely.

In AMD’s case, it’s got a rumored response to DLSS coming soon and we’ve recently seen the effectiveness of slapping a large L3 on top of a GPU, as well as the introduction of features like ReBAR/SAM. Intel has plans to integrate hybrid low-power CPU cores into its products, starting with Alder Lake later this year. Features like Variable Rate Shading have been introduced (if not yet popularized) as another way of diverting more GPU horsepower to the areas that need it most.

I suspect the next few years will see a lot of mud tossed at these proverbial walls as the industry continues to move away from the idea that lithography will provide additional performance improvements, and towards a model that prizes a multi-disciplinary approach to semiconductor performance improvement. Tightening the linkages between hardware and software and squeezing out inefficiencies is how companies are pushing performance forward these days. Clock jumps still count — witness the increase on AMD’s Radeon 6700 XT — but they’re increasingly just one tool in the toolbox.

Now Read:

Let’s block ads! (Why?)

ExtremeTechGaming – ExtremeTech

Mobile RTX 3070 Reviews Show Performance Variations in Gaming

This site may earn affiliate commissions from the links on this page. Terms of use.

If you’re shopping for a gaming laptop, it’s important to check reviews before you buy. We’ve discussed this issue before in a CPU context, but it’s just as important in GPUs. A recent set of reviews from our sister site PCMag offer an illustrative example of why.

PCMag just reviewed the Alienware m15 R4 and the Gigabyte Aero 15 OLED XC. Both of these systems are built around the Intel Core i7-10870H. Both feature an RTX 3070 — the Gigabyte system has the Max-Q version of the card, while the Alienware has a standard RTX 3070. We would normally expect the Gigabyte system to outperform the Alienware thanks to its use of a Max-Q card. Max-Q cards are binned for high efficiency in a given power envelope, not maximum performance. A Max-Q GPU is typically slightly slower than a full-sized equivalent.

PCSpecs-PCMag

The CPU-centric benchmarks PCMag ran show the Alienware m15 R4 narrowly losing to the Gigabyte Aero in most tests, though it wins Handbrake. There’s only a little variation between the two. The GPU tests are more interesting:

These results collectively show some interesting trends. The ROG Zephyrus 15 is only equipped with an RTX 2080 Super (Max-Q), but it clearly punches well above its weight class. The RTX 3070 is faster in Fire Strike but loses Sky Diver. In Far Cry 5, the gap between the RTX 2080 Super and the RTX 3070 Max-Q is 3.2 percent and 5.6 percent in favor of the RTX 3070 in the Gigabyte system. The fact that the R4 is only a little faster suggests that RotR is CPU-bound at this point in any case.

PCMag gives high marks to both of these gaming laptops, so I want to say up front that I’m not trying to trash the RTX 3070 or imply that the Max-Q version is a bad card. Even the fact that the RTX 2080 Super outperforms the RTX 3070 Max-Q isn’t an automatic problem in and of itself, especially when we haven’t factored price into the equation.

What’s interesting about the performance of the RTX 2080 Super versus the RTX 3070 Max-Q versus the regular RTX 3070 is the impact different thermal solutions can have on the laptop’s performance. PCMag didn’t run any ray tracing benchmarks, unfortunately, so it’s not clear how the Turing-equipped laptop would have fared against Ampere in that test. Our guess is that the architectural improvements in the architecture would still deliver a boost even if the RTX 2080 Super seems to have more room to stretch its metaphorical legs overall.

Each of the laptops in PCMag’s review offers a different balance between weight, performance, display resolution, refresh rates, and thermals. In this case, both the Asus ROG and the Alienware are able to target higher performance levels due to the specifics of their respective implementations. The Gigabyte system offers excellent gaming performance, but it doesn’t always match either of its competitors.

One of the best ways to maximize long-term performance is to check the performance of the specific system model you want to buy, with an eye towards how it lands in various 3D benchmarks. While there’s always test-to-test variation, you can typically pick out a pattern across a series of tests.

If I had to choose between the RTX 2080 Super-equipped system and an Ampere system at the same price, and performance slightly favored Turing, I’d still choose Ampere. Over time, newer games are more likely to favor Nvidia’s newer architecture over the older one, especially given how popular Ampere has been. Ampere also should offer efficiency gains in ray tracing, even if Turing is more competitive in other tests.

If you want to maximize your long-term performance when you buy a gaming laptop, there’s no substitute for reading reviews of the specific systems you’re interested in.

Now Read:

Let’s block ads! (Why?)

ExtremeTechGaming – ExtremeTech

Intel, Nvidia Deny Blocking AMD From High-End Mobile Gaming

This site may earn affiliate commissions from the links on this page. Terms of use.

There’s a rumor going around that Intel and Nvidia have conspired to block AMD’s Ryzen Mobile 4000 series from high-end gaming laptops. This information has supposedly been provided by an unnamed OEM, and claims that a secret agreement between Intel and Nvidia specifies that top-end RTX GPUs can only be paired with Intel 10th Generation CPUs. Intel and Nvidia have both denied the allegations.

The conspiracy theory argues that Intel and Nvidia formed an alliance to keep AMD out of the mobile gaming market by denying it access to top-end GPUs. This, in turn, would keep AMD out of the highest-priced and most lucrative mobile market.

There are some specific reasons to think this isn’t happening, but before we dig into them, let’s address the elephant in the room. The reason conspiracy theories about blocking AMD from accessing the market find a home online is that there has been a lot of bad blood between the two companies over the decades. Intel went all the way to the Supreme Court in an attempt to revoke AMD’s right to manufacture x86 CPUs. Over a decade later, AMD filed an antitrust lawsuit against Intel, alleging that the company had abused its monopoly in the x86 market by creating a rebate system that effectively locked AMD out of certain market segments. While these claims were never evaluated in a court of law, the court of public opinion had a lot to say about Intel’s behavior, and not much of it was good. Intel paid AMD $ 1.25B and renegotiated its x86 license to settle the case, and paid a $ 1.45B fine to the EU.

I covered the antitrust lawsuit when it happened and I’ve conducted my own investigations into the related compiler optimization differences that also formed part of the lawsuit. While obviously, ET cannot comprehensively declare that no Intel-Nvidia agreement exists, there are some objective reasons to think it doesn’t.

Why AMD Is Still Ramping Up in Mobile

To date, AMD hasn’t put the same emphasis on mobile that it’s put on desktop. We talk about desktop Ryzen and mobile Ryzen as two halves of the same product, but the two have faced very different competitive environments.

On desktop, Ryzen’s story is straightforward: At launch, Ryzen punched Kaby Lake in the throat. Intel’s Core i7-8700K landed its own headshot later on in 2017, but from 2018-2020, Intel’s position in desktop steadily weakened. The Ryzen 5000 desktop launch in the fall of 2020 gave AMD a real claim to all-around fastest CPU, including gaming. While Rocket Lake may change this calculus in about eight weeks, AMD currently holds a leadership position in desktop.

Mobile isn’t that simple. Ryzen Mobile didn’t launch until nearly a full year after Ryzen desktop. The first Ryzen desktop chips had featured up to 2x the core count of a top-end Kaby Lake CPU. On mobile, the Ryzen 2000 family topped out at four cores, while Intel pushed up to six-core mobile chips. The Ryzen Mobile 2000 was a breath of fresh air, but it wasn’t a Coffee Lake killer.

In 2019, AMD launched 7nm desktop CPUs, but kept mobile chips on refreshed 12nm silicon. The Ryzen 3000 version of the Surface Laptop was very well-reviewed, but a head-to-head comparison versus Intel’s Ice Lake showed that Intel retained an advantage in CPU performance and battery life. It was only in 2020 that the 7nm Ryzen 4000 series pulled ahead of Ice Lake. Even with this win, Intel took the CPU and GPU performance crown back with Tiger Lake later that same year.

Surface-Laptop-Feature

One of the advantages of partnering with Microsoft on the Surface Laptop 3 was the degree of optimization Microsoft did for the platform. AMD told the press some of this work would help other OEMs improve their AMD offerings.

Part of the reason AMD faces a more competitive mobile environment boils down to timing. Since 2017, AMD has launched new microarchitectures first for desktop, then for mobile. Intel, in contrast, has led with mobile. The Ryzen 7 1800X debuted against Intel’s 7th Gen processors. The mobile Ryzen chips, which launched nearly a full year later, faced 8th Gen mobile CPUs with higher core counts than their 7th Gen counterparts had offered.

If AMD had led with 7nm mobile chips in July of 2019 it would have launched against Coffee Lake, not Ice Lake. Ice Lake might have debuted as the new Intel mobile architecture that couldn’t catch AMD’s already in-market 7nm CPUs. Instead, Ice Lake was praised for demonstrating better power efficiency and markedly higher graphics performance.

This timing tradeoff has consequences for how well AMD has compared against Intel at any given point in time. When AMD’s Frank Azor appeared on The Full Nerd in May 2020, he specifically noted that OEMs weren’t confident that Ryzen 4000 would offer a real challenge to Ice Lake, and were cautious about adapting the design.

“I think Ryzen 4000 has exceeded everybody’s expectations, but for the most part, everyone tip-toed with us. Because of that, it was hard to imagine a world where we were the fastest mobile processor,” Azor said.

OEMs plan out their refresh cycles well in advance, and while the Ryzen Mobile 2000 and 3000 were good mobile CPUs, they weren’t cleanly better than what Intel was shipping at the time. Ryzen 4000 was the first AMD mobile CPU to challenge Intel in gaming, and OEMs don’t commit to shipping new system designs if they think all they’ll get is a single viable product generation. There are also some platform-level reasons why OEMs might prefer Intel, like the latter’s support for x16 PCIe connections on mobile, but that’s secondary to the question of absolute performance.

Another reason to doubt this theory is that we’re already seeing evidence of more Ryzen 5000-powered laptops with high-end GPUs this year than last. AMD’s consistent roadmap execution, and its demonstrated ability to navigate multiple microarchitectural shifts and a full node transition, has built faith with both OEMs and customers. If you look back at AMD’s claims against Intel in 2005, one of the arguments AMD made was its suspicious inability to claim more than 15-20 percent of the worldwide CPU market.

From AMD’s 2005 lawsuit against Intel.

There’s no equivalent glass ceiling visible in the data today. AMD’s market share in mobile, desktop, and server have all been growing since Ryzen was introduced into each product family. Last summer, AMD hit the highest market share it’s held since 2012. At no point has the company indicated to ExtremeTech that it believed the same shenanigans might be in play today.

From an OEM’s perspective, Ryzen 4000 proved Ryzen Mobile had the chops to compete in gaming notebooks. Now that Ryzen 4000 and (presumably) 5000 are offering much-improved competition against Intel, we can expect to see the number of top-end gaming systems featuring an AMD CPU to rise. The delays we’ve seen thus far make sense, given how recently AMD started competing in high-end mobile gaming.

Now Read:

Let’s block ads! (Why?)

ExtremeTechGaming – ExtremeTech

Lenovo’s New Legion Gaming Laptops Sport Latest AMD, Nvidia Hardware

This site may earn affiliate commissions from the links on this page. Terms of use.

There’s no in-person CES this year, but companies are still rolling out new products for the virtual event. Among them is Lenovo, which has unveiled a complete revamp of its Legion gaming laptops. The new machines combine AMD’s latest CPUs with Nvidia’s new mobile GPUs. The price tags won’t be in the budget range, but they’re much lower than some competing gaming laptops. 

The Lenovo Legion 7 (above) is at the top of Lenovo’s new lineup. This computer has a 16-inch display with a less-common 16:10 ratio. That gives you a little more vertical space compared with 16:9 displays. The IPS panel is 2560 x 1600 with a 165Hz refresh rate, 3ms response time, 500 nits of brightness, HDR 400 with Dolby Vision, and Nvidia G-Sync. 

The display alone puts it in the upper echelon of gaming laptops, but the Legion 7 doesn’t stop there. It will also have the latest 5000-series AMD Ryzen mobile processors. On the GPU side, the laptop will have RTX 3000 cards, but Lenovo hasn’t specified which models. The Legion 7 comes with up to 32GB of RAM and 2TB of NVMe storage. Lenovo expects to launch the Legion 7 in June with a starting price of $ 1,699.99. That’s an expensive laptop, but we regularly see gaming laptops that cost much more. 

If you’re looking to keep your mobile gaming machine a little more mobile, there’s the Legion Slim 7. This laptop will weigh just 4.2 pounds, making it the thinnest and lightest Legion laptop ever. This laptop will have both 4K and 1080p display options, but the 165Hz refresh rate is only available on the 1080p model. Again, this laptop will have the latest AMD and Nvidia parts. However, we don’t have a price or release date yet. 

The Legion 5 Pro.

The next step down is the Legion 5, which comes in three variants: A 16-inch Legion 5 Pro, a 17-inch Legion 5, and a 16-inch Legion 5. The Legion 5 Pro will start at just $ 1,000 with a 16-inch 165Hz LCD at 2560 x 1600 (another 16:10 ratio). This computer will max out with a Ryzen 7 CPU (instead of Ryzen 9 in the Legion 7) and 16GB of RAM, but it’ll still have an RTX 3000 GPU. 

The non-pro versions of the Legion 5 will start at just $ 770, but the cost will depend on which of numerous screen and CPU configs you choose. The displays are stuck at 1080p, but you can get a super-fast refresh rate. All these devices will have next-gen Ryzen CPUs and Nvidia 3000-series GPUs as well. That could make even the base model an appealing gaming machine.

Now read:

Let’s block ads! (Why?)

ExtremeTechGaming – ExtremeTech

Gaming Laptop With Ryzen 7 5800H, Mobile RTX 3080 Leaked

This site may earn affiliate commissions from the links on this page. Terms of use.

A new gaming laptop has leaked via a German electronics retailer. That in and of itself is not unusual, but the specs of the leaked computer are interesting. The new variant of Acer’s Nitro 5 has an AMD Ryzen 7 5800H CPU and Nvidia GeForce RTX 3080 mobile GPU. However, neither of those parts officially exist yet. Oops. 

The computer appeared on the Electronic Partner website for a short time before removal, but you can still see a cached copy. The laptop configuration (model AN517-41-R9S5) looks like it will slot into the high-end of Acer’s product portfolio. There’s a 17.3-inch 1080p LCD with a 144Hz refresh, 32GB of RAM, and a 1TB SSD. Thanks to the substantial footprint, the Nitro 5 has a full number pad in the backlit keyboard. 

What really makes the unreleased computer a beast is the CPU and GPU configuration. While these parts don’t officially exist, we can surmise what they’ll offer based on leaks and the model numbers. The CPU, for instance, is listed as a Ryzen 7 5800H. On the desktop side, the 5800X is one of the most powerful Zen 3 chips available. Although, “available” might be a little misleading. These CPUs have been in very short supply. AMD reportedly plans to mix and match Zen 2 and Zen 3 in the mobile Ryzen 5000-series, but we believe the 5800 will be a 45W Zen 3 CPU with eight cores and 16 threads with a 3.2GHz base clock speed (4.5GHz max). 

The mobile-optimized RTX 3080 is more mysterious, but it’s possible the hardware will be more akin to the RTX 3070 with a boosted clock speed. Regardless, that would still make it one of the most powerful GPUs in any Windows computer, and you might actually be able to buy one. Desktop versions of the 3000-series RTX GPUs have been almost impossible to find ever since launch, and the shortage is expected to continue well into 2021. 

The combination of two hard-to-find components will no doubt make this a popular computer, whenever it’s official. It won’t come cheap, though. The retailer had it listed at €1,948.61, which converts to about $ 2,375. That would put it solidly at the high-end for a Windows gaming laptop. Although, you can’t find a desktop RTX 3080 for anything south of $ 1,200 right now. You might save money buying this fancy laptop.

Now read:

Let’s block ads! (Why?)

ExtremeTechGaming – ExtremeTech

Xbox Cloud Gaming Comes to iOS, PC in 2021

This site may earn affiliate commissions from the links on this page. Terms of use.

Microsoft’s xCloud cloud gaming service is expanding in 2021 and its Xbox Game Pass service is driving new engagement with the Xbox ecosystem. That’s the word from the company in a recent blog post, which highlighted recent developments for both platforms.

Xbox Game Pass is Microsoft’s monthly subscription service that grants players access to a wide range of titles, with some games rotating on and off the service over time. The service is generally regarded as a good deal if you’re interested in subscribing to a games-on-demand product in the first place — there’s a wide range of titles available and Microsoft has pledged to launch AAA games on the service as new titles arrive for Xbox Series X. With XGP, you still download games to your local system to play them.

Project xCloud is the company’s new stream-anywhere service that’s intended to allow you to game on anything from an iPhone or Android device to a lightweight PC, without needing to be tethered to a console.

According to Microsoft, over 40 percent of new Xbox gamers are using the Xbox Series S, which seems like a sly way of confirming that the sales split between the two consoles is somewhere between 41/59 and 49/51. Microsoft didn’t mention it here, but the company claimed 15 million subscribers earlier this fall when it bought Bethesda.

Microsoft is pushing Xbox Game Pass as a major component of its entire gaming strategy this generation. It recently partnered with EA to offer EA Play games on XGP, and Aaron Greenberg, marketing chief for Xbox, has confirmed that these games will be available for “quite a while.”

If you want to take advantage of the capability, you’ll need a subscription to Microsoft Game Pass Ultimate, which costs $ 15. Microsoft offers Xbox Game Pass in several flavors, shown below:

XboxPassComparison

Bringing xCloud to PC isn’t surprising, but the fact that the service will be rolling out on iOS is a major change to previous Apple policies. Apple has previously argued that it would not allow such services at all because it couldn’t gate-keep the content they ran. After a back-and-forth in the media, Apple instead declared it would allow them — but only if every game listed in the service also passed App Store guidelines. This minor policy change is basically no policy change at all, given the practical impossibility of making that happen.

Instead, both Google and Microsoft will reportedly focus on delivering an experience entirely in-browser, bypassing the App Store lockout altogether. Amazon also plans to debut its streaming service on iOS similarly. It’s not clear what kind of impact this will have on gameplay, or how Microsoft might have to change its approach in order to effectively support the platform. If these companies are able to offer acceptable performance via Safari, it may send a signal to other firms that have chafed under Apple’s 30 percent fee — or its demand that companies include in-app purchases in their products solely to enrich Apple.

Now Read:

Let’s block ads! (Why?)

ExtremeTechGaming – ExtremeTech

Nvidia, Google to Support Cloud Gaming on iPhone Via Web Apps

This site may earn affiliate commissions from the links on this page. Terms of use.

So you want to stream some video games from the cloud? Apple hasn’t made that very easy on its devices thanks to some heavy-handed App Store policies, but the open internet is coming to the rescue. Both Nvidia and Google have announced iOS support for their respective cloud gaming platforms via progressive web applications. Apple can’t block that. 

This controversy dates back about a year when Google Stadia and Nvidia GeForce Now became available on select mobile devices. Google’s Play Store allowed the streaming apps, but Apple blocked them for dubious reasons. Apple later updated its policies to say that cloud gaming was a-okay as long as providers adhered to the App Store’s draconian rules like creating separate store pages for each game and having all titles approved by Apple for purchase inside the App Store. The company claimed this was about ensuring a level playing field for developers, but it also would have gotten Apple its customary 30 percent cut of sales. 

Microsoft already announced that it would bring xCloud to iOS via a web app, but Nvidia is the first to get there. Anyone with an iDevice can get started by heading to the GeForce Now website — you’ll also need a $ 4.99 monthly subscription. Once you’re logged in, you can import your existing library from Steam, Epic, and other game distribution platforms. Yes, that means Fortnite is back on iOS. The WebRTC-based client can stream the video of your gameplay session and relay your control inputs to the cloud just like a local app. You can pair an Xbox, PS4, or mobile Bluetooth controller with the device. The web app also has touch controls, but they won’t work in all games. And even if they do, you probably don’t want to use them. 

Google says its web app version of Stadia for iOS will launch in the coming weeks. Like Nvidia and Microsoft, Google was prevented from launching an iOS Stadia app, and the company seemed caught off-guard. When reviewing Stadia for its launch last year, there was a beta iOS client available for testing. Google was unable to release it on the App Store this whole last year. 

When the web version of Stadia launches in the next few weeks, you’ll be able to direct the Safari browser to the Stadia site to stream your games. Unlike Nvidia, Google sells games specifically for Stadia, but Google does let everyone play the base version of Destiny 2 for free. The $ 10 monthly Stadia Pro subscription adds features like 4K streaming and surround sound. 

Apple’s stubbornness has slowed the growth of cloud gaming on its platform, but it’s not stopping it. By early 2021, there should be three cloud gaming services live on iOS via the web. 

Now read:

Let’s block ads! (Why?)

ExtremeTechGaming – ExtremeTech

Microsoft’s Xbox Series X Review: The Living Room Gaming PC I’ve (Mostly) Always Wanted

Last year, not long after Microsoft announced the Xbox Series X, I declared that the upcoming console would “end” — I specifically did not say “win” — the PC/console war, not by beating the PC, but by effectively becoming a PC. At the hardware level, that’s more-or-less what has happened, and it’s particularly true in Microsoft’s case because the Xbox runs an OS based on Windows 10. Does it do what an HTPC/gaming PC does in a living room? I thought it would.

I’ve recently had the opportunity to put my theory to the test by evaluating the $ 499 Xbox Series X as an HTPC and downstairs gaming system replacement for the hardware I currently use for that task. Because I’ve never reviewed a console before and don’t have a handy PlayStation 5 to compare against, I’m going to evaluate the XSX explicitly from the viewpoint of a lifetime PC gamer considering the value and utility of the system. I’ll also have more to say about the system and some more direct comparisons at a later date when I am not responsible for two completely different reviews simultaneously.

This review does not focus on absolute image quality between Xbox and PC versions of a game. This is partly because virtually all of the truly next-generation games for Xbox Series X is still locked away, and partly because I just bought a 4K OLED and have only had a week with the Xbox Series X, which isn’t enough time for comparative analysis. Rendering a verdict without proper comparison risks mistaking improvements to the display with improvements to the image quality.

Specifically, I bought this OLED. LG CX 55-inch. It’s only been a week, but we’re very happy together.

Defining ‘PC’ in This Context

Conceptually, the Xbox Series X challenges the utility of a Home Theater PC, or HTPC, as well as a living room gaming PC (these are sometimes the same thing). HTPCs are pretty common in the enthusiast community, going all the way back to ATI and the days of their All-in-Wonder video capture card. An HTPC is typically (but not always) a secondary system attached to a TV rather than someone’s primary rig. They can be optimized for low power consumption and high storage capacity or kitted out more like gaming systems for simultaneous HTPC and high-end big-screen gaming capabilities. Content playback and gaming are the two markets where an HTPC would typically compete with a console and I’m comparing them on that basis.

What I Thought of Consoles Going In

Before starting this review, I thought of game consoles as a perfectly valid method of gaming, especially if you already had a lot of cash invested in the Microsoft, Sony, or Nintendo ecosystems, but certainly not a preferable one. Console developers, in my opinion, were far too willing to tolerate low frame rates. The few times I picked up an Xbox One or PlayStation 4 controller, I felt like I was gaming on a mid-to-low-end PC.

Unlike some PC gamers, I do not and have never hated consoles, but I’ve rarely been impressed by them.

The Hardware

My first thought, when I saw the Xbox Series X, was “Awww. It’s cute.”

The Xbox Series X is an unusually shaped small form factor PC. It uses a single 130mm ventilation fan to cool the system and it’s very quiet. I never heard the machine while gaming or watching content, even with the TV volume low. The PlayStation 5 may yet prove to be a truly chonky boy, but the XSX is smaller than I expected it to be. If you’ve spent a few decades with an ATX tower of one sort or another cluttering up the living room, the Xbox Series X is a delightful step towards smaller solutions, not larger ones.

The Xbox Series X’s ventilation diagram. The invasive pool noodles shove their way through the console until they are transformed into a cooling mint tornado. Or something. Seriously though, this thing is whisper-quiet.

As far as backward compatibility goes, the Xbox Series X had no problem identifying and enabling an Xbox One controller. The two controllers feel identical, at least to my hand, but I’m not exactly a connoisseur of the art form. My significant other, who is also a PC gamer, commented that the rumble didn’t make her rings vibrate, which she appreciated.

It’s not directly germane since I’m not comparing against a PS5, but the 3,328 RDNA2 GPU cores are worthy of a desktop PC card — and will be mounted in them soon enough.

As far as technical specs, we’ve discussed both the Xbox Series X and PlayStation 5 on more than one occasion. Microsoft went for an AMD Zen 3-based CPU, RDNA2 GPU, and fixed clock speeds for both, in direct opposition to Sony’s emphasis on variable clocking. There’ve also been some interesting remarks recently that confirm something we’d heard privately a few months ago: The Xbox Series X supports the full RDNA2 feature set, while the PlayStation 5 is supposedly based on RDNA (but with ray tracing still enabled). We don’t know enough yet to suss out the differences here, but it’s something to keep an eye on.

Services and Gaming: Microsoft Makes a Hell of a Case

The Xbox Series X cold boots from an unplugged state in 20.58 seconds on average when measured from the moment the button was pressed, not when the screen activated. The total time to load a saved game and begin playing Fallout New Vegas was 47.48 seconds when completely unplugged. When I merely turned the console off at the switch (depressing the button until the light turned off completely), the resume time was 4.5 seconds. We can’t compare the Xbox initialization process exactly to the boot time of a PC, but those figures are solidly within the range of high-end desktops, depending on how many applications you load at boot.

Setting the console up with a Microsoft account is arguably less annoying than installing Windows 10 (this is not a high bar), and once you’ve got it configured, things happen fast. I saw Fallout New Vegas available via Xbox Game Pass and was jaunting through the Mojave within 15 minutes of creating my account. I’m not going to say a high-end PC couldn’t match the same time from OS installation to game creation, but you’d need to be using the latest version of Windows 10 with pre-loaded GPU drivers or willing to run unpatched to score equivalently.

When it comes to outfitting the console with a suite of common apps like YouTube, Netflix, and such, Microsoft lands firmly in “just works” territory. Netflix image quality is much higher on the Xbox Series X, even though my HTPC streams using Microsoft Edge. An apples-to-apples comparison of the exact same stream always favors the Xbox Series X. Given a choice between streaming a service over Xbox Series X or my own HTPC, I’d take the XSX, ten times out of ten.

On the whole, the Xbox Series X is a very effective advertisement for Microsoft’s entire gaming ecosystem. Xbox Game Pass gives a new player an instant library of titles to choose from, with multiple entries in popular genres. Setting up apps like Netflix to run on the console is trivial. Game load times seem equal to or better than what we’d expect from an equivalent PC.

This is the sort of feature Microsoft promised to deliver when it began marketing the Xbox Series X. It wasn’t a feature I was certain we’d get. As I said earlier, I don’t — or at least I didn’t — associate consoles with high-end performance.

How does it feel to play the Xbox Series X? It feels like playing a game on a high-end PC, with a heavy-duty CPU core backing it up. The caveat here is that the titles we had available to play for Nov. 5 reflect current-generation titles and don’t feature capabilities like ray tracing, but then again, you can’t run DXR on any other AMD GPU currently in-market, either. As next-generation games unlock we’ll be able to compare more effectively on that front.

Every common title that I’ve played on both console and PC felt as equivalently good to play on this console as on any PC, at least as far as the underlying hardware’s performance. Microsoft is still working out the kinks in its Quick Resume feature, but it’s incredibly quick in action: tap, tap, and boom — you’re in a different game. Alt-tabbing between different games on PC is a risky proposition at best unless you already know both applications behave nicely when loaded simultaneously. The fact that you can even try alt-tabbing between games without instantly crashing the system is itself an achievement — GPUs didn’t used to tolerate being used for multiple workloads simultaneously under any circumstances.

From where I sit, this is no small thing. Unless you consider the PS5 — and I don’t have one to consider — there’s no way to get this kind of performance at the $ 500 price point in the PC universe. If you have an otherwise high-end system you could certainly upgrade your GPU to equal or better performance for less than $ 500, but the Xbox Series X is quite aggressively priced for its hardware specs.

What I Didn’t Like

There are some distinct things I do not like about the XSX. First, there’s the controller. While I have absolutely no complaint about the Xbox Series X controller as a controller, I would like to point out to whatever god or gods might be listening that using analog sticks to control a first-person shooter is like taking away a person’s hands and giving them a pair of stupid meat flippers instead. Nothing makes a sniper kill more satisfying than trying to simultaneously maneuver the world’s least-precise instrument over a head that’s four pixels wide without standing up / opening your Pip Boy / accidentally shooting Sunny Smiles in the back of the head.

Controllers vex me, is what I’m saying. They vex me enough that the learning curve, at least in some games, feels more like a learning cliff. If you’re a lifelong PC gamer like myself, you should expect some transition pains. After a week, I’m still not comfortable in a lot of titles, and full mouse and keyboard support would go a long way to making the Xbox Series S / X feel like a welcoming home for PC gamers.

Another negative? No modding support on the XSX, at least not yet. Modding on consoles is still in its infancy, so a big support boost from Microsoft would probably help the idea take off. Mods are a very important part of gaming to me and I’d always keep a foot in the PC gaming ecosystem for this reason alone, even if I switched primarily to console gaming.

The last thing about the Xbox Series X that I didn’t like is its overall network usage. While this could be the result of a disagreeable interaction between the XSX and my router, it’s a terrible bandwidth pig. Some applications “share” bandwidth more easily than others, which is to say that some of them will tank your entire internet connection as they hoover data out of the internet, while some are better behaved.

The Xbox Series X is not well-behaved. I actually had to shut the console down at multiple points during simultaneous Zen 3 / Xbox Series X testing, in order to download benchmarks at any kind of speed. Eighteen minutes on a 12MB download doesn’t cut it. I’m open to the idea that this is a conflict with my router, but the situation is untenable regardless.

There currently seems to be no method of controlling the Xbox Series X’s bandwidth usage while downloading without doing it externally at the router.

Is the Xbox Series X a Better Living Room PC Than a Typical PC?

The question of whether the Xbox Series X is a better living room PC than a regular HTPC depends, I think, on what your needs are. If you’re into video editing, content remastering, or upscaling, you know there are a lot of players and plugins you can use to improve baseline image quality in various ways. If you have content in unusual or esoteric video formats, there’s almost certainly a codec available on PC to play it. Consoles are dicier in that regard, though both Microsoft and Sony support the most common video and audio codecs.

If Microsoft supported keyboard and mouse configurations out of the box across the entire Xbox product line, I’d be 100 percent sold on the idea of the XSX as a media playback and gaming machine. Seeing as I’m still on Team Meat Flipper, I’m a little more circumspect in my evaluation. Is the Xbox Series X better than the [Insert $ 1,000+ gaming PC] you can buy at [insert OEM / boutique builder]? Very possibly not. Is it better than any $ 500 gaming PC you’re going to find in-market any time soon? I’m comfortable saying yes.

I’m not going to try to predict how the Xbox Series X will perform against the PS5 or which console players will prefer, but as far as comparisons to an equivalently-priced PC are concerned, the Xbox Series X more than holds its own. I’m downright impressed by the overall value proposition of the console and its capabilities. Obviously, you won’t be running DaVinci Studio Resolve on an Xbox any time soon, but when evaluated in terms of streaming fidelity, the Xbox Series X wins. Evaluated against the gaming capabilities of a $ 500 PC build, the Xbox Series X wins.

Gaming on the Xbox Series X may not feel much like gaming on the PC, thanks to the difference in interfaces, but it offers all of the PC’s greatest strengths in terms of load times and frame rates. The platform overperforms its price point, and it’s impressed me as far as the overall ecosystem value. There are no weak points here, and no Kinect-style screwups to muddy the value of the system. It’s a much stronger offering than Microsoft launched in 2013, and I’m really curious to see if the company will manage to convert PlayStation 4 owners to its own ecosystem, or if it’ll mostly appeal to existing Xbox, Switch, and PC gamers.

I’ll have more to say in upcoming articles. As a newcomer to the Xbox Series X ecosystem, I’m impressed by what I’ve seen thus far.

Now Read:

Let’s block ads! (Why?)

ExtremeTechGaming – ExtremeTech

MSI’s Nvidia RTX 3070 Gaming X Trio Review: 2080 Ti Performance, Pascal Pricing

There are two things you should know about the RTX 3070. First, this is a fabulous GPU by any measure. Nvidia has been claiming that this $ 500 GPU could match its $ 1200 RTX 2080 Ti Founder’s Edition. That claim has been demonstrated as broadly true with the release of the Founder’s Edition earlier this week — the RTX 2080 Ti can still pull ahead by a whisker in some 4K games, but in the vast majority of titles, the two are neck-and-neck.

Second: As great a GPU as the RTX 3070 is, you may still want to hold off a bit before purchasing a new card. It may not be able to buy one in any case — Nvidia has warned would-be customers to expect limited supply through the end of this year and into 2021. Separately from this, AMD will launch its own Big Navi cards in a matter of weeks. It may be prudent to see how the two compare before pulling the trigger.

GA102 and the MSI RTX 3070 Gaming X Trio

The MSI RTX 3070 Gaming X Trio is built around Nvidia’s GA104 GPU, while the RTX 3080 and RTX 3090 use the larger GA102 processor. Nvidia has been building multiple die at the high end for its recent launches as opposed to fusing off parts of the same design to create lower-end parts. When Turing launched, the RTX 2080 and RTX 2070 each used a different GPU design (TU104 and TU106, respectively). This time around, the RTX 3090 and RTX 3080 share a common GPU core (GA102), while RTX 3070 is built around its own unique chip (GA104).

Like its Turing and Pascal predecessors, the RTX 3070 retains the same 8GB RAM capacity Nvidia has used in this price segment since 2016. The company had plans to launch high VRAM variants of the RTX 3070 and RTX 3080, but those plans have been canceled or at least put on hold until overall availability improves.

The RTX 3070 packs 5888 cores, 96 ROPs, 14Gbps GDDR6, and a 1.725GHz boost clock. The GPU’s massive number of cores — more than 2.5x as many as the RTX 2070 — is why Nvidia is so confident in its ability to match the RTX 2080 Ti with a much cheaper card. The chip is a 17.4B transistor-design and its built on Samsung’s 8nm process. Computationally, the RTX 3070 has ~30 percent less compute power than the RTX 3080, and 41 percent less memory bandwidth. In addition to using slower RAM (16Gbps, down from 19Gbps on the RTX 3080), the RTX 3070 also has a smaller, 256-bit memory bus.

That’s the GA104 GPU itself. What has MSI brought to the table with the Gaming X Trio?

The Gaming X Trio uses a tri-axial cooler design, with additional bracing included to strengthen the GPU and prevent bending during transport.

I’ve seen more than one GPU shipped with inadequate packaging around it in the last few years, and I suspect the underlying problem is that people don’t appreciate just how heavy these cards have gotten. MSI’s strengthening bracket is a nice touch.

RGB support is provided via MSI’s “Mystic Light” system, and the default RGB is quite pretty if you like that sort of thing. The RGB lighting is designed to be controlled from MSI’s Dragon Center application, which functions as an all-in-one stop for controlling RGB, overclocking, and managing various system functions.

For those of you concerned about the power cabling situation, the MSI RTX 3070 Gaming X Trio uses twin 8-pin connectors, not the specialized Nvidia cable.

The Competition

Our competitive selection against the RTX 3070 is a bit more limited than I’d like — I don’t have an RTX 2080 Ti to compare against. We’ll be comparing the RTX 3070 against the just-launched RTX 3080, the older RTX 2080 (non-Super), and the Radeon VII from AMD. The RTX 2070 Super and the RTX 2080 perform quite similarly, so the RTX 2080 does double-duty representing both SKUs.

I chose the Radeon VII over the 5700 XT because AMD’s highest-end consumer GCN product often outperformed the newer RDNA chip last year, even if it was only by a few percent. The fact is, AMD doesn’t have a great GPU to compare in this bracket at the moment. The 5700 XT is currently selling for ~$ 390, which is quite a bit less than the RTX 3070’s $ 500 baseline MSRP, and the Radeon VII isn’t on the market at the moment.

For this review, I’ve decided to put AMD’s overall best foot forward. In a matter of weeks, we’ll have competitive figures from the Radeon 6800 and 6800 XT, and will be able to give you a much better estimation of what Big Navi does and doesn’t bring to the table. Regardless, the AMD figures are here for reference. Until Big Navi debuts, Nvidia is competing against itself. If you want to estimate RX 5700 XT performance, assume it’s close enough to the 5700 XT that you’d never actually notice in reality, but factually a bit slower if you ran the numbers.

All testing performed on an Asus Maximus XII Hero Wi-Fi with 32GB of DDR4-3600 installed. Intel’s Core i9-10900K was used for all testing, with Windows 10 2004 and the latest set of updates and patches installed.

Benchmark Results:

Ashes of the Singularity: Escalation shows the RTX 3070 pulling ahead of the RTX 2080 by 1.2x at 1080p, though the gap actually shrinks a bit as we go up. Ashes doesn’t bring GPUs to their knees quite the way it used to.

Deus Ex: Mankind Divided has the dubious distinction of having the worst MSAA mode I’ve ever seen, as far as its impact on performance. I’ve kept it around as a benchmark mostly for this reason. The RTX 3070 holds its 1.2x improvement over the RTX 2080 at 1080p but extends the lead to 1.26x at 4K.

At 4K, the RTX 3080 is 1.31x faster than the RTX 3070 and 1.4x more expensive. We expect price-performance curves to begin to bend out of a 1:1 curve at these price points, and the ratio isn’t bad — provided either of these GPUs are available for MSRP. The Radeon VII fully matches the RTX 2080 here, but it’s clearly a last-gen card.

The gaps between the GPUs are a little smaller here, but we’re using somewhat lower detail levels, which means the CPU is in play a little more than typical. The RTX 3070 continues to impress. While $ 500 is a great deal to spend on a GPU, the high performance of the RTX 3070 at $ 500 implies good things about the GPUs that will follow farther down the stack.

AMD’s Radeon VII would have benefited somewhat from DX12 here, but it wouldn’t be enough to change the big picture. Nvidia is nothing if not consistent here, with regular bands between each GPU.

I threw Strange Brigade into the mixture to see what a Vulkan title might look like. Nothing much to see here, except an interesting performance by the Radeon VII, which lags the RTX 2080 by about 17 percent in 1080p, roughly the same amount at 1440p, and just 3.4 percent in 4K. Every Nvidia GPU loses much more performance from 1440p – 4K than 1080p – 1440p, implying this may be a quirk of the engine.

At the risk of sounding boring, you’ve seen this graph already. Remember that the RTX 2080 is also standing in for the RTX 2070 Super — the gap with the standard RTX 2070 would be larger, and the performance per dollar gains are considerable.

No surprises here.

Final Fantasy XV doesn’t run well on the Radeon VII — this is a test where we suspect RDNA would turn in better results. With Big Navi launching soon, it’s not a big deal either way.

Three different Metro Exodus graphs here, to highlight three different takeaways. First, we have Ultra ray tracing enabled at Extreme Detail on the RTX 3070 versus the RTX 2080. This is a worst-case scenario, with 200 percent supersampling active — and the RTX 3070 still manages to turn in a 1.31x higher framerate at 1080p. Interestingly, the gap is smallest without ray tracing enabled — and here, the RTX 3070 isn’t all that much faster than the RTX 2080.

Big Picture Takeaways

At $ 500, the RTX 3070 is an objectively great GPU. As I expected, it solves all of the problems I initially had with Turing. Ray-tracing support is beginning to show up in games, and the higher-end cards in the family are now powerful enough to enable it at ultra-quality without needing to bother with tricks like DLSS 1.0. (DLSS 2.0 is much nicer). With Turing, I wasn’t comfortable recommending the family as a long-term investment into ray tracing given that the performance hit for enabling it could be 60-80 percent.

But most of all, with Ampere, Nvidia has returned to a Pascal-ish GPU pricing model. When it launched Turing, the RTX 2080 was 1.28x faster than the GTX 1080 on average, and cost about 1.28x  more. The RTX 3070, meanwhile, is about 1.25x faster than the RTX 2080, while costing about $ 200 less than that GPU did at launch.

If you know you’re Team Green forever, and you’ve got $ 500 to throw at a GPU upgrade, this is a great card to choose. Nvidia powers the majority of gaming PCs, which means game developers will build their titles to target whatever amount of VRAM Nvidia GPUs offer. It’s going to be interesting to see if AMD’s 16GB cards can offer performance advantages, but 8GB cards aren’t going to be outdated in a year or two. MSI’s version of the GPU doesn’t put a lot of english on the ball, but there’s no need to fix what isn’t broken, and this GPU most emphatically isn’t.

The RTX 3070 is the true successor to the value proposition Nvidia debuted with the GTX 1080, assuming one can snag a GPU at MSRP. Whether that’s going to be possible is anyone’s guess, and the impact of bots has been ugly enough this year that I feel obligated to leave a bit of a question mark on this claim.

If you are willing to wait and see what AMD will bring to the fight, I recommend doing so — it’s always a good idea to see what the competition has in store — but only if $ 579 is still within your price range. Either way, the RTX 3070 is a huge leap forward for gaming, and a great value for gamers that can afford it.

Now Read:

Let’s block ads! (Why?)

ExtremeTechGaming – ExtremeTech

Konami Entering Gaming PC Market With Spendy Arespear Lineup

This site may earn affiliate commissions from the links on this page. Terms of use.

Konami has been making games for decades, but now it’s making gaming PCs. Konami Amusement, a subsidiary of Konami Holdings, is now accepting orders for its new line of Arespear gaming PCs. The company expects to begin shipments for the Japanese market in September, and they are decidedly not cheap. 

The Arespear computers come in three different versions, all of which have the same custom “wiffleball” cases, measuring a compact 575.3 x 501.5 x 230mm. They also have a dedicated Asus Xonar XE sound card. The C300 is the base model, while the C700 and C700+ offer better specs. The C700+ is also the only one of the three with a transparent side panel. 

The C300 will have respectable internals with an Intel Core i5-9400F CPU (air-cooled), 8GB of DDR4 memory, a 512GB M.2 SSD, and an Nvidia GeForce GTX 1650. With that GPU, you’re limited to a single DisplayPort 1.4, one HDMI 2.0b, and a DVI-D port. You’re probably thinking that sounds alright for a modest gaming PC, but the price is anything but modest. The C300 will cost 184,800 yen, which works out to $ 1,760. 

The C300 without a window or RGB but still priced at nearly $ 2,000.

If you step up to the C700 Arespear, you get a water-cooled i7-9700 CPU, 16GB of DDR4 RAM, a 512 M.2 SSD, a 1TB hard drive, and an Nvidia RTX 2070 Super. You’ve got many more display options with these computers in the form of three DisplayPort 1.4 ports and an HDMI 2.0b on the video card. There is also an additional DisplayPort and HDMI on the motherboard. The C700 costs 316,800 yen ($ 3,016), and the C700+ is 338,800 yen ($ 3,226). 

The only difference between the two 700-series is the window and RGB lighting on the C700+. That distinction really drives home the wild pricing. Are people going to pay $ 200 more just for some RGB? Probably, but that doesn’t make it a good idea. We don’t know when or if Konami Amusement, which also makes arcade games, will launch the computers in other markets.

Now read:

Let’s block ads! (Why?)

ExtremeTechGaming – ExtremeTech