Intel put out another high-end chip, 10th-gen “Comet Lake”, which added a few cores but is still based on its aging 14nm transistor design, and AMD countered with Zen 3, an improved version of its desktop architecture that now goes up to 16 cores. While Zen 3 didn’t increase core counts or clock speeds dramatically, it did deliver a big boost in instructions per-clock without increasing power consumption.
With the Zen 3 chips, AMD has reclaimed the desktop performance crown, and the flagship 16-core 5950X is one of the fastest processors we've ever seen. But how much speed do you really need?
RSC-V is a new fully open-source processor design and instruction set. Developed at UC Berkeley, RISC-V is trying to do away with license fees and proprietary tech to make processor design accessible to companies all other the world.
QLC flash is enabling bigger and bigger SSDs, including the first 8TB consumer flash drives, but it comes with the serious drawbacks. How exactly does flash memory work, what is QLC and what is it good for?
The semiconductor GaN has already changed the world once, it's at the heart of blue and white LEDs, but researchers are looking at how this materials could revolution power systems, space travel, telecommunications, and even processors.
For the latest episode of our explainer show Upscaled, we dive into the history of the RISC architecture. With Apple dropping Intel for its own RISC-based CPUs, this will be the first time in decades we've seen a company try to produce high-end RISC processors.
For the latest episode of our explainer show Upscaled, we compared Intel’s new flagship 10-core chip, the i9-10900K, with AMD’s budget-level Ryzen 3 3300X. We were trying to answer one question: How much does your CPU matter in gaming performance?
In this mini-episode of our explainer show, Upscaled, we break down NVIDIA's latest GPU, the A100, and its new graphics architecture Ampere. Announced at the company's long-delayed GTC conference, the A100 isn't intended for gamers, or even for workstation users. Volta never directly came to consumers — aside from the Titan V and a Quadro workstation card — but the improvements and tensor cores it introduced were a key part of Turing, the architecture which underpins almost all of NVIDIA's current GeForce and Quadro cards.
Now, with its 10th-gen “Comet Lake” desktop chips, Intel is offering up to 10 cores on top-line chips. These Intel 10th-gen chips are also still based on Intel’s aging 14nm manufacturing process and Skylake architecture, so aside from more cores, don’t expect huge performance gains here.
The Xbox One and PS4 were the start of an unusual console generation. Both systems adopted very "PC-like" architecture and instead of a new generation we instead got a refresh where Microsoft and Sony both released faster versions of their existing consoles, the PS4 Pro and Xbox One X. We may still have a few months left to wait, but finally we've gotten details of what the next real generation will bring. Microsoft and Sony have both stuck with the PC-like design of their predecessors, and are again using AMD as their CPU and GPU supplier, but the Xbox Series X and PS5 will be very different from the current generation. The biggest changes come from the storage systems. Considering the Xbox One and PS4 both still use slow mechanical hard drives, we figured a move to smaller, faster, more efficient flash-based SSDs was inevitable, but Microsoft and Sony have gone all out. Both systems feature custom storage interfaces with PCI Express 4.0 SSDs and custom hardware to handle real-time decompression. That means these drives will move serious amounts of data very quickly. The Series X is claiming transfer speeds of around 3-4GB/s, while the PS5 may be capable of data rates as high as 9GB/s. With data rates that high (the current consoles manage maybe 150MB/s in ideal conditions), load times should be cut down to seconds, and in-game load screens may become a thing of the past. Faster data rates could also enable higher resolution textures for more photo-realistic graphics, and enable you to switch between games with the click of a button. Add in a significantly upgraded CPU and a long-awaited AMD ray tracing solution for hyper-realistic lighting, and these consoles represent a huge leap forward. Hopefully, their benefits will also trickle down into the PC space as well, and games on every platform will be able to leverage these new possibilities to be faster and better looking. We don't know exactly when they'll arrive yet, or how much they'll cost, but we'll have more details on the Series X and PS5 as soon as they're announced.
Last time on our explainer show Upscaled, we took a look at 5G, the new high-speed mobile technology starting to roll out in 2020. But 5G isn't entirely new technology, it builds on the mobile networks already in place around the world. This has been the pattern for mobile networks, incremental upgrades that added more capacity and speed bit by bit. This approach has yielded incredible results; as much as we might gripe about coverage or speeds today, the first real cell networks could only support about a dozen calls per tower, had no data capacity, and used unencrypted analog signals that were easy to intercept.
Game genres can wax and wane in popularity. It's a sad truth that two of my favorites, real-time strategy and space simulation games, have been on the decline since the glory days of Tie Fighter and Warcraft. We may never see the likes of Warcraft 3 again, but in recent years there have been some valiant attempts to revive the space sim, with releases like Everspace, Rebel Galaxy, and Elite breathing new life into the genre. But I haven't been playing any of those, because all my free time has been completely consumed by Freespace 2.
We've been hearing about 5G for ages, and 2020 is the year it'll finally become a reality for some people. Until this point there have been a few sparse 5G networks available in cities, but with only a handful of phones supporting 5G, even if you lived in an area with coverage odds are you couldn't connect. That's all set to change with a host of new 5G phones expected to be announced through 2020, and providers all around the world starting to switch on additional 5G towers. Even so, it's hard to know what to expect from 5G. Depending on your provider and your network, you may get blazing fast speeds but only in certain places, a bump in reliability without much speed, or anything in between. It turns out 5G isn't really one thing, it's a collection of technology and new frequency bands, and different carriers are focusing on different aspects of the network.
Hades is the game that finally got me to try an early-access release. I should explain: I generally feel you should buy a game only once it's been released and reviewed. In an era where physical stock is rarely a concern, pre-ordering games basically means you're giving publishers free money, but early access titles go a step further -- it's like paying to be a beta tester.
Aside from a few laptop launches, recently CES hasn't brought much computer news. AMD seems intent on changing that, launching the Radeon VII last year, and this year announcing a slate of new mobile processors. While AMD's chips may be changing the desktop landscape, bringing eight, twelve, and now sixteen cores processors within reach of the average desktop user, they've struggled in the mobile space. Put simply, AMD's current desktop chips are too slow, use too much power, and were a generation behind the minute they launched. But now at CES AMD CEO Lisa Su announced their 4000 line of mobile processors, which promise double the performance per watt of its current lineup (critical for mobile chips). AMD's own benchmarks, which admittedly must represent a best-case scenario, show its 15-watt low-power chips trading blows with Intel's 1065G7 laptop chip -- which currently powers the Dell XPS 13. AMD says its new 4800U should be 4% faster in single-core performance, but a staggering 90% faster in multicore benchmarks.
All the current camera hype may be around mirrorless cameras, but Canon is determined to prove DSLRs aren't dead. With the new 1DX MIII, Canon has created what might be not only the best photo camera for sports, news and wildlife, but also an exceptional video camera. First things, this is a giant DLSR. If you're used to nice compact mirrorless cameras, a pro camera like the 1DX or Nikon D5 feels enormous. For one, with its second grip (complete with control wheels), and a shutter built so you can swap to portrait orientation, the camera's body is almost square.
Microchips are so ubiquitous, it's easy to lose sight of how remarkable they actually are. Something as mundane as a thermostat or singing greeting card contains millions of microscopic structures created in one of the most remarkable manufacturing processes ever developed. The current process has been evolving since around 1977, and works sort of like a projector. Lasers shine light through a mask, which is like the blueprint for the chip, and projects the mask onto light-sensitive chemicals painted onto a slab of silicon. The result is almost like exposing a photograph: The light transmits the image of the chip onto the silicon, where it can be etched directly into the metal. This process is called photolithography, and as it's become more advanced, transistors have gotten smaller, faster and more energy efficient.
Phone cameras have undergone huge improvements in recent years, but they've done so without the hardware changing all that much. Sure, lenses and sensors continue to improve, but the big developments have all been in software. So-called computational photography is using algorithms and even machine learning to stitch together multiple photos to yield better results than were previously possible from a tiny lens and sensor. Smartphones are limited by physics. With a small sensor, narrow lens aperture and shallow depth, there are serious challenges in designing an improved phone camera. In particular, these mini cameras suffer from noise -- digital static in the images -- particularly in low light. Combine this with limited dynamic range, and you've got a camera that can perform pretty well in bright daylight, but where image quality starts to suffer as the light dims.
With 4K now firmly in place as the standard for most new TV's, high dynamic range, or HDR, video is starting to move from being an enthusiast curiosity to the next big thing in home media. HDR content looks vibrant, crisp and can be a bigger upgrade than 4K, but what's done to make those great images? Part of the confusion is that HDR isn't one thing, it's at least 4 different technology standards being unevenly applied by about the same number of competing video formats. These video standards, with opaque names like Rec2020 and SMPTE 2084, build on dozens of previous standards, going back to black-and-white CRT televisions and the dawn of broadcast. In short, it's all kind of a mess.
Welcome to the latest episode of Upscaled, our explainer show where we look at the components that make our tech faster. In this episode, we're taking a close look at AMD's RDNA graphics architecture. AMD's previous architecture, GCN, powered the Xbox One and PlayStation 4, and for the past few years has also provided the graphics power for Apple's higher-end laptops and desktops. But in the high-performance gaming space, AMD has been struggling in recent years, and many people were looking to RDNA as the product that would put AMD back on top. So does it live up to the hype? Well, not yet. The RDNA-based cards that AMD has released to far, the 5700 and 5700XT are perfectly fine performers, but they didn't exactly set the high-end graphics market on fire the way many people had hoped they would. So what actually makes RDNA different from the GCN-based graphics cards that came before it? Despite some claims that they're too similar to really be called a new architecture, RDNA does redesign how it processes graphical data at a pretty fundamental level. These changes should dramatically increase the efficiency of the GPU, and couple with a more flexible design, could lead to graphics processors that scale all the way from super-high end PC cards down to smartphone chips.
Welcome to the latest episode of Upscaled, our explainer show where we look at the parts that make our favorite tech faster. In this episode we're testing Intel's 10th gen, 10nm CPUs. Smaller transistors are one of the ways companies can speed up a chip, and Intel has been trying to get its transition from 14nm to 10nm processors going for years. This proved to be a bigger technological challenge than anyone expected, but finally Intel is gearing up to release a new line of chips based on 10nm technology. Code named "Ice Lake" these CPUS promise to be faster, and way more power efficient, and could help bring a new generation of speedy, ultra-long lasting laptops. But only laptops, because for now at least, Intel hasn't announced any desktop parts based on its new Ice Lake architecture. When they do, if they're anything like the mobile parts we tested, expect them to be seriously speedy. The CPU we benchmarked was the i7 1065G7, and despite being a pretty low-powered chip, it delivered some impressive performance.
Welcome to the latest episode of Upscaled, our explainer show where we look at the components and parts that make our favorite tech better. In this episode we're checking out AMD's new Zen 2 processors. These chips are crazy fast, and pack up to 16 cores into a consumer design, all while being remarkably power efficient. So how did AMD do it? Processor clock speeds have barely increased in years, but chips keep getting faster. What tricks are engineers using to keep the improvements coming? One of AMD's areas of focus was instructions-per clock, or IPC, a measure of how quickly the CPU can execute basic functions, and an area where AMD has historically lagged behind the competition.
Welcome to the latest episode of Upscaled, our explainer show where we look at the components and parts that make our favorite tech better. This week, we're talking about Apple's new Mac Pro, an insanely powerful new computer that's also a major change from the previous model. The last Mac Pro was a sleek black cylinder, a radical new design for a desktop computer, but one that ultimately limited the ability to upgrade the Mac Pro with new parts. The new design is a return to a typical desktop design, or so it appears. Under the hood, there's a number of unique design decisions and parts that potentially give the new Mac Pro unique capabilities, but may saddle it with some of the same flaws as its predecessor.
Welcome to the latest episode of Upscaled, our explainer show where we look at the components and parts that make our favorite tech better. In this week, we're actually taking a step away from components to talk about 4K movies and TV. As companies push higher resolution screens and cameras, 4K resolution has become the standard for high-quality content. Distributors have also embraced high-resolution, with 4K streaming becoming more common on platforms like Netflix and Amazon, and Ultra HD Blu-Rays generally considered to be the best video quality you can get at home.