Hades is the game that finally got me to try an early-access release. I should explain: I generally feel you should buy a game only once it's been released and reviewed. In an era where physical stock is rarely a concern, pre-ordering games basically means you're giving publishers free money, but early access titles go a step further -- it's like paying to be a beta tester.
Aside from a few laptop launches, recently CES hasn't brought much computer news. AMD seems intent on changing that, launching the Radeon VII last year, and this year announcing a slate of new mobile processors. While AMD's chips may be changing the desktop landscape, bringing eight, twelve, and now sixteen cores processors within reach of the average desktop user, they've struggled in the mobile space. Put simply, AMD's current desktop chips are too slow, use too much power, and were a generation behind the minute they launched. But now at CES AMD CEO Lisa Su announced their 4000 line of mobile processors, which promise double the performance per watt of its current lineup (critical for mobile chips). AMD's own benchmarks, which admittedly must represent a best-case scenario, show its 15-watt low-power chips trading blows with Intel's 1065G7 laptop chip -- which currently powers the Dell XPS 13. AMD says its new 4800U should be 4% faster in single-core performance, but a staggering 90% faster in multicore benchmarks.
All the current camera hype may be around mirrorless cameras, but Canon is determined to prove DSLRs aren't dead. With the new 1DX MIII, Canon has created what might be not only the best photo camera for sports, news and wildlife, but also an exceptional video camera. First things, this is a giant DLSR. If you're used to nice compact mirrorless cameras, a pro camera like the 1DX or Nikon D5 feels enormous. For one, with its second grip (complete with control wheels), and a shutter built so you can swap to portrait orientation, the camera's body is almost square.
Microchips are so ubiquitous, it's easy to lose sight of how remarkable they actually are. Something as mundane as a thermostat or singing greeting card contains millions of microscopic structures created in one of the most remarkable manufacturing processes ever developed. The current process has been evolving since around 1977, and works sort of like a projector. Lasers shine light through a mask, which is like the blueprint for the chip, and projects the mask onto light-sensitive chemicals painted onto a slab of silicon. The result is almost like exposing a photograph: The light transmits the image of the chip onto the silicon, where it can be etched directly into the metal. This process is called photolithography, and as it's become more advanced, transistors have gotten smaller, faster and more energy efficient.
Phone cameras have undergone huge improvements in recent years, but they've done so without the hardware changing all that much. Sure, lenses and sensors continue to improve, but the big developments have all been in software. So-called computational photography is using algorithms and even machine learning to stitch together multiple photos to yield better results than were previously possible from a tiny lens and sensor. Smartphones are limited by physics. With a small sensor, narrow lens aperture and shallow depth, there are serious challenges in designing an improved phone camera. In particular, these mini cameras suffer from noise -- digital static in the images -- particularly in low light. Combine this with limited dynamic range, and you've got a camera that can perform pretty well in bright daylight, but where image quality starts to suffer as the light dims.
With 4K now firmly in place as the standard for most new TV's, high dynamic range, or HDR, video is starting to move from being an enthusiast curiosity to the next big thing in home media. HDR content looks vibrant, crisp and can be a bigger upgrade than 4K, but what's done to make those great images? Part of the confusion is that HDR isn't one thing, it's at least 4 different technology standards being unevenly applied by about the same number of competing video formats. These video standards, with opaque names like Rec2020 and SMPTE 2084, build on dozens of previous standards, going back to black-and-white CRT televisions and the dawn of broadcast. In short, it's all kind of a mess.
Welcome to the latest episode of Upscaled, our explainer show where we look at the components that make our tech faster. In this episode, we're taking a close look at AMD's RDNA graphics architecture. AMD's previous architecture, GCN, powered the Xbox One and PlayStation 4, and for the past few years has also provided the graphics power for Apple's higher-end laptops and desktops. But in the high-performance gaming space, AMD has been struggling in recent years, and many people were looking to RDNA as the product that would put AMD back on top. So does it live up to the hype? Well, not yet. The RDNA-based cards that AMD has released to far, the 5700 and 5700XT are perfectly fine performers, but they didn't exactly set the high-end graphics market on fire the way many people had hoped they would. So what actually makes RDNA different from the GCN-based graphics cards that came before it? Despite some claims that they're too similar to really be called a new architecture, RDNA does redesign how it processes graphical data at a pretty fundamental level. These changes should dramatically increase the efficiency of the GPU, and couple with a more flexible design, could lead to graphics processors that scale all the way from super-high end PC cards down to smartphone chips.
Welcome to the latest episode of Upscaled, our explainer show where we look at the parts that make our favorite tech faster. In this episode we're testing Intel's 10th gen, 10nm CPUs. Smaller transistors are one of the ways companies can speed up a chip, and Intel has been trying to get its transition from 14nm to 10nm processors going for years. This proved to be a bigger technological challenge than anyone expected, but finally Intel is gearing up to release a new line of chips based on 10nm technology. Code named "Ice Lake" these CPUS promise to be faster, and way more power efficient, and could help bring a new generation of speedy, ultra-long lasting laptops. But only laptops, because for now at least, Intel hasn't announced any desktop parts based on its new Ice Lake architecture. When they do, if they're anything like the mobile parts we tested, expect them to be seriously speedy. The CPU we benchmarked was the i7 1065G7, and despite being a pretty low-powered chip, it delivered some impressive performance.
Welcome to the latest episode of Upscaled, our explainer show where we look at the components and parts that make our favorite tech better. In this episode we're checking out AMD's new Zen 2 processors. These chips are crazy fast, and pack up to 16 cores into a consumer design, all while being remarkably power efficient. So how did AMD do it? Processor clock speeds have barely increased in years, but chips keep getting faster. What tricks are engineers using to keep the improvements coming? One of AMD's areas of focus was instructions-per clock, or IPC, a measure of how quickly the CPU can execute basic functions, and an area where AMD has historically lagged behind the competition.
Welcome to the latest episode of Upscaled, our explainer show where we look at the components and parts that make our favorite tech better. This week, we're talking about Apple's new Mac Pro, an insanely powerful new computer that's also a major change from the previous model. The last Mac Pro was a sleek black cylinder, a radical new design for a desktop computer, but one that ultimately limited the ability to upgrade the Mac Pro with new parts. The new design is a return to a typical desktop design, or so it appears. Under the hood, there's a number of unique design decisions and parts that potentially give the new Mac Pro unique capabilities, but may saddle it with some of the same flaws as its predecessor.
Welcome to the latest episode of Upscaled, our explainer show where we look at the components and parts that make our favorite tech better. In this week, we're actually taking a step away from components to talk about 4K movies and TV. As companies push higher resolution screens and cameras, 4K resolution has become the standard for high-quality content. Distributors have also embraced high-resolution, with 4K streaming becoming more common on platforms like Netflix and Amazon, and Ultra HD Blu-Rays generally considered to be the best video quality you can get at home.
In our show Upscaled we try and dig into the science and engineering behind our favorite bits of tech. In this episode, we're taking a close look at foldable OLEDs. Despite Samsung's Galaxy Fold being delayed, the Huawei Mate X is expected within the next few months, and Xiaomi and Lenovo have both showed off foldable prototypes.
In the second episode of Upscaled, our new deep-dive explainer series focused on the components that make tech better, we're looking at what is going on with CPUs. Chips are still getting faster, but at a rate much slower than most predictions. 2019 might be the year that finally changes, though, and we're excited about a few developments that should be coming later this year.
While most photographers have left film far behind, many of us are still reliant on another piece of camera tech that's over 70 years old: a mirror. Mirrorless cameras ditch that mirror to let lenses project light directly onto the sensor, and that leads to a host of other differences in how they capture images when compared to their DSLR forebearers.
Whether you're interested in taking better photos, video or both, the best thing you can do is to practice and take the time to get to know your equipment. But, if you're finding situations where you're struggling to get the perfect shot, or looking for ways to stretch creatively, there's a world of gadgets out there you can use to mix things up.
Welcome to the first episode of our new explainer series, Upscaled. We're going to be examining the components and gadgets that are helping move technology forward, and in this first episode, we're looking at graphics cards.
When AMD announced it was developing new GPUs for data centers in mid-2018, it was clear they weren't intended for gaming. AMD was in a tough spot: NVIDIA was gearing up to release its RTX cards with ray-tracing and AI-powered tech that AMD couldn't compete with. The feeling was that AMD had decided to cede the high-end to NVIDIA and focus on the mid-range (where most sales are). A new high-end gaming card wasn't expected for another year at least. These data-center cards, the Instinct MI60 and MI50, took AMD's previous flagship gaming chip (named Vega 10) and shrunk the transistors from 14nm to a 7nm process. A small manufacturing process makes smaller transistors that can run faster or use less power for the same speed. When the Instinct cards were announced in November, they were a refined version of last years' gaming cards, with enterprise features like error correction and support for super-high-precision math. Take those features away from an Instinct MI50 and you have something that looks very similar to the Radeon VII.
This has been a big year for the camera industry, with Nikon and Canon releasing their first professional full-frame mirrorless cameras. As the name implies, these cameras ditch the traditional mirror and moving parts of a DSLR for a more compact body. But Nikon and Canon's models are a little late to the party, and Sony, Fuji and Panasonic have carved out a bigger share of the market. Just as important though, Nikon and Canon's mirrorless cameras also served to launch new lens mounts (and accompanying sets of lenses) from both companies. Lens mounts are at the heart of any camera and are generally standard for 30 years or more -- an eternity in our current era of disposable tech. These new mounts were a tricky proposition. They had to enable future technologies without alienating pros who've spent thousands of dollars on existing lenses. We sat down with Steve Heiner of Nikon and Drew Maccallum of Canon - both veterans of the photo world to talk about these new mirrorless cameras and what the future of photography has in store.
When Razer released its Nari gaming headset this past fall, it brought haptics, or vibration feedback, to a new realm: your head. I'm not sure anyone was asking for that, but if the idea of adding some haptic feedback to your favorite games sounds appealing in any way, Razer my soon have the product(s) for you.
With mouse and keyboard support coming to Xbox One last November, the walls between PC and console are coming down. Nearly any modern wired or wireless mouse should work with the Xbox One, but while lots of living room gamers may be ready to jump into the fray with the added precision of a mouse, most keyboard/mouse setups are not made to be easily used from the couch.
After debuting last fall, NVIDIA's RTX line of graphics cards is making its way to notebooks. And that means pretty much every gaming laptop under the sun -- good, bad and everything between -- is due a refresh. Thankfully, this one's definitely in the "good" category. We loved Razer's 2018 Blade reboot, calling it "almost perfect" in our review. Today the company is announcing that its popular 15-inch laptop is going to be configurable with up to RTX 2080 Max-Q graphics, with RTX 2070 Max-Q and RTX 2060 filling out the graphics card options.
The Mac Mini has had a rough few years. Its last update, in 2014, was disappointing. After offering quad-core CPUs on the 2011 and 2012 editions, the 2014 model was stuck with a dual-core CPU. This meant it was actually slower at some tasks than the computer it was supposed to replace. Add in the fact that aside from storage it was not upgradable, and you had a computer that left a lot of users unhappy. Amazingly, until last month the 2014 Mini was still available on Apple's web store for $500. The lack of updates over the past four years left a lot of us wondering if we'd ever see a new model. Fortunately, Apple has rectified the situation with the 2018 Mini. This new model retains the unibody design that we loved on the 2014 edition but sports a sleek space-gray color -- a first for the Mini line. (It's also now made entirely from recycled aluminum, as is the new MacBook Air.) With vastly improved components, the Mini is now a viable competitor in the compact-desktop market. And it does have competition. In the past four years, micro PCs have vastly improved, and most of the major manufacturers now offer a tiny Windows machine. Still, I was impressed with the Mini's performance, and it's the cheapest way to get a macOS machine. Despite this, the 2018 Mini has a few flaws that will probably keep it from being the best choice for most people.
Intel has released its new line of desktop processors, including the i9-9900K, an eight-core CPU which can boost up to 5GHz. These chips are certainly fast, but they also showcase some of the challenges Intel and entire chip industry has had in crafting speedier processors. In the 2000s, most people would have predicted we'd have 5GHz chips by around 2008. Though the first 5GHz chips did finally appear in 2013, they were outperformed by most other high-end chips on the market. So how can a processor that runs faster perform worse than a slower chip, and how fast will the i9-9900K really be?
Ray tracing has long been gaming's holy grail. A method of creating hyper-realistic lighting and graphics, for years ray tracing has been promised as the technology that will take games the next step closer to total realism. Ray tracing has perennially been just on the horizon, but at GDC 2018, both NVIDIA and Microsoft showed off technology that could make real-time ray tracing a reality. Typical graphics technology, struggle with how light works. Most games used rasterization, which draws a frame almost the same way someone paints a picture, one bit at a time, and with a lot of approximation. Ray tracing hews closer to how light works in the real world, by modelling millions of beams of light, and calculating how they'd bounce around a scene.
Intel's 8th-generation "Coffee Lake" CPUs are now on the market. These chips come with a modest bump in CPU frequency, but the big news is that Intel is finally adding 6-core processors to its mainstream i7 and i5 lines. More cores means these chips will perform better at tasks that benefit from multithreading, such as content creation and data processing, and the increase in frequency and cores will give a boost to gaming frame rates. Intel used to release chips on a "tick-tock" cycle that saw every release alternate either a new design, or a new manufacturing process - called a "node." A new process node, like moving from 45nm to 22nm, means smaller transistors and a faster or more power-efficient chip. But manufacturing challenges made tick-tock falter a few years ago, and now new releases are much harder to predict. "Coffee Lake" is the fourth chip Intel has released at 14nm, and the third based on the "Skylake" design from 2015. New designs and nodes are coming, but we'll probably have to wait until they arrive in 2018 to see a big jump in performance.