Tag: CES 2022
Ultraleap Gemini Impressions: Better Hand Tracking Is Coming
The 5GHz CPU wars are back. Should you care?
The 5GHz war between AMD and Intel is back in full force if this year's CES is any indication.
As part of their “one more thing” teases, both companies demonstrated desktop CPUs running at 5GHz or greater. AMD's started the rap battle by demoing its next-gen Ryzen 7000 processor running the game Halo Infinite with all of the CPU cores reportedly at 5GHz or above. Which CPU model and how many cores wasn't disclosed, but we're assuming at least 8-cores or more cores for it to be impressive.
Two hours later, Intel fired back with its own game demo of Hitman 3 being played on an upcoming 12th-gen Core i9 “KS” chip, with every performance core running at 5.2GHz. While impressive, Intel doesn't technically qualify for the “all-core boost” prize since the remaining efficiency cores buzzed along at “only” 4GHz. But those performance cores are really what matters when it comes to gaming.
Why is 5GHz such a big deal: You
We know, you're making the whatever face at all this chest puffing because it's no big deal. After all, the original 5GHz line on a desktop PC was crossed with AMD's FX-5950 chip almost nine years ago and no one cared back then either. So why does it matter this time?
Intel / YouTube
We'd mostly agree that breaking the 5GHz barrier isn't quite the massive deal that AMD and Intel are making it out to be on a practical level when it comes to gaming, but increases in all-core clocks will also generally mean real performance gains for applications and tasks that use more cores. So if you run 3D modelling, lean into Adobe Premiere and Lightroom, or run advanced analysis using Microsoft Excel, the higher all-core boosts should net you decent gains of everywhere from 8 to 11 percent.
Still, the 5GHz breakthrough isn't a game-changer until you consider its biggest advantage: Marketing. Slapping “5GHz” on a CPU box or PC works magic on consumers like nothing else. Yes, logically your brain tells you that 4.9GHz is basically the same as 5GHz, but emotionally that round number tickles all kinds of spots. Don't believe us? Then why are things 99 cents instead of 1 dollar? Or new laptops listed for $2,499 and cars at $27,995? The obvious answer is that silly humans really respond to how we perceive numbers. And it works in every culture across the planet and likely through history. We're sure the first barter ended up going for 19 chickens.
Robert Hallock, the director of technical marketing at AMD, broke down the concept during a recent CES 2022 interview on our Full Nerd podcast. (Jump to the 14:26 mark to hear his “5GHz” vs. “World's best” thoughts, but really, watch the whole thing—Robert and AMD gaming architect Frank Azor dropped all sorts of interesting knowledge bombs on the show.)
“We done all sort of market research on what is sticking with people,” Hallock said. “When they see a letter, or a number, or a spec on the box, what moves the needle and what doesn't? Big round whole numbers—like 4.0, 4.5, 5.0—that moves the needle quite a lot in consumer preference. But something like 5.1 or 5.2 barely registers on the Richter scale.”
So yes, the push for all-cores at 5GHz and higher is indeed significant, but mostly because big round numbers still work on humans. Fortunately, CPU makers have other ways to push the pedal to the metal with performance.
“Above [the stickiness of round numbers] is use-case relevance,” Hallock continued. “You've moved from specs to ‘is this good for me, and what I want to do?' And so if you're looking for the best CAD CPU, the best gaming, the best software development, the best compiling, that carries even more weight than a spec… And I think that over the last two or three years in particular, we've seen a decline on general market focus on frequency. I think people are realizing that (for example) Ryzen can come to the table at 4.6 or 4.7GHz and credibly beat a CPU that might be running at 5.1 to 5.2, and that's a 500 to 600 megahertz spread, how do you reconcile that? And maybe the answer is that sometimes frequency doesn't always matter.”
It's in AMD's interest to say that of course, as the company is pitching its upcoming Ryzen 7 5800X3D with radical new V-Cache technology as the “world's best gaming CPU,” claiming that it topples Intel's Core i9-12900K and even AMD's own Ryzen 9 5900X despite a noticeable decrease in clock frequencies thanks to all that extra cache stacked on top of the chip. Squeezing ever-more performance out of these increasingly complex pieces of silicon isn't as simple as just cranking up the clocks anymore, as Hallock explains in other portions of the interview.
But make no mistake: Faster chips are nothing but a good thing, no matter how that speed is achieved, and we're looking forward to the 5GHz war brewing in 2022. Intel's 12th-gen KS chip is scheduled to launch sometime this quarter, with Ryzen 7000 CPUs expected in the second half of the year. The Ryzen 7 5800X3D that AMD says will be the “world's fastest gaming CPU” despite topping out at 4.5GHz boost clocks will be available this spring.
There are 20 different Intel ARC GPU IDs noted in upcoming Linux Mesa driver update
CES Hands-On: How Two $5000 Force Feedback Gloves Compare
Nvidia’s 12GB GeForce RTX 3080 packs a bigger GPU and a much bigger price tag
Nvidia only just revealed the GeForce RTX 3050 and RTX 3090 Ti during its CES 2022 keynote last week, but that wasn't all the company had up its sleeve. On Tuesday morning, Nvidia quietly and surprisingly launched a new 12GB version of GeForce RTX 3080 that fixes our biggest (albeit minor) gripe with an otherwise stellar graphics card.
The 10GB of VRAM on the original RTX 3080 felt somewhat lacking right out of the gate, as games like Doom already chew up immense amounts of memory with every graphics knob cranked at 4K resolution. It probably wouldn't be something you'd have to worry about in most games soon, but the new version's 12GB of GDDR6X completely alleviates any potential concerns. Nvidia also expanded the memory bus width in this model from 320-bit to 384-bit, which helps boost overall memory bandwidth to 912GB/s, up from 760GB/s in the original.
Nvidia also tweaked the GPU configuration of the new 12GB RTX 3080, bumping the CUDA graphics core count from 8704 in the original to 8960 now. That, combined with the upgraded memory, should make the 12GB RTX 3080 just a wee bit faster than the original, though we would've preferred that Nvidia gave this card a slightly different name (like RTX 3080 Super) given the small-but-key GPU tweaks.
Don't bother trying to hunt down a Founders Edition version of this graphics card. Nvidia told PCWorld to contact board vendors (like EVGA, Asus, MSI, et cetera) for more product information—including pricing. Yes, Nvidia didn't even slap an official MSRP on this bad boy. Gulp. The original 10GB version ostensibly retailed for $700 but is currently impossible to find in traditional stores, and is going for $1,500 to $2000-plus on Ebay depending on the model. Thanks to the ongoing GPU shortage, don't expect this upgrade to come cheap.
Update: Yep, these aren't cheap. EVGA's standard 10GB RTX 3080 XC Ultra Gaming model costs $840 on the company's website. The 12GB RTX 3080 XC Ultra Gaming costs $1,250—a $410 markup, and even more than the non-Ultra version of the step-up RTX 3080 Ti XC Gaming, which costs $1210. Well, if any of them were in stock, that is. Considering that mining desirability is largely driven by memory bandwidth, that isn't really surprising, even if it's a bit disheartening.
Still, more GPUs is better than fewer GPUs. You might not have noticed the 12GB RTX 3080's launch if you weren't paying close attention—Nvidia didn't release any announcements or blog posts, it only quietly updated the RTX 3080 family's spec page and slipped mention of the GPU into its new driver release notes—but it's here. We'll try to get our hands on one for testing.
Intel and Nvidia just dunked on Apple’s M1 Max. Should you believe the hype?
When Apple finally released its M1 Max processor in October, the Internet predictably saw dark days for PC laptops. Some even felt bad for PC laptop makers being uncompetitive with the MacBook Pro for perhaps “years.” Those predictions may have to be retuned a bit now that Intel and Nvidia have both come out swinging at Apple, however.
Nvidia was the first to step on Apple's sneakers when it announced its new GeForce RTX 3070 Ti and GeForce RTX 3080 Ti Laptop GPUs. Nvidia casually compared not just its newest GeForce RTX 3080 Ti Laptop GPU against Apple's fastest M1 Max, but also the far more pedestrian GeForce RTX 3060 Laptop GPU using Autodesk Arnold, Blender, Chaos V-Ray, OctaneRender and Redshift.
As you can see from the comparison with the MacBook Pro 16's M1 Max, both the new GeForce flagship and the far blander RTX 3060 Laptop GPU simply crush the M1 Max. And by crush, we mean crush, because when a GeForce RTX 3080 Ti Laptop GPU takes 10 minutes to perform a render and a GeForce RTX 3060 Laptop GPU takes 22 minutes using Autodesk Arnold, versus 78 minutes for the M1 Max, it's a beat-down. That's an 87 percent decrease in rendering time for the RTX 3080 Ti vs. the M1 Max, and a 78 percent advantage for the RTX 3060. That's a shellacking no matter how you count it for working creators, but it should be pointed out that many of these apps have long been optimized for Nvidia's GPUs, giving GeForce a home field advantage.
Nvidia
For example, it's not clear if the testing Nvidia did for Blender Cycles used the version that's currently being ported to Apple's M1 and Metal API. We'd guess not since the presentation would have been based on numbers likely prepared well before CES 2022 to meet approval for public dissemination. With Blender support still in pre-beta stage, it's highly doubtful the Blender score is running an alpha version.
So is it fair if Nvidia shows off a stack of benchmarks arguably optimized for its GPU versus the unknown quantity of M1 Max support? It depends.
If your idea of a good time is to get into a yelling match on Twitter while wearing an Apple team jersey over the “unfairness” of Nvidia's results, then it's definitely not fair. If you're a working professional who gets paid to shovel pixels in Autodesk Arnold, Blender, V-Ray, OctaneRender, or Redshift, then it's most certainly a fair test, since the only thing you probably care about is how fast your hardware can make you money.
Intel steps up too
After Nvidia poked Apple in the nose a few times, Intel jumped into the ring later that morning by saying that its new 12th-gen “Alder Lake” laptop CPUs are faster than not only its older 11th-gen Tiger Lake H CPUs and AMD's Ryzen 9 5900HX, but also Apple's M1 Max. So yes, as the slide below says, the fastest mobile processor. Ever.
Intel
What's that based on? Fortunately, responsible companies show their homework as Intel did. In fact, Intel shows way more of its work than Nvidia did in its video, which publishes the results, but no information about how it tested the laptops.
Intel says its performance results for the Apple M1 Max is estimated based on: “public statements made by Apple on 10/18/2021 and measurements on Apple M1 Max 16″ 64GB RAM Model A2485. The metric used is the geometric mean of an n-copy SPECrate run of the C/C++ integer benchmarks in SPEC CPU 2017.”
For the uninitiated, SPEC is published by Standard Performance Evaluation Corporation, an industry group that has come together to create various agreed-upon tests and proclaims itself a “beacon of truth for 30 years.” Members include a who's who of tech companies including AMD, Apple, Intel, and Nvidia. You're typically required to publish much of the fine print, including wh at was used to compile the executables for the test too.
In Intel's case, it said it used ICC for the Windows laptops and Apple's Xcode 13.1 for the M1 Max. To its credit (although some would say it's just to avoid further Imperial entanglements), Intel discloses far more details on how it achieved its claim here.
Intel
Still, the upshot of Intel's tests says that even at 28 watts or so of power consumption, it's easily outperforming the M1 Max in a test even Apple has signed onto. As you push the wattage envelope of the Core i9-12900HK you're looking at perhaps near 45 percent more performance than that M1 Max.
So, what should you believe? One problem with SPEC benchmarks, though sometimes based on actual application code, is the lack of relatability for consumers. They can be useful for computer science students arguing in the quad, but for most people they're pretty esoteric. We'd probably want to see something we can relate to before determining if and how much of a beat down the new 12th-gen Alder Lake gives the M1 Max. It's very hard to argue against a test published by a benchmarking group even Apple is a member of though.
MSI GE76 Raider
And it's good Intel published its homework. We wish Nvidia would have said a little more about how it tested the laptops against the Apple M1 Max—did the workloads include ray tracing features that GeForce GPUs pack dedicated hardware for? But it's hard to complain about what Nvidia did when Apple has been publishing results like the one below. This is a benchmark result Apple showed off for the M1 Max's launch and, frankly, we still have no idea what Apple based it on. The dark gray line represents MSI's killer GE76 Raider gaming laptop that's actually—to Apple's credit—displayed as being faster than the M1 Max. The lighter gray line is a Razer Blade Advanced, which is slightly slower than the M1 Max. Both are outfitted with a GeForce RTX 3080 Laptop GPUs.
Apple
The M1 Max may lose or come close to Nvidia's GPUs in raw performance, but Apple's real victory is its power consumption. Apple's M1 Max and its TSMC 5nm process is indeed impressive for the power it consumes. At the same time, just what the hell was Apple testing? We have no doubt the M1 Max is indeed efficient, but of the three “our bars are longer” presentations, Apple's is the thinnest on actual details and mostly leaves you wondering just how it determined what it did. If you're concerned over Nvidia or Intel being fair, you should be even more concerned about Apple's claims.
In the end, consumers should always take any vendor's claims with a grain of salt. Wait for independent reviews using tests that relate to what you actually do with your laptop before deciding what to buy.
Best of CES 2022: The most intriguing and innovative PC hardware
CES returned to Las Vegas in 2022, but most of the major PC players decided to stick to the safety of remote events. The parade of PC hardware went on regardless. Big names AMD, Intel, and Nvidia all made splashy announcements, and PC manufacturers followed in their wake with new products, many coming soon. Peripherals and displays proved just as exciting. We also witnessed some deeply interesting reveals that had nothing to do with computers whatsoever, like a chameleon-like color-shifting BMW.
No time to sift through all our CES coverage? No problem! Read on for our Best of CES picks—the most intriguing and innovative products we saw. (If you're looking for higher-level impressions from the show, be sure to check out our roundups of 5 laptops trends and 5 monitor trends that PC enthusiasts can't ignore.)
CyberPowerPC Kinetic case
CyberPower
We've seen a lot of PC cases in our time, but none that has the wild take on practicality of CyberPowerPC's new Kinetic series. This futuristic case features a set of small, angular panels on its front that automatically open and close to manage airflow. Between the geometric design, sleek white and copper aesthetic, and mechanical wizardry, it looks like something straight out of a sci-fi movie. The company demurely refers to its design as “intelligent airflow,” but it's a hell of a way to stand out among the field of mid-tower ATX cases. —Alaina Yee
Dell XPS 13 Plus
Dell's XPS 13 is arguably one of the most high-profile laptops around thanks to its history of setting the stage for what all laptops strive for. For example: Before the first XPS 13 with InfinityEdge bezels appeared, all laptops rolled with bezels about as ridiculous-looking as bell bottom corduroy pants.
Dell's vaunted line once again reaches for the brass ring with the new 14-core XPS 13 Plus, which gives off a gorgeous and minimalistic look with its integrated haptic trackpad. The trackpad is still there in the central part of the palm rest but rather than using a conventional piano-hinge design, it “clicks” similarly to how your phone shakes.
Dell also isn't shy to take the Internet's slings and arrows with the integration of, well, not a touch bar, but a bar you can touch for the function keys striped across the top. Sure, the tech press that previously lauded Apple's Touch Bar as “futuristic” and “smart” are now out in force saying the XPS 13 Plus' capacitive touch function key is a bad idea—but that's the fickle press for you.
The Dell XPS 13 Plus grabs onto another controversial design decision with both hands by ditching the beloved 3.5mm analog headset jack. Dell said it dumped the jack to save space (also the reason it ditched physical function keys), which lets it stuff way more hardware inside.
Obviously, everyone will wonder if it's worth losing that headset jack and physical F4 key, but we'd guess there are plenty of folks that would call those worthy sacrifices to get a 14-core CPU in a laptop weighing less than three pounds.
What we can tell you is to win big, you have to take risks, and the XPS 13 Plus swings for the fences like Dell did years ago, when it basically forced all other laptop makers to catch up. — Gordon Mah Ung
Alienware 34 QD-LED
Dell
We've seen a few OLED screens aimed at PC gamers before, but they've generally been repurposed TV panels with a few extra bells and whistles (and of course, that lucrative “gamer” branding). Alienware seems to be the first out of the gate with a screen built from the ground up with PC gaming in mind. This massive 34-inch monitor is still tiny by comparison to previous designs, fitting into the popular 34-inch ultrawide category.
But it's not wanting for bells nor whistles. In addition to the usual RGB lighting, G-Sync support, and plethora of inputs expected from a premium PC gaming monitor, the Alienware QD-OLED 34 combines quantum dot technology with the perfect blacks and vivid colors of OLED for a brighter overall panel, overcoming some of the technology's inherent weaknesses. It's a common theme among the new TVs at CES, so it's great to see that (finally) going right into a PC monitor. On top of that you get a speedy 175Hz native refresh rate (impressive for this size), a USB hub, and a host of gamer goodies as well.
In short, while there are bigger and more impressive OLED “monitors” on the show floor, this is the one I'm going to be extremely tempted to spend (lots and lots) of money on in 2022. —Michael Crider
12th-gen Intel ‘Alder Lake' laptop CPUs
Intel
We wouldn't be lying if we said CES 2022's bushel of laptop CPUs was an embarrassment of riches, with AMD first announcing Ryzen 6000 mobile CPUs using its impressive Zen 3+ cores, and then Intel firing back with its 12th-gen Alder Lake mobile CPUs. What's better? Well, we don't honestly know yet to be honest. Both lines offer pretty spectacular feature sets, such as the new RDNA 2-based GPU cores in AMD's Ryzen 6000 processors, which offer twice the performance of the older chips. Intel's CPUs, meanwhile, bring core counts to unheard-of levels with three pound laptops wielding 14 cores on tap.
So you can see our predicament in trying to pick the best laptop CPU announcement at CES—everyone's a winner! “Best of” awards never get shared, however, so that means we have to pick a winner and with phaser to our head, we declare Intel's 12th-gen Alder Lake laptop lineup as the best laptop CPU of the show.
Offering a hybrid architecture with up to six performance cores sitting aside another 8 efficiency cores, a combination expected to offer more than 40 percent increased performance over Intel's already pretty awesome 11th-gen Tiger Lake CPUs, Intel's 12th-gen chips will be very difficult CPUs for even AMD's new Ryzen 6000 series to unseat in terms of raw performance. —Gordon Mah Ung
Samsung Freestyle portable projector
Samsung
Projectors may seem out of fashion these days, given the ubiquity of fancy monitors and televisions. But given world circumstances right now, the idea of setting up a big-screen viewing experience almost anywhere hits just right. Whether you want to have outdoor movie nights or make the most of ultra-tight living quarters, the Samsung Freestyle can adapt to the situation. It takes the same amount of room as a thermos, can run off of a USB PD power bank, and even screws into standard E26 light bulb sockets. No one needs this projector, but I definitely want one. (You can get the full lowdown on the Samsung Freestyle over at TechHive, our sister site.) —Alaina Yee
Lenovo ThinkBook Plus Gen 3
Lenovo
You know what's better than one display? Two displays. The Lenovo ThinkBook Plus Gen 3 certainly raised some eyebrows at CES this year due to this rather unusual feature. That said, it's a cool bit of hardware that makes multitasking a whole lot easier. The secondary display, which lives on the right side of the keyboard, is 8-inches with a resolution of 800-by-1280. The screen is made of glass and touch-enabled, and it allows you to quickly access apps like Microsoft Outlook or Edge. It's a great productivity machine for business professionals looking to get some serious work done, that's for sure. —Ashley Biancuzzo
AMD Radeon RX 6500 XT
AMD
Over three long years after Nvidia's GeForce RTX 20-series kicked off the real-time ray tracing revolution, the first sub-$200 graphics card capable of handling those beautifully strenuous lighting effects is finally here—and it came from AMD, not Nvidia. The Radeon RX 6500 XT is a humble $199 desktop GPU pitched more as a budget successor to the popular Radeon RX 570 than a true ray tracing powerhouse, but flipping on AMD's new Radeon Super Resolution feature (which will speed up performance in all your games) should help pick up the slack.
The sheer fact that a $199 graphics card exists in the midst of a severe GPU drought is worth cheering, and since AMD outfitted the 6500 XT with 4GB of RAM, it can't be used to mine Ethereum—the primary cryptocurrency helping to drive up graphics card costs. Time will tell if you'll actually be able to get your hands on one of these for around its sticker price, but I'd wager you'll probably have a lot better odds finding a Radeon RX 6500 XT in stores than Nvidia's newly announced $249 RTX 3050, whose juicy 8GB memory buffer puts it squarely in the sights of miners. The Radeon RX 6500 XT could be just what desperate PC gamers need in 2022. Fingers crossed. —Brad Chacos
Nvidia GeForce RTX 3090 Ti
Nvidia
In a world where gamers are grateful to buy four-year-old used GPUs at their original list price, Nvidia's mic drop reveal of the GeForce RTX 3090 Ti (it's pronounced “tie”) felt like a slap to everyone's face. It's also, frankly, the most gangster move ever, akin to casually drinking from a diamond-encrusted Evian bottle while surrounded by masses of people dying of thirst.
If you were offended by it though, you're missing the whole point of the GeForce RTX 3090 Ti's existence in the first place.
Coming as the replacement for a card so luxurious that most reviewers said to skip it (ourselves included), the GeForce RTX 3090 Ti does two things: It gives those trying to scrimp by on $200 million NBA Super Max contracts something to buy to replace their old GeForce RTX 3090 with, and it flexes muscles in AMD's and—more importantly to Nvidia—Intel's face in a year we're likely to see a GPU cat fight the likes of which we've never seen before. So while the 3090 Ti may not be the best GPU at CES (that honor goes to the Radeon 6500 XT), it's definitely the best BFGPU at CES. —Gordon Mah Ung
L'oreal Colorsonic
L'oreal
Back when CES used to be something we visited in person, I got pitched my fair share of beauty-tech products. But unlike the “magic mirrors” products of yore, which showcased cool tech but didn't actually solve a widespread problem, the L'oreal Colorsonic piqued my curiosity in the same way as innovations in PC cases and smart home products—it's tech that aims to eliminate common hassles.
L'oreal says this hair-dyeing wand simplifies the process of at-home coloring by making it far less messy and ensuring even application of dye (the two biggest issues for DIYers). You simply choose one of 40 shades, load in the cartridge from the haircolor kit, and then brush the device through your hair for dye application. We'll see how effective this product will be for dark hair (which typically needs to be bleached first), but color me intrigued. —Alaina Yee
Alienware x14
We never owned Alienware's original netbook-sized gaming laptop, the M14x, and to this day we still pine about the “one that got away.” But maybe—just maybe—we'll have our chance again to own an ultra-portable gaming notebook, as the new Alienware x14 claims the title of being the “thinnest gaming laptop in the world.”
Yeah, we know, you wrinkle your nose at “thin laptops,” but nothing motivates sales like a thin laptop, while also challenging engineers to make faster laptops with increasingly limited amounts of space.
Alienware's x14 measures a mere 12.66×10.4 inches at just 0.57-inches (or 14.5mm) thick. How thin is that? Most gaming laptops chest pound for being 19.5mm, so the x14 is 25 percent thinner. In fact, the Alienware x14 “out thins” Apple's new MacBook Pro 14 by a millimeter.
Despite its stature, the x14 packs in plenty of hardware with up to a 14-core 12th-gen Core i7 cooled with Alienware's Element 31 thermal interface, and a GeForce RTX 3060 6GB GPU along with a 144Hz FHD panel and G-Sync plus Optimus Advanced.
While other thin gaming laptops have tended to tone down the gamer aesthetic to go “legit” the x14 embraces its heritage. Sure, to get there you have to use soldered down LPDDR5/5200 RAM, but that also helps with battery life on a laptop that you likely won't mind actually carrying in your really thin bag. —Gordon Mah Ung
AMD Ryzen 7 5800X3D
AMD
Like laptop CPUs, the desktop CPU news at CES was beefy AF, with Intel filling out its entire desktop line up of 12th-gen CPUs, from $42 budget chips to the non-K CPUs most people buy. Intel closed off with news and a demo of a special edition “KS” chip that can run all of its performance cores at 5GHz while gaming. AMD, likewise, talked up its next-gen Ryzen 7000 processors due in the second half of the year—which can also run games with all cores at 5GHz. The CPU that might have the most impact for gamers, however, was AMD's Ryzen 7 5800X3D, which packs 64MB of high-performance L3 cache on top of an 8-core Ryzen 7 5800 die (hence the “3D” in the name).
That cache, AMD says, is just what games love, and will boost the Ryzen 7 5800X3D to gaming performance at and above that of Intel's top-end Core i9-12900K chip. It's so fast that AMD laid claim to it being the “fastest gaming CPU.” Sure, Ryzen 7000 is sexy and Alder Lake's full roster is exciting, but the value of a “fastest gaming CPU” sticker on the box is going to make most consumers gooey inside and easily qualifies as the best desktop CPU announcement at CES. —Gordon Mah Ung
BMW iX Flow
BMW's iX Flow is a concept, but one that should be familiar to PC users. What would happen if you took an ordinary BMW SUV and instead replaced the “paint” with a bunch of E-Ink panels? The result is the BMW iX Flow, which can change colors on the fly. Since they're E-Ink, you can only switch between black, white, and limitless combinations of gray, black, or various patterns/combinations, but it's simply way cool, as you can see in the tweet above. —Mark Hachman
Dell UltraSharp 32 4K
This seems so obvious I'm surprised it hasn't been done before: The Dell UltraSharp 32 4K is a desktop monitor that doubles as a Thunderbolt I/O dock. Maybe it just strikes me as particularly genius now that I'm working from home more often than not and finding myself regularly moving between my personal desktop and my work laptop, while having just one large monitor and one desk between them.
With the UltraSharp 32 4K, I could connect both machines to my 3240×2160 IPS display, as well as my mouse and keyboard, and use the KVM switching feature to seamlessly move between the two PCs as needed, using all the same peripherals, without adding any extra clutter—in the form of a standalone dock or switcher—to my already-cluttered desk. Shoot, you could possibly even daisy-chain a second monitor. At the same time, the UltraSharp's dock would be keeping my laptop charged without the need for its own power cable. How great is that? Ports include DisplayPort 1.4, USB-A ports, 10Gbps USB-C connections, an RJ45 port for Ethernet, and audio. It even has a built-in webcam. —Katherine Stevenson
Acer Predator Triton 500 SE
Acer
Acer's Predator Triton 500 SE strikes a balance between work and play, something that will hopefully become a trend in the future. Why have a gaming laptop and a productivity laptop, when one notebook can do both? Sure, there's a powerful 12th-gen Core and an Nvidia RTX 3080 Ti GPU inside, but the real story is just the workmanlike exterior and lack (yes, lack) of RGB bling. Long battery life, a high-res display, and gaming chops is a trend we like to see. —Mark Hachman
TP-Link AXE11000 Tri-Band Archer AXE200 Omni
TP-Link
Routers packing Wi-Fi 6E—the cutting-edge networking standard worth investing in—showed up in force at CES 2022 after sticking mostly to rare, high-end models in 2021. We saw Wi-Fi 6E routers rolled out by Netgear, TP-Link, Asus ROG, and even Comcast at the show, every one featuring blazing-fast speeds and killer next-gen features. My favorite though? The TP-Link AXE11000 Tri-Band Archer AXE200 Omni, and for a very, very dumb reason. It features mechanically self-adjusting arms that have a practical purpose—maximizing throughput from this ultra-fast router—but just as importantly, look absolutely badass.
As Wes Davis put it in his coverage, the whirling robo-arm setup “delights me—as someone who grew up with an '80s family-friendly sci fi-fueled vision of the future wherein all houses are full of whirring gizmos with superfluously-rotating, undulating, oscillating whatsits and doodads—on a purely superficial level.” And as a simple man who likes simple pleasures, all I can say is amen, brother. —Brad Chacos
LG C2 42-inch OLED TV
LG
Prior to LG's announcement this week, using an OLED television as a monitor meant going big—displays started at 48 inches and only went up from there. Having one sitting just a couple of feet from your face was borderline unwieldy. The upcoming 42-inch version of the LG C2 doesn't sound all that much smaller, but the six-inch difference should make desk usage feel far more comfortable. And at that close distance, you'll be able to properly enjoy OLED's deep black levels in all of your favorite dark, moody games and movies. The only bummer about LG's announcement is that the 42-inch and 48-inch version of the C2 lack the brighter EVO panels of their larger siblings. (For more info on the LG C2 lineup and other OLED TV announcements from CES, head over to our sister site TechHive.) —Alaina Yee
Lenovo Legion 5i Pro
Lenovo
With its glacial white chassis and barely-there bezels, the Lenovo Legion 5i Pro is sure to turn heads. The minimalistic, sophisticated aesthetic is a most welcome departure from the traditional gamer look. But it's more than just a pretty face. The 2960-by-1600 resolution IPS display features a lightning-fast adaptive refresh rate of 240Hz. The sky-high refresh rate is a major selling point, as it means smoother gameplay. The laptop is also driven by a 12th-gen Intel Core processor and RTX graphics. If it's power you're looking for, the Legion 5i Pro should have everything you need. —Ashley Biancuzzo
Editor's note: Originally published on January 7, updated on January 11 to add the Dell XPS 13 Plus and Alienware x14.
Enterprise Security at CES 2022 Marked by IoT, Biometrics, and PC Chips
CES 2022: Wireless power for all
We don’t need no stinkin’ wall power as CES shows off the power and promise of usable long-range wireless charging
The post CES 2022: Wireless power for all appeared first on WeLiveSecurity
Surprise! Nvidia’s 12GB GeForce RTX 3080 packs more memory and a bigger GPU
Nvidia only just revealed the GeForce RTX 3050 and RTX 3090 Ti during its CES 2022 keynote last week, but that wasn't all the company had up its sleeve. On Tuesday morning, Nvidia quietly and surprisingly launched a new 12GB version of GeForce RTX 3080 that fixes our biggest (albeit minor) gripe with an otherwise stellar graphics card.
The 10GB of VRAM on the original RTX 3080 felt somewhat lacking right out of the gate, as games like Doom already chew up immense amounts of memory with every graphics knob cranked at 4K resolution. It probably wouldn't be something you'd have to worry about in most games soon, but the new version's 12GB of GDDR6X completely alleviates any potential concerns. Nvidia also expanded the memory bus width in this model from 320-bit to 384-bit, which helps boost overall memory bandwidth to 912GB/s, up from 760GB/s in the original.
Nvidia also tweaked the GPU configuration of the new 12GB RTX 3080, bumping the CUDA graphics core count from 8704 in the original to 8960 now. That, combined with the upgraded memory, should make the 12GB RTX 3080 just a wee bit faster than the original, though we would've preferred that Nvidia gave this card a slightly different name (like RTX 3080 Super) given the small-but-key GPU tweaks.
Don't bother trying to hunt down a Founders Edition version of this graphics card. Nvidia told PCWorld to contact board vendors (like EVGA, Asus, MSI, et cetera) for more product information—including pricing. Yes, Nvidia didn't even slap an official MSRP on this bad boy. Gulp. The original 10GB version ostensibly retailed for $700 but is currently impossible to find in traditional stores, and is going for $1,500 to $2000-plus on Ebay depending on the model. Thanks to the ongoing GPU shortage, don't expect this upgrade to come cheap.
Still, more GPUs is better than fewer GPUs. You might not have noticed the 12GB RTX 3080's launch if you weren't paying close attention—Nvidia didn't release any announcements or blog posts, it only quietly updated the RTX 3080 family's spec page and slipped mention of the GPU into its new driver release notes—but it's here. We'll try to get our hands on one for testing.
NFTY, Inc., at CES 2022
NFTY, Inc. Unveils NFT Validation Protocol at CES
Undeniable Need for NFT Reputation Validation Drives Strong Interest
Las Vegas, Nevada, January 11, 2022. Consumer Electronics Show ("CES") welcomed well over 40,000 attendees in person, with 30% of attendees traveling from outside the US, representing 119 countries. Topics discussed included, auto technologies, TVs and gaming, health care technologies on smartphones and smartwatches, 5G, etc.
NFTY, Inc. ("NFTY") was honored to participate, leading the conversations around blockchain, cryptocurrency, and NFT. As one of the few NFT companies highlighted at CES, the NFTY defi protocol unveiled a first look at its new NFT advocacy functionality during the CES Unveiled, media only event, from 5pm-8:30pm Pacific Time on January 3, 2022.
The event was a huge success. Many attendees visited the NFTY booth to understand NFT technologies and trends and many saw opportunities in NFTY to shape and create opportunities for NFT reputation and discoverabilities. “Our team was thrilled to announce this game changer for the NFT space at CES,” CEO Chris Mills commented. “What we are rolling out is the core of the NFTY project. Incentivized crowdsourced reputation validation will completely change how NFT auctions are run and start rewarding excellence over hype.”
Join NFTY to learn how it is empowering a more artists-centered future that's connected and rewarding.
XXXXXXXXXX
XXXXX
3-minute video explaining how NFTY advocacy works: https://youtu.be/hBxOaSu-9ck
HP’s new flock of Dragonfly laptops offer svelte power in Chrome, Windows flavors
In 2020, PCWorld gave the original HP Elite Dragonfly our Editor's Choice award. Executive Editor Gordon Ung said the svelte laptop is “thin, light, and beautiful, with a battery that won't quit.” At CES 2022, HP is poised to make the Dragonfly label a new house brand, a la Dell's XPS or Lenovo's X1. The company is doing that with a revamped Windows laptop and a brand new convertible Chromebook design.
First up is the new Windows machine, rechristened the Elite Dragonfly G3. The biggest obvious change is a swap from a 16:9 screen to a productive 3:2 aspect ratio at 13.5 inches, starting at 1920×1280 resolution and 400 nits. Users can upgrade that panel to a 1000-nit version, or go all-out with a “3K2K” OLED screen. The Dragonfly's other signature feature, its weight (or lack thereof), is still impressive. Despite upgrades to the latest 12th-gen Intel Core processors, the laptop still starts at just 2.2 pounds. (The optional 6-cell battery may bump that up a bit.) Like a lot of premium Windows laptops it's packing a haptic trackpad.
HP
The Dragonfly G3 comes with double USB-C/Thunderbolt 4 ports, a USB-A port for older hardware, a full-sized HDMI port (harder to find at this size), and an optional nano-SIM card slot if you upgrade to 5G mobile service. Like the original model, it's compatible with the Tile Bluetooth tracking system for finding your gadgets, as well as NFC, a 5MP camera (plus infrared for Windows Hello), and a fingerprint reader. It's still built to the MIL-STD 810 standard, so it can take a punch or two.
HP
When the Elite Dragonfly G3 goes on sale it'll be available with up to 2TB of SSD storage and an impressive 32GB of DDR5 RAM. It'll come in silver or blue color options when it lands in March.
HP Elite Dragonfly Chromebook
For something figuratively (but not literally) lighter, check out the Elite Dragonfly Chromebook. This machine swaps up the form factor with a convertible fold-back hinge, but offers similar 13.5-inch 3:2 screens, though sadly there's no OLED option. “Next Gen” Intel processors and DDR4 RAM say the performance will be a bit of a step down, as does the storage maxed out at 512GB, but that is all still kind of overkill for Chrome.
Despite the more modest hardware, the Dragonfly Chromebook gets the MIL-STD body treatment, fingerprint sensor, and optional 5G connection, along with all the same ports. It's also boasting Google's proprietary Titan H1 chip for extra security, along with a haptic trackpad, a first for a Chrome device. 51 watt-hours of battery should make it a real road warrior running Chrome. Why HP chose the 360-degree hinge for the Chromebook and not the Windows laptop, we couldn't guess, but it bumps the weight up to 2.83 pounds.
HP
The Elite Dragonfly Chromebook will land in April.