Samsung’s updated DeX desktop experience uses the S9 as a trackpad

When it launched last year, Samsung’s first DeX dock was the latest in a long line of attempts to blur the line between phones and traditional PCs. And you know what? It worked surprisingly well, even though the value for most people wasn’t particularly clear. That didn’t stop Samsung from trying again, though: the company unveiled a new version of it’s DeX hardware alongside the Galaxy S9 and S9 Plus, and it’s much different from the dock we got last year.

In its closed form, the original DeX dock sort of resembled an extra-chunky hockey puck, and the sliding mechanism that allowed a phone to fit was quite clever. That’s gone. Instead, this year’s DeX Pad looks like a shrunken Sega Genesis that the phone is meant to rest on top of. It seems like a step backward in design until you realize that Samsung wanted the phone’s touchscreen as close as possible to the surface it’s resting on so you can use it as a trackpad.

This isn’t the first time we’ve seen an approach like this — Razer did the same thing with its lapdock concept at CES — but it works quite well for navigating between running apps on a bigger display. You’ll eventually be able to use that connected phone as a keyboard too, if you’ve got a masochistic streak. I was hoping to take this for a spin just to see if the road-warriors and traveling salespeople Samsung is targeting with the DeX Pad could squeeze some use out of a tiny touch keyboard, but it wasn’t ready for me to play with just yet.

More importantly, you’ll be able to drive higher resolution displays than you could before — in the old days, max resolution for a connected screen topped out at 1080p, which is less than ideal for anyone who has invested in high-end hardware. Things are a little better now, though, since you can crank up the resolution to 2,560 x 1,440 on that big display (as long as you’re using an S9, at least). When you take the boost in screen resolution along with more accessible touch controls and the Galaxy S9’s power, some novel experiences being to emerge. I definitely didn’t think I’d spend a chunk of my day playing Final Fantasy XV: Pocket Edition on a desktop monitor while fiddling with a phone-touchpad combo, but the whole thing felt more elegant than expected.

Some changes, however, might rub some people the wrong way. The DeX Pad packs still two full-sized USB ports, a USB-C port and an HDMI-out, the ethernet jack present in the original is nowhere to be found. Whether anyone will actually miss that thing is another question, but shouldn’t have to wait long to find out.

Intel’s PC concept ‘hides’ a 5G antenna in a plump kickstand

MWC 2018 has been gripped with 5G fever, even more intensely than in years past. Intel has gone all out for the show this year, with display upon display of 5G demos dominating its booth. One of the new proofs of concept from the chip maker is a detachable PC streaming a 4K video over a test 5G connection using an early modem. Intel also announced a partnership with Dell, Lenovo, HP and Microsoft to bring 5G-connected PCs with XMM 8000 series modems to market in the second half of 2019.

The unnamed concept 2-in-1 is a preview of what those PCs might be capable of, in terms of connection speeds. Intel is not going to sell hardware like this directly to consumers. At the booth, the convertible streamed Wonder Woman on loop, and company reps told Engadget that the demo was hitting 4 to 5 Gbps rates.

That’s impressive if true (we had no real way of measuring), but remember that the conditions were ideal. The 28GHz spectrum that the PC was tapping for the demo is dedicated to the stream, and thus not congested. Intel also placed the base station for the network about four or five feet directly above the device, leaving little room for interference to affect the stream.

One of the biggest challenges in implementing the millimeter wave communication that’s crucial to 5G is signal interference. Line of sight, meaning the path between the transmitter and receiver, is necessary to transport data. One of the ways Intel is catering for this is by embedding its early 5G modem in the concept PC’s kickstand that flaps out of the device, which it believes should minimize obstruction of incoming signals. To be clear, on this concept device, the kickstand is much larger than we’ve seen on commercially available convertibles, and might become smaller in eventual products.

Since this isn’t a device that you’re ever going to be able to (or want to) buy, there’s no real point in telling you how the display looked or the keyboard felt. What’s interesting is, Intel’s rep told me that the device can last between 3 and 4 hours when streaming the 4K video over 5G. We don’t have any other benchmark to compare that with just yet, and actual PCs may ship with differently sized batteries, but that’s long enough to watch an entire blockbuster, and even a couple of credits scenes.

We’re just a year and a half away from these devices launching, and hopefully by then, 5G networks will have become a widespread reality. Until then, we can only wait for a world without buffer delays.

You can finally stream Xbox One games to your Mac

Before now, if you fancied playing console games on your Mac, you’d need to use PlayStation 4’s Remote Play to do so. Windows 10 users have been able to stream Xbox One titles to their PCs since 2015, but macOS users have been out of luck. A new $10 app called OneCast, however, has apparently figured out how to get your Xbox One games streaming to your Mac.

The app isn’t an official release from Xbox, however, unlike Sony’s solution. We’ve reached out to Microsoft for comment and will update this post when we hear back. OneCast promises support for 1080p video, low lag, easy set up and the ability to stream Xbox One titles over the internet (with some manual configuration). OneCast says that you can use an Xbox One wireless controller connected via USB or Bluetooth, use any number of different consoles or gamertags and run in either full screen or windowed mode. You’ll get a free 14-day trial for your $10 purchase, which is regularly $20.

Lenovo’s Yoga 730 is a cheaper 2-in-1 with Alexa support

Mobile World Congress, for obvious reasons, isn’t really known for its laptop announcements. Lenovo is bucking that trend today, however, with three mid-range convertibles sporting the Yoga name. The most exciting is the Yoga 730, a sleek 2-in-1 that sits below the company’s flagship Thinkpad X1 line and the popular Yoga 920. It will be available in 13- and 15-inch variations, sporting similar designs but slightly different upgrade options. They’ll be joined by the Flex 14 (which will be marketed as the Yoga 530 outside of North America) a cheaper alternative with a near-identical form factor.

First, the Yoga 730. We’re yet to be told the base specs, but we know they’ll go up to an eighth-generation Intel Core i7 processor and 16GB of RAM. The smaller version will offer up to 512GB of SSD storage while its larger sibling ramps up to a beefy 1TB. On the 13-inch laptop, you’ll be stuck with integrated graphics (boo) but the 15-inch model can be specced up to a gaming-friendly Nvidia GTX 1050. Both come with built-in JBL speakers and the option of a 1080p or 4K touchscreen — perfect for mindless finger doodles and artsy brushstrokes with Lenovo’s optional Active Pen 2 stylus.

It’s a Yoga, so of course you can flip the screen over and use it like a tablet. I still think it’s weird to hold a laptop with the keyboard pressed into your palm, but hey, the option is there if you really dig the ‘clipboard’ form factor. Both are relatively light — the 13-inch model weighs two and a half pounds, while the 15-inch version comes in at 1.89kg. The display bezels are also fairly slim and I like that the webcam is placed up top, rather than the lower left-hand corner (where you’re stuck with an up-the-nose shot).

Like the Thinkpad X1 range, the Yoga 730 will support both Cortana and Amazon’s Alexa assistant. AI-hopping isn’t ideal, but until the smart speaker wars shake out it makes sense for Lenovo to offer multiple options (the company is also backing the Google Assistant with its Smart Display.) Both the 13-inch and 15-inch models will come with far-field microphones that can pick up your voice from across the room. So in theory, you could ask Cortana for your schedule and then book a time-appropriate Uber through Alexa. Or search for an article and order a pizza while you read.

The 13-inch and 15-inch Yoga 730 will start at $880 and $900 respectively. They cost considerably more than the $600 Flex 14, which will ship with an Intel Core i3 processor and a 1080p touchscreen. You can upgrade to an eighth-generation Intel Core i7 processor, a 512GB SSD, 16GB of DDR RAM and an Nvidia GeForce MX130 graphics card, but these will cost you extra. At first blush, Lenovo’s new laptops seem like decent if predictable performers for people with a sub-$1,000 budget. They all look the part and the Alexa support is nice for people who own a bunch of Echo speakers and compatible smart home appliances. Part of me misses the watchband-inspired hinge found on the Yoga 920, but hey — you can’t have it all.

Huawei’s MateBook X Pro crams a pop-up webcam into the keyboard

Huawei moved the webcam from above the screen and hid it in the keyboard, primarily to tackle privacy concerns. Instead of integrating shutters like Lenovo and HP have done, Huawei’s solution to the sticky-note-over-the-camera situation is to embed it inside a key.

Push down the spring-loaded middle button of the top function row, and up pops a 1-megapixel webcam with a light next to it indicating it’s on. Press it down when you’re done, and it not only deactivates and gets out of the way but even if it is hacked to spy on you, it will see only the darkness inside your laptop.

Kudos for creativity, Huawei — this is unique. But as we’ve already seen with Dell’s XPS line of laptops, webcams placed below the display make for seriously unflattering, unclear angles in photos and during conference calls. Unlike the XPS laptops, you can’t move the screen to try and find a less-awkward angle, either. When I tried out the MateBook X Pro’s camera, I was annoyed by how difficult it was to avoid having an on-screen double-chin — I wouldn’t be happy using this device to make any video calls.

The one other benefit of moving the webcam down to the keyboard is the MateBook X Pro’s noticeably skinnier bezels compared with the original. It doesn’t seem like a good enough tradeoff, though.

Still, the Pro is a gorgeous piece of hardware. Sure, it looks very much like a MacBook, but I’m not complaining — nice is nice. The new MateBook feels just as sturdy and premium as its predecessor, once again combining a lightweight build with a sleek profile.

If you can look past the awkward camera placement, or if you don’t intend to use your laptop’s webcam much, there are a few more things to like about the MateBook X Pro. In addition to getting bumped up to the latest eighth-generation Intel Core i5 (or i7) CPUs, the notebook can also be configured with a discrete NVIDIA MX150 graphics chip. This makes it the thinnest laptop of its size to sport discrete graphics, according to Huawei.

The MateBook X Pro has a larger screen than the original — 14 inches — which packs a 13-inch display. It also goes up to 450 nits in brightness over the smaller laptop’s 350 nits. Since I only saw the Pro in a meeting room, I couldn’t tell if the increased luminance would make it easier to see in direct sunlight, but it was certainly clear during our demo. The new model’s screen is now touch-friendly, and got a slight resolution bump to 3,000 x 2,000, maintaining a similar pixel density to the smaller version.

To complement the display during your movies or games, the MateBook X Pro comes with Dolby Atmos 2.0 for crisper and more immersive surround sound. During our demo, Dolby’s sample of nature sounds rang out loud and clear from the laptop’s quad speakers, even when I was facing the back of the device. Huawei also moved the speaker grills from above the keyboard to either side of the keys. The Pro also packs a quad-mic array, two more than before, which should allow for clearer voice quality. A much larger touchpad now sits below the keyboard, making mouse navigation more convenient, too.

The company has apparently managed to squeeze more battery life out of the Pro, and it can clock about 12 hours on a charge — two hours longer than the regular model (looping 1080p video). While the smaller MateBook X had two USB-C ports (one more than the MacBook), the new laptop adds a USB Type A socket to the mix, at USB 3.0 speeds. The two USB-C slots will not only support power and data transfer, but one of them is also compatible with Thunderbolt 3.

Questionable webcam placement aside, the MateBook X Pro appears to be an attractive, powerful 14-inch laptop. If the camera is not a dealbreaker, you might want to consider this PC as your next workhorse. Unfortunately, we still don’t know how much it will cost and when it will be available, but we do know that it will come in two colors (silver and dark gray). Because the original MateBook X started at $1,099, you can expect the Pro model to cost slightly more than that at launch. That’s not a small investment, so you might want to wait until we get a review unit in for more testing before splurging on it.

Update: At its press conference in Barcelona, Huawei shared some European pricing information, which should be a good reference for when the device arrives stateside. The base configuration, which packs the MX150 discrete graphics, a Core i5 CPU and 256GB of storage, will cost €1,499.

The GIGABYTE Aorus AX370-Gaming 5 Review: Dual Audio Codecs

we are having a look at a LED-laden, gaming-focused, ATX motherboard from GIGABYTE: the Aorus AX370-Gaming 5. If a user wants LEDs for Ryzen at under $200, here is one of the primary contenders. Being part of GIGABYTE’s gaming product line means we get SLI support, and GIGABYTE is using a gaming-focused network controller (one of two) and some overclocking options for the processor. The interesting part of this board, however, is the use of dual audio codecs: one for the rear panel and one for the front panel. To physically do this requires a couple of compromises, so we have put the board through its paces to see if it is worth buying.

GIGABYTE AX370-Gaming 5
On amazon

The GIGABYTE Aorus AX370-Gaming 5 Overview

The GIGABYTE AX370-Gaming 5 shows that not every motherboard has to conform to the regular gaming themed combination of black PCB with red or silver aluminium heatsinks. With the Gaming 5, it has a wave of black and white contrasting heatsinks featured across the board. GIGABYTE has opted to implement a fairly standard X370 PCIe layout consisting of two full-length PCIe 3.0 slots powered by the CPU, which feature support for dual graphics card configurations on either SLI or CrossFire and have additional rigidity support. This is in addition to a single full-length slot present at the bottom which operates at PCIe 2.0 x4 also with rigidity support, and three PCIe 2.0 x1 slots.

It gets a little interesting when we start discussing the controllers. Powering the onboard audio are a pair of Realtek ALC1220 codecs, with one dedicated for the back panel and one specifically for the front. Very few boards (if ever?) have had this arrangement, making the Gaming 5 special in that regard. The audio comes bundled with SoundBlaster’s X-Fi MB5 software utility. For networking, the primary port is derived from the gaming-focused Killer E2500 Gigabit controller, and a second from an Intel I211-AT controller.

Featured is a 10-phase power design, which GIGABYTE aims for solid and consistent power delivery, and claims it is useful for overclocking. It is worth noting that the VRM is split into a 4+2 design with the phases dedicated to the SoC using a doubler to give an 8+2 phase design overall.

Storage wise, the Gaming 5 has eight SATA 6Gbs ports which are accompanied by two SATA Express ports. PCIe storage comes via a single U.2 port, which shares bandwidth with a single M.2-2280 slot found between the first two full-length PCIe slots.

Performance on the Gaming 5 essentially matches what we see on the other X370 boards. Despite the dual audio codecs, this means that each codec only has half of the space, so our audio results show that it is one of the weaker ALC1220 solutions (although better than the ALC892 units we have tested). Power consumption at idle was within a couple of watts of our other boards despite the LEDs, and at load the system actually drew 15W less than our other tests, to which we’re still looking into an explanation. Overclocking, as explained below, was relatively easy.

The Gaming 5 sits near the top of the pile of GIGABYTE’s current X370 offerings, with the only model above it being the AX370-Gaming K7. It is also worth noting that GIGABYTE’s X370 range stops just short of $200 even with their top AX370-Gaming K7 model; with the very similar AX370-Gaming 5 which this review is actually on coming in at $184 (at the time of review).

Overclocking

Most, if not all, mid-range motherboards are very capable of overclocking the processor, and most include a one-click OC button (either physical or in the BIOS) which gives the task to the motherboard based on how far it believes it can be done safely. The only caveat of this is that virtually every motherboard I have used this with is very cautious about not giving enough voltage, so over-volts the CPU. This gives the overclock more chance to remain stable, but plays havoc for little gains at the price of extra energy lost as heat; thermal sensors can start to kick in even if the auto-option is safe. With the Gaming 5, both automatic overclocking and a manual overclocking is available.

Methodology

Our standard overclocking methodology is as follows. We select the automatic overclock options and test for stability with POV-Ray and OCCT to simulate high-end workloads. These stability tests aim to catch any immediate causes for memory or CPU errors.

For manual overclocks, based on the information gathered from previous testing, starts off at a nominal voltage and CPU multiplier, and the multiplier is increased until the stability tests are failed. The CPU voltage is increased gradually until the stability tests are passed, and the process repeated until the motherboard reduces the multiplier automatically (due to safety protocol) or the CPU temperature reaches a stupidly high level (100ºC+). Our test bed is not in a case, which should push overclocks higher with fresher (cooler) air.

Overclocking Results

Referencing back to the Biostar X370GTN review, our Ryzen 7 1700 CPU does have a limitation between 3.9GHz and 4.0GHz; at least on the boards we have tested thus far. This is down to silicon lottery and a combination of a sharp ramp of voltage to temperature when moving up each different step; therefore, cutting out/throttling due to thermal limitations when pushed too far on ambient cooling.

POV-Ray @ 3.9GHz

Power OCCT (w/GTX 980) - Overclocking

The Ryzen 7 1700 processor we are using has a 3.0 GHz base core clock speed and a 3.7 GHz turbo, and is rated at 65W. When overclocked to 3.9 GHz with 1.375v, the overall power consumption taken at the wall was pushing just under 187W at peak.

AMD Announces Wider EPYC Availability and ROCm 1.7 with TensorFlow Support

Earlier this year AMD announced its return to the high-end server market with a series of new EPYC processors. Inside is AMD’s new Zen core, up to 32 of them, with the focus on the major cloud providers. We were the first media outlet to publish our review of EPYC, which showed AMD to be highly competitive in an Intel dominated x86 landscape. One of the concerns over the launch period was for the wider availability of EPYC: it was clear that AMD was announcing the product very early in its distribution cycle.

At SuperComputing 17 this week, the enterprise computing conference, AMD is announcing that it has ramped production of the processors and it has several OEMs ready, distributors ready, and system integrators expanding their portfolios.

OEMs with EPYC enabled systems at Supercomputing this week include ASUS, BOXX, GIGABYTE, HPE (Hewlett Packard Enterprise), Penguin Computing, Supermicro and Tyan. Each company is targeting certain niches: ASUS for HPC and Virtualization with its RS720A-E9 and RS700A-E9 1U/2U servers, BOXX combining EPYC with Radeon Instinct for multi-GPU compute solutions and deep learning, GIGABYTE with rackmount servers, HPE for complex workloads and Supermicro moving from tower form factors to 1U, 2U and 4U for HPC and storage.

“The industry’s leading system providers are here at SC17 with a full breadth of AMD-based solutions that deliver outstanding compute capability across HPC workloads,” said Forrest Norrod, SVP and GM of Enterprise, Embedded and Semi-Custom, AMD.

We had a meeting with AMD for this launch. Normally OEM systems coming to the market might not light the news on fire, but we did have an interesting line worth mentioning. AMD stated that there has been a steep production ramp for EPYC processors, after the initial phase with the cloud providers ensuring that the systems are ready to go, and so are ready to meet OEM requirements. We were told that all the SKUs announced at launch are in production as well, all the way down to the 8-core and the 1P parts, so OEMs that are interested in the full stack can now flex their product range muscles.

AMD also wheeled out the Inventec P47 system that it announced at launch, with a single EPYC processor and four Radeon Instinct MI25 GPUs. In partnership with AMAX, 47 of these systems were put together into a single rack, capable of one PetaFLOP of single precision in a turnkey solution. From today, AMAX is now taking pre-orders for this rack, for delivery in Q1.

ROCm 1.7 Gets Multi-GPU Support, Support for TensorFlow and Caffe

AMD also announced that its open-compute platform for graphics, ROCm, is being updated to 1.7. With this revision, ROCm adds support for multiple GPUs for the latest hardware, as well as support for TensorFlow and Caffe machine learning frameworks in the MIOpen libraries. This should be bigger news, given the prevalence of TensorFlow when it comes to machine learning. AMD also stated that ROCm 1.7 delivers additional math libraries and software development support, to provide a ‘foundation for a heterogeneous computing strategy’.

Google Assistant will get support for Routines ‘in the coming weeks’

Today’s Google Assistant is much, much more capable than the version that first debuted on the original Pixel and Pixel XL. Don’t expect that progress to slow anytime soon, either: Google laid out some new plans to improve the Assistant just in time for Mobile World Congress, and they extend far beyond just teaching it more languages.

Most importantly, Google confirmed it has been working with smartphone makers on ways to weave Assistant more elegantly into our smartphones. That work is being formalized in the new Assistant Mobile OEM program, and Google’s list of accomplishments with its partners is nothing to sneeze at: it helped make Assistant compatible with certain kinds of mobile AI coprocessor and worked to make sure devices can listen for the right wake-words even when their screens are off. It won’t be long before you start to see device-specific Google Assistant commands, either — LG touted a list of 23 new commands for its updated V30, and Google also cited close working relationships with companies like Sony and Xiaomi.

Google Assistant is also finally getting support for Routines, a feature first announced last year. Long story short, you’ll be able to string together multiple actions with a single command; saying “OK Google, goodnight,” for instance, could dim your Philips lights, dial down the temperature on your Nest thermostat and lower the volume on your Google Home Max. Routine support is expected to go live within the next few weeks, as will location-based reminders through Assistant-powered speakers. (Yes, you could do this through a phone already, but parity between different flavors of Google Assistant is always a good thing.

Qualcomm Launches 48-core Centriq for $1995: Arm Servers for Cloud Native Applications

Following on from the SoC disclosure at Hot Chips, Qualcomm has this week announced the formal launch of its new Centriq 2400 family of Arm-based SoCs for cloud applications. The top processor is a 48-core, Arm v8-compliant design made using Samsung’s 10LPE FinFET process, with 18 billion transistors in a 398mm2 design. The cores are 64-bit only, and are grouped into duplexes – pairs of cores with a shared 512KB of L2 cache, and the top end design will also have 60 MB of L3 cache. The full design has 6 channels of DDR4 (Supporting up to 768 GB) with 32 PCIe Gen 3.0 lanes, support for Arm Trustzone, and all within a TDP of 120W and for $1995.

We covered the design of Centriq extensively in our Hot Chips overview, including the microarchitecture, security and new power features. What we didn’t know were the exact configurations, L3 cache sizes, and a few other minor details. One key metric that semiconductor professionals are interested in is the confirmation of using Samsung’s 10LPE process, which Qualcomm states gave them 18 billion transistors in a 398mm2 die (45.2MTr/mm2). This was compared to Intel’s Skylake XCC chip on 14nm (37.5MTr/mm2, from an Intel talk), but we should also add in Huawei’s Kirin 970 on TSMC 10nm (55MTr/mm2). Today Qualcomm is releasing all this information, along with a more detailed block diagram of the chip.

The chip has 24 duplexes, essentially grouped into sets of four. Connecting them all is a bi-directional segmented ring bus, with a mid-silicon bypass to speed up cross-core transfers. This ring bus is set with 250 GBps of aggregate bandwidth. Shown in the diagram are 12 segments of L3 cache, which means these are shipped with 5 MB each (although there may be more than 5 MB in a block for yield redundancy). This gives a metric of 1.25 MB of L3 cache per core, and for the SKUs below 48 cores the cache is scaled accordingly. Qualcomm also integrates its inline memory bandwidth compression to enhance the workflow, and provides a cache quality of service model (as explained in our initial coverage). Each of the six memory controllers supports a channel of DDR4-2667, with support up to 768GB of memory and a peak aggregate bandwidth of 128 GB/s.

Qualcomm Centriq 2400 Series
AnandTech.com Centriq 2460 Centriq 2452 Centriq 2434
Cores 48 46 40
Base Frequency 2.2 GHz 2.2 GHz 2.3 GHz
Turbo Frequency 2.6 GHz 2.6 GHz 2.5 GHz
L3 Cache 60.0 MB 57.5 MB 50 MB
DDR4 6-Channel, DDR4-2667
PCIe 32 PCIe 3.0
TDP 120 W 120 W 110 W
Price $1995 $1373 $888

Starting with the chips on offer, Qualcomm will initially provide three different configurations, starting with 40 cores at 2.3 GHz (2.5 GHz turbo), up to 46 and 48 cores both at 2.2 GHz (2.6 GHz turbo). All three chips are somewhat equal, binned depending on active duplexes and cache, with $1995 set for the top SKU. Qualcomm is aiming to attack current x86 cloud server markets on three metrics: performance per watt, overall performance, and cost. In that regard it offered three distinct comparisons, one for each chip:

  • Centriq 2460 (48-core, 2.2-2.6 GHz, 120W) vs Xeon Platinum 8180 (28-core, 2.5-3.8 GHz, 205W)
  • Centriq 2452 (46-core, 2.2-2.6 GHz, 120W) vs Xeon Gold 6152 (22-core, 2.1-3.7 GHz, 140W)
  • Centriq 2434 (40-core, 2.3-2.5 GHz, 110W) vs Xeon Silver 4116 (12-core, 2.1-3.0 GHz, 85W)

Qualcomm provided some SPECint_rate2006 comparisons between the chips, showing Centriq either matching or winning in performance per thread, beating in performance per watt, and up to 4x in performance per dollar. It should be noted that the data for the Intel chips were interpolated from other Xeon chips, except the 8180. Those numbers can be found in our gallery below.

One interesting bit of data from the launch was the power consumption results provided. As a server or cloud CPU scales to more cores, there will undoubtedly be situations where not all the cores are always drawing power, either due to how the algorithm works or the system is waiting on data. Normally the TDP values are given as a measure of power consumption, despite the actual definition of thermal dissipation requirements – a 120W chip does not always draw 120W, in other words. To this end, Qualcomm provided the average power consumption of the 120W Centriq 2460 while running SPECint_rate2006.

It shows a median power consumption of 65W, peaking just below 100W for hmmer and h264ref. The other interesting point is the 8W idle power, which is indicated as for only when C1 is enabled. With all idle states enabled, Qualcomm claims under 4W for the full SoC. Qualcomm was keen to point out that this includes the IO on the SoC, which requires a separate chipset on an Intel platform.

Any time an Arm chip comes into the enterprise space, thoughts immediately turn to high-performance, and Qualcomm is keen here to point out that while performant, their main goal is to cloud services and hyper-scale, such as scale-out situations, micro-services, containers, and instance-based implementations. At the launch in San Diego, they rolled out quotes from Alibaba, Google, HPE, and Microsoft, all of whom are working closely with Qualcomm for deployment. Demonstrations at the launch event included NoSQL, cloud automation, data analytics with Apache Spark, deep learning, network virtualization, video and image processing, compute-based bioinformatics, OpenStack, and neural networks.

On the software side, Qualcomm is working with a variety of partners to enable and optimize their software stacks for the Falkor design. At Hot Chips, Qualcomm also stated that there are plans in the works to support Windows Server, based on work done with their Snapdragon on Arm initiative, although this seemed to be missing from the presentation.

Also as a teaser, Qualcomm gave the name of its next-generation enterprise processor. The next design will be called the Qualcomm Firetail, using Saphira cores. (Qualcomm has already trademarked both of those names).

Qualcomm Centriq is now shipping (for revenue) to key customers. We should be on the list for review samples when they become available.

The MSI X299 Tomahawk Arctic Motherboard Review: White as Snow

Our first look at MSI X299 offerings arrived in the form of a mid-range board from the Arsenal Gaming series, the Tomahawk Arctic. The Tomahawk line of motherboards, MSI says, have heavy plated heatsinks and “combative looks ready for anything”. It uses MSI’s branded ‘Military Class VI’ components and has features for gamers aimed to improve their gaming experience.

MSI X299 Tomahawk Arctic Overview

MSI released its Arsenal Gaming line back in 2015 with the Z170 chipset. Within the line were board names such as Grenade, Mortar, Bazooka, and the Tomahawk, each supporting a different level of features within the Arsenal line. The Arsenal Gaming Segment is suited for the beginner or casual gamer using integrated graphics or mainstream level VGAs. Features for the casual user have more subtle influences, such as LED lighting that is found on the back of the motherboard for an ambient light approach. The Mousemaster software allows users to tweak their high-end gaming mouse, and an EZ debug LED showing where in the boot process it may be getting hung up. All of the features are claimed to simplify and streamline activities for that casual user and gamer.

MSI X299 Tomahawk Arctic
On Newegg

Overall, the CPU performance on the X299 Tomahawk AC was above average, managing to beat out comparable boards in some tests due to the use of Multi-Core Enhancement although some tests were more affected by MCE than others, as well as different tests to other MSI boards. We are not entirely sure why this happened as the testing used the exact same settings and drivers. Even turbo/clock speeds were the same in both tests, so it might be related to how the board ramps up and down the frequency on the local environment at the time. Boot times were in the middle of the pack, while power use was slightly higher than the previous boards tested. Overclocking results were the same as our other high-end motherboard tests, with our i9-7900X hitting 4.5 GHz at our temperature limit. The voltage needed to reach the clocks was slightly less, though not enough to push past our current temperature limited clock speeds. The small variance could also be due to software since the board does not have voltage read points to confirm with a digital multi-meter.

The Tomahawk gives users almost the full gamut of storage connectivity. There are six SATA ports, two M.2 slots, and one U.2 port. The two M.2 slots go through the chipset, and sends the data through the same path as the eight SATA ports to do so. With that, when using M.2 SATA devices, some SATA ports will be disabled. The U.2 port and the third PCIe slot are switched, meaning only one can be used at once. There are many options and many outcomes when using these devices. It is explained in more detail later in the review, as well as all the details are found in the manual (p32).

There are six total PCIe slots with the slots in positions 1, 4, and 6 are all CPU connected lanes (intended for video cards) while PCIe slots 2, 3, and 5 are sourced from the chipset. For connectivity, the MSI X299 Tomahawk has a USB 3.1 (10 Gbps) Type-C port and Type-A port on the back panel via an ASMedia 3142 controller, with three other 5 Gbps ports on the back panel managed by the ASMedia 1074 hub. The chipset delivers the remainder of the 5 Gbps and USB 2.0 ports. Audio comes through a Realtek ALC1220 codec, and networking from an Intel I219-V controller.

The X299 Tomahawk Arctic is currently priced at $280 at both Newegg and Amazon. The price point places the Tomahawk Arctic in the middle of MSI’s entire product stack, and between its twin siblings within the Arsenal line, the Tomahawk (non-Arctic) and the Tomahawk AC. They share the exact same DNA but are both non-white, while the Tomahawk AC adds Wi-Fi and Bluetooth using the Intel Dual Band Wireless AC 8265 module.

MSI’s X299 Strategy

MSI brings a current total of 11 X299 boards to choose from: the MSI X299 XPower Gaming AC holds the flagship title and makes its home in the Enthusiast Gaming segment along with the X299 Gaming M7 ACK. There are a total of three boards in the Performance Gaming hierarchy in the X299 Gaming Pro Carbon AC, Gaming Pro Carbon, and X299M Gaming Pro Carbon AC (mATX). The Arsenal line carries the three Tomahawk boards, the X299 Tomahawk, the X299 Tomahawk Arctic (this review) and the X299 Tomahawk AC, while the Pro lineup for professionals has three motherboards; X299 SLI Plus (review soon!), X299 Raider, and X299M-A Pro.

MSI’s X299 Motherboard Lineup (11/20)
AnandTech
Review
Amazon Newegg
X299 XPower Gaming AC $450 $450
X299 Gaming M7 ACK $378 $380
X299 Gaming Pro Carbon AC Review 9/21 $310 $310
X299 Gaming Pro Carbon $320 $320
X299M Gaming Pro Carbon AC $some
X299 Tomahawk AC $290 $290
X299 Tomahawk Arctic this review $280 $280
X299 Tomahawk $269 $270
X299 SLI PLUS [upcoming link] $220 $220
X299 Raider $215 $220
X299M-A Pro $237

The two mini-ITX versions are seemingly hard to find at this moment, in the US at least.

Information on Intel’s X299 and our other Reviews

With Intel’s release of the Basin Falls platform, encompassing the new X299 chipset and LGA2066 socket, a new generation of CPUs called Skylake-X and Kaby Lake-X were also released. The Skylake-X CPUs range from the 7800X, a hex-core part, all the way up to an 18-core 7980XE multitasking behemoth. Between the bookend CPUs are five others increasing in core count, as in the table below. The latter HCC models are set to be launched over 2H of 2017.

Skylake-X Processors
7800X 7820X 7900X 7920X 7940X 7960X 7980XE
Silicon LCC HCC
Cores / Threads 6/12 8/16 10/20 12/24 14/28 16/32 18/36
Base Clock / GHz 3.5 3.6 3.3 2.9 3.1 2.8 2.6
Turbo Clock / GHz 4.0 4.3 4.3 4.3 4.3 4.3 4.2
Turbo Max Clock N/A 4.5 4.5 4.4 4.4 4.4 4.4
L3 1.375 MB/core 1.375 MB/core
PCIe Lanes 28 44 44
Memory Channels 4 4
Memory Freq DDR4 2400 2666 2666
TDP 140W 140W 165W
Price $389 $599 $999 $1199 $1399 $1699 $1999

Board partners have launched dozens of motherboards on this platform already, several of which we will have an opportunity to look over in the coming weeks and months.

To read specifically about the X299 chip/platform and the specifications therein, our deep dive into what it is can be found at this link.

X299 Motherboard Review Notice

If you’ve been following the minutiae of the saga of X299 motherboards, you might have heard some issues regarding power delivery, overclocking, and the ability to cool these processors down given the power consumption. In a nutshell, it comes down to this:

  • Skylake-X consumes a lot of power at peak (150W+),
  • The thermal interface inside the CPU doesn’t do much requiring a powerful CPU cooler,
  • Some motherboard vendors apply Multi-Core Turbo which raises the power consumption and voltage, exacerbating the issue
  • The VRMs have to deal with more power, and due to losses, raise in temperature
  • Some motherboards do not have sufficient VRM cooling without an active cooler
  • This causes the CPU to declock or hit thermal power states as to not degrade components
  • This causes a performance drop, and overclocked systems are affected even more than usual

There has been some excellent work done by Igor Wallossek over at Tom’s Hardware, with thermal probes, thermal cameras, and performance analysis. The bottom line is that motherboard vendors need to be careful when it comes to default settings (if MCT is enabled by default) and provide sufficient VRM cooling in all scenarios – either larger and heavier heatsinks or moving back to active cooling. This means there are going to be some X299 boards that perform normally, and some that underperform based on BIOS versions or design decisions.