Samsung’s updated DeX desktop experience uses the S9 as a trackpad

When it launched last year, Samsung’s first DeX dock was the latest in a long line of attempts to blur the line between phones and traditional PCs. And you know what? It worked surprisingly well, even though the value for most people wasn’t particularly clear. That didn’t stop Samsung from trying again, though: the company unveiled a new version of it’s DeX hardware alongside the Galaxy S9 and S9 Plus, and it’s much different from the dock we got last year.

In its closed form, the original DeX dock sort of resembled an extra-chunky hockey puck, and the sliding mechanism that allowed a phone to fit was quite clever. That’s gone. Instead, this year’s DeX Pad looks like a shrunken Sega Genesis that the phone is meant to rest on top of. It seems like a step backward in design until you realize that Samsung wanted the phone’s touchscreen as close as possible to the surface it’s resting on so you can use it as a trackpad.

This isn’t the first time we’ve seen an approach like this — Razer did the same thing with its lapdock concept at CES — but it works quite well for navigating between running apps on a bigger display. You’ll eventually be able to use that connected phone as a keyboard too, if you’ve got a masochistic streak. I was hoping to take this for a spin just to see if the road-warriors and traveling salespeople Samsung is targeting with the DeX Pad could squeeze some use out of a tiny touch keyboard, but it wasn’t ready for me to play with just yet.

More importantly, you’ll be able to drive higher resolution displays than you could before — in the old days, max resolution for a connected screen topped out at 1080p, which is less than ideal for anyone who has invested in high-end hardware. Things are a little better now, though, since you can crank up the resolution to 2,560 x 1,440 on that big display (as long as you’re using an S9, at least). When you take the boost in screen resolution along with more accessible touch controls and the Galaxy S9’s power, some novel experiences being to emerge. I definitely didn’t think I’d spend a chunk of my day playing Final Fantasy XV: Pocket Edition on a desktop monitor while fiddling with a phone-touchpad combo, but the whole thing felt more elegant than expected.

Some changes, however, might rub some people the wrong way. The DeX Pad packs still two full-sized USB ports, a USB-C port and an HDMI-out, the ethernet jack present in the original is nowhere to be found. Whether anyone will actually miss that thing is another question, but shouldn’t have to wait long to find out.

Earthbound antimatter mystery deepens after scientists rule out pulsar source

The HAWC gamma-ray observatory detects cosmic rays from its altitude of 13,500 feet in Mexico's Pico de Orizaba National Park. The Sierra Negro volcano looms large in the background.

More antimatter particles stream toward Earth than scientists can explain — and new research from a mountaintop observatory in central Mexico deepens the mystery by crossing off one possible source.

The Earth is constantly showered by high-energy particles from a variety of cosmic sources. Physicist Victor Hess used a balloon to provide the first evidence of the extraterrestrial nature of cosmic rays in 1912. Since then, scientists have identified and accounted for a variety of different types, but the origin of some of these particles continues to elude experts.

The recent finding, detailed in the journal Science today (Nov. 17), concerns positrons, the antimatter complements of electrons. High-energy particles, usually protons, traveling across the galaxy can create pairs of positrons and electrons when they interact with dust and gas in space, study co-author Hao Zhou, at Los Alamos National Lab, told Space.com. In 2008, the space-based PAMELA detector measured unexpectedly high numbers of earthbound positrons. This was about 10 times what they were expecting to see, according to Zhou. [Supernova Face-Off May Solve 40-Year-Old Antimatter Mystery]

After years of work, camps coalesced around two distinct explanations, according to a statementby Michigan Technological University, which was involved in the new study. One hypothesis suggests the particles come from nearby pulsars, rapidly spinning cores of burnt-out stars, which can whip particles like electrons and positrons to incredible speeds. The other group posits a more exotic origin for the excess positrons, perhaps involving dark matter, an unknown yet pervasive entity that accounts for 80 percent of the universe’s mass.

Particles like positrons that carry an electric charge are difficult to detect on Earth since they can be deflected by the planet’s magnetic field. But scientists have a workaround. The particles also interact with the cosmic microwave background — an ever-present stream of low-energy photons left over from the birth of the universe. “The high-energy electron, or positron, [will] kick the low-energy photon … so this the photon becomes a high-energy gamma-ray,” Zhou said. “These gamma-rays, which have no electric charge, can pass right through the magnetic field and make it all the way to Earth’s surface.

Zhou’s team made detailed measurements of the gamma-rays coming from the direction of two nearby pulsars — Geminga and its companion PSR B0656+14 — that are the right age and distance from Earth to account for the excess positrons. To do this, the scientists used the High-Altitude Water Cherenkov (HAWC) Gamma-Ray Observatory, located about 4 hours east of Mexico City. HAWC comprises more than 300 tanks of extra-pure water. When gamma-rays plow into the atmosphere, they create a cascade of high-energy particles. As this shower of particles passes through HAWC’s tanks, it emits flashes of blue light, which scientists can use to determine the energy and origin of the original cosmic ray.

The data from HAWC revealed that particles are streaming away from the pulsars too slowly to account for the excess positrons, according to a statement by the University of Maryland, whose researchers also contributed to the work. In order to have arrived here by now, the particles would have needed to leave before the pulsars had formed, Zhou said.

Zhou’s colleagues are quick to point out an important caveat. “Our measurement doesn’t decide the question in favor of dark matter, but any new theory that attempts to explain the excess using pulsars will need to match the new data,” University of Maryland physicist Jordan Goodman, the lead investigator and U.S. spokesman for the HAWC collaboration, said in the statement from Maryland.

By observing the rotations of galaxies, scientists determined that the universe contains more mass than the objects we can observe. They call this mysterious extra mass dark matter. Aside from seeing dark matter’s gravitational influence from afar, no one has directly detected it otherwise. However, a popular model of the substance involves weakly interacting massive particles, or WIMPS, which interact with regular matter solely through gravity. If these proposed particles were to decay, or be annihilated somehow, they could conceivably generate pairs of electrons and positrons, Zhou said.

There are other astrophysical processes to consider as well. Supernova remnants and microquasars — extremely bright objects formed as matter spirals toward a black hole  — can produce positrons, Zhou said. And there’s the possibility that the initial model of particle interactions with the cosmic microwave background is inaccurate. “In order to confirm a detection of dark mater, I guess, there’s still a long way to go,” Zhou said. “We have to rule out all these astrophysical processes.”

Zhou’s team plans to take advantage of HAWC’s incredibly wide field of view to narrow down these alternatives in future studies.

Intel’s PC concept ‘hides’ a 5G antenna in a plump kickstand

MWC 2018 has been gripped with 5G fever, even more intensely than in years past. Intel has gone all out for the show this year, with display upon display of 5G demos dominating its booth. One of the new proofs of concept from the chip maker is a detachable PC streaming a 4K video over a test 5G connection using an early modem. Intel also announced a partnership with Dell, Lenovo, HP and Microsoft to bring 5G-connected PCs with XMM 8000 series modems to market in the second half of 2019.

The unnamed concept 2-in-1 is a preview of what those PCs might be capable of, in terms of connection speeds. Intel is not going to sell hardware like this directly to consumers. At the booth, the convertible streamed Wonder Woman on loop, and company reps told Engadget that the demo was hitting 4 to 5 Gbps rates.

That’s impressive if true (we had no real way of measuring), but remember that the conditions were ideal. The 28GHz spectrum that the PC was tapping for the demo is dedicated to the stream, and thus not congested. Intel also placed the base station for the network about four or five feet directly above the device, leaving little room for interference to affect the stream.

One of the biggest challenges in implementing the millimeter wave communication that’s crucial to 5G is signal interference. Line of sight, meaning the path between the transmitter and receiver, is necessary to transport data. One of the ways Intel is catering for this is by embedding its early 5G modem in the concept PC’s kickstand that flaps out of the device, which it believes should minimize obstruction of incoming signals. To be clear, on this concept device, the kickstand is much larger than we’ve seen on commercially available convertibles, and might become smaller in eventual products.

Since this isn’t a device that you’re ever going to be able to (or want to) buy, there’s no real point in telling you how the display looked or the keyboard felt. What’s interesting is, Intel’s rep told me that the device can last between 3 and 4 hours when streaming the 4K video over 5G. We don’t have any other benchmark to compare that with just yet, and actual PCs may ship with differently sized batteries, but that’s long enough to watch an entire blockbuster, and even a couple of credits scenes.

We’re just a year and a half away from these devices launching, and hopefully by then, 5G networks will have become a widespread reality. Until then, we can only wait for a world without buffer delays.

You can finally stream Xbox One games to your Mac

Before now, if you fancied playing console games on your Mac, you’d need to use PlayStation 4’s Remote Play to do so. Windows 10 users have been able to stream Xbox One titles to their PCs since 2015, but macOS users have been out of luck. A new $10 app called OneCast, however, has apparently figured out how to get your Xbox One games streaming to your Mac.

The app isn’t an official release from Xbox, however, unlike Sony’s solution. We’ve reached out to Microsoft for comment and will update this post when we hear back. OneCast promises support for 1080p video, low lag, easy set up and the ability to stream Xbox One titles over the internet (with some manual configuration). OneCast says that you can use an Xbox One wireless controller connected via USB or Bluetooth, use any number of different consoles or gamertags and run in either full screen or windowed mode. You’ll get a free 14-day trial for your $10 purchase, which is regularly $20.

Scientists have a new theory on how the Chernobyl disaster unfolded

The number four reactor at the Chernobyl nuclear plant is seen in this December 2, 1986 file photo, after completion of work to entomb it in concrete following the explosion at the plant. (Reuters)

A new theory on the Chernobyl disaster could shed fresh light on the world’s worst nuclear accident.

In an article published in the journal Nuclear Technology, scientists argue that the first of two explosions reported by eyewitnesses was a nuclear, not a steam explosion, as is widely thought. Instead, the researchers believe that the first explosive event noted by eyewitnesses was a jet of debris ejected to an altitude of almost 2 miles by a series of nuclear explosions within the Chernobyl reactor. Some 2.7 seconds later, they say, a steam explosion ruptured the reactor and sent yet more debris into the atmosphere at lower altitudes.

“We realized that we, based on real measurements and observations, could explain details in the Chernobyl accident scenario and the nature of the two major explosions that occurred during a few seconds that unfortunate night more than 31 years ago,” explained the report’s lead author Lars-Erik De Geer, in an email to Fox News.

The 1986 explosion at the Chernobyl nuclear plant in Ukraine sparked a widespread environmental disaster. Thirty workers died either from the explosion at the number four reactor or from acute radiation sickness within several months. The accident exposed millions in the region to dangerous levels of radiation and forced a wide-scale, permanent evacuation of hundreds of towns and villages in Ukraine and Belarus.

A helicopter dropping concrete onto the fourth reactor of the Chernobyl nuclear power after its explosion is seen in this 1986 file picture. (Reuters)

A cloud of radioactive particles from the disaster reached other parts of parts of Europe, such as Sweden.

The report cites xenon isotopes detected by the V.G. Khlopin Radium Institute in Leningrad four days after the accident. Leningrad, now known as Saint Petersburg, is about 599 miles north of Chernobyl. Xenon isotopes were also reported in Cherepovets, about 622 miles north of Chernobyl.

The result of recent nuclear fission, the isotopes were likely caused by a recent nuclear explosion, according to the experts. This is in contrast to the main Chernobyl debris that contained equilibrium xenon isotopes from the reactor’s rupture and drifted toward Scandinavia.

This new theory presented by experts from the Swedish Defence Research Agency, the Swedish Meteorological and Hydrological Institute and Stockholm University, could offer fresh insight into the disaster. The new analysis could help prevent similar incidents from occurring, experts say.

A panel with a portrait of Soviet state founder Vladimir Lenin and an abandoned building are seen at the 30 km (19 miles) exclusion zone around the Chernobyl nuclear reactor in the abandoned village of Orevichi, Belarus, March 12, 2016. REUTERS/Vasily Fedosenko SEARCH "REVOLUTION RUSSIA" FOR THIS STORY. SEARCH "WIDER IMAGE" FOR ALL STORIES. - RC13B4460000

A panel with a portrait of Soviet state founder Vladimir Lenin and an abandoned building are seen at the 30 km (19 miles) exclusion zone around the Chernobyl nuclear reactor in the abandoned village of Orevichi, Belarus, March 12, 2016. (REUTERS/Vasily Fedosenko)

The destroyed reactor tank suggests that the first explosion caused temperatures high enough to melt a 6.6-foot bottom plate in part of the core, the researchers said, noting that this damage is consistent with a nuclear explosion. In the rest of the core, the bottom plate was relatively intact, but had dropped by nearly 13 feet. This, they say, is consistent with a steam explosion, noting that the temperature would not be sufficient to melt the plate, but could generate enough pressure to force it down.

Additionally, seismic measurements and eyewitness reports of a blue flash above the reactor a few seconds after the first explosion could also support the new theory of a nuclear explosion followed by a steam explosion.

De Geer told Fox News that the Chernobyl disaster could only happen in Soviet-era reactors built using a design known as Reaktor Bolshoy Moshchnosti Kanalnyy (RBMK), or ‘High Power Channel-Type Reactor.’ There are 11 RBMK Reactors operating in Russia, according to The World Nuclear Association.

“Our new theory deepens the understanding of the severe effects that can be the result of some original design faults in such reactors,” he said. “Much has been corrected in remaining RBMK reactors, but a better understanding of what really happened in 1986 must of course be of great value for overseeing and possibly improving the design also in the future.”

The disaster shone a spotlight on lax safety standards and government secrecy in the former Soviet Union. The explosion on April 26, 1986, was not reported by Soviet authorities for two days, and then only after winds had carried the fallout across Europe and Swedish experts had gone public with their concerns.

The final death toll from Chernobyl is subject to speculation, due to the long-term effects of radiation. Estimates range from 9,000 by the World Health Organization to one of a possible 90,000 by the environmental group Greenpeace.

The terrible environmental fallout of Chernobyl is still being felt. A wild boar with more than 10-times the safe limit of radiation, for example, was recently killed by hunters hundreds of miles away in Sweden.

Lenovo’s Yoga 730 is a cheaper 2-in-1 with Alexa support

Mobile World Congress, for obvious reasons, isn’t really known for its laptop announcements. Lenovo is bucking that trend today, however, with three mid-range convertibles sporting the Yoga name. The most exciting is the Yoga 730, a sleek 2-in-1 that sits below the company’s flagship Thinkpad X1 line and the popular Yoga 920. It will be available in 13- and 15-inch variations, sporting similar designs but slightly different upgrade options. They’ll be joined by the Flex 14 (which will be marketed as the Yoga 530 outside of North America) a cheaper alternative with a near-identical form factor.

First, the Yoga 730. We’re yet to be told the base specs, but we know they’ll go up to an eighth-generation Intel Core i7 processor and 16GB of RAM. The smaller version will offer up to 512GB of SSD storage while its larger sibling ramps up to a beefy 1TB. On the 13-inch laptop, you’ll be stuck with integrated graphics (boo) but the 15-inch model can be specced up to a gaming-friendly Nvidia GTX 1050. Both come with built-in JBL speakers and the option of a 1080p or 4K touchscreen — perfect for mindless finger doodles and artsy brushstrokes with Lenovo’s optional Active Pen 2 stylus.

It’s a Yoga, so of course you can flip the screen over and use it like a tablet. I still think it’s weird to hold a laptop with the keyboard pressed into your palm, but hey, the option is there if you really dig the ‘clipboard’ form factor. Both are relatively light — the 13-inch model weighs two and a half pounds, while the 15-inch version comes in at 1.89kg. The display bezels are also fairly slim and I like that the webcam is placed up top, rather than the lower left-hand corner (where you’re stuck with an up-the-nose shot).

Like the Thinkpad X1 range, the Yoga 730 will support both Cortana and Amazon’s Alexa assistant. AI-hopping isn’t ideal, but until the smart speaker wars shake out it makes sense for Lenovo to offer multiple options (the company is also backing the Google Assistant with its Smart Display.) Both the 13-inch and 15-inch models will come with far-field microphones that can pick up your voice from across the room. So in theory, you could ask Cortana for your schedule and then book a time-appropriate Uber through Alexa. Or search for an article and order a pizza while you read.

The 13-inch and 15-inch Yoga 730 will start at $880 and $900 respectively. They cost considerably more than the $600 Flex 14, which will ship with an Intel Core i3 processor and a 1080p touchscreen. You can upgrade to an eighth-generation Intel Core i7 processor, a 512GB SSD, 16GB of DDR RAM and an Nvidia GeForce MX130 graphics card, but these will cost you extra. At first blush, Lenovo’s new laptops seem like decent if predictable performers for people with a sub-$1,000 budget. They all look the part and the Alexa support is nice for people who own a bunch of Echo speakers and compatible smart home appliances. Part of me misses the watchband-inspired hinge found on the Yoga 920, but hey — you can’t have it all.

Huawei’s MateBook X Pro crams a pop-up webcam into the keyboard

Huawei moved the webcam from above the screen and hid it in the keyboard, primarily to tackle privacy concerns. Instead of integrating shutters like Lenovo and HP have done, Huawei’s solution to the sticky-note-over-the-camera situation is to embed it inside a key.

Push down the spring-loaded middle button of the top function row, and up pops a 1-megapixel webcam with a light next to it indicating it’s on. Press it down when you’re done, and it not only deactivates and gets out of the way but even if it is hacked to spy on you, it will see only the darkness inside your laptop.

Kudos for creativity, Huawei — this is unique. But as we’ve already seen with Dell’s XPS line of laptops, webcams placed below the display make for seriously unflattering, unclear angles in photos and during conference calls. Unlike the XPS laptops, you can’t move the screen to try and find a less-awkward angle, either. When I tried out the MateBook X Pro’s camera, I was annoyed by how difficult it was to avoid having an on-screen double-chin — I wouldn’t be happy using this device to make any video calls.

The one other benefit of moving the webcam down to the keyboard is the MateBook X Pro’s noticeably skinnier bezels compared with the original. It doesn’t seem like a good enough tradeoff, though.

Still, the Pro is a gorgeous piece of hardware. Sure, it looks very much like a MacBook, but I’m not complaining — nice is nice. The new MateBook feels just as sturdy and premium as its predecessor, once again combining a lightweight build with a sleek profile.

If you can look past the awkward camera placement, or if you don’t intend to use your laptop’s webcam much, there are a few more things to like about the MateBook X Pro. In addition to getting bumped up to the latest eighth-generation Intel Core i5 (or i7) CPUs, the notebook can also be configured with a discrete NVIDIA MX150 graphics chip. This makes it the thinnest laptop of its size to sport discrete graphics, according to Huawei.

The MateBook X Pro has a larger screen than the original — 14 inches — which packs a 13-inch display. It also goes up to 450 nits in brightness over the smaller laptop’s 350 nits. Since I only saw the Pro in a meeting room, I couldn’t tell if the increased luminance would make it easier to see in direct sunlight, but it was certainly clear during our demo. The new model’s screen is now touch-friendly, and got a slight resolution bump to 3,000 x 2,000, maintaining a similar pixel density to the smaller version.

To complement the display during your movies or games, the MateBook X Pro comes with Dolby Atmos 2.0 for crisper and more immersive surround sound. During our demo, Dolby’s sample of nature sounds rang out loud and clear from the laptop’s quad speakers, even when I was facing the back of the device. Huawei also moved the speaker grills from above the keyboard to either side of the keys. The Pro also packs a quad-mic array, two more than before, which should allow for clearer voice quality. A much larger touchpad now sits below the keyboard, making mouse navigation more convenient, too.

The company has apparently managed to squeeze more battery life out of the Pro, and it can clock about 12 hours on a charge — two hours longer than the regular model (looping 1080p video). While the smaller MateBook X had two USB-C ports (one more than the MacBook), the new laptop adds a USB Type A socket to the mix, at USB 3.0 speeds. The two USB-C slots will not only support power and data transfer, but one of them is also compatible with Thunderbolt 3.

Questionable webcam placement aside, the MateBook X Pro appears to be an attractive, powerful 14-inch laptop. If the camera is not a dealbreaker, you might want to consider this PC as your next workhorse. Unfortunately, we still don’t know how much it will cost and when it will be available, but we do know that it will come in two colors (silver and dark gray). Because the original MateBook X started at $1,099, you can expect the Pro model to cost slightly more than that at launch. That’s not a small investment, so you might want to wait until we get a review unit in for more testing before splurging on it.

Update: At its press conference in Barcelona, Huawei shared some European pricing information, which should be a good reference for when the device arrives stateside. The base configuration, which packs the MX150 discrete graphics, a Core i5 CPU and 256GB of storage, will cost €1,499.

The GIGABYTE Aorus AX370-Gaming 5 Review: Dual Audio Codecs

we are having a look at a LED-laden, gaming-focused, ATX motherboard from GIGABYTE: the Aorus AX370-Gaming 5. If a user wants LEDs for Ryzen at under $200, here is one of the primary contenders. Being part of GIGABYTE’s gaming product line means we get SLI support, and GIGABYTE is using a gaming-focused network controller (one of two) and some overclocking options for the processor. The interesting part of this board, however, is the use of dual audio codecs: one for the rear panel and one for the front panel. To physically do this requires a couple of compromises, so we have put the board through its paces to see if it is worth buying.

GIGABYTE AX370-Gaming 5
On amazon

The GIGABYTE Aorus AX370-Gaming 5 Overview

The GIGABYTE AX370-Gaming 5 shows that not every motherboard has to conform to the regular gaming themed combination of black PCB with red or silver aluminium heatsinks. With the Gaming 5, it has a wave of black and white contrasting heatsinks featured across the board. GIGABYTE has opted to implement a fairly standard X370 PCIe layout consisting of two full-length PCIe 3.0 slots powered by the CPU, which feature support for dual graphics card configurations on either SLI or CrossFire and have additional rigidity support. This is in addition to a single full-length slot present at the bottom which operates at PCIe 2.0 x4 also with rigidity support, and three PCIe 2.0 x1 slots.

It gets a little interesting when we start discussing the controllers. Powering the onboard audio are a pair of Realtek ALC1220 codecs, with one dedicated for the back panel and one specifically for the front. Very few boards (if ever?) have had this arrangement, making the Gaming 5 special in that regard. The audio comes bundled with SoundBlaster’s X-Fi MB5 software utility. For networking, the primary port is derived from the gaming-focused Killer E2500 Gigabit controller, and a second from an Intel I211-AT controller.

Featured is a 10-phase power design, which GIGABYTE aims for solid and consistent power delivery, and claims it is useful for overclocking. It is worth noting that the VRM is split into a 4+2 design with the phases dedicated to the SoC using a doubler to give an 8+2 phase design overall.

Storage wise, the Gaming 5 has eight SATA 6Gbs ports which are accompanied by two SATA Express ports. PCIe storage comes via a single U.2 port, which shares bandwidth with a single M.2-2280 slot found between the first two full-length PCIe slots.

Performance on the Gaming 5 essentially matches what we see on the other X370 boards. Despite the dual audio codecs, this means that each codec only has half of the space, so our audio results show that it is one of the weaker ALC1220 solutions (although better than the ALC892 units we have tested). Power consumption at idle was within a couple of watts of our other boards despite the LEDs, and at load the system actually drew 15W less than our other tests, to which we’re still looking into an explanation. Overclocking, as explained below, was relatively easy.

The Gaming 5 sits near the top of the pile of GIGABYTE’s current X370 offerings, with the only model above it being the AX370-Gaming K7. It is also worth noting that GIGABYTE’s X370 range stops just short of $200 even with their top AX370-Gaming K7 model; with the very similar AX370-Gaming 5 which this review is actually on coming in at $184 (at the time of review).

Overclocking

Most, if not all, mid-range motherboards are very capable of overclocking the processor, and most include a one-click OC button (either physical or in the BIOS) which gives the task to the motherboard based on how far it believes it can be done safely. The only caveat of this is that virtually every motherboard I have used this with is very cautious about not giving enough voltage, so over-volts the CPU. This gives the overclock more chance to remain stable, but plays havoc for little gains at the price of extra energy lost as heat; thermal sensors can start to kick in even if the auto-option is safe. With the Gaming 5, both automatic overclocking and a manual overclocking is available.

Methodology

Our standard overclocking methodology is as follows. We select the automatic overclock options and test for stability with POV-Ray and OCCT to simulate high-end workloads. These stability tests aim to catch any immediate causes for memory or CPU errors.

For manual overclocks, based on the information gathered from previous testing, starts off at a nominal voltage and CPU multiplier, and the multiplier is increased until the stability tests are failed. The CPU voltage is increased gradually until the stability tests are passed, and the process repeated until the motherboard reduces the multiplier automatically (due to safety protocol) or the CPU temperature reaches a stupidly high level (100ºC+). Our test bed is not in a case, which should push overclocks higher with fresher (cooler) air.

Overclocking Results

Referencing back to the Biostar X370GTN review, our Ryzen 7 1700 CPU does have a limitation between 3.9GHz and 4.0GHz; at least on the boards we have tested thus far. This is down to silicon lottery and a combination of a sharp ramp of voltage to temperature when moving up each different step; therefore, cutting out/throttling due to thermal limitations when pushed too far on ambient cooling.

POV-Ray @ 3.9GHz

Power OCCT (w/GTX 980) - Overclocking

The Ryzen 7 1700 processor we are using has a 3.0 GHz base core clock speed and a 3.7 GHz turbo, and is rated at 65W. When overclocked to 3.9 GHz with 1.375v, the overall power consumption taken at the wall was pushing just under 187W at peak.

AMD Announces Wider EPYC Availability and ROCm 1.7 with TensorFlow Support

Earlier this year AMD announced its return to the high-end server market with a series of new EPYC processors. Inside is AMD’s new Zen core, up to 32 of them, with the focus on the major cloud providers. We were the first media outlet to publish our review of EPYC, which showed AMD to be highly competitive in an Intel dominated x86 landscape. One of the concerns over the launch period was for the wider availability of EPYC: it was clear that AMD was announcing the product very early in its distribution cycle.

At SuperComputing 17 this week, the enterprise computing conference, AMD is announcing that it has ramped production of the processors and it has several OEMs ready, distributors ready, and system integrators expanding their portfolios.

OEMs with EPYC enabled systems at Supercomputing this week include ASUS, BOXX, GIGABYTE, HPE (Hewlett Packard Enterprise), Penguin Computing, Supermicro and Tyan. Each company is targeting certain niches: ASUS for HPC and Virtualization with its RS720A-E9 and RS700A-E9 1U/2U servers, BOXX combining EPYC with Radeon Instinct for multi-GPU compute solutions and deep learning, GIGABYTE with rackmount servers, HPE for complex workloads and Supermicro moving from tower form factors to 1U, 2U and 4U for HPC and storage.

“The industry’s leading system providers are here at SC17 with a full breadth of AMD-based solutions that deliver outstanding compute capability across HPC workloads,” said Forrest Norrod, SVP and GM of Enterprise, Embedded and Semi-Custom, AMD.

We had a meeting with AMD for this launch. Normally OEM systems coming to the market might not light the news on fire, but we did have an interesting line worth mentioning. AMD stated that there has been a steep production ramp for EPYC processors, after the initial phase with the cloud providers ensuring that the systems are ready to go, and so are ready to meet OEM requirements. We were told that all the SKUs announced at launch are in production as well, all the way down to the 8-core and the 1P parts, so OEMs that are interested in the full stack can now flex their product range muscles.

AMD also wheeled out the Inventec P47 system that it announced at launch, with a single EPYC processor and four Radeon Instinct MI25 GPUs. In partnership with AMAX, 47 of these systems were put together into a single rack, capable of one PetaFLOP of single precision in a turnkey solution. From today, AMAX is now taking pre-orders for this rack, for delivery in Q1.

ROCm 1.7 Gets Multi-GPU Support, Support for TensorFlow and Caffe

AMD also announced that its open-compute platform for graphics, ROCm, is being updated to 1.7. With this revision, ROCm adds support for multiple GPUs for the latest hardware, as well as support for TensorFlow and Caffe machine learning frameworks in the MIOpen libraries. This should be bigger news, given the prevalence of TensorFlow when it comes to machine learning. AMD also stated that ROCm 1.7 delivers additional math libraries and software development support, to provide a ‘foundation for a heterogeneous computing strategy’.

Google Assistant will get support for Routines ‘in the coming weeks’

Today’s Google Assistant is much, much more capable than the version that first debuted on the original Pixel and Pixel XL. Don’t expect that progress to slow anytime soon, either: Google laid out some new plans to improve the Assistant just in time for Mobile World Congress, and they extend far beyond just teaching it more languages.

Most importantly, Google confirmed it has been working with smartphone makers on ways to weave Assistant more elegantly into our smartphones. That work is being formalized in the new Assistant Mobile OEM program, and Google’s list of accomplishments with its partners is nothing to sneeze at: it helped make Assistant compatible with certain kinds of mobile AI coprocessor and worked to make sure devices can listen for the right wake-words even when their screens are off. It won’t be long before you start to see device-specific Google Assistant commands, either — LG touted a list of 23 new commands for its updated V30, and Google also cited close working relationships with companies like Sony and Xiaomi.

Google Assistant is also finally getting support for Routines, a feature first announced last year. Long story short, you’ll be able to string together multiple actions with a single command; saying “OK Google, goodnight,” for instance, could dim your Philips lights, dial down the temperature on your Nest thermostat and lower the volume on your Google Home Max. Routine support is expected to go live within the next few weeks, as will location-based reminders through Assistant-powered speakers. (Yes, you could do this through a phone already, but parity between different flavors of Google Assistant is always a good thing.