Gelsinger Leads Emboldened Intel With Ice Lake Launch

The past several years haven’t been easy on Intel. The world’s top processor maker stumbled on its transition from 14 nanometer to 10 nanometer manufacturing and still finds itself behind rivals like AMD and Arm, which have made the move to 7 nanometer processes and which have line of sight on 5 nanometer. Manufacturing glitches, product delays and changes at the executive level have had some observers questioning Intel’s ability to keep up amid the changes roiling datacenters, from clouds and the edge to new workloads like artificial intelligence, advanced analytics, and 5G.

The company isn’t in danger of losing its dominant position at the top of the server chip market, where it still holds a share of more than 90 percent. However, the server chip field has become more crowded, with a reinvigorated AMD that after more than a decade of wandering in the wilderness with its Opteron processors roared back onto the scene in 2017 on the back of its Epyc chips, based on the company’s new Zen architecture. The Epyc chips put AMD solidly back into the mix, a position put on display last month when it introduced its “Milan” Epyc 7003 chip family.

Arm is still pushing into the datacenter and if Nvidia is successful in its $40 billion bid to buy the processor designer, it will make the GPU maker a more formidable foe for Intel. Added to all this a growing list of smaller chip makers and startups – most notably Ampere Computing and its Arm-based Altra offerings – looking to meet demand for silicon optimized for AI and similar workloads.

Intel on Tuesday came out swinging with its “Ice Lake” 3rd Gen Xeon Scalable processors, a sprawling family of chips that come with integrated AI acceleration through its DL Boost technology, security capabilities via Software Guard Extension (SGX) and Crypto Acceleration and a broad range of support from server OEMs (Dell, Hewlett Packard Enterprise, Lenovo, and Cisco, among others) and top public cloud providers like Microsoft Azure, Amazon Web Services, Google Cloud, and Oracle Cloud.

The company also touted new Ice Lake SP Xeon chips as foundational to platform offerings – in conjunction with such technologies as Intel’s Optane persistent memory, Ethernet, software offerings and accelerators like GPUs and FPGAs – for core datacenters, cloud infrastructures, and the rapidly expanding edge space.

We at The Next Platform have laid out many of the grittier details about the new chips. However, the message from Intel – particularly new CEO Pat Gelsinger – was that Intel silicon is the starting point for platform and solutions it can bring together for organizations, whether they’re enterprises or HPC firms running general-purpose applications or advanced workloads that need optimized offerings.

“It’s a new day at Intel,” Gelsinger said during his part of a recorded presentation for the Ice Lake Xeon SP launch. “We are no longer just the CPU company. Only Intel can bring together software, silicon and platforms, packaging and process with at-scale manufacturing for our customers so they can build their next-generation innovations. This is our unique advantage as an integrated device manufacturer, or IDM.”

Gelsinger spent three decades at Intel before leaving in 2012 and taking over the CEO slot at VMware, where he oversaw that company’s evolution from the dominant datacenter virtualization vendor to a significant player in the hybrid cloud software market. He returned to Intel earlier this year, replacing Bob Swan, and quickly began making moves to restore Intel’s footing and swagger in the market. That effort was evident last month, when the CEO pushed back against suggestions by some industry observers that Intel should pare back on it manufacturing efforts – which for decades had been touted by Intel executives as a key differentiator from other vendors, including AMD and IBM – and instead announced a $20 billion plan to expand its fabs so it can not only make more Intel processors but also those of other companies (it also launched its Intel Foundry Services unit, a formalizing of work it has already been doing for years).

The plan calls for building two new fabs in Arizona and making manufacturing facilities in the United States and Europe more widely available to other companies. It will continue to leverage third-party fabs from companies like Taiwan Semiconductor Manufacturing Corp. (TSMC) for such products as chipsets and graphics chips, but it will continue to make the bulk of its core silicon in-house. In a statement at the time, Gelsinger said that “Intel is and will remain a leading developer of process technology, a major manufacturer of semiconductors, and the leading provider of silicon globally.”

What didn’t get as much play during what the announcement of what he dubbed IDM 2.0 was that Intel, which killed off its popular Intel Developer Forum (IDF) conference in 2016, will launch a similar event in the fall in San Francisco called Intel On. IDF had been a place for Intel to make product announcements, but the company was challenged by increasing difficulties having products in place to announce every year and of presenting offerings that covered a wide range of areas – from datacenters to mobile devices – to a single audience.

However, losing IDF meant losing a massive opportunity for Intel to annually show off its capabilities. Bringing back a similar event will enable to Intel to reclaim that center of gravity every year.

Development of the Ice Lake Xeon SP processors was well underway before Gelsinger took over the CEO spot – according to Intel, more than 200,000 units shipped for revenue in the first quarter – but they fit in well with the forceful message he is putting out there about Intel’s future.

During the virtual event, Navin Shenoy, executive vice president and general manager of Intel’s Data Platforms Group, said Intel knew that this family of chips needed to be flexible across a wide range of workloads that datacenters are dealing with.

“The datacenters of the future will look very different from how they do today,” Shenoy said while showing off a chip. “They’ll be more distributed in size and location, built on both public and private cloud computing. Storage and memory will be increasingly disaggregated, leveraging pools of connected infrastructure. Security will be architected in at the chip level. Flexibility will extend across hardware, software and applications and services deploy to smaller units called microservices for the faster development time. … CPUs and XPUs will work closely together to solve increasingly complex problems across these distributed environments.”

Over the last several years, Intel has responded by not only developing new CPUs but also other components, from GPUs, FPGAs, and AI processors to silicon photonics and software. By the numbers, they can run in systems with one to eight sockets, can have up to 40 cores per processor and can deliver up to 2.65 times the average performance of a system that’s been around for five years and a 15 percent generation-to-generation performance gain. Each socket supports up to 6 TB of mixed DDR and Optane memory, up to eight channels of memory, and as many as 64 lanes of PCI-Express 4.0 peripherals.

There is built-in AI acceleration and software optimizations that provide 74 percent AI performance improvement than “Cascade Lake” Xeon SPs and Intel’s SGX, a set of instructions for improved security for software coding and data. It has been used in Intel’s Xeon E processors, but it now is making the jump to SP chips. Intel’s Crypto Acceleration helps drive the performance of compute-intensive cryptographic algorithms for organizations like online retailors that process high volumes of transactions a day or healthcare firms that need to protect patient data.

The datacenter platform Ice Lake Xeon SPs anchor includes the Optane 200 series, Optane P5800X solid-state drive (SSD) and SSD D5-P5316 NAND SSDs. There also is Intel’s Ethernet 800 Series network adapters and Agilex FPGAs.

Intel also put a focus on networking, given the need to move data and workloads easily and quickly between datacenters, multiple public clouds and the edge. Intel’s N-SKUs are network-optimized offerings designs to support a range of network environments and enable the newest SP chips to drive 62 percent more performance over the prior generation for such workloads as vRAN, network functions virtualization infrastructure (NVFI) and virtual content development networks (CDNs).

“Having the networking capabilities to quickly and efficiently move the data at the edge and back to the data center is critical,” said Lisa Spelman, vice president and general manager of Intel’s Xeon products, noting telecommunications service providers’ need to increase network capacity and throughput to manage the growing demand on their networks. “The ability to achieve this is built upon a virtualization foundation, an area of Intel expertise. Our network-optimized Xeon end SKUs offer lower latency, higher throughput and deterministic performance with a variety of core counts and power uploads and on a range of broadly deployed network and 5G workloads.”

Sign up to our Newsletter

Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.
Subscribe now

Be the first to comment

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.