Intel’s Mix and Match Innovation (Infographic)

Today, at the annual Hot Chips Conference in Cupertino, California, Intel presented details about the company’s EMIB (Embedded Multi-die Interconnect Bridge) packaging technology. Developed by Intel, EMIB facilitates high-speed communication between multiple die in-package, and is a key component of Intel’s mix-and-match heterogeneous computing strategy. EMIB is used in Intel® Stratix® 10 FPGAs and 8th Gen Intel® Core™ processors with Radeon Graphics.

monolithic vs heterogeneous infographic thumb
» Click for full image

The post Intel’s Mix and Match Innovation (Infographic) appeared first on Intel Newsroom.

Intel Hosts NASA Frontier Development Lab Demo Day for 2018 Research Presentations

moon 2x1
A NASA Frontier Development Lab photo shows a crater on the moon. (Credit: NASA Frontier Development Lab)

What’s New: Today Intel hosts the NASA Frontier Development Lab (FDL)* Event Horizon 2018: AI+Space Challenge Review Event (view live webcast) in Santa Clara, California. It concludes an eight-week research program that applies artificial intelligence (AI) technologies to challenges faced in space exploration. For the third year, NASA Ames Research Center*, the SETI Institute* and participating program partners have provided support to ongoing research for interdisciplinary AI approaches, leveraging the latest hardware technology with advanced machine learning tools.

“Artificial intelligence is expected to significantly influence the future of the space industry and power the solutions that create, use and analyze the massive amounts of data generated. The NASA FDL summer program represents an incredible opportunity to take AI applications and implement them across different challenges facing space science and exploration today.”
– Amir Khosrowshahi, vice president and chief technology officer, Artificial Intelligence Products Group, Intel

Why It’s Important: Through its work with FDL, Intel is addressing critical knowledge gaps by using AI to further space exploration and solve problems that can affect life on Earth.

New tools in artificial intelligence are already demonstrating a new paradigm in robotics, data acquisition and analysis, while also driving down the barriers to entry for scientific discovery. FDL’s researcher program participants have implemented AI to predict solar activity, map lunar poles, build 3D shape models of potentially hazardous asteroids, discover uncategorized meteor showers and determine the efficacy of asteroid mitigation strategies.

“This is an exciting time for space science. We have this wonderful new toolbox of AI technologies that allow us to not only optimize and automate, but better predict Space phenomena – and ultimately derive a better understanding,” said James Parr, FDL director.

The Challenge: Since 2017, Intel has been a key partner to FDL, contributing computing resources and AI and data science mentorship. Intel sponsored two Space Resources teams, which used the Intel® Xeon® platform for inference and training, as well as the knowledge of Intel principal engineers:

  • Space Resources Team 1:  Autonomous route planning and cooperative platforms to coordinate routes between a group of lunar rovers and a base station, allowing the rovers to autonomously cooperate in order to complete a mission.
  • Space Resources Team 2: Localization – merging orbital maps with surface perspective imagery to allow NASA engineers to locate a rover on the lunar surface using only imagery. This is necessary since there is no GPS in space. A rover using the team’s algorithm will be able to precisely locate itself by uploading a 360-degree view of its surroundings as four images.

More Challenges: Additional challenges presented during the event include:

  • Astrobiology Challenge 1: Understanding What is Universally Possible for Life
  • Astrobiology Challenge 2: From Biohints to Evidence of Life: Possible Metabolisms within Extraterrestrial Environmental Substrates
  • Exoplanet Challenge: Increase the Efficacy and Yield of Exoplanet Detection from Tess and Codify the Process of AI Derived Discovery
  • Space Weather Challenge 1: Improve Ionospheric Models Using Global Navigation Satellite System Signal Data
  • Space Weather Challenge 2: Predicting Solar Spectral Irradiance from SDO/AIA Observations

What’s Next: At the conclusion of the projects, FDL will open-source research and algorithms, allowing the AI and space communities to leverage work from the eight teams in future space missions.

The post Intel Hosts NASA Frontier Development Lab Demo Day for 2018 Research Presentations appeared first on Intel Newsroom.

Sensors – the Eyes and Ears of Autonomous Vehicles

One of the most basic – and challenging – building blocks of an autonomous vehicle system is the ability to detect and classify objects. A vehicle must be able to accurately assess its surroundings before safely adjusting to traffic, roadway regulations or obstacles.

The high-accuracy sensors inside advanced driver assistance systems (ADAS) are saving lives on the road today and include a suite of cameras, lidar, radar, computing, and mapping technologies, explained here.

More: Autonomous Driving at Intel (Press Kit) | Experience Counts, Particularly in Safety-Critical Areas (Amnon Shashua Editorial) | Intel and Mobileye Autonomous Test Car Drives in Jerusalem (360-degree view video) | Download a static version of this graphic (PDF)

The post Sensors – the Eyes and Ears of Autonomous Vehicles appeared first on Intel Newsroom.

Intel Introduces New NUC Kits and NUC Mini PCs to the Intel NUC Family

Intel NUC Mini PC 1

» Download all images (ZIP, 219 KB)

What’s New: Today, Intel introduces new NUC kits and NUC mini PCs to the Intel NUC family featuring 8th Gen Intel® Core™ processors, providing greater choice for a wide range of mainstream computing needs.

“Intel NUCs are mini PCs that offer high-performance capabilities in a space-saving design and are perfectly suited for home theater, home office, entry-level gaming or as a replacement for desktops when space is a concern. These new NUCs offer a number of new options that will fit a wide range of computing needs.”
–John Deatherage, marketing director for Intel NUCs

What It Is: The new Intel NUCs bring desktop performance in a compact form factor for enthusiasts of all levels to use anywhere.

The new Intel NUC kits (NUC8i7BEH, NUC8i5BEH, NUC8i5BEK, NUC8i3BEH, NUC8i3BEK, formerly code-named Bean Canyon) are based on the 8th Gen Intel Core i7, i5 and i3 processors (formerly code-named Coffee Lake-U) featuring Intel Iris® graphics with eDRAM that can power home theater systems, drive content creator boxes and serve as a personal voice assistant. The new Intel NUC kits allow integrators and DIYers to customize with their choice of storage, memory and operating system. With this flexibility, the Intel NUC kits offer a range of price/performance options to meet most mainstream users’ needs in an ultra-small form factor.

Designed with the right balance of performance and affordability, the new Intel NUC mini PCs (NUC8i3CYSM, NUC8i3CYSN, formerly code-named Crimson Canyon) are an affordable mainstream gaming option for playing some of today’s most popular games at 1080p, including “League of Legends*”, “TF2″* and “CS:GO”*. These NUCs are powered by the 8th Gen Intel Core i3-8121U processors (formerly code-named Cannon Lake) and are the first mainstream NUCs to feature discrete graphics. These NUCs come fully configured with 1TB of storage, either 8GB or 4GB of memory, and Windows® 10 Home and include Intel’s Wireless-AC 9560 CNVi 802.11ac WiFi + Bluetooth 5 solution, two HDMI 2.0a outputs, and four USB 3.0 ports – all in a form factor that fits in the palm of your hand and can easily be hidden behind a monitor or mounted under a desk to save space.

Why It’s Important: Intel NUCs are mini PCs that offer high-performance capabilities in a space-saving design and are perfectly suited for home theater, home office, entry-level gaming or as a replacement for desktops when space is a concern. These new NUCs offer a number of new options that will fit a wide range of computing needs.

How You Get It: The new Intel NUC kits and Intel NUC mini PCs will be available worldwide through Intel distributors and through online retailers beginning in September.

More Context: For information on all Intel NUC kits, visit the NUC Kits page on Intel.com. For information on all fully configured Intel NUC mini PCs, visit the NUC Mini PCs page on Intel.com.

The post Intel Introduces New NUC Kits and NUC Mini PCs to the Intel NUC Family appeared first on Intel Newsroom.

Intel Advances the Safe Integration of Drones into US Airspace

Intel Open Drone ID 1

» Download all images (ZIP, 3 MB)

What’s New: Intel is working with the Federal Aviation Administration (FAA) and other industry participants to foster innovation and to shape the global standards and practices for unmanned aircraft systems (UAS), with safety being the first priority. Intel’s assistance builds on the company’s foundational work last year with the Unmanned Aircraft System Traffic Management (UTM) trials conducted by NASA and the FAA to develop and test UAS drone guidelines for collaborative communications and navigation among unmanned aerial systems in the sky.

“I’m honored that Intel’s Drone Group is participating in such critical programs to pave the way for new and expanded commercial UAS operations. By working with the U.S. government, as well as various other industry partners, we can demonstrate the magnitude of a drone’s potential when integrated into our nation’s airspace in a responsible way.”
–Anil Nanduri, vice president and general manager, Intel drone team

Why It Matters: The White House tasked the U.S. Department of Transportation and the FAA with maintaining and building on their leadership in the drone space. New programs would facilitate advanced commercial drone operations and applications of technology and allow testing of UAS traffic management systems and detection and tracking capabilities. This is necessary to fully integrate drone operations into the national airspace system.

Recently, the FAA initiated the Unmanned Aircraft System Integration Pilot Program (IPP) designed to explore ways to safely expand cutting-edge drone operations into the national airspace by pairing state, local and tribal governments with unmanned aircraft operators. The program consists of 10 teams in locations throughout the country. The work will test advanced drone operation and related technology over several years. Specifically, Intel is a participant in four of the 10 sites and may participate in operations in the Choctaw Nation of Oklahoma, Durant, Oklahoma; the city of San Diego; the Innovation and Entrepreneurship Investment Authority, Herndon, Virginia; and the Memphis-Shelby County Airport Authority in Tennessee.

Today’s Activities: Today in Oklahoma, the Choctaw Nation hosted a media event to show progress, demonstrate and share results of some of the first missions. Intel flew night missions using a thermal sensor on the Intel® Falcon™ 8+ drone. This application could be used to look for lost cattle, as well as learn more about the habits and tendencies of local wildlife. In addition, Intel performed the first public demonstration of Open Drone ID, an open standard that offers a solution for the remote identification and tracking of UAS. Future missions at Choctaw may include drones for agricultural applications, public safety and infrastructure inspections, with planned beyond-visual-line-of-sight operations over people and more nighttime operations. The plans are to invest in mobile ground-based detect-and-avoid radars and advanced weather infrastructure.

What’s Come Before: The FAA has been chartered by the Unmanned Aircraft Systems Identification and Tracking Aviation Rulemaking Committee (ARC) to identify, categorize and recommend available and emerging technology for the remote identification and tracking of UAS. Open Drone ID is designed as an open standard that offers a solution. It is a beacon-based (wireless drone identification) solution that enables drones to be identified when within range of a receiver, like a smartphone. The current draft specification is based on Bluetooth 4.2 broadcast packets and Bluetooth 5 (long-range) advertising extensions. With this technology, each aircraft can broadcast its unique ID, location, direction, altitude, speed, make/model, base location and other related data.

The Open Drone ID project is managed through a workgroup within ASTM, an international standards body. Intel is leading the ASTM F38 Remote ID Standard and Tracking Workgroup. It is important that Open Drone ID is a global standard, like Wi-Fi or Bluetooth, to provide broad scalability to many end users and use cases.  More information can be found at the Open Drone ID website.

Intel’s Role: Intel has a history of participating in standards bodies and industry groups worldwide and has played a significant industry leadership role in bringing about globally adopted standards such as Ethernet, USB and Wi-Fi. Standards developed by standards-setting organizations and industry consortia are common tools to bring new innovations to global mass markets.

More Context: Drones at Intel

The post Intel Advances the Safe Integration of Drones into US Airspace appeared first on Intel Newsroom.

Intel and Philips Accelerate Deep Learning Inference on CPUs in Key Medical Imaging Uses

healthcare illustrationWhat’s New: Using Intel® Xeon® Scalable processors and the OpenVINO™ toolkit, Intel and Philips* tested two healthcare use cases for deep learning inference models: one on X-rays of bones for bone-age-prediction modeling, the other on CT scans of lungs for lung segmentation. In these tests, Intel and Philips achieved a speed improvement of 188 times for the bone-age-prediction model, and a 38 times speed improvement for the lung-segmentation model over the baseline measurements.

“Intel Xeon Scalable processors appear to be the right solution for this type of AI workload. Our customers can use their existing hardware to its maximum potential, while still aiming to achieve quality output resolution at exceptional speeds.”
–Vijayananda J., chief architect and fellow, Data Science and AI at Philips HealthSuite Insights

Why It’s Important: Until recently, there was one prominent hardware solution to accelerate deep learning: graphics processing unit (GPUs). By design, GPUs work well with images, but they also have inherent memory constraints that data scientists have had to work around when building some models.

Central processing units (CPUs) – in this case Intel Xeon Scalable processors – don’t have those same memory constraints and can accelerate complex, hybrid workloads, including larger, memory-intensive models typically found in medical imaging. For a large subset of artificial intelligence (AI) workloads, Intel Xeon Scalable processors can better meet data scientists’ needs than GPU-based systems. As Philips found in the two recent tests, this enables the company to offer AI solutions at lower cost to its customers.

Why It Matters: AI techniques such as object detection and segmentation can help radiologists identify issues faster and more accurately, which can translate to better prioritization of cases, better outcomes for more patients and reduced costs for hospitals.

Deep learning inference applications typically process workloads in small batches or in a streaming manner, which means they do not exhibit large batch sizes. CPUs are a great fit for low batch or streaming applications. In particular, Intel Xeon Scalable processors offer an affordable, flexible platform for AI models – particularly in conjunction with tools like the OpenVINO toolkit, which can help deploy pre-trained models for efficiency, without sacrificing accuracy.

These tests show that healthcare organizations can implement AI workloads without expensive hardware investments.

What the Results Show: The results for both use cases surpassed expectations. The bone-age-prediction model went from an initial baseline test result of 1.42 images per second to a final tested rate of 267.1 images per second after optimizations – an increase of 188 times. The lung-segmentation model far surpassed the target of 15 images per second by improving from a baseline of 1.9 images per second to 71.7 images per second after optimizations.

What’s Next: Running healthcare deep learning workloads on CPU-based devices offers direct benefits to companies like Philips, because it allows them to offer AI-based services that don’t drive up costs for their end customers. As shown in this test, companies like Philips can offer AI algorithms for download through an online store as a way to increase revenue and differentiate themselves from growing competition.

More Context: Multiple trends are contributing to this shift:

  • As medical image resolution improves, medical image file sizes are growing – many images are 1GB or greater.
  • More healthcare organizations are using deep learning inference to more quickly and accurately review patient images.
  • Organizations are looking for ways to do this without buying expensive new infrastructure.

The Philips tests are just one example of these trends in action. Novartis* is another. And many other Intel customers – not yet publicly announced – are achieving similar results. Learn more about Intel AI technology in healthcare at “Advancing Data-Driven Healthcare Solutions.”

The post Intel and Philips Accelerate Deep Learning Inference on CPUs in Key Medical Imaging Uses appeared first on Intel Newsroom.

Protecting Our Customers through the Lifecycle of Security Threats

By Leslie Culbertson

Intel’s Product Assurance and Security (IPAS) team is focused on the cybersecurity landscape and constantly working to protect our customers. Recent initiatives include the expansion of our Bug Bounty program and increased partnerships with the research community, together with ongoing internal security testing and review of our products. We are diligent in these efforts because we recognize bad actors continuously pursue increasingly sophisticated attacks, and it will take all of us working together to deliver solutions.

Today, Intel and our industry partners are sharing more details and mitigation information about a recently identified speculative execution side-channel method called L1 Terminal Fault (L1TF). This method affects select microprocessor products supporting Intel® Software Guard Extensions (Intel® SGX) and was first reported to us by researchers at KU Leuven University*, Technion – Israel Institute of Technology*, University of Michigan*, University of Adelaide* and Data61*1. Further research by our security team identified two related applications of L1TF with the potential to impact other microprocessors, operating systems and virtualization software.

More: Security Exploits and Intel Products (Press Kit) | Security Research Findings (Intel.com)

I will address the mitigation question right up front: Microcode updates (MCUs) we released earlier this year are an important component of the mitigation strategy for all three applications of L1TF. When coupled with corresponding updates to operating system and hypervisor software released starting today by our industry partners and the open source community, these updates help ensure that consumers, IT professionals and cloud service providers have access to the protections they need.

L1TF is also addressed by changes we are already making at the hardware level. As we announced in March, these changes begin with our next-generation Intel® Xeon® Scalable processors (code-named Cascade Lake), as well as new client processors expected to launch later this year.

We are not aware of reports that any of these methods have been used in real-world exploits, but this further underscores the need for everyone to adhere to security best practices. This includes keeping systems up-to-date and taking steps to prevent malware. More information on security best practices is available on the Homeland Security website.

About L1 Terminal Fault

All three applications of L1TF are speculative execution side channel cache timing vulnerabilities. In this regard, they are similar to previously reported variants. These particular methods target access to the L1 data cache, a small pool of memory within each processor core designed to store information about what the processor core is most likely to do next.

The microcode updates we released earlier this year provide a way for system software to clear this shared cache. Given the complexity, we created a short video to help explain L1TF.

Once systems are updated, we expect the risk to consumer and enterprise users running non-virtualized operating systems will be low. This includes most of the data center installed base and the vast majority of PC clients. In these cases, we haven’t seen any meaningful performance impact from the above mitigations based on the benchmarks we’ve run on our test systems.

There is a portion of the market – specifically a subset of those running traditional virtualization technology, and primarily in the data center – where it may be advisable that customers or partners take additional steps to protect their systems. This is principally to safeguard against situations where the IT administrator or cloud provider cannot guarantee that all virtualized operating systems have been updated. These actions may include enabling specific hypervisor core scheduling features or choosing not to use hyper-threading in some specific scenarios. While these additional steps might be applicable to a relatively small portion of the market, we think it’s important to provide solutions for all our customers.

For these specific cases, performance or resource utilization on some specific workloads may be affected and varies accordingly. We and our industry partners are working on several solutions to address this impact so that customers can choose the best option for their needs. As part of this, we have developed a method to detect L1TF-based exploits during system operation, applying mitigation only when necessary. We have provided pre-release microcode with this capability to some of our partners for evaluation, and hope to expand this offering over time.

For more information on L1TF, including detailed guidance for IT professionals, please visit the advisory on the security center. We’ve also provided a white paper and updated the FAQs on our security first website.

I’d like to again thank our industry partners and the researchers who first reported these issues for their collaboration and collected commitment to coordinated disclosure. Intel is committed to the security assurance of our products, and will continue to provide regular updates on issues as we identify and mitigate them.

As always, we continue to encourage everyone to take advantage of the latest security protections by keeping your systems up-to-date.

Leslie Culbertson is executive vice president and general manager of Product Assurance and Security at Intel Corporation.

1Raoul Strackx, Jo Van Bulck, Marina Minkin, Ofir Weisse, Daniel Genkin, Baris Kasikci, Frank Piessens, Mark Silberstein, Thomas F. Wenisch, and Yuval Yarom

The post Protecting Our Customers through the Lifecycle of Security Threats appeared first on Intel Newsroom.

Intel Artificial Intelligence Helps Bring ‘The Meg’ Mega Shark to the Big Screen

Meg Trailer 01 48

What’s New: Last week, Warner Bros. Pictures* and Gravity Pictures* released “The Meg*,” a science fiction action thriller film starring a prehistoric, 75-foot-long shark known as the Megalodon. Powered by Intel artificial intelligence (AI) hardware and created by Scanline VFX* using the Ziva VFX* software, the Megalodon was created by VFX animators in record time and with lifelike accuracy – from the way the shark moves in the water to its muscles and skin – to deliver a jaw-dropping experience to movie audiences around the world.

“At Intel, we strive every day to make the amazing possible, and it’s exciting to see our Intel® Xeon® Scalable processors used to bring the film’s Megalodon shark to life on the big screen.”
– Julie Choi, head of AI Marketing, Intel

Why It’s Important: AI technology allows movies to create incredibly detailed and lifelike graphics, while saving time throughout creative iterations, which all work together to elevate the art of movie creation and enhance the audience experience.

Re-creating a prehistoric, 75-foot long shark in the water for the big screen is not an easy task. In addition to bringing the Megalodon to life, Scanline and Ziva also needed to ensure its movements through the ocean, a fluid background, were realistic. They were able to realistically create the Megalodon moving through water by processing a number of physical simulations and then running the simulated shark through all of the movements and poses needed in the shots for the film.

“At Ziva, we help creators make amazing creatures with the power of Intel AI. One of the great advantages to using Intel Xeon Scalable processors is that they allow us to generate amazing training data. When you want to train a machine learning process, it needs to know how something is going to behave in order to anticipate itself, or extrapolate how it expects something to behave – in this case, the movement of the shark itself. Intel Xeon technology helped the film’s creators do that quickly and efficiently and in the most realistic way possible,” said James Jacobs, CEO, Ziva VFX.

What Powers the Technology: Intel Xeon Scalable processors power Ziva’s character-generating software and help accelerate Ziva’s physics engine – an AI algorithm that automates movement for generated creatures, including the Megalodon from “The Meg.” Additionally, Scanline used powerful Intel Xeon processors to render the shots for the film, saving them valuable time while allowing them to create more shots and options.

“To create ‘The Meg,’ we needed a massive amount of performance in our computer system,” said Stephan Trojansky, president and VFX supervisor, Scanline. “Years ago, you would have needed a huge render farm and a large crew for a very small amount of footage – today, we can use 2,500 Intel Xeon processors with almost 100,000 cores that are used to compute all of the needs of the movie. This enables fast iterations and the ability to present multiple options to the director, which is critical in making the best possible visual effects.”

Where to See It: Warner Bros. Pictures and Gravity Pictures present a di Bonaventura/Apelles Entertainment Inc.*/Maeday Productions Inc.*/Flagship Entertainment Group* production, a film by Jon Turteltaub, “The Meg.” The film was released Aug. 10 in 2D and 3D in select theatres and IMAX. It will be distributed in China by Gravity Pictures, and throughout the rest of the world by Warner Bros. Pictures, a Warner Bros. Entertainment Company.  “The Meg” has been rated PG-13.

More Context: Artificial Intelligence at Intel

The post Intel Artificial Intelligence Helps Bring ‘The Meg’ Mega Shark to the Big Screen appeared first on Intel Newsroom.

Redesigning the Solid State Drive – and the Data Center along with It

intel wayne allen
Wayne Allen leads data center storage pathfinding at Intel Corporation. Allen and his team brought the ruler, or EDSFF, solid-state drive to life, delivering massive improvements in density, cooling, and space and power efficiency. (Credit: Walden Kirsch/Intel Corporation)

How he’d describe his work to a 10-year-old: “I make it so you can save endless selfies.”

Flash to the future: While your smartphone and laptop have gotten substantially sleeker and slimmer with more compact electronics inside, data center equipment evolves more slowly. Occasionally, however, big jumps happen. One began a couple years ago, when Wayne and his team — which he says “dreams up what’s needed in the data center of the future” — set out to “go figure out the best way to deploy flash in a server.” Flash is a kind of chip that stores data, replacing the last-century spinning hard disk drive. Flash is both faster and more reliable, and most important, it’s smaller. But the size and shape of most drives is based on those old spinning disks, which have been “around for 30 years,” Wayne says.

More: Read about all Intel Innovators | World’s Densest, Totally Silent Solid State Drive (Intel Images)

Hello, ruler! Er, EDSFF: The team aimed to create something more compact, more efficient and easier to service and replace. They narrowed dozens of ideas down to three, and showed them to select customers for a vote. The winner? The aptly nicknamed “ruler.” The long version (now an industry specification called EDSFF, which includes two other form factors) is about 12 inches long, 1½ inches wide, and a measly one-third of an inch thick.

A seriously cool design: While it wasn’t easy, simply changing the size and shape of the drive had a surprising set of positive side effects. “We didn’t just improve density — we improved thermals,” Wayne notes. Thermals are a major factor in the operations and expense of a data center (cooling alone can be the biggest cost). And though these drives will spend their lives hidden from view, the design has earned awards from the Industrial Designers Society of America and the Core77 Design Awards.

Reshaping the drive, and the data center: Ruler-based servers can be designed to let fresh air pass directly to the processors in the back of the machine, improving the cooling efficiency and opening the door to even higher-performing processors. “A new form factor itself isn’t all that exciting, typically,” Wayne says. “But because [the ruler] impacts everything about server design and helps increase performance and reach new levels of density, it’s a big deal. We’re redesigning the data center with this — that’s the most fun part of it for me.”

Huge storage, tiny slot: The team wanted to achieve a petabyte of storage space — that’d be 4,000 256 GB smartphones — in a “1U” server slot that’s 1.75 inches high and 19 inches wide. Thanks to the ruler and new 3D NAND technology, 32 of Intel’s forthcoming 32-terabyte rulers will fit in that 1U space, and there’s your very slim, extremely efficient petabyte. Compared to a petabyte of hard drives, Wayne says, “we’ve delivered a 10x power reduction and a 20x space improvement. It’s pretty remarkable.”

» Click for full image

The post Redesigning the Solid State Drive – and the Data Center along with It appeared first on Intel Newsroom.

World’s Densest, Totally Silent Solid State Drive

Fast disappearing from data centers are power-hungry spinning hard disk drives that hum, buzz, run warm (or even hot), require fans and expensive cooling systems, and can crash unexpectedly.

Intel’s newest solid state drive, the Intel® SSD DC P4500, is about the size of an old-fashioned 12-inch ruler, and can store 32 terabytes. That’s equivalent to triple the entire printed collection of the U.S. Library of Congress.

The new SSD is Intel’s densest drive ever, and is built on Intel® 3D NAND technology, which stacks memory cells atop each other in multiple extremely thin layers, instead of just one. Memory cells in the P4500 are stacked 64 layers deep.

Older disk drives produce a great deal of heat. In most data centers today, the single biggest cost is air conditioning to keep them cool. This is one of the reasons some of the world’s biggest data companies — IBM, Microsoft, Tencent — are using the new “ruler” SSD to support their cloud and data center operations.

In data centers, the no-moving-parts ruler-shaped SSDs can be lined up 32 side-by-side, to hold up to a petabyte in a single server slot. Compared with a traditional SSD, the “ruler” requires half the airflow to keep cool. And compared with hard disk storage, the new 3D NAND SSD sips one-tenth the power and requires just one-twentieth the space.

More: Redesigning the Solid State Drive – and the Data Center along with It (Intel Innovators) | All Intel Images

intel ssd dc p4500
The ruler-shaped Intel SSD DC P4500 can hold up to 32 terabytes. It draws just one-tenth the power of a traditional spinning hard drive. (Credit: Walden Kirsch/Intel Corporation)
» Click for full image

The post World’s Densest, Totally Silent Solid State Drive appeared first on Intel Newsroom.