Intel Shows Collaborative Mini-Bots, 5G Innovations and Brain-Inspired Computers at ISSCC 2019

intel 2019 isscc 3

» Download all images (ZIP, 16 MB)

What’s New: This week, Intel is presenting a series of innovations that have the potential to enable real-time, low-energy computation for an increasingly connected and data-driven world – from 5G networks to intelligent edge systems and robotic systems. These innovations in integrated circuits and systems-on-chip will be presented during the International Solid-State Circuits Conference (ISSCC), the leading forum on advanced circuit research, in San Francisco from Feb. 17-21. Intel will present 17 scientific papers and accompanying demonstrations that could have a transformative impact on a wide range of applications for the future of technology – including developments in 5G and memory.

“The research underway at Intel is varied in its focus but unified in a vision for the future of technology – one where anyone and everything can communicate with data. To achieve this vision, we recognize the need for computational systems capable of tackling problems conventional computers simply cannot handle, and – as we are showcasing this year at ISSCC – Intel is committed to furthering research and development of the technologies with the potential to carry us to that future.”
–Dr. Rich Uhlig, managing director, Intel Labs

RESEARCH PRESENTED THIS WEEK INCLUDES:

Distributed Autonomous and Collaborative Multi-Robot System Featuring a Low-Power Robot SoC in 22nm CMOS for Integrated Battery-Powered Minibots

Abstract: In this paper, Intel demonstrates a distributed, autonomous and collaborative multi-robot system featuring integrated, battery-powered, crawling and jumping minibots. For example, in a search and rescue application, four minibots collaboratively navigate and map an unknown area without a central server or human intervention, detecting obstacles and finding paths around them, avoiding collisions, communicating among themselves, and delivering messages to a base station when a human is detected.

Each minibot platform integrates: (i) a camera, LIDAR and audio sensors for real-time perception and navigation; (ii) a low-power custom robot SoC for sensor data fusion, localization and mapping, multi-robot collaborative intelligent decision-making, object detection and recognition, collision avoidance, path planning, and motion control; (iii) low-power ultra-wideband (UWB) radio for anchorless dynamic ranging and inter-robot information exchange; (iv) long-range radio (LoRa) for robot-to-base-station critical message delivery; (v) battery and PMIC for platform power delivery and management; (vi) 64MB pseudo-SRAM (PSRAM) and 1GB flash memory; and (vii) actuators for crawling and jumping motions.

Why It Matters: Multi-robot systems, working collectively to accomplish complex missions beyond the capability of a single robot, have the potential to disrupt a wide range of applications ranging from search and rescue missions to precision agriculture and farming. The multi-bot systems can dramatically speed the time to perform a single task. For example, shortening time and latency for first responders during an emergency. However, advanced robotics and artificial intelligence have, to date, required large investment and intensive computational power. The development of these distributed, autonomous and collaborative minibots, which are operated by a system-on-chip that delivers efficiencies orders of magnitude beyond what was previously possible, represents the first step toward enabling the development of energy- and cost-efficient multi-robot systems.

5G Wireless Communication: An Inflection Point

Abstract: The 5G era is upon us, ushering in new opportunities for technology innovation across the computing and connectivity landscape. 5G presents an inflection point where wireless communication technology is driven by application and expected use cases, and where the network will set the stage for data-rich services and sophisticated cloud apps, delivered faster and with lower latency. This paper will highlight the disruptive architectures and technology innovations required to make 5G and beyond a reality.

Why It Matters: Whereas 4G was about moving data faster, 5G will bring more powerful wireless networks that connect “things” to each other, to people and to the cloud. The 5G network will set the stage for data-rich services and sophisticated cloud apps, delivered faster and with lower latency than ever. It will transform our lives by helping deliver a smart and connected society with smart cities, self-driving cars and new industrial efficiencies. For this to happen, networks must become faster, smarter and more agile to handle the unprecedented increase in volume and complexity of data traffic as more devices become connected and new digital services are offered.


» Download video: “Intel Demonstrates Multi-Robot System (B-Roll)”

Applying Principles of Neural Computation for Efficient Learning in Silicon

Abstract: Intel’s Loihi novel processor implements a microcode-programmable learning architecture supporting a wide range of neuroplasticity mechanisms under study at the forefront of computational neuroscience. By applying many of the fundamental principles of neural computation found in nature, Loihi promises to provide highly efficient and scalable learning performance for supervised, unsupervised, reinforcement-based and one-shot paradigms. This talk describes these principles as applied to the Loihi architecture and shares our preliminary results toward the vision of low-power, real-time on-chip learning.

Why It Matters: Deep learning algorithms mainly used today in machine learning (ML) applications are very costly in terms of energy consumption, due to their large amount of required computations and large model sizes. Many issues, such as connectivity to the cloud, latency, privacy and public safety, could be resolved by establishing intelligent computing at the edge. By applying principles of neural computation to architecture, circuit and integrated design solutions, we could minimize the energy consumption and computational demand of edge learning systems.

Novel Memory/Storage Solutions for Memory-Centric Computing

Abstract: The exponential growth in connected devices and systems is generating a staggering amount of digital records. These records not only need to be stored but also need to be mined for useful information. This era of big data is driving fundamental changes in both memory and storage hierarchy. Data and compute need to be brought closer together to avoid networking and storage protocol inefficiencies. This drives the demand for larger memory capacity, which is currently hindered by memory subsystem cost. In addition, the need for memory persistency will not only streamline storage protocols but will also significantly reduce bring-up time after system failure. In this presentation, novel solutions for memory-centric architecture will be discussed, with a focus on their value, performance and power efficiency.

Why It Matters: Memory-centric computing has the potential to enable energy-efficient, high-performance AI/ML applications. With the explosive growth of memory-intensive workloads like machine learning, video capture/playback and language translation, there is tremendous interest in preforming some compute near memory, by placing logic inside the DRAM/NVM main-memory die (aka near-memory compute), or even doing the compute within the memory array, embedded within the compute die (aka in-memory compute). In either case, the motivation is to reduce the significant data movement between main/embedded memory and compute units, as well as to reduce latency by preforming many operations in parallel, inside the array.

More: Intel presentations at ISSCC | Intel Labs (Press Kit)

The post Intel Shows Collaborative Mini-Bots, 5G Innovations and Brain-Inspired Computers at ISSCC 2019 appeared first on Intel Newsroom.

Arsenal FC, Liverpool FC and Manchester City Bring Immersive Experiences to Fans with Intel True View

fc true view 2x1

LONDON, Feb. 7, 2019 – Imagine watching a season-defining moment, then reliving it from any angle including from the perspective of your favourite player as they run up to take a penalty or make an improbable goal-line clearance.

In partnership with Arsenal FC*, Liverpool FC* and Manchester City*, Intel will deliver immersive experiences via Intel® True View at Emirates Stadium, Anfield and the Etihad Stadium. Now their fans worldwide can enjoy the biggest moments of the match from every angle, whether they’re watching the Rights Holders’ live broadcast and highlights or reliving the action post-match from their favourite clubs’ official website, mobile app or social media.

As three of the most recognisable and innovative football clubs in the world, Arsenal FC, Liverpool FC and Premier League* champions Manchester City will leverage Intel True View to capture every match element from every angle. Intel True View re-creates the action on the pitch and presents that from an ideal vantage point or player’s perspective, using Intel’s unmatched data-processing capability to deliver the experience to fans.

Beginning March 10, Intel’s leading volumetric technology will bring fans as close to the action on the pitch as the starting XI from the world’s most iconic football clubs. With dedicated supporters across the globe, football fans are passionate about the tactics as much as the goals scored, and Intel True View will highlight the immersive experiences unique to the style and skill of football.

The partnerships will introduce features that include:

  • Multi-angle views of a play: Intel Sports’ industry-leading volumetric video process creates thrilling 360-degree replays and highlight reels from every conceivable angle, using 38 5K ultra-high-definition cameras.
  • Laser wall: A virtual plane giving viewers a clear picture as to where players are positioned on the pitch.
  • Be the player capabilities: Intel True View freezes a moment in the match to let fans see the pitch from the eyes of a player. This also enables presenters and pundits to share a new level of insight into the tactics and decisions made by players to provide an entirely new perspective to fans.

Intersection of Sports and Technology: The sports industry is undergoing a period of significant change as consumer behavior is shifting, driving technology, leagues and brands to address the expectations of fans. In 2018, technology investments into sports reached nearly $1 billion, continuing to drive the intersection of sports and technology. With smart and connected tools, Intel is uniquely positioned to enable the sports industry to capture, analyse and respond to new levels of insight in real time and create amazing new experiences for fans.

How Intel True View Works: The process begins with volumetric video, the capture and rendering technique behind Intel True View. Using the volumetric capture method, footage is recorded from 38 5K ultra-high-definition cameras that includes height, width and depth of data to produce voxels (pixels with volume). After content is captured, a substantial amount of data is processed with servers powered by Intel® Core™ i7 and Intel® Xeon® processors. The software then re-creates all the viewpoints of a fully volumetric 3D person or object. That information renders a virtual environment in spectacular, multi-perspective 3D that enables users to experience a captured scene from any angle and perspective and can provide true 6 degrees of freedom.

» Download team-specific demo videos:

James Carwana, vice president and general manager of Intel Sports, said: “Immersive media experiences continue to create more opportunities for sports teams and leagues to put the fan experience first. With the expansion of Intel True View into more stadiums with Arsenal, Liverpool and Manchester City, we have the chance to transform the experience for fans of one of the world’s top sports leagues with our leading and differentiated volumetric technology.”

Peter Silverstone, commercial director of Arsenal FC, said: “We are always looking to find new ways to bring our 780 million fans and followers around the world closer to the action and this partnership will give our fans a whole new view of the game. The technology effectively allows a supporter to step into the boots of players and see the game from their perspective.

“We have seen the impact this Intel technology has had in other sports leagues across the world and are excited that it will be installed at Emirates Stadium.

“At Arsenal we are committed to innovating and keeping at the forefront of developments on and off the pitch so it’s fitting that Emirates Stadium will be the first stadium to bring Intel’s immersive and transformational True View technology to the Premier League.”

Billy Hogan, managing director and chief commercial officer of Liverpool FC, said: “We’re delighted to be working with Intel to bring this advanced Intel True View technology to our supporters.

“Intel True View enables fans to immerse themselves even further into the game and has the power to add a new depth to match highlights, which can significantly improve the supporter experience. This technology has the potential to add a new dynamic to how people interact with the game and create different conversations with our fans around the world.

“We strive to utilise the latest technology to be at the forefront of the experience our supporters have, whether that’s on-screen or online, and with the help of our newest partner Intel, that is certainly set to continue.”

Damian Willoughby, senior vice president of Partnerships at City Football Group, said: “We’re very excited to integrate Intel True View at the Etihad Stadium and to announce our new partnership with a world-class brand like Intel. We love to be first, both on and off the pitch, so we are delighted to pioneer this game-changing technology at the Etihad Stadium. We are sure City fans, and football fans around the world, will love watching beautiful football from every angle.”

intel fc true view 4

» Download all images (ZIP, 12 MB)

The post Arsenal FC, Liverpool FC and Manchester City Bring Immersive Experiences to Fans with Intel True View appeared first on Intel Newsroom.

Intel Drone Light Show and Intel True View Technology Enhance Pepsi Super Bowl LIII Halftime Show and Super Bowl LIII Viewing Experience

Intel SBLIII True View 6

» Download drone images (ZIP, 18 MB)
» Download true view images (ZIP, 11 MB)

News Highlights:

  • One hundred fifty enhanced Intel® Shooting Star™ drones took a live flight during the Pepsi* Super Bowl LIII Halftime Show to amplify Maroon 5’s performance.
  • Intel® True View™ captures volumetric content to let fans see key plays from multiple angles and players’ points of view.

ATLANTA, Feb. 3, 2019 – Today during the Super Bowl, Intel Corporation partnered with the NFL* to create the first-ever live drone light show during a Super Bowl Halftime Show. Intel and the NFL will also make advanced Intel® True View™ highlights available for fans to relive the most exciting moments of the biggest game of the year.

As Maroon 5 began the song “She Will Be Loved,” 150 enhanced Intel Shooting Star drones floated up and over the field in a choreographed performance to the music to form the words “ONE” and “LOVE.” Intel enhanced the Intel Shooting Star drones specifically for the Pepsi Super Bowl Halftime Show to emulate the experience of floating lanterns. The drones were also enabled to successfully fly a pre-programmed path inside a closed stadium environment without GPS. Additionally, the 150 drones flown indoors exceed the world record that Intel earned flying 110 indoor drones at CES in 2018.

“Our team constantly looks for opportunities to push the boundaries of innovation and deliver stunning entertainment experiences with our drone technology,” said Anil Nanduri, Intel vice president and general manager of the Intel Drone Group. “When we received the opportunity to bring our drone light show technology back to the Super Bowl, we were excited by the challenge to execute it live and within a closed stadium environment. We collaborated with the show producers, both creatively and technically, to bring a special and unique show experience to the viewers. It was an honor to have performed with Maroon 5 to create a memorable experience for those watching live from their seats in the stadium and for viewers watching at home.”


» Download video: “Intel Drone Light Show and Intel True View at Super Bowl LIII (B-Roll)”

As an Official Technology Provider for the NFL, Intel installed Intel True View in 13 NFL stadiums, including Super Bowl LIII host Mercedes-Benz Stadium. Volumetric capture is enabling immersive experiences that bring the game to life from every angle, allowing fans to analyze key plays with multi-angle views, including through the eyes of the players. Using high-performance computing, Intel True View transforms massive amounts of volumetric video data captured from 38 5K ultra-high-definition cameras into immersive 3D replays of the game’s biggest moments. Intel True View content is accessible via NFL.com/trueview, the NFL app, the NFL channel on YouTube* and other endpoints across the NFL and participating teams.

“There is tremendous potential in what True View can do to bring our fans closer to the game than ever before,” said William Deng, vice president, Media Strategy and Business Development at the NFL. “We are thrilled to be partnered with Intel to advance this groundbreaking technology. It truly brings the fan into the action and gives them a perspective of the game that was never before possible.”

Drone light shows and Intel True View are part of Intel’s larger effort to deliver powerful, innovative technologies that enable the rich viewing and entertainment experiences of the future. For more information on Intel’s drone light show, visit Intel’s drone page. For more information on Intel True View, visit Intel’s True View page.

The Pepsi Super Bowl LIII Halftime Show was consistent with the temporary flight restrictions in effect during the game. The Intel Shooting Star drones appearing during the show were specially preprogrammed to fly and remain within the stadium, and therefore did not enter the controlled airspace over Mercedes-Benz Stadium.

The Intel Shooting Star drones appearing during the Pepsi Super Bowl LIII Halftime Show received authorization to operate under an experimental license issued by the Federal Communications Commission in compliance with federal regulations. This model of the Intel Shooting Star drone has not received final certification from the FCC and may not be offered for sale or lease, or sold or leased, until final certification is obtained.

The post Intel Drone Light Show and Intel True View Technology Enhance Pepsi Super Bowl LIII Halftime Show and Super Bowl LIII Viewing Experience appeared first on Intel Newsroom.

Intel Manufacturing Images (B-roll, Photos)

B-roll video of Intel manufacturing facilities.

Manufacturing at D1D/D1X

From April 2017:

» Download video: “Manufacturing at Intel D1D/D1X (B-roll)”

Intel-Manufacturing-34

» Download Set 1 images (ZIP, 325 MB)
» Download Set 2 images (ZIP, 387 MB)

More Intel Manufacturing

» Download “Intel Clean Room (B-roll)” from Vimeo

» Download “Intel Headquarters (B-roll)” from Vimeo

Media assets are free for editorial broadcast, print, online and radio use; they are restricted for use for other purposes. Any use of these videos must credit “Intel Corporation” as the copyright holder.

The post Intel Manufacturing Images (B-roll, Photos) appeared first on Intel Newsroom.

Intel Announces New Class of RealSense Stand-Alone Inside-Out Tracking Camera

Intel RealSense T265 2

» Download all images (ZIP, 298 KB)

What’s New: Intel today introduced the Intel® RealSense™ Tracking Camera T265, a new class of stand-alone inside-out tracking device that will provide developers with a powerful building block for autonomous devices, delivering high-performance guidance and navigation. The T265 uses proprietary visual inertial odometry simultaneous localization and mapping (V-SLAM) technology with computing at the edge and is key for applications that require a highly accurate and low-latency tracking solution, including robotics, drones, augmented reality (AR) and virtual reality.

“Understanding your environment is a critical component for many devices. The T265 was designed to complement our existing Intel RealSense Depth Cameras and provide a quick path to product development with our next-generation integrated V-SLAM technology.”
–Sagi Ben Moshe, vice president and general manager, Intel RealSense Group

What’s In It: The Intel RealSense Tracking Camera T265 is powered by the Intel® Movidius™ Myriad™ 2 vision processing unit (VPU), which directly handles all the data processing necessary for tracking on the machine. This makes the T265 a small footprint, low-power consumption solution that is simple for use by developers implementing into existing designs or building their own intellectual property that requires rich visual intelligence.

Why It’s Important: The Intel RealSense Tracking Camera T265 is good for applications where tracking the location of a device is important, especially in locations without GPS service, such as warehouses or remote outdoor areas where the camera uses a combination of known and unknown data to accurately navigate to its destination. The T265 is also designed for flexible implementation and can be easily added to small-footprint mobile devices like lightweight robots and drones, as well as for connectivity with mobile phones or AR headsets.

For example, integrating the T265 into a robot designed for agriculture allows the device to navigate fields in a precise lawn-mower-style pattern and intelligently adapt to avoid obstacles in its environment, including structures or people. Whether bringing medical supplies to remote, off-the-grid areas or to a lab inside a hospital ward, the T265 can be used in drone or robotic deliveries due to its wider field of view and optimization for tracking use cases.

How It’s Different: The Intel RealSense Tracking Camera T265 uses inside-out tracking, which means the device does not rely on any external sensors to understand the environment. Unlike other inside-out tracking solutions, the T265 delivers 6-degrees-of-freedom (6DoF) inside-out tracking by gathering inputs from two onboard fish-eye cameras, each with an approximate 170-degree range of view. The V-SLAM systems construct and continually update maps of unknown environments and the location of a device within that environment. Since all position calculations are performed directly on the device, tracking with the T265 is platform independent and allows the T265 to run on very low-compute devices.

The T265 complements Intel’s RealSense D400 series cameras, and the data from both devices can be combined for advanced applications like occupancy mapping, improved 3D scanning and advanced navigation and collision avoidance in GPS-restricted environments. The only hardware requirements are sufficient non-volatile memory to boot the device and a USB 2.0 or 3.0 connection that provides 1.5  watts of power.

When You Can Get It: The Intel RealSense Tracking Camera T265 is available for pre-order now. It will begin shipping Feb. 28 at $199.

More Context: Intel RealSense T265 Product Page | Intel RealSense Technology

The post Intel Announces New Class of RealSense Stand-Alone Inside-Out Tracking Camera appeared first on Intel Newsroom.

Intel Studios’ Volumetric Video Gives ‘Grease’ New Life 40 Years Later

At CES 2019, Randal Kleiser, director of the iconic 1978 movie “Grease,” and Diego Prilusky, Intel Studios’ general manager, revealed a teaser and first look at an immersive experience by Intel and Paramount Pictures celebrating the film’s 40th anniversary.

In December, Kleiser and more than 20 dancers filmed a re-creation of one of the movie’s memorable musical numbers, “You’re the One That I Want,” at Intel Studios in Los Angeles. Captured by the studio’s 96 high-definition 5K cameras, the dance scene will come alive in “volumetric” video, a format that enables the viewer to experience the content from any given point of view.

Intel Studios’ 10,000-square-foot-geodesic dome is the world’s largest immersive media hub. Moviemakers film scenes inside the dome from all directions at once, a technique called “volumetric capture.” Data captured by each camera is shaped into voxels (think 3D pixels), which render the virtual environment in multi-perspective 3D. It allows audiences to view a scene from any angle – even the middle of the action.

“It’s exciting to be able to do something with this new cutting-edge technology,” Kleiser said.

Intel and Paramount plan to release an immersive experience based on the full song “You’re the One That I Want” this year for PCs and virtual reality headsets.

More on Volumetric Video: Huge Geodesic Dome is World’s Largest 360-Degree Movie Set | Intel Studios: A Home for Volumetric Video Capture and Creation (Video) | Intel at 2019 CES

The post Intel Studios’ Volumetric Video Gives ‘Grease’ New Life 40 Years Later appeared first on Intel Newsroom.

2019 CES: Intel Advances PC Experience with New Platforms, Technologies and Industry Collaboration

intel lakefield
At CES 2019, Intel Corporation previews a new client platform code-named “Lakefield.” It features a hybrid CPU architecture with Intel’s Foveros 3D packaging technology. Lakefield has five cores, combining a 10nm high-performance Sunny Cove core with four Intel Atom processor-based cores into a tiny motherboard for thin and light devices packed with performance, long battery life and connectivity. Intel displays how its technology is the foundation for the world’s most important innovations and advances at CES 2019 from Jan. 8-11 in Las Vegas. (Credit: Intel Corporation)
» Click for full image

Today at CES 2019 in Las Vegas, Intel unveiled the next wave of PC innovation that will advance the PC experience to help power every person’s greatest contribution, including:

  • Details on our highly integrated platform that will feature our upcoming first volume 10nm PC processor, code-named “Ice Lake”
  • Preview of a hybrid 10nm CPU architecture with Foveros 3D packaging, code-named “Lakefield”
  • Project Athena, an innovation program to deliver a new class of advanced laptops
  • Expansion of the 9th Gen Intel® Core™ desktop processor family

New 10nm Platforms for Mobile PCs

In the coming months, Intel will launch its new mobile PC platform with Intel’s upcoming first volume 10nm processor, code-named “Ice Lake.”

Built on Intel’s new Sunny Cove CPU microarchitecture detailed last month at Architecture Day, Ice Lake is expected to deliver a new level of technology integration on a client platform.

More: Intel at 2019 CES (All Intel News)

Ice Lake is the first platform to feature the all-new Gen11 integrated graphics architecture, support Intel Adaptive Sync technology, enabling smooth frame rates and capable of more than 1 TFLOP of performance for richer gaming and creation experience.

The new mobile PC platform is also the first to integrate Thunderbolt™ 3 and the new high-speed Wi-Fi 6 wireless standard as a built-in technology, as well as feature Intel® DL Boost instruction sets to accelerate artificial intelligence (AI) workloads. Ice Lake brings this all together with incredible battery life to enable super-thin, ultra-mobile designs with world-class performance and responsiveness, enabling people to enjoy an amazing computing experience. As shown by Dell*, look for new devices from Intel OEM partners on shelves by holiday 2019.

Intel also provided a sneak peek of a new client platform, code-named “Lakefield,” that features a hybrid CPU architecture with Intel’s new innovative Foveros 3D packaging technology. Lakefield has five cores, combining a 10nm high-performance Sunny Cove core with four Intel Atom® processor-based cores into a tiny package that delivers low-power efficiency with graphics and other IPs, I/O and memory. The result is a smaller board that provides OEMs more flexibility for thin and light form factor design and is packed with all the technology people have come to expect from Intel including long battery life, performance and connectivity. Lakefield is expected to be in production this year.

Intel is uniquely positioned to deliver this type of innovation because the broad set of technologies – the packaging, the architecture, the performance and connectivity – are all under one roof to help bring incredible new designs and experiences to life.

Project Athena Advances Laptop Innovation

Intel also announced Project Athena, an innovation program that defines and aims to help bring to market a new class of advanced laptops. Combining world-class performance, battery life and connectivity in sleek, beautiful designs, the first Project Athena laptops are expected to be available in the second half of this year across both Windows* and Chrome* operating systems.

intel project athena innovation vectors
» Click for full infographic

Designed to enable new experiences and capitalize on next-generation technologies, including 5G and artificial intelligence, Project Athena creates a path forward to accelerate laptop innovation through:

intel project athena innovation partners
» Click for full infographic
  • An annual spec outlining platform requirements
  • New user experience and benchmarking targets defined by real-world usage models
  • Extensive co-engineering support and innovation pathfinding
  • Ecosystem collaboration to accelerate key laptop component development and availability
  • Verification of Project Athena devices through a comprehensive certification process

Based on extensive research to understand how people use their devices and the challenges they face, the annual spec combines key areas of innovation to deliver laptops that are purpose-built to help people focus, adapt to life’s roles and always be ready. Intel’s Project Athena innovation partners include Acer*, Asus*, Dell, Google*, HP*, Innolux*, Lenovo*, Microsoft*, Samsung* and Sharp*, among others.

From delivering the first connected PC with integrated Wi-Fi in the Intel® Centrino® platform to driving mainstream adoption of super thin and light designs, touchscreens and 2 in 1 form factors with Ultrabook™, Intel is uniquely positioned to be the catalyst in delivering the next-gen PC experience.

Expanding 9th Gen Intel Core Processor Family

In October, Intel launched the first set of the 9th Gen Intel® Core™ desktop processors, including the Intel Core i9-9900K processor, the world’s best gaming processor1. Today, Intel introduced new additions to the 9th Gen Intel Core desktop processor family that expand the options to meet a broad range of consumer needs from casual users to professionals to gamers and serious content creators. The first of the new 9th Gen Intel Core desktop processors is expected to be available starting this month with more rolling out through the second quarter of this year.

1As measured by in-game benchmark mode performance where available, or highest median frames per second (FPS) where benchmark mode is unavailable. PC Gaming Processors Compared:  9th Gen Intel® Core™ i9-9900K, Intel® Core™ i9-9980XE Extreme Edition, and Intel® Core™ i9-9900X X-series; 8th Gen Intel® Core™ i7-8700K and i7-8086K; and AMD Ryzen™ 7 2700X, AMD Ryzen™ Threadripper 2990WX, and AMD Ryzen™ Threadripper 2950X. Prices of compared products may differ.  Configurations: Graphics: NVIDIA GeForce GTX 1080 TI, Memory: 4x16GB DDR4 (2666 or 2933 per highest speed of the corresponding processor), Storage: 1TB, OS: Windows* 10 RS4 Build 1803, Samsung 970 Pro SSD. Results: Intel® Core™ i9-9900K scored better on the majority of the 19 game titles tested. The Intel® Core™ i9-9900K scored the same as the Intel® Core™ i7-8700K and the Intel® Core™ i7-8086K on “Middle Earth: Shadow of War,” and scored less than the Intel® Core™ i9-9980XE Extreme Edition on “Rise of the Tomb Raider.” More detail on workloads, test methodology, and configurations available at http://facts.pt/11u9e2.

Performance results are based on testing by Principled Technologies as of October 4, 2018, and may not reflect all publicly available security updates. See configuration disclosure for details. No product can be absolutely secure.

Intel will be marketing the Intel® Core™ i9-9900K with the tag line “Performance Unleashed” in certain jurisdictions, including PRC and Vietnam. Intel will be marketing the Intel® Core™ i9-9900K with the tag line “Intel’s Best Gaming Desktop Processor” in certain jurisdictions, including Argentina, Belarus, Belize, Egypt, El Salvador, Guatemala, Honduras, Italy, Japan, Panama, Peru, Saudi Arabia, and Turkey. If you are media or an influencer from these countries, or otherwise communicating directly to residents in these countries (e.g., on local-language social media), please only refer to the tag line Intel will be using in that country in lieu of the claim on this slide/document.

Forward-Looking Statements

Statements in this news summary that refer to future plans and expectations, including with respect to Intel’s future products and the expected availability and benefits of such products, are forward-looking statements that involve a number of risks and uncertainties. Words such as “anticipates,” “expects,” “intends,” “goals,” “plans,” “believes,” “seeks,” “estimates,” “continues,” “may,” “will,” “would,” “should,” “could,” and variations of such words and similar expressions are intended to identify such forward-looking statements. Statements that refer to or are based on estimates, forecasts, projections, uncertain events or assumptions, including statements relating to total addressable market (TAM) or market opportunity and anticipated trends in our businesses or the markets relevant to them, also identify forward-looking statements. Such statements are based on the company’s current expectations and involve many risks and uncertainties that could cause actual results to differ materially from those expressed or implied in these forward-looking statements. Important factors that could cause actual results to differ materially from the company’s expectations are set forth in Intel’s earnings release dated October 25, 2018, which is included as an exhibit to Intel’s Form 8-K furnished to the SEC on such date. Additional information regarding these and other factors that could affect Intel’s results is included in Intel’s SEC filings, including the company’s most recent reports on Forms 10-K and 10-Q. Copies of Intel’s Form 10-K, 10-Q and 8-K reports may be obtained by visiting our Investor Relations website at www.intc.com or the SEC’s website at www.sec.gov.

The post 2019 CES: Intel Advances PC Experience with New Platforms, Technologies and Industry Collaboration appeared first on Intel Newsroom.

2018 Yearbook: Top Moments of Intel’s 50th Year

2018 yearbook 2x1

In 2018, Intel achieved the 50-year milestone, a huge accomplishment that most companies never achieve.

As the year ends, Intel’s 100,000 employees look back at 2018 with pride. But our focus is firmly fixed on building a smarter, more connected future for our communities and the world.

view yearbook button

The post 2018 Yearbook: Top Moments of Intel’s 50th Year appeared first on Intel Newsroom.

Intel Announces Dr. Rich Uhlig as New Managing Director of Intel Labs

richard uhlig
Dr. Rich Uhlig is the managing director of Intel Labs. (Credit: Intel Corporation)
» Click for full image

What’s New: Dr. Richard (Rich) Uhlig is the new managing director of Intel Labs. In this role, he will lead Intel Labs in its mission to look beyond today’s product portfolio to discover and research new forms of technology and computing. Intel Labs operates one of the most advanced networks of collaborative university-based research partnerships and seeks to speed the development and scale the impact of emerging technologies.

“The work we are doing at Intel Labs is pushing the boundaries of technology every day whether that’s our research in quantum and neuromorphic computing or how we’re extending and evolving Moore’s Law. We have some of the brightest minds working together across industry and academia to solve some of the biggest challenges in technology. I am very excited to lead Intel Labs in this data-centric era.”
– Dr. Rich Uhlig, managing director, Intel Labs

Who is Rich Uhlig: Uhlig is the new managing director of Intel Labs and an Intel senior fellow. Prior to this role, Rich was the director of Systems and Software Research in Intel Labs, where he led research efforts in virtualization, cloud-computing systems, software-defined networking, big-data analytics, machine learning and artificial intelligence. He joined Intel in 1996 and led the definition of multiple generations of virtualization architecture for Intel processors and platforms, known collectively as Intel Virtualization Technology (Intel® VT). Rich earned his Ph.D. in computer science and engineering from the University of Michigan.

More Context: Intel Labs

The post Intel Announces Dr. Rich Uhlig as New Managing Director of Intel Labs appeared first on Intel Newsroom.

Huge Geodesic Dome is World’s Largest 360-Degree Movie Set

On a studio lot just a mile down the road from Los Angeles International Airport sits an unassuming building that may signal the future of high-tech moviemaking. Inside the new Intel Studios structure is the world’s largest immersive media hub. It’s a 10,000-square-foot-geodesic dome outfitted with 96 high-resolution 5K cameras. The dome is more than 44 feet high, about four stories.

Moviemakers can film scenes inside the canvas dome from all directions at once, a technique called “volumetric capture”. It allows musical performers, Hollywood directors, even top athletes to tell stories in magical new ways. Intel’s technology creates voxels (think 3D pixels), which render the virtual environment in spectacular, multi-perspective 3D. It allows audiences to view a scene from any angle – even the middle of the action.

“Volumetric video opens the door to entirely new kinds of visual storytelling,” said Diego Prilusky, general manager of Intel Studios, which is part of Intel Sports. “It lets moviemakers create lifelike immersive and interactive media experiences that simply haven’t been possible before.”

Intel Studios sends captured volumetric content over 5 miles of fiber-optic cable to more than 90 Intel-powered servers that can crunch over 1 terabyte of data every 10 seconds. (It takes almost 1,500 CDs to hold 1TB of information.) The studio’s servers will eventually store up to 10 Petabytes of data — the equivalent of 133 years of high-definition video.

Intel Studios is currently working on two Hollywood virtual reality productions to be released in early 2019.

More: All Intel Images | Intel Studios Fact Sheet | More on Intel Studios

intel studios
Intel Studios’ 10,000-square-foot geodesic dome in Los Angeles is the world’s largest immersive media hub. Each of the studio’s 96 high-resolution 5K cameras captures action in two dimensions and algorithms convert those trillions of pixels into a 360-degree, 3D virtual environment. (Credit: Tim Herman/Intel Corporation)
» Click for full image

The post Huge Geodesic Dome is World’s Largest 360-Degree Movie Set appeared first on Intel Newsroom.