Tel Aviv, Israel, Sept. 14, 2020 – Intel announced today the launch of two additional Intel Ignite sites in Austin, Texas, and Munich, Germany, following its success in Tel Aviv, Israel. Intel Ignite is Intel’s startup growth program launched in 2019. This is the first stage of expansion that will establish the program in multiple cities throughout Europe, North America and Asia.
Intel Ignite is a 12-week program for 10 early-stage startups that will receive hands-on mentorship from Intel and world-leading experts. Participating companies will gain access to technology and business leaders, as well as the knowledge, network and association that come with being accepted into one of the world’s most challenging and competitive startup programs.
“Intel’s purpose is to create world-changing technologies that enrich the lives of every person on Earth,” said Intel CEO Bob Swan. “We launched Ignite in 2019 both to support early-stage companies on their journey to success and to provide Intel employees with an opportunity to advance our purpose. In its first year, the Ignite program’s achievements have far surpassed our expectations and because of its proven, strategic impact, we are expanding its reach.”
The Intel Ignite team chose to establish programs in Austin and Munich because both cities are home to Intel sites with top-tier startup ecosystems featuring a high concentration of innovation, technology and talent. Intel Ignite has completed two successful cohorts in Israel representing startups from a range of industries, including artificial intelligence, cybersecurity and the internet of things, all areas where Intel can provide deep expertise and guidance — from engineering to manufacturing to marketing — as well as industry connections.
The program is currently accepting applications for the third cohort in Tel Aviv. The Ignite program in Munich and Austin will start in the first half of 2021. Munich’s applications will open in 2020’s fourth quarter to applicants from across Europe; Austin’s will open in 2021’s first quarter to applicants from across the U.S. All three programs will have a physical base at the center of their local startup ecosystem. The selection process and complete program will be delivered virtually while operating under pandemic guidelines.
Criteria for interested companies include a minimum of $1 million in funding (seed, series A), an experienced founding team, significant IP and a large market opportunity. Startups must be in the technology sector. Intel Ignite will not take equity from startups as part of this program. To apply, visit www.intel.com/ignite.
“Ignite was created to help startups realize their potential and to connect them with leading Intel experts. We are expanding the program to multiple countries so that we can work with a diverse array of startups tackling some of the world’s biggest challenges,” said Intel Ignite General Manager Tzahi (Zack) Weisfeld. “The startups we look for are fearless, pave new paths and are not afraid of taking bold steps. Intel employees engaging with the startups in Ignite are challenged to work more nimbly and creatively, and they gain perspective on how other companies and entrepreneurs operate, producing fast results and significant business outcomes.” For more information, visit www.intel.com/ignite.
What’s New: Samsung Medison and Intel are collaborating on new smart workflow solutions to improve obstetric measurements that contribute to maternal and fetal safety and can help save lives. Using an Intel® Core™ i3 processor, the Intel® Distribution of OpenVINO™ toolkit and OpenCV library, Samsung Medison’s BiometryAssist™ automates and simplifies fetal measurements, while LaborAssist™ automatically estimates the fetal angle of progression (AoP) during labor for a complete understanding of a patient’s birthing progress, without the need for invasive digital vaginal exams.
“Samsung Medison’s BiometryAssist is a semi-automated fetal biometry measurement system that automatically locates the region of interest and places a caliper for fetal biometry, demonstrating a success rate of 97% to 99% for each parameter1. Such high efficacy enables its use in the current clinical practice with high precision.”
–Professor Jayoung Kwon, MD PhD, Division of Maternal Fetal Medicine, Department of Obstetrics and Gynecology, Yonsei University College of Medicine, Yonsei University Health System in Seoul, Korea
Why It’s Needed: According to the World Health Organization, about 295,000 women died during and following pregnancy and childbirth in 2017, even as maternal mortality rates decreased. While every pregnancy and birth is unique, most maternal deaths are preventable. Research from the Perinatal Institute found that tracking fetal growth is essential for good prenatal care and can help prevent stillbirths when physicians are able to recognize growth restrictions.
“At Intel, we are focused on creating and enabling world-changing technology that enriches the lives of every person on Earth,” said Claire Celeste Carnes, strategic marketing director for Health and Life Sciences at Intel. “We are working with companies like Samsung Medison to adopt the latest technologies in ways that enhance the patient safety and improve clinical workflows, in this case for the important and time-sensitive care provided during pregnancy and delivery.”
How It Works: BiometryAssist automates and standardizes fetal measurements in approximately 85 milliseconds with a single click, providing over 97% accuracy1. This allows doctors to spend more time talking with their patients while also standardizing fetal measurements, which have historically proved challenging to accurately provide. With BiometryAssist, physicians can quickly verify consistent measurements for high volumes of patients.
“Samsung is working to improve the efficiency of new diagnostic features, as well as healthcare services, and the Intel Distribution of OpenVINO library and OpenCV toolkit have been a great ally in reaching these goals,” said Won-Chul Bang, corporate vice president and head of Product Strategy, Samsung Medison.
During labor, LaborAssist helps physicians estimate fetal AOP and head direction. This enables both the physician and patient to understand the fetal descent and labor process and determine the best method for delivery. There is always risk with delivery and a slowing progress could result in issues for the baby. Obtaining more accurate and real-time progression of labor can help physicians determine the best mode of delivery and potentially help reduce the number of unnecessary cesarean sections.
“LaborAssist provides automatic measurement of the angle of progression as well as information pertaining to fetal head direction and estimated head station. So it is useful for explaining to the patient and her family how the labor is progressing, using ultrasound images which show the change of head station during labor. It is expected to be of great assistance in the assessment of labor progression and decision-making for delivery,” said Professor Min Jeong Oh, MD, PhD, Department of Obstetrics and Gynecology, Korea University Guro Hospital in Seoul, Korea.
BiometryAssist and LaborAssist are already in use in 80 countries, including the United States, Korea, Italy, France, Brazil and Russia. The solutions received Class 2 clearance by the FDA in 2020.
What’s Next: Intel and Samsung Medison will continue to collaborate to advance the state of the art in ultrasounds by accelerating AI and leveraging advanced technology in Samsung Medison’s next-generation ultrasound solutions, including Nerve Tracking, SW Beamforming and AI Module.
1 Source: Internal Samsung testing. System configuration: Intel® Core™ i3-4100Q CPU @ 2.4 GHz, 8 GB memory; OS: 64-bit Windows 10. Inference time without OpenVINO enhancements was 480 milliseconds. Inference time with OpenVINO enhancements was 85 milliseconds.
“I will continue to evolve, dying as a human, living as a cyborg.”
That’s British roboticist Dr. Peter Scott-Morgan. In 2017, he received a diagnosis of with motor neurone disease (MND), also known as ALS or Lou Gehrig’s disease. MND attacks one’s brain and nerves and eventually paralyzes all muscles, even those that enable breathing and swallowing.
Doctors told the 62-year-old scientist he’d probably die by the end of 2019, but Scott-Morgan had other plans: He wants to replace all his organs with machinery to become the “world’s first full cyborg.” Scott-Morgan began his transformation late last year when he underwent a series of operations to extend his life using technology.
He now relies on synthetic speech and has developed a lifelike avatar of his face for more effective communication with others. “Peter 2.0 is now online,” Scott-Morgan announced after his surgeries late last year. “This is MND with attitude.”
Among the team of technologists working with Scott-Morgan is Lama Nachman, Intel fellow and director of Intel’s Anticipatory Computing Lab.
Nachman helped famed physicist Stephen Hawking speak; now she and her team are helping Scott-Morgan.
For almost eight years, Nachman helped Hawking communicate his almost mythical intellectual achievements through an open-source platform she and her team helped develop, called the Assistive Context-Aware Toolkit (ACAT). The software helps people with severe disabilities communicate through keyboard simulation, word prediction and speech synthesis. For Hawking, it was a tiny muscle in his cheek that he twitched to trigger a sensor on his glasses that would interface with his computer to type sentences. For Scott-Morgan, Nachman’s team added gaze tracking, which allows him to stare at letters on his computer screen to form sentences, as well as word prediction capabilities.
“How can technology empower people? That’s been a thread in my life all along.”
A Palestinian growing up as a child in Kuwait, Nachman recalls neighbors calling her to fix their broken electronics and appliances. “I’ve always had this interest in figuring out the latest and greatest technologies and playing with them and breaking them and fixing them,” Nachman says.
Nachman’s team works on context-aware computing and human artificial intelligence (AI) collaboration technologies that can help elderly in their homes, students who might not thrive in standard classrooms and technicians in manufacturing facilities. “I’ve always felt that technology can empower people who are most marginalized,” Nachman says. “It can level the playing field and bring more equity into society, and that is most obvious for people with disabilities.”
While Hawking wanted more control over his conversations, Nachman says, “Peter is open to greater experimentation and the idea of he and the machine learning together. As a result, we have been researching how to build a response-generation capability that can listen to the conversation and suggest answers that he can quickly choose from or nudge in a different direction.”
While this isn’t as accurate as Hawking’s preference, Nachman says Scott-Morgan is willing to forego control in exchange for intuitive collaboration with his AI-powered communication interface because of the speed it affords him.
“My ventilator is a lot quieter than Darth Vader’s.”
Scott-Morgan is known for his wit and self-effacing humor, and he wants to be able to show that with his artificial voice. In addition to decreasing the latency, or “silence gaps,” between Scott-Morgan and another conversing, Nachman’s team is looking into how Scott-Morgan can express emotion. When we’re conversing normally with another, we’re looking at multiple cues like expressions and tone, not just the words. For Scott-Morgan, the team is researching an AI system that listens to what’s going on and then prompts alternative suggestions and tones according to different criteria.
Someday, Scott-Morgan and others might use brainwaves to control their voices.
Nachman said some of her team’s research focuses on people who cannot move any part of their body, not even a twitch of their cheeks or eyes. For them, Nachman says, brain-computer interfaces (BCIs) include skullcaps equipped with electrodes that monitor brainwaves, like an electroencephalogram test. Nachman says she and her team are looking to add BCIs to ACAT to ensure no one is left behind.
As AI gets smarter, Nachman is particularly interested in exploring ways to preserve human control while giving the AI system greater agency so “the two diverse actors are working in concert to achieve better outcomes together.”
KIGALI, Rwanda – Standing beside a rutted red dirt road at about 5,000 feet up in the jungled mountains of northern Rwanda, Intel’s Adam Schafer explains in four words why he and a teammate had traveled from Oregon to this especially remote part of Central Africa.
“We’re here to learn.”
With banana trees swaying behind him, Schafer continues: “Our goal is to protect the people and the planet, both of which help us produce our products. We want to meet the responsible sourcing expectations of our customers, shareholders and employees.”
Schafer is Intel’s director of Supply Chain Sustainability. Late last year, he and Erin Mitchell, manager of Intel’s Responsible Minerals Program, spent a week crisscrossing Rwanda’s mineral-rich mountains — fording creeks in a four-wheel drive, scrambling down narrow mountain trails to mine entrances and asking questions at every turn.
Why Rwanda? The minerals — tin, tantalum, tungsten and gold (known as 3TG) — that lie both deep underground and right on the surface in this part of Africa are essential to the worldwide silicon manufacturing industry.
In chip manufacturing, for example, tantalum is a metal uniquely well-suited as a diffusion barrier on advanced copper interconnects. In the assembly/test process, tin offers a low melting point and is a key component of the solder that attaches silicon chips to their packaging. Gold is corrosion-resistant and an excellent electrical conductor for the tiny pins that connect chips to other components.
On behalf of Intel, Schafer and Mitchell made the trip to fully understand the first part of a complex process. It begins with a chunk of mineral ore in Africa and — after passing through many hands, including miners, refiners, smelters and sellers, scattered across the globe — eventually winds up in chip factories. Ultimately, it turns up in your computer, your tablet, your smartphone, as well as in the millions of servers that run the internet and likely are delivering this story to you.
The Intel team’s fact-finding trip was completed before the coronavirus pandemic halted most air travel — but helped guide Intel’s recently announced 2030 Corporate Responsibility Goals as they relate to responsible sourcing.
In continued successful pursuit of Moore’s Law – as well as achieving these new 2030 goals – Intel recognizes this: Ethical mineral sourcing throughout the supply chain is no less important than process and technology innovation.
More than 10 years ago, Intel recognized that some of its mineral purchases — through a complex web of supply chain intermediaries — were unintentionally contributing to human rights abuses in the Democratic Republic of Congo. Armed guerilla factions in that country were exploiting forced labor, often abusing children and women, and engaging in multiple human rights violations — all in pursuit of illicit profits from the global mineral trade.
This is how the 3TG minerals — when extracted from the Earth under these abusive conditions — came to be called “conflict minerals.” The minerals themselves are not an issue.
At the time, Intel analyzed its supply chain and began a multiyear, industrywide effort to root out human rights abuses from the mineral components in its own products and those of other tech companies.
The 2010 U.S. Dodd-Frank Act, which Intel supported, required companies to disclose if any of their 3TG minerals are sourced from the Congo or neighboring countries. That was the same year the U.N. reported a grim statistic: In the Congo’s mineral-rich Kivu provinces, “almost every mining deposit was controlled by a military group.”
The International Peace Information Service reported that by 2016, 79% of miners in eastern Congo said they were working in mines where no armed groups were involved.
From conflict-free to responsible sourcing
Schafer and Mitchell’s mine scouting work last December in Rwanda — Schafer also visited mines in the Congo, and Mitchell visited smelters in India — was a sign that Intel, along with key tech industry leaders, are further raising the bar.
Intel and its partners are moving beyond the issue of conflict minerals to the broader and loftier goal of achieving “responsible sourcing.”
“Intel came off to a very strong start in the conflict minerals space, and we were a very early leader,” explains Mitchell during one mine visit. “We want to take that leadership we had early on and expand on it.”
So now, Schafer and Mitchell, along with others across Intel’s global supply chain organization, are asking questions such as: Are mining conditions sustainable and ethical? Are miners’ human rights respected? Is the raw mineral ore that enters the supply chain carefully traced so that buyers can be assured it was mined and sold legally, free of human rights abuses?
Schafer and Mitchell visited six mines and refining facilities over five days, heading out each morning from the capital city of Kigali with a local driver familiar with the bone-rattling unpaved mountain roads that snake up to most of the country’s underground mines.
At Rutongo Mines, the two stood at the entrance to a horizontal mine shaft as workers in black rubber boots and bright yellow hard hats emerged from the darkness, grinning at their visitors while muscling a narrow-gauge rail cart full of tin ore into the bright nearly equatorial sun. There, the Intel team learned that the mine operator is facing competition to his business of an odd kind.
He says townspeople with picks and shovels sneak onto his company’s sprawling multi-thousand-acre mountain mining claim. In broad daylight — until they’re chased away — they dig into the mountainsides for chunks of tin ore, called cassiterite. Then they sell the ore to street buyers in the capital city of Kigali, undercutting the legal business. And who knows if that ore-on-the-street was responsibly sourced? Or if it came from rogue players?
To ensure responsible sourcing, label and log everything
The industry answer is a “bag-and-tag” system. Intel and other tech companies have been successfully pushing for a process that tracks bags of mineral ore with crimped-on, tamper-resistant tags. Bag-and-tag ensures that minerals come from responsible sources — in much the same way you trust that your supermarket blueberries labeled as organic are, in fact, organic. Their route from farm to shipper to market is documented and traced.
Schafer says that while mineral bag-and-tag is not a perfect system, he calls it “an important first step in diligence and transparency.”
“We keep the tags locked up. And there are two locks and two keys,” Lionel Sematuro explains to Schafer and Mitchell at a mine run by Piran Rwanda Ltd.
They’re standing inside an old rust-red shipping container that serves as a depot and safe box for 220-pound plastic bags filled with Piran’s tin ore. To ensure traceability — that mineral ore has been dug legally by legit workers on the company payroll, not by unknown freelancers — each heavy bag is tagged then kept under lock and key in that shipping container.
A carefully kept logbook, filled out by hand and also locked in that same shipping container, documents all mineral movements.
Traceability “helps ensure investors that they’re not investing in unsafe or unfair practices, that they’re investing in responsibly sourced minerals, that we’re doing things correctly and by the book,” explains Ashley Dace, with Piran Rwanda Ltd.
The importance of mineral traceability extends beyond the mining process to refining and smelting, and ultimately the ready-to-market minerals that Intel and other tech firms worldwide need to buy.
Ensuring all minerals entering the supply chain can be traced to responsible sources is a major element of Intel’s responsible sourcing strategy — and is shared across the tech industry.
Intel joins Apple, Facebook, Google, others
In Rwanda, the Intel team talked not just with mine operators, refiners and local government leaders, but also with fellow corporate responsibility representatives in the technology industry. Accompanying the Intel team on several of the mine and government visits were reps from Apple, Facebook, Google, Nokia and other companies with whom Intel has been partnering on conflict minerals and responsible sourcing issues for nearly a decade.
Says Schafer: “It’s important that we continue to work with our peers and customers to help make our industry even better.”
For the past six years, of the more than 200 companies whose mineral sourcing programs are analyzed by the Responsible Sourcing Network, Intel has ranked No. 1.
Why does this matter? Increasingly, as Intel Corporate Responsibility director Suzanne Fallender points out, investors want to know that firms in which they stake a claim are behaving as responsible corporate citizens. Fallender told Greenbiz that investors “demand more accountability than ever, and companies have an obligation to be transparent with them.”
Schafer and Mitchell say their Rwandan mine inspection made clear the importance of continued mineral ore tracing. The visit — the first of its kind by Intel in seven years — signaled to local players the importance of responsible sourcing to Intel and other firms.
They also say they now recognize the industry needs a better way to account for so-called “artisanal miners” to legitimately join in the mining process. These are local citizens whose livelihood may depend on digging for minerals to sell on the open market. The challenge, says Mitchell, is “how to reduce the risk” that unfettered artisanal mining could open the door to human rights abuses – while still helping local residents to support themselves and their families.
‘Entire periodic table’
What next? Intel is not planning to quit at the 3TG minerals. Each of these minerals occupies one of those 118 little squares on the periodic table of elements that many of us remember from high school chemistry classrooms.
Standing on that jungled mountainside in Rwanda, Schafer explains that Intel’s ambitions are much greater — they are set out in the company’s 2020 Corporate Responsibility Report and its 2030 Corporate Strategy and Goals.
“As we expand from conflict minerals to responsible sourcing, we and the industry are responsible for the entire periodic table,” he says. “Our goal is to respect all aspects of human rights, environmental impact and the people in the communities who are sourcing the materials that are critical to our industry.”
A Canadian museum is safely reopening from its pandemic closure with the help of a virtual assistant powered by artificial intelligence (AI). Originally designed by CloudConstable to welcome visitors to the Ontario Regiment Museum, virtual assistant Master Cpl. Lana interacts with visitors over a large screen, and was reconfigured with Intel® RealSense™ and AI technology to enable the safe return of the many volunteers who keep the museum and vehicles operating.
The Ontario Regiment Museum houses North America’s largest collection of operational military vehicles, many dating back to the 1940s. The collection allows the public to experience a piece of history, both at the museum and through the historical films in which the vehicles often appear.
At the start of the pandemic in early 2020, CloudConstable began working with the museum to design Master Cpl. Lana as an AI virtual assistant who would greet visitors, check them in and provide museum details.
Before Lana’s deployment, COVID-19 closed the museum to the public. But with over 120 military vehicles that need constant servicing and driving, the museum needed its volunteers to continue their essential maintenance and operations work at the site.
“The Ontario Regiment Museum is one of the few museums in the world with such a large and diverse collection of operating military vehicles, which help people experience history in a very real way. Regular maintenance is crucial, even during the worst of the pandemic, which is why we turned to CloudConstable and Intel to help build an autonomous solution,” said Jeremy Blowers, executive director of the Ontario Regiment Museum.
CloudConstable relied on the Intel RealSense team’s insight that Lana’s existing and unique capabilities — already built on the Intel RealSense Depth Camera and using the Intel® Distribution of OpenVINO™ toolkit for accelerated machine vision inferences — could be extended for a more comprehensive and safer COVID-19 screening solution. Adding an Intel® NUC 9 Pro with Intel Active Management Technology, as part of the Intel vPro® platform, the team reworked Lana to take temperatures via thermal scans and ask a series of questions to assess COVID-19 risk and exposure. Since June, Lana has provided an enhanced, fully automated and touchless screening process so volunteers can continue to do their important work with the vehicles.
“Intel RealSense technology is used to develop products that enrich people’s lives by enabling machines and devices to perceive the world in 3D. CloudConstable leverages Intel’s technology to help create a state-of-the-art natural voice and vision interface with touchless, self-service COVID-19 screening,” said Joel Hagberg, head of product management and marketing for Intel’s RealSense Group.
With the Ontario Regiment Museum now preparing to reopen to the public, CloudConstable, along with Intel, is working to bring the new COVID-19 protection capabilities into the original concept for Lana as a greeter for visitors. Lana will greet visitors, provide contactless check-in, scan temperatures and ensure the museum adheres to visitor limits and other COVID-19 health protection protocols. Eventually, she’ll even thank them for coming and help visitors keep in touch with all the latest activities at the museum.
Humphrey Hanley, who goes by the name @NoHandsNZ, is a gamer from New Zealand with a disability whose motto is “No Hands, No Excuses.” In recognition of Global Accessibility Awareness Day, Intel’s Accessibility Office and Client Computing Group sent Humphrey the parts he needed to build a PC for the first time. He livestreamed his 11-hour build on Twitch, the popular livestreaming platform for gamers.
After finishing the build, Hanley, who has epidermolysis bullosa that makes his skin susceptible to damage from any kind of friction, collected his thoughts and talked through the challenges he faces. Intel will use his feedback to build more accessibility into gaming PCs. Many people with disabilities often turn to gaming as a place where technology can level the playing field and allow them to compete like other players.
Hanley first began making videos in late 2016 after recovering from a major surgery. “I started going back to the gym, and part of my whole process was to record my progress. In those days, it was just using my cellphone, and then I got a GoPro and gradually just slowly expanded my range of ability and ways to make content,” says Hanley.
He first began posting on his YouTube channel, No Hands, No Excuses. He later discovered Twitch: “Twitch was suddenly an amazing platform for being able to spread the message about accessibility and the joy that gaming can bring to people with disabilities worldwide, live, instantly, to anyone that was able to tune in.”
His relationship with Intel began at TwitchCon 2019 when he saw a tweet from @IntelGaming inviting the first 10 people who replied to lunch. “I thought I might be a little bit late,” says Hanley, “but actually how about I come to lunch and I bring my own chair, because when I travel, I use a wheelchair.”
At the lunch, he talked about the issues there are for people with disabilities and gaming. “One of the big difficulties for people that want to be involved, especially in PC gaming, is the price tag that comes with prebuilt PCs and anything that companies advertise as being a gaming machine,” says Hanley. “But you haven’t picked the parts yourself and put it together yourself. There’s almost always a catch somewhere.”
In spring 2020, Intel sent Humphrey all the parts to build an Intel® gaming PC without modifications:
10th Gen Intel® Core™ i9-10900K Desktop Processor
ASUS Turbo GeForce RTX 2080 Ti GPU
Intel® Solid State Drive, 660P Series, 1 TB
ASUS ROG Maximus XII Extreme Motherboard
“Watching Humphrey build out his machine, we started having some really good discussions about how we might be able to make builds easier for people with disabilities,” says Kahlief Adams, technical marketing engineer in Intel’s Client Computing Group. “With his help and guidance, there is a real possibility that we can make important and impactful strides to get everyone in the game, in ways that enable PC gamers with disabilities to build the rigs of their dreams.”
Hanley says the industry could do a lot of things to make building a PC more accessible. “There are probably some things in there that are necessary barriers,” says Hanley. But he says things like plugs and the way that connection points go inside PCs are unnecessary barriers rather than necessary ones. “So just think about those things that you are designing and making or programming, and what are the unnecessary barriers that you are putting in there inadvertently.”
For example, Hanley says he struggled lowering the i9-10900K processor into the motherboard’s CPU pot. “I think I dropped it four times because it was so tricky to get it into the right place,” he says. “And I was just thinking at the time, why is there not just a little extra bit on here that I could attach something to, to allow me to lower this chip in? Or why isn’t another tool designed to do that in a really safe and easy way? So we’re not putting fingers on it, if we’ve got them or not — dealing with something so ridiculously sharp and fiddly.”
Hanley’s advice to gaming industry leaders is to involve people who can help identify barriers early in the process.
“I think again, that comes down to having people on your team or working for your company that can see those barriers,” says Hanley. “Because if it’s not a problem for you, you are not necessarily going to see things like that as a barrier. And then I think the other thing is to foster inclusive spaces. It’s the culture. It’s the way we treat people. It’s the way we open our communities up.”
What’s New: Mobileye, an Intel company, received an automated vehicle (AV) testing permit recommendation from the independent technical service provider TÜV SÜD. As one of the leading experts in the field of safe and secure automated driving, TÜV SÜD enabled Mobileye to obtain approval from German authorities by validating the vehicle and functional safety concepts of Mobileye’s AV test vehicle. This allows Mobileye to perform AV testing anywhere in Germany, including urban and rural areas as well as the Autobahn at regular driving speed of up to 130 kilometers per hour. The AV testing in Germany in real-world traffic is starting now in and around Munich.
“Mobileye is eager to show the world our best-in-class self-driving vehicle technology and safety solutions as we get closer to making safe, affordable self-driving mobility solutions and consumer vehicles a reality. The new AV Permit provides us an opportunity to instill even more confidence in autonomous driving with future riders, global automakers and international transportation agencies. We thank TÜV SÜD for their trusted collaboration as we expand our AV testing to public roads in Germany.”
– Johann Jungwirth, vice president, Mobility-as-a-Service (MaaS), Mobileye
Why It Matters: Mobileye is one of the first non-OEM companies to receive a permit to test AVs on open roads in Germany. Until now, AV test drives in Germany have primarily taken place in closed and simulated environments. The basis for the independent vehicle assessment by TÜV SÜD in Germany builds on Mobileye’s existing program in place in Israel, where it has tested AVs for several years.
“With the TÜV SÜD AV-permit we bring in our broad expertise as a neutral and independent third party on the way to safe and secure automated mobility of the future,” sais Patrick Fruth, CEO Division Mobility, TÜV SÜD. “Our demanding assessment framework and test procedure considers state-of-the-art approaches to safety and combines physical real-world tests and scenario-based simulations.”
With the ability to test automated vehicles with a safety operator on public roads in Germany, Mobileye is taking another significant step toward the goal of a driverless future. On the heels of Mobileye’s acquisition of Moovit, a leading MaaS solutions company, as well as recent collaborations to test and deploy self-driving vehicles in France, Japan, Korea and Israel, the new testing permit strengthens Mobileye’s growing global leadership position as an AV technology as well as complete mobility solutions provider.
How It Works: The new permit will allow Mobileye to demonstrate to the global automotive industry and partners the safety, functionality and scalability of its unique self-driving system (SDS) for MaaS and consumer autonomous vehicles. The Mobileye SDS is comprised of the industry’s most advanced vision sensing technology, True Redundancy with two independent perception sub-systems, crowd-sourced mapping in the form of Road Experience Management™ (REM™) and its pioneering Responsibility-Sensitive Safety (RSS) driving policy.
Although the first tests of AVs using Mobileye’s SDS will be completed in Munich, the company plans to also perform AV testing in other parts of Germany. In addition, Mobileye expects to scale open-road testing in other countries before the end of 2020.
In order to obtain the authorization, Mobileye-powered AV test vehicles underwent a series of rigorous safety tests and provided comprehensive technical documentation. Part of the application also included a detailed hazard analysis, vehicles safety and functional safety concepts and proof that the cars can be safely integrated into public road traffic – an assessment that was made possible using Mobileye’s RSS.
More Context: As Mobileye begins self-driving vehicle testing in Germany, Mobileye and Moovit will start demonstrating full end-to-end ride hailing mobility services based on Moovit’s mobility platform and apps using Mobileye’s AVs. Intel is pursuing the goal of continuing to develop pioneering technologies together with Mobileye and Moovit that will make roads safer for all road users while also improving mobility access for all.
In addition to the development of market-ready technologies, an important prerequisite is the worldwide mapping of roads. Mobileye has already successfully laid the foundations with REM. In cooperation with various automobile manufacturers, data from 25 million vehicles is expected to be collected by 2025. Mobileye is creating high-definition maps of the worldwide road infrastructure as the basis for safe autonomous driving. Millions of kilometers of roads across the globe are mapped every day with the REM technology.
Together, Intel, Mobileye and Moovit are driving forward the implementation of their mobility-as-a-service strategy. This strategy offers society and individuals solutions to today’s major social costs of transportation. The goal is to make mobility safe, accessible, clean, affordable and convenient, so that people can travel efficiently, flexibly and smartly from Point A to Point B. All means of transport — from public transport to car and bike sharing services to ride hailing and ride sharing with self-driving vehicles — will be bundled within one service offering of Moovit and Mobileye, smartly managed by Moovit’s mobility intelligence platform. The advantages are manifold: traffic congestion is minimized, emissions are reduced, and people are given equal and affordable access to mobility — an approach that is a top priority at Intel.
What’s New: Today, two researchers from the National University of Singapore (NUS), who are members of the Intel Neuromorphic Research Community (INRC), presented new findings demonstrating the promise of event-based vision and touch sensing in combination with Intel’s neuromorphic processing for robotics. The work highlights how bringing a sense of touch to robotics can significantly improve capabilities and functionality compared to today’s visual-only systems and how neuromorphic processors can outperform traditional architectures in processing such sensory data.
“This research from National University of Singapore provides a compelling glimpse to the future of robotics where information is both sensed and processed in an event-driven manner combining multiple modalities. The work adds to a growing body of results showing that neuromorphic computing can deliver significant gains in latency and power consumption once the entire system is re-engineered in an event-based paradigm spanning sensors, data formats, algorithms, and hardware architecture.”
— Mike Davies, director of Intel’s Neuromorphic Computing Lab
Why It Matters: The human sense of touch is sensitive enough to feel the difference between surfaces that differ by just a single layer of molecules, yet most of today’s robots operate solely on visual processing. Researchers at NUS hope to change this using their recently developed artificial skin, which according to their research can detect touch more than 1,000 times faster than the human sensory nervous system and identify the shape, texture and hardness of objects 10 times faster than the blink of an eye.
Enabling a human-like sense of touch in robotics could significantly improve current functionality and even lead to new use cases. For example, robotic arms fitted with artificial skin could easily adapt to changes in goods manufactured in a factory, using tactile sensing to identify and grip unfamiliar objects with the right amount of pressure to prevent slipping. The ability to feel and better perceive surroundings could also allow for closer and safer human-robotic interaction, such as in caregiving professions, or bring us closer to automating surgical tasks by giving surgical robots the sense of touch that they lack today.
While the creation of artificial skin is one step in bringing this vision to life, it also requires a chip that can draw accurate conclusions based on the skin’s sensory data in real time, while operating at a power level efficient enough to be deployed directly inside the robot. “Making an ultra-fast artificial skin sensor solves about half the puzzle of making robots smarter,” said assistant professor Benjamin Tee from the NUS Department of Materials Science and Engineering and NUS Institute for Health Innovation & Technology. “They also need an artificial brain that can ultimately achieve perception and learning as another critical piece in the puzzle. Our unique demonstration of an AI skin system with neuromorphic chips such as the Intel Loihi provides a major step forward towards power-efficiency and scalability.”
About the Research: To break new ground in robotic perception, the NUS team began exploring the potential of neuromorphic technology to process sensory data from the artificial skin using Intel’s Loihi neuromorphic research chip. In their initial experiment, the researchers used a robotic hand fitted with the artificial skin to read Braille, passing the tactile data to Loihi through the cloud to convert the micro bumps felt by the hand into a semantic meaning. Loihi achieved over 92 percent accuracy in classifying the Braille letters, while using 20 times less power than a standard Von Neumann processor.
Building on this work, the NUS team further improved robotic perception capabilities by combining both vision and touch data in a spiking neural network. To do so, they tasked a robot to classify various opaque containers holding differing amounts of liquid using sensory inputs from the artificial skin and an event-based camera. Researchers used the same tactile and vision sensors to test the ability of the perception system to identify rotational slip, which is important for stable grasping.
Once this sensory data was captured, the team sent it to both a GPU and Intel’s Loihi neuromorphic research chip to compare processing capabilities. The results, which were presented at Robotics: Science and Systems this week, show that combining event-based vision and touch using a spiking neural network enabled 10 percent greater accuracy in object classification compared to a vision-only system. Moreover, they demonstrated the promise for neuromorphic technology to power such robotic devices, with Loihi processing the sensory data 21 percent faster than a top-performing GPU, while using 45 times less power.
“We’re excited by these results. They show that a neuromorphic system is a promising piece of the puzzle for combining multiple sensors to improve robot perception. It’s a step toward building power-efficient and trustworthy robots that can respond quickly and appropriately in unexpected situations,” said assistant professor Harold Soh from the Department of Computer Science at the NUS School of Computing.
About the Intel Neuromorphic Research Community: The Intel Neuromorphic Research Community is an ecosystem of academic groups, government labs, research institutions, and companies around the world working with Intel to further neuromorphic computing and develop innovative AI applications. Researchers interested in participating in the INRC and developing for Loihi can visit the Intel Neuromorphic Research Community website. A list of current members can also be found at the site.
JERUSALEM, Israel, and OSAKA, Japan, July 8, 2020 – Mobileye, an Intel Company, and WILLER, one of the largest transportation operators in Japan, Taiwan and the Southeast Asian region, today announced a strategic collaboration to launch an autonomous robotaxi service in Japan and markets across Southeast Asia, including Taiwan. Beginning in Japan, the companies will collaborate on the testing and deployment of autonomous transportation solutions based on Mobileye’s automated vehicle (AV) technology.
“Our new collaboration with WILLER brings a meaningful addition to Mobileye’s growing global network of transit and mobility ecosystem partners,” said Prof. Amnon Shashua, Intel senior vice president and president and CEO of Mobileye. “We look forward to collaborating with WILLER as we work together for new mobility in the region by bringing self-driving mobility services to Japan, Taiwan and ASEAN markets.”
“Collaboration with Mobileye is highly valuable for WILLER and a big step moving forward to realize our vision of innovating transportation services: travel anytime and anywhere by anybody,” said Shigetaka Murase, founder and CEO of WILLER. “Innovation of transportation will lead to a smarter, safer and more sustainable society where people enjoy higher quality of life.”
Together, Mobileye and WILLER are seeking to commercialize self-driving taxis and autonomous on-demand shared shuttles in Japan, while leveraging each other’s strengths. Mobileye will supply autonomous vehicles integrating its self-driving system and WILLER will offer services adjusted to each region and user tastes, ensure regulatory framework, and provide mobility services and solutions for fleet operation companies.
The two companies aim to begin testing robotaxis on public roads in Japan in 2021, with plans to launch fully self-driving ride-hailing and ride-sharing mobility services in 2023, while exploring opportunities for similar services in Taiwan and other Southeast Asian markets.
For Mobileye, the collaboration with WILLER advances the company’s global mobility-as-a-service (MaaS) ambitions. Since announcing its intention to become a complete mobility provider, Mobileye has begun a series of collaborations with cities, transportation agencies and mobility technology companies to develop and deploy self-driving mobility solutions in key markets. The agreement with WILLER builds on Mobileye’s existing MaaS partnerships. Examples include the agreement with Daegu Metropolitan City, South Korea, to deploy robotaxis based on Mobileye’s self-driving system, and the joint venture with Volkswagen and Champion Motors to operate an autonomous ride-hailing fleet in Israel. The collaboration with WILLER greatly expands and strengthens the company’s global MaaS ambition.
WILLER aims to unify user experiences across countries in the region; it released a MaaS app in 2019 and enabled a QR-code-based payment system this year. WILLER has partnered with Kuo-Kuang Motor Transportation, the largest bus operator in Taiwan, and Mai Linh, the largest taxi company in Vietnam, as well as invested in Car Club, a car-sharing service provider in Singapore. WILLER also partners with 150 local transportation providers in Japan. On top of these partnerships, WILLER will provide self-driving ride-hailing and ride-sharing services in the region and provide the best customer-ride experiences together with Mobileye.
The collaboration between WILLER and Mobileye will add a new transportation mode to the existing range of transportation services, including highway buses, railways and car-sharing. Adding self-driving vehicles, on-demand features and sharing services will improve customer ride experiences and address social challenges such as traffic accidents, congestion and, especially, the shortage of drivers and the challenges resulting from Japan’s aging society. Together Mobileye and WILLER will accelerate the social benefits of self-driving transportation solutions that contribute to higher quality of daily lives, making society smarter, safer and more sustainable.
About Mobileye Mobileye is the global leader in the development of computer vision and machine learning, data analysis, localization and mapping for advanced driver-assistance systems and automated driving. Mobileye’s technology helps keep people safer on the road, reduces the risks of traffic accidents, saves lives and aims to revolutionize the driving experience by enabling autonomous driving. Mobileye’s proprietary software algorithms and EyeQ® chips perform detailed interpretations of the visual field in order to anticipate possible collisions with other vehicles, pedestrians, cyclists, animals, debris and other obstacles. Mobileye’s products are also able to detect roadway markings such as lanes, road boundaries, barriers and similar items; identify and read traffic signs, directional signs and traffic lights; create a RoadBook™ of localized drivable paths and visual landmarks using REM™; and provide mapping for autonomous driving. More information is available in Mobileye’s press kit.
About WILLER WILLER was established in 1994 to provide society- and community-centric transportation services. WILLER pursues cutting-edge technology and marketing strategies to better customers’ ride experiences and create innovative values for society and local community. In Japan, WILLER has the largest intercity bus networks and operates a railway in Kyoto and operates unique restaurant buses that offers local cuisine area by area. Besides Japan, WILLER operates car-sharing services in Singapore and ride-hailing taxis in Vietnam.
Intel, working with the International Olympic Committee (IOC), will extend life-coaching, mentoring, and learning and development services to more than 50,000 athletes that are a part of the Olympic community through the postponed Olympic Games in Tokyo next year. Intel will provide these services as part of Athlete365, the IOC’s official athlete support program. This new initiative is a direct outcome of Intel’s commitment to support Olympians and Olympic hopefuls who are managing the impact of the COVID-19 pandemic.
Intel was able to provide these services through existing benefits the company offers its employees. Athlete benefits will include access to tools that will help address the challenges created by this worldwide pandemic. These tools and services include Headspace and EXOS, as well as additional learning and development services from Intel and LinkedIn. Further, Intel will design and deliver mentoring and networking services that are crafted specifically to support the needs of elite athletes within the Olympic community.
“Athletes work tirelessly to achieve their goals. In the process, they bring the world closer together. As a worldwide Olympic partner, we see the athletes as an extension to our Intel family and want to help in any way we can, especially during these challenging times. We have some great services for our Intel employees and want to extend them to the athlete community.”
– Intel CEO Bob Swan
“The IOC always has an athlete-first approach because athletes are at the heart of the Olympic movement. We are excited to be working with the Intel team to support athletes around the world, but also to drive the future of the Olympic Games through Intel’s cutting-edge technology. This collaboration is another demonstration of the support the IOC provides to athletes’ well-being at every stage of an athlete’s career.”
– IOC President Thomas Bach
“There are many important corporate benefits that can help athletes navigate future career opportunities. This program is crucial in supporting athletes achieve their professional and personal goals.”
– Ashton Eaton, Olympic champion and Intel employee
Intel and the IOC are committed to supporting the Olympic ecosystem, athletes and partners during the global pandemic. These benefits are available for Intel’s 100,000 employees, and now will extend to a greater community of 50,000 athletes spanning 200 countries. Additionally, Intel has donated Intel-technology-powered products, including virtual reality headsets, to different sporting committees to help athletes continue and enhance their training as well as staying connected. These benefits will provide athletes with resources beyond the Games and align with Intel’s mission of enriching lives and promoting inclusion. These service offerings build on Intel’s greater commitment to combating issues related to COVID-19 through its Pandemic Response Technology Initiative.
Services Provided to Athletes:
Intel employee mentoring services: Intel offers exclusive mentoring services from experienced Intel employees across a range of technical and non-technical backgrounds to help athletes develop meaningful growth opportunities as they retire from competition and transition into the next phases of their lives. Intel is widening the breadth of mentors by collaborating with the IOC to include IOC staff and experienced Olympic athletes. Athletes, including members of Team Intel, will be able to develop new post-competition pathways with their mentors and learn useful personal and professional skills.
Athlete Webinar Series: Today, Olympic champion and Intel employee Ashton Eaton and Intel’s vice president and general manager of the Olympic Program Office, Rick Echevarria, will participate in the IOC’s Athlete Webinar Series. Eaton will discuss his transition from Olympic athlete to Intel employee and how being an Olympian has prepared him for a career in technology.
Knowledge development: Intel is extending its employee-based courses taught by experienced Intel employees to the Olympic community. Courses are curated and adapted specifically for relevance to the athlete audience, building their competency in some of the most in-demand and essential skill sets industries are looking for in the evolving professional landscape of the future. Also, athletes can hone crucial skills important today and in their futures by accessing topics ranging from business and technology to public speaking and branding through LinkedIn.
Performance mindset: Athletes will have access to hundreds of resources for focus, sleep, movement and more, including content designed for recovery, competition, training and motivation through six months of Headspace Plus. Additionally, athletes will have access to content developed by EXOS around mindset and recovery.