Stop the Bleeding: AI Startup Deep01 Assists Physicians Evaluate Brain Hemorrhage

During a stroke, a patient loses an estimated 1.9 million brain cells every minute, so interpreting their CT scan even one second quicker is vital to maintaining their health.

To save precious time, Taiwan-based medical imaging startup Deep01 has created an AI-based medical imaging software, called DeepCT, to evaluate acute intracerebral hemorrhage (ICH), a type of stroke. The system works with 95 percent accuracy in just 30 seconds per case — about 10 times faster than competing methods.

Founded in 2016, Deep01 is the first AI company in Asia to have FDA clearances in both the U.S. and Taiwan. It’s a member of NVIDIA Inception, a program that helps startups develop, prototype and deploy their AI or data science technology and get to market faster.

The startup recently raised around $3 million for DeepCT, which detects suspected areas of bleeding around the brain and annotates where they’re located on CT scans, notifying physicians of the results.

The software was trained using 60,000 medical images that displayed all types of acute ICH. Deep01 uses a self-developed deep learning framework that runs images and trains the model on NVIDIA GPUs.

“Working with NVIDIA’s robust AI computing hardware, in addition to software frameworks like TensorFlow and PyTorch, allows us to deliver excellent AI inference performance,” said David Chou, founder and CEO of the company.

Making Quick Diagnosis Accessible and Affordable

Strokes are the world’s second-most common cause of death. When stroke patients are ushered into the emergency room, doctors must quickly determine whether the brain is bleeding and what next steps for treatment should be.

However, many hospitals lack enough manpower to perform such timely diagnoses, since only some emergency room doctors specialize in reading CT scans. Because of this, Deep01 was founded, according to Chou, with the mission of offering affordable AI-based solutions to medical institutions.

The 30-second speed with which DeepCT completes interpretation can help medical practitioners prioritize the patients in most urgent need for treatment.

Helpful for Facilities of All Types and Sizes

DeepCT has helped doctors evaluate more than 5,000 brain scans and is being used in nine medical institutions in Taiwan, ranging from small hospitals to large-scale medical centers.

“The lack of radiologists is a big issue even in large-scale medical centers like the one I work at, especially during late-night shifts when fewer staff are on duty,” said Tseng-Lung Yang, senior radiologist at Kaohsiung Veterans General Hospital in Taiwan.

Geng-Wang Liaw, an emergency physician at Yeezen General Hospital — a smaller facility in Taiwan — agreed that Deep01’s technology helps relieve physical and mental burdens for doctors.

“Doctors in the emergency room may misdiagnose a CT scan at times,” he said. “Deep01’s solution stands by as an assistant 24/7, to give doctors confidence and reduce the possibility for medical error.”

Beyond ICH, Deep01 is at work on expanding its technology to identify midline shift, a pathological finding that occurs when there’s increased pressure on the brain and increases mortality.

The post Stop the Bleeding: AI Startup Deep01 Assists Physicians Evaluate Brain Hemorrhage appeared first on The Official NVIDIA Blog.

AI Explains AI: Fiddler Develops Model Explainability for Transparency

Your online loan application just got declined without explanation. Welcome to the AI black box.

Businesses of all stripes turn to AI for computerized decisions driven by data. Yet consumers using applications with AI get left in the dark on how automated decisions work. And many people working within companies have no idea how to explain the inner workings of AI to customers.

Fiddler Labs wants to change that.

The San Francisco-based startup offers an explainable AI platform that enables companies to explain, monitor and analyze their AI products.

Explainable AI is a growing area of interest for enterprises because those outside of engineering often need to understand how their AI models work.

Using explainable AI, banks can provide reasons to customers for a loan’s rejection, based on data points fed to models, such as maxed credit cards or high debt-to-income ratios. Internally, marketers can strategize about customers and products by knowing more about the data points that drive them.

“This is bridging the gap between hardcore data scientists who are building the models and the business teams using these models to make decisions,” said Anusha Sethuraman, head of product marketing at Fiddler Labs.

Fiddler Labs is a member of NVIDIA Inception, a program that enables companies working in AI and data science with fundamental tools, expertise and marketing support, and helps them get to market faster.

What Is Explainable AI?

Explainable AI is a set of tools and techniques that help explore the math inside an AI model. It can map out the data inputs and their weighted values that were used to arrive at the data output of the model.

All of this, essentially, enables a layperson to study the sausage factory at work on the inside of an otherwise opaque process. The result is explainable AI can help deliver insights into how and why a particular decision was made by a model.

“There’s often a hurdle to get AI into production. Explainability is one of the things that we think can address this hurdle,” Sethuraman said.

With an ensemble of models often at use, creating this is no easy job.

But Fiddler Labs CEO and co-founder Krishna Gade is up to the task. He previously led the team at Facebook that built the “Why am I seeing this post?” feature to help consumers and internal teams understand how its AI works in the Facebook news feed.

He and Amit Paka — a University of Minnesota classmate — joined forces and quit their jobs to start Fiddler Labs. Paka, the company’s chief product officer, was motivated by his experience at Samsung with shopping recommendation apps and the lack of understanding into how these AI recommendation models work.

Explainability for Transparency

Founded in 2018, Fiddler Labs offers explainability for greater transparency in businesses. It helps companies make better informed business decisions through a combination of data, explainable AI and human oversight, according to Sethuraman.

Fiddler’s tech is used by Hired, a talent and job matchmaking site driven by AI. Fiddler provides real-time reporting on how Hired’s AI models are working. It can generate explanations on candidate assessments and provide bias monitoring feedback, allowing Hired to assess its AI.

Explainable AI needs to be quickly available for consumer fintech applications. That enables customer service representatives to explain automated financial decisions — like loan rejections and robo rates — and build trust with transparency about the process.

The algorithms used for explanations require hefty processing. Sethuraman said that Fiddler Labs taps into NVIDIA cloud GPUs to make this possible, saying CPUs aren’t up to the task.

“You can’t wait 30 seconds for the explanations — you want explanations within milliseconds on a lot of different things depending on the use cases,” Sethuraman said.

Visit NVIDIA’s financial services industry page to learn more.

Image credit: Emily Morter, via the Unsplash Photo Community. 

The post AI Explains AI: Fiddler Develops Model Explainability for Transparency appeared first on The Official NVIDIA Blog.

Non-Stop Shopping: Startup’s AI Let’s Supermarkets Skip the Line

Eli Gorovici loves to take friends sailing on the Mediterranean. As the new pilot of Trigo, a Tel Aviv-based startup, he’s inviting the whole retail industry on a cruise to a future with AI.

“We aim to bring the e-commerce experience into the brick-and-mortar supermarket,” said Gorovici, who joined the company as its chief business officer in May.

The journey starts with the sort of shopping anyone who’s waited in a long checkout line has longed for.

You fill up your bags at the market and just walk out. Magically, the store knows what you bought, bills your account and sends you a digital receipt, all while preserving your privacy.

Trigo is building that experience and more. Its magic is an AI engine linked to cameras and a few weighted shelves for small items a shopper’s hand might completely cover.

With these sensors, Trigo builds a 3D model of the store. Neural networks recognize products customers put in their bags.

When shoppers leave, the system sends the grocer the tally and a number it randomly associated with them when they chose to swipe their smartphone as they entered the store. The grocer matches the number with a shopper’s account, charges it and sends off a digital bill.

And that’s just the start.

An Online Experience in the Aisles

Shoppers get the same personalized recommendation systems they’re used to seeing online.

“If I’m standing in front of pasta, I may see on my handset a related coupon or a nice Italian recipe tailored for me,” said Gorovici. “There’s so much you can do with data, it’s mind blowing.”

The system lets stores fine-tune their inventory management systems in real time. Typical shrinkage rates from shoplifting or human error could sink to nearly zero.

AI Turns Images into Insights

Making magic is hard work. Trigo’s system gathers a petabyte of video data a day for an average-size supermarket.

It uses as many as four neural networks to process that data at mind-melting rates of up to a few hundred frames per second. (By contrast, your TV displays high-definition movies at 60 fps.)

Trigo used a dataset of up to 500,000 2D product images to train its neural networks. In their daily operations, the system uses those models to run millions of inference tasks with help from NVIDIA TensorRT software.

The AI work requires plenty of processing muscle. A supermarket outside London testing the Trigo system uses servers in its back room with 40-50 NVIDIA RTX GPUs. To boost efficiency, Trigo plans to deliver edge servers using NVIDIA T4 Tensor Core GPUs and join the NVIDIA Metropolis ecosystem starting next year.

Trigo got early access to the T4 GPUs thanks to its participation in NVIDIA Inception, a program that gives AI startups traction with tools, expertise and go-to-market support. The program also aims to introduce Trigo to NVIDIA’s retail partners in Europe.

In 2021, Trigo aims to move some of the GPU processing to Google, Microsoft and other cloud services, keeping some latency- or privacy-sensitive uses inside the store. It’s the kind of distributed architecture businesses are just starting to adopt, thanks in part to edge computing systems such as NVIDIA’s EGX platform.

Big Supermarkets Plug into AI

Tesco, the largest grocer in the U.K., has plans to open its first market using Trigo’s system. “We’ve vetted the main players in the industry and Trigo is the best by a mile,” said Tesco CEO Dave Lewis.

Israel’s largest grocer, Shufersal, also is piloting Trigo’s system, as are other retailers around the world.

Trigo was founded in 2018 by brothers Michael and Daniel Gabay, leveraging tech and operational experience from their time in elite units of the Israeli military.

Seeking his next big opportunity in his field of video technology, Gorovici asked friends who were venture capitalists for advice. “They said Trigo was the future of retail,” Gorovici said.

Like sailing in the aqua-blue Mediterranean, AI in retail is a compelling opportunity.

“It’s a trillion-dollar market — grocery stores are among the biggest employers in the world. They are all being digitized, and selling more online now given the pandemic, so maybe this next stage of digital innovation for retail will now move even faster,” he said.

Taking the Heat Off: AI Temperature Screening Aids Businesses Amid Pandemic

As businesses and schools consider reopening around the world, they’re taking safety precautions to mitigate the lingering threat of COVID-19 — often taking the temperature of each individual entering their facilities.

Fever is a common warning sign for the virus (and the seasonal flu), but manual temperature-taking with infrared thermometers takes time and requires workers stationed at a building’s entrances to collect temperature readings. AI solutions can speed the process and make it contactless, sending real-time alerts to facilities management teams when visitors with elevated temperatures are detected.

Central California-based IntelliSite Corp. and its recently acquired startup, Deep Vision AI, have developed a temperature screening application that can scan over 100 people a minute. Temperature readings are accurate within a tenth of a degree Celcius. And customers can get up and running with the app within a few hours, with an AI platform running on NVIDIA GPUs on premises or in the cloud for inference.

“Our software platform has multiple AI modules, including foot traffic counting and occupancy monitoring, as well as vehicle recognition,” said Agustin Caverzasi, co-founder of Deep Vision AI, and now president of IntelliSite’s AI business unit. “Adding temperature detection was a natural, easy step for us.”

The temperature screening tool has been deployed in several healthcare facilities and is being tested at U.S. airports, amusement parks and education facilities. Deep Vision is part of NVIDIA Inception, a program that helps startups working in AI and data science get to market faster.

“Deep Vision AI joined Inception at the very beginning, and our engineering and research teams received support with resources like GPUs for training,” Caverzasi said. “It was really helpful for our company’s initial development.”

COVID Risk or Coffee Cup? Building AI for Temperature Tracking

As the pandemic took hold, and social distancing became essential, Caverzasi’s team saw that the technology they’d spent years developing was more relevant than ever.

“The need to protect people from harmful viruses has never been greater,” he said. “With our preexisting AI modules, we can monitor in real time the occupancy levels in a store or a hospital’s waiting room, and trigger alerts before the maximum occupancy is reached in a given area.”

With governments and health organizations advising temperature checking, the startup applied its existing AI capabilities to thermal cameras for the first time. In doing so, they had to fine-tune the model so it wouldn’t be fooled by false positives — for example, when a person shows up red on a thermal camera because of their cup of hot coffee..

This AI model is paired with one of IntelliSite’s IoT solutions called human-based monitoring, or hBM. The hBM platform includes a hardware component: a mobile cart mounted with a thermal camera, monitor and Dell Precision tower workstation for inference. The temperature detection algorithms can now scan five people at the same time.

Double Quick: Faster, Easier Screening

The workstation uses the NVIDIA Quadro RTX 4000 GPU for real-time inference on thermal data from the live camera view. This reduces manual scanning time for healthcare customers by 80 percent, and drops the total cost of conducting temperature scans by 70 percent.

Facilities using hBM can also choose to access data remotely and monitor multiple sites, using either an on-premises Dell PowerEdge R740 server with NVIDIA T4 Tensor Core GPUs, or GPU resources through the IntelliSite Cloud Engine.

If businesses and hospitals are also taking a second temperature measurement with a thermometer, these readings can be logged in the hBM system, which can maintain records for over a million screenings. Facilities managers can configure alerts via text message or email when high temperatures are detected.

The Deep Vision developer team, based in Córdoba, Argentina, also had to adapt their AI models that use regular camera data to detect people wearing face masks. They use the NVIDIA Metropolis application framework for smart cities, including the NVIDIA DeepStream SDK for intelligent video analytics and NVIDIA TensorRT to accelerate inference.

Deep Vision and IntelliSite next plan to integrate the temperature screening AI with facial recognition models, so customers can use the application for employee registration once their temperature has been checked.

IntelliSite is a member of the NVIDIA Clara Guardian ecosystem, bringing edge AI to healthcare facilities. Visit our COVID page to explore how other startups are using AI and accelerated computing to fight the pandemic.

FDA disclaimer: Thermal measurements are designed as a triage tool and should not be the sole means of diagnosing high-risk individuals for any viral threat. Elevated thermal readings should be confirmed with a secondary, clinical-grade evaluation tool. FDA recommends screening individuals one at a time, not in groups.

Smart Hospitals: DARVIS Automates PPE Checks, Hospital Inventories Amid COVID Crisis

After an exhausting 12-hour shift caring for patients, it’s hard to blame frontline workers for forgetting to sing “Happy Birthday” twice to guarantee a full 30 seconds of proper hand-washing.

Though at times tedious, the process of confirming such detailed, protective measures like the amount of time hospital employees spend sanitizing their hands, the cleaning status of a room, or the number of beds available is crucial to preventing the spread of infectious diseases such as COVID-19.

DARVIS, an AI company founded in San Francisco in 2015, automates tasks like these to make hospitals “smarter” and give hospital employees more time for patient care, as well as peace of mind for their own protection.

The company developed a COVID-19 infection-control compliance model within a month of the pandemic breaking out. It provides a structure to ensure that workers are wearing personal protective equipment and complying with hygiene protocols amidst the hectic pace of hospital operations, compounded by the pandemic. The system can also provide information on the availability of beds and other equipment.

Short for “Data Analytics Real-World Visual Information System,” DARVIS uses the NVIDIA Clara Guardian application framework, employing machine learning and advanced computer vision.

The system analyzes information processed by optical sensors, which act as the “eyes and ears” of the machine, and alerts users if a bed is clean or not, or if a worker is missing a glove, among other contextual insights. Upon providing feedback, all records are fully anonymized.

“It’s all about compliance,” said Jan-Philipp Mohr, co-founder and CEO of the company. “It’s not about surveilling workers, but giving them feedback where they could harm themselves. It’s for both worker protection and patient security.”

DARVIS is a member of NVIDIA Inception, a program that helps startups working in AI and data science accelerate their product development, prototyping and deployment.

The Smarter the Hospital, the Better

Automation in hospitals has always been critical to saving lives and increasing efficiency, said Paul Warren, vice president of Product and team lead for AI at DARVIS. However, the need for smart hospitals is all the more urgent in the midst of the COVID-19 crisis, he said.

“We talk to the frontline caregivers, the doctors, the nurses, the transport staff and figure out what part of their jobs is particularly repetitive, frustrating or complicated,” said Warren. “And if we can help automate that in real time, they’re able to do their job a lot more efficiently, which is ultimately good for improving patient outcomes.”

DARVIS can help save money as well as lives. Even before the COVID crisis, the U.S. Centers for Disease Control and Prevention estimated the annual direct medical costs of infectious diseases in U.S. hospitals to be around $45 billion, a cost bound to rise due to the global pandemic. By optimizing infection control practices and minimizing the spread of infectious disease, smart hospitals can decrease this burden, Mohr said.

To save costs and time needed to train and deploy their own devices, DARVIS uses PyTorch and TensorFlow optimized on NGC, NVIDIA’s registry of GPU-accelerated software containers.

“NVIDIA engineering efforts to optimize deep learning solutions is a game-changer for us,” said Warren. “NGC makes structuring and maintaining the infrastructure environment very easy for us.”

DARVIS’s current centralized approach involves deep learning techniques optimized on NVIDIA GPU-powered servers running on large workstations within the hospital’s data center.

As they onboard more users, the company plans to also use NVIDIA DeepStream SDK on edge AI embedded systems like NVIDIA Jetson Xavier NX to scale out and deploy at hospitals in a more decentralized manner, according to Mohr.

Same Technology, Numerous Possibilities

While DARVIS was initially focused on tracking beds and inventory, user feedback led to the expansion of its platform to different areas of need.

The same technology was developed to evaluate proper usage of PPE, to analyze worker compliance with infection control practices and to account for needed equipment in an operating room.

The team at DARVIS continues to research what’s possible with their device, as well as in the field of AI more generally, as they expand and deploy their product at hospitals around the world.

Watch DARVIS in action:

Learn more about NVIDIA’s healthcare-application framework on the NVIDIA Clara developers page.

Images courtesy of DARVIS, Inc.

The post Smart Hospitals: DARVIS Automates PPE Checks, Hospital Inventories Amid COVID Crisis appeared first on The Official NVIDIA Blog.

Screening for COVID-19: Japanese Startup Uses AI for Drug Discovery

Researchers are racing to discover the right drug molecule to treat COVID-19 — but the number of potential drug-like molecules out there is estimated to be an inconceivable 1060.

“Even if you hypothetically checked one molecule per second, it would take longer than the age of the universe to explore the entire chemical space,” said Shinya Yuki, co-founder and CEO of Tokyo-based startup Elix, Inc. “AI can efficiently explore huge search spaces to solve difficult problems, whether in drug discovery, materials development or a game like Go.”

Yuki’s company is using deep learning to accelerate drug discovery, building neural networks that predict the properties of molecules much faster than computer simulations can. To support COVID-19 research, the team is using AI to find drugs that are FDA-approved or in clinical trials that could be repurposed to treat the coronavirus.

“Developing a new drug from scratch is a years-long process, which is unwanted especially in this pandemic situation,” Yuki said. “Speed is critical, and drug-repurposing can help identify candidates with an existing clinical safety record, significantly reducing the time and cost of drug development.”

Elix recently published a paper on approved and clinical trial-stage drugs that its AI model flagged for potential COVID-19 treatments. Among the candidates selected by Elix’s AI tool was remdevisir, an antiviral drug that recently received emergency use authorization from the FDA for coronavirus cases.

A member of NVIDIA Inception, a program that helps startups get to market faster, Elix uses the NVIDIA DGX Station for training and inference of its deep learning algorithms. Yuki spoke about the company’s work in AI for drug discovery in the Inception Startup Showcase at GTC Digital, NVIDIA’s digital conference for developers and AI researchers.

Elix’s AI Fix for Drug Discovery

At the molecular level, a successful drug must have the perfect combination of shape, flexibility and interaction energies to bind to a target protein — like the spike proteins that cover the viral envelope of SARS-CoV-2, the virus that causes COVID-19.

SARS-CoV-2, the virus that causes COVID-19, has a surface covered in protein spikes. Image credit: CDC/ Alissa Eckert, MSMI; Dan Higgins, MAMS. Licensed under public domain.

A person gets infected with COVID-19 when these spike proteins attach to cells in the body, bringing the virus into the cells. An effective antiviral drug might interfere with this attachment process. For example, a promising drug molecule would bind with receptors on the spike proteins, preventing the virus from attaching to human cells.

To help researchers find the best drug for the job, Elix uses a variety of neural networks to rapidly narrow down the field of potential molecules. This allows researchers to reserve physical tests in the lab for a smaller subset of molecules that have a higher likelihood of solving the problem.

With predictive AI models, Yuki’s team can analyze a database of drug candidates to infer which have the right physical and chemical properties to treat a given disease. They also use generative models, which start from scratch to come up with promising molecular structures — some of which may not be found in nature.

That’s where a third neural network comes in, a retrosynthesis model that helps researchers figure out if the generated molecules can be synthesized in the lab.

Elix uses multiple NVIDIA DGX Station systems — GPU-powered AI workstations for data science development teams — to accelerate training and inference of these neural networks, achieving up to a 6x speedup using a single GPU for training versus a CPU.

Yuki says the acceleration is essential for the generative models, which would otherwise take a week or more to train until convergence, when the neural network reaches the lowest error rate possible. Each DGX Station has four NVIDIA V100 Tensor Core GPUs, enabling the Elix team to tackle bigger AI models and run multiple experiments at once.

“DGX Stations are basically supercomputers. We usually have several users working on the same machine at the same time,” he said. “We can not only train models faster, we can also run up to 15 experiments in parallel.”

The startup’s customers include pharmaceutical companies, research institutes and universities. Since molecular data is sensitive intellectual property for the pharma industry, most choose to run the AI models on their own on-prem servers.

Beyond drug discovery, Elix also uses AI for molecular design for material informatics, working with companies like tire- and rubber-manufacturer Bridgestone and RIKEN, Japan’s largest research institution. The company also develops computer vision models for autonomous driving and AI at the edge.

In one project, Yuki’s team worked with global chemical company Nippon Shokubai to generate a molecule that can be used as a blending material for ink, while posing a low risk of skin irritation.

Learn more about Elix in Yuki’s GTC Digital lightning talk. Visit our COVID page to explore how other startups are using AI and accelerated computing to fight the pandemic.

Main image by Chaos, licensed from Wikimedia Commons under CC BY-SA 3.0

The post Screening for COVID-19: Japanese Startup Uses AI for Drug Discovery appeared first on The Official NVIDIA Blog.

Hardhats and AI: Startup Navigates 3D Aerial Images for Inspections

Childhood buddies from back in South Africa, Nicholas Pilkington, Jono Millin and Mike Winn went off together to a nearby college, teamed up on a handful of startups and kept a pact: work on drones once a week.

That dedication is paying off. Their drone startup, based in San Francisco, is picking up interest worldwide and has landed $35 million in Series D funding.

It all catalyzed in 2014, when the friends were accepted into the AngelPad accelerator program in Silicon Valley. They founded DroneDeploy there, enabling contractors to capture photos, maps, videos and high-fidelity panoramic images for remote inspections of job sites.

“We had this a-ha moment: Almost any industry can benefit from aerial imagery, so we set out to build the best drone software out there and make it easy for everyone,” said Pilkington, co-founder and CTO at DroneDeploy.

DroneDeploy’s AI software platform — it’s the navigational brains and eyes — is operating in more than 200 countries and handling more than 1 million flights a year.

Nailing Down Applications

DroneDeploy’s software has been adopted in construction, agriculture, forestry, search and rescue, inspection, conservation and mining.

In construction, DroneDeploy is used by one-quarter of the world’s 400 largest building contractors and six of the top 10 oil and gas companies, according to the company.

DroneDeploy was one of three startups that recently presented at an NVIDIA Inception Connect event held by Japanese insurer Sompo Holdings. For good reason: Startups are helping insurance and reinsurance firms become more competitive by analyzing portfolio risks with AI.

The NVIDIA Inception program nurtures startups with access to GPU guidance, Deep Learning Institute courses, networking and marketing opportunities.

Navigating Drone Software

DroneDeploy offers features like fast setup of autonomous flights, photogrammetry to take physical measurements and APIs for drone data.

In addition to supporting industry-leading drones and hardware, DroneDeploy operates an app ecosystem for partners to build apps using its drone data platform. John Deere, for example, offers an app for customers to upload aerial drone maps of their fields to their John Deere account so that they can plan flights based on the field data.

Split-second photogrammetry and 360-degree images provided by DroneDeploy’s algorithms running on NVIDIA GPUs in the cloud help provide pioneering mapping and visibility.

AI on Safety, Cost and Time

Drones used in high places instead of people can aid in safety. The U.S. Occupational Safety and Health Administration last year reported that 22 people were killed in roofing-related accidents in the U.S.

Inspecting roofs and solar panels with drone technology can improve that safety record. It can also save on cost: The traditional alternative to having people on rooftops to perform these inspections is using helicopters.

Customers of the DroneDeploy platform can follow a quickly created map to carry out a sequence of inspections with guidance from cameras fed into image recognition algorithms.

Using drones, customers can speed up inspections by 80 percent, according to the company.  

“In areas like oil, gas and energy, it’s about zero-downtime inspections of facilities for operations and safety, which is a huge value driver for these customers,” said Pilkington.

The post Hardhats and AI: Startup Navigates 3D Aerial Images for Inspections appeared first on The Official NVIDIA Blog.

Sand Safety: Startup’s Lifeguard AI Hits the Beach to Save Lives

A team in Israel is making a splash with AI.

It started as biz school buddies Netanel Eliav and Adam Bismut were looking to solve a problem to change the world. The problem found them: Bismut visited the Dead Sea after a drowning and noticed a lack of tech for lifeguards, who scanned the area with age-old binoculars.

The two aspiring entrepreneurs — recent MBA graduates of Ben-Gurion University, in the country’s south — decided this was their problem to solve with AI.

“I have two little girls, and as a father, I know the feeling that parents have when their children are near the water,” said Eliav, the company’s CEO.

They founded Sightbit in 2018 with BGU classmates Gadi Kovler and Minna Shezaf to help lifeguards see dangerous conditions and prevent drownings.

The startup is seed funded by Cactus Capital, the venture arm of their alma mater.

Sightbit is now in pilot testing at Palmachim Beach, a popular escape for sunbathers and surfers in the Palmachim Kibbutz area along the Mediterranean Sea, south of Tel Aviv. The sand dune-lined destination, with its inviting, warm aquamarine waters, gets packed with thousands of daily summer visitors.

But it’s also a place known for deadly rip currents.

Danger Detectors

Sightbit has developed image detection to help spot dangers to aid lifeguards in their work. In collaboration with the Israel Nature and Parks Authority, the Beersheba-based startup has installed three cameras that feed data into a single NVIDIA Jetson AGX at the lifeguard towers at Palmachim beach. NVIDIA Metropolis is deployed for video analytics.

The system of danger detectors enables lifeguards to keep tabs on a computer monitor that flags potential safety concerns while they scan the beach.

Sightbit has developed models based on convolutional neural networks and image detection to provide lifeguards views of potential dangers. Kovler, the company’s CTO, has trained the company’s danger detectors on tens of thousands of images, processed with NVIDIA GPUs in the cloud.

Training on the images wasn’t easy with sun glare on the ocean, weather conditions, crowds of people, and people partially submerged in the ocean, said Shezaf, the company’s CMO.

But Sightbit’s deep learning and proprietary algorithms have enabled it to identify children alone as well as clusters of people. This allows its system to flag children who have strayed from the pack.

Rip Current Recognition

The system also harnesses optical flow algorithms to detect dangerous rip currents in the ocean for helping lifeguards keep people out of those zones.  These algorithms make it possible to identify the speed of every object in an image, using partial differential equations to calculate acceleration vectors of every voxel in the image.

Lifeguards can get updates on ocean conditions so when they start work they have a sense of hazards present that day.

“We spoke with many lifeguards. The lifeguard is trying to avoid the next accident. Many people go too deep and get caught in the rip currents,” said Eliav.

Cameras at lifeguard towers processed on the single compact supercomputing Jetson Xavier and accessing Metropolis can offer split-second inference for alerts, tracking, statistics and risk analysis in real time.

The Israel Nature and Parks Authority is planning to have a structure built on the beach to house more cameras for automated safety, according to Sightbit.

COVID-19 Calls 

Palmachim Beach lifeguards have a lot to watch, especially now as people get out of their homes for fresh air after the region begins reopening from COVID-19-related closures.

As part of Sightbit’s beach safety developments, the company had been training its network to spot how far apart people were to help gauge child safety.

This work also directly applies to monitoring social distancing and has attracted the attention of potential customers seeking ways to slow the spread of COVID-19. The Sightbit platform can provide them crowding alerts when a public area is overcrowded and proximity alerts for when individuals are too close to each other, said Shezaf.

The startup has put in extra hours to work with those interested in its tech to help monitor areas for ways to reduce the spread of the pathogen.

“If you want to change the world, you need to do something that is going to affect people immediately without any focus on profit,” said Eliav.

 

Sightbit is a member of NVIDIA Inception, a virtual accelerator program that helps startups in AI and data science get to market faster.

The post Sand Safety: Startup’s Lifeguard AI Hits the Beach to Save Lives appeared first on The Official NVIDIA Blog.

Heart of the Matter: AI Helps Doctors Navigate Pandemic

A month after it got FDA approval, a startup’s first product was saving lives on the front lines of the battle against COVID-19.

Caption Health develops software for ultrasound systems, called Caption AI. It uses deep learning to empower medical professionals, including those without prior ultrasound experience, to perform echocardiograms quickly and accurately.

The results are images of the heart often worthy of an expert sonographer that help doctors diagnose and treat critically ill patients.

The coronavirus pandemic provided plenty of opportunities to try out the first dozen systems. Two doctors who used the new tool shared their stories on the condition they are their patients remain anonymous.

A 53-year-old diabetic woman with COVID-19 went into cardiac shock in a New York hospital. Without the images from Caption AI, it would have been difficult to clinch the diagnosis, said a doctor on the scene.

The system helped the physician identify heart problems in an 86-year-old man with the virus in the same hospital, helping doctors bring him back to health. It was another case among more than 200 in the facility that was effectively turned into a COVID-19 hospital in mid-March.

The Caption Health system made a tremendous impact for a staff spread thin, said the doctor. It would have been hard for a trained sonographer to keep up with the demand for heart exams, he added.

Heart Test Becomes Standard Procedure

Caption AI helped doctors in North Carolina determine that a 62-year-old man had COVID-19-related heart damage. Thanks, in part, to the ease of using the system, the hospital now performs echocardiograms for most patients with the virus.

At the height of the pandemic’s first wave, the hospital stationed ultrasound systems with Caption AI in COVID-19 wards. Rather than sending sonographers from unit to unit, the usual practice, staff stationed at the wards used the systems. The change reduced staff exposure to the virus and conserved precious protective gear.

Beyond the pandemic, the system will help hospitals provide urgent services while keeping a lid on rising costs, said a doctor at that hospital.

“AI-enabled machines will be the next big wave in taking care of patients wherever they are,” said Randy Martin, chief medical officer of Caption Health and emeritus professor of cardiology at Emory University.

Martin joined the startup about four years ago after meeting its founders, who shared expertise and passion for medicine and AI. Today their software “takes a user through 10 standard views of the heart, coaching them through some 90 fine movements experts make,” he said.

“We don’t intend to replace sonographers; we’re just expanding the use of portable ultrasound systems to the periphery for more early detection,” he added.

Coping with an Unexpected Demand Spike

In the early days of the pandemic, that expansion couldn’t come fast enough.

In late March, the startup exhausted supplies that included NVIDIA Quadro P3000 GPUs that ran its AI software. In the early days of the global shutdown, the startup reached out to its supply chain.

“We are experiencing overwhelming demand for our product,” the company’s CEO wrote, after placing orders for 100 GPUs with a distributor.

Caption Health has systems currently in use at 11 hospitals. It expects to deploy Caption AI at a number of additional sites in the coming weeks.

GPUs at the Heart of Automated Heart Tests

The startup currently integrates its software in a portable ultrasound from Terason. It intends to partner with more ultrasound makers in the future. And it advises partners to embed GPUs in their future ultrasound equipment.

The Quadro P3000 in Caption AI runs real-time inference tasks using deep convolutional neural networks. They provide operators guidance in positioning a probe that captures images. Then they automatically choose the highest-quality heart images and interpret them to help doctors make informed decisions.

The NVIDIA GPU also freed up four CPU cores, making space to process other tasks on the system, such as providing a smooth user experience.

The startup trained its AI models on a database of 1 million echocardiograms from clinical partners. An early study in partnership with Northwestern Medicine and the Minneapolis Heart Institute showed Caption AI helped eight registered nurses with no prior ultrasound experience capture highly accurate images on a wide variety of patients.

Inception Program Gives Startup Traction

Caption Heath, formerly called Bay Labs, was founded in 2015 in Brisbane, Calif. It received a $125,000 prize at a 2017 GTC competition for members of NVIDIA’s Inception program, which gives startups access to technology, expertise and markets.

“Being part of the Inception program has provided us with increased recognition in the field of deep learning, a platform to share our AI innovations with healthcare and deep learning communities, and phenomenal support getting NVIDIA GPUs into our supply chain so we could deliver Caption AI,” said Charles Cadieu, co-founder and president of Caption Health.

Now that its tool has been tested in a pandemic, Caption Health looks forward to opportunities to help save lives across many ailments. The company aims to ride a trend toward more portable systems that extend availability and lower costs of diagnostic imaging.

“We hope to see our technology used everywhere from big hospitals to rural villages to examine people for a wide range of medical conditions,” said Cadieu.

To learn more about Caption Health and other companies like it, watch the webinar on healthcare startups against COVID-19 Heart of the Matter: AI Helps Doctors Navigate Pandemic appeared first on The Official NVIDIA Blog.

Taking AI to Market: NVIDIA and Arterys Bridge Gap Between Medical Researchers and Clinicians

Around the world, researchers in startups, academic institutions and online communities are developing AI models for healthcare. Getting these models from their hard drives and into clinical settings can be challenging, however.

Developers need feedback from healthcare practitioners on how their models can be optimized for the real world. So, San Francisco-based AI startup Arterys built a forum for these essential conversations between clinicians and researchers.

Called the Arterys Marketplace, and now integrated with the NVIDIA Clara Deploy SDK, the platform makes it easy for researchers to share medical imaging AI models with clinicians, who can try it on their own data.

“By integrating the NVIDIA Clara Deploy technology into our platform, anyone building an imaging AI workflow with the Clara SDK can take their pipeline online with a simple handoff to the Arterys team,” said Christian Ulstrup, product manager for Arterys Marketplace. “We’ve streamlined the process and are excited to make it easy for Clara developers to share their models.”

Researchers can submit medical imaging models in any stage of development — from AI tools for research use to apps with regulatory clearance. Once the model is posted on the public Marketplace site, anyone with an internet connection can test it by uploading a medical image through a web browser.

Models on Arterys Marketplace run on NVIDIA GPUs through Amazon Web Services for inference.

A member of both the NVIDIA Inception and AWS Activate programs, which collaborate to help startups get to market faster, Arterys was founded in 2011. The company builds clinical AI applications for medical imaging and launched the Arterys Marketplace at the RSNA 2019 medical conference.

It recently raised $28 million in funding to further develop the ecosystem of partners and clinical-grade AI solutions on its platform.

Several of the models now on the Arterys Marketplace are focused on COVID-19 screening from chest X-rays and CT images. Among them is a model jointly developed by NVIDIA’s medical imaging applied research team and clinicians and data scientists at the National Institutes of Health. Built in under three weeks using the NVIDIA Clara Train framework, the model can help researchers study the detection of COVID-19 from chest CT scans.

Building AI Pillar of the Community

While there’s been significant investment in developing AI models for healthcare in the last decade, the Arterys team found that it can still take years to get radiologists’ hands on the tools.

“There’s been a huge gap between the smart, passionate researchers building AI models for healthcare and the end users — radiologists and clinicians who can use these models in their workflow,” Ulstrup said. “We realized that no research institution, no startup was going to be able to do this alone.”

The Arterys Marketplace was created with simplicity in mind. Developers need only fill out a short form to submit an AI model for inclusion, and then can send the model to users as a URL — all for free.

For clinicians around the world, there’s no need to download and install an AI model. All that’s needed is an internet connection and a couple medical images to upload for testing with the AI models. Users can choose whether or not their imaging data is shared with the researchers.

The images are analyzed with NVIDIA GPUs in the cloud, and results are emailed to the user within minutes. A Slack channel provides a forum for clinicians to provide feedback to researchers, so they can work together to improve the AI model.

“In healthcare, it can take years to get from an idea to seeing it implemented in clinical settings. We’re reducing that to weeks, if not days,” said Ulstrup. “It’s absurdly easy compared to what the process has been in the past.”

With a focus on open innovation and rapid iteration, Ulstrup says, the Arterys Marketplace aims to bring doctors into the product development cycle, helping researchers build better AI tools. By interacting with clinicians in different geographies, developers can improve their models’ ability to generalize across different medical equipment and imaging datasets.

Over a dozen AI models are on the Arterys Marketplace so far, with more than 300 developers, researchers, and startups joining the community discussion on Slack.

“Once models are hosted on the Arterys Marketplace, developers can send them to researchers anywhere in the world, who in turn can start dragging and dropping data in and getting results,” Ulstrup said. “We’re seeing discussion threads between researchers and clinicians on every continent, sharing screenshots and feedback — and then using that feedback to make the models even better.”

Check out the research-targeted AI COVID-19 Classification Pipeline developed by NVIDIA and NIH researchers on the Arterys Marketplace. To hear more from the Arterys team, register for the Startups4COVID webinar, taking place July 28.

The post Taking AI to Market: NVIDIA and Arterys Bridge Gap Between Medical Researchers and Clinicians appeared first on The Official NVIDIA Blog.