DarwinAI reaches strategic collaboration with Lockheed Martin
EP&T MagazineAutomation / Robotics Electronics Engineering AI AI
‘Explainability’ attempts to illuminate how neural networks
DarwinAI, the “explainable AI company” located in Waterloo ON, has reached a strategic collaboration with global aerospace leader Lockheed Martin that seeks to improve Lockheed Martin’s customers’ understanding of AI solutions.
Explainable AI or “explainability” attempts to illuminate how neural networks – complex constructions that mimic the human brain – reach their decisions. The lack of understanding around AI’s decision-making process has hampered the widespread adoption of AI.
Improving neural network efficiencies
In response to this industry-wide impasse, DarwinAI created an explainability platform for deep learning development powered by its proprietary technology, GenSynth Explain. In addition to improving neural network efficiencies, the platform can dramatically reduce the time it takes to produce robust and accurate models through the insights it generates.
“Explainability is a critical challenge in our industry. Understanding how a neural network makes its decisions is important in constructing robust AI solutions that our customers can trust,” said Lee Ritholtz, director and chief architect of applied artificial intelligence at Lockheed Martin. “We look forward to working with the DarwinAI team to identify projects across our enterprise where their explainability technology can be applied.”
“Negotiating AI’s black box problem in a practical, actionable manner is a key focus for us this year,” said Sheldon Fernandez, CEO, DarwinAI. “Our collaboration with a leader in the aerospace industry such as Lockheed Martin underscores the importance of trustworthy AI solutions. We look forward to this important collaboration.”