Electronic Products & Technology

EU’s Tulipp project blooms for embedded image processing and vision applications

EP&T Magazine   

Electronics Embedded Systems Optoelectronics Engineering embedded image embedded image processing processing vision applications vision applications

Delivers comprehensive reference platform for vision-based system designers comprising full development kit and ‘real-world’ use cases

The Tulipp (Towards Ubiquitous Low-power Image Processing Platforms) Consortium, Palaiseau, France, has announced a highly successful conclusion to the EU’s three-year project. Beginning in January 2016, the Tulipp project targeted the improved development of high performance, energy efficient systems for the growing range of complex, vision-based image processing applications. The Tulipp project was funded with nearly $4-million-Euro from Horizon 2020, the European Union’s biggest research and innovation program to date.

Medical X-ray image before (left) and after (right) processing to remove sensor noise.

The conclusion of the Tulipp project sees the release of a comprehensive reference platform for vision-based embedded system designers, enabling computer vision product designers to readily address the combined challenges of low power, low latency, high performance and real-time image processing design constraints. The Tulipp reference platform includes a full development kit, comprising an FPGA-based embedded, multicore computing board, parallel real-time operating system and development tool chain with guidelines, coupled with ‘real world’ Use Cases focusing on diverse applications such as medical x-ray imaging, driver assistance and autonomous drones with obstacle avoidance. The complete Tulipp ecosystem was demonstrated earlier in the year to vision-based system designers in a series of hands-on tutorials.

“The Tulipp project has achieved all of its objectives,” said Philippe Millet of Thales and Tulipp’s project coordinator. “By taking a diverse range of application domains as the basis for defining a common reference processing platform that captures the commonality of real-time, high-performance image processing and vision applications, it has successfully addressed the fundamental challenges facing today’s embedded vision-based system designers.”

Compliant with the PC/104 embedded processor board standard

Developed by Sundance Multiprocessor Technology, each instance of the Tulipp processing platform is 40mm x 50mm and is compliant with the PC/104 embedded processor board standard. The hardware platform utilizes the powerful multicore Xilinx Zynq Ultrascale+ MPSoC which contains, along with the Xilinx FinFET+ FPGA, an ARM Cortex-A53 quad-core CPU, an ARM Mali-400 MP2 Graphics Processing Unit (GPU), and a real-time processing unit (RPU) containing a dual-core ARM Cortex-R5 32-bit real-time processor based on the ARM-v7R architecture. A separate expansion module (VITA57.1 FMC) allows application-specific boards with different flavours of input and output interfaces to be created while keeping the interfaces with the processing module consistent.

Advertisement

The Tulipp project team at the HIPEAC 2019 event.

Coupled with the Tulipp hardware platform, is a parallel, low latency embedded real-time operating system developed by Hipperos specifically to manage complex multi-threaded embedded applications in a predictable manner. Perfect real-time co-ordination ensures a high frame rate without missing any deadlines or data. Additionally, to facilitate the efficient development of image processing applications on the Tulipp hardware and in order to help vision-based systems designers understand the impact of their functional mapping and scheduling choices on the available resources, the Tulipp reference platform has been extended with performance analysis and power measurement features developed by Norges Teknisk-Naturvitenskapelige Universitet (NTNU) and Technische Universität Dresden (TUD) and implemented in the Tulipp STHEM toolchain.

Also, the insights of the Tulipp Consortium’s experts have been captured in a set of guidelines, consisting of practical advice, best practice approaches and recommended implementation methods, to help vision-based system designers select the optimal implementation strategy for their own applications. This will become a TULIPP book to be published by Springer by the end of 2019 and supported by endorsements from the growing ecosystem of developers that are currently testing the concept.

Defining a common reference processing platform

To further demonstrate the applicability of defining a common reference processing platform, comprising the hardware, operating system and a programming environment that captures the commonality of real-time, high performance image processing and vision application, Tulipp has also developed three ‘real-world’ Use Cases in distinctly diverse application domains – medical X-ray imaging, automotive Advanced Driver Assistance Systems (ADAS) and Unmanned Aerial vehicles (UAVs).

The Tulipp Starter Kit with Lynsyn PDM on a PC/104 carrier board.

Tulipp’s medical X-ray imaging Use Case demonstrates advanced image enhancement algorithms for X-ray images running at high frame rates. It focuses on improving the performance of X-ray imaging Mobile C-Arms, which provide an internal view of a patient’s body in real-time during the course of an operation, to deliver increases in surgeon efficiency and accuracy with minimal incision sizes, aids faster patient recovery, lowers nosocomial disease risks and reduces by 75% the radiation doses to which patients and staff are exposed.

Small, energy-efficient Electronic Control Unit

ADAS adoption is dependent on the implementation of vision systems or on combinations of vision and radar and the algorithms must be capable of integration into a small, energy-efficient Electronic Control Unit (ECU). An ADAS algorithm should be able to process a video image stream with a frame size of 640×480 at a full 30Hz or at least at the half rate. The Tulipp ADAS Use Case demonstrates pedestrian recognition in real-time based on Viola & Jones algorithm. Using the Tulipp reference platform, the ADAS Use Case achieves a processing time per frame of 66ms, which means that the algorithm reaches the target of running on every second image when the camera runs at 30Hz.

Tulipp’s UAV Use Case demonstrates a real-time obstacle avoidance system for UAVs based on a stereo camera setup with cameras orientated in the direction of flight. Even though we talk about autonomous drones, most current systems are still remotely piloted by humans. The Use Case uses disparity maps, which are computed from the camera images, to locate obstacles in the flight path and to automatically steer the UAV around them. This is the necessary key towards fully autonomous drones.

“As image processing and vision applications grow in complexity and diversity, and become increasingly embedded by their very nature, vision-based system designers need to know that they can simply and easily solve the design constraint challenges of low power, low latency, high performance and reliable real-time image processing that face them,” concluded Philippe Millet. “The EU’s Tulipp project has delivered just that. Moreover, the ecosystem of stakeholders that we have created along the way will ensure that it will continue to deliver in the future. Tulipp will truly leave a legacy.”

Advertisement

Stories continue below

Print this page

Related Stories