DESIGN TOOLS
applications

Seeing LiDAR technology in a new light

Robert Bielby | September 2022

At Micron Ventures, we are always on the hunt to find companies that are truly innovative in their application of Artificial Intelligence (AI) and Machine Learning (ML) across various problem domains and applications. The sensor stack in industrial and automotive sectors is one of the areas we are constantly monitoring, and we have been specifically on the lookout for a low-cost, software defined 3D sensing solution.

Spatial Awareness & 3D Sensing

Spatial and environmental awareness is the critical bottleneck preventing the widespread adoption of autonomy in areas such as driving and factories. There are various technologies used in traditional sensing systems: cameras, ultrasound, radar, and LiDAR, but all have inherent limitations. Camera based sensors do well during the day, but they do not perform well at night because cameras need sufficient light for object detection. Conversely, radar sensors are capable of perceiving objects’ shapes better than cameras in dark environments but are typically limited in resolution. Light detection and ranging (LiDAR) technology has thus far shown promise in addressing these limitations with better performance in dark environments for object detection and 3D sensing but the technology is still limited in image quality and resolution in rainy or foggy weather.

For this reason, several autonomous vehicles (AV) manufacturers point to sensor fusion as the solution to address the inherent limitations of each sensor: taking different technologies and building a composite view around the vehicle through the combination of ultrasound radar, LiDAR, and cameras. This fusion of multiple sensor technologies is helpful to eliminate the weaknesses of any sensor type and is proving to be the best way forward. There has been a flurry of investment in LiDAR sensors with around $6 billion in ventures funding (source: Pichbook) over the last decade and has even resulted in eight (Luminar, Velodyne, Aeva, Ouster, AEye, Quanergy Systems, Cepton Technologies, and Innoviz Technologies) publicly traded LiDAR companies in the 2020/21 SPAC boom. However, LiDAR is challenged to become a practical technology that can be part of the sensor fusion portfolio in the vehicle due to limited reliability, high cost and limited accuracy and resolution.

The time is right for a new approach. While incumbent LiDAR players are focused on fine tuning hardware from the light source to the conversion and post processing of the data to achieve scale, Red Leader is using its novel software defined approach that offers increased flexibility and offers:

  • The ability to add complex and proprietary software algorithms in a dedicated System on a Chip (SoC) – driving higher resolution without compromising on power requirements
  • A hardware agnostic solution that can integrate across LiDAR sensors (mechanical vs solid state, wavelength of the source laser 905nm vs 1550nm and now pulsed vs. coherent)

A Leap Forward with Red Leader

As you know when you go to the dentist, and the dentist shows you the x-ray films of your teeth, and they might indicate a certain area and tell you, “There’s a cavity here”. But all that is visible to you is a blurry stain.

That’s similar to what untrained people see when analyzing the data of a traditional LiDAR. Just as it would be more helpful to see a dental x-ray clearly, it would also be more helpful to see the image produced by LiDAR more clearly.

That is what happens with Red Leader technology.

Red Leader’s unique software-based approach to 3D sensing in a predominantly hardware-centric LiDAR industry offers performance enhancement and unlocks interoperability across manufacturers via an open licensing model. The company has developed a proprietary AI algorithm and computational digital signal processing (DSP) software solution to enable light-wave processing to unlock scalability within 3D imaging. Unlike traditional LiDARs that are unresponsive to scenes and use a rudimentary single beam for signal generation, Red Leader transmits a data-rich customized waveform with multiple encoded beams without waiting for return signals and calculates distances by decoding the reflected beams from incoming signals like CDMA. By leveraging signal processing techniques to modulate laser signals across schemes — direct pulsed time-of-flight (ToF), indirect continuous ToF, and coherent fully modulated continuous wave (FMCW) — Red Leader enables transmission of multiple laser beams in a single receiver system and achieves 100x resolution enhancement compared to current LiDARs.

Red Leader achieves spatial awareness by combining high resolution (20M points/sec compared to 1M points/sec) sensing with edge compute (<5ms latency for automation loops) and is environmentally robust as its 3D sensor extends LiDAR capabilities to support all weather and lighting environments by unlocking radar-like ranges and camera-like resolutions not feasible today. In the image below we see that an industry standard 16-channel LiDAR system is unable to generate a clear point cloud (image created through LiDAR) but when equipped with Red Leader’s 3D sensing software, the same LiDAR system can clearly see objects and humans without compromising on range or cost. Traditional LiDARs are also prone to reduced ranges and resolutions at scale from crosstalk (noise created by myriad disparate signals) as more and more machines are equipped with LiDAR sensors, but Red Leader’s 3D sensing solution even resolves this problem since every channel in its transmit signal beam is unique.

comparison of 16 channel Lidar with red leader software versus 16 channel Lidar without red leader software Image from Red Leader | Red Leader Demo Video

Taking LiDAR to Multiple Market Segments

Red Leader is currently targeting autonomous mobile robots (AMRs) and autonomous guided vehicles (AGVs) as its beachhead within the robotics market to drive adoption and realize economies of scale by leveraging learnings and network effects to expand into broader high performance autonomous vehicles and mass-scale consumer electronics (AR/VR headsets, drones and smartphones) markets. By incorporating an activation-based business model in which Joint Development Manufacturers (JDMs) or Contract Manufacturers (CMs) license the open hardware reference designs for mass production, the Company is unlocking a scale motion currently infeasible by hardware focused LiDAR manufacturers.

Since their last financing round co-led by Micron Ventures, Jake Hillard and Rebecca Wong, the co-founders of Red Leader, have done a great job of attracting top talent from industry leaders and garnered notable investors and industry advisors to help scale their novel full-stack 3D sensing hardware-software solution. By investing in Red Leader at the Series A stage, Micron is not only supporting Red Leader to optimize its edge computations but also positioning Micron as a leading participant in the nascent but large and strategic 3D sensing market within the broader industrial, automotive and consumer electronics industries.

To learn more about Red Leader check out what their CEO had to say at Computex 2022 and the Red Leader website

To learn more about Micron Ventures and our portfolio companies visit here

Sr Director, Automotive Systems Architecture, EBU

Robert Bielby

Robert Bielby, senior director of Automotive Systems Architecture and Segment Marketing, is responsible for the strategy, marketing and product definition for Micron’s Automotive Division business group. Before joining Micron, Robert spent more than 30 years in systems, semiconductor and solutions businesses holding various engineering and executive roles at Kodak, Altera, LSI Logic, and Xilinx. Robert brings a wealth of experience at the system level in architecture, strategy, vertical marketing and product planning. Robert has authored multiple articles on broad industry topics and holds more than 40 patents in the areas of channel coding, digital signal processing, and programmable logic devices.