Top Banner
MIPI CSI-2CSI-2Application for Vision and Sensor Fusion Systems Richard Sproul – Cadence Design Systems, IP Architect
26

MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

Apr 15, 2017

Download

Mobile

MIPI Alliance
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

MIPI CSI-2℠

CSI-2℠ Application for Vision and Sensor Fusion Systems

Richard Sproul – Cadence Design Systems, IP Architect

Page 2: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

Overview •  The expanding demand for imaging and vision based

systems in mobile, IoT and automotive products is creating the need for multi-camera and sensor fusion systems to look for novel ways to gather and process multiple camera/sensor data streams whilst still fitting into the mobile interface.

•  The presentation will highlight some of the key details and requirements for a system with image processing of a multi-camera/sensor system.

2

Page 3: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

CSI-2 Application for Multi-Sensor Systems Multi-Camera Applications

•  Imaging applications are everywhere •  Mobile Phone

–  Selfie Picture in Picture –  Gesture Recognition

•  Video Games –  Gesture Recognition

•  Autonomous Driving –  Pedestrian Detection

–  Signage Recognition –  Night Vision –  Parallel Parking!

•  In-Car Control –  Gesture Recognition

Page 4: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

4

Morethanthenakedeye…

CSI-2 Application for Multi-Sensor Systems Machine Vision

Page 5: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

5

CSI-2 Application for Multi-Sensor Systems Camera Applications Op0malpathwayformul0pleforward-lookingadvancementsinimaging–KeyDrivers:Health,Convenience,Security,Lifestyle,Efficiency–High-perfpixelconduitneedsmetwithC/D-PHYadvancements–Broaddefini0onsandfuzzyrange:(i.e.Wearable:NearBody,OnBody,InBody)•Explorepossibili0esofoverlapbetweenImagingandlow-speedsensorrequirementsandsolu0ons

CameraControllerInterface(CCI/CCS)advancementconsidera0ons:-Point-to-PointandMul0-Dropconfigura0ons-Energyconsumed/Gbtransfer-LimitlatencyforVB&HB-PrecisionTiming&Sync-IndependentTransport:PixelData&Control-ChannelIntegrity(ErrorDetec0on)-FWUpload(ISP,Neural)

Page 6: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

• Options for physical interface •  Pins, legacy, bandwidth

6

CSI-2 Application for Multi-Sensor Systems MIPI CSI-2 Interfaces

Page 7: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

•  Evolution of the CSI-2

7

CSI-2 Application for Multi-Sensor Systems CSI-2 Generations

Page 8: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

8

CSI-2 Application for Multi-Sensor Systems CSI-2 Performance

Page 9: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

•  CSI-2 packets V1.x to V2.0

9

LP

LP

LP

LP

Transi0onbetweenpacketstoLPstateforPHYdatalane(100ns)

Transi0onbetweenpacketsbyusingfillerpa_erns

CSI-2 Application for Multi-Sensor Systems CSI-2 Packet Structure

Page 10: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

•  Improve the effective bandwidth

10

0

20

40

60

80

100

120

1000 1250 1500 1800 2000 2250 2500

Fram

eRatefp

s

BitRate(Mbps)

CSI-2FrameRateImprovementV1.xtoV2.x1920x1080RAW12

FPS(V1.x) FPF(V2.x)

CSI-2 Application for Multi-Sensor Systems CSI-2 Packet Transmission

Page 11: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

•  Automotive application for driver assistance - External systems and for in-car control

• Objects• Highresolu0on

• NightimageandIR

• In-cargesture

• Peopledetec0on

• Mediumresolu0on

• Roadsignage• Mediumresolu0on

Parkingassistance

CSI2 Application for Multi-Sensor Systems Advanced Driver-Assistance System

Page 12: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

CSI-2 Application for Multi-Sensor Systems • Application in a Multi-Camera Platform

12

Automotive AV Reference Subsystem

MIPIDPHY

AudioDSP

$I $D

System Interconnect

Image/VisionDSP

DMA I-RAM D-RAM

AXI2AHB

UART I2C

32bAPB

TimerI2S GPIO

AHB2APB

32bAHB

QSPI

SoundWire

Audio

USB2/3device

EthernetMAC

On-ChipSystemSRAM

1300MT/sDDR3

Controller

DDR-PHY

64bDDR3SODIMM

SDSDIOeMMC

Display

Inm.

BRPHY

USBPHY

Pixel2AXI

ColorConver

tVideoScalar

HDMIPHY

Image/VisionDSP

DMA I-RAM D-RAM

MIPICSI-2Rx

MIPICSI-2RX

MIPICSI-2Rx

MIPIDPHY

MIPIDPHY

SensorDSP

$I

Page 13: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

13

CSI-2 Application for Multi-Sensor Systems Sensors Everywhere

Page 14: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

•  The data does not have to be images… •  LiDAR

•  The resolution is low (IR RAW data, typically 64 pixels high, though much more horizontally)

•  Range is limited. Typical LiDARs see well to about 70 metres. •  Refresh rates tend to be slower, at around 10fps.

•  RADAR •  Long range – cruise control, brake assist

•  Ultrasonic •  Short-range parking assist •  Self parking ☺

•  Protocol support with user-defined data to transfer the bytes

14

CSI-2 Application for Multi-Sensor Systems CSI-2 for Sensors

Page 15: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

CSI-2 Application for Multi-Sensor Systems CSI-2 Example Video Frame

•  Bandwidth on CSI-2 V1.1 – 4 Lanes 6Gbps •  So with our 30fps, we have 200M bit to use

•  3 HD camera RGB888 1920x1200x24=165.888M •  Also adding 100ns gaps (150 bit clocks) •  3 x(1920x24) +3x150 = 138240 •  Embedded data line with image processed data (clusters, edges)

Page 16: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

16

CSI-2TXController

VideoBuffer

CentralizedECUforInfotainmentorADAS

Pixel

Pixel

Data

Data

SensorDSP

SensorDSP

PCIe

SensorDSP

I2S

I2S I2SSoundW

ire

eMMC

AppsUSB

DSI

DDR

DPHY

PCIePHY

USBPHY

Sensor

Sensor

VideoEP

VideoEP

PPI

PPI

DPHY

DPHY

Pixel

Pixel

PHY

Data

Data

CSI-2 domain

VideoBuffer

DataBuffer

DataBuffer

I2SSoundW

ire

SensorDSP

CSI-2RXCTRL

VisionDSP

CSI-2

VisionDSP

CSI-2

CSI-2 Application for Multi-Sensor Systems CSI-2 Sensor Fusion Example

•  Sensor Fusion ADAS System Topology •  Merge the data from image and other sensors •  Pre-processing the inline data for the application

Page 17: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

•  Image Processing and the Application •  Application processing will need to perform the ADAS system and

sensor analysis

17

CSI2 Application for Multi-Sensor Systems CSI-2 Sensor Fusion Example

Page 18: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

18

CSI-2 Application for Multi-Sensor Systems Filling the Channels

D PHY ( MCNN )

D - PHY ( MFEN )

Pixel Processor / Application

DP

DP DN

DN D PHY ( SCNN )

D - PHY ( SFEN )

Pixel Stream 0

PPI

PPI

D - PHY ( MFEN )

DP DN D - PHY

( SFEN ) PPI

Pixel Stream 1 Pixel Stream 2 Pixel Stream 3

Sensor Processor / Application

Pixel Stream 4 Pixel Stream 5 Pixel Stream 6 Pixel Stream 7

CSI-2 Host Controller

PPI

PPI

PPI

D - PHY ( MFEN ) D - PHY

( SFEN ) DP DN CSI-2 Slave

Controller PPI PPI Sensor Control Sensor

Control

Sensor

Sensor

Sensor

Sensor

Sensor DATA

DATA

DATA

DATA

DATA

DATA DATA

CSI-2 Slave Controller

CSI-2 Host Controller

Page 19: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

19

CSI-2 Application for Multi-Sensor Systems Physical Interface

Page 20: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

CSI-2 Application for Multi-Sensor Systems CSI-2 Example Video Frame

•  Bandwidth on CSI-2 V1.1 – 4 Lanes 6Gbps –  Using LS/LE to keep synchronisation and sequence –  Use the virtual channel to identify the sensor –  Use the data types (RAW, RGB, YUV and user defined) –  Use the short packet sync events

Page 21: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

•  Functional safety considerations •  DPHY BER, RX error detection •  Packet header ECC •  Payload CRC •  SP sync sequences, counting values

21

CSI2 Application for Multi-Sensor Systems Functional Safety In CSI2 ADAS

Page 22: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

•  Pitfalls of interleaved streams

22

CSI-2 Application for Multi-Sensor Systems CSI-2 Interleaving Data

Page 23: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

23

CSI-2 Application for Multi-Sensor Systems CSI-2 Interleaving

Page 24: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

CSI-2 Beyond Mobile

24

Page 25: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

CSI-2 Beyond Mobile •  System architecture considerations for CNN

applications: •  Assist •  Co-pilot •  Automated

• Optimal platform arch for the CNN engines •  Central processing (+SW dev, lacks scalability / modularity, cost

may not align w/ entry-level cars) •  Distributed processing (plug-and-play, scalable, each camera unit

enhances capability, complex system) •  AlgoEngine: CPU / GPU / DSP / FPGA

• Overall risks and uncertainty: •  Market, product, execution, timing, regulators, infrastructure

25

Page 26: MIPI DevCon 2016: MIPI CSI-2 Application for Vision and Sensor Fusion Systems

CSI-2 Beyond Mobile • What can technology do for us?

•  Imaging: digital photography vs. vision •  Scene capture, object capture & track, modeling & measurement

•  Perception and decision-making using real-time streaming image data: •  Camera, RADAR, LiDAR, sonar (varying detection capabilities vs.

cost)

•  Performance vs robustness – consequence of error •  Camera position, lighting, environmental factors, required accuracy

for object detection

26