Linley Autonomous Hardware Conference 2017
Focusing on hardware design for autonomous vehicles and deep learning
Held on April 6, 2017
Hyatt Regency Hotel, Santa Clara, CA
Agenda: April 6, 2017
Technology Requirements for Autonomous Vehicles
This presentation gives an overview of the development of advanced driver assistance systems (ADAS) and autonomous vehicles, including the levels of autonomy and our forecast for each level. It will then provide background on key technologies, such as deep learning and vision processing, that are required to create autonomous vehicles. The presentation will conclude with a look at the hardware configurations needed to achieve these functions, including an overview of the solutions announced by Intel, Mobileye, Nvidia, and othersThere will be Q&A and a panel discussion following this Keynote.
|9:50am-10:10am||Break - Sponsored by Synopsys - Premier Sponsor|
|10:10am-12:10pm||Session 1: Automotive SoC Design
ADAS SoCs must meet high standards for safety, security, and reliability that depend on reliable interaction between a chip's internal blocks and its interfaces to a vehicle's electronic controls. This session, moderated by Linley Group senior analyst Mike Demler, will discuss technologies for manufacturing automotive-grade SoCs, and the critical IP that provides the backbone between processor cores as well as to external sensors.
Automotive Rising – Semiconductors Merge into the Fast Lane
Automotive semiconductors are defining and designing the electric and autonomous vehicle landscape. The vehicle experience is driven by emerging SoC trends across various subsystems that need to meet automotive-grade requirements for compute, power, connectivity, content, security, and safety. Overlaying these are the regulatory and extreme reliability needs of ADAS, including functional safety and quality flow. This presentation will share insight on the key automotive applications and discuss GLOBALFOUNDRIES products and solutions that differentiate them for the customers and industry.
Autonomous Driving with the MIPI Camera and Sensor Interfaces
SoCs for ADAS and self-driving cars incorporate numerous interfaces for functions such as camera, radar, lidar, and sensors. It is vital for such interfaces to meet the new stringent automotive standards as well as the low-power and high-performance requirements that designers seek. The MIPI interfaces for cameras (CSI-2) and other sensors (I3C) play an essential role in enabling ADAS SoCs for autonomous driving. This presentation explains how the MIPI specifications are implemented in automotive SoCs and why.
Intelligent Interconnect for Autonomous Vehicle SoCs
The proliferation of autonomous features in today's cars is pushing the boundaries of complexity and scalability in automotive SoCs. Interactions among heterogeneous CPU cores, clusters, vision processors, and storage are extremely complex, especially when coherency and functional safety become part of the equation. This complexity makes it almost impossible to build interconnects by hand and thus creates a need for automated tools that leverage advanced machine learning to build correct-by-construction interconnect designs that are not prone to human error.
Implementing Coherent Machine Learning SoCs That Meet ISO 26262 Requirements
Autonomous vehicle systems typically combine massive computational throughput with hardware-based functional safety features to meet ISO 26262 requirements. But starting today, developers of autonomous hardware SoCs can use a highly flexible and scalable cache-coherent interconnect with integrated functional safety features to implement heterogeneous multicore machine-learning systems. This new version of Arteris cache-coherent interconnect IP allows efficient integration of custom processing elements for machine learning while optionally providing integrated interconnect hardware duplication and data-protection features for functional safety.There will be Q&A and a panel discussion featuring above speakers.
|12:10pm-1:30pm||Lunch - Sponsored by Synopsys - Premier Sponsor|
|1:30pm-1:45pm||Session 2: Autonomous Benchmarks
The industry needs new benchmarks to better gauge the performance of deep-learning and vision-processing accelerators. This session discusses a standard-driven effort to develop such benchmarks.
Evaluating Processor Architectures for ADAS
Semiconductor and IP vendors are driving to get their products adopted into the rapidly growing ADAS market. But how are these compute technologies being evaluated? Identifying the potential compute performance of an ADAS architecture is impractical: running real-world scenarios on these architectures require an optimal utilization of the available compute resources, and optimal use of ADAS processors requires intimate knowledge of their architectures. This presentation will describe details of a portable benchmarking methodology used to evaluate an ADAS architecture's real-world behavior.There will be a brief Q&A following this presentation.
|1:45pm-2:55pm||Session 3: Deep Learning and Vision Processing Accelerators
Deep learning and vision processing are two critical technologies for autonomous systems. These systems can be divided into two logical portions: the "eyes" (vision processing) and the "brain" (deep learning). This session, moderated by Linley Group principal analyst Linley Gwennap, discusses and compares accelerators for these functions from leading IP vendors.
OpenVX: An Industry-Standard Computer Vision API for Autonomous Hardware
Future autonomous hardware platforms will use diverse vision-processing hardware such as DSPs, GPUs, CPUs, and proprietary application-specific hardware. OpenVX is an industry-standard Computer Vision API designed for efficient implementation on a variety of embedded platforms. Cadence has developed an implementation of the OpenVX API that runs on the Tensilica Vision DSP products. This talk will provide an overview of the API and its benefits, and an overview of the Cadence implementation and the performance and time-to-market advantages it provides for autonomous hardware platforms. The talk will also summarize future extensions OpenVX will have for neural networks and safety-critical applications.
Utilization of a Vision Platform Optimized For Deep Learning
Automotive is seeing tremendous growth in the vision applications that will lead the way to autonomous vehicles. With the complexity of these systems, Tier-1 suppliers, OEMs, and the entire ecosystem are working with artificial intelligence and deep learning to identify objects, determine free space for vehicles, and plan vehicle movements. CEVA's vision platform includes HW IP, SW ecosystem, and development tools to help bridge the gap from R&D into production. This presentation will discuss how this platform can help implement low-power ADAS solutions.
Deep Learning Requirements for Autonomous Vehicles
Deep-learning techniques for embedded vision are enabling cars to 'see' their surroundings and have become a critical component in the push toward fully autonomous vehicles. The early use of deep learning for object detection e.g., pedestrian detection and collision avoidance is evolving toward scene segmentation where every pixel of a high-resolution video stream must be identified. This presentation will discuss the current and next-generation requirements for ADAS vision applications, including the need for deep-learning accelerators.
|2:55pm-3:15pm||Break - Sponsored by Synopsys - Premier Sponsor|
|3:15pm-4:05pm||Session 3: Deep Learning and Vision Processing Accelerators (cont)
Essential Elements for an Automotive Image Signal Processor (ISP)
Today's ADAS and IVI systems are putting increased demands on Image Signal Processor (ISP) technology, including requirements for multi-camera support, advanced high-dynamic range (HDR) capability, and hardware-based functional safety (FuSa) features. This presentation will highlight the essential components required in an advanced automotive ISP and highlight their applicability to computer vision and human display applications.There will be Q&A and a panel discussion featuring above speakers.
|4:05pm-5:00pm||Special Panel: Self-Driving Cars —Clear Sailing Or A Bumpy Road Ahead?
This special session will feature a diverse group of automotive-industry experts who will share their insights on the challenges of developing and deploying autonomous vehicles. Linley Group senior analyst Mike Demler will moderate this panel discussion, which will include representatives from an automobile manufacturer, a sensor supplier, and a software developer.
Panelist.There will be Q&A and a panel discussion featuring above panelists.
|5:00pm-6:30pm||Reception and Exhibits - Sponsored by Synopsys - Premier Sponsor|