Intel® FPGAs

Gain cost savings and revenue growth from integrated circuits that retrieve and classify data in real time. Use these accelerators for AI inferencing as a low-latency solution for safer and interactive experiences that can be applied to autonomous vehicles, robotics, IoT, and data centers.

Common uses:

  • Real-time, low-latency inference

  • FPGA abstraction using the Deep Learning Acceleration Suite

Works best for:

  • Smart city

  • Smart retail

  • Smart factory

Resources:

Supported hardware:

Supported operating systems:

  • Ubuntu 16.04.3 LTS (64 bit)

  • CentOS 7.4 (64 bit)

Supported Intel Distribution of OpenVINO toolkit component:

Last updated

Was this helpful?