Electronics Maker
  • Home
  • Electronics News
  • Articles
  • Magazine
  • Events
  • Interview
  • Electronics Projects
No Result
View All Result
  • Home
  • Electronics News
  • Articles
  • Magazine
  • Events
  • Interview
  • Electronics Projects
No Result
View All Result
Electronics Maker
No Result
View All Result
Home Electronics News

MathWorks Announces MATLAB Integration with NVIDIA TensorRT to Accelerate Artificial Intelligence Applications

Electronics Maker by Electronics Maker
March 27, 2018
in Electronics News
0
0
SHARES
115
VIEWS
Share on FacebookShare on Twitter
Speeds deep learning inference by 5x compared to TensorFlow on NVIDIA GPUs

Bangalore, 27 March 2018 – MathWorks today announced that MATLAB now offers NVIDIA TensorRT integration through GPU Coder. This helps engineers and scientists develop new AI and deep learning models in MATLAB with the performance and efficiency needed to meet the growing demands of data centers, embedded, and automotive applications.

MATLAB provides a complete workflow to rapidly train, validate, and deploy deep learning models. Engineers can use GPU resources without additional programming so they can focus on their applications rather than performance tuning. The new integration of NVIDIA TensorRT with GPU Coder enables deep learning models developed in MATLAB to run on NVIDIA GPUs with high-throughput and low-latency. Internal benchmarks show that MATLAB-generated CUDA code combined with TensorRT can deploy Alexnet with 5x better performance than TensorFlow and can deploy VGG-16 with 1.25x better performance than TensorFlow for deep learning inference.*

“Rapidly evolving image, speech, sensor, and IoT technologies are driving teams to explore AI solutions with better performance and efficiency. In addition, deep learning models are becoming more complex. All of this puts immense pressure on engineers,” said David Rich, director, MathWorks. “Now, teams training deep learning models using MATLAB and NVIDIA GPUs can deploy real-time inference in any environment from the cloud to the data center to embedded edge devices.”

To learn more about MATLAB for deep learning, visit: mathworks.com/solutions/deep-learning.html

* All benchmarks were run on MATLAB R2018a with GPU Coder, TensorRT 3.0.1, TensorFlow 1.6.0, CUDA 9.0 and cuDNN 7 on an NVIDIA TITAN Xp GPU in a Linux 12 core Intel ® Xeon® E5-1650 v3 PC with 64GB RAM

Tags: modelingSimulation
Electronics Maker

Electronics Maker

Subscribe Newsletter

Subscribe to our mailing list to receives daily updates direct to your inbox!

*we hate spam as much as you do

Electronics Maker

It is a professionally managed electronics print media with highly skilled staff having experience of this respective field.

Subscribe Newsletter

Subscribe to our mailing list to receives daily updates direct to your inbox!

*we hate spam as much as you do

Quick Links

  • »   Electronics News
  • »   Articles
  • »   Magazine
  • »   Events
  • »   Interview
  • »   About Us
  • »   Contact Us

Contact Us

EM Media LLP
  210, II nd Floor, Sager Plaza – 1, Road No-44,, Plot No-16, Pitampura, New Delhi - 110034
  01145629941
  info@electronicsmaker.com
  www.electronicsmaker.com

  • Advertise with Us
  • Careers
  • Terms Of Use
  • Privacy Policy
  • Disclaimer
  • Cookie Policy

© 2020 Electronics Maker. All rights reserved.

No Result
View All Result
  • Home
  • Electronics News
  • Articles
  • Magazine
  • Events
  • Interview
  • Electronics Projects

© 2020 Electronics Maker. All rights reserved.