CloudCloud NewsNews

Google adds new features and capabilities to its TensorFlow, for AI developers 

1 Mins read

At the TensorFlow Developer Summit, Google rolled out several new improvements to its machine learning development framework— TensorFlow.

TensorFlow will now have ‘eager execution’ programming model for Python developers. They will be able to use the same code they use in eager execution, to execute and generate computational graphs using Estimator high-level API.

The Estimator models can be run on multiple GPUs on a single machine, enabling developers to scale the models with lesser changes in coding.

Google has unveiled a new library called TensorFlow Hub which will help developers to share and reuse the machine learning models. This library contains several modules that are pre-trained on large datasets to use in applications. The TensorFlow Hub will also help in training a model which can be reused for similar tasks.

Another feature in the list is an interactive graphical debugger, which makes the debugging of models easier. It helps in inspecting the internal nodes of a computation graph in real-time.

For end-to-end solution to build machine learning systems, Google added TensorFlow Model Analysis to the TensorFlow Extended (TFX). The TFX already includes TensorFlow Transform, Estimators and TensorFlow Serving. The addition of Model Analysis will help developers to prepare data, train, validate, and deploy TensorFlow models in production.

TensorFlow will now include support for new third-party technologies. For hardware, it is working with Raspberry Pi miniature computer and Nvidia’s TensorRT. The TensorRT is a framework and will be helpful in optimizing AI models, so that they can run better on Nvidia GPUs.

For software, it will now support two new programming languages—JavaScript and Swift. JavaScript support is critical since it paves a path to development of AI algorithms that can be run in a browser. The Swift support, expected to be available later this month, will help the developers who build Apple applications.

TensorFlow Lite, the lightweight solution to train machine learning models on mobile and edge devices, has also been updated. Its core interpreter has been reduced to size of only 75 KB to accelerate the training by three times.

Also read: Top 7 open-source project trends to look out for in 2018  

For further details, visit TensorFlow blog.

Leave a Reply

Your email address will not be published. Required fields are marked *

40 ÷ = four