InfoQ Homepage Artificial Intelligence Content on InfoQ
-
New Technique Speeds up Deep-Learning Inference on TensorFlow by 2x
Researchers at North Carolina State University recently presented a paper at the International Conference on Supercomputing (ICS) on their new technique, "deep reuse" (DR), that can speed up inference time for deep-learning neural networks running on TensorFlow by up to 2x, with almost no loss of accuracy.
-
Predicting the Future, Amazon Forecast Reaches General Availability
In a recent blog post, Amazon announced the general availability (GA) of Amazon Forecast, a fully managed, time series data forecasting service. Amazon Forecast uses deep learning from multiple datasets and algorithms to make predictions in the areas of product demand, travel demand, financial planning, SAP and Oracle supply chain planning and cloud computing usage.
-
University Research Teams Open-Source Natural Adversarial Image DataSet for Computer-Vision AI
Research teams from three universities recently released a dataset called ImageNet-A, containing natural adversarial images: real-world images that are misclassified by image-recognition AI. When used as a test-set on several state-of-the-art pre-trained models, the models achieve an accuracy rate of less than 3%.
-
Microsoft Open-Sources TensorWatch AI Debugging Tool
Microsoft Research open-sourced TensorWatch, their debugging tool for AI and deep-learning. TensorWatch supports PyTorch as well as TensorFlow eager tensors, and allows developers to interactively debug training jobs in real-time via Jupyter notebooks, or to build their own custom UIs in Python.
-
Baidu Open-Sources ERNIE 2.0, Beats BERT in Natural Language Processing Tasks
In a recent blog post, Baidu, the Chinese search engine and e-commerce giant, announced their latest open-source, natural language understanding framework called ERNIE 2.0. They also shared recent test results including achieving state-of-the art (SOTA) results and outperforming existing frameworks, including Google’s BERT and XLNet in 16 NLP tasks in both Chinese and English.
-
The First AI to Beat Pros in 6-Player Poker, Developed by Facebook and Carnegie Mellon
Facebook AI Research’s Noam Brown and Carnegie Mellon’s professor Tuomas Sandholm recently announced Pluribus, the first Artificial Intelligence program able to beat humans in 6 player hold-em poker. In the past years, computers have progressively improved, beating humans in checkers, chess, Go, and the Jeopardy TV show. Poker poses more challenges around information asymmetry and bluffing.
-
Researchers Develop Technique for Reducing Deep-Learning Model Sizes for Internet of Things
Researchers from Arm Limited and Princeton University have developed a technique that produces deep-learning computer-vision models for internet-of-things (IoT) hardware systems with as little as 2KB of RAM. By using Bayesian optimization and network pruning, the team is able to reduce the size of image recognition models while still achieving state-of-the-art accuracy.
-
Google Adds New Integrations for the What-If Tool on Their Cloud AI Platform
In a recent blog post, Google announced a new integration of the What-If tool, allowing data scientists to analyse models on their AI Platform – a code-based data science development environment. Customers can now use the What-If tool for their XGBoost and Scikit Learn models deployed on the AI Platform.
-
Google Releases Post-Training Integer Quantization for TensorFlow Lite
Google announced new tooling for their TensorFlow Lite deep-learning framework that reduces the size of models and latency of inference. The tool converts a trained model's weights from floating-point representation to 8-bit signed integers. This reduces the memory requirements of the model and allows it to run on hardware without floating-point accelerators and without sacrificing model quality.
-
Google Releases TensorFlow.Text Library for Natural Language Processing
Google released a TensorFlow.Text, a new text-processing library for their TensorFlow deep-learning platform. The library allows several common text pre-processing activities, such as tokenization, to be handled by the TensorFlow graph computation system, improving consistency and portability of deep-learning models for natural-language processing.
-
Facebook Open-Sources Deep-Learning Recommendation Model DLRM
Facebook AI Research announced the open-source release of a deep-learning recommendation model, DLRM, that achieves state-of-the-art accuracy in generating personalized recommendations. The code is available on GitHub, and includes versions for the PyTorch and Caffe2 frameworks.
-
Moving Embodied AI forward, Facebook Open-Sources AI Habitat
In a recent blog post, Facebook has announced they have open-sourced AI Habitat, an Artificial Intelligence (AI) simulation platform that is designed to train embodied agents, such as virtual robots. Using this technology, robots can learn how to grab an object from an adjacent room or assist a visually-impaired person in navigating an unfamiliar transit system.
-
MIT Debuts Gen, a Julia-Based Language for Artificial Intelligence
In a recent paper, MIT researchers introduced Gen, a general-purpose probabilistic language based on Julia aimed to allow users to express models and create inference algorithms using high-level programming constructs.
-
AWS Enhances Deep Learning AMI, AI Services SageMaker Ground Truth, and Rekognition
Amazon Web Services (AWS) announced updates to their Deep Learning virtual machine image, as well as improvements to their AI services SageMaker Ground Truth and Rekognition.
-
Amazon Personalize Is Now Generally Available, Bringing ML to Customers
After the first announcement of Amazon Personalize during AWS re:Invent last November, the service is now generally available for all AWS customers. With this service, developers can add custom machine learning models to their application, including ones for personalized product recommendations, search results and direct marketing, even if they don’t have much machine learning experience.