InfoQ Homepage AI, ML & Data Engineering Content on InfoQ
-
Microsoft Open-Sources Project Petridish for Deep-Learning Optimization
A team from Microsoft Research and Carnegie Mellon University has open-sourced Project Petridish, a neural architecture search algorithm that automatically builds deep-learning models that are optimized to satisfy a variety of constraints. Using Petridish, the team achieved state-of-the-art results on the CIFAR-10 benchmark with only 2.2M parameters and five GPU-days of search time.
-
Jenkins Creator Launches ML Startup in Continuous Risk-Based Testing
Jenkins creator, Kohsuke Kawaguchi, starts Launchable, a startup using machine learning to identify risk-based tests. Testing thought leader Wayne Ariola also writes about the need for a continuous testing approach, where targeted risk-based tests help provide confidence for continuous delivery.
-
Compliance and the California Privacy Act - the Empire Strikes Back
On January 1, 2020, the California Privacy Act came into effect. Many companies have not complied with the law, and the long term effects of the legislation are unclear.
-
Google Open-Sources Reformer Efficient Deep-Learning Model
Researchers from Google AI recently open-sourced the Reformer, a more efficient version of the Transformer deep-learning model. Using a hashing trick for attention calculation and reversible residual layers, the Reformer can handle text sequences up to 1 million words while consuming only 16GB of memory on a single GPU accelerator.
-
The Distributed Data Mesh as a Solution to Centralized Data Monoliths
Instead of building large, centralized data platforms, corporations and data architects should create distributed data meshes.
-
Microsoft Open-Sources ONNX Acceleration for BERT AI Model
Microsoft's Azure Machine Learning team recently open-sourced their contribution to the ONNX Runtime library for improving the performance of the natural language processing (NLP) model BERT. With the optimizations, the model's inference latency on the SQUAD benchmark sped up 17x.
-
Apple Acquires Edge-Focused AI Startup Xnor.ai
Apple has acquired Xnor.ai, a Seattle-based startup that builds AI models that run on edge devices, for approximately $200 million.
-
QCon London - Keynotes & Workshops on Kubernetes, Apache Kafka, Microservices, Docker
QCon London is fast approaching. Join over 1,600 global software leaders this March 2-4. At the event, you will experience: talks that describe how industry leaders drive innovation and change within their organizations; a focus on real-world experiences, patterns, and practices (not product pitches), and implementable ideas for your projects and your teams.
-
Uber's Synthetic Training Data Speeds Up Deep Learning by 9x
Uber AI Labs has developed an algorithm called Generative Teaching Networks (GTN) that produces synthetic training data for neural networks which allows the networks to be trained faster than when using real data. Using this synthetic data, Uber sped up its neural architecture search (NAS) deep-learning optimization process by 9x.
-
Stanford Researchers Publish AI Index 2019 Report
The Stanford Human-Centered Artificial Intelligence Institute published its AI Index 2019 Report. The 2019 report tracks three times the number of datasets as the previous year's report, and contains nearly 300 pages of data and graphs related to several aspects of AI, including research, technical performance, education, and societal considerations.
-
Deep Java Library: New Deep Learning Toolkit for Java Developers
Amazon released Deep Java Library (DJL), an open-source library with Java APIs to simplify training, testing, deploying, and making predictions with deep-learning models. DJL is framework agnostic; it abstracts away commonly used deep-learning functions, using Java Native Access (JNA) on top of existing deep-learning frameworks, currently providing implementations for Apache MXNet and TensorFlow.
-
Google Open-Sources ALBERT Natural Language Model
Google AI has open-source A Lite Bert (ALBERT), a deep-learning natural language processing (NLP) model, which uses 89% fewer parameters than the state-of-the-art BERT model, with little loss of accuracy. The model can also be scaled-up to achieve new state-of-the-art performance on NLP benchmarks.
-
Uber Open-Sources Plug-and-Play Language Model for Controlling AI-Generated Text
Uber AI open-sourced the plug-and-play language model (PPLM) which can control the topic and sentiment of AI-generated text. The model's output is evaluated by human judges as achieving 36% better topic accuracy compared to the baseline GPT-2 model.
-
Microsoft Introduces Power Virtual Agents, a No-Code Solution to Building AI Bots
In a recent blog post, Microsoft announced the general availability (GA) of Power Virtual Agents, a service designed to democratize building conversational chatbots using a no-code graphical user interface. The service is part of the Microsoft Power Platform, which includes Power Apps, Power BI and Power Automate and democratizes access to building artificial intelligence-powered bots.
-
Google Cloud Team Releases AutoML Natural Language
The Google Cloud team recently announced the generally available (GA) release of AutoML Natural Language framework. AutoML Natural Language supports features for data processing and common machine learning tasks like classification, sentiment analysis, and entity extraction.