InfoQ Homepage AI, ML & Data Engineering Content on InfoQ
-
Amazon Announces Alexa Prize SocialBot Grand Challenge 4 Winners
Amazon recently announced the winners of the 4th Alexa Prize SocialBot Grand Challenge, a competition for university students to develop conversational AI. A team from Czech Technical University (CTU) won first prize, while teams from Stanford University and University of Buffalo took second and third.
-
Erlang-Inspired Language Gleam Now Compiles to JavaScript
Gleam, which self-describes as a language for building type-safe, scalable systems for the Erlang virtual machine, now also compiles to JavaScript.
-
How Quantifying Information Leakage Helps to Protect Systems
Information leakage happens when observable information can be correlated with a secret. Secrets such as passwords, medical diagnosis, locations, or financial data uphold a lot of our world, and there are many types of information, like error messages or electrical consumption patterns, that can give hints to these secrets.
-
PyTorch 1.9 Release Includes Mobile, Scientific Computing, and Distributed Training Updates
PyTorch, Facebook's open-source deep-learning framework, announced the release of version 1.9 which includes improvements for scientific computing, mobile support, and distributed training. Overall, the new release contains more than 3,400 commits since the 1.8 release.
-
Stack Overflow’s 2021 Developer Survey Uncovers New Trends in Tech and Work
Stackoverflow’s 2021 developer survey focuses mostly on work outside the traditional office. With younger respondents, this year's survey shows shifts in the way they learn and work, and with more interest in health. On the technology side, it has been a year of consolidation: React, Rust, and Clojure being more used and present, while Redis keeps attracting attention.
-
Three Tracks Not to Miss at QCon Plus - Interview with Karen Casella
During a recent interview, Karen Casella, director of engineering at Netflix and QCon Plus November 2021 Program Committee member, shared with us the three topical tracks she felt software leaders should be paying attention to.
-
OpenAI Announces 12 Billion Parameter Code-Generation AI Codex
OpenAI recently announced Codex, an AI model that generates program code from natural language descriptions. Codex is based on the GPT-3 language model and can solve over 70% of the problems in OpenAI's publicly available HumanEval test dataset, compared to 0% for GPT-3.
-
Microsoft Warns Customers about a Critical Vulnerability in Azure Cosmos DB
Azure Cosmos DB is a globally-distributed and fully-managed NoSQL database service. Recently, Microsoft warned thousands of its Cosmos DB customers of a vulnerability that exposes their data. A flaw in the service could grant a malicious actor access keys to steal, edit or delete sensitive data.
-
MLOps: Continuous Delivery of Machine Learning Systems
Developing, deploying, and keeping machine learning models productive is a complex and iterative process with many challenges. MLOps means combining the development of ML models and especially ML systems with the operation of those systems. To make MLOps work, we need to balance iterative and exploratory components from data science with more linear software engineering components.
-
Uncover What's Next for Software Engineering at QCon Plus Online Software Conference (Nov 1-12)
QCon Plus gives you access to a curated learning experience that covers the topics that matter right now in software development and technical leadership. Learn from the laser-focus sharing experiences of 64+ software practitioners from early adopter companies to help you adopt the right patterns and practices.
-
DeepMind Open Sources Data Agnostic Deep Learning Model Perceiver IO
DeepMind has open-sourced Perceiver IO, a general-purpose deep-learning model architecture that can handle many different types of inputs and outputs. Perceiver IO can serve as a "drop-in" replacement for Transformers that performs as well or better than baseline models, but without domain-specific assumptions.
-
OpenAI Releases Triton, Python-Based Programming Language for AI Workload Optimization
OpenAI released their newest language, Triton, an open-source programming language that enables researchers to write highly efficient GPU code for AI workloads. Triton is Python-compatible and allows new users to achieve expert-quality results in only 25 lines of code. The code is written in Python using Triton’s libraries, which are then JIT-compiled to run on the GPU.
-
Cloudera Announces the General Availability of Cloudera DataFlow for the Public Cloud
The enterprise data cloud company Cloudera recently announced the general availability (GA) of Cloudera DataFlow for the Public Cloud, a cloud-native service for data flows to process hybrid streaming workloads on the Cloudera Data Platform (CDP).
-
Oracle Introduces MySQL Autopilot with Machine-Learning Capabilities for MySQL Heatwave
Oracle released updates to its MySQL HeatWave service earlier this month. The new MySQL Autopilot feature uses machine learning to automate database provisioning and optimisation tasks, making recommendations and query optimisations based on each database’s usage patterns.
-
MIT Demonstrates Energy-Efficient Optical Accelerator for Deep-Learning Inference
Researchers at MIT's Quantum Photonics Laboratory have developed the Digital Optical Neural Network (DONN), a prototype deep-learning inference accelerator that uses light to transmit activation and weight data. At the cost of a few percentage points of accuracy, the system can achieve an transmission energy advantage of up to 1000x over traditional electronic devices.