Blogs

FEB 14, 2020

AI in the 2020s Must Get Greener—and Here’s How

By Ameet Talwalkar

The environmental impact of artificial intelligence (AI) has been a hot topic as of late—and I believe it will be a defining issue for AI this decade.

LEARN MORE

NOV 19, 2019

AI Leadership And The Positive Impacts On Economy, Privacy, Environmental Health

By Evan Sparks

Decades ago, Japan faced an unavoidable, long-term economic challenge. Even as its economy reached record highs in the late 1980s (fueled by strong auto sales, the rise of innovative companies like Nintendo, and real estate speculation), it was preparing for the coming day when more than a quarter of its population would be over age 65.

LEARN MORE

NOV 12, 2019

Deep in the Trenches: What Are We Reading? (November 2019)

By Evan Sparks

In the first of a series of posts, we share some thoughts on papers and blog posts that we’re reading right now that have generated some fiery internal discussion at Determined AI.

LEARN MORE

OCT 29, 2019

The squeeze on AI talent could cripple America’s most important companies

By Evan Sparks

With the AI revolution solidly underway, tech’s top 5 companies are investing huge amounts of money into AI development and AI engineering talent.

LEARN MORE

AUG 19, 2019

Specialized AI chips hold both promise and peril for developers

By Evan Sparks

In the next few years, chipmaking giants and well-funded startups will race to gain market share.

LEARN MORE

AUG 13, 2019

[Product feature series] One-click access to TensorBoard for model development and experimentation

By Desmond Chan

Training a massive deep neural network can be daunting. Many deep learning (DL) engineers rely on TensorBoard for visualization so that they can better understand, debug, and optimize their model code.

LEARN MORE

JUN 04, 2019

The cloud giants have an AI problem

By Evan Sparks

The general perception of cloud computing is that it makes all compute tasks cheaper and easier to manage.

LEARN MORE

MAY 20, 2019

Stop doing iterative model development

By Yoav Zimmerman

Imagine a world in which gradient descent or second-order methods have not yet been invented, and the only way to train machine learning models is to tune their weights by hand.

LEARN MORE

MAR 05, 2019

Random Search is a hard baseline to beat for Neural Architecture Search

By Ameet Talwalkar

In a previous post on “What’s the deal with Neural Architecture Search?”, Liam Li and I discussed Neural Architecture Search (NAS) as a promising research direction that has the potential to replace expert-designed networks with learned, task-specific architectures.

LEARN MORE

FEB 20, 2019

Addressing the challenges of massively parallel hyperparameter optimization

By Ameet Talwalkar, Desmond Chan

As most deep learning engineers know, it can take days or weeks to train a deep learning model, costing organizations considerable time and money. But what if we could speed up the process and achieve better results in the process?

LEARN MORE