Photo by Eric Ward on Unsplash

When you understand what hyperparameters (HPs) are, you may notice that there are many HPs in your code that you haven’t noticed before. In a typical deep learning model, in addition to the frequently debated HPs of choice of optimizer (SGD, Momentum, Adam, etc.) and learning rate and variables for the optimizer, there’s also number of layers, activation functions between the layers, nodes in each layer, etc. Not all of these HPs have a major effect on the output of the algorithm, but they may have effects in other ways, such as the amount of storage space required or time…


This article is a translation of the Japanese blog post authored by Makoto Hiramatsu of Cookpad Inc.

I am Makoto Hiramatsu (Twitter: @himkt or @himako_h) from the Business Development Department at Cookpad. 👋 I usually work on natural language processing (NLP) on real-world data.

In this article, I introduce nerman, a named entity recognition system from recipe texts. The word nerman is derived from ner (Named Entity Recognition) + man. It is a system that automatically extracts cooking terminology from recipes posted on Cookpad, which is created with a combination of AllenNLP and Optuna. …


Gluon + Optuna!

Optuna is a hyperparameter optimization framework applicable to machine learning frameworks and black-box optimization solvers. The Gluon library in Apache MXNet (incubating) provides a clear, concise, and simple API for deep learning. It makes it easy to prototype, build, and train deep learning models without sacrificing training speed. Combining the two of them allows for automatic tuning of hyperparameters to find the best performing models.

Creating the Objective Function

Optuna is a black-box optimizer, which means it needs an objective function, which returns a numerical value to evaluate the performance of the hyperparameters, and decide where to sample in upcoming trials.

In our example…


PyTorch Ignite + Optuna!

Optuna is a hyperparameter optimization framework applicable to machine learning frameworks and black-box optimization solvers. PyTorch Ignite is a high-level library for PyTorch that helps you write compact, but full-featured, code in less lines. Combining the two of them allows for automatic tuning of hyperparameters to find the best performing models.

Creating the Objective Function

Optuna is a black-box optimizer, which means it needs an objectivefunction, which returns a numerical value to evaluate the performance of the hyperparameters, and decide where to sample in upcoming trials.

In our example, we will be doing this for identifying MNIST characters from the Optuna GitHub examples folder…


PyTorch Ignite + Optuna!

Optuna is a hyperparameter optimization framework applicable to machine learning frameworks and black-box optimization solvers. PyTorch Ignite is a high-level library for PyTorch that helps you write compact, but full-featured, code in less lines. Combining the two of them allows for automatic tuning of hyperparameters to find the best performing models.

Creating the Objective Function

Optuna is a black-box optimizer, which means it needs an objectivefunction, which returns a numerical value to evaluate the performance of the hyperparameters, and decide where to sample in upcoming trials.

In our example, we will be doing this for identifying MNIST characters from the Optuna GitHub examples folder…


MXNet + Optuna!

Optuna is a hyperparameter optimization framework applicable to machine learning frameworks and black-box optimization solvers. MXNet is an open source machine learning framework for flexible research prototyping and production. Let’s see how they can work together!

Creating the Objective Function

Optuna is a black-box optimizer, which means it needs an objectivefunction, which returns a numerical value to evaluate the performance of the hyperparameters, and decide where to sample in upcoming trials.

In our example, we will be doing this for identifying MNIST characters. In this case, the objective function looks like this:

Notice that the objective function is passed an Optuna specific argument…


This post uses XGBoost v1.0.2 and optuna v1.3.0.

XGBoost + Optuna!

Optuna is a hyperparameter optimization framework applicable to machine learning frameworks and black-box optimization solvers. XGBoost isis an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. Let’s see how they can work together!

Creating the Objective Function

Optuna is a black-box optimizer, which means it needs an objectivefunction, which returns a numerical value to evaluate the performance of the hyperparameters, and decide where to sample in upcoming trials.

In our example, we will be doing this for a cancer detection example from the Optuna GitHub pruning examples folder. …


PyTorch + Optuna!

Optuna is a hyperparameter optimization framework applicable to machine learning frameworks and black-box optimization solvers. PyTorch is an open source machine learning framework use by may deep learning programmers and researchers. Let’s see how they can work together!

Creating the Objective Function

Optuna is a black-box optimizer, which means it needs an objectivefunction, which returns a numerical value to evaluate the performance of the hyperparameters, and decide where to sample in upcoming trials.

In our example, we will be doing this for identifying MNIST characters. In this case, the objective function looks like this:

Notice that the objective function is passed an Optuna…


This post uses PyTorch v1.4 and optuna v1.3.0.

PyTorch + Optuna!

Optuna is a hyperparameter optimization framework applicable to machine learning frameworks and black-box optimization solvers. PyTorch is an open source machine learning framework use by may deep learning programmers and researchers. Let’s see how they can work together!

Creating the Objective Function

Optuna is a black-box optimizer, which means it needs an objectivefunction, which returns a numerical value to evaluate the performance of the hyperparameters, and decide where to sample in upcoming trials.

In our example, we will be doing this for identifying MNIST characters. In this case, the objective function looks like this:

Notice…


This post uses tensorflow v2.1 and optuna v1.1.0.

TensorFlow + Optuna!

Optuna is a hyperparameter optimization framework applicable to machine learning frameworks and black-box optimization solvers. TensorFlow is Google’s open source platform for machine learning, with a deep ecosystem of tools, libraries and community resources. Let’s see how they can work together!

Creating the Objective Function

Optuna is a black-box optimizer, which means it needs an objectivefunction, which returns a numerical value to evaluate the performance of the hyperparameters, and decide where to sample in upcoming trials.

In our example, we will be doing this for identifying MNIST characters from the Optuna GitHub pruning examples folder. …

Crissman Loomis

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store