# Indirect Hints

The first way to communicate with the oracle is by giving hints to the oracle about the problem. These hints are used by the oracle to holistically figure out the context of the problem, and determine reasonable hyper-parameters for the type and complexity of the task.

In an NML script, indirect hints are placed outside of the constructs. Typically, for ease of reading, they are placed at the beginning of the script, but they can appear between blocks as well. These hints cannot appear inside other constructs.

## Problem type and complexity hints:

The most general hints that the oracle accepts are about the type of problem, and how complex it is. Based on these hints, the oracle will fill in knowledge about architecture and hyper-parameters for the user. The available hints and their default values are:

### mode

`oracle("mode") = "classification"`

This hint is used to tell the oracle what type of machine learning task the user wants to perform. Available hints are shown bellow.

`"classification"`

Classification predictive modeling is the task of approximating a mapping function (f) from input variables (X) to discrete output variables (y), where y is one integer number. That means each X belongs to only one class.

`"multi_class"`

Similar to classification, but output variable (y) could be more than one integer number. That means each input variable (X) could belongs to multiple classes.

`"regression"`

Regression predictive modeling is the task of approximating a mapping function (f) from input variables (X) to continuous output variables (y).

`"vector_capsule"`

Vector Capsule Networks with Dynamic Routing.

`"matrix_capsule"`

Matrix Capsule Networks with EM Routing.

`"SSD"`

Single Shot MultiBox Detector for image detection.

`"YOLO"`

You Only Look Once(YOLO) for image detection.

`"Retina"`

Focal Loss for Dense Object Detection.

`"Count"`

A task of predict the number of specific object in images.

`"unsupervised"`

Unsupervised learning tasks.

`"spectral_opt"`

Training with Spetral Hyper-parameters Optimization. A spectral optimization tutorial can be found here.

Note that other problem types can be solved using AI Studio, but the oracle doesn't have a complete handle on those problems... yet.

**Coming Soon:** More problem types.

### complexity

`oracle("complexity") = 0.5`

At a high level, this hint tells the oracle the complexity the task the model must perform. Complexity is defined as a floating point value between 0 and 1, where lower is simpler. For example, binary classification of tweet sentiment (positive or negative) would be relatively simple task (0.1), whereas image classification into a thousand categories would be a very complex task (0.99).

### regularization

`oracle("regularization") = 0.5`

At a high level, define how complex is the architecture being designed, to inform the oracle how much it should regularize to prevent overfitting. This hint expects a floating point value in the range [0.0,1.0], where lower is simpler.

## Hyper-parameter optimization hints:

The default behavior is to randomly sample four models from the space of all possible hyper-parameters. In most cases this will generate a decent model that can then be fine-tuned by optimizing individual hyper-parameters using direct hints (see below).

### generated

`oracle("generated") = 2`

This sets the limit on the number of models with different hyper-parameters/architectures that the oracle will generate. The default value is 4.

### generated_strict

`oracle("generated_strict") = True`

**WARNING: EXERCISE CAUTION WHEN USING THIS HINT**

Setting this to False tells the oracle to generate models with **EVERY** possible combination of hyper-parameters. Unless you restrict the number of possible hyper-parameters (either by fixing them to specific values or using direct hints), setting this to false can generate a large number (possibly thousands) of models.

### hyperalgo

`oracle("hyperalgo") = random`

Setting this to`brute`

will perform a brute force grid search over the available space of hyper-parameters. This is useful when fine tuning models.