Click on the Column details tab to optionally configure the training parameters. As you can specify different training parameters for each table in your dataset, select the table you want to configure from the table list and scroll down to the Training parameters section.

Training parameters

With these parameters, you can:

  • Optimize AI model training for accuracy or speed.

  • Tweak the training parameters if the results of an earlier run were not of the desired accuracy or took too long to generate.

  • Disable the generation of detailed accuracy and privacy charts if the table’s synthetic data accuracy and privacy are in good shape.

Read on below to learn more.

For Batch size and Learning rate, we highly recommend using the default values. Only in some instances can different settings lead to better and faster results.

Maximum training epochs

An epoch refers to the process of passing the table forward and backward through the neural network only once. MOSTLY AI will start new epochs until the neural network optimally learned your dataset’s features. Unfortunately, it’s not possible to know beforehand how many epochs are needed.

This setting allows you to limit the numbers of epochs to, for instance, 2, 5, or 10 — significantly reducing the time to generate your synthetic dataset but at the cost of accuracy.

Batch size

MOSTLY AI won’t pass the entire dataset into the neural net at once. Instead, it divides your dataset into batches and updates the neural network’s parameters after each batch.

Setting the batch size to 1 will update these parameters after processing each training example. This results in the longest training time but allows you to process the largest possible models.

Setting a large batch size can significantly speed up the training but at the cost of memory.

If you get out of memory errors during the training stage, then you can try to resolve it by decreasing the batch size.

Learning rate

The learning rate controls the extent that the neural network learns from its mistakes after each batch of training subjects. Suitable values are within the range of 0 and 1 and exist on an exponential scale. 0.1, for instance, is a high value, and for each step lower, you respectively have 0.01, 0.001, and 0.0001.

A learning rate of 1 would mean that the neural net would very quickly process the training subjects, but it would fail to learn your dataset correctly.

Conversely, a very low learning rate, such as 0.00000001, would learn your dataset precisely but would take forever to complete the training.

The learning rate is automatically optimized during the training process.

Optimize for Speed or Accuracy

This toggle allows you to select a suitable training stopping condition for your synthetic data generation task.

By default, MOSTLY AI will aim to achieve the highest attainable synthetic data accuracy. It stops the training when the validation loss stops to improve.

By switching the toggle to speed, MOSTLY AI will stop the training as soon as the rate of improvement decreases. This significantly reduces the training time but at the cost of accuracy.

Generate full QA report

The Generate full QA report toggle allows you to disable the generation of detailed accuracy and privacy charts for this table. An executive summary stating the synthetic data accuracy and whether the privacy tests passed or failed will always be available.

This option allows you to speed up the analysis of the synthetic data if its accuracy and privacy are in good shape. However, if the accuracy is lower than 90% or the privacy tests fail, the detailed accuracy and privacy charts will be generated anyway.