This field represents a training algorithm step. Given required inputs, it computes outputs to update initializers in its own or inference graph's initializer lists. In general, this field contains loss node, gradient node, optimizer node, increment of iteration count.
An execution of the training algorithm step is performed by executing the graph obtained by combining the inference graph (namely "ModelProto.graph") and the "algorithm" graph. That is, the actual the actual input/initializer/output/node/value_info/sparse_initializer list of the training graph is the concatenation of "ModelProto.graph.input/initializer/output/node/value_info/sparse_initializer" and "algorithm.input/initializer/output/node/value_info/sparse_initializer" in that order. This combined graph must satisfy the normal ONNX conditions. Now, let's provide a visualization of graph combination for clarity. Let the inference graph (i.e., "ModelProto.graph") be
tensor_a, tensor_b -> MatMul -> tensor_c -> Sigmoid -> tensor_d
and the "algorithm" graph be
tensor_d -> Add -> tensor_e
The combination process results
tensor_a, tensor_b -> MatMul -> tensor_c -> Sigmoid -> tensor_d -> Add -> tensor_e
Notice that an input of a node in the "algorithm" graph may reference the output of a node in the inference graph (but not the other way round). Also, inference node cannot reference inputs of "algorithm". With these restrictions, inference graph can always be run independently without training information.
By default, this field is an empty graph and its evaluation does not produce any output. Evaluating the default training step never update any initializers.