I am having a look at neural networks in Arraymancer. After training the model, how do I test it at a particular point. See ??? in the last line of code below: `let (N, D_in, H, D_out) = (10, 2, 3, 1) # Create the autograd context that will hold the computational graph let ctx = newContext Tensor[float32] # Create random Tensors to hold inputs and outputs, and wrap them in Variables. let # x = ctx.variable(randomTensor[float32](N, D_in, 1'f32)) xx = [0.000783'f32, 0.153779, 0.560532, 0.865013, 0.276724, 0.895919, 0.704462, 0.886472, 0.929641, 0.469290, 0.350208, 0.941637, 0.096535, 0.457211, 0.346164, 0.970019, 0.114938, 0.769819, 0.341565, 0.684224 ].toTensor.reshape(10,2) x = ctx.variable( xx ) y = [3.000602'f32, 5.738535, 4.316186, 6.618693, 6.045585, 4.771489, 3.230002, 4.798756, 3.455619, 4.285201].toTensor.reshape(10,1) # ################################################################## # Define the model network TwoLayersNet: layers: fc1: Linear(D_in, H) fc2: Linear(H, D_out) forward x: # x.fc1.relu.fc2 x.fc1.sigmoid.fc2 let model = ctx.init(TwoLayersNet) optim = model.optimizer(SGD, learning_rate = 1e-4'f32) # ################################################################## # Training for t in 0 ..< 500: let y_pred = model.forward(x) loss = y_pred.mse_loss(y) # echo &"Epoch {t}: loss {loss.value[0]}" loss.backprop() optim.update() # ################################################################## # Test at (0.25, 0.5) let res = [0.2'f32, 0.5 ].toTensor.reshape(2,1) ???`