API
This page shows the public Python API for AutoNeuroNet.
Core types
Var: Scalar value with gradient tracking and reverse-mode automatic differentiation.Matrix: 2D matrix ofVarvalues with math, activation functions, gradients, and more.
Neural Network
Layer(base class)Linear,ReLU,LeakyReLU,Sigmoid,Tanh,SiLU,ELU,SoftmaxNeuralNetwork
Optimizers
GradientDescentOptimizerSGDOptimizer
Losses
MSELossMAELossBCELossCrossEntropyLossCrossEntropyLossWithLogits
Utilities
matmul(A, B)numpy_to_matrix(array, as_column=False)from_numpy(array, as_column=False)(alias)
Automatic Differentiation
Var
- Create a value:
Var(1.0) - Read/write value:
v.val - Read/write gradient:
v.grad - Backpropagation:
v.setGrad(1.0); v.backward()
Matrix
- Create:
Matrix(rows, cols) - Index:
M[i, j]or row slicesM[i] - Math:
+,-,*,/,@,pow,sin,cos,tanh,relu,softmax, etc.
Neural Networks
NeuralNetwork
- Construct with a list of layers.
- Forward pass:
model.forward(x) - Save/restore weights:
model.saveWeights(path),model.loadWeights(path)
Optimizers
- Create with
learning_rateand a model. - Call
optimize()after backprop. - Call
resetGrad()between steps if needed.