Neural

Neural

DSL for defining, training, debugging neural networks.

4 followers

Neural is a domain-specific language (DSL) designed for defining, training, debugging, and deploying neural networks. With declarative syntax, cross-framework support, and built-in execution tracing (NeuralDbg), it simplifies deep learning development.
Neural gallery image
Neural gallery image
Neural gallery image
Neural gallery image
Free
Launch Team / Built With
Auth0
Auth0
Start building with Auth0 for AI Agents, now generally available.
Promoted

What do you think? …

Lemniscate-SHA-256

Why It Matters

  • Core Value: Fix critical blockers like shape errors and debugging woes with game-changing tools.

  • Strategic Edge: Streamline framework switches and HPO for big wins.

  • User-Friendly: Lower barriers and enhance workflows with practical features.

Neural is a domain-specific language (DSL) designed for defining, training, debugging, and deploying neural networks. With declarative syntax, cross-framework support, and built-in execution tracing (NeuralDbg), it simplifies deep learning development.

Feedback

Help us improve Neural DSL! Share your feedback: Typeform link.

🚀 Features

  • YAML-like Syntax: Define models intuitively without framework boilerplate.

  • Shape Propagation: Catch dimension mismatches before runtime.

    • ✅ Interactive shape flow diagrams included.

  • Multi-Framework HPO: Optimize hyperparameters for both PyTorch and TensorFlow with a single DSL config (#434).

  • Multi-Backend Export: Generate code for TensorFlow, PyTorch, or ONNX.

  • Training Orchestration: Configure optimizers, schedulers, and metrics in one place.

  • Visual Debugging: Render interactive 3D architecture diagrams.

  • Extensible: Add custom layers/losses via Python plugins.

🛠 NeuralDbg: Built-in Neural Network Debugger

NeuralDbg provides real-time execution tracing, profiling, and debugging, allowing you to visualize and analyze deep learning models in action.


Shape Propagation Debugging – Visualize tensor transformations at each layer.
Gradient Flow Analysis – Detect vanishing & exploding gradients.
Dead Neuron Detection – Identify inactive neurons in deep networks.
Anomaly Detection – Spot NaNs, extreme activations, and weight explosions.
Step Debugging Mode – Pause execution and inspect tensors manually.

📦 Installation

Clone the repository

git clone https://github.com/yourusername/neural.git cd neural

Create a virtual environment (recommended)

python -m venv venv source venv/bin/activate # Linux/macOS venv\Scripts\activate # Windows

Install dependencies

pip install -r requirements.txt
pip install neural-dsl

see v0.2.4 for bug fixes

Prerequisites: Python 3.8+, pip

🛠️ Quick Start

1. Define a Model

Create mnist.neural:

network MNISTClassifier {
  input: (28, 28, 1)  # Channels-last format
  layers:
    Conv2D(filters=32, kernel_size=(3,3), activation="relu")
    MaxPooling2D(pool_size=(2,2))
    Flatten()
    Dense(units=128, activation="relu")
    Dropout(rate=0.5)
    Output(units=10, activation="softmax")
  
  loss: "sparse_categorical_crossentropy"
  optimizer: Adam(learning_rate=0.001)
  metrics: ["accuracy"]
  
  train {
    epochs: 15
    batch_size: 64
    validation_split: 0.2
  }
}

3. Run Or Compile The Model

neural run mnist.neural --backend tensorflow --output mnist_tf.py
# Or for PyTorch:
neural run mnist.neural --backend pytorch --output mnist_torch.py

4. Visualize Architecture

neural visualize mnist.neural --format png

This will create architecture.png, shape_propagation.html, and tensor_flow.html for inspecting the network structure and shape propagation.


5. Debug with NeuralDbg

neural debug mnist.neural

Open your browser to http://localhost:8050 to monitor execution traces, gradients, and anomalies interactively.

6. Use The No-Code Interface

neural --no_code

Open your browser to http://localhost:8051 to build and compile models via a graphical interface.

🛠 Debugging with NeuralDbg

🔹 1️⃣ Start Real-Time Execution Tracing

python neural.py debug mnist.neural

Features:
✅ Layer-wise execution trace
✅ Memory & FLOP profiling
✅ Live performance monitoring

🔹 2️⃣ Analyze Gradient Flow

python neural.py debug --gradients mnist.neural

🚀 Detect vanishing/exploding gradients with interactive charts.

🔹 3️⃣ Identify Dead Neurons

python neural.py debug --dead-neurons mnist.neural

🛠 Find layers with inactive neurons (common in ReLU networks).

🔹 4️⃣ Detect Training Anomalies

python neural.py debug --anomalies mnist.neural

🔥 Flag NaNs, weight explosions, and extreme activations.

🔹 5️⃣ Step Debugging (Interactive Tensor Inspection)

python neural.py debug --step mnist.neural

🔍 Pause execution at any layer and inspect tensors manually.

Support

Please give us a star ⭐️ to increase our chances of getting into GitHub trends - the more attention we get, the higher our chances of actually making a difference. Please share this project with your friends! Every share helps us reach more developers and grow our community. The more developers we reach, the more likely we are to build something truly revolutionary together. 🚀

📬 Community