
Neural
DSL for defining, training, debugging neural networks.
4 followers
DSL for defining, training, debugging neural networks.
4 followers
Neural is a domain-specific language (DSL) designed for defining, training, debugging, and deploying neural networks. With declarative syntax, cross-framework support, and built-in execution tracing (NeuralDbg), it simplifies deep learning development.








Why It Matters
Core Value: Fix critical blockers like shape errors and debugging woes with game-changing tools.
Strategic Edge: Streamline framework switches and HPO for big wins.
User-Friendly: Lower barriers and enhance workflows with practical features.
Neural is a domain-specific language (DSL) designed for defining, training, debugging, and deploying neural networks. With declarative syntax, cross-framework support, and built-in execution tracing (NeuralDbg), it simplifies deep learning development.
Feedback
Help us improve Neural DSL! Share your feedback: Typeform link.
🚀 Features
YAML-like Syntax: Define models intuitively without framework boilerplate.
Shape Propagation: Catch dimension mismatches before runtime.
✅ Interactive shape flow diagrams included.
Multi-Framework HPO: Optimize hyperparameters for both PyTorch and TensorFlow with a single DSL config (#434).
Multi-Backend Export: Generate code for TensorFlow, PyTorch, or ONNX.
Training Orchestration: Configure optimizers, schedulers, and metrics in one place.
Visual Debugging: Render interactive 3D architecture diagrams.
Extensible: Add custom layers/losses via Python plugins.
🛠 NeuralDbg: Built-in Neural Network Debugger
NeuralDbg provides real-time execution tracing, profiling, and debugging, allowing you to visualize and analyze deep learning models in action.
✅ Shape Propagation Debugging – Visualize tensor transformations at each layer.
✅ Gradient Flow Analysis – Detect vanishing & exploding gradients.
✅ Dead Neuron Detection – Identify inactive neurons in deep networks.
✅ Anomaly Detection – Spot NaNs, extreme activations, and weight explosions.
✅ Step Debugging Mode – Pause execution and inspect tensors manually.
📦 Installation
Clone the repository
git clone https://github.com/yourusername/neural.git cd neural
Create a virtual environment (recommended)
python -m venv venv source venv/bin/activate # Linux/macOS venv\Scripts\activate # Windows
Install dependencies
see v0.2.4 for bug fixes
Prerequisites: Python 3.8+, pip
🛠️ Quick Start
1. Define a Model
Create mnist.neural:
3. Run Or Compile The Model
4. Visualize Architecture
This will create architecture.png, shape_propagation.html, and tensor_flow.html for inspecting the network structure and shape propagation.
5. Debug with NeuralDbg
Open your browser to http://localhost:8050 to monitor execution traces, gradients, and anomalies interactively.
6. Use The No-Code Interface
Open your browser to http://localhost:8051 to build and compile models via a graphical interface.
🛠 Debugging with NeuralDbg
🔹 1️⃣ Start Real-Time Execution Tracing
Features:
✅ Layer-wise execution trace
✅ Memory & FLOP profiling
✅ Live performance monitoring
🔹 2️⃣ Analyze Gradient Flow
🚀 Detect vanishing/exploding gradients with interactive charts.
🔹 3️⃣ Identify Dead Neurons
🛠 Find layers with inactive neurons (common in ReLU networks).
🔹 4️⃣ Detect Training Anomalies
🔥 Flag NaNs, weight explosions, and extreme activations.
🔹 5️⃣ Step Debugging (Interactive Tensor Inspection)
🔍 Pause execution at any layer and inspect tensors manually.
Support
Please give us a star ⭐️ to increase our chances of getting into GitHub trends - the more attention we get, the higher our chances of actually making a difference. Please share this project with your friends! Every share helps us reach more developers and grow our community. The more developers we reach, the more likely we are to build something truly revolutionary together. 🚀
📬 Community
Discord Server: Chat with developers
Twitter @NLang4438: Updates & announcements