New User

GitHub - Differentiable sign/round/floor for PyTorch

by
PyTorch's sign, round, floor, and argmax break gradients. Existing fixes (STE, Gumbel-Softmax) require rewriting your model. SLL-Core is zero-intrusive: with sll.linearize(): y = torch.sign(x) # now differentiable! loss.backward() Only linearizes inside an ε-band around decision boundaries. Everywhere else stays exactly hard. Zero deployment overhead. pip install sll-core

Add a comment

Replies

Best
New User
Maker
📌
Hey Product Hunt! 👋 I built SLL-Core because I was tired of rewriting model code every time I needed to backprop through a discrete operation like sign, round, or floor. Existing solutions (STE, Gumbel-Softmax) work, but they force you to change your model architecture or add custom gradient hacks. SLL-Core takes a different approach: it only linearizes inside a tiny ε-band around decision boundaries, leaving everything else exactly hard. The result? You wrap your training loop in `with sll.linearize():` and your discrete model just works — no rewrites, no deployment residue. Would love to hear: what's the most painful non-differentiable operation you've run into with PyTorch?