adam

Melogen - Turn AI-generated melodies into playable MIDI — instantly

by
Melogen lets you import melodies generated by ChatGPT, Claude, Gemini, or Grok — paste the JSON, hear it play on a piano roll, edit it, and export as MIDI. No music theory required. No DAW needed. Just paste, play, and create. ✦ Works with ChatGPT, Claude, Gemini, Grok ✦ Browser-based piano roll editor ✦ Export to MIDI for any DAW ✦ Free to use

Add a comment

Replies

Best
adam
Maker
📌
Hey Product Hunt! 👋 I'm Adam, and I built Melogen because I kept running into the same problem: I'd ask ChatGPT or Claude to write a melody, get back a wall of JSON — and have absolutely no way to hear it. So I built the missing piece. Melogen lets you paste any LLM-generated melody JSON, hear it play instantly on a piano roll, edit individual notes, and export to MIDI for your DAW. No music theory required. No DAW needed just to preview. Just paste, play, and create. I tested it across ChatGPT, Claude, Gemini, and Grok — the differences in how each model "thinks" about music are genuinely surprising. (Full breakdown in our YouTube video if you're curious.) Melogen is free to use. Pro plan unlocks higher limits and more features. Would love your honest feedback — especially from anyone who's tried getting usable music out of an LLM before. What was your biggest pain point? Try it free → melogen.app
adam
Maker

Just shipped a small but important update to Melogen 🎵

New: "Try Demo" button — hear what Melogen does before you paste anything.

No AI needed. No JSON. Just click and listen.

melogen.app

#AIMusic #MIDI #IndieHacker

adam
Maker

https://youtu.be/sWqYa98yHH8

Most AI music tools generate audio that you can't really edit.

But music producers usually want something different:
they want to change notes, fix melodies, and refine ideas.

I made a short demo showing how AI-generated melodies can be edited directly in a piano roll — deleting notes, adjusting pitch/timing, and adding new ones to improve the result.

The idea behind Melogen is simple:

AI generates the structure → you refine it like MIDI.

Curious what people here think:

Should AI music be fully editable, or should it stay more like “instant audio”?

Here’s the demo 👇