Comments on “Deep Elon Musk
Hey! I'm here if you have any question.
Ramil Bakhishli@ramil_bakhishli
@berpj Hi Pierre, just followed you, I think this is incredible and funny. Just wondering how you did this in non-technical terms so I could make one for my celebrities. Thank you
@ramil_bakhishli Thanks! It's based on Machine Learning. Basically it's a program that learns how to write like a child. It read this dataset over and over to learn how to create new sentences: https://github.com/berpj/elon-mu... You can have a look at this article, it contains some interesting ideas: http://karpathy.github.io/2015/0...
Jevin Sew@jevinsew · Rails / iOS developer
@berpj Thanks for the links. I might try that considering the tweets sound eerily true.
@jevinsew To be a bit more technical: it's an LSTM Neural Network, made with Tensorflow in Python. Have a look to this tutorial https://www.tensorflow.org/versi...
Vladislav Arbatov@vladzima · Founder http://en.arb.digital
@berpj @jevinsew I'm pretty sure all tweets are post-processed and corrected AND specifically hand-picked from a big generated set. LSTM (word or character level) is not currently capable of generating perfect sentences with at least some meaning, it only happens by relatively small chance. I used my 17.7k tweets as a dataset for experiments with models, whi… See more
@vladzima @jevinsew Yes! Tweets are post-processed, and I generally generate a few dozens of them, then put the best ones in Buffer (great app by the way).
Carlos Cabada@cabada · Founder @ Talkbot.io
@berpj I find this amazing, I've always wanted to know how to do NLP/ML with Text/Language. Are you planning on open-sourcing it, it would be really nice to learn how to do it.