Would you be able to post your own solutions for the non sense spewing ted bot? I'd like to jump into hacking on something that already works while also working through the theory from the lectures
Yes happy to share once the evaluation period is over, but sadly I dont think Im allowed to share beforehand. Will have to check when but I expect it should be in approximately 2 weeks.
In the meantime, you might want to check out this excellent blog post http://r2rt.com/recurrent-neural-networks-in-tensorflow-ii.h.... This will provide you with skeleton code to implement a character level generative model (similar to Practical 3, Task 2 with the exception that there you will generate words and not characters). Andrej Karpathy's blog post on LSTMs is also excellent and I believe he also provides the code in his repository as well http://karpathy.github.io/2015/05/21/rnn-effectiveness/