Twitter bots are fun, especially the ones that generate amusingly random content based off other sources. But what's the best way to generate the content for your Twitter bot? Join me for a quick run through Markov chains, neural networks and other methods for generating the best word soup for your bot.
I run a Twitter bot that's aimed to be amusing. It generates tweets based off a mixture of other Twitter feeds, Facebook posts and some other sources. One of the challenges in making this work has been getting decently amusing output.
This presentation will go through some of the options available, ranging from simple Markov systems using markovify, through more complicated Markov approaches and will then look at systems based on torch-rnn and other neural network systems.
You'll leave this presentation with a decent overview of the ways you can transform content into new content and the strengths and weaknesses of each.
Benno is a longtime FreeBSD committer and, more recently, Core Team member. He’s also been part of the Python community for a fair old while. He kicked off FreeBSD’s port to the PowerPC architecture a long time ago and co-created Python’s behave project. Lately he’s been working with FreeBSD’s Core Team to improve FreeBSD’s community processes.
He currently works in Dell EMC’s Isilon division on their FreeBSD-based clustered storage appliance.