Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Beginner's Questions, Answers, & Tutorials #152

Open
ikarth opened this issue Nov 11, 2015 · 3 comments
Open

Beginner's Questions, Answers, & Tutorials #152

ikarth opened this issue Nov 11, 2015 · 3 comments
Labels

Comments

@ikarth
Copy link

ikarth commented Nov 11, 2015

While there are a ton of resources in the #1 Resources thread, many of them assume that you know what fancy terms like NLP, NLTK, RNN, or ConceptNet mean. This thread is for introductory tutorials and starting points, plus answering questions.

Feel free to chime in if you feel you have anything to contribute. I'm not going to try to duplicate the already super-useful resources, so go check out that thread if you want more tools and links to interesting things.

DariusK Livecoding Sessions

Darius has previously recorded several coding sessions that were broadcast live. When you're just starting out, it can sometimes be helpful to watch someone else code like this, because you'll see all of the little details that tend to get left out of even the most introductory tutorials.
https://www.youtube.com/watch?v=V9XwFYwyunw
https://www.youtube.com/watch?v=y-YIdzaG4OE
https://www.youtube.com/watch?v=_DMa_ve3N6o

@ikarth ikarth mentioned this issue Nov 11, 2015
@ikarth
Copy link
Author

ikarth commented Nov 11, 2015

Markov Chains

Markov chains are a well-known way to do text generation. When used to generate text, the basic operation is to look at the last few words and then pick the next word based on what words have been observed to follow.

Here's an interactive explanation of Markov chain generation by tullyhansen
Allison Parrish's tutorial on N-grams and Markov chains, with Python code
Jeff Atwood explains Markov chains via Garfield
A visual explanation of Markov chains
Andrew Plotkin's Fun With Markov Chains
Bookmerge, an example Markov generator that lets you combine two books (uses Ruby)

@ikarth
Copy link
Author

ikarth commented Nov 12, 2015

Reading and Writing Electronic Text

The course notes from this class at NYU's Interactive Telecommunications Program cover a lot of subjects and are a good introduction if you're wondering where to start. It also includes lots of example code in this github repository.

@ikarth
Copy link
Author

ikarth commented Nov 12, 2015

Templating and Grammars (with Tracery)

A replacement grammar is a set of rules to replace symbols. It works kind of like mad-libs: we start with a sentence like "Was it done by #suspect# in the #room# with the #murder weapon#?", and then the apply a rule that says that every time we see the word #suspect#, replace it with one of the names from our list of suspects.

While this starts out pretty simple, many of the successful novels from past NaNoGenMos have been made using similar techniques. Aggressive Passive, Redwreath and Goldstar Have Traveled to Deathsgate, Recipe Book Generator, and Threnody for Abraxas are a few examples of novels that use recursive replacement grammars in one way or another.

Tracery is a library that makes replacement grammars easy. You can try out the interactive tutorial for a playful introduction.

Some examples of what you can do with Tracery, including a demo of the in-progress no-programming-required visual editor

@hugovk hugovk added the admin label Nov 13, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants