top of page
  • Writer's pictureIbrahim Hashmat

ICM Media Entry#5 Markov Chains and Text

The prompt for this week felt very open ended (more so than usual) and I felt like I should try something completely different to what we did in class, so I picked Markov chains. For my assignment this week, I went over Dan Shiffman's videos on Markov chains and the different ways they're used. Going into this task I had no idea that these Markov chains could become so complex. But that didn't deter my goal of making a Markov generator.


The reason I chose Markov chains because I found the idea of using code to weed out words and characters from a text to be an interesting intersection of human and computer interaction. Having computer language (code) make changes to human language (text) sounds like something close to a cyberpunk dystopia. The genesis of this idea came when we watched the Jürg Lehni's Apple Talk.


For my assignment I wanted to see how code could change and reformat written human language. My idea then shifted a little when I discovered this website where an AI generates text using a prompt from a user. So I decided I wanted to see how my use use of computer code could change text written by a AI. I took my original idea and flipped it on it's head.


To that end I followed Shiffman's videos and constructed my generator. I refactorized his code so that I could understand it with a bit more clarity. I annotated the code as well in case anyone would want to use it for their own project.


This is the main setup for the generator. Like Shiffman's the idea is to set the order of ngrams the user wants to find in a given text then have a button that can be pressed to display that text with the selected number of ngrams. There's also the console log that displays all the possible selected ngrams within the text


for (var i = 0; i <= txt.length - order; i++) { // looking at every ngram 
    var gram = txt.substring(i, i + order);

    if (!ngrams[gram]) { // checking if the ngram exists
      ngrams[gram] = [];
    }
    ngrams[gram].push(txt.charAt(i + order)); // if it exists then place the next ngram in the array 

  }
  button = createButton("generate"); // clicking on the button starts the markov generator 

  button.mousePressed(marKov);
  console.log(ngrams);

Next we have the function that I refactorized for the generator. It runs to find the random possible number of ngrams that the user has selected. It displays the most current ngram at the top and then proceeds to look for more ngrams within the text while removing the ones it has already found.


function marKov() {

  var currentGram = txt.substring(0, order);          // start from 0 and 6(or whatever the order we selected is )
  
  var result = currentGram;

  for (var i = 0; i < 100; i++) {  // generate 100 characters in length
    
    var possibilities = ngrams[currentGram];   // what are the possible next characters with the ngrams
    
    if (!possibilities) { // if possibilities not defined then it ends 
      break;
    }
    var next = random(possibilities);                
    result += next;              // add it to the current result
    var len = result.length;
    currentGram = result.substring(len - order, len);  // next ngrm needs to be the last selected ngram amount of the result
  }

  createP(result);
}

In the end pressing the button displays how the AI text looks through a Markov chain that is using an order of 6 ngrams. The text still makes sense but looks more artificial, as it has grammatical mistakes and repeating statements. You too can try it for yourself with some autogenerated text and my sketch.



10 views0 comments

Recent Posts

See All
Post: Blog2_Post
bottom of page