Tokenization is the process by which the Dgraph analyzes the search query string, yielding a sequence of distinct query terms.