Solution 1:

Though LIKE statement (partial match) is not supported in Full Text Search, but you could hack around it.

First, tokenize the data string for all possible substrings (hello = h, he, hel, lo, etc.)

def tokenize_autocomplete(phrase):
    a = []
    for word in phrase.split():
        j = 1
        while True:
            for i in range(len(word) - j + 1):
                a.append(word[i:i + j])
            if j == len(word):
                break
            j += 1
    return a

Build an index + document (Search API) using the tokenized strings

index = search.Index(name='item_autocomplete')
for item in items:  # item = ndb.model
    name = ','.join(tokenize_autocomplete(item.name))
    document = search.Document(
        doc_id=item.key.urlsafe(),
        fields=[search.TextField(name='name', value=name)])
    index.put(document)

Perform search, and walah!

results = search.Index(name="item_autocomplete").search("name:elo")

https://code.luasoftware.com/tutorials/google-app-engine/partial-search-on-gae-with-search-api/

Solution 2:

just like @Desmond Lua answer, but with different tokenize function:

def tokenize(word):
  token=[]
  words = word.split(' ')
  for word in words:
    for i in range(len(word)):
      if i==0: continue
      w = word[i]
      if i==1: 
        token+=[word[0]+w]
        continue

      token+=[token[-1:][0]+w]

  return ",".join(token)

it will parse hello world as he,hel,hell,hello,wo,wor,worl,world.

it's good for light autocomplete purpose

Solution 3:

As described at Full Text Search and LIKE statement, no it's not possible, since the Search API implements full text indexing.

Hope this helps!