2021年1月3日星期日

Can you make this faster? Sentence to Numpy Array mapping

I have a string of Korean language. I need to map it into a numpy array where each letter is encoded in one-hot fashion.

def embed_letter(x: str) -> np.array:      # input: one Korean letter      # maps the letter into ...      # output: one-hot encoded np.array    def embed_sentence(x: str, max_length: int) -> np.array:      embedded_char_list = []      append = embedded_char_list.append           end = len(sentence) if len(sentence) < max_length else max_length        for i in range(end):          append(embed_letter(sentence[i]))        stacked = np.stack(embedded_char_list, axis=0)            return stacked  

I wanted to make the embed_sentence faster by kind of vectorizing embed_letter. Could you do this, if so how? Are there any other way to make this faster? Thank you

https://stackoverflow.com/questions/65556557/can-you-make-this-faster-sentence-to-numpy-array-mapping January 04, 2021 at 09:06AM

没有评论:

发表评论