Here are a few use cases that Writer covers with its generative AI capabilities
Fill mask is a technique used in generative AI that allows users to input a partial sentence and have the AI system complete the sentence. Essentially, this is the task of masking some of the words in a sentence and predicting which words should replace those masks. This is possible because the AI system has been trained on a large amount of data and can generate new sentences that are similar to the ones it has seen before.
Text2text generation is a method of creating text by using a neural network to generate new text from a given input. For example, a text2text model could be used to generate a summary of a news article based on the original article or a user could input a short story and the AI could generate a continuation of that story.
Sentence similarity is the process of determining how similar two sentences are in meaning. This is often done by comparing the words in each sentence and determining how often those words are used in similar contexts. For example, the sentences "I like to eat apples" and "I enjoy eating apples" are more similar than the sentences "I like to eat apples" and "I swim in the ocean."
Writer’s AI is capable of answering questions based on information that's been fed to our model. It usually involves taking information from a given text and using it to generate a response that is relevant to the question. For example, you may ask Writer’s AI to provide you with a plot summary of a book or ask why the sky is blue.
Text classification is a process of assigning a class label to a piece of text. This is often done by first building a large language model, which is then used to generate labels for new pieces of text. For example, a language model might be used to label a piece of text as "positive" or "negative" based on the sentiment of the text.
Zero-shot classification is a method of text classification that doesn't require any training data. Instead, it relies on large language models that have been trained on a large amount of data. These models can learn the general structure of language and can therefore classify new text without any training data.
A relatable example of this would be if you were to take a large corpus of text, such as all of Wikipedia, and train a language model on it. This model would then be able to take any new text, such as a news article, and classify it according to its topic.
Simply put, Writer’s AI is capable of summarizing any piece of text. For example, a reader wants to get the gist of a long article without reading the entire thing and would request a summary of this article.
Lastly, what we all know generative AI to be, text generation is the act of producing new text from large language models. Emails, summaries, essays and such can all be generated as an example.
Updated 7 months ago