Shared conversation
GPT-4 ·
Generate a human understandable comparison scale, of different body of texts, in terms of how many tokens they would embed to.
Think about average lengths of tweets, news articles, blogposts, wiki pages, bible, encyclopedia etc and generate a table of each in average length in tokens
GPT-4
Poe
Here's a human-understandable comparison scale of different types of texts and their estimated average length in tokens (words and punctuation marks). Keep in mind that these are approximate estimates and can vary greatly depending on the specific content.
Type of Text | Average Length in Tokens |
---|---|
Tweet | 20-30 |
SMS/Text Message | 10-20 |
News Headline | 8-15 |
News Article | 400-800 |
Blog Post | 500-1500 |
Wikipedia Page | 1000-5000 |
Short Story | 3000-8000 |
Novel | 80,000-120,000 |
The Bible | 800,000 |
Encyclopedia | 1-10 million+ |
Please note that these numbers are rough estimates and may vary depending on the specific instance of each text type.
This conversation may have been edited by the sharer.