Do More With AI - LLMs With Big Token Counts
Вставка
- Опубліковано 8 лип 2024
- Join Scott and CJ as they dive into the fascinating world of AI, exploring topics from LLM token sizes and context windows to understanding input length. They discuss practical use cases and share insights on how web developers can leverage larger token counts to maximize the potential of AI and LLMs.
Show Notes
00:00 Welcome to Syntax!
01:31 Brought to you by Sentry.io.
02:42 What is a token?
04:22 Context window sometimes called "max tokens".
10:42 Understanding input length.
11:59 Models + services with big token counts.
13:22 Generating open API documentation for a complex API.
17:29 Generating JSDoc style typing.
21:07 Generating seed data for a complex database.
24:34 Summarizing 8+ hours of video.
29:35 Some things we've yet to try.
31:32 What about cost?
All links available at syntax.fm/789
------------------------------------------------------------------------------
Hit us up on Socials!
Scott: / stolinski
Wes: / wesbos
Randy: / @randyrektor
Syntax: / syntaxfm
www.syntax.fm
Brought to you by Sentry.io
#webdevelopment #webdeveloper #javascript - Наука та технологія
I actually made a typo when writing the word "token" in Quizgecko's token counter and it turned out to be more tokens than the correctly spelled word. Interesting.
This was really interesting and a great antidote to reddit “programmers are out of a job now” singularity obsessed posts
Going to 12:00 skips the introduction content
Good that they have nice chapters for scrobling thru.
Please where is wesbos?
New kid, do he is on a break atm.
Yall should build custom GPT WesGPT ScottGPT and CJGPT
Always appreciate the new content