Nicole Koenigstein: Transformers in Action, Kartoniert / Broschiert
Transformers in Action
Sie können den Titel schon jetzt bestellen. Versand an Sie erfolgt gleich nach Verfügbarkeit.
- Verlag:
- Manning Publications, 10/2025
- Einband:
- Kartoniert / Broschiert
- Sprache:
- Englisch
- ISBN-13:
- 9781633437883
- Artikelnummer:
- 12380056
- Umfang:
- 393 Seiten
- Gewicht:
- 467 g
- Erscheinungstermin:
- 28.10.2025
- Hinweis
-
Achtung: Artikel ist nicht in deutscher Sprache!
Klappentext
Take a deep dive into Transformers and Large Language Models---the foundations of generative AI!
Generative AI has set up shop in almost every aspect of business and society. Transformers and Large Language Models (LLMs) now power everything from code creation tools like Copilot and Cursor to AI agents, live language translators, smart chatbots, text generators, and much more.
In Transformers and LLMs in Action you'll discover:
• How transformers and LLMs work under the hood
• Adapting AI models to new tasks
• Optimizing LLM model performance
• Text generation with reinforcement learning
• Multi-modal AI models
• Encoder-only, decoder-only, encoder-decoder, and small language models
This practical book gives you the background, mental models, and practical skills you need to put Gen AI to work.
What is a transformer?
A "transformer" is a neural network model that finds relationships in sequences of words or other data using a mathematical technique called attention. Because the attention mechanism allows transformers to focus on the most relevant parts of a sequence, transformers can learn context and meaning from even large bodies of text. LLMs like GPT, Gemini, and Claude, are transformer-based models that have been trained on massive data sets, which gives them the uncanny ability to generate natural, coherent responses across a wide range of knowledge domains.
About the book
Transformers and LLMs in Action guides you through the design and operation of transformers and transformer-based models. You'll dive immediately into LLM architecture, with even the most complex concepts explained clearly through easy-to-understand examples and clever analogies. Because transformers are based in mathematics, author Nicole Koenigstein carefully guides you through the foundational formulas and concepts one step at a time. You'll also appreciate the extensive code repository that lets you instantly start exploring LLMs hands-on.
As you go, you learn how and when to use different model architectures such as decoder-only, encoder-only, and encoder-decoder. You'll also discover when small language models make sense for specific tasks like classification, in resource-constrained environments, or when privacy is paramount. You'll push transformers further with tasks like refining text generation with reinforcement learning, developing multimodal models including building multimodal RAG pipelines, and fine-tuning. You'll even learn how to optimize LLMs to maximize efficiency and minimize cost.
About the reader
For software engineers and data scientists comfortable with the basics of ML, Python, and common data tools.
About the author
Nicole Koenigstein is CEO and Chief AI Officer at Quantmate, an agentic ecosystem for hypothesis testing, trading strategy evolution, and dynamic algorithmic intelligence.
Get a free eBook (PDF or ePub) from Manning as well as access to the online liveBook format (and its AI assistant that will answer your questions in any language) when you purchase the print book.
