![Amazon prime logo](https://cdn.statically.io/img/m.media-amazon.com/images/G/01/marketing/prime/new_prime_logo_RGB_blue._CB426090081_.png)
Enjoy fast, free delivery, exclusive deals, and award-winning movies & TV shows with Prime
Try Prime
and start saving today with fast, free delivery
Amazon Prime includes:
Fast, FREE Delivery is available to Prime members. To join, select "Try Amazon Prime and start saving today with Fast, FREE Delivery" below the Add to Cart button.
Amazon Prime members enjoy:- Cardmembers earn 5% Back at Amazon.com with a Prime Credit Card.
- Unlimited Free Two-Day Delivery
- Streaming of thousands of movies and TV shows with limited ads on Prime Video.
- A Kindle book to borrow for free each month - with no due dates
- Listen to over 2 million songs and hundreds of playlists
- Unlimited photo storage with anywhere access
Important: Your credit card will NOT be charged when you start your free trial or if you cancel during the trial period. If you're happy with Amazon Prime, do nothing. At the end of the free trial, your membership will automatically upgrade to a monthly membership.
Buy new:
-10% $44.99$44.99
Ships from: Amazon.com Sold by: Amazon.com
Save with Used - Good
$38.75$38.75
$3.99 delivery July 29 - August 2
Ships from: HPB-Red Sold by: HPB-Red
Learn more
1.27 mi | ASHBURN 20147
![Kindle app logo image](https://cdn.statically.io/img/m.media-amazon.com/images/G/01/kindle/app/kindle-app-logo._CB668847749_.png)
Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet, or computer - no Kindle device required.
Read instantly on your browser with Kindle for Web.
Using your mobile phone camera - scan the code below and download the Kindle app.
Pretrain Vision and Large Language Models in Python: End-to-end techniques for building and deploying foundation models on AWS
Purchase options and add-ons
Master the art of training vision and large language models with conceptual fundaments and industry-expert guidance. Learn about AWS services and design patterns, with relevant coding examples
Key Features:
- Learn to develop, train, tune, and apply foundation models with optimized end-to-end pipelines.
- Explore large-scale distributed training for models and datasets with AWS and SageMaker examples.
- Evaluate, deploy, and operationalize your custom models with bias detection and pipeline monitoring.
Book Description:
Foundation models have forever changed machine learning. From BERT to ChatGPT, CLIP to Stable Diffusion, when billions of parameters are combined with large datasets and hundreds to thousands of GPUs, the result is nothing short of record-breaking. The recommendations, advice, and code samples in this book will help you pretrain and fine-tune your own foundation models from scratch on AWS and Amazon SageMaker, while applying them to hundreds of use cases across your organization.
With advice from seasoned AWS and machine learning expert Emily Webber, this book helps you learn everything you need to go from project ideation to dataset preparation, training, evaluation, and deployment for large language, vision, and multimodal models. With step-by-step explanations of essential concepts and practical examples, you'll go from mastering the concept of pretraining to preparing your dataset and model, configuring your environment, training, fine-tuning, evaluating, deploying, and optimizing your foundation models.
You will learn how to apply the scaling laws to distributing your model and dataset over multiple GPUs, remove bias, achieve high throughput, and build deployment pipelines.
By the end of this book, you'll be well equipped to embark on your own project to pretrain and fine-tune the foundation models of the future.
What You Will Learn:
- Find the right use cases and datasets for pretraining and fine-tuning
- Prepare for large-scale training with custom accelerators and GPUs
- Configure environments on AWS and SageMaker to maximize performance
- Select hyperparameters based on your model and constraints
- Distribute your model and dataset using many types of parallelism
- Avoid pitfalls with job restarts, intermittent health checks, and more
- Evaluate your model with quantitative and qualitative insights
- Deploy your models with runtime improvements and monitoring pipelines
Who this book is for:
If you're a machine learning researcher or enthusiast who wants to start a foundation modelling project, this book is for you. Applied scientists, data scientists, machine learning engineers, solution architects, product managers, and students will all benefit from this book. Intermediate Python is a must, along with introductory concepts of cloud computing. A strong understanding of deep learning fundamentals is needed, while advanced topics will be explained. The content covers advanced machine learning and cloud techniques, explaining them in an actionable, easy-to-understand way.
- ISBN-10180461825X
- ISBN-13978-1804618257
- PublisherPackt Publishing
- Publication dateMay 31, 2023
- LanguageEnglish
- Dimensions9.25 x 7.52 x 0.54 inches
- Print length258 pages
Frequently bought together
![Pretrain Vision and Large Language Models in Python: End-to-end techniques for building and deploying foundation models on AW](https://cdn.statically.io/img/images-na.ssl-images-amazon.com/images/I/61Ymsod7jJL._AC_UL116_SR116,116_.jpg)
Customers who bought this item also bought
- Quick Start Guide to Large Language Models: Strategies and Best Practices for Using ChatGPT and Other LLMs (Addison-Wesley Data & Analytics Series)PaperbackFREE Shipping by AmazonGet it as soon as Saturday, Jul 27
- Generative Deep Learning: Teaching Machines To Paint, Write, Compose, and PlayPaperbackFREE Shipping by AmazonGet it as soon as Saturday, Jul 27
- Generative AI on AWS: Building Context-Aware Multimodal Reasoning ApplicationsPaperbackFREE Shipping by AmazonGet it as soon as Saturday, Jul 27
- Natural Language Processing with Transformers, Revised EditionLewis TunstallPaperbackFREE Shipping by AmazonGet it as soon as Saturday, Jul 27
- Transformers for Natural Language Processing and Computer Vision - Third Edition: Explore Generative AI and Large Language Models with Hugging Face, ChatGPT, GPT-4V, and DALL-E 3Paperback24% offLimited time dealFREE Shipping by AmazonGet it as soon as Saturday, Jul 27
- Designing Machine Learning Systems: An Iterative Process for Production-Ready ApplicationsPaperbackFREE Shipping by AmazonGet it as soon as Saturday, Jul 27
Editorial Reviews
About the Author
Emily Webber is a Principal Machine Learning Specialist Solutions Architect at Amazon Web Services. She has assisted hundreds of customers on their journey to ML in the cloud, specializing in distributed training for large language and vision models. She mentors Machine Learning Solution Architects, authors countless feature designs for SageMaker and AWS, and guides the Amazon SageMaker product and engineering teams on best practices in regards around machine learning and customers. Emily is widely known in the AWS community for a 16-video YouTube series featuring SageMaker with 160,000 views, plus a Keynote at O’Reilly AI London 2019 on a novel reinforcement learning approach she developed for public policy.
Product details
- Publisher : Packt Publishing (May 31, 2023)
- Language : English
- Paperback : 258 pages
- ISBN-10 : 180461825X
- ISBN-13 : 978-1804618257
- Item Weight : 1.01 pounds
- Dimensions : 9.25 x 7.52 x 0.54 inches
- Best Sellers Rank: #1,259,094 in Books (See Top 100 in Books)
- #445 in Natural Language Processing (Books)
- #1,314 in Python Programming
- #3,706 in Computer Science (Books)
- Customer Reviews:
About the author
![Emily Webber](https://cdn.statically.io/img/m.media-amazon.com/images/S/amzn-author-media-prod/kjkqumm0phf7rg5cg57b3mn4ba._SY600_.jpg)
Emily Webber is a Principal Machine Learning Specialist Solutions Architect and keynote speaker at Amazon Web Services, where she has lead the development of countless solutions and features on Amazon SageMaker. She has guided and mentored hundreds of teams, developers, and customers in their machine learning journey on AWS. She specializes in large-scale distributed training in vision, language, generative AI, and is active in the scientific communities in these areas. She hosts YouTube and Twitch series on the topic, regularly speaks at re:Invent, writes many blog posts, and leads workshops in this domain worldwide.
Customer reviews
Customer Reviews, including Product Star Ratings help customers to learn more about the product and decide whether it is the right product for them.
To calculate the overall star rating and percentage breakdown by star, we don’t use a simple average. Instead, our system considers things like how recent a review is and if the reviewer bought the item on Amazon. It also analyzed reviews to verify trustworthiness.
Learn more how customers reviews work on AmazonReviews with images
![Useful guide to understand business and technical case for deploying foundation models on AWS](https://cdn.statically.io/img/images-na.ssl-images-amazon.com/images/G/01/x-locale/common/transparent-pixel._V192234675_.gif)
-
Top reviews
Top reviews from the United States
There was a problem filtering reviews right now. Please try again later.
![Customer image](https://cdn.statically.io/img/images-na.ssl-images-amazon.com/images/G/01/x-locale/common/transparent-pixel._V192234675_.gif)
Reviewed in the United States on July 19, 2023
![Customer image](https://cdn.statically.io/img/m.media-amazon.com/images/I/71SPFtQsJLL._SY88.jpg)
In the fast-paced world of ML, where LLMs are at the forefront of innovation, keeping up with the rapid advancements can be challenging. However, Emily's book, announced just before the LLM boom, provides a solid foundation that enables you to navigate these developments with ease. It has proven to be a timely resource for those keen on understanding and leveraging the power of LLMs.
Book Summary:
Comprehensive Coverage: The book offers an in-depth exploration of training vision and large language models, covering all stages from project ideation, dataset preparation, training, evaluation, to deployment for large language, vision, and multimodal models.
Expert Guidance: Authored by Emily Webber, a seasoned AWS and machine learning expert, the book provides industry-expert guidance and practical advice, making it a valuable resource for both beginners and experienced practitioners.
Practical Approach: The book is replete with practical examples and code samples that help readers understand how to pretrain and fine-tune their own foundation models on AWS and Amazon SageMaker.
Bias Detection: A unique feature of the book is its focus on bias detection and pipeline monitoring, which are critical aspects of model development and deployment.
Advanced Topics: The book delves into advanced topics like large-scale distributed training, hyperparameter selection, and model distribution, providing readers with a deep understanding of these complex areas.
Future Trends: The final chapter on future trends in pretraining foundation models gives readers a glimpse into what's next in the field, keeping them ahead of the curve.
In conclusion, if you're looking to ride the wave of LLMs and want to do so using AWS, this book is a must-read. It's more than just a guide; not a beginner's book!!! It's a comprehensive resource that empowers you to navigate the fast-paced world of ML with confidence and proficiency. Emily Webber's expertise shines through each page, making this book an invaluable asset for anyone in the field. As LLMs continue to evolve and revolutionize various sectors, this book stands as a testament to their transformative potential and a guide for those looking to be part of this exciting journey. This is just the start..... Transformers......... !!!
![Customer image](https://cdn.statically.io/img/images-na.ssl-images-amazon.com/images/G/01/x-locale/common/transparent-pixel._V192234675_.gif)
Reviewed in the United States on June 5, 2023
In the fast-paced world of ML, where LLMs are at the forefront of innovation, keeping up with the rapid advancements can be challenging. However, Emily's book, announced just before the LLM boom, provides a solid foundation that enables you to navigate these developments with ease. It has proven to be a timely resource for those keen on understanding and leveraging the power of LLMs.
Book Summary:
Comprehensive Coverage: The book offers an in-depth exploration of training vision and large language models, covering all stages from project ideation, dataset preparation, training, evaluation, to deployment for large language, vision, and multimodal models.
Expert Guidance: Authored by Emily Webber, a seasoned AWS and machine learning expert, the book provides industry-expert guidance and practical advice, making it a valuable resource for both beginners and experienced practitioners.
Practical Approach: The book is replete with practical examples and code samples that help readers understand how to pretrain and fine-tune their own foundation models on AWS and Amazon SageMaker.
Bias Detection: A unique feature of the book is its focus on bias detection and pipeline monitoring, which are critical aspects of model development and deployment.
Advanced Topics: The book delves into advanced topics like large-scale distributed training, hyperparameter selection, and model distribution, providing readers with a deep understanding of these complex areas.
Future Trends: The final chapter on future trends in pretraining foundation models gives readers a glimpse into what's next in the field, keeping them ahead of the curve.
In conclusion, if you're looking to ride the wave of LLMs and want to do so using AWS, this book is a must-read. It's more than just a guide; not a beginner's book!!! It's a comprehensive resource that empowers you to navigate the fast-paced world of ML with confidence and proficiency. Emily Webber's expertise shines through each page, making this book an invaluable asset for anyone in the field. As LLMs continue to evolve and revolutionize various sectors, this book stands as a testament to their transformative potential and a guide for those looking to be part of this exciting journey. This is just the start..... Transformers......... !!!
![Customer image](https://cdn.statically.io/img/m.media-amazon.com/images/I/410D3BQgt5L._SY88.jpg)
![Customer image](https://cdn.statically.io/img/m.media-amazon.com/images/I/41OyGmPTR5L._SY88.jpg)
![Customer image](https://cdn.statically.io/img/m.media-amazon.com/images/I/513-BClvdGL._SY88.jpg)
![Customer image](https://cdn.statically.io/img/m.media-amazon.com/images/I/51MiR-Cs6wL._SY88.jpg)
![Customer image](https://cdn.statically.io/img/m.media-amazon.com/images/I/61bJn+b1WIL._SY88.jpg)
Customizing data sets to be unique in purpose, depth, and completeness is recommended to stay competitive in this rapidly evolving landscape. My first pass through the content of this book helped me to understand that the incorporation of continuous learning and keeping humans in the loop throughout the application workflow is key to leveraging the full potential of LLMs.
The most important takeaway to me was the relationship and scaling between open-source ML models and fine-tuning them for specific tasks. This book provided an understanding of this relationship, describing it, identifying constants involved, and determining the appropriate scaling needed for optimization.
The author did a great job of breaking down the components of Pretraining for this novice. I can only assume that there are tidbits in this book for all levels of AI enthusiasts delving into the preparation of LLMs for purpose. Thank you for this solid information!
![Customer image](https://cdn.statically.io/img/images-na.ssl-images-amazon.com/images/G/01/x-locale/common/transparent-pixel._V192234675_.gif)
Reviewed in the United States on March 23, 2024
Customizing data sets to be unique in purpose, depth, and completeness is recommended to stay competitive in this rapidly evolving landscape. My first pass through the content of this book helped me to understand that the incorporation of continuous learning and keeping humans in the loop throughout the application workflow is key to leveraging the full potential of LLMs.
The most important takeaway to me was the relationship and scaling between open-source ML models and fine-tuning them for specific tasks. This book provided an understanding of this relationship, describing it, identifying constants involved, and determining the appropriate scaling needed for optimization.
The author did a great job of breaking down the components of Pretraining for this novice. I can only assume that there are tidbits in this book for all levels of AI enthusiasts delving into the preparation of LLMs for purpose. Thank you for this solid information!
![Customer image](https://cdn.statically.io/img/m.media-amazon.com/images/I/51-bIpHWcqL._SY88.jpg)
However, it might be beneficial to have some prior knowledge in deep learning, as the book can get intricate at times. Some sections could be more streamlined for easier navigation.
Overall, if you're diving into large foundation models and AWS deployment, this is a worthy guide.
Top reviews from other countries
![](https://images-na.ssl-images-amazon.com/images/S/amazon-avatars-global/default._CR0,0,1024,1024_SX48_.png)
![](https://images-eu.ssl-images-amazon.com/images/S/amazon-avatars-global/default._CR0,0,1024,1024_SX48_.png)
The first chapters are good introduction but the book tries to cover too much and ends up in not going into any detail in anything. Theres a lot of 'will cover more later in the book' but it never gets covered to any meaningful depth. Also, it often reads like a sales advert for SageMaker.
Would find it hard to recommend this book as you could just visit 4 or 5 blogs and get the same content whilst saving yourself £40