<!DOCTYPE html><html lang="en" xmlns="http://www.w3.org/1999/xhtml" xmlns:v="urn:schemas-microsoft-com:vml" xmlns:o="urn:schemas-microsoft-com:office:office" style="font-size:16px;"><head></head><head><meta charset="utf-8"/><!--[if !mso]><!--><meta http-equiv="X-UA-Compatible" content="IE=edge"/><!--<![endif]--><meta name="viewport" content="width=device-width,initial-scale=1"/><meta name="x-apple-disable-message-reformatting"/><meta name="format-detection" content="telephone=no,address=no,email=no,date=no,url=no"/><meta name="color-scheme" content="light"/><meta name="supported-color-schemes" content="light"/><title>Video models are zero-shot learners and reasoners</title><!--[if mso]><xml><o:OfficeDocumentSettings><o:AllowPNG/><o:PixelsPerInch>96</o:PixelsPerInch></o:OfficeDocumentSettings></xml><![endif]--><style>
:root { color-scheme: light; supported-color-schemes: light; }
body { margin: 0; padding: 0; min-width: 100%!important; -ms-text-size-adjust: 100% !important; -webkit-transform: scale(1) !important; -webkit-text-size-adjust: 100% !important; -webkit-font-smoothing: antialiased !important; }
.body { word-wrap: normal; word-spacing:normal; }
table.mso { width: 100%; border-collapse: collapse; padding: 0; table-layout: fixed; }
img { border: 0; outline: none; }
table { mso-table-lspace: 0px; mso-table-rspace: 0px; }
td, a, span { mso-line-height-rule: exactly; }
#root [x-apple-data-detectors=true],
a[x-apple-data-detectors=true],
#MessageViewBody a { color: inherit !important; text-decoration: inherit !important; font-size: inherit !important; font-family: inherit !important; font-weight: inherit !important; line-height: inherit !important; }
span.MsoHyperlink { color: inherit !important; mso-style-priority: 99 !important; }
span.MsoHyperlinkFollowed { color: inherit !important; mso-style-priority: 99 !important; }
.a { background-color:#dedede; }
.b { background-color:#2a2a2a; }
.c { background-color:#ffffff; }
.d { background-color:#fff0c8; }
.d2 { background-color:#FFFFFF; }
.d3 { background-color:#FFFFFF; }
h1 a { text-decoration:none;color:#2C81E5;font-style:italic; }
h2 a { text-decoration:none;color:#2C81E5;font-style:italic; }
h3 a { text-decoration:none;color:#2C81E5;font-style:italic; }
h4 a { text-decoration:none;color:#2C81E5;font-style:italic; }
h5 a { text-decoration:none;color:#2C81E5;font-style:italic; }
h6 a { text-decoration:none;color:#2C81E5;font-style:italic; }
h1, h1 a, h2, h2 a, h3, h3 a, h4, h4 a, h5, h5 a, h6, h6 a, ul, li, ol, p, p a { margin: 0;padding: 0; }
h1 { font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif;font-weight:700;font-size:28px;color:#2A2A2A;line-height:42px;padding-bottom:4px;padding-top:16px;mso-margin-top-alt:16px;mso-margin-bottom-alt:4px }
h2 { font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif;font-weight:700;font-size:24px;color:#2A2A2A;line-height:36px;padding-bottom:4px;padding-top:16px;mso-margin-top-alt:16px;mso-margin-bottom-alt:4px }
h3 { font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif;font-weight:400;font-size:20px;color:#2A2A2A;line-height:30px;padding-bottom:4px;padding-top:16px;mso-margin-top-alt:16px;mso-margin-bottom-alt:4px }
h4 { font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif;font-weight:400;font-size:18px;color:#2A2A2A;line-height:27px;padding-bottom:4px;padding-top:16px;mso-margin-top-alt:16px;mso-margin-bottom-alt:4px }
h5 { font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif;font-weight:400;font-size:16px;color:#2A2A2A;line-height:24px;padding-bottom:4px;padding-top:16px;mso-margin-top-alt:16px;mso-margin-bottom-alt:4px }
h6 { font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif;font-weight:400;font-size:14px;color:#2A2A2A;line-height:21px;padding-bottom:4px;padding-top:16px;mso-margin-top-alt:16px;mso-margin-bottom-alt:4px }
p { font-family:'Georgia','Times New Roman',serif;font-weight:400;color:#2D2D2D;font-size:16px;line-height:24px;padding-bottom:8px;padding-top:8px;mso-margin-top-alt:8px;mso-margin-bottom-alt:8px; }
p a, .e a, ul a, li a, .h a, .h2 a, .h3 a { word-break:break-word;color:#2C81E5 !important;text-decoration:none;font-style:italic; }
p a span, .e a span, ul a span, li a span { color: inherit }
p .bold { font-weight:bold;color:#2D2D2D; }
p span[style*="font-size"] { line-height: 1.6; }
.f p { font-size:12px;line-height:15px;color:#2D2D2D;padding:0; }
.f p a { color:#2D2D2D !important; }
.g p { font-family:'Helvetica',Arial,sans-serif;font-size:14px;line-height:20px;font-weight:normal;margin:0; }
.g p a { text-decoration: underline; }
.i p { font-family:'Helvetica',Arial,sans-serif;line-height:23px;font-size:15px;color:#2D2D2D; }
.i p a { color:#2D2D2D !important; }
.i2 p { font-family:'Helvetica',Arial,sans-serif;line-height:23px;font-size:15px;color:#2D2D2D; }
.i2 p a { color:#2D2D2D !important; }
.i3 p { font-family:'Helvetica',Arial,sans-serif;line-height:43px;font-size:24px;color:#2D2D2D; }
.i3 p a { color:#2D2D2D !important; }
.h p a { color:#595959 !important; }
.h2 p a { color:#595959 !important; }
.h3 p a { color:#595959 !important; }
.f p a, .i p a, .i2 p a, .i3 p a, .h p a, .h2 p a, .h3 p a { text-decoration:underline; }
.j { border-top:3px solid #ffeb2d; }
.k p { padding-left:15px;padding-bottom:0px;padding-top:6px;mso-margin-top-alt:6px;mso-margin-bottom-alt:0px;mso-margin-left-alt:15px; }
.o { background-color:#FFFFFF;border:1px solid #F1F1F1;border-radius:5px; }
.o p { font-family:'Helvetica',Arial,sans-serif;padding:0px;margin:0px; }
.l p,
.l p a, .l a { font-size:14px;line-height:20px;font-weight: bold;color:#2D2D2D;padding-bottom:6px;mso-margin-bottom-alt:6px;text-decoration:none; }
.m p,
.m p a { font-size:13px;line-height:18px;font-weight:400;color:#2D2D2D;padding-bottom:6px;mso-margin-bottom-alt:6px;text-decoration:none; }
.n p,
.n p a { font-size:12px;line-height:17px;font-weight:400;color:#2D2D2D;padding-bottom:6px;mso-margin-bottom-alt:6px;text-decoration:none; }
.p { background-color:#FFFFFF;max-width:520px;border:1px solid #E1E8ED;border:1px solid rgba(80, 80, 80, 0.3);border-radius:5px; }
.q { font-size:16px;font-family:Helvetica,Roboto,Calibri,sans-serif !important;border:1px solid #e1e8ed;border:1px solid rgba(80, 80, 80, 0.3);border-radius:10px;background-color:#FFFFFF; }
.q p { font-size:16px;font-family:system-ui,Helvetica,Roboto,Calibri,sans-serif !important;color:#222222;padding:4px 0; }
.r { border:1px solid #E1E8ED !important;border-radius:5px; }
.s p { font-size: 14px; line-height: 17px; font-weight: 400; color: #697882; text-decoration: none; }
.t p { font-family:'Helvetica',Arial,sans-serif;font-size:12px;line-height:18px;font-weight:400;color:#000000;font-style:italic;padding:4px 0px 0px; }
.v { border-radius:10px;border:solid 0px #DFD150;background-color:#2C81E5;font-family:'Open Sans','Segoe UI','Apple SD Gothic Neo','Lucida Grande','Lucida Sans Unicode',sans-serif;color:#FFFFFF; }
.v a { text-decoration:none;display:block;color:#FFFFFF; }
.w p { font-size:12px;line-height:15px;font-weight:400;color:#FFFFFF; }
.w p a { text-decoration: underline !important;color:#FFFFFF !important; }
ul { font-family:'Helvetica',Arial,sans-serif;margin:0px 0px 0px 25px !important;padding:0px !important;color:#2D2D2D;line-height:24px;list-style:disc;font-size:16px; }
ul > li { font-family:'Helvetica',Arial,sans-serif;margin:10px 0px 0px 0px !important;padding: 0px 0px 0px 0px !important; color: #2D2D2D; list-style:disc; }
ol { font-family:'Helvetica',Arial,sans-serif;margin: 0px 0px 0px 25px !important;padding:0px !important;color:#2D2D2D;line-height:24px;list-style:decimal;font-size:16px; }
ol > li { font-family:'Helvetica',Arial,sans-serif;margin:10px 0px 0px 0px !important;padding: 0px 0px 0px 0px !important; color: #2D2D2D; }
.e h3,
.e p,
.e span { padding-bottom:0px;padding-top:0px;mso-margin-top-alt:0px;mso-margin-bottom-alt:0px; }
.e span,
.e li { font-family:'Helvetica',Arial,sans-serif;font-size:16px;color:#2D2D2D;line-height:24px; }
.rec { font-family: ui-sans-serif, system-ui, -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", Arial, "Noto Sans", sans-serif, "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Noto Color Emoji" !important; }
.rec__button:hover { background-color: #f9fafb !important; }
.copyright a {color: inherit !important; text-decoration: none !important; font-size: inherit !important; font-family: inherit !important; font-weight: inherit !important; line-height: inherit !important;}
.txt_social p { padding: 0; word-break: break-all; }
.table, .table-c, .table-h { border: 1px solid #C0C0C0; }
.table-c { padding:5px; background-color:#FFFFFF; }
.table-c p { color: #2D2D2D; font-family:'Helvetica',Arial,sans-serif !important;overflow-wrap: break-word; }
.table-h { padding:5px; background-color:#F1F1F1; }
.table-h p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important;overflow-wrap: break-word; }
@media only screen and (max-width:667px) {
.aa, .w100pc { width: 100% !important; }
.bb img { width: 100% !important; height: auto !important; max-width: none !important; }
.cc { padding: 0px 8px !important; }
.ee { padding-top:10px !important;padding-bottom:10px !important; }
.ff ul, .ff ol { margin: 0px 0px 0px 10px !important;padding: 0px !important; }
.ff li { margin:10px 0px 0px 10px !important; }
.r {height:140px !important;}
.s p { font-size:13px !important;line-height:15px !important; }
.mob-hide {display:none !important;}
.mob-show {display: block !important; width: auto !important; overflow: visible !important; float: none !important; max-height: inherit !important; line-height: inherit !important;}
.mob-stack {width:100% !important;display:block !important;}
.mob-w-full {width:100% !important;}
.mob-block {display:block !important;}
.embed-img {padding:0px 0px 12px 0px !important;}
.socialShare {padding-top:15px !important;}
.rec { padding-left:15px!important;padding-right:15px!important; }
.bodyWrapper { padding:7px 4px 7px 4px !important; }
.social-mobile {float:left !important;margin-top:10px !important;}
}
@media screen and (max-width: 480px) {
u + .a .gg { width: 100% !important; width: 100vw !important; }
.tok-heart { padding-top:75% !important; }
.tok-play { padding-top: 250px !important; }
}
@media screen and (max-width: 320px) {
.tok-heart { padding-top:65% !important; }
}
.u { border: 1px solid #CACACA !important; border-radius: 2px !important; background-color: #ffffff !important; padding: 0px 13px 0px 13px !important; font-family:ui-sans-serif,system-ui,-apple-system,BlinkMacSystemFont,"Segoe UI",Roboto,"Helvetica Neue",Arial,"Noto Sans",sans-serif !important;font-size: 12px !important; color: #767676 !important; }
.u a { text-decoration: none; display: block !important; color: #767676 !important; margin: 0px !important; }
.u span, .u img { color: #767676 !important;margin:0px !important; max-height:32px !important;background-color:#ffffff !important; }
</style><!--[if mso]><style type="text/css">
h1, h2, h3, h4, h5, h6 {font-family: Arial, sans-serif !important;}
body, table, td, p, a, span {font-family: Arial, sans-serif !important;}
sup { font-size: 100% !important;vertical-align: .5em !important;mso-text-raise: -1.5% !important;line-height: 0 !important; }
ul { margin-left:0px !important; margin-right:10px !important; margin-top:20px !important; margin-bottom:20px !important; }
ul li { margin-left: 0px !important; mso-special-format: decimal; }
ol { margin-left:0px !important; margin-right:10px !important; margin-top:20px !important; margin-bottom:20px !important; }
ol li { margin-left: 0px !important; mso-special-format: decimal; }
li.listItem { margin-left:15px !important; margin-top:0px !important; }
.paddingDesktop { padding: 10px 0 !important; }
.edm_outlooklist { margin-left: -20px !important; }
.embedImage { display:none !important; }
</style><![endif]--><!-- __merge_tags_in_links__ --><style>
@font-face {
font-family: 'Open Sans';
font-style: normal;
font-weight: 700;
font-display: swap;
src: url('https://fonts.gstatic.com/s/opensans/v40/memSYaGs126MiZpBA-UvWbX2vVnXBbObj2OVZyOOSr4dVJWUgsg-1x4gaVIUwaEQbjA.woff2') format('woff2');
}
@font-face {
font-family: 'Open Sans';
font-style: italic;
font-weight: 700;
font-display: swap;
src: url('https://fonts.googleapis.com/css2?family=Open+Sans:ital,wght@1,700&display=swap') format('woff2');
}
</style></head><body class="a" style="margin:0px auto;padding:0px;word-wrap:normal;word-spacing:normal;background-color:#dedede;"><div role="article" aria-roledescription="email" aria-label="email_name" lang="en" style="font-size:1rem"><div style="display:none;max-height:0px;overflow:hidden;"> Plus more about Thinking Augmented Pre-training and Reinforcement Learning on Pre-Training Data  ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ </div><table role="none" width="100%" border="0" cellspacing="0" align="center" cellpadding="0" class="gg"><tr><td align="center" valign="top"><table role="none" width="670" border="0" cellspacing="0" cellpadding="0" class="aa" style="width:670px;table-layout:fixed;"><tr><td class="bodyWrapper" align="center" valign="top" style="padding:7px 7px 7px 7px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr><td align="center" valign="top" style="border-width:0px 0px 0px 0px;border-style: solid; border-color: #2a2a2a;border-radius:10px 10px 0px 0px;background-color:#ffffff;" class="c"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr id="header"><td style="padding:15px 15px 0px 15px;"><div style="padding-top:0px;padding-right:0px;padding-bottom:20px;padding-left:0px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr><td class="f" align="right" valign="top"><p> September 30, 2025 | <a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.c6q0w4g5sodbtO4I1B_pxSdB5RCIH6yy1Fm1CYma3Ew6WuYwEpeeXtRaboM_v3XZZGh9F0l_L3D6ggA_wBcYKU5xGC1l0fixLDfg0g5cO3LPqD-zA8RKEXaNGHVX020CjosuUHqDcMU_5gqrheojVhpQQV1_WL3IccOCdGGWGJ-zt9aNDGekxE5iUC4F8cm52N6hiQygaznQn-YsuTHHsTmflEQ9R614EQgmABzGwJcE-XvcBnynnI0EVWHdptxgK8lo61isLiNyd2og9-6aj4mbk_ujFPGU7IRYRDT3xy0PLIuWZaFPosfU11iS_jdYDLu0dX6LHElj8UmjqCR7DF5UQq2r_xs6MS6uB2xNQFfohe6V29K7Y6W7hBeCiDiUTdqkJqSC3ObZi7GMB2wU_6PXbjwSS9JWgnekm1H72K88_AuVNqOaJswZl6cgTgpjyxgitsO_3l-FUA0myyHKEKAplpyRJw8sD_hUerawMBQSFm2I3Ks9vdC-SXqe7TFT7RDG2eQnTqxoUrn_YTHRjtHoyyDthRL8ZdTDwT3cJGuCcxCZntonAWnq6dheUvo67qHDMWggyb7JG8jhWxgEIjqqgBomsqOuUowvjZkaWft7PFFUtEmrLFE7Pk9rdKuazye8afyErp6fzqNTk7KNPqjKfqDIDq5VbFdO-OVhejeLQ59lKEjxRgObHczvFRUyenM2_x-jC8R9XNehICIFnWEIZs0HvQRgVSOTFAgI6B2pOA-2NufPMnueU5Gmzq7LYQGF1kk_uqFG59ThrrEMRmszTwEU_Rz_PQOh07GMlcuBrYwuc9cMAEsQRSYfa1OCB57xG7DnC3G0gwaxW-zYgg/4kc/R1NR_ruUQ6G9mf4EocZKug/h0/h001.Xn4Nl-wMT4mbQIRLoZJ3TaAOwQOVVOVMTnZonj5eRyk"><span class="translation_missing" title="translation missing: en.templates.posts.email.header.read_online">Read Online</span></a></p></td></tr><tr><td class="dd" align="center" valign="top" style="padding:15px 0;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr><td align="center" valign="top"><h1 style="text-align:left;font-family:'Open Sans','Segoe UI','Apple SD Gothic Neo','Lucida Grande','Lucida Sans Unicode',sans-serif;font-weight:Bold;font-size:32px;color:#2A2A2A;padding:2px 0;line-height:38px;"> Video models are zero-shot learners and reasoners </h1><p style="text-align:left;font-family:'Helvetica',Arial,sans-serif;font-weight:normal;font-size:20px;color:#3E3E3E;padding:5px 0;line-height:24px;"> Plus more about Thinking Augmented Pre-training and Reinforcement Learning on Pre-Training Data </p></td></tr></table></td></tr><tr><td style="line-height:0;"><div data-open-tracking="true"> <img src="https://elink4f7.mail.bycloud.ai/ss/o/u001.3wmUuY8gEWd4_869a_eXcg/4kc/R1NR_ruUQ6G9mf4EocZKug/ho.gif" alt="" width="1" height="1" border="0" style="height:1px !important;width:1px !important;border-width:0 !important;margin-top:0 !important;margin-bottom:0 !important;margin-right:0 !important;margin-left:0 !important;padding-top:0 !important;padding-bottom:0 !important;padding-right:0 !important;padding-left:0 !important;"/> </div></td></tr></table></div></td></tr><tr id="content-blocks"><td class="email-card-body" align="center" valign="top" style="padding-bottom:28px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr><td align="center" valign="top" style="padding: 20px 28px 20px;" class="dd"><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin: 0 auto 0 auto"><tr><td align="center" valign="top" style="width:300px;"><p style="opacity: 0.8;"><b>In partnership with</b></p></td></tr><tr><td align="center" valign="top" style="width:300px;"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.c6q0w4g5sodbtO4I1B_pxU54leatptHde-0GADX_7DSobFyAC7h-HCdhgztGfT-nwYcacUb1JZ-EAHFgM-7VGie3eFjyUsET_Ih2l0inC7cHOhViNa7_xhD8WkNRH3HHTV125LN4Vl4hAIWk7LqDfvUP54iFbr9sVFnlQqIE-7JAVfmxx5Fa7KJhxiJBPNT_CTTxDEzEgHCGy94n-vbFNbnob2-eTTHmycTw1JZ2RUF1QhJ0vwT0GPjQNcScjNdsNiJRDJ0cZ95FAzlYleqSUFYrbgsj-uLq8eGiArympIfSURaIOcFXRvlxOpgOSgdgt6LygEC4NdkznjbuhL5W56l_9Qlpcq1Wk_JcK1E1fPg97EVLZjpir8bgguDbBmu6pLInTQo_1TAO75N83ExKi-rbJWHEgqfSkgwn_Ui4ttRSq31amMiTdP1TUTIr1nMPLUO1opIb83ipe6X7ZXioxA/4kc/R1NR_ruUQ6G9mf4EocZKug/h1/h001.VAnPm682f0OI6MIcV_08VkNcCIM2U5v0_35d80tczug" target="_blank" rel="noopener noreferrer nofollow" style="text-decoration:none;"><img src="https://beehiiv-images-production.s3.amazonaws.com/uploads/ad_network/advertiser/logo/321956fd-38aa-446b-871d-e492fd453e44/600x300.png" height="auto" width="300" style="display:block;" lborder="0"/></a></td></tr></table></td></tr><tr><td id="nov-18-th-nov-24-th-33-latest-ai-re" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:normal;padding:0px 28px;text-align:left;"><h6 style="color:#2A2A2A;font-weight:normal;mso-line-height-alt:87.5%;"><i>Sep 22nd ~ Sep 29th</i><br><i>#75 Latest AI Research Explained Simply</i></h6></td></tr><tr><td><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" style=""><tr><td bgcolor="#222222" style="background-color:#222222;padding:0.0px 0.0px 0.0px 0.0px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0"><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"></p></td></tr></table></td></tr></table></td></tr><tr><td id="industry-news-in-1-line" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:Bold;padding:0px 28px;text-align:left;"><h2 style="color:#2A2A2A;font-weight:Bold;mso-line-height-alt:150.0%;">🗞️ Industry News in 1 Line</h2></td></tr><tr><td style="padding-bottom:12px;padding-left:50px;padding-right:40px;padding-top:12px;" class="ee"><div style="margin-left:0px;" class="edm_outlooklist"><ol start="1" style="list-style-type:decimal;margin:0px 0px;padding:0px 0px 0px 0px;"><li class="listItem ultext"><p style="mso-line-height-alt:150.0%;padding:0px;text-align:left;word-break:break-word;"><span style="background-color:#e0e0e0;"><span style="color:rgb(255, 58, 58);font-size:0.6rem;">♥ 6.3k</span></span> <code>DeepSeek-V3.2-Exp</code> is now available on App, Web, and API platforms with <a class="link" href="https://elink4f7.mail.bycloud.ai/ss/c/u001.VomAAYwkCjux8i_FMc4kJenfjI5JRwchmV1otB2zAaTDB5__nYLk0BTe8kqy7cVJzuDV_1nxp__jeCFRtCAyVOyKpvpFEy69LOf8HiSerYlAsgdbW0o3SUaqZ7Z8N_Wrbj0xf4VTbv2e23R-1P4qqVqDT8ceQSEKBmp48M3KyPPhf_tv7oNZHRIqItxoiXrnDSWmfWtj2rTdJOnxbuK9NZSYMJ8LZW5ymqaGHH9lkUp_3PNbSviIMKY7T57Qp3RrRrcfns8yAtfdBG1odcAdOvfs47of0_zfBSNOPcMxR-ey5-qLs9Mr2QKCpMDivkc8FPd65c1qlfcBaShKPkqxzQ/4kc/R1NR_ruUQ6G9mf4EocZKug/h2/h001.IQqt_KZiMFxp8LZC_xU0Fm_0NA02KSR6v9EODJUYHZk" target="_blank" rel="noopener noreferrer nofollow"><span>DeepSeek Sparse Attention (DSA)</span></a> technology that enables faster training and inference on long-context tasks. The model is fully open source with technical documentation and GPU kernels available on <a class="link" href="https://elink4f7.mail.bycloud.ai/ss/c/u001.CxDkkVpJsBdVoe83c_tBWoNV4Z0gzLAqdXCVgcyx3wsmEMg5nw7JnMGZ7SvhfwkVx8yMFa1bifPFFrqfVRsnKkMT8ShHfkliLvV129o0FCbHCP1vvtPfGjdkRBn6x5ww0kTNkhy0CBSSsBq8niuvmwEmvEilzfVA-z3g4J-3anK1GBnx0FlKUMcagbskRPmrJ44fMYJRp2OFdEukVGuD3jr5QiBjO58SJAP77GoCcTZAEH88Rdocsu4PxfgBMEMan0deOZ4aUGku053fLd4FNE_1GkDRC_ldM8_yWgZD7xI/4kc/R1NR_ruUQ6G9mf4EocZKug/h3/h001.-BnUSJWPtCt7DQt51TqGyCvrbR2eN7Mh7VIOjhU1E_I" target="_blank" rel="noopener noreferrer nofollow"><span>Hugging Face</span></a>. You can test out <a class="link" href="https://elink4f7.mail.bycloud.ai/ss/c/u001.fUNb4GdFo9D3F8WuLArtofIbyzMYf_oYQfHTkZZcLAyBX07xnlwlHKswM75AjG-ZVuE1vtF6XTVZtRUa2KV5s3fR47WP9u4Ds3oEEhQtuEV9Q2gji6wErPtK-3JXBi58v6ACWKZL6fkEk8DW48hFoagmVODrnLY8MPdX-9ut3Xa3_3oo1_OSGi5ImJhENDjsLmEwrvRAUtc_pBgkA-gtB-NeOBE-WCQI2gXVt7vkXs0AWvYYBYgi-Yov9aGxxbbFBdF1mZHLKj1ZIKHkkISXC1Ycjlfg9V2ztLQYonVvG70/4kc/R1NR_ruUQ6G9mf4EocZKug/h4/h001.XivtYZsEch9Zfx9aRkzrQKYgtyWZlKDYco2-tgexbrU" target="_blank" rel="noopener noreferrer nofollow"><span>DeepSeek-V3.2-Exp through the DeepSeek API right now</span></a> (<b>API pricing has been reduced by over 50%)</b>. </p><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/2480bcc4-101e-4919-ba6b-c46a7376aa25/v3_2_benchmark.jpg?t=1759255866" alt="" height="auto" width="626" style="display:block;width:100%;" border="0"/></td></tr></table></li><li class="listItem ultext"><p style="mso-line-height-alt:150.0%;padding:0px;text-align:left;word-break:break-word;"><span style="background-color:#e0e0e0;"><span style="color:rgb(255, 58, 58);font-size:0.6rem;">♥ 19k</span></span> Anthropic has announced significant upgrades to Claude, starting with <a class="link" href="https://elink4f7.mail.bycloud.ai/ss/c/u001.DUiN96-Eq7pUHzwEhy5j2yo571wOI2ayM48Skcu9lJLrZxl9wj07gwQhp6CxHXLyOsuY31YcRKGWGQ3Ri2SQOD-yFhLj7SpS2BBrd3w_Q-np9sMghp3i2JXyoP7jmyGQ1IB7WaH0ADw4Q66q_g2UVOVIZ15uklF3LoJGTWz9aYejzH7Sagal2ZyDXdT17AUKH-7zHrFZFzl6ThygxsjC1hu8LFFqa52FdbFS0hfSNM8gGws9x3AwSWlBpuZDwS7vD4_MzyYXzJbYPaFGo2pCyX19f6KUlnnsEuK7mE-fbPc/4kc/R1NR_ruUQ6G9mf4EocZKug/h5/h001.rU6FhSEqRpJowlOvOkKO-vTFOZwvawsckvTm5i_ws1c" target="_blank" rel="noopener noreferrer nofollow"><span>Claude Sonnet 4.5</span></a>, which brings enhanced capabilities including code execution for data analysis and visualization, a redesigned Claude Code terminal interface, and a <b>checkpoints feature</b> that enables users to save progress and roll back when needed during large tasks. </p><p style="mso-line-height-alt:150.0%;padding:0px;text-align:left;word-break:break-word;"> Additionally, the <a class="link" href="https://elink4f7.mail.bycloud.ai/ss/c/u001.DUiN96-Eq7pUHzwEhy5j2yo571wOI2ayM48Skcu9lJJDvx2zq_89rFpOTTSMQKUrhUBDVR7D7NgTOrT4P6jN7ps9AKj0KMAb99GTr2FhsKjtYo0k39cgo3vXcBqYeGTKTnnBU2OyruPHfwDA83hGc7UQhBcurWRoRZQYtdFcsPJIy9txel9FgRSL8WvisKHi7CyPbvNZz-hm-7V5Stx81g8O88DxWi0gT9Kc7W_nBTDMCL-NnpVKlM7OjWhdKr_2a5g75lTbu1Yr0H7KU5HefgnqeIo2GuBgUiZZIKzqJh0/4kc/R1NR_ruUQ6G9mf4EocZKug/h6/h001.hTbOgo4dPeJFsnkd68Pni-BpHBLm6Tkx_C-zkKipW6U" target="_blank" rel="noopener noreferrer nofollow"><span><b>Claude for Chrome extension</b></span></a> is now available to all waitlist members. Max users can access a five-day research preview of "<a class="link" href="https://elink4f7.mail.bycloud.ai/ss/c/u001.ZsobsZmG6kUZ4LjqczYBVGsLf83KRGEhNe9KyuxhFsMAiLH3Q_Ud95pgkXk6BtalTRULHTv3iGimLmIWLLlMKyXHnFhifFRf_VKAzDfEs02vjxGIIiIokJ32fmr6L0s3su1zqkrOD6xVL9I4pCSEN83DVaLXHDBTi1At9o9PbU7g9Y5cvUSu7YOldAdkrn8sN6OLmtkmKVytle49mI2153FzMWu2e18bGjauQt07rqOn4M0auIioDMKKHih0cAi-LdhNjl4Yp8IruFQAYLfdXH3C7FOpqB3nC-maTuEUwvNUNYyA4VMXslp7n68CjgzCKqWdW5VCgQ3nZOglgQlZ2A/4kc/R1NR_ruUQ6G9mf4EocZKug/h7/h001.aCaaL2LmPRIAMwqG_TR3dB7m0eDzrmcTtuz4yNL3hDU" target="_blank" rel="noopener noreferrer nofollow"><span><b>Imagine with Claude</b></span></a>", which is an experimental feature that generates software dynamically without predetermined functionality or prewritten code. </p><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/13249de5-d143-499b-b819-625178b8244e/6421e7049ff8b2c4591497ec92dc4157b2ac1b30-3840x2160.jpg?t=1759256241" alt="" height="auto" width="626" style="display:block;width:100%;" border="0"/></td></tr></table></li><li class="listItem ultext"><p style="mso-line-height-alt:150.0%;padding:0px;text-align:left;word-break:break-word;"><span style="background-color:#e0e0e0;"><span style="color:rgb(255, 58, 58);font-size:0.6rem;">♥ 1.8k</span></span> inclusionAI has released <code>Ring-1T-preview</code>, which is the first open-source thinking model with 1 trillion parameters. This model solved IMO25 Question 3 in a single attempt with partial solutions for Questions 1, 2, 4, and 5. You can visit Hugging Face to <a class="link" href="https://elink4f7.mail.bycloud.ai/ss/c/u001.CxDkkVpJsBdVoe83c_tBWiHxAxLcebl_xZNuLdFlDkCsTXMgmvUmTVZWhD2E9FvV3PYsRHdu0j5MlyxSrZKp6rxzgfF8KomNt5T2fSDBmPO9QSDadmgEJRWMCH8QNA9S0VyldDWW8LNndqLp_z5FdjNkO-H2CkNVuhCq022l1LlbSvRiM-B13eMkmccrPuL5xMtttrdBAm0rxG5HDdHgHpub-lpTDufAX8vI7QGEvSqodZIn9IB2B4s4rpUrE1Vnjqhmi5K28CSc2v-BH3Hz2z0Mkkb4uDxhb-eVutArwfI/4kc/R1NR_ruUQ6G9mf4EocZKug/h8/h001.U_QD_gonNtwHFWCYMnmmdudBaGm4J8yoQR8okCRhN_8" target="_blank" rel="noopener noreferrer nofollow"><span>test Ring-1T-preview</span></a> yourself and experience trillion-parameter reasoning in action.<br></p><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/d4ba1220-cc0b-435b-992b-cfc71d8e7ce2/image.png?t=1759256568" alt="" height="auto" width="626" style="display:block;width:100%;" border="0"/></td></tr></table></li></ol></div></td></tr><tr><td><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" style=""><tr><td bgcolor="#222222" style="background-color:#222222;padding:0.0px 0.0px 0.0px 0.0px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0"><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"></p></td></tr></table></td></tr></table></td></tr><tr><td id="not-actively-job-hunting-great-most" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:normal;padding:0px 28px;text-align:left;"><h3 style="color:#2A2A2A;font-weight:normal;mso-line-height-alt:125.0%;">Not actively job hunting? Great, most people on Dex aren’t.</h3></td></tr><tr><td align="center" valign="top" style="padding-bottom:20px;padding-left:15px;padding-right:15px;padding-top:20px; " class="dd"><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.c6q0w4g5sodbtO4I1B_pxU54leatptHde-0GADX_7DSobFyAC7h-HCdhgztGfT-nwYcacUb1JZ-EAHFgM-7VGie3eFjyUsET_Ih2l0inC7cHOhViNa7_xhD8WkNRH3HHTV125LN4Vl4hAIWk7LqDfvUP54iFbr9sVFnlQqIE-7JAVfmxx5Fa7KJhxiJBPNT_CTTxDEzEgHCGy94n-vbFNbnob2-eTTHmycTw1JZ2RUF1QhJ0vwT0GPjQNcScjNdsNiJRDJ0cZ95FAzlYleqSUFYrbgsj-uLq8eGiArympIfSURaIOcFXRvlxOpgOSgdgt6LygEC4NdkznjbuhL5W56l_9Qlpcq1Wk_JcK1E1fPg97EVLZjpir8bgguDbBmu6VLN51xaLSHUkFC2mBaea8az5MzjuzkAGUEgpNQZzRQ2cZGTeHMCb145ZmsnEMgfuye00HW66PTv_BAOUowP4lQ/4kc/R1NR_ruUQ6G9mf4EocZKug/h9/h001.Wf9VOrpIhLvPZsqBQJy6rfPlo8h68rgURr4TVLjSnkg" rel="noopener noreferrer nofollow" style="text-decoration:none;" target="_blank"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/9e55427d-cd09-4996-ab1a-5d03e798fe12/1200x600_-_Not_actively_job_hunting.png?t=1758728696" alt="" height="auto" width="626" style="display:block;width:100%;" border="0"/></a></td></tr></table></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"><a class="link" href="https://elink4f7.mail.bycloud.ai/ss/c/u001.c6q0w4g5sodbtO4I1B_pxU54leatptHde-0GADX_7DSobFyAC7h-HCdhgztGfT-nwYcacUb1JZ-EAHFgM-7VGie3eFjyUsET_Ih2l0inC7cHOhViNa7_xhD8WkNRH3HHTV125LN4Vl4hAIWk7LqDfvUP54iFbr9sVFnlQqIE-7JAVfmxx5Fa7KJhxiJBPNT_CTTxDEzEgHCGy94n-vbFNbnob2-eTTHmycTw1JZ2RUF1QhJ0vwT0GPjQNcScjNdsNiJRDJ0cZ95FAzlYleqSUFYrbgsj-uLq8eGiArympIfSURaIOcFXRvlxOpgOSgdgt6LygEC4NdkznjbuhL5W56l_9Qlpcq1Wk_JcK1E1fPg97EVLZjpir8bgguDbBmu6kZcjgu4ag7y0sVV5F_SCHK8Fia0PPbhfiFWeKj4irJEd1s3aycfEc24l4f7N5Y7sGOvOm1C0FHMoH9VDGEGrqQ/4kc/R1NR_ruUQ6G9mf4EocZKug/h10/h001.khQ7wRC8BJ1e29bs3uMHNUDeMDT7QKYdrbqQFyaGqmE" target="_blank" rel="noopener noreferrer nofollow"><span>Dex</span></a> is a conversational AI and career matchmaker that works on behalf of each person. You spend 15-20 minutes on the phone with him, talking about your experience, your ambitions and your non-negotiables. </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> Dex then scans thousands of roles and companies to identify the most interesting and compatible opportunities. </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> Once we’ve found a match, <a class="link" href="https://elink4f7.mail.bycloud.ai/ss/c/u001.c6q0w4g5sodbtO4I1B_pxU54leatptHde-0GADX_7DSobFyAC7h-HCdhgztGfT-nwYcacUb1JZ-EAHFgM-7VGie3eFjyUsET_Ih2l0inC7cHOhViNa7_xhD8WkNRH3HHTV125LN4Vl4hAIWk7LqDfvUP54iFbr9sVFnlQqIE-7JAVfmxx5Fa7KJhxiJBPNT_CTTxDEzEgHCGy94n-vbFNbnob2-eTTHmycTw1JZ2RUF1QhJ0vwT0GPjQNcScjNdsNiJRDJ0cZ95FAzlYleqSUFYrbgsj-uLq8eGiArympIfSURaIOcFXRvlxOpgOSgdgt6LygEC4NdkznjbuhL5W56l_9Qlpcq1Wk_JcK1E1fPg97EVLZjpir8bgguDbBmu6pOczdF6OFN141_CLBuPkS_vhnKWFy1IH04hQIiN4OtlT5eH0OCrP_NpB3F3W9WpyhZjZIquVUSmlikxBs0cxIg/4kc/R1NR_ruUQ6G9mf4EocZKug/h11/h001.iRGSLwcGxFbQN88UxTqjJtvcQefjn3feCGGLFi41Ehk" target="_blank" rel="noopener noreferrer nofollow"><span>Dex</span></a> connects you to hiring managers and even helps you prep for interviews. </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> Thousands of exceptional engineers have already signed up and we’re partnered with many of the UK’s leading Start-ups, Scale-ups, hedge funds and tech companies. </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> Don’t waste another day at a job you hate. Speak with Dex today. </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"><a class="link" href="https://elink4f7.mail.bycloud.ai/ss/c/u001.c6q0w4g5sodbtO4I1B_pxU54leatptHde-0GADX_7DSobFyAC7h-HCdhgztGfT-nwYcacUb1JZ-EAHFgM-7VGie3eFjyUsET_Ih2l0inC7cHOhViNa7_xhD8WkNRH3HHTV125LN4Vl4hAIWk7LqDfvUP54iFbr9sVFnlQqIE-7JAVfmxx5Fa7KJhxiJBPNT_CTTxDEzEgHCGy94n-vbFNbnob2-eTTHmycTw1JZ2RUF1QhJ0vwT0GPjQNcScjNdsNiJRDJ0cZ95FAzlYleqSUFYrbgsj-uLq8eGiArympIfSURaIOcFXRvlxOpgOSgdgt6LygEC4NdkznjbuhL5W56l_9Qlpcq1Wk_JcK1E1fPg97EVLZjpir8bgguDbBmu6u4XVsdBa-OvQ5YocZAfBw662zOnGU-jvhjHzQAWM79I-OKu-m_km3yipeujRPT-rV5HGIIsGYbA-d2KFhuJTPQ/4kc/R1NR_ruUQ6G9mf4EocZKug/h12/h001.kL5HPvogFuVJisq652sNseTthsp3R2dRr0DvYkXoxGU" target="_blank" rel="noopener noreferrer nofollow"><span>Try for Free</span></a></p></td></tr><tr><td><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" style=""><tr><td bgcolor="#222222" style="background-color:#222222;padding:0.0px 0.0px 0.0px 0.0px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0"><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"></p></td></tr></table></td></tr></table></td></tr><tr><td id="video-models-are-zeroshot-learners-" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:Bold;padding:0px 28px;text-align:left;"><h2 style="color:#2A2A2A;font-weight:Bold;mso-line-height-alt:150.0%;">Video models are zero-shot learners and reasoners</h2></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"><i>Wiedemer</i><span style=""><i> et al. [</i></span><i>Google DeepMind</i><span style=""><i>]</i></span></p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"><span style="background-color:#e0e0e0;"><span style="color:rgb(255, 58, 58);font-size:0.6rem;"> ♥ 485 </span></span><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> </span><span style="background-color:#e0e0e0;"><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> LLM Reasoning </span></span></p></td></tr><tr><td id="introduction-to-video-models-as-zer" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:normal;padding:0px 28px;text-align:left;"><h3 style="color:#2A2A2A;font-weight:normal;mso-line-height-alt:125.0%;">Introduction to Video Models as Zero-Shot Learners</h3></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> For years, computer vision has depended on specialized tools (one model for segmentation, another for object detection). This made the field fragmented and less adaptable. </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> The Veo 3 research paper shows that when video models are trained on vast amounts of video data with simple generative objectives, they can become general-purpose foundation models for vision, much like LLMs did for language. </p></td></tr><tr class="embed-gen-img-r"><td align="center" valign="top" style="padding:12px 27px 12px 27px;" class="dd"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr><td align="center" valign="top" class="o" style="padding:12px 12px 12px 12px;;background-color:#FFFFFF;border-color:#F1F1F1;border-radius:5px 5px 5px 5px;border-width:1px 1px 1px 1px;"><!--[if !mso]><!--><div style="display:none; float:left; overflow:hidden; width:0; max-height:0; line-height:0;" class="mob-show"><table role="none" border="0" cellspacing="0" cellpadding="0" align="right" width="100%"><tr><td align="center" valign="top"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001._C0nEVD-E08zSV7Iuq9_0tWkjGWcp4JFqflhDfmAly3ES0_gPKMWoFiutdJr-9JO7GTUHOC1uVffroBva14LeM8kM1HxGOcF5_1DsIXLgStoohn5gdKUaUT7D03xmbxWAJBObZs0EGThxKcZhptgSYBDIrCWCKZ-SNgk5msNDL0kaO624eomUNSGHnjEYszsQEgvg-scsrS5h4Z7gSmh7ZwrmfwJTgfp-Y4qSHH2zu-ssnJaOVvaCmH2W_8iW9ZtCcvj-AMfBtiCZBqMSFVEpw/4kc/R1NR_ruUQ6G9mf4EocZKug/h13/h001.BhDrVhAgOLZVElMnwbCphJOSgdBvHH678A6fXSbq0_M" target="_blank"><img src="https://video-zero-shot.github.io/assets/preview.png" width="100%" style="height:auto;display:block;"/></a></td></tr><tr><td height="16" style="font-size:16px;line-height:16px;"> </td></tr></table></div><!--<![endif]--><table role="none" border="0" cellspacing="0" cellpadding="0" align="right" width="100%"><tr><td width="57%" align="center" valign="middle" class="mob-stack"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr><td align="left" valign="middle" class="l"><p><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001._C0nEVD-E08zSV7Iuq9_0tWkjGWcp4JFqflhDfmAly3ES0_gPKMWoFiutdJr-9JO7GTUHOC1uVffroBva14LeM8kM1HxGOcF5_1DsIXLgStoohn5gdKUaUT7D03xmbxWAJBObZs0EGThxKcZhptgSYBDIrCWCKZ-SNgk5msNDL0kaO624eomUNSGHnjEYszsQEgvg-scsrS5h4Z7gSmh7QxeBdp4Ux2RCFUxyRUASng9MMA_-0ppfi22szZ8k7mpE5hpTJQWKfSGK912wphmbg/4kc/R1NR_ruUQ6G9mf4EocZKug/h14/h001.3GQ4Xd_QWBrmD_0DfE34gTO2BFFgpVzdCeVfF7eo5j4" style="text-decoration:none;font-style:normal;color:#2D2D2D !important;font-size:14px;line-height:20px;" target="_blank"> Video models are zero-shot learners and reasoners <tr><td align="left" valign="top" class="m"><p style="font-size:13px;line-height:19px;color:#2D2D2D;"> Video models like Veo 3 are on a path to become vision foundation models. </p></td></tr><tr><td align="left" valign="bottom" class="n" style="vertical-align:bottom;padding-top:12px;"><p style="word-break:break-word;">video-zero-shot.github.io</p></td></tr></a></p></td></tr></table></td><td width="3%" style="font-size:16px;line-height:16px;" class="mob-hide"> </td><td width="40%" align="left" valign="top" class="mob-hide"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001._C0nEVD-E08zSV7Iuq9_0tWkjGWcp4JFqflhDfmAly3ES0_gPKMWoFiutdJr-9JO7GTUHOC1uVffroBva14LeM8kM1HxGOcF5_1DsIXLgStoohn5gdKUaUT7D03xmbxWAJBObZs0EGThxKcZhptgSYBDIrCWCKZ-SNgk5msNDL0kaO624eomUNSGHnjEYszsQEgvg-scsrS5h4Z7gSmh7e41uYHY5Y_CuGoBT-cD7AfjpDjReRdiLurCB4BAYGA9H34g1EbG8kWAJ4vx7p6arA/4kc/R1NR_ruUQ6G9mf4EocZKug/h15/h001.5hQwI30BKVaIy1WSSSIQ66b86jzStkPkRgVHkMU1sU4" target="_blank"><img src="https://video-zero-shot.github.io/assets/preview.png" width="230" style="height:auto;display:block;"/></a></td></tr></table></td></tr></table></td></tr><tr><td id="inner-working-of-veo-3-s-zero-shot-" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:normal;padding:0px 28px;text-align:left;"><h3 style="color:#2A2A2A;font-weight:normal;mso-line-height-alt:125.0%;">Inner Working of Veo 3's Zero-Shot Mechanism</h3></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> The approach behind Veo 3 is very straightforward: users provide an initial image and a text instruction, and the model generates a short video in response. This method mirrors the prompting strategy that made LLMs so versatile, avoiding the need for fine-tuning or custom architectures. </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> Veo 3 processes both spatial and temporal information, which allows it to animate scenes frame by frame based on the prompt. This frame-by-frame generation acts like a "chain-of-frames," where each step in the video can represent a logical progression, similar to how chain-of-thought reasoning works in language models. </p></td></tr><tr><td align="center" valign="top" style="padding-bottom:20px;padding-left:15px;padding-right:15px;padding-top:20px; " class="dd"><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/39bffc01-5727-471c-898e-3e70fa3b6016/image.png?t=1759253419" alt="" height="auto" width="626" style="display:block;width:100%;" border="0"/></td></tr><tr><td align="center" valign="top" class="t" style="width:626px; padding: 4px 0px 4px 0px;"><p>Veo 3 zero-shot learning and reasoning examples.</p></td></tr></table></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> This capability enables Veo 3 to handle a hierarchy of visual tasks, starting with basic perception, like identifying edges or segmenting objects, and then moving to modeling physical properties, such as buoyancy or material interactions. </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> From there, it progresses to manipulation tasks, such as editing images by changing colors or removing backgrounds, and finally to visual reasoning, where it solves puzzles or navigates mazes over multiple frames. The model's training on diverse video data gives it a broad understanding of visual concepts, which it applies dynamically through this structured generation process. </p></td></tr><tr><td id="evaluation-and-benchmark-performanc" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:normal;padding:0px 28px;text-align:left;"><h3 style="color:#2A2A2A;font-weight:normal;mso-line-height-alt:125.0%;">Evaluation and Benchmark Performance of Veo 3</h3></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> Veo 3 shows impressive zero-shot performance in tests across various tasks, and sometimes rivals specialized models like <b>Nano Banana</b>. For instance, in edge detection, Veo 3 achieved a pass@10 rate of 0.77, and in segmentation, it reached a mean Intersection over Union of 0.74, comparable to dedicated image editing models. </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> It excelled in object extraction, correctly identifying and lining up animals in 92% of cases with multiple attempts, and demonstrated strong abilities in image editing, though it sometimes introduced unintended animations. </p></td></tr><tr><td align="center" valign="top" style="padding-bottom:20px;padding-left:15px;padding-right:15px;padding-top:20px; " class="dd"><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/63b3c9d5-ab41-4f35-a5e3-1af53af59a9e/image.png?t=1759253468" alt="" height="auto" width="626" style="display:block;width:100%;" border="0"/></td></tr><tr><td align="center" valign="top" class="t" style="width:626px; padding: 4px 0px 4px 0px;"><p>Testing visual symmetry</p></td></tr></table></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> On reasoning tasks, Veo 3 solved mazes with up to 78% accuracy on 5x5 grids and handled visual symmetry problems with high success rates, <b>outperforming Veo 2 by wide margins</b>. However, complex analogies occasionally make errors, and better control over unintended scene changes is needed. </p></td></tr><tr class="btn_row"><td valign="top" style="padding-bottom:14px;padding-left:28px;padding-right:28px;padding-top:14px;text-align:center;width:100%;word-break:break-word;" class="dd"><table width="100%" role="none" border="0" cellspacing="0" cellpadding="0" style="margin:14px auto 14px auto;"><tr><td align="center" valign="middle"><table role="none" border="0" cellspacing="0" cellpadding="0"><tr><td style="background-color:#2C81E5;border-radius:8px;mso-padding-alt:14px 20px;" class="btn"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.fUNb4GdFo9D3F8WuLArtoV5sElgytBlvJRzI9WtI92behby1i2J6xEhXN5Eyl9ZPe8ktZvFnyrvWQqtAJS_iqgXh13dGkj5G5QpLSFU6ta21NakB5fjxEXahDJRAHe7SgZTLZGAvypvP-d1UmHGpx2L12-YVjVyw9SwCyWjfG7vsFiPPgv5HcE1pSGQ1WoVHS_S5pIgEz7_zOSgoTWoI5-QQOWVxEI6Nx1DNkvqR8_hb8YsGd7QxcFS7U601pt5YXkUqCsG6nGHJMasg4Vx2hg/4kc/R1NR_ruUQ6G9mf4EocZKug/h16/h001.QoTyXP_Xmt7Miw67n1DkvzrumlzrgBsOgjyYMGwE28k" target="_blank" rel="noopener noreferrer nofollow" style="background-color:#2C81E5;border-radius:8px;color:#FFFFFF;display:inline-block;font-family:'Open Sans','Segoe UI','Apple SD Gothic Neo','Lucida Grande','Lucida Sans Unicode',sans-serif;font-size:16px;font-weight:normal;line-height:18px;padding:14px 20px;text-decoration:none;"> Read Full Paper </a></td></tr></table></td></tr></table></td></tr><tr><td><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" style=""><tr><td bgcolor="#222222" style="background-color:#222222;padding:0.0px 0.0px 0.0px 0.0px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0"><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"></p></td></tr></table></td></tr></table></td></tr><tr><td id="thinking-augmented-pretraining" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:Bold;padding:0px 28px;text-align:left;"><h2 style="color:#2A2A2A;font-weight:Bold;mso-line-height-alt:150.0%;">Thinking Augmented Pre-training</h2></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"><i>Wang</i><span style=""><i> et al. [</i></span><i>Microsoft Research</i><span style=""><i>]</i></span></p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"><span style="background-color:#e0e0e0;"><span style="color:rgb(255, 58, 58);font-size:0.6rem;"> ♥ 22k </span></span><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> </span><span style="background-color:#e0e0e0;"><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> Pre-training </span></span><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> </span></p></td></tr><tr><td id="introduction-to-thinking-augmented-" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:normal;padding:0px 28px;text-align:left;"><h3 style="color:#2A2A2A;font-weight:normal;mso-line-height-alt:125.0%;">Introduction to Thinking Augmented Pre-Training</h3></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> As large language models grow, the demand for high-quality training data is quickly outpacing the supply of human-written text on the web. In LLMs, some valuable tokens are inherently difficult for a model to learn directly because they summarize a long chain of reasoning in just one step. </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> Thinking Augmented Pre-Training, or TPT, tackles this by enriching existing text data with automatically generated thinking trajectories. These trajectories act like a step-by-step reasoning guide, and break down complex ideas into simpler parts that are easier for models to digest. This training method <b>boosts data efficiency without requiring more raw documents</b>. </p></td></tr><tr><td align="center" valign="top" style="padding-bottom:20px;padding-left:15px;padding-right:15px;padding-top:20px; " class="dd"><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/d73a5270-fead-4022-b8a9-fc76dfb8c529/tpt.png?t=1759253936" alt="" height="auto" width="626" style="display:block;width:100%;" border="0"/></td></tr><tr><td align="center" valign="top" class="t" style="width:626px; padding: 4px 0px 4px 0px;"><p>The average few-shot accuracy scores on the GSM8k and MATH datasets with respect to total training tokens. </p></td></tr></table></td></tr><tr><td id="inner-working-of-thinking-augmented" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:normal;padding:0px 28px;text-align:left;"><h3 style="color:#2A2A2A;font-weight:normal;mso-line-height-alt:125.0%;">Inner Working of Thinking Augmented Pre-Training</h3></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> TPT augments each document in the training set with a thinking trajectory generated by an existing language model. For a given text, such as a math problem or an explanatory passage, the system prompts an off-the-shelf model to simulate an expert’s thought process as they analyze the content. </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> This thinking text is then appended to the original document, which forms a single, extended training example. The model is then trained on these augmented samples using the standard next-token prediction objective. It means that the model is learning not only from the original content but also from the detailed reasoning that accompanies it. </p></td></tr><tr><td align="center" valign="top" style="padding-bottom:20px;padding-left:15px;padding-right:15px;padding-top:20px; " class="dd"><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/7c8ce6d7-0180-4087-a90a-4c40a3c0bd09/image.png?t=1759254169" alt="" height="auto" width="626" style="display:block;width:100%;" border="0"/></td></tr></table></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> This approach naturally directs more training attention toward high-value or difficult concepts. For example, in domains like mathematics and physics, the generated thinking trajectories tend to be longer, meaning the model spends more time processing and learning from these reasoning-intensive sections. </p></td></tr><tr><td id="evaluation-and-benchmark-performanc" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:normal;padding:0px 28px;text-align:left;"><h3 style="color:#2A2A2A;font-weight:normal;mso-line-height-alt:125.0%;">Evaluation and Benchmark Performance of Thinking Augmented Pre-Training</h3></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> Experiments with TPT show substantial improvements in both data efficiency and final model performance. When pre-training an 8-billion-parameter model from scratch on 100 billion tokens, the TPT-enhanced version reached performance comparable to LLaMA-3.1-8B, which was trained on 15 trillion tokens (<b>3x improvement in data efficiency</b>). </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> On reasoning-heavy benchmarks like GSM8k and MATH, TPT models more than doubled the scores of vanilla pre-training, and achieved 50.1% and 21.8% respectively, compared to 19.2% and 9.1% for the baseline. </p></td></tr><tr><td align="center" valign="top" style="padding-bottom:20px;padding-left:15px;padding-right:15px;padding-top:20px; " class="dd"><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/49c50a7c-fec7-48a8-a6d4-6e7748e2e057/image.png?t=1759254192" alt="" height="auto" width="626" style="display:block;width:100%;" border="0"/></td></tr></table></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> Ablation studies confirmed that even when using smaller models to generate the thinking trajectories, the performance remained strong. The approach consistently improved results as training data increased, with no signs of plateauing even at 100 billion tokens. </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> One limitation is that thinking trajectories for expert-level texts were sometimes shorter. This is possibly because such content assumes prior knowledge and requires fewer explanatory steps. </p></td></tr><tr class="btn_row"><td valign="top" style="padding-bottom:14px;padding-left:28px;padding-right:28px;padding-top:14px;text-align:center;width:100%;word-break:break-word;" class="dd"><table width="100%" role="none" border="0" cellspacing="0" cellpadding="0" style="margin:14px auto 14px auto;"><tr><td align="center" valign="middle"><table role="none" border="0" cellspacing="0" cellpadding="0"><tr><td style="background-color:#2C81E5;border-radius:8px;mso-padding-alt:14px 20px;" class="btn"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.fUNb4GdFo9D3F8WuLArtoV5sElgytBlvJRzI9WtI92bGFV_BOIlUYKeXfxMvFZyPpUVwJF8QiBL4hL0Om6VAjnqMXU-ErhADvRW5wM_lbTwVq232oUalwP076G_eDvO2usYpvez1j4dnQTGqiQr4D1S13QLjlcdOPhKsXSEKB4K1Vw3gzUde-UJFOBNrExqHZg_2tTvnoOvuD4_f02QTesLLBc10nnzAPoSyu-UhUbKveAzWv6yEs74NBQDNDJIYCluPnQ0XNOwxce6wVhQJig/4kc/R1NR_ruUQ6G9mf4EocZKug/h17/h001.-ckbDtMVGbuKg_PR5kLuEvXq4Lilj8kqOWi13dgxPs0" target="_blank" rel="noopener noreferrer nofollow" style="background-color:#2C81E5;border-radius:8px;color:#FFFFFF;display:inline-block;font-family:'Open Sans','Segoe UI','Apple SD Gothic Neo','Lucida Grande','Lucida Sans Unicode',sans-serif;font-size:16px;font-weight:normal;line-height:18px;padding:14px 20px;text-decoration:none;"> Read Full Paper </a></td></tr></table></td></tr></table></td></tr><tr><td><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" style=""><tr><td bgcolor="#222222" style="background-color:#222222;padding:0.0px 0.0px 0.0px 0.0px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0"><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"></p></td></tr></table></td></tr></table></td></tr><tr><td id="reinforcement-learning-on-pre-train" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:Bold;padding:0px 28px;text-align:left;"><h2 style="color:#2A2A2A;font-weight:Bold;mso-line-height-alt:150.0%;">Reinforcement Learning on Pre-Training Data</h2></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"><span style=""><i>Li et al. [</i></span><i>Tencent, HunYuan Infra Team, The Chinese University of Hong Kong</i><span style=""><i>]</i></span></p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"><span style="background-color:#e0e0e0;"><span style="color:rgb(255, 58, 58);font-size:0.6rem;"> ♥ 424 </span></span><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> </span><span style="background-color:#e0e0e0;"><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> LLM Training </span></span><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> </span><span style="background-color:#e0e0e0;"><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> bycloud’s pick </span></span><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> </span></p></td></tr><tr><td id="introduction-to-reinforcement-learn" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:normal;padding:0px 28px;text-align:left;"><h3 style="color:#2A2A2A;font-weight:normal;mso-line-height-alt:125.0%;">Introduction to Reinforcement Learning on Pre-Training Data (RLPT)</h3></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> As large language models grow, simply adding more parameters or training tokens no longer guarantees major gains. This challenge has sparked interest in new ways to use existing data more effectively. </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> One promising direction comes from a method called Reinforcement Learning on Pre-Training data, or RLPT. Instead of relying only on supervised learning, RLPT applies reinforcement learning directly to the raw text data models that are already trained on. </p></td></tr><tr><td align="center" valign="top" style="padding-bottom:20px;padding-left:15px;padding-right:15px;padding-top:20px; " class="dd"><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/f006c168-1da3-4218-b998-b1daba770156/image.png?t=1759255101" alt="" height="auto" width="626" style="display:block;width:100%;" border="0"/></td></tr><tr><td align="center" valign="top" class="t" style="width:626px; padding: 4px 0px 4px 0px;"><p>Overview of RLPT. Raw data from the internet corpora is processed into training samples.</p></td></tr></table></td></tr><tr><td id="how-does-reinforcement-learning-on-" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:normal;padding:0px 28px;text-align:left;"><h3 style="color:#2A2A2A;font-weight:normal;mso-line-height-alt:125.0%;">How Does Reinforcement Learning on Pre-Training Data Work</h3></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> RLPT works by having the language model predict the next segment of text (like a full sentence or a reasoning step) based on the context that comes before it. This is different from standard next-token prediction, which only looks one word ahead. </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> By predicting larger chunks of text, the model is encouraged to build more coherent and meaningful thought processes. The training uses two types of tasks: one where the model predicts the next sentence given only the prior context, and another where it fills in a missing middle segment using both preceding and following text. These are called <b>Autoregressive Segment Reasoning</b> and <b>Middle Segment Reasoning</b>, respectively. </p></td></tr><tr><td align="center" valign="top" style="padding-bottom:20px;padding-left:15px;padding-right:15px;padding-top:20px; " class="dd"><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/33600086-c32c-4fe8-bd90-2775d2f10006/image.png?t=1759255144" alt="" height="auto" width="626" style="display:block;width:100%;" border="0"/></td></tr></table></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> To guide learning, a generative reward model checks whether the predicted segment matches the meaning of the actual text that follows, even if the wording isn’t identical. This reward signal helps the model explore different reasoning paths while staying semantically accurate. </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> By alternating between the two task types during training, RLPT balances the model’s ability to generate text step-by-step and to understand broader contextual relationships, which improves generalization. </p></td></tr><tr><td align="center" valign="top" style="padding-bottom:20px;padding-left:15px;padding-right:15px;padding-top:20px; " class="dd"><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/c148106f-7f48-420f-9174-521b93421623/image.png?t=1759255238" alt="" height="auto" width="626" style="display:block;width:100%;" border="0"/></td></tr></table></td></tr><tr><td id="evaluation-and-benchmark-performanc" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:normal;padding:0px 28px;text-align:left;"><h3 style="color:#2A2A2A;font-weight:normal;mso-line-height-alt:125.0%;">Evaluation and benchmark performance of RLPT</h3></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> RLPT was tested on both general knowledge benchmarks like MMLU and MMLU-Pro, and on mathematical reasoning tests such as AIME and MATH-500. Across different model sizes, including Qwen3-4B and Llama3.2-3B, RLPT produced consistent gains. For example, on Qwen3-4B, it improved MMLU accuracy by 3.0 points and MMLU-Pro by 5.1 points. In mathematical reasoning, Pass@1 scores on AIME24 and AIME25 rose by 6.6 and 5.3 points respectively. </p></td></tr><tr><td align="center" valign="top" style="padding-bottom:20px;padding-left:15px;padding-right:15px;padding-top:20px; " class="dd"><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/b1f23db9-edd4-428c-a101-1a8b9062779d/image.png?t=1759255170" alt="" height="auto" width="626" style="display:block;width:100%;" border="0"/></td></tr><tr><td align="center" valign="top" class="t" style="width:626px; padding: 4px 0px 4px 0px;"><p>Performance on general-domain tasks across different models, with the best results highlighted.</p></td></tr></table></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> Additionally, when used as a foundation for reinforcement learning with verifiable rewards (RLVR), RLPT provided an extra boost, improving both exploitation and exploration in mathematical problem-solving. </p></td></tr><tr class="btn_row"><td valign="top" style="padding-bottom:14px;padding-left:28px;padding-right:28px;padding-top:14px;text-align:center;width:100%;word-break:break-word;" class="dd"><table width="100%" role="none" border="0" cellspacing="0" cellpadding="0" style="margin:14px auto 14px auto;"><tr><td align="center" valign="middle"><table role="none" border="0" cellspacing="0" cellpadding="0"><tr><td style="background-color:#2C81E5;border-radius:8px;mso-padding-alt:14px 20px;" class="btn"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.fUNb4GdFo9D3F8WuLArtoV5sElgytBlvJRzI9WtI92ZioieOK5pMIY2IH6tEmKXSbELyn4hqDlLvRmo5Z4Q0enYI5I4JeYxEuxH7OQIvACb4C4hAxUjCT2hfv_T1OwcbuupsGMp3RM_ejyHfavmaCbx40293TfBcv6x1a-tRdbTuyiMHhg2FGKeFdbVQ1MEHXq_1bWQAFODC65JpZ4WoKiThBK4DrWbxWMlYphaZPo3SEw7fw1LbDdUsZzpLyJlRDGDsbpZMXzpJrG9OfeMQQQ/4kc/R1NR_ruUQ6G9mf4EocZKug/h18/h001.OOg2aQGwt_kIN4_32NMYuHmM-g6crzU0ZNjba3sUjYM" target="_blank" rel="noopener noreferrer nofollow" style="background-color:#2C81E5;border-radius:8px;color:#FFFFFF;display:inline-block;font-family:'Open Sans','Segoe UI','Apple SD Gothic Neo','Lucida Grande','Lucida Sans Unicode',sans-serif;font-size:16px;font-weight:normal;line-height:18px;padding:14px 20px;text-decoration:none;"> Read Full Paper </a></td></tr></table></td></tr></table></td></tr><tr><td class="dd" align="center" valign="top" style="padding:20px;"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.DUiN96-Eq7pUHzwEhy5j25aF_udDsq8EAwNLhMGYMEBXZBfJWd9af3vqHMeCc7Xxjvl8B_2kcqo5rI2ku4o3P7W5hweBVcrLsGzwg35YUxRgMrraou8I8epn4aQtG25uz09FcPbJ4R-sVrBtNWOMA21wDtxPFFsrZaStVjbt3nkVwhVzHTHmO0ARZZzdTt7gaSXLEyL-JCfPBY2S_qUDkXrg0n2KN9htTeydXvtQi5KdAAqi29JW3zjMJTJ7sqLbBJv0o0yKS77U-6YXcHxK7wLH8I9m69U1kw0l8PdDURs/4kc/R1NR_ruUQ6G9mf4EocZKug/h19/h001.UbTJEvpMDZusYLrqnrYlzO8sw2DSCn_KHSCdXplWeWY" style="text-decoration:none;"><table align="center" width="100%" cellpadding="0" cellspacing="0" border="0" role="none" style="max-width:520px;margin:0 auto;"><tr><td class="p" width="100%" style="padding:2px;border:none;"><table width="100%" cellpadding="0" cellspacing="0" border="0" role="none"><tr><td align="center" valign="top" style="width:100%;"><div style="max-height:0;position:relative;opacity:0.999;width:100%;mso-hide:all;"><div style="display:inline-block;width:100%;padding-top:25%;"><img width="20%" height="auto" loading="lazy" alt="" style="border:0;" src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/static_assets/youtube_play_icon.png"/></div></div><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.DUiN96-Eq7pUHzwEhy5j25aF_udDsq8EAwNLhMGYMEBXZBfJWd9af3vqHMeCc7Xxjvl8B_2kcqo5rI2ku4o3P7W5hweBVcrLsGzwg35YUxRgMrraou8I8epn4aQtG25uz09FcPbJ4R-sVrBtNWOMA21wDtxPFFsrZaStVjbt3nkVwhVzHTHmO0ARZZzdTt7gaSXLEyL-JCfPBY2S_qUDkQwv881DMJhIPsYzf1z0oeF25iFvKLNWL8Vq9OO-nEwy5mi9D66eCPz0FOWzvZHHRLwtnTxVW9lBcY8rRLZpV6M/4kc/R1NR_ruUQ6G9mf4EocZKug/h20/h001.mOdF4bZopc483IojY5YNCBK2GrFUQzp11COhMyBpVik" style="text-decoration:none;"><img src="https://i.ytimg.com/vi/SvIJ-BIAPNI/maxresdefault.jpg" width="480" height="auto" loading="lazy" alt="YouTube video by bycloud" style="display:block;height:auto;border:0;outline:none;text-decoration:none;background-color:#000000;width:100%;"/></a></td></tr><tr><td><p style="font-size:12px;font-weight:500;font-style:italic;font-family:Helvetica, Calibri, sans-serif;color: #686a6d; padding-top:0 !important;padding-bottom:6px !important; padding-left:4px !important;"> The Final Boss of Making AI Wrapper: Context Engineering Explained </p></td></tr></table></td></tr></table></a></td></tr></table></td></tr></table></td></tr><tr><td align="center" valign="top"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr><td><tr><td class="b" align="center" valign="top" bgcolor="#2a2a2a" style="padding:0px 0px 0px 0px;border-style:solid;border-width: 0px 0px 0px 0px;border-color: #2a2a2a;border-bottom-left-radius:10px;border-bottom-right-radius:10px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr><td align="center" valign="top" bgcolor="#73ddff" style="padding:12px"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr><td><span style="padding-left:1px;"></span></td><td align="center" valign="middle" width="75" style="width:75px;"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.1muhFWIqieRYpaJ-FbWSCQqcWoV4NNHHr5SkP9THApWUO4S9eWSDBFDMKQ83N4CY1l4kXQTU9YnEEqXRrg_2uhS94rQOKDl60C6UO57Zu1mJCFi_zhfD-a_hnJHdTQ7EMuFzTXyHHoMrXkgQApIEja6W-P_UQ-lzCpTLoYr-m2jTQM1I55GeX0wbku5ikoOok3mxFulJp98_0Kw_2UQA6v7zrMINBypp-6OzNe20xZn2XlwHK8eL3qqw_2gmGyLhiZcqe8RWLUS0HxZLfM8ZhQ/4kc/R1NR_ruUQ6G9mf4EocZKug/h21/h001.rzRJiYqRp_FhBRnwmmQtD9G5OcepqGWKq0NOZ7nwMCU" style="text-decoration:none;"><img width="22" height="22" alt="tw" border="0" style="display:block;max-width:22px;color:Dark" src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/static_assets/x_dark.png"/></a></td><td align="center" valign="middle" width="75" style="width:75px;"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.amatuKKICSickUKplYJXmBoQnQ9VXnB2zTxBG4HeHBi5iti4l06m5fR1UTFq_vFgQaGMmutCjJbuBFU8WHbRj6heToGsiZHlry3dxu5DEimeQbpBAMyhKdSbaWrmIf3bDAVa9Uol06ZI3EWdGVjRg_nZcpSnraXxk9jUZsMFGQxMIGWVEczGFQPgwuYc6r9GsyPlzmg_XJKPXF4SwjDuEfp8aoIiR8DT0LXC07DRLfKwxuzi_3RUmJKXsWNUn90Fjc2Ad1Ss6A3Fw8xxUiruXA/4kc/R1NR_ruUQ6G9mf4EocZKug/h22/h001.fIrnw4lDYV2yn-eYrdJNsDT-B2D6KGdTpzGyHnjkqIA" style="text-decoration:none;"><img width="22" height="16" alt="yt" border="0" style="display:block;max-width:22px;color:Dark" src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/static_assets/youtube_dark.png"/></a></td><td><span style="padding-left:1px;"></span></td></tr></table></td></tr><tr><td height="10" style="line-height:1px;font-size:1px;height:10px;"> </td></tr><tr><td class="w" align="center" valign="top" style="padding:15px 15px 15px 15px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr><td align="center" valign="top"><p style="font-family:'Verdana',Geneva,sans-serif;color:#FFFFFF!important;"> Update your email preferences or unsubscribe <a class="link" href="https://elink4f7.mail.bycloud.ai/ss/c/u001.c6q0w4g5sodbtO4I1B_pxWc4htTObwdorovK0nFHVH-4pUdVE0ELYH5DsNemk732SjNwhPNJ25r0O8B5vYifsBhEpz-DJgyVFmavJPa0OyKRRnvw4o7XGyvIv7PRofnmgHs3WEvG0EU6G-Riujw8KQ76zwAV-RNQuQs1p88OdRC8pqWaEPogXDA275kvxhzuej-7ebTGafgXk1Hurvc3MuiImsDAOBKF2Fk0cfogVBQ9X_zrBXyw-_tvLwk8iKVwD2VJSRo52ruC8cknOU36qlGV-IcproHcGgOIzzQjKo6yOWjvI6tgYU2BaPcY64RimKc0UEhOFqnVbsjOzDMAjiZtIkobCs1mTq4zxC_MY5-_xtbYm0F5sxQUBtTivzqTPGIrIJq5dAKc20dUGbl9h4_C9JXBg_znjKsG7p12EqULFdykCmTVLfz49ByFaRzWnEtjWGrYp4QIRohfPg7GYucFEYgOdN-kMnv6fQXU2XOy2bfyfE5DvupNDGSSECq77zt7CnnMsx77J2jvTJJyE_6DWofEFu5KxYB4L38Iwz-8SAxKmfpitazWD5wRyNl54MR21ApehuZn4R4M6xTo6y7eXob81Trw6in1npGXAL3ACLrXsk3dsHKm9dBOIshJvqzL9eBdhTq98Z9PsFo1m8NEGG4mgsDNs9oTwgtwDexbdCT_clf2lFsNKzvSfPo65KSvKVqTqcSSsk5c8wQFPth76pCT5ZYvob8lu8411_Dbv61KBhDDriadSz_srSmaNGtgCYSEsgfDB8WelOr_NuD6S9KtbFiL5K_ofZcdxZN2PV-VJro8HxCscbNiZy0JRCiVQaZz-8F3TSi0BKymMO18d8RRpoM8u18kd493urLFwCfeLdsjxm7FO_YDxTnsoVrJe6Fj_PUBtDEf9BmYGSvZMPZFgW1flKAvwIPKZR9yX2at4F1gKA0XuHnqWvDHhoLSskvp8RPygIPAblJ7DzAZVN5NmIHRqKl-evhp91I/4kc/R1NR_ruUQ6G9mf4EocZKug/h23/h001.CNXqfcMn5KOBiBcjkDbcVJHtQ-aujpKEj7POAH7mjZA" style="text-decoration:underline;text-decoration-color:#FFFFFF!important;color:#FFFFFF!important;"> here</a></p><p class="copyright" style="font-family:'Verdana',Geneva,sans-serif;color:#FFFFFF!important;"> © 2025 bycloudai </p><p style="font-family:'Verdana',Geneva,sans-serif;color:#FFFFFF!important;"> 228 Park Ave S, #29976, New York, New York 10003, United States </p></td></tr><tr style="display: table-row !important;"><td align="center" valign="top" style="padding-top:20px;" style="display:table-cell !important;"><table role="none" border="0" cellspacing="0" cellpadding="0" align="center" style="display:table !important;"><tr style="display:table-row !important;"><td class="u" align="center" valign="middle" height="32" style="height:32px;display:table-cell !important; max-height: 32px !important;margin:0px !important; background-color: #ffffff !important;"><a style="line-height:32px !important;text-decoration:none;display:block !important;" href="https://elink4f7.mail.bycloud.ai/ss/c/u001.DUiN96-Eq7pUHzwEhy5j28olDWFpV5DDKfdk_OdOKOhTKE-fiA5R_YmxyaRVrCJR3dYrkw_b3ZyIhCpE0TBiDxn6ETsUDYucW9q5gAsgsCcL-2oqRoelReoIBCGPs0E6TzolxAQKEZkF-mjt83gdm65EPZt65XtyvzD56X1l-zJCr94ylokmzWOxoBuPLXNLmxlLls6nLCX2qi_NKGf-DH650Ha0J5x4BYZT19gZX2m1Gvdciu3F9QLejEzO-yrN/4kc/R1NR_ruUQ6G9mf4EocZKug/h24/h001.g-h2HjzIPs15pgldgHimAnB-HiadMgFVR_9alPgg5NI"><img src="https://media.beehiiv.com/output-onlinepngtools.png" width="16" alt="beehiiv logo" style="display:inline-block !important;max-width:16px !important; vertical-align:-3px !important;width: 16px !important;" border="0"/><span style="padding-left:11px !important;display: inline-block !important;">Powered by beehiiv</span></a></td></tr></table></td></tr><tr><td align="left" valign="top" height="2" style="height:2px;"><a href='https://elink4f7.mail.bycloud.ai/ss/c/u001.CxDkkVpJsBdVoe83c_tBWsHIaP4XNp0WgUYqLvHcKk_3uqk_KIkz4ddLinhFbud6JuxLFdSUhYnR7b1NSsmbtzXNGNblnEEMKUtkCAjkn8Y/4kc/R1NR_ruUQ6G9mf4EocZKug/h25/h001.UdbxMxKmak-aJQxuFdYXtY9KAkLAKGED-o-V4UYJki8' style="color: #2a2a2a !important; cursor: default; font-size: 1px; text-decoration: none;"> Terms of Service </a></td></tr></table></td></tr></table></td></tr></td></tr></table></td></tr></table></td></tr></table></td></tr></table></div></body></html>