<!DOCTYPE html><html lang="en" xmlns="http://www.w3.org/1999/xhtml" xmlns:v="urn:schemas-microsoft-com:vml" xmlns:o="urn:schemas-microsoft-com:office:office" style="font-size:16px;"><head></head><head><meta charset="utf-8"/><!--[if !mso]><!--><meta http-equiv="X-UA-Compatible" content="IE=edge"/><!--<![endif]--><meta name="viewport" content="width=device-width,initial-scale=1"/><meta name="x-apple-disable-message-reformatting"/><meta name="format-detection" content="telephone=no,address=no,email=no,date=no,url=no"/><meta name="color-scheme" content="light"/><meta name="supported-color-schemes" content="light"/><title>Language Models are Injective and Hence Invertible</title><!--[if mso]><xml><o:OfficeDocumentSettings><o:AllowPNG/><o:PixelsPerInch>96</o:PixelsPerInch></o:OfficeDocumentSettings></xml><![endif]--><style>
:root { color-scheme: light; supported-color-schemes: light; }
body { margin: 0; padding: 0; min-width: 100%!important; -ms-text-size-adjust: 100% !important; -webkit-transform: scale(1) !important; -webkit-text-size-adjust: 100% !important; -webkit-font-smoothing: antialiased !important; }
.body { word-wrap: normal; word-spacing:normal; }
table.mso { width: 100%; border-collapse: collapse; padding: 0; table-layout: fixed; }
img { border: 0; outline: none; }
table { mso-table-lspace: 0px; mso-table-rspace: 0px; }
td, a, span { mso-line-height-rule: exactly; }
#root [x-apple-data-detectors=true],
a[x-apple-data-detectors=true],
#MessageViewBody a { color: inherit !important; text-decoration: inherit !important; font-size: inherit !important; font-family: inherit !important; font-weight: inherit !important; line-height: inherit !important; }
span.MsoHyperlink { color: inherit !important; mso-style-priority: 99 !important; }
span.MsoHyperlinkFollowed { color: inherit !important; mso-style-priority: 99 !important; }
.a { background-color:#dedede; }
.b { background-color:#2a2a2a; }
.c { background-color:#ffffff; }
.d { background-color:#fff0c8; }
.d2 { background-color:#FFFFFF; }
.d3 { background-color:#FFFFFF; }
h1 a { text-decoration:none;color:#2C81E5;font-style:italic; }
h2 a { text-decoration:none;color:#2C81E5;font-style:italic; }
h3 a { text-decoration:none;color:#2C81E5;font-style:italic; }
h4 a { text-decoration:none;color:#2C81E5;font-style:italic; }
h5 a { text-decoration:none;color:#2C81E5;font-style:italic; }
h6 a { text-decoration:none;color:#2C81E5;font-style:italic; }
h1, h1 a, h2, h2 a, h3, h3 a, h4, h4 a, h5, h5 a, h6, h6 a, ul, li, ol, p, p a { margin: 0;padding: 0; }
h1 { font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif;font-weight:700;font-size:28px;color:#2A2A2A;line-height:42px;padding-bottom:4px;padding-top:16px;mso-margin-top-alt:16px;mso-margin-bottom-alt:4px }
h2 { font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif;font-weight:700;font-size:24px;color:#2A2A2A;line-height:36px;padding-bottom:4px;padding-top:16px;mso-margin-top-alt:16px;mso-margin-bottom-alt:4px }
h3 { font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif;font-weight:400;font-size:20px;color:#2A2A2A;line-height:30px;padding-bottom:4px;padding-top:16px;mso-margin-top-alt:16px;mso-margin-bottom-alt:4px }
h4 { font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif;font-weight:400;font-size:18px;color:#2A2A2A;line-height:27px;padding-bottom:4px;padding-top:16px;mso-margin-top-alt:16px;mso-margin-bottom-alt:4px }
h5 { font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif;font-weight:400;font-size:16px;color:#2A2A2A;line-height:24px;padding-bottom:4px;padding-top:16px;mso-margin-top-alt:16px;mso-margin-bottom-alt:4px }
h6 { font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif;font-weight:400;font-size:14px;color:#2A2A2A;line-height:21px;padding-bottom:4px;padding-top:16px;mso-margin-top-alt:16px;mso-margin-bottom-alt:4px }
p { font-family:'Georgia','Times New Roman',serif;font-weight:400;color:#2D2D2D;font-size:16px;line-height:24px;padding-bottom:8px;padding-top:8px;mso-margin-top-alt:8px;mso-margin-bottom-alt:8px; }
p a, .e a, ul a, li a, .h a, .h2 a, .h3 a { word-break:break-word;color:#2C81E5 !important;text-decoration:none;font-style:italic; }
p a span, .e a span, ul a span, li a span { color: inherit }
p .bold { font-weight:bold;color:#2D2D2D; }
p span[style*="font-size"] { line-height: 1.6; }
.f p { font-size:12px;line-height:15px;color:#2D2D2D;padding:0; }
.f p a { color:#2D2D2D !important; }
.g p { font-family:'Helvetica',Arial,sans-serif;font-size:14px;line-height:20px;font-weight:normal;margin:0; }
.g p a { text-decoration: underline; }
.i p { font-family:'Helvetica',Arial,sans-serif;line-height:23px;font-size:15px;color:#2D2D2D; }
.i p a { color:#2D2D2D !important; }
.i2 p { font-family:'Helvetica',Arial,sans-serif;line-height:23px;font-size:15px;color:#2D2D2D; }
.i2 p a { color:#2D2D2D !important; }
.i3 p { font-family:'Helvetica',Arial,sans-serif;line-height:43px;font-size:24px;color:#2D2D2D; }
.i3 p a { color:#2D2D2D !important; }
.h p a { color:#595959 !important; }
.h2 p a { color:#595959 !important; }
.h3 p a { color:#595959 !important; }
.f p a, .i p a, .i2 p a, .i3 p a, .h p a, .h2 p a, .h3 p a { text-decoration:underline; }
.j { border-top:3px solid #ffeb2d; }
.k p { padding-left:15px;padding-bottom:0px;padding-top:6px;mso-margin-top-alt:6px;mso-margin-bottom-alt:0px;mso-margin-left-alt:15px; }
.o { background-color:#FFFFFF;border:1px solid #F1F1F1;border-radius:5px; }
.o p { font-family:'Helvetica',Arial,sans-serif;padding:0px;margin:0px; }
.l p,
.l p a, .l a { font-size:14px;line-height:20px;font-weight: bold;color:#2D2D2D;padding-bottom:6px;mso-margin-bottom-alt:6px;text-decoration:none; }
.m p,
.m p a { font-size:13px;line-height:18px;font-weight:400;color:#2D2D2D;padding-bottom:6px;mso-margin-bottom-alt:6px;text-decoration:none; }
.n p,
.n p a { font-size:12px;line-height:17px;font-weight:400;color:#2D2D2D;padding-bottom:6px;mso-margin-bottom-alt:6px;text-decoration:none; }
.p { background-color:#FFFFFF;max-width:520px;border:1px solid #E1E8ED;border:1px solid rgba(80, 80, 80, 0.3);border-radius:5px; }
.q { font-size:16px;font-family:Helvetica,Roboto,Calibri,sans-serif !important;border:1px solid #e1e8ed;border:1px solid rgba(80, 80, 80, 0.3);border-radius:10px;background-color:#FFFFFF; }
.q p { font-size:16px;font-family:system-ui,Helvetica,Roboto,Calibri,sans-serif !important;color:#222222;padding:4px 0; }
.r { border:1px solid #E1E8ED !important;border-radius:5px; }
.s p { font-size: 14px; line-height: 17px; font-weight: 400; color: #697882; text-decoration: none; }
.t p { font-family:'Helvetica',Arial,sans-serif;font-size:12px;line-height:18px;font-weight:400;color:#000000;font-style:italic;padding:4px 0px 0px; }
.v { border-radius:10px;border:solid 0px #DFD150;background-color:#2C81E5;font-family:'Open Sans','Segoe UI','Apple SD Gothic Neo','Lucida Grande','Lucida Sans Unicode',sans-serif;color:#FFFFFF; }
.v a { text-decoration:none;display:block;color:#FFFFFF; }
.w p { font-size:12px;line-height:15px;font-weight:400;color:#FFFFFF; }
.w p a { text-decoration: underline !important;color:#FFFFFF !important; }
ul { font-family:'Helvetica',Arial,sans-serif;margin:0px 0px 0px 25px !important;padding:0px !important;color:#2D2D2D;line-height:24px;list-style:disc;font-size:16px; }
ul > li { font-family:'Helvetica',Arial,sans-serif;margin:10px 0px 0px 0px !important;padding: 0px 0px 0px 0px !important; color: #2D2D2D; list-style:disc; }
ol { font-family:'Helvetica',Arial,sans-serif;margin: 0px 0px 0px 25px !important;padding:0px !important;color:#2D2D2D;line-height:24px;list-style:decimal;font-size:16px; }
ol > li { font-family:'Helvetica',Arial,sans-serif;margin:10px 0px 0px 0px !important;padding: 0px 0px 0px 0px !important; color: #2D2D2D; }
.e h3,
.e p,
.e span { padding-bottom:0px;padding-top:0px;mso-margin-top-alt:0px;mso-margin-bottom-alt:0px; }
.e span,
.e li { font-family:'Helvetica',Arial,sans-serif;font-size:16px;color:#2D2D2D;line-height:24px; }
.rec { font-family: ui-sans-serif, system-ui, -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", Arial, "Noto Sans", sans-serif, "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Noto Color Emoji" !important; }
.rec__button:hover { background-color: #f9fafb !important; }
.copyright a {color: inherit !important; text-decoration: none !important; font-size: inherit !important; font-family: inherit !important; font-weight: inherit !important; line-height: inherit !important;}
.txt_social p { padding: 0; word-break: break-all; }
.table, .table-c, .table-h { border: 1px solid #C0C0C0; }
.table-c { padding:5px; background-color:#FFFFFF; }
.table-c p { color: #2D2D2D; font-family:'Helvetica',Arial,sans-serif !important;overflow-wrap: break-word; }
.table-h { padding:5px; background-color:#F1F1F1; }
.table-h p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important;overflow-wrap: break-word; }
@media only screen and (max-width:667px) {
.aa, .w100pc { width: 100% !important; }
.bb img { width: 100% !important; height: auto !important; max-width: none !important; }
.cc { padding: 0px 8px !important; }
.ee { padding-top:10px !important;padding-bottom:10px !important; }
.ff ul, .ff ol { margin: 0px 0px 0px 10px !important;padding: 0px !important; }
.ff li { margin:10px 0px 0px 10px !important; }
.r {height:140px !important;}
.s p { font-size:13px !important;line-height:15px !important; }
.mob-hide {display:none !important;}
.mob-show {display: block !important; width: auto !important; overflow: visible !important; float: none !important; max-height: inherit !important; line-height: inherit !important;}
.mob-stack {width:100% !important;display:block !important;}
.mob-w-full {width:100% !important;}
.mob-block {display:block !important;}
.embed-img {padding:0px 0px 12px 0px !important;}
.socialShare {padding-top:15px !important;}
.rec { padding-left:15px!important;padding-right:15px!important; }
.bodyWrapper { padding:7px 4px 7px 4px !important; }
.social-mobile {float:left !important;margin-top:10px !important;}
}
@media screen and (max-width: 480px) {
u + .a .gg { width: 100% !important; width: 100vw !important; }
.tok-heart { padding-top:75% !important; }
.tok-play { padding-top: 250px !important; }
}
@media screen and (max-width: 320px) {
.tok-heart { padding-top:65% !important; }
}
.u { border: 1px solid #CACACA !important; border-radius: 2px !important; background-color: #ffffff !important; padding: 0px 13px 0px 13px !important; font-family:ui-sans-serif,system-ui,-apple-system,BlinkMacSystemFont,"Segoe UI",Roboto,"Helvetica Neue",Arial,"Noto Sans",sans-serif !important;font-size: 12px !important; color: #767676 !important; }
.u a { text-decoration: none; display: block !important; color: #767676 !important; margin: 0px !important; }
.u span, .u img { color: #767676 !important;margin:0px !important; max-height:32px !important;background-color:#ffffff !important; }
</style><!--[if mso]><style type="text/css">
h1, h2, h3, h4, h5, h6 {font-family: Arial, sans-serif !important;}
body, table, td, p, a, span {font-family: Arial, sans-serif !important;}
sup { font-size: 100% !important;vertical-align: .5em !important;mso-text-raise: -1.5% !important;line-height: 0 !important; }
ul { margin-left:0px !important; margin-right:10px !important; margin-top:20px !important; margin-bottom:20px !important; }
ul li { margin-left: 0px !important; mso-special-format: decimal; }
ol { margin-left:0px !important; margin-right:10px !important; margin-top:20px !important; margin-bottom:20px !important; }
ol li { margin-left: 0px !important; mso-special-format: decimal; }
li.listItem { margin-left:15px !important; margin-top:0px !important; }
.paddingDesktop { padding: 10px 0 !important; }
.edm_outlooklist { margin-left: -20px !important; }
.embedImage { display:none !important; }
</style><![endif]--><!-- __merge_tags_in_links__ --><style>
@font-face {
font-family: 'Open Sans';
font-style: normal;
font-weight: 700;
font-display: swap;
src: url('https://fonts.gstatic.com/s/opensans/v40/memSYaGs126MiZpBA-UvWbX2vVnXBbObj2OVZyOOSr4dVJWUgsg-1x4gaVIUwaEQbjA.woff2') format('woff2');
}
@font-face {
font-family: 'Open Sans';
font-style: italic;
font-weight: 700;
font-display: swap;
src: url('https://fonts.googleapis.com/css2?family=Open+Sans:ital,wght@1,700&display=swap') format('woff2');
}
</style></head><body class="a" style="margin:0px auto;padding:0px;word-wrap:normal;word-spacing:normal;background-color:#dedede;"><div role="article" aria-roledescription="email" aria-label="email_name" lang="en" style="font-size:1rem"><div style="display:none;max-height:0px;overflow:hidden;"> and more on Kimi Linear, Looped Transformer, How FP16 fixes RL...  ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ </div><table role="none" width="100%" border="0" cellspacing="0" align="center" cellpadding="0" class="gg"><tr><td align="center" valign="top"><table role="none" width="670" border="0" cellspacing="0" cellpadding="0" class="aa" style="width:670px;table-layout:fixed;"><tr><td class="bodyWrapper" align="center" valign="top" style="padding:7px 7px 7px 7px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr><td align="center" valign="top" style="border-width:0px 0px 0px 0px;border-style: solid; border-color: #2a2a2a;border-radius:10px 10px 0px 0px;background-color:#ffffff;" class="c"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr id="header"><td style="padding:15px 15px 0px 15px;"><div style="padding-top:0px;padding-right:0px;padding-bottom:20px;padding-left:0px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr><td class="f" align="right" valign="top"><p> November 04, 2025 | <a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.c6q0w4g5sodbtO4I1B_pxSdB5RCIH6yy1Fm1CYma3Exnhym0HKspXjKKg-Oz9p4I0bH-ztso3z9gATk7Ps0MurHbZd9d98i_B96L1kVjKZnux8bGtpgMCQnFHHoxx7S_10DzbYRuK9MVt07UflT9x_MeQoQzN7eL02JFsuE0W1LGgHTau6ssfyVRjbAwXO0IcfwxeigcVonH2ciS5HVYrF7keF8efb47CYLIbWR9e6WafUdg74vfsfOS71uhY0vLNDLOhNkaVsLSvenmWZPFdxuG8VEyE-7xOp9vE0hnOjRQSZFM_r13rWkK-35vfZr49VAlZAz_L06tGqSrZq7AYp0lkdUNUKTQqeagBdvAYKqPiVCfzrypRMxNZq_rZn5j6f9Lnh8MZcpLOiUDjH3NwthUzmAe2Ey8ldEXAiiADjpu-srBe76UWHm7E8N7IPcVxnHn42i3vnBnQNiUACVXTLQtmwP8KkVTjJkzxHJS0XI9GTYJ2CgqUiGFE4aEpuxlOwHTv_PJUSMD7Uo7vy4_WUxl8D98WIUPEfu0GLVRDXiUmpd335EV0QR4U80N9yS_4U6LPbLxTrwukpcvQKCBQZ_HJi8bT0Qccgs4frNdkfQy1yXer6j6emtxFOq_pwmoK4ig3R6SfRG9LbxY8dhNr6N1D_lQWo_AU9EGCN_pP34BHmWclxLSGRnOgGWmRbPBktqAY3M_7bDNtZMMtlvtJh-XcqImBzJKqqcw3Yr8HYoU9sHo1jvcC30Nn7qJceOTRqILMRk_LAiH8PEcZHbjG0-W7yv9OgEEVwirds_6FhiHwgByYgL_3BvwJw03P02Caxs2vg23k17FvXHjEvkSgw/4lb/twXBdovCTqiYTsaJEd24KQ/h0/h001.I7Fwi8keXaitHn5SOKtlqOkHmgylNHCzqdaKy2I6Iho"><span class="translation_missing" title="translation missing: en.templates.posts.email.header.read_online">Read Online</span></a></p></td></tr><tr><td class="dd" align="center" valign="top" style="padding:15px 0;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr><td align="center" valign="top"><h1 style="text-align:left;font-family:'Open Sans','Segoe UI','Apple SD Gothic Neo','Lucida Grande','Lucida Sans Unicode',sans-serif;font-weight:Bold;font-size:32px;color:#2A2A2A;padding:2px 0;line-height:38px;"> Language Models are Injective and Hence Invertible </h1><p style="text-align:left;font-family:'Helvetica',Arial,sans-serif;font-weight:normal;font-size:20px;color:#3E3E3E;padding:5px 0;line-height:24px;"> and more on Kimi Linear, Looped Transformer, How FP16 fixes RL... </p></td></tr></table></td></tr><tr><td style="line-height:0;"><div data-open-tracking="true"> <img src="https://elink4f7.mail.bycloud.ai/ss/o/u001.3wmUuY8gEWd4_869a_eXcg/4lb/twXBdovCTqiYTsaJEd24KQ/ho.gif" alt="" width="1" height="1" border="0" style="height:1px !important;width:1px !important;border-width:0 !important;margin-top:0 !important;margin-bottom:0 !important;margin-right:0 !important;margin-left:0 !important;padding-top:0 !important;padding-bottom:0 !important;padding-right:0 !important;padding-left:0 !important;"/> </div></td></tr></table></div></td></tr><tr id="content-blocks"><td class="email-card-body" align="center" valign="top" style="padding-bottom:15px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr><td align="center" valign="top" style="padding: 20px 28px 20px;" class="dd"><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin: 0 auto 0 auto"><tr><td align="center" valign="top" style="width:300px;"><p style="opacity: 0.8;"><b>In partnership with</b></p></td></tr><tr><td align="center" valign="top" style="width:300px;"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.fUNb4GdFo9D3F8WuLArtoQWM4WfOezTCPUxm2nqVUZlaiEDFf-W5CDjqrpRm4tTzPb7DXfnMcezjkfQEOx8bcmlXP_DmDI9tlkbDFpPlJjdC2Heb516Px4gHf_lz6z73Lah_A2008BqTUMYQ7KSSoMmQge5Fe7Q3TV1FHliMGX8KpY7O6ic8MtPEQP89215bEF3Udlm9aIl-ezMzkF4MBNhJhbmWEEjvwygde1iNkYLGsUI_z8vbY416VrV6gb54fZz9WPwrvSzLJqSe4Ltj_6mJRRooniPS_3HR1H3UOo-4h_CdZepg13DG8CLdhqaMJBogTo-zzJD6x0CbiXz6r23z0GyxxuLpABFUC0hhZHdHiDFpQajhtoDO0Dnv-DHLwOVxyjRb9oOhZ6KC3msgBRnoOZsRHJJwzlYpw-YyZbqq1Q7P5q8TYs-qp7kJYAEwdQQMvuGGmDZgaoTGGkhzr1rflvuUQIjsNRWQNktyvADdz5cpkT9PCE3APfYDA2d1-ZZk7hvIzeltVvVy_5EcmO3Ljbu1y691h75-CuYXLtj6PCOwA-7l8E2KQjfSvfTG/4lb/twXBdovCTqiYTsaJEd24KQ/h1/h001.leooEk4cVLejQx0iqvZ3ews7o9Ft96gYbx6rCyG-vvI" target="_blank" rel="noopener noreferrer nofollow" style="text-decoration:none;"><img src="https://beehiiv-images-production.s3.amazonaws.com/uploads/ad_network/advertiser/logo/4da7c419-66a7-4e83-a82b-2aef0cfc99ac/RokuAdsManagerlogo.png" height="auto" width="300" style="display:block;" lborder="0"/></a></td></tr></table></td></tr><tr><td id="nov-18-th-nov-24-th-33-latest-ai-re" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:normal;padding:0px 28px;text-align:left;"><h6 style="color:#2A2A2A;font-weight:normal;mso-line-height-alt:87.5%;"><i>Oct 27th ~ Nov 3rd</i><br><i>#80 Latest AI Research Explained Simply</i></h6></td></tr><tr><td><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" style=""><tr><td bgcolor="#222222" style="background-color:#222222;padding:0.0px 0.0px 0.0px 0.0px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0"><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"></p></td></tr></table></td></tr></table></td></tr><tr><td id="language-models-are-injective-and-h" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:Bold;padding:0px 28px;text-align:left;"><h2 style="color:#2A2A2A;font-weight:Bold;mso-line-height-alt:150.0%;">Language Models are Injective and Hence Invertible</h2></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"><i>Nikolaou [Sapienza University of Rome, EPFL</i>, <i>University of Athens</i>, <i>Archimedes RC]</i></p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"><span style="background-color:#e0e0e0;"><span style="color:rgb(255, 58, 58);font-size:0.6rem;"> ♥ 22k </span></span><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> </span><span style="background-color:#e0e0e0;"><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> LLMs </span></span><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> </span></p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> It's often assumed that transformers lose information as they process text, since their components (attention and normalization) map different inputs to the same output. But this research shows that's not the case. In fact, decoder-only transformers are inherently lossless, meaning every distinct input sequence produces a unique internal representation. </p></td></tr><tr><td align="center" valign="top" style="padding-bottom:20px;padding-left:15px;padding-right:15px;padding-top:20px; " class="dd"><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/43d20498-1007-462b-a644-29aba3fc65e6/image.png?t=1762246427" alt="" height="auto" width="626" style="display:block;width:100%;" border="0"/></td></tr></table></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> Transformers are built from smooth, structured components, which mathematically ensures that different prompts almost never collide into the same hidden state. This property is maintained from initialization through training, providing the model reliably preserves input identity across its layers. Because of this, we can trace back from any hidden state to the exact input that created it. </p></td></tr><tr><td align="center" valign="top" style="padding-bottom:20px;padding-left:15px;padding-right:15px;padding-top:20px; " class="dd"><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/62b0364d-aff9-4e64-899b-0f5be93fc5b8/image.png?t=1762246390" alt="" height="auto" width="626" style="display:block;width:100%;" border="0"/></td></tr></table></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> In practice, the authors introduce <b>SIPIT</b>, an algorithm that <b>recovers the original prompt</b> from hidden activations by stepping through the vocabulary token by token. It uses the causal structure of transformers: at each position, only one vocabulary candidate will match the observed hidden state given the preceding context. Experiments across multiple models and billions of prompt pairs confirmed zero collisions, and SIPIT achieved <b>perfect reconstruction in linear time</b>, which offers a practical tool for model transparency and interpretability. </p></td></tr><tr class="btn_row"><td valign="top" style="padding-bottom:14px;padding-left:28px;padding-right:28px;padding-top:14px;text-align:center;width:100%;word-break:break-word;" class="dd"><table width="100%" role="none" border="0" cellspacing="0" cellpadding="0" style="margin:14px auto 14px auto;"><tr><td align="center" valign="middle"><table role="none" border="0" cellspacing="0" cellpadding="0"><tr><td style="background-color:#2C81E5;border-radius:8px;mso-padding-alt:14px 20px;" class="btn"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.DUiN96-Eq7pUHzwEhy5j28yjf9KIXZdsXoh1WlHvvKltQ5dneCya7wc1TN3SEVjbdSyJESb0g7HI64dYOB6_nGvbhZ_9aSisBuRlHF9nwrulsi_SIeLq6hHm028ysSKgq3PpWeTLuxzDJvgyG74fK3H3HFmm0rCMkWIiSn84n_HK6s7K7ui2zq_VLaFNwSixtye6UF1fw9LDsZih-Vp-jpcL-Be8X_TKf-qmhRUtTn14IVwT5h9FjRFiN---eNls3PlZpOVJhbKuVu_pCmUHMA/4lb/twXBdovCTqiYTsaJEd24KQ/h2/h001.stgTYnBtozh8D1hCAGoCVXde5Y9nEC3N9kXy0kCgVfA" target="_blank" rel="noopener noreferrer nofollow" style="background-color:#2C81E5;border-radius:8px;color:#FFFFFF;display:inline-block;font-family:'Open Sans','Segoe UI','Apple SD Gothic Neo','Lucida Grande','Lucida Sans Unicode',sans-serif;font-size:16px;font-weight:normal;line-height:18px;padding:14px 20px;text-decoration:none;"> Read Full Paper </a></td></tr></table></td></tr></table></td></tr><tr><td><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" style=""><tr><td bgcolor="#222222" style="background-color:#222222;padding:0.0px 0.0px 0.0px 0.0px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0"><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"></p></td></tr></table></td></tr></table></td></tr><tr><td id="find-your-customers-on-roku-this-bl" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:normal;padding:0px 28px;text-align:left;"><h3 style="color:#2A2A2A;font-weight:normal;mso-line-height-alt:125.0%;">Find your customers on Roku this Black Friday</h3></td></tr><tr><td align="center" valign="top" style="padding-bottom:20px;padding-left:15px;padding-right:15px;padding-top:20px; " class="dd"><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.fUNb4GdFo9D3F8WuLArtoQWM4WfOezTCPUxm2nqVUZlaiEDFf-W5CDjqrpRm4tTzPb7DXfnMcezjkfQEOx8bcmlXP_DmDI9tlkbDFpPlJjdC2Heb516Px4gHf_lz6z73Lah_A2008BqTUMYQ7KSSoMmQge5Fe7Q3TV1FHliMGX8KpY7O6ic8MtPEQP89215bEF3Udlm9aIl-ezMzkF4MBNhJhbmWEEjvwygde1iNkYLGsUI_z8vbY416VrV6gb54fZz9WPwrvSzLJqSe4Ltj_6mJRRooniPS_3HR1H3UOo-4h_CdZepg13DG8CLdhqaMJBogTo-zzJD6x0CbiXz6r23z0GyxxuLpABFUC0hhZHdHiDFpQajhtoDO0Dnv-DHLwOVxyjRb9oOhZ6KC3msgBRnoOZsRHJJwzlYpw-YyZbqq1Q7P5q8TYs-qp7kJYAEwdQQMvuGGmDZgaoTGGkhzr1rflvuUQIjsNRWQNktyvABmWw8YZLIQhoRMCMt3kESc8IxdpGA9fM-boWeV0tRfrQUhVRMUwypP6De_9e6CJJX3ERSjuGO1ljI79ezQZQ5c/4lb/twXBdovCTqiYTsaJEd24KQ/h3/h001.5dFoCaZ2HAg5TpxbFET7Xrl60fdw5UI6WH0EZ_A8Gn4" rel="noopener noreferrer nofollow" style="text-decoration:none;" target="_blank"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/344c4d7f-9567-48b9-a09a-8513e6e1dfaa/1200x600_V2.jpg?t=1761760126" alt="" height="auto" width="626" style="display:block;width:100%;border-radius:0px 0px 0px 0px;border-style:solid;border-width:0px 0px 0px 0px;box-sizing:border-box;border-color:#E5E7EB;" border="0"/></a></td></tr></table></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> As with any digital ad campaign, the important thing is to reach streaming audiences who will convert. To that end, <a class="link" href="https://elink4f7.mail.bycloud.ai/ss/c/u001.fUNb4GdFo9D3F8WuLArtoQWM4WfOezTCPUxm2nqVUZlaiEDFf-W5CDjqrpRm4tTzPb7DXfnMcezjkfQEOx8bcmlXP_DmDI9tlkbDFpPlJjdC2Heb516Px4gHf_lz6z73Lah_A2008BqTUMYQ7KSSoMmQge5Fe7Q3TV1FHliMGX8KpY7O6ic8MtPEQP89215bEF3Udlm9aIl-ezMzkF4MBNhJhbmWEEjvwygde1iNkYLGsUI_z8vbY416VrV6gb54fZz9WPwrvSzLJqSe4Ltj_6mJRRooniPS_3HR1H3UOo-4h_CdZepg13DG8CLdhqaMJBogTo-zzJD6x0CbiXz6r23z0GyxxuLpABFUC0hhZHdHiDFpQajhtoDO0Dnv-DHLwOVxyjRb9oOhZ6KC3msgBRnoOZsRHJJwzlYpw-YyZbqq1Q7P5q8TYs-qp7kJYAEwdQQMvuGGmDZgaoTGGkhzr1rflvuUQIjsNRWQNktyvADyn1JB_Q1gfvCmseFz2mPTmM_d9JH4XvCIs7798H0txJpsJ7p_3eSIhl1YkE0XI5h7qMoykzAAOi-X9XFmMbW9/4lb/twXBdovCTqiYTsaJEd24KQ/h4/h001.yrzOvUqXdbn7brhSFyzYsXGV0N67cVNH5tIiqnxdVvs" target="_blank" rel="noopener noreferrer nofollow"><span>Roku’s self-service Ads Manager</span></a> stands ready with powerful segmentation and targeting options. After all, you know your customers, and we know our streaming audience. </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> Worried it’s too late to spin up new Black Friday creative? With Roku Ads Manager, you can easily import and augment existing creative assets from your social channels. We also have AI-assisted upscaling, so every ad is primed for CTV. </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> Once you’ve done this, then you can easily set up A/B tests to flight different creative variants and Black Friday offers. If you’re a Shopify brand, you can even run shoppable ads directly on-screen so viewers can purchase with just a click of their Roku remote. </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> Bonus: we’re gifting you $5K in ad credits when you spend your first $5K on Roku Ads Manager. Just sign up and use code GET5K. Terms apply. </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"><a class="link" href="https://elink4f7.mail.bycloud.ai/ss/c/u001.fUNb4GdFo9D3F8WuLArtoQWM4WfOezTCPUxm2nqVUZlaiEDFf-W5CDjqrpRm4tTzPb7DXfnMcezjkfQEOx8bcmlXP_DmDI9tlkbDFpPlJjdC2Heb516Px4gHf_lz6z73Lah_A2008BqTUMYQ7KSSoMmQge5Fe7Q3TV1FHliMGX8KpY7O6ic8MtPEQP89215bEF3Udlm9aIl-ezMzkF4MBNhJhbmWEEjvwygde1iNkYLGsUI_z8vbY416VrV6gb54fZz9WPwrvSzLJqSe4Ltj_6mJRRooniPS_3HR1H3UOo-4h_CdZepg13DG8CLdhqaMJBogTo-zzJD6x0CbiXz6r23z0GyxxuLpABFUC0hhZHdHiDFpQajhtoDO0Dnv-DHLwOVxyjRb9oOhZ6KC3msgBRnoOZsRHJJwzlYpw-YyZbqq1Q7P5q8TYs-qp7kJYAEwdQQMvuGGmDZgaoTGGkhzr1rflvuUQIjsNRWQNktyvABvBg3JexHwFFysEBTV1KOsJn1FEK_G3K1TxWKszn7WPwOUdCROxpC2Ynb8qdZxmrCnhS_jzOMyRIh10GriHl1n/4lb/twXBdovCTqiYTsaJEd24KQ/h5/h001.FcuikBEAYYYeEEdn6XXoYNu8dGseFzZZdwwDr6NSYWU" target="_blank" rel="noopener noreferrer nofollow"><span>Use code GET5K now</span></a></p></td></tr><tr><td><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" style=""><tr><td bgcolor="#222222" style="background-color:#222222;padding:0.0px 0.0px 0.0px 0.0px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0"><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"></p></td></tr></table></td></tr></table></td></tr><tr><td id="defeating-the-training-inference-mi" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:Bold;padding:0px 28px;text-align:left;"><h2 style="color:#2A2A2A;font-weight:Bold;mso-line-height-alt:150.0%;">Defeating the Training-Inference Mismatch via FP16</h2></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"><i>Qi et al. [Sea AI Lab, National University of Singapore]</i></p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"><span style="background-color:#e0e0e0;"><span style="color:rgb(255, 58, 58);font-size:0.6rem;"> ♥ 1.2k </span></span><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> </span><span style="background-color:#e0e0e0;"><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> LLM Training </span></span><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> </span><span style="background-color:#e0e0e0;"><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> bycloud’s pick </span></span><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> </span></p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> When fine-tuning large language models with reinforcement learning, even minor numerical inconsistencies can lead to significant training instability. Researchers have observed that the policies used during training and inference often don't align perfectly, resulting in models performing poorly or collapsing unexpectedly. This paper identifies a surprisingly straightforward fix: switching the floating-point precision from BF16 to FP16 eliminates this mismatch at its source, leading to more reliable and effective training. </p></td></tr><tr><td align="center" valign="top" style="padding-bottom:20px;padding-left:15px;padding-right:15px;padding-top:20px; " class="dd"><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/5fa63cda-0e57-4ff2-a4a0-123740ee1142/bf16_vs_fp16_training.png?t=1762246233" alt="" height="auto" width="626" style="display:block;width:100%;" border="0"/></td></tr><tr><td align="center" valign="top" class="t" style="width:626px; padding: 4px 0px 4px 0px;"><p>Training reward comparison between BF16 and FP16.</p></td></tr></table></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> There is a difference in how BF16 and FP16 handle precision. BF16 is designed with a wide dynamic range, which helps in pre-training, but it uses fewer bits for precision. This means that small rounding errors can accumulate during the auto-regressive generation of text, causing the training and inference policies to diverge over time. FP16, on the other hand, allocates more bits for precision, ensuring calculations remain consistent between the training and inference engines. This higher fidelity reduces the tiny errors that can disrupt the learning process, allowing the model to optimize more smoothly. </p></td></tr><tr><td align="center" valign="top" style="padding-bottom:20px;padding-left:15px;padding-right:15px;padding-top:20px; " class="dd"><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/21344c96-4f88-4a75-8542-800c42cfee63/fp16_comparison.png?t=1762246266" alt="" height="auto" width="626" style="display:block;width:100%;" border="0"/></td></tr><tr><td align="center" valign="top" class="t" style="width:626px; padding: 4px 0px 4px 0px;"><p> Comparisons between various algorithms based on FP16.</p></td></tr></table></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> The researchers tested it on a range of benchmarks, including different algorithms, model sizes, and specialized setups such as Mixture-of-Experts or LoRA-based training, and FP16 consistently delivered better results. It achieved higher rewards, faster convergence, and near-perfect accuracy on solvable tasks where BF16 often led to collapse. By addressing the root cause numerically, this approach avoids the need for complex algorithmic patches and could make RL fine-tuning more accessible and stable for future AI development. </p></td></tr><tr class="btn_row"><td valign="top" style="padding-bottom:14px;padding-left:28px;padding-right:28px;padding-top:14px;text-align:center;width:100%;word-break:break-word;" class="dd"><table width="100%" role="none" border="0" cellspacing="0" cellpadding="0" style="margin:14px auto 14px auto;"><tr><td align="center" valign="middle"><table role="none" border="0" cellspacing="0" cellpadding="0"><tr><td style="background-color:#2C81E5;border-radius:8px;mso-padding-alt:14px 20px;" class="btn"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.DUiN96-Eq7pUHzwEhy5j28yjf9KIXZdsXoh1WlHvvKlm1H8yWZc-vW33kYhh0G1ReaqJ9CWWc-jxHquHdXyv1_zEEVG4hV8N_shWPv7CuVkiSeTito7JYUrRf7eDYyOXxTqVXNVA3AG7e86maCLNySKbZnzb9kVWiNzsneX9h8EN9Giavn0d7ghGsKw6I9zSvnDgofUO6j98wVRT0sbSG8V415p3cx-KcczzwaLTQbJ--2MMnux5MiHLp_HsDX971v6fg7bQsJIhC0yND75fFw/4lb/twXBdovCTqiYTsaJEd24KQ/h6/h001.gvbRkueo1DT9Imk-1yUCvC96jual1Dud1KE3NGgl_PU" target="_blank" rel="noopener noreferrer nofollow" style="background-color:#2C81E5;border-radius:8px;color:#FFFFFF;display:inline-block;font-family:'Open Sans','Segoe UI','Apple SD Gothic Neo','Lucida Grande','Lucida Sans Unicode',sans-serif;font-size:16px;font-weight:normal;line-height:18px;padding:14px 20px;text-decoration:none;"> Read Full Paper </a></td></tr></table></td></tr></table></td></tr><tr><td><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" style=""><tr><td bgcolor="#222222" style="background-color:#222222;padding:0.0px 0.0px 0.0px 0.0px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0"><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"></p></td></tr></table></td></tr></table></td></tr><tr><td id="not-all-bits-are-equal-scale-depend" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:Bold;padding:0px 28px;text-align:left;"><h2 style="color:#2A2A2A;font-weight:Bold;mso-line-height-alt:150.0%;">Scaling Latent Reasoning via Looped Language Models</h2></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"><i>Zhu et al. [ByteDance Seed, UC Santa Cruz, Princeton University, Mila- Quebec AI Institute, University of Montreal, Peking</i> <i>University, Carnegie Mellon University, University of Pennsylvania, Conscium, University of Manchester, M-A-P]</i></p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"><span style="background-color:#e0e0e0;"><span style="color:rgb(255, 58, 58);font-size:0.6rem;"> ♥ 577 </span></span><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> </span><span style="background-color:#e0e0e0;"><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> LLM Scaling </span></span><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> </span></p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> What if language models could learn to reason during pre-training, not just afterward? Current models rely heavily on chain-of-thought prompting, which delays reasoning to inference and doesn't fully use pre-training data. The Ouro research introduces a new architecture called Looped Language Models (LoopLM), which builds reasoning directly into pre-training using iterative computation in a latent space, a learned depth allocation system, and training on 7.7 trillion tokens. </p></td></tr><tr><td align="center" valign="top" style="padding-bottom:20px;padding-left:15px;padding-right:15px;padding-top:20px; " class="dd"><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/9a875ac9-4394-4be6-9a64-6f94f69018af/ouro_main.png?t=1762245679" alt="" height="auto" width="626" style="display:block;width:100%;" border="0"/></td></tr><tr><td align="center" valign="top" class="t" style="width:626px; padding: 4px 0px 4px 0px;"><p>Ouro Looped Language Model Architecture and Performance</p></td></tr></table></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> The model works by reusing the same set of layers multiple times in a loop, with each pass refining its internal understanding of the input. An entropy-regularized training objective encourages the model to explore different numbers of loops. At the same time, a learned gating mechanism enables it to decide when to stop processing based on the task's complexity. This means that simpler inputs can be handled quickly with fewer loops, while harder problems require more computational effort, all without increasing the model's parameter count. </p></td></tr><tr class="embed-gen-img-r"><td align="center" valign="top" style="padding:12px 27px 12px 27px;" class="dd"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr><td align="center" valign="top" class="o" style="padding:12px 12px 12px 12px;;background-color:#FFFFFF;border-color:#F1F1F1;border-radius:5px 5px 5px 5px;border-width:1px 1px 1px 1px;"><!--[if !mso]><!--><div style="display:none; float:left; overflow:hidden; width:0; max-height:0; line-height:0;" class="mob-show"><table role="none" border="0" cellspacing="0" cellpadding="0" align="right" width="100%"><tr><td align="center" valign="top"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.CxDkkVpJsBdVoe83c_tBWgJ2TYlk6-rvnh5aA3abf3DJH3Qgi9P2y5cws7bnsM0FhdN8OxWasjfytIlzb4BDixJfT6D9umSlZhHFfa5U8h5pqNcwkM1iFh5OXQhAdcNGTusoEQQJNQhi9vUj9b4tIziI3BI7wIkudwOuHtty7UFgmVNYfcbbZP6COi2fwLzGb7VKpkxmDFmsY7_efi2f-kAmqgoilwcpxEmJVGN0DDUyg-OacxiAZLxnsUT0FDsB1J0zfwXW1fwdxFPtQ4Ymsx5eVCfvm4aMMUZ4fGlw9x8/4lb/twXBdovCTqiYTsaJEd24KQ/h7/h001.l5DTxMudM0ke9aeva358loxsLhlFN5twm5t4KSSTCJk" target="_blank"><img src="https://cdn-thumbnails.huggingface.co/social-thumbnails/models/ByteDance/Ouro-1.4B-Thinking.png" width="100%" style="height:auto;display:block;"/></a></td></tr><tr><td height="16" style="font-size:16px;line-height:16px;"> </td></tr></table></div><!--<![endif]--><table role="none" border="0" cellspacing="0" cellpadding="0" align="right" width="100%"><tr><td width="57%" align="center" valign="middle" class="mob-stack"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr><td align="left" valign="middle" class="l"><p><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.CxDkkVpJsBdVoe83c_tBWgJ2TYlk6-rvnh5aA3abf3DJH3Qgi9P2y5cws7bnsM0FhdN8OxWasjfytIlzb4BDixJfT6D9umSlZhHFfa5U8h5pqNcwkM1iFh5OXQhAdcNGTusoEQQJNQhi9vUj9b4tIziI3BI7wIkudwOuHtty7UFgmVNYfcbbZP6COi2fwLzGb7VKpkxmDFmsY7_efi2f-kAmqgoilwcpxEmJVGN0DDXgWgVFn8d2cxlYek-oH5RRX_Erk3MWWsDvl67f7eEzyN0gbuRVmM4not5rs0c5LJ0/4lb/twXBdovCTqiYTsaJEd24KQ/h8/h001.nfJ5MtTW2RneNgDSN9wCAqL98VknXrvzJ-ptGU9FZ04" style="text-decoration:none;font-style:normal;color:#2D2D2D !important;font-size:14px;line-height:20px;" target="_blank"> ByteDance/Ouro-1.4B-Thinking · Hugging Face <tr><td align="left" valign="top" class="m"><p style="font-size:13px;line-height:19px;color:#2D2D2D;"> We’re on a journey to advance and democratize artificial intelligence through open-source and open-science approaches. </p></td></tr><tr><td align="left" valign="bottom" class="n" style="vertical-align:bottom;padding-top:12px;"><p style="word-break:break-word;">huggingface.co/ByteDance/Ouro-1.4B-Thinking</p></td></tr></a></p></td></tr></table></td><td width="3%" style="font-size:16px;line-height:16px;" class="mob-hide"> </td><td width="40%" align="left" valign="top" class="mob-hide"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.CxDkkVpJsBdVoe83c_tBWgJ2TYlk6-rvnh5aA3abf3DJH3Qgi9P2y5cws7bnsM0FhdN8OxWasjfytIlzb4BDixJfT6D9umSlZhHFfa5U8h5pqNcwkM1iFh5OXQhAdcNGTusoEQQJNQhi9vUj9b4tIziI3BI7wIkudwOuHtty7UFgmVNYfcbbZP6COi2fwLzGb7VKpkxmDFmsY7_efi2f-kAmqgoilwcpxEmJVGN0DDVQWE7QekA19Th_5NYgwhYgFVBV2AhTi6vb4M0nsm9xwbaZ9Rmsqb9U0wGSWvBpGvg/4lb/twXBdovCTqiYTsaJEd24KQ/h9/h001.LPhFfc655FRMiR0kSDfrhYkqAo9GSrKUtzt-cWDfkNk" target="_blank"><img src="https://cdn-thumbnails.huggingface.co/social-thumbnails/models/ByteDance/Ouro-1.4B-Thinking.png" width="230" style="height:auto;display:block;"/></a></td></tr></table></td></tr></table></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> In tests, the 1.4B and 2.6B parameter Ouro models performed as well as standard models up to 12B parameters across a range of reasoning, math, and coding benchmarks. This research suggests that looped architectures offer a promising new direction for scaling AI, which can improve both performance and safety as the number of computational steps increases. </p></td></tr><tr class="btn_row"><td valign="top" style="padding-bottom:14px;padding-left:28px;padding-right:28px;padding-top:14px;text-align:center;width:100%;word-break:break-word;" class="dd"><table width="100%" role="none" border="0" cellspacing="0" cellpadding="0" style="margin:14px auto 14px auto;"><tr><td align="center" valign="middle"><table role="none" border="0" cellspacing="0" cellpadding="0"><tr><td style="background-color:#2C81E5;border-radius:8px;mso-padding-alt:14px 20px;" class="btn"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.DUiN96-Eq7pUHzwEhy5j28yjf9KIXZdsXoh1WlHvvKmMli9zcGDZ11TufmOv-zdkP2ZGwonKy6i2RF35yDbYbnF56IpZY_ry0JDfLlbu2Dbtjn5oWA5HufllHfkY6c6djgJlqd_sPErrKgZLuE7pzzoZ9hf9hOc-U0DIkSTpNkPxVP3ANOLowNrJHIW74vxGZBkv_U-PCyV_qq4S84o-resQKscJ482d_M2BGa93A9qeJnyoqT-bfydP-LDAjcvBgFMooc1nDhhgI4FenRTG_g/4lb/twXBdovCTqiYTsaJEd24KQ/h10/h001.1A13BlFlrK5sj8ezgCjy84X-RI5sQTdUrBa2Qhj1Koc" target="_blank" rel="noopener noreferrer nofollow" style="background-color:#2C81E5;border-radius:8px;color:#FFFFFF;display:inline-block;font-family:'Open Sans','Segoe UI','Apple SD Gothic Neo','Lucida Grande','Lucida Sans Unicode',sans-serif;font-size:16px;font-weight:normal;line-height:18px;padding:14px 20px;text-decoration:none;"> Read Full Paper </a></td></tr></table></td></tr></table></td></tr><tr><td><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" style=""><tr><td bgcolor="#222222" style="background-color:#222222;padding:0.0px 0.0px 0.0px 0.0px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0"><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"></p></td></tr></table></td></tr></table></td></tr><tr><td id="kimi-linear-an-expressive-efficient" class="dd" align="left" valign="top" style="color:#2A2A2A;font-weight:Bold;padding:0px 28px;text-align:left;"><h2 style="color:#2A2A2A;font-weight:Bold;mso-line-height-alt:150.0%;">Kimi Linear: An Expressive, Efficient Attention Architecture</h2></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"><i>Developed by the Kimi Team at Moonshot AI</i></p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"><span style="background-color:#e0e0e0;"><span style="color:rgb(255, 58, 58);font-size:0.6rem;"> ♥ 1.2k </span></span><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> </span><span style="background-color:#e0e0e0;"><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> LLM Attention </span></span><span style="color:rgb(44, 129, 229);font-size:0.6rem;"> </span></p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> Running large language models for complex tasks, such as reinforcement learning and long conversations, can slow down inference due to the growing memory demands of standard attention mechanisms. Kimi Linear addresses this by introducing a hybrid architecture that combines a new linear attention module with full attention layers. It can exceed the performance of full attention models while significantly reducing memory use and enhancing speed. </p></td></tr><tr><td align="center" valign="top" style="padding-bottom:20px;padding-left:15px;padding-right:15px;padding-top:20px; " class="dd"><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/6ed840f0-e431-4330-8154-408cdae1add0/arch.png?t=1762245285" alt="" height="auto" width="626" style="display:block;width:100%;" border="0"/></td></tr><tr><td align="center" valign="top" class="t" style="width:626px; padding: 4px 0px 4px 0px;"><p>Kimi Linear Attention Architecture</p></td></tr></table></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> Kimi Linear uses the Kimi Delta Attention (KDA), which improves on earlier linear attention methods by using a fine-grained, channel-wise gating mechanism. This allows the model to more precisely manage its finite memory state more precisely, selectively retaining or forgetting information across different feature dimensions. KDA relies on a specialized form of diagonal-plus-low-rank transition matrices, enabling a custom chunk-wise computation process that reduces computational load compared to general approaches while maintaining alignment with the established delta rule for stable learning. </p></td></tr><tr><td align="center" valign="top" style="padding-bottom:20px;padding-left:15px;padding-right:15px;padding-top:20px; " class="dd"><table role="none" border="0" cellspacing="0" cellpadding="0" style="margin:0 auto 0 auto;"><tr><td align="center" valign="top" style="width:626px;"><img src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/21d597c2-e68a-4508-9a74-828aa4713ccd/perf_speed.png?t=1762245269" alt="" height="auto" width="626" style="display:block;width:100%;" border="0"/></td></tr><tr><td align="center" valign="top" class="t" style="width:626px; padding: 4px 0px 4px 0px;"><p>(a) Performance vs. acceleration. (b) Time per output token (TPOT) vs. decoding length.</p></td></tr></table></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> The hybrid design alternates three KDA layers with one full attention layer, balancing local processing with global information flow. This structure reduces the key-value cache memory footprint by up to 75% during long-sequence generation. In tests, a 3-billion-parameter Kimi Linear model trained on 1.4 trillion tokens outperformed a comparable full-attention model across short-context, long-context, and reinforcement learning tasks, while achieving up to six times higher decoding throughput for a one-million-token context. </p></td></tr><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"> These results show us that Kimi Linear can serve as a drop-in replacement for full attention and provide better performance and efficiency, particularly in settings with lengthy inputs and outputs. However, it's worth noting that another hybrid baseline, Gated DeltaNet-Hybrid, did experience a performance drop in long-context evaluations. </p></td></tr><tr><td style="padding-bottom:14px;padding-left:15px;padding-right:15px;padding-top:14px;"><table class="table" width="100%" style="border-bottom-width:1px;border-collapse:collapse;border-left-width:1px;border-right-width:1px;border-top-width:1px;table-layout:fixed;"><tr><th class="table-h" width="20%" valign="middle" style="border-bottom-width:1px;border-left-width:1px;border-right-width:1px;border-top-width:1px;padding-bottom:5px;padding-top:5px;"><p style="mso-line-height-alt:150.0%;padding:0px;text-align:left;word-break:break-word;"> Model </p></th><th class="table-h" width="20%" valign="middle" style="border-bottom-width:1px;border-left-width:1px;border-right-width:1px;border-top-width:1px;padding-bottom:5px;padding-top:5px;"><p style="mso-line-height-alt:150.0%;padding:0px;text-align:left;word-break:break-word;"> #Total Params </p></th><th class="table-h" width="20%" valign="middle" style="border-bottom-width:1px;border-left-width:1px;border-right-width:1px;border-top-width:1px;padding-bottom:5px;padding-top:5px;"><p style="mso-line-height-alt:150.0%;padding:0px;text-align:left;word-break:break-word;"> #Activated Params </p></th><th class="table-h" width="20%" valign="middle" style="border-bottom-width:1px;border-left-width:1px;border-right-width:1px;border-top-width:1px;padding-bottom:5px;padding-top:5px;"><p style="mso-line-height-alt:150.0%;padding:0px;text-align:left;word-break:break-word;"> Context Length </p></th><th class="table-h" width="20%" valign="middle" style="border-bottom-width:1px;border-left-width:1px;border-right-width:1px;border-top-width:1px;padding-bottom:5px;padding-top:5px;"><p style="mso-line-height-alt:150.0%;padding:0px;text-align:left;word-break:break-word;"> Download Link </p></th></tr><tr><td class="table-c" width="20%" valign="top" style="border-bottom-width:1px;border-left-width:1px;border-right-width:1px;border-top-width:1px;padding-bottom:5px;padding-top:5px;"><p style="mso-line-height-alt:150.0%;padding:0px;text-align:left;word-break:break-word;"> Kimi-Linear-Base </p></td><td class="table-c" width="20%" valign="top" style="border-bottom-width:1px;border-left-width:1px;border-right-width:1px;border-top-width:1px;padding-bottom:5px;padding-top:5px;"><p style="mso-line-height-alt:150.0%;padding:0px;text-align:left;word-break:break-word;"> 48B </p></td><td class="table-c" width="20%" valign="top" style="border-bottom-width:1px;border-left-width:1px;border-right-width:1px;border-top-width:1px;padding-bottom:5px;padding-top:5px;"><p style="mso-line-height-alt:150.0%;padding:0px;text-align:left;word-break:break-word;"> 3B </p></td><td class="table-c" width="20%" valign="top" style="border-bottom-width:1px;border-left-width:1px;border-right-width:1px;border-top-width:1px;padding-bottom:5px;padding-top:5px;"><p style="mso-line-height-alt:150.0%;padding:0px;text-align:left;word-break:break-word;"> 1M </p></td><td class="table-c" width="20%" valign="top" style="border-bottom-width:1px;border-left-width:1px;border-right-width:1px;border-top-width:1px;padding-bottom:5px;padding-top:5px;"><p style="mso-line-height-alt:150.0%;padding:0px;text-align:left;word-break:break-word;"> 🤗<a class="link" href="https://elink4f7.mail.bycloud.ai/ss/c/u001.CxDkkVpJsBdVoe83c_tBWpwuXiC8bRxIIctJGlFDhSeR99RsNeeom1MTiDVme1ASTAw36Ea-wwoqfUpFh4NdmrVjvMVdUAMZKwz5ehrxSTRl-UzXKEGwBSGbfbFYFS5m65vKLccO1zul-Y9z7rvXck42HKoaJLxn2-WVXfuUeS_Ib_R6PFv0fe2FFNVLDOQsToTppb94z8iEsRt99nk2Zti2lJzwr8Xy8RxpOJb_9enocJjQF2GvhU6JsTX1ZRW5Ohm2I4Lq-1_UHOT8EGwrV-di6NhDcHM9iF71TW0Zea1n7mei-611uZHhJ0krt64D/4lb/twXBdovCTqiYTsaJEd24KQ/h11/h001.Kl2YuQzoxwrlMHAQYOWPTd3Xt_kNSEzvx6qVZTb81Qk" target="_blank" rel="noopener noreferrer nofollow"><span> Hugging Face</span></a></p></td></tr><tr><td class="table-c" width="20%" valign="top" style="border-bottom-width:1px;border-left-width:1px;border-right-width:1px;border-top-width:1px;padding-bottom:5px;padding-top:5px;"><p style="mso-line-height-alt:150.0%;padding:0px;text-align:left;word-break:break-word;"> Kimi-Linear-Instruct </p></td><td class="table-c" width="20%" valign="top" style="border-bottom-width:1px;border-left-width:1px;border-right-width:1px;border-top-width:1px;padding-bottom:5px;padding-top:5px;"><p style="mso-line-height-alt:150.0%;padding:0px;text-align:left;word-break:break-word;"> 48B </p></td><td class="table-c" width="20%" valign="top" style="border-bottom-width:1px;border-left-width:1px;border-right-width:1px;border-top-width:1px;padding-bottom:5px;padding-top:5px;"><p style="mso-line-height-alt:150.0%;padding:0px;text-align:left;word-break:break-word;"> 3B </p></td><td class="table-c" width="20%" valign="top" style="border-bottom-width:1px;border-left-width:1px;border-right-width:1px;border-top-width:1px;padding-bottom:5px;padding-top:5px;"><p style="mso-line-height-alt:150.0%;padding:0px;text-align:left;word-break:break-word;"> 1M </p></td><td class="table-c" width="20%" valign="top" style="border-bottom-width:1px;border-left-width:1px;border-right-width:1px;border-top-width:1px;padding-bottom:5px;padding-top:5px;"><p style="mso-line-height-alt:150.0%;padding:0px;text-align:left;word-break:break-word;"> 🤗<a class="link" href="https://elink4f7.mail.bycloud.ai/ss/c/u001.CxDkkVpJsBdVoe83c_tBWpwuXiC8bRxIIctJGlFDhSeR99RsNeeom1MTiDVme1ASWHCqQSfoXBtd_LCPDrJbX4OtdY4W-CZlQQ2qejxYEIouum7fw9N8medeT0LVHVxzlrlI41-eHTPltFns2s5FoJ4hcjeq9n0NPNIOmgqopqmciZNPA0V-geO4exUC6PoEK0f2HdPrSZzmLSI4Y86IErOfGLi8JAYHfgYP0ya76ecOLxJHounpqw9ckDIyPn0xQlhZnvyciX9cUypVCCceGW2EUKO4j84Pyz1gCeb1rfJNrAhIO9iSe9dpu7QD5aCU/4lb/twXBdovCTqiYTsaJEd24KQ/h12/h001.7Hlf0PRn41PInPO1xdJNeJa69PKMQVE5bPnGS5XwnZA" target="_blank" rel="noopener noreferrer nofollow"><span> Hugging Face</span></a></p></td></tr></table></td></tr><tr class="btn_row"><td valign="top" style="padding-bottom:14px;padding-left:28px;padding-right:28px;padding-top:14px;text-align:center;width:100%;word-break:break-word;" class="dd"><table width="100%" role="none" border="0" cellspacing="0" cellpadding="0" style="margin:14px auto 14px auto;"><tr><td align="center" valign="middle"><table role="none" border="0" cellspacing="0" cellpadding="0"><tr><td style="background-color:#2C81E5;border-radius:8px;mso-padding-alt:14px 20px;" class="btn"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.DUiN96-Eq7pUHzwEhy5j28yjf9KIXZdsXoh1WlHvvKlEKDz_3ibd5aaBZ5ISSTUyE258AW_ICzHZ78MYEvmA2YcD7zRPXmh6OJQhod7HDlP-pkmYib9huuF6bNU4gIoG_ewzfooPewYx9ehuCjycsjCR387_ykNXa9ObiJvC-WZ8Em0m64S9WH6ndPXX0tbOXmkFECg0WS4tfN4VcL48D_TbjSbMzyKIzsoPwZo3uHfdmXFItYqu9HgpG52S821Qqp0unLbkEkNevzRErKUb2Q/4lb/twXBdovCTqiYTsaJEd24KQ/h13/h001.kaB-UsH5owH7F9OlQXq0RLCZTnS2tcHEFgk_84IH8bA" target="_blank" rel="noopener noreferrer nofollow" style="background-color:#2C81E5;border-radius:8px;color:#FFFFFF;display:inline-block;font-family:'Open Sans','Segoe UI','Apple SD Gothic Neo','Lucida Grande','Lucida Sans Unicode',sans-serif;font-size:16px;font-weight:normal;line-height:18px;padding:14px 20px;text-decoration:none;"> Read Full Paper </a></td></tr></table></td></tr></table></td></tr><tr><td><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" style=""><tr><td bgcolor="#222222" style="background-color:#222222;padding:0.0px 0.0px 0.0px 0.0px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0"><tr><td class="dd" align="left" style="padding:0px 28px;text-align:left;word-break:break-word;"><p style="mso-line-height-alt:150.0%;"></p></td></tr></table></td></tr></table></td></tr><tr><td class="e" align="left" valign="top" style="padding:28px 28px 8px;"><h3 style="">Do you like the "AI Industry News" section in this newsletter?</h3></td></tr><tr><td class="ee e " style="padding:0px 28px 28px;"><div style="margin-left:0px;" class="edm_outlooklist"><table role="none" border="0" cellspacing="4" cellpadding="0" align="left" style="min-width:300px;"><tr><td style="width:100%;border:1px solid rgba(50, 50, 50, 0.17);padding:6px 12px; border-radius:4px;"><a style="font-style:normal;text-decoration:none;" href="https://elink4f7.mail.bycloud.ai/ss/c/u001.c6q0w4g5sodbtO4I1B_pxSdB5RCIH6yy1Fm1CYma3ExfM6YV1rmw92B0KWhoZ--Xyow7-XcGNYb7lm5lGs_dfTvnDgoGNrq0SMJRUPeaLdvyojCq7eBfEpUh_elLubQZff68dmFXQUYW0VCJX9oJbQAObl5tdOE_qzkIyuhPLkUJ0fOmUliJCu5_IcAvY6hicce65WFQMl0tz1k1jjsDZP93QOGKjS5o45RhosmQuHsgS7S1RBve8FKnHB9phWTL3FHws1fewO4GbfqiWaNBjnWQxodsRf4z-pjbHufQWROKgqsS6aDtSiSADN_Fa16CtE39CncOSf0e9um_ajXNN8t_T-cOwWVYkPOwVAUZrEkLKvkpIpbZ6et6ZAC2uI1444OOCW80-s6t6tquPwkmndz92A8aDMylQh1XoJYCIKOcbjZz8iN5I8jTpfRDKCUtD-Wyai7csu_af7uKbwvM1MkkGZUU6RndThRkjW7_Fl8o659fN9x8lDmNnTT1NbFE70CjjxtxFFYgfN4l0rXsit-Gx7rFRKhIjKsU1LYIKD9GsSBWhk4XgreTMln3_RsnX8bCRCCoISRKKUDeUul20czaBJm2ICxmNDYLuJLkgRFSCtYXdjOrbzMROz9gHpjxRdWnFkp18CsWxepuB4SYj73UG1LDKRFQisf8qdO56Yux7pUQCC_MRnOZe_Q6cqFiypXmHP_bK7VTNZw8wrLjwzgbO788_JX9VQA9fdh0I-T88aXs74Aqni9t9cNhWTQSPSus6-21Qne10MghzKJQ1MpynGDU9iXyRAAiCiEF-0wGAPRHjImnatlSM358m_OCWHAMK17CTWiAm3ZjiIp9wHawXWpNnVWXz3eX9_tjbXj9NDDM6iEUTo9tj1aj3_QQAgPphZCOg2yN3MCZfUqgwoYYNA0eMWuSbPr5NiNKkC-s1cVtcMmOX7IC2PVsUy9AvPFQH8qakog0XieTgZGEPcd5rReyPOjlbphxyMsy5ycga1i_POKfbf4e7C2P3qwETqxsEcNdvsDWeMHnExEaNpyc4S9QrV8ql1o2lTDfikc/4lb/twXBdovCTqiYTsaJEd24KQ/h14/h001.xjV9H4Cvmjduk4csPgAcE7B-YaPmnPNRhP3G_lxbzpU"><p style="font-size:14px;color:inherit;"> No, I don't read AI news </p></a></td></tr><tr><td style="width:100%;border:1px solid rgba(50, 50, 50, 0.17);padding:6px 12px; border-radius:4px;"><a style="font-style:normal;text-decoration:none;" href="https://elink4f7.mail.bycloud.ai/ss/c/u001.c6q0w4g5sodbtO4I1B_pxSdB5RCIH6yy1Fm1CYma3ExfM6YV1rmw92B0KWhoZ--Xyow7-XcGNYb7lm5lGs_dfTvnDgoGNrq0SMJRUPeaLdtCw1F0PgGsW1FmsZ-20Ah9o-SroEl8CROTsL63DkqteaSBXpV-qrcy57XaA-sXnD-Q3yF0dXuHQkZWFdUZUm25oO4fD6L9ow-7hioTNAANcy53bGCEDudCgD3MzNyEvc2vx_j_Owkbia_pvtcbyV16_DbccAfg6dr6SeyXT4kOw_eibdsPXCe-ywpTlARvLgunvHonEUu6ZNdYH7tXsk7T9gDU9jdOtFv2Au2jSLouib1E4PGacri1I6m5x0H95uIC0u7Y1_ZGDRLrw7JWjat9ZpAVSL0LfLEx9HLPs8xQMnGztkjvacECvIwR6KzWJ5ct-rsmB0xhryT9Q8w9Irx0bDWJYbBf9LymOgyQwd-A3OW0SsAn_QzESL5H07HW2IES9wvpipcpEDdmn_Bzrtku7E9LaTcod_8CZ7i0fyszTRlRiFAcxSlEgKbPFcEHeTIdSVSqdQVg9wEP3sPlFTzaaYIXrT1esenu6mAsLYoweAP8gjIkYgJ0kzrtM1TvqvCEfubQTAhOOjdNNtB3W9syDzWmknv6uhgT-3WGxx6EH6Lvgm13zDTTowj_y-VclWpSnQJl1mPlx8OHGXVns17HYBHI_YE6l08_BbQAouVXVGapfOULJFtSaudJcbB8-ueg6UolRYLJC_T0TDNIm4CTCzYEan08U1W87O9NETUqgkJn9xVisQt-5lrMimh7LZZLw1m1U8CQ1A0yw5kM5uM8ZOPeH_Ml_1D8VcfLpp2MtFfhxXAe3iLNgP6xydrirMBNwoLO4M5Z_wudYw3wxn070mqCCWxkSbFMJZ1Ll5kIpGQWOzgMPm5jYxYNObxIc5HGvDvB8-yK9X2lGIS4rJMuRCRaHQMGh0J-vCDdfnrrIiLiRhlsXp_JHOdzYyd339QUkm85PgVxZEtP_u4pFDJJFdAssQR-5jPYQUfVVqTswLaq1lUz13Q__BGhS_234u0/4lb/twXBdovCTqiYTsaJEd24KQ/h15/h001.mvWHu_HmvHtlVyr_GW-GdAQxRVi0ff5cTVr_cGdUBj0"><p style="font-size:14px;color:inherit;"> Yes, Bring back "AI Industry News" </p></a></td></tr></table></div></td></tr><tr><td class="dd" align="center" valign="top" style="padding:20px;"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.amatuKKICSickUKplYJXmF6U5Gfz7Ypkzogc36GdU_0eT53nfmgdnkpCjc9k3ouSVOz5lDbjOlb6qmLx4K0sc2hKSO-4Aw4XxVST4fwRQwCoWxNayw0DpPMEQh63lNCXyKI8iSiHmcLP42UseOdLB5PY9Pu7ZZZT0t6dAfTsnCHBwNQZC26TvwWhfC3u5QBYB_sJHnMj5qYwNdi0fB6Xh1Q60Ri9otjFiI9DAGRBumIUPcm_I6kLO2_9qmiUbd5hDrNb3wEg_e5J0yfG4mCvrUJK7jynWjwne2vlMNxqYUg/4lb/twXBdovCTqiYTsaJEd24KQ/h16/h001.1iYIMYUDouMTWUnMMlRYOGd8nbB9bNX8OLTrwyZ1i7w" style="text-decoration:none;"><table align="center" width="100%" cellpadding="0" cellspacing="0" border="0" role="none" style="max-width:520px;margin:0 auto;"><tr><td class="p" width="100%" style="padding:2px;border:none;"><table width="100%" cellpadding="0" cellspacing="0" border="0" role="none"><tr><td align="center" valign="top" style="width:100%;"><div style="max-height:0;position:relative;opacity:0.999;width:100%;mso-hide:all;"><div style="display:inline-block;width:100%;padding-top:25%;"><img width="20%" height="auto" loading="lazy" alt="" style="border:0;" src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/static_assets/youtube_play_icon.png"/></div></div><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.amatuKKICSickUKplYJXmF6U5Gfz7Ypkzogc36GdU_0eT53nfmgdnkpCjc9k3ouSVOz5lDbjOlb6qmLx4K0sc2hKSO-4Aw4XxVST4fwRQwCoWxNayw0DpPMEQh63lNCXyKI8iSiHmcLP42UseOdLB5PY9Pu7ZZZT0t6dAfTsnCHBwNQZC26TvwWhfC3u5QBYB_sJHnMj5qYwNdi0fB6Xh1Q60Ri9otjFiI9DAGRBumLujCwQ_9DeXPkNjO9Td7-gQP43QNwn0Os_LhCYjcCVB5xqWvYvM0ZyWa0JhTE-zgo/4lb/twXBdovCTqiYTsaJEd24KQ/h17/h001.ZGna7_uTiWGiwCMwqBOL2CR49X1cqtAnxQaDU_5-b9A" style="text-decoration:none;"><img src="https://i.ytimg.com/vi/XFhUI1fphKU/maxresdefault.jpg" width="480" height="auto" loading="lazy" alt="YouTube video by bycloud" style="display:block;height:auto;border:0;outline:none;text-decoration:none;background-color:#000000;width:100%;"/></a></td></tr><tr><td><p style="font-size:12px;font-weight:500;font-style:italic;font-family:Helvetica, Calibri, sans-serif;color: #686a6d; padding-top:0 !important;padding-bottom:6px !important; padding-left:4px !important;"> The Chinese AI Iceberg </p></td></tr></table></td></tr></table></a></td></tr></table></td></tr></table></td></tr><tr><td align="center" valign="top"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr><td><tr><td class="b" align="center" valign="top" bgcolor="#2a2a2a" style="padding:0px 0px 0px 0px;border-style:solid;border-width: 0px 0px 0px 0px;border-color: #2a2a2a;border-bottom-left-radius:10px;border-bottom-right-radius:10px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr><td align="center" valign="top" bgcolor="#73ddff" style="padding:12px"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr><td><span style="padding-left:1px;"></span></td><td align="center" valign="middle" width="75" style="width:75px;"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.1muhFWIqieRYpaJ-FbWSCQqcWoV4NNHHr5SkP9THApWUO4S9eWSDBFDMKQ83N4CY1l4kXQTU9YnEEqXRrg_2uhS94rQOKDl60C6UO57Zu1mJCFi_zhfD-a_hnJHdTQ7E2DazVu4_EQEVQ_Liy9q3rxer9CNlEwluXkWu5YS4lFBTrKAKuHWvhgetIcAhGytLkB923b3JxaogmeiZFU83Xrq-8-T10EAB_S94-aaQZEzjEkctm4HsZa-lbmphWUEar2Di5pVbQ2OZ1qQ2_e1ZVw/4lb/twXBdovCTqiYTsaJEd24KQ/h18/h001.7zUNAMI3rFzwmiKsEOcXeliLdMB5wi6EL93BRUFYKCI" style="text-decoration:none;"><img width="22" height="22" alt="tw" border="0" style="display:block;max-width:22px;color:Dark" src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/static_assets/x_dark.png"/></a></td><td align="center" valign="middle" width="75" style="width:75px;"><a href="https://elink4f7.mail.bycloud.ai/ss/c/u001.amatuKKICSickUKplYJXmBoQnQ9VXnB2zTxBG4HeHBi5iti4l06m5fR1UTFq_vFgQaGMmutCjJbuBFU8WHbRj6heToGsiZHlry3dxu5DEimeQbpBAMyhKdSbaWrmIf3bsmtVkueZfLA8Ujm98y7rV_OhUC1yuj6VoMP-Hf-7aDx3sSwXYfBbPy9bDn6uMKI0mqt29rj_aUrni4iy0S7hur3UVoVLSJcBs4oNZVqO4R36hqXH1vKcj0ROs6l33rcaF8qu9wdKI8T--Iahjqta2A/4lb/twXBdovCTqiYTsaJEd24KQ/h19/h001.aqe6KDfNL5PO1tzv6rJ_Zgfnc7ayrGiQMR0LVFppykU" style="text-decoration:none;"><img width="22" height="16" alt="yt" border="0" style="display:block;max-width:22px;color:Dark" src="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/static_assets/youtube_dark.png"/></a></td><td><span style="padding-left:1px;"></span></td></tr></table></td></tr><tr><td height="10" style="line-height:1px;font-size:1px;height:10px;"> </td></tr><tr><td class="w" align="center" valign="top" style="padding:15px 15px 15px 15px;"><table role="none" width="100%" border="0" cellspacing="0" cellpadding="0" align="center"><tr><td align="center" valign="top"><p style="font-family:'Verdana',Geneva,sans-serif;color:#FFFFFF!important;"> Update your email preferences or unsubscribe <a class="link" href="https://elink4f7.mail.bycloud.ai/ss/c/u001.c6q0w4g5sodbtO4I1B_pxWc4htTObwdorovK0nFHVH-4pUdVE0ELYH5DsNemk732SjNwhPNJ25r0O8B5vYifsBhEpz-DJgyVFmavJPa0OyKRRnvw4o7XGyvIv7PRofnmqm8iRbTlSUDmkRNXAOlpK1olS0NX2dr9JkNoRRnSdVDa8KxYrWMrv10J5S8m1BmZOwZ11pwysKtwFkeDKz2AzKfUb4iSUfazQOBVuvCPzSlE-JqiGnbvwnqFlszHBC0vxgehSwbuBapY2hx9IixD3F6beJA7XpsJ0DEp119F1pvHuz8i83gXUpahQ9_6KWImUD7PUNSOlWKt8fdSCf_WYYS6SNUWtFJC8w3LwbvHlWwexig2xskCiv1wE5VmALJWlfXaAFnt6oScjm428-tiV8u-vwoC9V-LdYzxReobttf822w4q8w43PncvkOwcdKHGiNbIiE9ahy4YbQ68i0uH1zRpanwlotkyp5uUF5cr97Q0c-bFIdarArN59v_PMCURl2v77qk3fqkDzbKPDqcT602EpdgPKyvycC7ua0W4drJmlP97ZIxW8ZFqhi-N3C9pI0t2ZD7FKgfLDdRR4Bdhv-Y71QhlsdqoZQV54dCHSMkp61_irkLpovVOUIfJ4bq48ANh7ahe7yoJRcHuGXsrSBKOqvypWWvv_Z7eoczjLEuh8wI5Cc3evzYs3-1F9Fh1NXmEMo6ZCZGB5HXyA5i7OFkmI_7YWfWUSmny9r2DaFK9g6tD8UZilS6k6QvQz466IUmONkpH95GfgzacR3QA-3pBEqHUnON3cv4HvCtNEMuvuK0CVLoCLthbaquKRt-iMqigbMp33TVhGKZqNedYnAtC1zG6ScmXH_mXfO1B6hYaxTC6tFinGE_0lv16yyLXB9NFdW4bHWDPKRXkBMqLxXQYoRC50DorNg1jeKvm0aYazRu21PsBAXRiTvH-TMr5JSCxUD5IZnz9jCCgFE9orgR7Dk7tDqwlj5nfhR1RS5otHlhIeeytyt2n3MLALvNFsES2v61EC47dkv7R156-iSldnRlXbDaEdGU3qnZNKc/4lb/twXBdovCTqiYTsaJEd24KQ/h20/h001.WSVTLmmUD-OtdKTKzlFcbFTGbNsdLxDq8Dqdl5Jq4LY" style="text-decoration:underline;text-decoration-color:#FFFFFF!important;color:#FFFFFF!important;"> here</a></p><p class="copyright" style="font-family:'Verdana',Geneva,sans-serif;color:#FFFFFF!important;"> © 2025 bycloudai </p><p style="font-family:'Verdana',Geneva,sans-serif;color:#FFFFFF!important;"> 228 Park Ave S, #29976, New York, New York 10003, United States </p></td></tr><tr style="display: table-row !important;"><td align="center" valign="top" style="padding-top:20px;" style="display:table-cell !important;"><table role="none" border="0" cellspacing="0" cellpadding="0" align="center" style="display:table !important;"><tr style="display:table-row !important;"><td class="u" align="center" valign="middle" height="32" style="height:32px;display:table-cell !important; max-height: 32px !important;margin:0px !important; background-color: #ffffff !important;"><a style="line-height:32px !important;text-decoration:none;display:block !important;" href="https://elink4f7.mail.bycloud.ai/ss/c/u001.DUiN96-Eq7pUHzwEhy5j28olDWFpV5DDKfdk_OdOKOhqQWbrNURCscCJH_EwbgrMx5-b8ECWQYGFzsU1W6el3qsFm9VjyCQ90exvi1C6A8TPpKE1joBOKkN4beiPrjNoFlFy-qHo0vbgnHaSzWNVTTouPWUlmvxuD5M-S5gthJLszyIsfgGq81ZoQ3BruuQ-676Pqmyf2tF5KxITAwaQj5JfLcNS3uCMPEOX6gjpv7I4FeMcgwuBaFEzjMg9JAiY/4lb/twXBdovCTqiYTsaJEd24KQ/h21/h001.IWcUkaxmiJ9VCFDGi7pnve_nG9vtCRA4aslhtxTsouU"><img src="https://media.beehiiv.com/output-onlinepngtools.png" width="16" alt="beehiiv logo" style="display:inline-block !important;max-width:16px !important; vertical-align:-3px !important;width: 16px !important;" border="0"/><span style="padding-left:11px !important;display: inline-block !important;">Powered by beehiiv</span></a></td></tr></table></td></tr><tr><td align="left" valign="top" height="2" style="height:2px;"><a href='https://elink4f7.mail.bycloud.ai/ss/c/u001.CxDkkVpJsBdVoe83c_tBWsHIaP4XNp0WgUYqLvHcKk_3uqk_KIkz4ddLinhFbud6JuxLFdSUhYnR7b1NSsmbtzXNGNblnEEMKUtkCAjkn8Y/4lb/twXBdovCTqiYTsaJEd24KQ/h22/h001.8AWsCp8EI1DVIJhkFAcCat_SK0xFQpw_KoqNBIpAaXQ' style="color: #2a2a2a !important; cursor: default; font-size: 1px; text-decoration: none;"> Terms of Service </a></td></tr></table></td></tr></table></td></tr></td></tr></table></td></tr></table></td></tr></table></td></tr></table></div></body></html>