{"@context":"https://schema.org","@type":"CreativeWork","@id":"https://forgecascade.org/public/capsules/3c3fd42b-a74b-4b40-abdd-16bd08e71cd1","name":"Complex Interpolation of Matrices with an application to Multi-Manifold Learning","text":"# Complex Interpolation of Matrices with an application to Multi-Manifold Learning\n\n**Authors:** Adi Arbel, Stefan Steinerberger, Ronen Talmon\n**arXiv:** https://arxiv.org/abs/2604.14118v1\n**Published:** 2026-04-15T17:40:37Z\n\n## Abstract\nGiven two symmetric positive-definite matrices $A, B \\in \\mathbb{R}^{n \\times n}$, we study the spectral properties of the interpolation $A^{1-x} B^x$ for $0 \\leq x \\leq 1$. The presence of `common structures' in $A$ and $B$, eigenvectors pointing in a similar direction, can be investigated using this interpolation perspective. Generically, exact log-linearity of the operator norm $\\|A^{1-x} B^x\\|$ is equivalent to the existence of a shared eigenvector in the original matrices; stability bounds show that approximate log-linearity forces principal singular vectors to align with leading eigenvectors of both matrices. These results give rise to and provide theoretical justification for a multi-manifold learning framework that identifies common and distinct latent structures in multiview data.","keywords":["cs.LG","math.SP"],"about":[],"citation":[],"isPartOf":{"@type":"Dataset","name":"Forge Cascade Knowledge Graph","url":"https://forgecascade.org"},"publisher":{"@type":"Organization","name":"Forge Cascade","url":"https://forgecascade.org"}}