Building a Cloud-Native Platform for the Future of AI: Foundation Models
Abstract
Foundation Model is an emerging inflection point in the creation of powerful, very high dimensional data representations, triggered by advances in AI. Foundation Models in AI are billion-parameter-scale neural networks, powered by novel architectures which are trained using a technique called self-supervision. This new paradigm imposes unprecedented opportunities and challenges across the full computing stack. Hear how IBM Research is expanding and realizing the value of Foundation Models, from building a cloud-native supercomputing infrastructure and a simplified, cloud-native common stack to train and deploy Foundation Models in an multicloud environment, to applying this full stack to enable advances in natural language domain and beyond, including time series and code generation.