Live Interactions with Jan Chorowski and Adrian Kosowski | Bonus Resource
Last updated
Last updated
As a part of this bootcamp, we hosted a captivating Fireside Chat featuring Jan Chorowski, CTO of Pathway. Jan is a renowned figure in Artificial Intelligence with a Ph.D. in Neural Networks and a portfolio that includes more than 10k+ citations and collaborations on research papers with AI pioneers like Geoff Hinton and Yoshua Bengio.
During this captivating Fireside Chat, our host, Anup Surendran, engaged in a deep exploration of the captivating realm of Large Language Models (LLMs) alongside Jan. Their dynamic discussion spanned a wide spectrum of LLM topics, encompassing their diverse applications, the operational hurdles they face, and the intriguing concept of 'learning to forget.' Moreover, the conversation delved into the real-time capabilities of LLMs and illuminated their paramount importance in the ever-evolving landscape of modern technology.
Key Highlights:
Gain insights into the evolution of LLMs and their practical applications.
Explore the operational challenges faced by LLMs in real-time scenarios.
Understand the concept of 'learning to forget' and its role in LLM development.
Dive into the discussion on the real-time nature of LLMs and their relevance.
Get valuable answers to audience questions about LLMs, document versioning, and more.
Don't miss out on this illuminating Fireside Chat that offers a unique perspective on the evolving world of Large Language Models. Dive into the past conversation to uncover the mysteries and possibilities of LLMs in today's tech-driven world.
This is a session from Pathway's Archives. It's a live interaction between Jon Krohn (Chief Data Scientist at Nebula | GitHub) and Adrian Kosowski (CPO at Pathway | Google Scholar). Adrian Kosowski, notable for his early PhD at and co-founding Spoj.com, brings over 15 years of diverse research experience to the discussion. It delves into real-time data processing, contrasting stream versus batch processing, and exploring practical ML applications.
Key Highlights:
Gain a deep understanding of real-time data processing nuances through a discussion on reactive data processing.
Explore the key differences and practical applications of stream versus batch processing.
Understand the role of transformers in data engineering, especially in managing and streaming data.
Discover emerging machine learning tools and approaches that are particularly beneficial for startups.
Don't miss these illuminating Fireside Chats that offer unique perspectives on the fast-evolving domains of Large Language Models and Real-time Data Processing. These sessions provide valuable insights into the wonderful world of AI and machine learning.