N/A

Teaching Experience Publication ML ∪ Systems Resources Contact

ML ∪ Systems

This blog is a cozy corner dedicated to exploring and collecting researches, articles, and happenings around applied machine learning, systems and their intersection.
It is an effort aiming at keeping myself updated with new findings, recent advances, and peaks and valleys in these domains.
Occasionally delving into the depth, often shallower and broader, and mostly just pointers to papers, talks and posts; with zero guarantee on the correctness of the content.

Feel free to leave comments, point out any mistakes (rest assured, there will be many), and share your own knowledge. Your contribution is greatly valued.

ML ∪ Systems Intro

Latest


  • ICLR 2024: Numbers and My To Be Read List


    May 27, 2024

    The twelfth International Conference on Learning Representations (ICLR 2024) took place during the second week of May in Vienna, Austria. ICLR is widely recognized as one of the premier conferences in the field of machine learning, with a high volume of submissions annually. This post will first present the conference key statistics, followed by a list of papers that received outstanding paper awards and honorable mentions. Lastly, a selection of papers that I find intriguing, based solely on their titles and abstracts, with a few exceptions, will be shared.

  • CoNEXT '23 SIGCOMM Rising Star Keynote: Application-Centric Networking


    December 28, 2023

    Ravi Netravali, an Assistant Professor at Princeton University won this year’s ACM SIGCOMM Rising Star Award. In this keynote talk which was given at ACM CoNEXT ‘23 Conference, he discusses the importance of application-centric networking. He argues that the networking community has neglected the impact of the application and its corresponding layer in the networking stack, and gives his insights through several projects that he has done in the past few years.

  • Behind the Scenes Scaling ChatGPT


    December 26, 2023

    In this non-technical talk, Evan Morikawa, Engineering Manager at OpenAI opens up a window to the behind the scenes of OpenAI since the first days of releasing ChatGPT. He explains a few engineering challenges that they faced and their [sometimes funny] thoughts and efforts on addressing them.

  • Are Machine Learning Methods Better than Simple Heuristics for Time Series Prediction in Networking Problems?


    December 01, 2023

    Machine learning methods and in particular deep learning have paved their way to many other computer science domains, and research around networked systems is not an exception. One key challenge which to some extent is shared between a considerable number of networks and systems tasks is forecasting different kinds of time series. Incoming traffic, metrics extracted from the system conditions during a time period, and/or events in different granularities are a few to mention.

  • Which One Is the Winner CNN or Vision Transformer?


    November 27, 2023

    Deep learning advancements was started with recurrent neural networks like LSTM in 90s, later blown up with CNNs in ~2010 and grown to another level with Transformers in 2017. Since the emergence of Vision Transformers (ViTs) around 2020, there has always been a debate between ConvNet fans (who design, develop and support convolutional neural networks) and foundational models fans (who believe transformers are all you need) around which architecture performs better in terms of achievable accuracy, scalability and accuracy:compute ratio.

Contact


Room 1912, Chair of Communication Networks, CE Department, Technical University of Munich, Munich, Germany.