Week of NeurIPS 2023 and Humanoids 2023
Bringing you daily snippets from these 2 top tier conferences
This is a big week with two major conferences in the fields of Artificial Intelligence (AI) and Robotics intersecting, so there is a lot of interesting news to report. The biggest gathering for AI, NeurIPS 2023 occurs all-week in New Orleans, LA, while the biggest gathering for humanoid robots, Humanoids 2023 occurs from Tuesday - Thursday in Austin, TX. We will bring you short stories such as this post from these two conferences for paid subscribers only. If you have not signed up for paid subscription yet, please do at the following link.
Big advances for Large Language Models (LLMs)
In the case of AI, companies and research groups make big announcements scheduled around NeurIPS in order to make the loudest bang. Google released their latest multi-modal model (multi-modal models work with text, video, images, code, and other media formats simultaneously) called Gemini last week. While there a lot of news on Gemini last week, Google is also receiving a lot of flak for having a canned demo that made people believe that Gemini was close to Artificial General Intelligence (AGI).
Mixtral
AI research moves fast, and Gemini has already been overshadowed by a new model from Mistral AI called Mixtral 8x7B. Mixtral is a mixture-of-experts model designed with 8 subgroups of model parameters (of size 7 Billion) out of which only two subgroups are dynamically chosen during inference. With this design, the Mixtral 8x7B beats the largest Llama2 70 Billion parameters model in accuracy, but also achieves 6x faster inference because a much smaller set of parameters are used for computation (12.9 Billion parameters per token) instead of the full model in the case of Llama2. If you want to know more about Mixtral, the Interconnects blog has a very nice and detailed article on it. If you are interested in testing Mixtral, Together AI has already made a free inference point available for you.
Topics from NeurIPS 2023
The attention given to LLMs can also be gauged by the fact that 4 of 6 best paper awards at NeurIPS 2023 went to those working on LLMs.
In the context of robotics, the most interesting advancements are in the field of Reinforcement Learning (RL). There is an interesting paper from researchers at Stanford which discusses how trained transformer model (like those used in LLMs) can be used for predicting an optimal action given a query state and a dataset of interactions. I also look forward to read through a ton of research work from Prof. Sergey Levine’s RAIL research group at UC Berkeley. A thread with research work presented from his group is available here. If you want to track all the cool work from NeurIPS, you can search on X (Twitter) for the hashtag #NeurIPS2023.
Humanoid Robotics competition
For me, besides the great quality of research work, a great attraction of Humanoids 2023 is the robot competition on Wednesday. For humanoid robots, there are two walking competitions where the robot is asked to walk end-to-end across a 15m X 2m corridor. In one competition, there are no obstacles, while in the other competition 5 gallon trash cans and large storage boxes are presented as obstacles. The fastest robot wins. The robots competing are Unitree H1, Artemis robot from UCLA, and the UNLV Hubo robot. There is a similar race for quadruped robots on Thursday. We hope to get you some cool videos from these competition.
If any readers are attending either of these two conferences please send me a pointer (via comments below).
The real excitement at IEEE Humanoids is the workshops today (like Can We Build BayMax?), where we're hearing a lot about what's under the hood in the humanoids that have achieved real product status. There are still many unsolved problems to research but this is the year that commercial humanoids are here.