Artwork

Treść dostarczona przez The Cloud Pod, Justin Brodley, Jonathan Baker, Ryan Lucas, and Peter Roosakos. Cała zawartość podcastów, w tym odcinki, grafika i opisy podcastów, jest przesyłana i udostępniana bezpośrednio przez The Cloud Pod, Justin Brodley, Jonathan Baker, Ryan Lucas, and Peter Roosakos lub jego partnera na platformie podcastów. Jeśli uważasz, że ktoś wykorzystuje Twoje dzieło chronione prawem autorskim bez Twojej zgody, możesz postępować zgodnie z procedurą opisaną tutaj https://pl.player.fm/legal.
Player FM - aplikacja do podcastów
Przejdź do trybu offline z Player FM !

257: Who Let the LLamas Out? *Bleat Bleat*

1:01:47
 
Udostępnij
 

Manage episode 415760735 series 2499996
Treść dostarczona przez The Cloud Pod, Justin Brodley, Jonathan Baker, Ryan Lucas, and Peter Roosakos. Cała zawartość podcastów, w tym odcinki, grafika i opisy podcastów, jest przesyłana i udostępniana bezpośrednio przez The Cloud Pod, Justin Brodley, Jonathan Baker, Ryan Lucas, and Peter Roosakos lub jego partnera na platformie podcastów. Jeśli uważasz, że ktoś wykorzystuje Twoje dzieło chronione prawem autorskim bez Twojej zgody, możesz postępować zgodnie z procedurą opisaną tutaj https://pl.player.fm/legal.

Welcome to episode 257 of the Cloud Pod podcast – where the forecast is always cloudy! This week your hosts Justin, Matthew, Ryan, and Jonathan are in the barnyard bringing you the latest news, which this week is really just Meta’s release of Llama 3. Seriously. That’s every announcement this week. Don’t say we didn’t warn you.

Titles we almost went with this week:

  • Meta Llama says no Drama
  • No Meta Prob-llama
  • Keep Calm and Llama on
  • Redis did not embrace the Llama MK
  • The bedrock of good AI is built on Llamas
  • The CloudPod announces support for Llama3 since everyone else was doing it
  • Llama3, better know as Llama Llama Llama
  • The Cloud Pod now known as the LLMPod
  • Cloud Pod is considering changing its name to LlamaPod
  • Unlike WinAMP nothing whips the llamas ass

A big thanks to this week’s sponsor:

Check out Sonrai Securities‘ new Cloud Permission Firewall. Just for our listeners, enjoy a 14 day trial at www.sonrai.co/cloudpod

Follow Up

01:27 Valkey is Rapidly Overtaking Redis

  • Valkey has continued to rack up support from AWS, Ericsson, Google, Oracle and Verizon initially, to now being joined by Alibaba, Aiven, Heroku and Percona backing Valkey as well.
  • Numerous blog posts have come out touting Valkey adoption.
  • I’m not sure this whole thing is working out as well as Redis CEO Rowan Trollope had hoped.

AI Is Going Great – Or How AI Makes All It’s Money

03:26 Introducing Meta Llama 3: The most capable openly available LLM to date

  • Meta has launched Llama 3, the next generation of their state-of-the-art open source large language model.
  • Llama 3 will be available on AWS, Databricks, GCP, Hugging Face, Kaggle, IBM WatsonX, Microsoft Azure, Nvidia NIM, and Snowflake with support from hardware platforms offered by AMD, AWS, Dell, Intel, Nvidia and Qualcomm
  • Includes new trust and safety tools such as Llama Guard 2, Code Shield and Cybersec eval 2
  • They plan to introduce new capabilities, including longer context windows, additional model sizes and enhanced performance.
  • The first two models from Meta Lama3 are the 8B and 70B parameter variants that can support a broad range of use cases.
  • Meta shared some benchmarks against Gemma 7B and Mistral 7B vs the Lama 3 8B models and showed improvements across all major benchmarks. Including Math with Gemma 7b doing 12.2 vs 30 with Llama 3
  • It had highly comparable performance with the 70B model against Gemini Pro 1.5 and Claude 3 Sonnet scoring within a few points of most of the other scores.
  • Jonathan recommends using LM Studio to get start playing around with LLMS, which you can find at https://lmstudio.ai/

04:42 Jonathan – “Isn’t it funny how you go from an 8 billion parameter model to a 70 billion parameter model but nothing in between? Like you would have thought there would be some kind of like, some middle ground maybe? But, uh, but… No. But, um,

  continue reading

293 odcinków

Artwork
iconUdostępnij
 
Manage episode 415760735 series 2499996
Treść dostarczona przez The Cloud Pod, Justin Brodley, Jonathan Baker, Ryan Lucas, and Peter Roosakos. Cała zawartość podcastów, w tym odcinki, grafika i opisy podcastów, jest przesyłana i udostępniana bezpośrednio przez The Cloud Pod, Justin Brodley, Jonathan Baker, Ryan Lucas, and Peter Roosakos lub jego partnera na platformie podcastów. Jeśli uważasz, że ktoś wykorzystuje Twoje dzieło chronione prawem autorskim bez Twojej zgody, możesz postępować zgodnie z procedurą opisaną tutaj https://pl.player.fm/legal.

Welcome to episode 257 of the Cloud Pod podcast – where the forecast is always cloudy! This week your hosts Justin, Matthew, Ryan, and Jonathan are in the barnyard bringing you the latest news, which this week is really just Meta’s release of Llama 3. Seriously. That’s every announcement this week. Don’t say we didn’t warn you.

Titles we almost went with this week:

  • Meta Llama says no Drama
  • No Meta Prob-llama
  • Keep Calm and Llama on
  • Redis did not embrace the Llama MK
  • The bedrock of good AI is built on Llamas
  • The CloudPod announces support for Llama3 since everyone else was doing it
  • Llama3, better know as Llama Llama Llama
  • The Cloud Pod now known as the LLMPod
  • Cloud Pod is considering changing its name to LlamaPod
  • Unlike WinAMP nothing whips the llamas ass

A big thanks to this week’s sponsor:

Check out Sonrai Securities‘ new Cloud Permission Firewall. Just for our listeners, enjoy a 14 day trial at www.sonrai.co/cloudpod

Follow Up

01:27 Valkey is Rapidly Overtaking Redis

  • Valkey has continued to rack up support from AWS, Ericsson, Google, Oracle and Verizon initially, to now being joined by Alibaba, Aiven, Heroku and Percona backing Valkey as well.
  • Numerous blog posts have come out touting Valkey adoption.
  • I’m not sure this whole thing is working out as well as Redis CEO Rowan Trollope had hoped.

AI Is Going Great – Or How AI Makes All It’s Money

03:26 Introducing Meta Llama 3: The most capable openly available LLM to date

  • Meta has launched Llama 3, the next generation of their state-of-the-art open source large language model.
  • Llama 3 will be available on AWS, Databricks, GCP, Hugging Face, Kaggle, IBM WatsonX, Microsoft Azure, Nvidia NIM, and Snowflake with support from hardware platforms offered by AMD, AWS, Dell, Intel, Nvidia and Qualcomm
  • Includes new trust and safety tools such as Llama Guard 2, Code Shield and Cybersec eval 2
  • They plan to introduce new capabilities, including longer context windows, additional model sizes and enhanced performance.
  • The first two models from Meta Lama3 are the 8B and 70B parameter variants that can support a broad range of use cases.
  • Meta shared some benchmarks against Gemma 7B and Mistral 7B vs the Lama 3 8B models and showed improvements across all major benchmarks. Including Math with Gemma 7b doing 12.2 vs 30 with Llama 3
  • It had highly comparable performance with the 70B model against Gemini Pro 1.5 and Claude 3 Sonnet scoring within a few points of most of the other scores.
  • Jonathan recommends using LM Studio to get start playing around with LLMS, which you can find at https://lmstudio.ai/

04:42 Jonathan – “Isn’t it funny how you go from an 8 billion parameter model to a 70 billion parameter model but nothing in between? Like you would have thought there would be some kind of like, some middle ground maybe? But, uh, but… No. But, um,

  continue reading

293 odcinków

Wszystkie odcinki

×
 
Loading …

Zapraszamy w Player FM

Odtwarzacz FM skanuje sieć w poszukiwaniu wysokiej jakości podcastów, abyś mógł się nią cieszyć już teraz. To najlepsza aplikacja do podcastów, działająca na Androidzie, iPhonie i Internecie. Zarejestruj się, aby zsynchronizować subskrypcje na różnych urządzeniach.

 

Skrócona instrukcja obsługi