Foundation Base LLMs

The Foundation Base LLMs for DAMN:

Open Source Models

Meta AI Models

  • LLaMA 3.1: Latest version with 8B, 70B, and 405B parameter variants

  • LLaMA 2: Previous version with 7B, 13B, and 70B parameter variants

Google Models

  • BERT: One of the earliest transformer-based models

  • Gemma: Open-source version of Gemini, available in 2B and 7B parameter sizes

Mistral AI Models

  • Mistral 7B: 7.3B parameter model with impressive performance

  • Mixtral 8x22B: 141B total parameters, using 39B active parameters

Stability AI Models

  • StableLM: Series including 3B, 7B variants, with larger models in development

  • Stable LM 2: Available in 1.6B and 12B parameter versions

Other Open Source Models

  • BLOOM: Developed by BigScience, available through Hugging Face

  • Falcon: Available in 11B (Falcon 2) and 180B parameter versions

  • OPT: Meta's series ranging from 125M to 175B parameters

  • XGen-7B: Developed by Salesforce, focusing on longer context windows

  • GPT-NeoX: 20B parameter model by EleutherAI

  • GPT-J: 6B parameter model by EleutherAI

  • Pythia: Series of models ranging from 70M to 12B parameters

  • DBRX: 132B parameter model by Databricks and Mosaic

Proprietary Models

OpenAI Models

  • GPT-4o: Latest multimodal model with text, image, video, and voice capabilities

  • GPT-3.5: Previous generation model

Anthropic Models

  • Claude 3.5: Known for ethical design and strong performance

Google Models

  • PaLM 2: 340B parameter model, successor to the original PaLM

  • Gemini 1.5: Focused on improving multilingual capabilities

Other Proprietary Models

  • Grok-1: Developed by xAI, with 314B parameters

  • Inflection-2.5: Powers the conversational AI assistant Pi

  • Jamba: 52B parameter model by AI21 Labs, using SSM technology

  • Cohere: Specialized in enterprise applications

  • Luminous: 70B parameter model developed by Aleph Alpha

Last updated