A Generative AI Engineer is tasked with developing an application that is based on an open-source large language model (LLM). They need a foundation LLM with a large context window. Which model fits this need?
- DBRX,
- Llama2-70B,
- DistilBert
- MPT-30B.
DBRX has a larger context window compared to MPT-30B. DBRX has a 32K token context window, while MPT-30B has an 8k token context window but the Answer has mentioned as MPT-30B. Can anyone please help here? Thanks in advance.