[go: nahoru, domu]

Skip to content

Latest commit

 

History

History
6 lines (5 loc) · 675 Bytes

README.md

File metadata and controls

6 lines (5 loc) · 675 Bytes

Dolly-2.0-LLM

Databricks’ dolly-v2-12b, an instruction-following large language model trained on the Databricks machine learning platform that is licensed for commercial use. Based on pythia-12b, Dolly is trained on ~15k instruction/response fine tuning records databricks-dolly-15k generated by Databricks employees in capability domains from the InstructGPT paper, including brainstorming, classification, closed QA, generation, information extraction, open QA and summarization. dolly-v2-12b is not a state-of-the-art model, but does exhibit surprisingly high quality instruction following behavior not characteristic of the foundation model on which it is based.