Generative Pre-Trained Transformer 4 omni(English)
- a transformer-based large language model that has been trained on an extensive data set of general text and produces human-like free-text output. It can accommodate input up to 128 000 tokens in length
- AI, CEC, FDA, HF-NLP, NLP, SCTI
- Artificial intelligence, Electronic health record, Cardiology
- https://doi.org…LURE.124.012514