UT Spark: Model Availability

UT Spark: Model Availability

UT Spark provides access to a variety of Azure-deployed Large Language Models (LLMs). In the coming months, we plan to expand availability to include ChatGPT-5, Claude Sonnet 4, and Claude Opus 4, while maintaining the flexibility to add or remove models based on community feedback.

Process

When a new model or version of a model becomes available, the AI studio team will test and evaluate it for use within UT spark. Once deemed acceptable for use, the AI studio team will release to production and remove the older version of that vendor/model. For example: When Chat GPT5 is evaluated and ready for production, the team will remove the oldest version Chat GPT 4o, keep Chat GPT 4.1 and deploy Chat GPT 5. This will allow 2 versions in use at all times. Note: there may be a delay between when a new version/model is released to the public and when it is released in UT Spark as it is tested and evaluated.

OpenAI

ChatGPT 5

Coming soon…

ChatGPT 4o

Aspect

Details

Aspect

Details

Release Date

May 13, 2024 (Business Insider, Wikipedia)

Knowledge Cutoff

Initially October 2023; extended to June 2024 as of January 29, 2025 (Seifeur Guizeni, DeepNewz, Wikipedia)

When to Use

  • Multimodal tasks (text, image, audio) - Conversational, exploratory, or creative workflows - Real-time, interactive dialogue (voice/image-enabled) - Quick, general-purpose summarization, brainstorming, or learning assistance

Why Use It

  • Intuitive and responsive—great for live interaction and ideation - Multimodal support enhances flexibility - More up-to-date training data (to mid-2024) improves recency and relevance

ChatGPT 4.1

Aspect

Details

Aspect

Details

Release Date

April 14, 2025 (API launch); available via ChatGPT Plus/Pro since May 14, 2025 (Wikipedia)

Knowledge Cutoff

June 2024 (shared across the GPT-4.1 model family) (Wikipedia)

When to Use

  • Projects involving very long or complex prompts (large context) - Precise, structured, or code-heavy outputs - Automated or backend uses via the API - Scalable workflows: large-scale content, technical documents, multi-step reasoning

Why Use It

  • Massive context window (~1 million tokens) handles long documents/codebases - Stronger accuracy, instruction following, and coding performance - Designed for efficiency and lower costs in API-driven workflows

 

Anthropic

Coming soon…

 

Mistral

Coming soon…