Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
While the track application versions guide showed how to track application versions using LoggedModel
as a metadata hub linking to external code (e.g., Git), there are scenarios where you need to package your application code directly into the LoggedModel
.
This is particularly useful for deployment to Databricks Model Serving or via Agent Framework, which expects self-contained model artifacts.
When to Package Code Directly
Package your code into a LoggedModel
when you need:
- Self-contained deployment artifacts that include all code and dependencies
- Direct deployment to serving platforms without external code dependencies
This is an optional step for deployment, not the default versioning approach for development iterations.
Getting started
MLflow reccomends using the ChatAgent
interface to package your GenAI applications.
**Visit the build and deploy AI agents quickstart for a quick example or author AI agents in code page to learn details.
Following these guides will result in a deployment-ready LoggedModel
that behaves in the same way as the metadata-only LoggedModel
. Follow step 6 of the track application versions guide to link your packaged model version to evaluation results.
Next Steps
- Deploy to Model Serving - Deploy your packaged model to production
- Link production traces to app versions - Track deployed versions in production
- Run scorers in production - Monitor deployed model quality