Showing posts with label open source. Show all posts
Showing posts with label open source. Show all posts

Sunday, October 26, 2025

Mlflow Java client

Mlflow is a leading open source framework for managing AI/ ML workflows. Mlflow allows tracking, monitoring and generally visualizing end-to-end ML project lifecycles. A handy ops side tool that improves over interpretability of AI/ ML projects.

Key Mlflow concepts include ML Projects, Models on which several Runs of Experiments conducted to name a few. Experiments can also be Tagged with meaningful humanly relevant labels.

While Mlflow is a Python native library with integrations with all the leading Python AI/ ML frameworks such as OpenAI, Langchain, Llamaindex, etc there are also Mlflow API endpoints for wider portability. 

There is also a specific Mlflow Java Api for use from the Java ecosystem. The corresponding Mlflow Java client (maven plugin, etc) works well with the API. To get started with the mlflow using Java:

(I) Install mlflow (Getting started guide)

        $ pip install mlflow 

 This installs mlflow to the users .local folder:

        ~/.local/bin/mlflow 

(II) Start Local mlflow server (simple without authentication)

        $ mlflow server --host 127.0.0.1 --port 8080

mlflow server should be running on 

        http://127.0.0.1:8080

(III) Download mlflower repo (sample Java client code)

Next clone the mlflower repo which has some sample code showing working of the mlflow Java client. 

  • The class Mlfclient shows a simple use case of Creating an Experiment:

            client.createExperiment(experimentName);

Followed by a few runs of logging some Parameters, Metrics, Artifacts:

     run.logParam();

      run.logMetric();

       run.logArtifact()

 

  • Run Hierarchy: Class NestedMlfClient shows nesting hierarchy of Mlflow runs

        Parent Run -> Child Run -> Grand Child Run ->.... & so on

(IV) Start Local mlflow server (with eBasic Authentication)

While authentication is crucial for managing workflows, Mlflow only provided Basic Auth till very recently. Version 3.5 onwards has better support for various auth provides, SSO, etc. For now only mlflow Basic Auth integration is shown.

           # Start server with Basic Auth
            mlflow server --host 127.0.0.1 --port 8080 --app-name basic-auth

Like previously, mlflow server should start running on

            http://127.0.0.1:8080

Only requiring a login credential this time to access the page. The default admin credentials are mentioned on mlflow basic-auth-http.

  • The class BasicAuthMlfclient shows the Java client using BasicMlflowHostCreds to connect to Mlflow with basic auth. 

            new MlflowClient(new BasicMlflowHostCreds(TRACKING_URI, USERNAME, PASSWORD));

(V) Deletes Soft/ Hard

  • Experiments, Runs, etc created within mlflow can be deleted from the ui (& client). The deletes are however only Soft, and get stored somewhere in a Recycle Bin, not visible on the UI.
  •  Hard/ permanent deletes can be effected from the mlflow cli

    # Set mlflow server tracking uri 

    export MLFLOW_TRACKING_URI=http://127.0.0.1:8080

    # Clear garbage

    mlflow gc

  (VI) Issues

  • MlflowContext.withActiveRun() absorbs exception without any logs, simply sets the run status to RunStatus.FAILED
    • So incase runs show failure on the mlflow UI, its best to put explicit try-catch on the client to find the cause.
  • Unable to upload artifacts since cli looks for python (& not python3) on path to run. 
    • Error message: Failed to exec 'python -m mlflow.store.artifact.cli', needed to access artifacts within the non-Java-native artifact store at 'mlflow-artifacts:
    • The dev box (Ubuntu ver 20.04) has python3 (& not python) installed.
    • Without changing the dev box a simple fix is to set/ export the environment variable MLFLOW_PYTHON_EXECUTABLE (within the IDE, shell, etc) to whichever python lib is installed on the box:
               MLFLOW_PYTHON_EXECUTABLE=/usr/bin/python3 
 
So with that keep the AI/ Ml projects flowing!

Sunday, April 6, 2025

Model Context Protocol (MCP)

Standardization Protocol for AI agents. Enables them to act, inter-connect, process, parse, invoke functions. In other words to Crawl, Browse, Search, click, etc. 

MCP re-uses well known client-server architecture using JSON-RPC. 

Apps use MCP Clients -> MCP Servers (abstracts the service)

Kind of API++ for an AI world!

Saturday, April 5, 2025

Open Weight AI

Inspired by Open Source Software (OSS), yet not fully open...

With Open Weight (OW) typically the final model weights (& the fully trained model) are made available under a liberal free to reuse, modify, distribute, non-discriminating, etc licence. This helps for anyone wanting to start with the fully trained Open Weight model & apply them, fine-tune, modify weights (LoRA, RAG, etc) for custom use-cases. To that extent, OW has a share & reuse philosophy.
 
On the other hand, wrt training data, data sources, detailed architecture, optimizations details, and so on OW diverges from OSS by not making it compulsory to share any of these. So these remain closed source with the original devs, with a bunch of pros & cons. Copyright material, IP protection, commercial gains, etc are some stated advantages for the original devs/ org. But lack of visibility to the wider community, white box evaluation of model internals, biases, checks & balances are among the downsides of not allowing a full peek into the model.

Anyway, that's the present, a time of great flux. As models stabilize over time OW may tend towards OSS...

References

  • https://openweight.org/    
  • https://www.oracle.com/artificial-intelligence/ai-open-weights-models/
  • https://medium.com/@aruna.kolluru/exploring-the-world-of-open-source-and-open-weights-ai-aa09707b69fc
  • https://www.forbes.com/sites/adrianbridgwater/2025/01/22/open-weight-definition-adds-balance-to-open-source-ai-integrity/
  • https://promptengineering.org/llm-open-source-vs-open-weights-vs-restricted-weights/
  • https://promptmetheus.com/resources/llm-knowledge-base/open-weights-model
  • https://www.agora.software/en/llm-open-source-open-weight-or-proprietary/