WOLFcon 2024 - Understanding and Using AI Workflows with FOLIO

23 September 2024


Edge AI Module and AI Workflows

This workshop introduces two key components of FOLIO's AI integration: the Edge AI Module and AI Workflows.

The FOLIO Edge AI Module provides a set of API endpoints for different FOLIO modules. The backend module supports the following Generative AI services:

  • OpenAI's ChatGPT (requires an API token)
  • Anthropic Claude (requires an API token)
  • Google Gemini (requires an API token)
  • Locally hosted Llama model (through a hosted GPT4ALL or LLaMA.cpp instance)

Existing API endpoints

Inventory Instance Record

Most of the development so far on Edge AI Module so far has focused on the requirements of the Automated Metadata Generation Enrichment use-case around the FOLIO Inventory Instance records.

AI Workflows

The AI Workflows is a proof-of-concept open-source repository that operates within the Apache Airflow workflow technology stack.

What about mod-workflow?

The Edge AI Module is compatible with the new mod-workflows module in FOLIO, which will be available in the upcoming months. The current Directed Acyclic Graphs (DAGs) in AI Workflows are simple enough to be migrated to mod-workflows.

The purpose of AI Workflows is to support more complex DAGs and additional features beyond the planned functionality of mod-workflows. The Edge AI Module should be able to complement" and support both workflow technologies.

Connection to FOLIO AI Use-cases

From the use cases hosted in Edge AI Module's repository wiki, a set of requirements are being formulated (thank-you for the help during this workshop!) to prioritize the development of Edge AI Module.

TODO

  • Automate the deployment of the Edge AI Module to a FOLIO environment API service, using either Okapi or Kong
  • Increase Unit test coverage
  • Add Integration Tests