Awesome
<div align="center"> <img src="https://github.com/CODAIT/max-central-repo/raw/master/images/title.png"> </div>:exclamation: We are Open Source and We Welcome Contributions :exclamation:
Our mission at CODAIT is to democratize AI, to make AI technologies accessible to practitioners who understand real-world problems and enable them to develop AI solutions that solve these problems.
The core technologies behind today’s AI systems rely heavily on open-source software projects. Going from raw data to training data to models to solutions requires many open technologies, and it’s crucial that these technologies not only work well, but work well together. Our developers and data scientists are continually improving these frameworks with our targeted open source contributions, making them work better both individually and as an integrated pipeline. Hence our name — the Center for Open-Source Data and AI Technologies.
Subscribe to our newsletters to keep updated with the recent announcements here.
Model Asset Exchange
The Model Asset Exchange on IBM Developer is a place for developers to find and use free, open source, state-of-the-art deep learning models for common application domains. The curated list includes deployable models that you can run as a microservice locally or in the cloud on contanerization platforms like Docker, Kubernetes or OpenShift, and trainable models where you can use your own data to train the models.
Models are licensed under the Apache 2.0 License.
Domains covered: Text, Vision, Audio, and Time-Series.
Models can be consumed via:
Contact Us
If you have any questions that you would like to discuss with us:
- For general discussion, you are welcome to open an issue in our issue tracker.
- For a question about a particular model, you are welcome to open an issue in the corresponding repository. You can find a list of the repositories here.
Contributing to the Model Asset Exchange
We welcome anyone who wants to make contributions to the Model Asset Exchange. Please review the contribution guidelines. This project adheres to code of conduct mentioned here. By participating, you are expected to uphold this code.
- To start a general discussion create an issue in this repo.
- For model-specific questions, go to the corresponding model repository and create an issue.
- For those who want to contribute a model to MAX, here is a quick summary of the process:
- To contribute code, documentation, or tests, please submit a pull request in the model's repository (a list is available).
- Our maintainers will review your proposal.
- If approved, wrap the model using the MAX-Skeleton as a guide.
We use Github Pull Requests for tracking requests and bugs, please direct any questions to our Slack channel.
Model Information
Deployable Models
Deployable and Trainable Models
Domain | Model | Framework | Training Dataset for Deployable Model | Application | Model Consumption |
---|---|---|---|---|---|
NLP | Text Sentiment Classifier | TensorFlow | IBM Claim Stance Dataset | Sentiment Analysis | |
NLP | Named Entity Tagger | Keras | Groningen Meaning Bank (GMB) Dataset | Named Entity Recognition | |
NLP | Question Answering | TensorFlow | SQuAD 1.1 Dataset | Question and Answer | |
NLP | Word Embedding Generator | TensorFlow | Random Text | Word Embeddings | |
Vision | Object Detector | TensorFlow | COCO | Object Detection | <p> WebApp <br> Node-RED <br> CodePen |
Vision | ResNet-50 | Keras | ImageNet | Image Classification | |
Vision | Image Segmenter | TensorFlow | VOC2012 ~10k images | Semantic Image Segmentation | <p> Demo <br> WebApp <br> Node-RED <br> CodePen |
General resources
-
Check the current status for the Model Asset Exchange ecosystem here.
-
MAX Framework: Python package that contains common code shared across all MAX models - (link)
-
MAX Skeleton: Docker based deployment skeleton for deep learning models on the Model Asset Exchange - (link)
-
MAX Training Framework: WML training framework library for the Model Asset Exchange (link)
Tutorials
- Get started with the Model Asset Exchange
- Deploy deep learning models on Red Hat OpenShift
- Deploy MAX models to the cloud with Kubernetes
- Leverage deep learning in IBM Cloud Functions
- Leverage deep learning in your Node-RED flows
- Use Node-RED Node Generator to create new nodes from APIs and services
Blogs
- Get an introduction to the Model Asset Exchange on IBM Developer
- Open source and AI at IBM
- Expanding the reach of the IBM Model Asset eXchange
- An introduction to the internals of the Model Asset eXchange
- Where are my new models for NLP? They’re here!
- Running MAX Deep Learning models on Raspberry Pi
Slides and Video Recordings
Slides
Videos
-
Video - Image Cropping Web App (Nick Kasten) (based on the demp application code pattern)
About : Use a free, open-source deep learning model to detect different types of objects in an image, then interact with them in a drag-and-drop web application.
-
Video - Object Detector Web App demo (Alex Bozarth)
About : The IBM Model Asset eXchange (MAX) has given application developers without data science experience easy access to prebuilt machine learning models. This web app uses the Object Detector from MAX and creates a simple web UI that displays bounding boxes around detected objects in an image and lets you filter the objects based on their label and probable accuracy given by the model.
-
Video - Lighting Talk: IBM Code Model Asset Exchange (Brendan Dwyer)
About: This talk walks you through the process of building Model Asset Exchange
-
About : In this talk, we’ll break down the challenges a domain expert faces today in applying AI to real-world problems. We’ll talk about the challenges that a domain expert needs to overcome in order to go from “I know a model of this type exists” to “I can tell an application developer how to apply this model to my domain.”
-
Video - Deploying Machine Learning Models in Practice (Nick Pentreath)
About : The talk will cover various options for the most popular and widely used ML libraries, including MLeap, TF Serving and open standards such as PMML, PFA and the recently announced ONNX for Deep Learning. I will also introduce Aardpfark, initially covering Spark ML pipelines - as well as experimental work for exporting Spark ML pipelines to TensorFlow graphs for use with TF Serving.
-
Video - Lessons Learned Building an Open Deep Learning Model Exchange (Nick Pentreath)
About : This talk walks you through the process of building MAX and shares challenges and problems encountered, the solutions developed, and the lessons learned, along with best practices for cross-framework, standardized deep learning model training and deployment.
-
About The talk will cover Model Asset Exchange, Data Asset Exchange and explains steps for wrapping a custom model using MAX-Framework.