Sample iOS application for models exported from Custom Vision Service

게시자: Adam Behringer
마지막 업데이트: 2017-09-14
GitHub에서 편집

This sample application demonstrates how to take a model exported from the Custom Vision Service in the CoreML format and add it to a template iOS 11 application for real-time image classification.

Getting Started

Prerequisites

  • XCode 9 beta
  • iOS device running iOS 11 beta
  • An account at Custom Vision Service ### Quickstart
  • clone the repository and open the project in XCode
  • launch your application to your iOS device ### Replacing the sample model with your own classifier The model provided with the sample recognizes some fruit (apples, bananas, coconuts, oranges, passionfruit, pineapples, strawberries). to replace it with your own model exported from the Custom Vision Service do the following, and then build and launch the application:
    1. Create and train a classifer with the Custom Vision Service. You must choose a "compact" domain such as General (compact) to be able to export your classifier. If you have an existing classifier you want to export instead, convert the domain in "settings" by clicking on the gear icon at the top right. In setting, choose a "compact" model, Save, and Train your project.
    2. Export your model by going to the Performance tab. Select an iteration trained with a compact domain, an "Export" button will appear. Click on Export then iOS then Export. Click the Download button when it appears. A .mlmodel file will download (you can also do all of this programatically with the Custom Vision Service Training API.
    3. Drop your .mlmodel file into your XCode Project.
    4. Replace Fruit.mlmodel with the name of your model in ViewController.swift. ## Screenshot The demo application includes a fruit recognition model. This is a screenshot.

Resources