Zephyrnet Logo

Create an iOS app that uses built-in and custom classifiers

Date:

This code pattern is part of the Watson Visual Recognition learning path.

Summary

In this developer code pattern, use IBM Watson™ Visual Recognition to showcase various built-in and custom classifiers on IBM Cloud using an iOS app built using Swift. A user can open the app on an iOS-based mobile phone and choose the different classifiers (faces, explicit, food, etc.) they want to use, including custom classifiers. The Watson Visual Recognition service on IBM Cloud classifies and provides the app with the classification results.

Description

The app in this code pattern has support for the following features of Watson Visual Recognition:

  • General — Watson Visual Recognition’s default classification returns the confidence of an image from thousands of classes.
  • Food — A classifier intended for images of food items.
  • Explicit — Returns percentage of confidence of whether an image is inappropriate for general use.
  • Custom classifier(s) — Gives the user the ability to create his own classifier.

Flow

Custom classifier flow architecture

  1. Clone the repo.
  2. Install dependencies with Carthage.
  3. Set up Watson Visual Recognition credentials.
  4. Run the app with Xcode.

Instructions

Ready to check it out? See the all detailed info on GitHub.

Conclusion

This code pattern showcased various built-in and custom classifiers on IBM Cloud using an iOS app built using Swift. The code pattern is part of the Watson Visual Recognition learning path. To continue with the learning path, look at the next step, Build a custom visual recognition model and deploy to an iOS app.

Source: https://developer.ibm.com/patterns/create-ios-app-uses-builtin-custom-classifiers/

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?