PencilKit Meets Core ML

Source: Deep Learning on Medium

Setting Up

Setting up the canvas

It’s really easy to set up the PKCanvasView in our application, as the following code shows:

let canvasView = PKCanvasView(frame: .zero)
canvasView.backgroundColor = .black
canvasView.translatesAutoresizingMaskIntoConstraints = false
canvasView.topAnchor.constraint(equalTo: navigationBar.bottomAnchor),
canvasView.bottomAnchor.constraint(equalTo: view.bottomAnchor),
canvasView.leadingAnchor.constraint(equalTo: view.leadingAnchor),
canvasView.trailingAnchor.constraint(equalTo: view.trailingAnchor),

Setting our tool picker

The following code shows how to set up the ToolPicker UI:

override func viewDidAppear(_ animated: Bool) {
let window = view.window,
let toolPicker = PKToolPicker.shared(for: window) else {return}
toolPicker.setVisible(true, forFirstResponder: canvasView)

Setting our navigation bar buttons

The navigation bar was already added to the storyboard. In the following code, we’ve added a few action buttons to it.

func setNavigationBar() {
if let navItem = navigationBar.topItem{

let detectItem = UIBarButtonItem(title: "Detect", style: .done, target: self, action: #selector(detectImage))
let clearItem = UIBarButtonItem(title: "Clear", style: .plain, target: self, action: #selector(clear))
navItem.rightBarButtonItems = [clearItem,detectItem]
navItem.leftBarButtonItem = UIBarButtonItem(title: "", style: .plain, target: self, action: nil)


The left bar button is where the final predicted output is displayed.