airasoul.

Exploring Machine Learning and AI with Hugging Face

Cover Image for Exploring Machine Learning and AI with Hugging Face

Exploring Machine Learning and AI with Hugging Face

This post explores some of the fundamental tasks we are curerntly able to perform using AI models and the Hugging Face Inference API.

Hugging Face develops tools for building applications using machine learning. The Hugging Face Hub is a platform with around half a million open source models, finding the appropriate model is simplified with a documentation project called Tasks which provides an example of the problem space, lists the various models available to use, and how to use the Inference API with examples in python and javascript

Hugging Face provides a unified API for accessing and using pre-trained language models.

Lets start

The source code for this post can be found here, pull this repo and follow along with these examples.

exploring-machine-learning-and-ai-with-hugging-face

After cloning the above repository, lets install our single dependency, the Hugging Face Inference API.

npm install

The Hugging Face API should work without an API key, but will be rate limited, so I suggest you register and create an account. This following config file contains the key.

examples/config.js

export const key = 'YOUR_KEY'

Translation

tasks/translation

The following model translates text into another language

node examples/translate.js

import { HfInference } from '@huggingface/inference'
import { key } from './config.js'

const translate = async (model, text) => {
  const hf = new HfInference(key)
  return await hf.translation({
    model: model,
    inputs: text,
  })
}

// Helsinki-NLP/opus-mt-en-es spanish
// Helsinki-NLP/opus-mt-en-de german
// Helsinki-NLP/opus-mt-en-fr french

const response = await translate("Helsinki-NLP/opus-mt-en-fr", 'hello')
console.log(response)

Text to Speech

tasks/text-to-speech

The following model translates text into speech and saves to an audio wav

node examples/textToSpeech.js

import fs from 'fs'
import { HfInference } from '@huggingface/inference'
import { key } from './config.js'

const textToSpeech = async (text) => {
  const hf = new HfInference(key)
  const blob1 = await hf.textToSpeech({
    model: 'espnet/kan-bayashi_ljspeech_vits',
    inputs: text
  })

  const buffer1 = Buffer.from( await blob1.arrayBuffer() );
  fs.writeFileSync('audio.wav', buffer1)
}

await textToSpeech('The development of full artificial intelligence could spell the end of the human race….It would take off on its own, and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded. Stephen Hawking')

Summarization

tasks/summarization

The following model allows us to summarise text to a min/max length

node examples/textSummarization.js

import { HfInference } from '@huggingface/inference'
import { key } from './config.js'

const textSummarization = async (text) => {
  const hf = new HfInference(key)
  return hf.summarization({
    model: 'facebook/bart-large-cnn',
    inputs: text,
    parameters: {
      max_length: 200,
      min_length: 100,
      do_sample: true
    }
  })
}

const summary = await textSummarization('A Love Supreme is an album by American jazz saxophonist John Coltrane. He recorded it in one session on December 9, 1964, at Van Gelder Studio in Englewood Cliffs, New Jersey, leading a quartet featuring pianist McCoy Tyner, bassist Jimmy Garrison and drummer Elvin Jones.  A Love Supreme was released by Impulse! Records in January 1965. It ranks among Coltranes best-selling albums and is widely considered as his masterpiece.')

console.log(summary)

Text Classification

tasks/text-classification

The following model tokenises a text string into tokens

node examples/classification.js

import { HfInference } from '@huggingface/inference'
import { key } from './config.js'

const classify = async (text) => {
  const hf = new HfInference(key)
  return hf.tokenClassification({
    model: 'dbmdz/bert-large-cased-finetuned-conll03-english',
    inputs: text
  })
}

const classification = await classify('A Love Supreme is an album by American jazz saxophonist John Coltrane')

console.log(classification)

Text Generation

tasks/text-generation

The following model generates a text response to a prompt

node examples/textGeneration.js

import { HfInference } from '@huggingface/inference'
import { key } from './config.js'

const textGeneration = async (text) => {
  const hf = new HfInference(key)
  return hf.textGeneration({
    model: 'gpt2',
    inputs: text
  })
}

const generated = await textGeneration('What is the meaning of life')
console.log(generated)

Text to Image

tasks/text-to-image

The following model translates text into an image in png format

node examples/textToImage.js

import fs from 'fs'
import { HfInference } from '@huggingface/inference'
import { key } from './config.js'

const generateImage = async (text, negative) => {
  const hf = new HfInference(key)
  const blob = await hf.textToImage({
    inputs: text,
    model: 'runwayml/stable-diffusion-v1-5',
    parameters: {
      negative_prompt: negative,
    }
  })
  const buffer = Buffer.from( await blob.arrayBuffer() );
  fs.writeFileSync('image.png', buffer)
}

await generateImage('An old black and white photo of an ai', 'pixelated blurry')

Image to Text

tasks/image-to-text

The following model allows us to infer a text caption for an image

node examples/inferCaptionFromImage.js

import fs from 'fs'
import { HfInference } from '@huggingface/inference'
import { key } from './config.js'

const inferCaptionFromImage = async (path) => {
  const hf = new HfInference(key)
  return hf.imageToText({
    data: fs.readFileSync(path),
    model: 'nlpconnect/vit-gpt2-image-captioning'
  })
}

const text = await generateTextForImage('./image.png')
console.log(text)

Image to Text

tasks/image-to-text

The following model allows us to infer a text caption for an image url

node examples/inferCaptionFromImage.js

import { HfInference } from '@huggingface/inference'
import { key } from './config.js'

const inferCaptionFromImage = async (url) => {
  const hf = new HfInference(key)
  const model = 'Salesforce/blip-image-captioning-base'
  const response = await fetch(url)
  const data = await response.blob()
  return await hf.imageToText({ data, model })
}

const caption = await inferCaptionFromImage('https://i.cbc.ca/1.4209267.1500324221!/fileImage/httpImage/coltrane-1.png')
console.log(caption)

Hugging Face Inference Endpoints

Hugging Face also offers Inference Endpoints, a secure production solution to easily deploy Machine Learning models on infrastructure managed by Hugging Face.

huggingface.co/inference-endpoints

In order to test this we can use the Hosted Inference API which is free to use, and rate limited.

node examples/inference.js

import { HfInference } from '@huggingface/inference'
import { key } from './config.js'

const inference = new HfInference(key)
const endpoint = inference.endpoint('https://api-inference.huggingface.co/models/bert-base-uncased')
const text = await endpoint.fillMask({inputs:"The answer to the universe is [MASK]."})
console.log(text)