Skip to content

Test out OpenCLIP for Image Search and Automatic Captioning

License

Notifications You must be signed in to change notification settings

robgon-art/open-clip

Repository files navigation

Using OpenCLIP for Image Search and Automatic Captioning

How LAION used more data and new ML training techniques to improve image and text embeddings for various applications

open-clip cover Image

By Robert. A Gonsalves

You can see my article on Medium.

The source code and generated images are released under the CC BY-SA license.
CC BYC-SA

Google Colabs

Acknowledgements


  • A. Radford et al., CLIP, Learning Transferable Visual Models From Natural Language Supervision (2021)
  • M. Cherti et al., OpenCLIP, Reproducible scaling laws for contrastive language-image learning (2022)
  • G. Couairon et al., Embedding Arithmetic of Multimodal Queries for Image Retrieval (2022)
  • S. Wang and P. Kanwar, BFloat16: The secret to high performance on Cloud TPUs (2019)
  • J. Yu, CoCa: Contrastive Captioners are Image-Text Foundation Models (2022)

To get unlimit

Citation

To cite this repository:

@software{GreenLIT,
  author  = {Gonsalves, Robert A.},
  title   = {Using OpenCLIP for Image Search and Automatic Captioning},
  url     = {https://github.com/robgon-art/open-clip},
  year    = 2023,
  month   = February
}

About

Test out OpenCLIP for Image Search and Automatic Captioning

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published