Skip to main content

12 posts tagged with "ai"

View All Tags

· 6 min read
Jesse Thompson

If you've been toying around in the AI space over the past few months, you've probably heard of Ollama. Ollama is a tool for running various LLMs locally on your own hardware, and currently supports a bunch of open models from Google, Facebook and independent sources.

Besides the basic terminal chat function, Ollama has an API for use from within your favourite programming languages. This means you can build your very own LLM-powered apps!

Let's say we've built the next killer LLM app: ChatWich (which allows you to chat with your sandwich) and people are loving it when you show it off on your laptop, but personally visiting all your customers with your computer in hand is getting tiring, and the travel bills are starting to outweigh the (awesome) frequent flyer miles you're getting.

It's time to move to the cloud.

· 9 min read
Brian Morrison II

Generative AI is a fantastic tool to use to quickly create images based on prompts.

One of the issues with some of these platforms is that they don’t actually store the images in a way that makes them easier to retrieve after they’ve been created. Oftentimes you have to make sure to save it immediately after the process is completed, otherwise, it's gone. Luckily, Stability offers an API that can be used to programmatically generate images, and Tigris is the perfect solution to store those images for retrieval.

In this article, you’ll learn how to deploy an app to Fly.io that will allow you to generate images using the Stability API and automatically store them in a Tigris bucket.