Chris Padilla/Blog
My passion project! Posts spanning music, art, software, books, and more. Equal parts journal, sketchbook, mixtape, dev diary, and commonplace book.
- Request is made to your app with the message "How many Tex-Mex Restaurants are in Dallas?"
- Your application gathers context. For example, we may make a query to our DB for a summary of all restaurants in the area.
- We'll provide a summary of the context and instructions to the LLM with a prompt.
- We send along the response to the user.
Peg O My Heart
Another swing at chord melody!
City Sunset
The Retrieval-Augmented Generation Pattern for AI Development
Yes ladies and gentleman, a post about developing with AI!
If you're team is looking to incorporate an LLM into your services, the first challenge to overcome is how to do so in a cost effective way.
Chances are, your business is already focused on a specific product domain, with resources targeted towards building that solution. This already is going to point you towards finding an off the shelf solution to integrate with an API.
With your flavor of LLM picked, the next set of challenges center around getting it to respond to questions in a way that meaningfully provides answers from your business data. LLM's need to be informed on how to respond to requests, what data to utilize when considering their answers, and even what to do if they're tempted to guess.
The way forward is through prompt engineering, with the help of Retrieval-Augmented Generation
Retrieval-Augmented Generation
The simplified procedure for RAG goes as follows:
That's an overly simplified walk through, but it should already get you thinking about the details involved in those steps depending on your use case.
Another benefit to this is that requests to an API are not inherently stateful. The chat window of an AI app will remember our previous messages. But my API request to that third party does not automatically. We have to store and retrieve that context.
AI Agents
It's worth noting that step 2 may even require an LLM to parse the question and then interact with an API to gather data. There's a fair amount of complexity to still parse in developing these solutions. This is where you may be leaning on an AI Agent. An agent is an LLM that will parse a result and determine if a tool is required, such as pinging your internal APIs.
Prompt Engineering is emerging as a role and craft all of it's own, and there are many nuances to doing it well.
LangChain
The workflow is already so common that there's a framework at the ready to spin up and take care of the heavy lifting for you. LangChain (stylized as π¦βοΈβπ₯) is just that tool.
For a hands on experience building a RAG application on rails, their docs on building a chatbot are a good starting place.
For a more complex agentive tool, LangGraph opens up the hood on LangChain for more control and plays nicely with LangChain when needed.
Campfire Folk Intro
Gather round, and listen to this tale...
Just a bit of noodling between practicing longer pieces.
Calm Sky
Extending Derived Class Methods in Python
Polymorphism! A sturdy pillar in the foundation of Object Oriented Programming. At it's simplest, it's the ability to change the implementation of specific methods on derived classes.
At face value, that could mean entirely rewriting the method. But what if we want a bit more nuance? What if we want to extend instead of replace the method entirely.
I'll continue on my Vehicle example from my previous post on polymorphism, this time in Python:
from abc import ABC
class Vehicle(ABC):
def __init__(self, color: str):
if not color:
raise ValueError("Color string cannot be null")
self._passengers = []
self.color = color
def load_passenger(self, passenger: str):
# Logic to load passenger
def move(self):
# Some default code for moving
print("Moving 1 mile North")
I've created an Abstract Base Class that serves as a starting point for any derived classes. Within it, I've defined a method move()
that moves the vehicle North by 1 mile. All children will have this class available automatically.
Now, if I want to override that method, it's as simple as declaring a method of the same name in my child classes:
class Car(Vehicle):
def move(self):
print("Driving 1 mile North")
class Boat(Vehicle):
def move(self):
print("Sailing 1 mile North")
In the case that I want to extend the functionality, we can use Super() to do so:
class Car(Vehicle):
def move(self):
super().move()
print("Pedal to metal!")
class Boat(Vehicle):
def move(self):
super().move()
print("Raising the sail!")
The benefit here is I can pass all the same arguments I'm receiving in the method call on either child instance to the default implementation in the parent. They can then be used in my own custom implementation in the child class.
car = Car()
car.move()
# Moving 1 mile North
# Pedal to metal!
Angel Eyes
'Scuse me while I disappear~ π«οΈ
Parkway
All the Things You Are Chord Melody
My first swing at chord melody! Love this tune even more on guitar.
Night Lake
The veil is thin. π»
Squeaked in one Inktober drawing this year! Very much directly inspired by the energetic ink work of Violaine Briat's Lil' Dee.
TypedDicts in Python
So much of JavaScript/TypeScript is massaging data returned from an endpoint through JSON. TypeScript has the lovely ability to type the objects and their properties that come through.
While Python is not as strongly typed as TypeScript, we have this benefit built in to the type hinting system.
It's easier shown than explained:
from typing import Union, TypedDict
from datetime import datetime
class Concert(TypedDict):
"""
Type Dict for concert dictionaries.
"""
id: str
price: int
artist: str
show_time: Union[str, datetime]
All pretty straightforward. We're instantiating a class, inheriting from the TypedDict
base class. Then we set our expected properties as values on that class.
It's ideal to store a class like this in it's own types directory in your project.
A couple of nice ways to use this:
First, you can use this in your methods where you are expecting to receive this dictionary as an argument:
def get_concert_ticket_details(
self, concert: UnitDict = None
) -> tuple(list[str], set[str]):
// Do work
You can also directly create a dictionary from this class through instantiation.
concert = Concert({
"id": "28",
"price": 50,
"artist": "Prince",
"show_time": show_time
})
The benefit of both is, of course, the suggestion in your editor letting you know that a property does not match the expected shape.
More details on Python typing in this previous post. Thorough details available in the official docs.
Sonny Rollins β Oleo
Today I learned that this jazz standard is named after margarine. Yum!
From the Other Side
Optimistic UI in Next.js with SWR
I remember the day I logged onto ye olde facebook after a layout change. A few groans later, what really blew me away was the immediacy of my comments on friends' posts. I was used to having to wait for a page refresh, but not anymore! Once I hit submit, I could see my comment right on the page with no wait time.
That's the power of optimistic UI. Web applications maintain a high level of engagement and native feel by utilizing this pattern. While making an update to the page, the actual form submission is being sent off to the server in the background. Since this is more than likely going to succeed, it's safe to update the UI on the page.
There are a few libraries that make this process a breeze in React. One option is Vercel's SWR, a React hook for data fetching.
Data Fetching
Say I have a component rendering data about several cats. At the top of my React component, I'll fetch the data with the useSWR
hook:
const {data, error, isLoading, mutate} = useSWR(['cats', queryArguments], () => fetchCats(args));
If your familiar with TanStack Query (formerly React Query), this will look very familiar. (See my previous post on data fetching in React with TanStack Query for a comparison.)
To the hook, we pass our key which will identify this result in the cache, then the function where we are fetching our data (a server action in Next), and optionally some options (left out above.)
That returns to us our data from the fetch, errors if failed, and the current loading state. I'm also extracting a bound mutate
method for when we want to revalidate the cache. We'll get to that in a moment.
useSWRMutation
Now that we have data, let's modify it. Next, I'm going to make use of the useSWRMutation
hook to create a method for changing our data:
const {mutate: insertCatMutation} = useMutation([`cats`, queryArguments], () => fetchCats(args)), {
optimisticData: [generateNewCat(), ...(data],
rollbackOnError: true,
revalidate: true
});
Note that I'm using the same key to signal that this pertains to the same set of data in our cache.
As you can see, we have an option that we can pass in for populating the cache with our optimistic data. Here, I've provided an array that manually adds the new item through the function generateNewCat()
. This will add my new cat data to the front of the array and will show on the page immediately.
I can then use the mutate
function in any of my handlers:
const {error: insertError} = await insertCatMutation(generateNewCat());
Bound Mutate Function
Another way of accomplishing this is with the mutate
method that we get from useSWR
. The main benefit is we now get to pass in options when calling the mutate method.
const handleDeleteCat = async (id) => {
try {
// Call delete server action
deleteCat({id});
// Mutate the cache
await mutate(() => fetchCats(queryArguments), {
// We can also pass a function to `optimistiData`
// removeCat will return the current cat data after removing the targeted cat data
optimisticData: () => removeCat(id),
rollbackOnError: true,
revalidate: true
})
} catch (e) {
// Here we'll want to manually handle errors
handleError(e);
}
}
This is advantageous in situations like deletion, where we want to sequentially pass in the current piece of data targeted for removal. That can then be passed both to our server action and updated optimistically through SWR.
For even more context on using optimistic UI, you can find a great example in the SWR docs
AntΓ΄nio Carlos Jobim β Once I Loved... Continued!
Finishing out this bossa ποΈ