DUBBED Building tomorrow through your vision.

LLMing like the Rent Due

An exploration in Retrival Augmented Generation through AI supported Rapid Prototyping

I recently made the decision to transition out of my job to explore new avenues in my career. I’ve been looking to use my skills for something closer to my own passions. This is the first in a series of projects to find that next step. A form of digital wayfinding myself to the next step in my career.

You can find the app here:

Rents Due   

What was the inspiration behind the project, and how does it align with your goals?

One thing I always strive for is to get the maximum value while using minimal resources. In this case, the initial goal was to create my first LLM app. For what that LLM did, I let the universe guide me. As I was working on the fundamentals of the app, I came across LaRussell’s “Rent Due” challenge for his upcoming album and it sparked the idea. I’ve always wanted to work in service of artists and creatings. LaRussell is a local hero in the town I grew up in and currently reside in, so it made sense to honor him and his continuing legacy with this app. This choice gave me some good motivation to work on the project, a target deadline, some assests, design direction, and a realistic use case (Album Rollout).  

Day 1 - Designs + Project setup


What technologies were used in developing the project, and how were they integrated?

My design tool of choice is Figma, it is easy to use and allows for an easy way for my clients to provide feedback. It is one that I’m comfortable with, unlike the backend I chose.

My backend of choice is typically Ruby on Rails for more robust apps, but it seemed like most LLM libraries used Python. I ended up going with Flask since it’s lightweight and pretty standard. I’ve never personally used it, but wanted to also test the limits of how quickly I could get up and running with the help of tools like Copilot and ChatGPT. 

The front-end seemed simple enough so it’s mostly vanilla HTML, JS, and CSS with a bit of Tailwind for some utility classes.

Lastly, I used ChatGPT as the large language model and LangChain to assist with the retrival augmented  generation aspect.

Day 2 - Proof of concept


What challenges did you encounter during the project, and how did you overcome them?

Surprisingly, implementating the AI workflow was the least of my concerns. The most difficult task was probably getting a good prompt to do what I needed it to do. I ran into issues like the LLM choosing from only a subset of lyrics causing many repeat answers. Another issue was it selecting tone-deaf lyrics in certain contexts like when a user would mention death. It took some massaging, but I was finally able to get to a prompt that I felt did what I needed to and covered all the bases. 

Day 3 - Basic Frontend + LLM tuning

What was some user facing feedback you receive and how did you incorporate those learnings?

First pieces of feedback I got was:
“How do I use this?”
“What do I do?”

So with the first iteration, I added a small typing animation that would display different “Rent Due” testimonies to give the user an idea of what I was looking for.
I ended the testimony examples with some concrete instructions hoping this would be inline with the creative direction and be just enough to give the user an understanding of what to do.


Overall it was a fun, short, and sweet passion project. Feel free to hit me up about the finer details or any other questions you may have. This was my first time working on an LLM app and really using AI to empower my process. I learned a lot and a feel proud of what I’ve made.